WO2020189636A1 - Information providing system, moving body, information providing method, and information providing program - Google Patents

Information providing system, moving body, information providing method, and information providing program Download PDF

Info

Publication number
WO2020189636A1
WO2020189636A1 PCT/JP2020/011521 JP2020011521W WO2020189636A1 WO 2020189636 A1 WO2020189636 A1 WO 2020189636A1 JP 2020011521 W JP2020011521 W JP 2020011521W WO 2020189636 A1 WO2020189636 A1 WO 2020189636A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
moving body
vehicle
projector
information providing
Prior art date
Application number
PCT/JP2020/011521
Other languages
French (fr)
Inventor
Yuuki Suzuki
Hiroshi Yamaguchi
Tomohiro Nakajima
Original Assignee
Ricoh Company, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Company, Ltd. filed Critical Ricoh Company, Ltd.
Publication of WO2020189636A1 publication Critical patent/WO2020189636A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/525Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking automatically indicating risk of collision between vehicles in traffic or with pedestrians, e.g. after risk assessment using the vehicle sensor data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/179Distances to obstacles or vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/334Projection means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/77Instrument locations other than the dashboard
    • B60K2360/797Instrument locations other than the dashboard at the vehicle exterior
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2400/00Special features or arrangements of exterior signal lamps for vehicles
    • B60Q2400/50Projected symbol or information, e.g. onto the road or car body
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/205Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • the disclosures discussed herein relate to an information providing system, a moving body, an information providing method, and an information providing program.
  • Notification technologies have been known in the art. For example, to notify pedestrians, opposing cars, and the like of information regarding a vehicle, technologies such as projecting (displaying) graphics, shapes, characters, and the like on a road surface using headlights and the like of the vehicle have been used. In addition, to notify drivers of various types of information, head-up display devices installed in the vehicles have been used to display a virtual image in a display area inside the vehicles.
  • the disclosed technology is intended to reduce a burden on drivers when providing information.
  • an information providing system installed in a moving body includes a projector; and a controller configured to control the projector, wherein the controller includes an external information acquiring unit configured to acquire external information outside the moving body, a target recognition unit configured to recognize one or more targets based on the external information, an estimating unit configured to estimate rankings of the one or more recognized targets, and a projection control unit configured to cause the projector to project an image outside the moving body, the image representing notification to a desired target, the desired target being specified, from among the one or more recognized targets, in accordance with the rankings of the one or more recognized targets.
  • the controller includes an external information acquiring unit configured to acquire external information outside the moving body, a target recognition unit configured to recognize one or more targets based on the external information, an estimating unit configured to estimate rankings of the one or more recognized targets, and a projection control unit configured to cause the projector to project an image outside the moving body, the image representing notification to a desired target, the desired target being specified, from among the one or more recognized targets, in accordance with the rankings of the one
  • the disclosed technology will reduce a burden on drivers when providing information.
  • FIG. 1 is a diagram illustrating an example of an information providing system according to a first embodiment.
  • FIG. 2A is a diagram illustrating an arrangement example of devices of the information providing system installed in a moving body.
  • FIG. 2B is another diagram illustrating an arrangement example of devices of the information providing system installed in a moving body.
  • FIG. 3 is a first diagram illustrating a hardware configuration example of a projector.
  • FIG. 4 is a second diagram illustrating a hardware configuration example of the projector.
  • FIG. 5 is a third diagram illustrating a hardware configuration example of the projector.
  • FIG. 6 is a diagram illustrating a hardware configuration example of a controller.
  • FIG. 7 is a diagram illustrating a functional configuration of a controller according to the first embodiment.
  • FIG. 8 is a diagram illustrating an example of a priority estimation table according to the first embodiment.
  • FIG. 9 is a diagram illustrating an example of a notification content table according to the first embodiment.
  • FIG. 10 is a flowchart illustrating an operation of the controller according to the first embodiment.
  • FIG. 11 is a flowchart illustrating a process of a projection control unit according to the first embodiment.
  • FIG. 12 is a first diagram illustrating an example of notification projected by a projector according to the first embodiment.
  • FIG. 13 is a second diagram illustrating an example of notification projected by the projector according to the first embodiment.
  • FIG. 14 is a third diagram illustrating an example of notification projected by the projector according to the first embodiment.
  • FIG. 15 is a diagram illustrating an example of an information providing system according to a second embodiment.
  • FIG. 16 is a schematic diagram illustrating a configuration of a HUD device according to the second embodiment.
  • FIG. 17 is a diagram illustrating a configuration of an optical unit of the HUD device.
  • FIG. 18 is a diagram illustrating a hardware configuration example of a controller of the HUD device.
  • FIG. 19 is a diagram illustrating a functional configuration of a controller according to the second embodiment.
  • FIG. 20 is a diagram illustrating an example of a notification content table according to the second embodiment.
  • FIG. 21 is a flowchart illustrating the controller according to the second embodiment.
  • FIG. 22 is a first diagram illustrating an example of notification by the projector and the HUD device according to the second embodiment.
  • FIG. 22 is a first diagram illustrating an example of notification by the projector and the HUD device according to the second embodiment.
  • FIG. 23 is a second diagram illustrating an example of notification by the projector and the HUD device according to the second embodiment.
  • FIG. 24 is a third diagram illustrating an example of notification by the projector and the HUD device according to the second embodiment.
  • FIG. 25 is a diagram illustrating an example of notification by the projector according to the second embodiment.
  • FIG. 1 is a diagram illustrating an example of an information providing system according to the first embodiment.
  • the information providing system 100 is installed in a moving body.
  • a four wheeled-vehicle is described as an example of the moving body.
  • the information providing system 100 includes a controller 200 and a projector 300.
  • the controller 200 and projector 300 are connected to each other.
  • the controller 200 performs, for example, control with respect to a vehicle.
  • the controller 200 acquires information relating to the vehicle (hereinafter called "vehicle information" or "moving body information"), and information external to the vehicle, where the information providing system 100 is installed in the vehicle.
  • vehicle information information relating to the vehicle
  • moving body information information external to the vehicle
  • the information external to the vehicle is acquired, for example, by a sensor device 400 connected to the controller 200.
  • the sensor device 400 may, for example, be a stereo camera or the like installed in the vehicle.
  • the projector 300 is controlled by the controller 200.
  • the projector 300 according to the first embodiment uses a light source installed in the moving body to project (render) an image with respect to a space outside the moving body (vehicle), where the space acts as a projecting surface.
  • the space outside the vehicle may, for example, include a road surface on which the moving body is traveling, or a wall surface or the like outside the moving body.
  • the image projected by the projector 300 according to the first embodiment acts primarily as an image for notifying a person outside the vehicle of the presence of the vehicle.
  • the presence of a vehicle can be notified to those who are present within a viewable range of the projected image.
  • FIGS. 2A and 2B are diagrams each illustrating an arrangement example of devices of the information providing system 100 installed in a moving body.
  • the information providing system 100 includes a controller 200, and projectors 300-1 and 300-2.
  • the projector 300-1 is, for example, located at a left headlight of a vehicle 1 to project a predetermined image ahead of the vehicle 1.
  • the projector 300-2 is, for example, located at a right headlight of the vehicle 1 to project a predetermined image ahead of the vehicle 1.
  • the projectors 300-1 and 300-2 according to the first embodiment have the same configuration. Accordingly, in the following description, the projectors 300-1 and 300-2 are referred to as a projector(s) 300 where identification of the projectors 300-1 and 300-2 is not required.
  • the controller 200 according to the first embodiment is disposed inside a dashboard of the vehicle 1, for example, as illustrated in FIG. 2B.
  • the controller 200 generates projection image data to be projected by the projector 300, and transmits the generated projection image data to the projector 300.
  • the number of projectors 300 installed in the information providing system 100 is not limited to two, but may be one.
  • a single projection device is, for example, located at a front-center position of the vehicle 1 to project a predetermined image ahead of the vehicle 1.
  • the number of projectors installed in the information providing system 100 may be three or more.
  • the projectors 300 are disposed on two sides of the vehicle 1, and a rear surface (e.g., tail lamp position) of the vehicle 1, in addition to the two headlights of the vehicle 1.
  • the projectors 300 installed in the above manner can project a predetermined image ahead of, from the two sides of, and from a rear side of the vehicle 1.
  • FIG. 3 is a first diagram illustrating an example of a hardware configuration of a projector.
  • the projector 300 includes a light source 301, a collimator lens 302, MEMS (Micro Electro Mechanical Systems) 303, a wavelength conversion element 305, and a projection lens 306.
  • MEMS Micro Electro Mechanical Systems
  • the light source 301 outputs light having, for example, a blue wavelength band to render predetermined projection image data generated by the controller 200.
  • the collimator lens 302 collects luminous flux emitted from the light source 301 into the MEMS 303.
  • the MEMS 303 has a reflective mirror.
  • the MEMS 303 is driven by a mechanism that tilts and moves in two axes, i.e., longitudinally and laterally, based on control signals from the controller 200.
  • the MEMS 303 reflects light collected by the collimator lens 302
  • the MEMS 303 two dimensionally scans a wavelength conversion element 305 by controlling the reflecting light within a range indicated by a scan width 304.
  • the wavelength conversion element 305 is a reflective phosphor to which predetermined projection image data is rendered.
  • the wavelength conversion element 305 emits yellow fluorescence (fluorescence with at least green and red wavelength bands) when a two dimensionally scanned blue luminous flux is illuminated at the front side of the vehicle 1. Note that the blue luminous flux is two dimensionally scanned by the MEMS 303.
  • the projection lens 306 projects white light ahead of the vehicle 1.
  • the white light is obtained as a result of mixing of light converted by the wavelength conversion element 305 and unconverted light. Accordingly, the projector 300 illustrated in FIG. 3 is enabled to project an image corresponding to the predetermined projection image data within a space ahead of the vehicle 1.
  • the predetermined projection image data is generated by the controller 200.
  • FIG. 4 is a second diagram illustrating an example of a hardware configuration of a projector 300.
  • the difference between the projector 300 illustrated in FIG. 4 and the projector 300 illustrated in FIG. 3 is in that the blue luminous flux, which is two dimensionally scanned by the MEMS 303, is emitted from the back side of a wavelength conversion element 305a in FIG. 4. This is because the wavelength conversion element 305a is a transparent phosphor.
  • the projector 300 illustrated in FIG. 4 is enabled to project an image corresponding to the generated predetermined projection image data within a space ahead of the vehicle 1.
  • FIG. 5 is a third diagram illustrating an example of a hardware configuration of a projector 300.
  • the difference between the projector 300 in FIG. 5 and the projectors 300 in FIGS. 3 and 4 is in that the projector 300 in FIG. 5 includes a light source 301a instead of the light source 301, and a microdisplay 305b instead of the wavelength conversion elements 305 and 305a.
  • the light source 301a is a white LED that emits white light based on control signals from the controller 200. Light emitted from the light source 301a is emitted to the microdisplay 305b via the collimator lens 302.
  • the microdisplay 305b is, for example, a digital microdisplay device (DMD, registered trademark).
  • the microdisplay 305b displays predetermined projection image data generated by the controller 200, and controls on and off of image light on a per pixel basis, in accordance with the displayed predetermined projection image data.
  • the image light is on, light emitted from the light source 301b is applied to the microdisplay 305b, and the light applied to the microdisplay 305b is reflected toward the projection lens 306.
  • the projector 300 illustrated in FIG. 5 is enabled to project an image corresponding to the predetermined projection image data generated by the controller 200 within a space ahead of the vehicle 1.
  • the microdisplay 305b is not limited to DMD (TM).
  • the microdisplay 305b may be a reflective liquid crystal panel or a transparent liquid crystal panel.
  • FIG. 6 is a diagram illustrating an example of a hardware configuration of the controller 200.
  • the controller 200 includes a CPU (Central Processing Unit) 201, a RAM (Random Access Memory) 202, a storage device 203, and an input/output device 204.
  • the above-described devices and units of the controller 200 are interconnected to each other via a bus 205.
  • the CPU 201 is a computer that executes a program stored in the storage device 203 (such as a control program described below).
  • the RAM 202 is a main storage device such as DRAM (Dynamic Random Access Memory) and SRAM (Static Random Access Memory).
  • the RAM 202 is loaded upon a program stored in storage device 203 being executed by CPU 201, and functions as a work area.
  • the storage device 203 is a non-volatile memory such as an EPROM or an EEPROM, and stores a program to be executed by the CPU 201.
  • the input/output device 204 is an interface device for communicating with the projector 300 or CAN (Controller Area Network).
  • FIG. 7 is a diagram illustrating a functional configuration of the controller 200 according to the first embodiment.
  • the controller 200 includes a storage unit 210.
  • the storage unit 210 is implemented, for example, by the storage device 203 in FIG. 6.
  • the controller 200 includes a vehicle information acquiring unit 221, an external information acquiring unit 222, a target recognition unit 223, a recognition information generating unit 224, a priority estimating unit 225, a notification target specifying unit 226, a notification determining unit 227, and a projection control unit 230. Each of these units is implemented by executing a control program installed in the controller 200.
  • the storage unit 210 stores a priority estimation table 211 and a notification content table 212.
  • the priority estimation table 211 manages information, which is information about targets recognized in the vicinity of the vehicle 1, and is used when estimating priorities of the target recognized in the vicinity of the vehicle 1.
  • the notification content table 212 manages projection image data generated by the projector 300.
  • the vehicle information acquiring unit 221 acquires vehicle information relating to the vehicle 1, in which the information providing system 100 is installed.
  • the vehicle information acquiring unit 221 according to the first embodiment acquires the vehicle information by the input/output device 204 through communication via the CAN or a navigation system or the like.
  • the vehicle information includes information indicating a speed, direction, and position of the vehicle 1.
  • the external information acquiring unit 222 acquires external information representing an external condition outside the vehicle 1. Specifically, the external information acquiring unit 222 may acquire the external information of the vehicle 1 from video data around the vehicle 1 captured by the sensor device 400. The external information acquiring unit 222 may acquire the external information by the input/output device 204 through communication from a monitoring device or the like other than the vehicle 1.
  • the monitoring device other than the vehicle 1 may, for example, be an imaging device disposed in a traffic light or the like.
  • the external information includes information indicating the presence or absence of an object, such as a person, a vehicle, or a building in a space outside the vehicle 1, or information indicating a shape of such an object.
  • the external information also includes three-dimensional information in a space outside the vehicle 1.
  • information representing a moving speed and moving direction of a target may be acquired by the sensor device 400 that sequentially acquires the external information.
  • notification priorities with respect to these targets may be estimated according to movements of the targets.
  • a target refers to a person who is to be notified of the presence of the vehicle 1, or an object of which the presence is to be notified to a driver of the vehicle 1.
  • the target recognition unit 223 recognizes a target present in a space outside the vehicle 1, based on the external information. Specifically, the target recognition unit 223 stores pattern information representing a shape of an object that may become a target, and recognizes a target by matching the external information with the pattern information.
  • the pattern information may include information representing various shapes of, for example, bicycles, motorcycles, guardrails, humans, vehicles, buildings, and the like.
  • the recognition information generating unit 224 acquires recognition information on a per target basis, in accordance with the recognition results obtained by the target recognition unit 223 and the vehicle information acquired by the vehicle information acquiring unit 221.
  • the recognition information includes, for example, as items of information, a type of a target, a moving speed of a target, a position of a target relative to the vehicle 1, a distance between the vehicle 1 and a target, a moving direction of a target, and the like. That is, the recognition information is information indicating a status of each of targets recognized by the target recognition unit 223.
  • the recognition information generating unit 224 generates the above-described recognition information about targets recognized by the target recognition unit 223 on a per target basis, in accordance with the external information and the vehicle information.
  • the priority estimating unit 225 estimates a notification priority relating to each of the recognized targets, based on a combination of the recognition information generated by the recognition information generating unit 224 and the vehicle information. Specifically, the priority estimating unit 225 refers to the priority estimation table 211, and acquires a priority corresponding to a combination of the recognition information and the vehicle information, with respect to each of the targets.
  • the notification target specifying unit 226 specifies a target to which the notification is projected by the projector 300, in accordance with the notification priority relating to the target.
  • the notification determining unit 227 determines whether to cause the projector 300 to project projection image data, in accordance with the target specified by the notification target specifying unit 226. In other words, the notification determining unit 227 determines whether to notify the specified target of the presence of the vehicle 1.
  • the notification determining unit 227 determines that the projection image data is projected to the specified target when the specified target is a person or a moving object driven by a person, for example. Further, the notification determining unit 227 according to the first embodiment further determines that the projection image data is projected to the specified target when a priority corresponding to the specified target is higher than a predetermined level.
  • the specified target is a person or a moving object driven by a person
  • a priority corresponding to the specified target is higher than the predetermined level
  • the projection control unit 230 controls projection of the projection image data performed by the projector 300.
  • the projection control unit 230 includes a projection position determining unit 231, a projectability determining unit 232, a content determining unit 233, and a projection instructing unit 234.
  • the projection position determining unit 231 determines a position, on the road surface, at which projection image data is projected. Specifically, the projection position determining unit 231 determines a projection position at which projection image data is projected, in accordance with a position, a moving speed, and a moving direction of a specified target.
  • the projectability determining unit 232 determines whether the projection position determined by the projection position determining unit 231 is a projectable position on a road surface at which the projection image data is projected. Specifically, the projectability determining unit 232 determines whether unevenness of the road surface is within an allowable range, based on three-dimensional external information.
  • the content determining unit 233 refers to the notification content table 212, and determines a desired projection image data to be projected, in accordance with a type and a priority of the specified target.
  • the projection instructing unit 234 instructs the projector 300 to project the projection image data determined by the content determining unit 233.
  • FIG. 8 is a diagram illustrating an example of a priority estimation table according to the first embodiment.
  • the priority estimation table 211 includes, as items of information, a type of a target, a position of a target, a moving speed of a target, a distance from a vehicle (vehicle 1 as a reference), a moving direction of a target, a vehicle speed (a speed of the vehicle 1), a moving direction of the vehicle (moving direction of the vehicle 1), and a priority.
  • a type of a target a position of a target
  • a moving speed of a target a distance from a vehicle (vehicle 1 as a reference)
  • a moving direction of a target a vehicle speed (a speed of the vehicle 1)
  • a moving direction of the vehicle moving direction of the vehicle
  • the items "a type of a target”, “a position of a target”, “a moving speed of a target”, “a distance from the vehicle,” and "a moving direction of a target” are items included in the recognition information generated from the external information by the recognition information generating unit 224.
  • the items "vehicle speed" and "a moving direction of the vehicle” are the items included in the vehicle information acquired by the vehicle information acquiring unit 221.
  • the priority estimation table 211 associates the recognition information and the vehicle information with the item "priority".
  • a value of the item "type of target” indicates a type of a target.
  • a value of the item "position of target” indicates a position of a target relative to the vehicle 1 acting as a reference. Note that a value of the item "position of target” may be indicated by latitude and longitude.
  • a value of the item "moving speed of target” indicates a speed of a target
  • a value of the item “distance from vehicle” indicates a distance between the vehicle 1 and a target
  • a value of the item "moving direction of target” indicates a moving direction of a target.
  • a value of the item "vehicle speed” indicates a speed of the vehicle 1
  • a value of the item "moving direction of the vehicle” indicates a moving direction of the vehicle 1.
  • a value of the item "priority” indicates a notification priority with respect to a combination of the recognition information and the vehicle information.
  • the recognition information is indicated by respective values of the "type of target”, “position of target”, “moving speed of target”, “distance from vehicle”, and “moving direction of target”.
  • the vehicle information is indicated by respective values of the "vehicle speed” and "moving direction of the vehicle”.
  • the type of a target indicated by the recognition information is, for example, a bicycle
  • the bicycle as a target is moving at a speed of 10 km/h toward the south on the left front of the vehicle 1
  • the distance between the bicycle as a target and the vehicle 1 is 50 m.
  • the vehicle 1 indicated by the vehicle information is moving at a speed of 30 km/h toward the north.
  • the bicycle and the vehicle 1 are moving in close proximity to each other. Hence, it is preferable to notify a rider of the bicycle of the presence of the vehicle 1. Accordingly, the priority of this combination is "high".
  • a target is a vehicle
  • the vehicle as a target is moving toward northeast at a speed of 50 km/h behind the vehicle 1 acting as a reference, and the distance between the vehicle as a target and the vehicle 1 acting as a reference is 100 m.
  • the vehicle 1 acting as a reference is also moving toward northeast at a speed of 50 km/h.
  • the priority of this combination is "low".
  • the priority estimation table 211 according to the first embodiment includes a notification priority in accordance with the combination of the recognition information and the vehicle information.
  • the priority estimation table 211 according to the first embodiment is referred to when sorting the recognized targets into two types: one type is a target to which notification is presented preferentially, and the other type is a target to which notification is not presented preferentially.
  • the priority estimating unit 225 refers to the priority estimation table 211 and sets rankings (i.e., the notification priority) with respect to the recognized targets. That is, according to the first embodiment, the notification priority may also be referred to as "ranking".
  • the combination of the recognition information and the vehicle information with high priority indicates a status in which notification should be preferentially presented.
  • the combination of the recognition information and the vehicle information with high priority may indicate a high-risk status.
  • the priority estimation table 211 according to the first embodiment may be stored in the storage unit 210 in advance.
  • the priority estimation table 211 according to the first embodiment may be periodically updated by, for example, a server that collects traffic information, or the like.
  • the controller 200 may periodically access the server to acquire an updated priority estimation table 211, and overwrite the storage unit 210 with the updated priority estimation table 211.
  • the priority estimation table 211 by updating the priority estimation table 211 as described above, cases that have resulted in an accident and the like may be stored in the priority estimation table 211. Further, according to the first embodiment, inclusion of the priority estimation table 211 may enable priority estimation without need for communication outside the vehicle 1; this will reduce a communication load and improve the processing speed.
  • a priority with respect to a combination of the recognition information and the vehicle information may not necessarily be stored as a table such as the priority estimation table 211.
  • the priority may be a value calculated based on conditions indicated by a combination of the recognition information and the vehicle information.
  • FIG. 9 is a diagram illustrating an example of the notification content table 212 according to the first embodiment.
  • the notification content table 212 includes a type of a target, a priority, and a notification content as items of information, which are associated with each other.
  • a value of the item "notification content” is projection image data to be rendered by the projector 300.
  • the projector 300 projects projection image data 11.
  • the notification content table 212 may be stored in the storage unit 210 in advance.
  • FIG. 10 is a flowchart illustrating an operation of the controller 200 according to the first embodiment.
  • step S1001 the controller 200 according to the first embodiment acquires vehicle information and external information by the vehicle information acquiring unit 221 and the external information acquiring unit 222, respectively.
  • step S1002 the controller 200 determines, via the target recognition unit 223, whether there is a target present outside the vehicle 1.
  • step S1002 when a target is not present (NO in step S1002), the controller 200 proceeds to step S1009, which will be described later.
  • the controller 200 When one or more targets are present outside the vehicle 1 (YES in step S1002), the controller 200 generates, via the recognition information generating unit 224, recognition information for the one or more targets recognized by the target recognition unit 223 on a per target basis (step S1003).
  • step S1004 the controller 200 estimates, via the priority estimating unit 225, notification priorities of the one or more recognized targets, on a per target basis.
  • the priority estimating unit 225 refers to the priority estimation table 211, and acquires a priority in association with a combination of the recognition information generated for each of the one or more recognized targets and the vehicle information, as a priority relating to notification to the corresponding recognized target.
  • the recognition information is generated by the recognition information generating unit 224 in step S1003.
  • the combination of the generated recognition information (generated by the recognition information generating unit 224) and the vehicle information may not have to completely match the combination of the recognition information and the vehicle information stored in the priority estimation table 211.
  • the combination of the recognition information and the vehicle information may be considered to match the combination stored in the priority estimation table 211 when a value of the moving speed of a target in the recognition information generated by the recognition information generating unit 224 falls within a predetermined range of the moving speed of the target in the recognition information stored in the priority estimation table 211.
  • step S1005 the controller 200 specifies, via the notification target specifying unit 226, a target having a highest priority, from among the recognized targets having estimated priorities.
  • step S1006 the controller 200 determines, via the notification determining unit 227, whether a priority of the specified target is a predetermined level that requires notification.
  • a priority of the specified target is a predetermined level that requires notification.
  • the controller 200 determines, via the notification determining unit 227, whether a priority of the specified target is a predetermined level that requires notification.
  • the priority is equal to or lower than the predetermined level, the projection image data is not necessarily projected by the projector 300.
  • the predetermined level may be set in advance.
  • step S1006 when a priority of the specified target is "low", for example, the notification determining unit 227 may determine that the priority of the specified target does not require notification, and may determine not to project the projection image data.
  • step S1006 when the notification determining unit 227 determines that the priority of the specified target is not the predetermined level that requires notification, the controller 200 proceeds to step S1009, which will be described later.
  • step S1007 when the notification determining unit 227 determines that the priority of the specified target is the predetermined level that requires notification in step S1006, the notification determining unit 227 determines whether a type of the specified target is a person or an object involving a person. Specifically, the notification determining unit 227 determines whether the specified target is a pedestrian, or an object operated by a person, such as a bicycle, a motorcycle, or the like.
  • step S1007 when the type of the specified target is not a person or an object involving a person, the controller 200 proceeds to step S1009, which will be described later.
  • Targets that are not a person or an object involving a person may, for example, be guardrails or buildings.
  • step S1007 when a type of the specified target is a person or an object involving a person, the controller 200 causes, via the projection control unit 230, the projector 300 to project projection image data (step S1008). Details of step S1008 are described below.
  • step S1009 the controller 200 determines whether the engine of the vehicle 1 has been stopped. In step S1009, when the engine of the vehicle 1 has not been stopped, the controller 200 returns to step S1001. In step S1009, when the engine of the vehicle 1 has been stopped, the controller 200 ends the process.
  • FIG. 11 is a flowchart illustrating a process of the projection control unit 230 according to the first embodiment.
  • the process illustrated in FIG. 11 illustrates details of step S1008 in FIG. 10.
  • step S1101 the projection control unit 230 according to the first embodiment determines, via the projection position determining unit 231, a projection position at which the projection image data is projected.
  • the projection position at which the projection image data is projected indicates a projection area, on a road surface, in which an image is projected.
  • the projection position determining unit 231 may, for example, determine a projection position, based on a position, a moving direction, and a moving speed of a target.
  • a projection position may be included in a projectable range within which the projection image data is projected by the projector 300, and may also be included in a viewable range within which the projection image data is viewed by a person who is a target, or by a person who drives a target.
  • step S1102 the projection control unit 230 determines, via the projectability determining unit 232, whether the road surface at the determined position is a projectable road surface, onto which the projection image data is projected.
  • the projectability determining unit 232 determines that the road surface is not a projectable road surface, onto which the projection image data is projected.
  • step S1102 when the projectability determining unit 232 determines that the road surface at the determined position is a projectable road surface, the projection control unit 230 refers to the notification content table 212, and determines projection image data to be projected on the road surface, in accordance with the specified target and the priority of the specified target (step S1103).
  • step S1104 the projection control unit 230 instructs, via the projection instructing unit 234, the projector 300 to project the determined projection image data, and ends the process.
  • step S1102 when the projectability determining unit 232 determines that the road surface at the determined position is not a projectable road surface, the projection control unit 230 changes, via the projection position determining unit 231, the projection position (step S1105). Specifically, the projection position determining unit 231 changes the projection positions by making a projection area small.
  • step S1106 the projection control unit 230 determines, via the projectability determining unit 232, whether the road surface of the projection area changed by the projection position determining unit 231 is a projectable road surface.
  • step S1106 when the projectability determining unit 232 determines that the road surface of the changed projection area is a projectable road surface, the projection control unit 230 proceeds to step S1104.
  • step S1106 when the projectability determining unit 232 determines that the road surface of the changed projection area is not a projectable road surface, the projection control unit 230 ends the process.
  • the projection control unit 230 determines a projection area of an image, in accordance with unevenness of the road surface.
  • FIG. 12 is a first diagram illustrating an example of notification projected by a projector according to the first embodiment.
  • FIG. 12 schematically illustrates a driver's field of view from the vehicle 1.
  • the controller 200 of the vehicle 1 recognizes, via the target recognition unit 223, targets 121, 122, and 123.
  • the target 121 is a vehicle driving on a lane opposite to the lane on which the vehicle 1 is driving
  • the target 122 is a bicycle moving close to the vehicle 1
  • targets 123 are unmoved pedestrians.
  • FIG. 12 also illustrates a case in which the notification target specifying unit 226 specifies, from among the targets 121, 122, and 123, the target 122 as a target having the highest priority who needs to be notified of the presence of the vehicle 1.
  • the controller 200 causes the projector 300 to project images 126 in a projection area 125, which is viewable by the target 122 who is a rider of a bicycle.
  • the images 126 represent notification indicating that the vehicle 1 is approaching. Note that the images 126 may, for example, be projected when the vehicle 1 is about to turn left.
  • projecting of the images 126 in this manner can notify the bicycle rider of the presence of the vehicle 1.
  • FIG. 13 is a second diagram illustrating an example of notification projected by the projector according to the first embodiment. As illustrated in FIG. 13, an image 126A is projected in the projection area 125. The image 126A may be projected, for example, when vehicle 1 is about to drive straight ahead.
  • FIG. 14 is a third diagram illustrating an example of notification projected by the projector 300 according to the first embodiment.
  • a target is recognized behind the vehicle 1, and this recognized target is specified as a target subject to notification.
  • a pedestrian as a target behind the vehicle 1.
  • the vehicle 1 projects an image 142 in a projection area 141 behind the vehicle 1.
  • the image 142 in this example indicates the presence of the vehicle 1.
  • a target having the highest notification priority is specified, and the notification is projected only to the specified target.
  • the information presented in a driver's field of view of the vehicle 1 is simplified.
  • the process illustrated in FIG. 10 is constantly performed.
  • projection image data can be projected by the projector 300, based on an external condition outside the vehicle 1, which changes over time.
  • an image can be projected to the road surface at the necessary time only to a target, to whom notification of the presence of the vehicle 1 is desirable. Hence, the sight of the target may be guided to the vehicle 1 to alert the target.
  • the information providing system according to the second embodiment differs from the information providing system according to the first embodiment in that the information providing system according to the second embodiment includes a HUD (head-up display) device.
  • HUD head-up display
  • FIG. 15 is a diagram illustrating an example of an information providing system according to the second embodiment.
  • An information providing system 100A according to the second embodiment includes a controller 200A, a projector 300, and a HUD device 500.
  • the controller 200A is connected to the projector 300 and the HUD device 500.
  • the controller 200A is connected to the sensor device 400 and a sensor device 450.
  • the sensor device 450 may, for example, be a stereo camera, provided inside a vehicle 1, and may be configured to image an inside of the vehicle.
  • the information acquired by the sensor device 450 is thus internal information indicating the inside of the vehicle 1.
  • the HUD device 500 according to the second embodiment may also be connected to a navigation system 600 or the like provided to the vehicle 1, and may be configured to display vehicle information or information provided by the navigation system 600.
  • FIG. 16 is a schematic diagram illustrating a configuration of a HUD device according to the second embodiment.
  • the HUD device 500 is installed in a vehicle 1.
  • the HUD device 500 is embedded within the dashboard.
  • the HUD device 500 may be a display device configured to display an image toward the windshield 12 through a projection window 501.
  • the projection window 501 is disposed on an upper surface of the HUD device 500.
  • the displayed image is presented as a virtual image I ahead of the windshield 12.
  • the HUD device 500 may be an aspect of a display device.
  • a driver V is enabled to visually observe information that assists his or her driving while keeping his or her eyes (with a small gaze movement) on a preceding vehicle and on a road surface.
  • the information that assists the driver's driving may be any information; examples of such information other than the vehicle speed will be described later.
  • the HUD device 500 may be disposed at any place other than the dashboard, such as on a ceiling, a sun visor, or the like, insofar as the HUD device 500 can display an image onto the windshield 12.
  • the HUD device 500 may be a general-purpose information processing terminal or a HUD dedicated terminal.
  • the HUD dedicated terminal is simply referred to as a head-up display device.
  • the HUD dedicated terminal that is integrated with a navigation device may be referred to as a navigation device.
  • the HUD dedicated terminal is also called a PND (Portable Navigation Device).
  • the HUD dedicated terminal may be referred to as a display audio system (or a connected audio system).
  • the display audio system is a device that mainly provides an audio-visual (AV) function and a communication function without installing a navigation function.
  • AV audio-visual
  • Examples of the general-purpose information processing terminal include a smartphone, a tablet terminal, a mobile phone, a PDA (Personal Digital Assistant), a notebook PC, and a wearable PC (e.g., a wristwatch type, a sunglass type).
  • the general-purpose information processing terminal is not limited to these examples; the general-purpose information processing terminal may simply include typical functions of an information processing apparatus.
  • a typical general-purpose information processing terminal is used as an information processing apparatus that executes various applications. For example, as with the HUD dedicated terminal, the general-purpose information processing terminal displays information for assisting a driver's driving, upon executing application software for a HUD device.
  • the HUD device 500 may be switched between a vehicle mounted mode and a mobile mode when the HUD device 500 is used as a general purpose information processing terminal or a HUD dedicated terminal.
  • the HUD device 500 has an optical unit 510 and a controller 520, as major components.
  • a panel mode and a laser scanning mode are known in the related art.
  • the panel mode indicates forming an intermediate image using an imaging device, such as a liquid crystal panel, a DMD panel (digital mirror device panel), or a fluorescent display tube (VFD).
  • the laser scanning mode indicates forming an intermediate image using a two-dimensional scanning device for scanning a laser beam emitted from a laser light source.
  • the laser scanning mode is preferred in that the laser scanning mode can typically form a high contrast image. This is because, unlike the panel mode that forms an image with partial blocking of light emission to the full screen, the laser scanning mode can assign emission or non-emission of light to each pixel.
  • the laser scanning mode is employed as a projection mode of the HUD device 500.
  • any projection mode may be applicable insofar as the projection mode enables a process of reducing a floating feeling.
  • FIG. 17 is a diagram illustrating a configuration example of an optical unit of the HUD device 500.
  • the optical unit 510 includes a light source unit 502, an optical deflector 503, a mirror 504, a screen 505, and a concave mirror 506. Note that FIG. 17 merely illustrates the main components included in the main HUD device 500.
  • the light source unit 502 includes, for example, three laser light sources corresponding to RGB (hereinafter referred to as LD: laser diodes), a coupling lens, an aperture, a combining element, a lens, and the like.
  • the light source unit 502 is configured to combine laser beams emitted from the three LDs, and guide the combined laser beam toward a reflecting surface of the optical deflector 503.
  • the laser beam guided to the reflecting surface of the optical deflector 503 is two-dimensionally deflected by the optical deflector 503.
  • optical deflector 503 for example, one micro-mirror oscillating with respect to two orthogonal axes, two micro-mirrors oscillating with respect to or rotating around one axis, and the like may be used.
  • the optical deflector 503 may, for example, be MEMS (Micro Electro Mechanical Systems) fabricated by a semiconductor process, or the like.
  • the optical deflector 503 may be driven, for example, by an actuator that drives deformation force of a piezoelectric element.
  • the optical deflector 503 may be a galvanic mirror, a polygon mirror, or the like.
  • the two-dimensionally deflected laser beam by optical deflector 503 enters the mirror 504 and is reflected off by the mirror 504 so as to render a two-dimensional image (intermediate image) on the surface of the screen 505 (scanned surface).
  • a concave mirror may be used as the mirror 504 in this example; however, a convex mirror or a planar mirror may also be used as the mirror 504.
  • a microlens array or a micro-mirror array having a function of diverging a laser beam at a desired divergence angle may preferably be used; however, a diffuser plate for diffusing a laser beam, a transmitter plate having a smooth surface, a reflector plate, or the like may be also used.
  • the laser beam emitted from the screen 505 is reflected by the concave mirror 506, and the reflected laser beam is then projected onto the windshield 12.
  • the concave mirror 506 has a function similar to a lens to form an image at a predetermined focal length. Accordingly, a virtual image I is displayed at a position determined based on a distance between the screen 505 (corresponding to an object) and the concave mirror 506, and also on a focal length of the concave mirror 506. As illustrated in FIG. 17, a laser beam is projected via the concave mirror 506 onto the windshield 12 such that the virtual image I is displayed (formed) at a distance L from a viewpoint E of the driver V.
  • the virtual image I is an intermediate image on the screen 505, which is enlarged through the windshield 12. That is, an intermediate image is magnified as the virtual image I across the windshield 12 when viewed from the driver V.
  • an image-forming position of the virtual image I is determined based not only on a focal length of the concave mirror 506, but also on a curved surface of the windshield 12.
  • the light collection power of the concave mirror 506 is preferably set such that the virtual image I is displayed at a position having a distance L from the viewpoint E of the driver V being 4 m or more and 10 m or less (preferably 6 m or less).
  • the windshield 12 may cause optical distortion, such as a horizontal line of an intermediate image being curved in a concave or convex manner, due to the shape of the windshield 12.
  • optical distortion such as a horizontal line of an intermediate image being curved in a concave or convex manner
  • at least one of the mirror 504 and the concave mirror 506 be designed or arranged to compensate for optical distortion.
  • at least one of the mirror 504 and the concave mirror 506 be designed or arranged in consideration of distortion so that an image to be displayed is corrected.
  • a combiner may be disposed as a transparent-reflective member at a position closer to the viewpoint E than to the windshield 12.
  • a virtual image I may be displayed in a manner similar to a configuration in which the windshield 12 is designed to receive light from the concave mirror 506.
  • “to display a virtual image” indicates to display an image in a visually perceivable manner to a driver through a transparent member.
  • “to display a virtual image” may be expressed as "to display an image” for simplifying the explanation.
  • the windshield 12 may be designed to emit light to display an image.
  • FIG. 18 is a diagram illustrating an example of a hardware configuration of a controller 520 of a HUD device 500.
  • the controller 520 includes an FPGA 511, a CPU 512, a ROM 513, a RAM 514, an I/F 515, a bus line 516, an LD driver 517, and a MEMS controller 518.
  • the FPGA 511, the CPU 512, the ROM 513, the RAM 514, and the I/F 515 are interconnected via the bus line 516.
  • the CPU 512 controls respective functions of the HUD device 500.
  • the ROM 513 stores a program 530, which is executed by CPU 512 to control respective functions of the HUD device 500.
  • the RAM 514 loads the program 530, and the CPU 512 uses the RAM 514 as a work area for executing the loaded program 530.
  • the RAM 514 also includes an image memory 519.
  • the image memory 519 is used to generate an image, which is displayed as a virtual image I.
  • the I/F 515 is an interface for communicating with other devices installed in the vehicle 1.
  • the I/F 515 is connected, for example, to a CAN (Controller Area Network) bus or Ethernet (TM) of the vehicle 1.
  • CAN Controller Area Network
  • TM Ethernet
  • the FPGA 511 controls an LD (Laser Diode) driver 517, based on an image created by the CPU 512.
  • the LD driver 517 drives an LD of the light source unit 502 of the optical unit 510 so as to control light emission of the LD in accordance with the image created by the CPU 512.
  • the FPGA 511 operates the optical deflector 503 of the optical unit 510 via the MEMS controller 518 such that a laser beam is deflected toward a direction corresponding to a pixel position of an image.
  • FIG. 19 is a diagram illustrating a functional configuration of the controller 200A according to the second embodiment.
  • the controller 200A includes a storage unit 210A, a vehicle information acquiring unit 221, an external information acquiring unit 222, a target recognition unit 223, a recognition information generating unit 224, a priority estimating unit 225, a notification target specifying unit 226, a notification determining unit 227A, an internal information acquiring unit 228, a projection control unit 230, and a HUD control unit 240.
  • the storage unit 210A stores the priority estimation table 211 and a notification content table 212A.
  • the notification content table 212A stores projection image data to be projected by a projector 300A and image data to be displayed by the HUD device 500. Details of the notification content table 212A will be described later.
  • the notification determining unit 227A determines whether to project projection image data by the projector 300A, and whether to display image data by the HUD device 500, in accordance with a type of a specified target and a priority corresponding to the specified target.
  • the notification determining unit 227A determines to perform projection by the projector 300 and display by the HUD device 500. Also, when the type of the specified target is a person or a moving body driven by a person, the notification determining unit 227A determines to perform projection of projection image data by the projector 300 and display of image data by the HUD device 500. Further, when the type of the specified target is not a person or a moving body driven by a person, the notification determining unit 227A determines to perform only display of image data by the HUD device 500.
  • the internal information acquiring unit 228 acquires internal information.
  • the internal information is information inside the vehicle 1 acquired via the sensor device 450.
  • the internal information includes, for example, information indicating the number of occupants in the vehicle 1, information indicating seating positions of the occupants, information indicating whether each of occupants is an adult or a child, and the like.
  • the HUD control unit 240 controls the HUD device 500.
  • the HUD control unit 240 is a display control unit that controls the HUD device 500 acting as a display device.
  • the HUD control unit 240 includes a display position determining unit 241, a content determining unit 242, and a display instructing unit 243.
  • the display position determining unit 241 determines a position at which image data is displayed inside the vehicle 1.
  • the display position determining unit 241 according to the second embodiment determines a display position of image data within a display area of the HUD device 500 such that the display position of the image data is not superimposed on a projection area of the projection image data, which is projected by the projector 300.
  • the display position determining unit 241 may locate (obtain) a projection area of the projector 300 within a driver's field of view, from video data and the like in a range corresponding to the driver's field of view. The display position determining unit 241 may then determine a display position of image data to be within an area excluding the projection area of the projector 300. Note that the video data and the like may be captured (imaged) from inside the vehicle 1 by the sensor device 450.
  • the content determining unit 242 refers to the notification content table 212A, and determines image data corresponding to the specified target, and a notification priority of the specified target.
  • the display instructing unit 243 instructs the HUD device 500 to display the image data determined by the content determining unit 242.
  • FIG. 20 is a diagram illustrating an example of a notification content table 212A according to the second embodiment.
  • the notification content table 212A according to the second embodiment includes a type of a target, a priority, and notification content as items of information.
  • the item "notification content” includes projection image data (projection image data 11 to 13, and 21 to 23) to be projected by the projector 300 on the road surface, and image data (image data 1 to 9) to be displayed by the HUD device 500 in a display area inside the vehicle 1.
  • the display area according to the second embodiment is an area in which an image can be displayed by the HUD device 500.
  • the display area may be determined according to the structure of the optical unit 510 of the HUD device 500.
  • projection image data 22 is projected onto the road surface by the projector 300 and image data 5 is displayed inside the vehicle 1 by the HUD device 500.
  • the projector 300 according to the second embodiment projects projection image data onto a road surface when a type of a target is a person or a person being a driver. Accordingly, the projector 300 according to the second embodiment may also be called a device for presenting notification to a person outside the vehicle 1.
  • the HUD device 500 according to the second embodiment displays image data inside the vehicle 1 even when a type of a target is not a person or a person being a driver. Accordingly, the HUD device 500 according to the second embodiment may also be called a device for presenting notification to a driver of the vehicle 1.
  • FIG. 21 is a flowchart illustrating the controller 200A according to the second embodiment.
  • the controller 200A acquires vehicle information, external information, and internal information via the vehicle information acquiring unit 221, the external information acquiring unit 222, and the target recognition unit 223, respectively, (step S2101), and then proceeds to step S2102.
  • step S2102 to step S2105 Since a process from step S2102 to step S2105 is the same as the process from step S1002 to step S1005 in FIG. 10, a description of the process from step S2102 to step S2105 will not be repeated.
  • the controller 200A determines, via the notification determining unit 227, whether to perform projection of the projection image data by the projector 300 and display of the image data by the HUD device 500, based on the estimated priority of the specified target (step S2106).
  • the notification determining unit 227A determines whether to present notification using the projector 300 and the HUD device 500.
  • step S2106 the controller 200A proceeds to step S2110, which will be described later.
  • step S2107 when the notification determining unit 227A determines to present notification in step S2106, the notification determining unit 227A determines whether the specified target is a person, or a moving body driven by a person.
  • step S2107 When the specified target is not a person or a moving body driven by a person in step S2107, the controller 200A proceeds to step S2109, which will be described later.
  • step S2107 when a type of the specified target is a person or a moving body driven by a person, the controller 200A instructs the projection control unit 230 to cause the projector 300 to project projection image data onto a road surface (step S2108).
  • Step S2108 is similar to step S1008 in FIG. 10.
  • step S2109 the controller 200A instructs the HUD control unit 240 to cause the HUD device 500 to display image data.
  • the HUD control unit 240 determines, via the display position determining unit 241, a display position of image data, where the image data represents notification to a driver of the vehicle 1.
  • the HUD control unit 240 determines, via the content determining unit 242, notification content (i.e., image data) corresponding to a priority of the specified target, by referring to the notification content table 212A.
  • the HUD control unit 240 then instructs, via the display instructing unit 243, the HUD device 500 to display image data determined by the content determining unit 242.
  • an image corresponding to the determined image data is displayed on the windshield 12.
  • step S2110 the controller 200A determines whether the engine has been turned off. In step S2110, when the engine has not been turned off, the controller 200A returns to step S2101.
  • step S2110 when the engine has been turned off, the controller 200A ends the process.
  • FIG. 22 is a first diagram illustrating an example of notification by the projector 300 and the HUD device 500, according to the second embodiment.
  • FIG. 22 schematically illustrates a driver's field view of the vehicle 1 in which the information providing system 100A is installed.
  • respective targets 121, 122, and 123 are recognized, and a priority corresponding to the target 122 indicates the highest priority.
  • the target 122 is specified as a target to which notification is presented.
  • images 126 are projected in a projection area 125 on the road surface in order to present notification indicating that the vehicle 1 is approaching.
  • the HUD device 500 displays an image 260 to highlight the target 122 in a display area inside the vehicle 1 so as to notify a driver of the vehicle 1 of the presence of the target 122.
  • the HUD device 500 also displays an image 262 and an image 263 in an area 261. Note that this area 261 is a part of the display area of the HUD device 500.
  • the image 262 is a mark indicating that a bicycle as the target 122 is approaching, and the image 263 is a mark indicating that the vehicle 1 is about to turn left.
  • the controller 200A determines, via the display position determining unit 241 of the HUD control unit 240 (an example of a display control unit), a position of the area 261 such that the area 261 is not superimposed onto the projection area 125 of the projector 300, within the driver's field view of the vehicle 1.
  • the HUD device 500 displays an image to highlight a specified target (e.g., target 122), a driver of the vehicle 1 can be promptly notified of the presence of the specified target that needs to be attended to.
  • a specified target e.g., target 122
  • a projection area, in which projection image data is projected by the projector 300, is not superimposed on a display area, in which image data is displayed by the HUD device 500, within the driver's field of view of the vehicle 1.
  • an image displayed by the HUD device 500 will not be superimposed on an image projected by the projector 300, on the road surface, which acts as a background within the driver's field of view of the vehicle 1. This improves viewability of a driver of the vehicle 1.
  • information about the vehicle 1 can be notified to a person outside the vehicle by projecting an image onto a road surface, and information about a target that needs to be attended to can be notified to a person inside the vehicle (e.g., a driver) by displaying an image in a display area inside the vehicle 1.
  • a person outside the vehicle e.g., vehicle driver
  • the image 260 for highlighting only the specified target 122 is displayed; however, the present invention is not limited to this example.
  • the HUD device 500 may, for example, display images for highlighting any other pedestrians, from among the recognized targets, regardless of priorities of the pedestrians.
  • the images for highlighting the targets 123 are also displayed; however, a display mode of the images for highlighting the targets 123 may be changed from the display mode of the image 260 for highlighting the target 122. For example, color and brightness of an image for highlighting a target with the highest priority may be changed from those of images for highlighting other targets.
  • FIG. 23 is a second diagram illustrating an example of notification by the projector and the HUD device according to the second embodiment.
  • FIG. 23 also schematically illustrates a driver's field of view of the vehicle 1 in which the information providing system 100A is installed, as with FIG. 22.
  • FIG. 23 illustrates that a target 271 is recognized by a driver of the vehicle 1.
  • the target 271 in this case is a vehicle (a second vehicle) moving ahead of the vehicle 1.
  • FIG. 23 also illustrates that another vehicle (a third vehicle) moving next to the lane along which the vehicle 1 is moving is trying to enter between the vehicle 1 and the target 271 (second vehicle).
  • images 282 are projected in a projection area 281 by a projector 300 of another vehicle (third vehicle), and the images 282 come into the field of view of the driver of the vehicle 1.
  • the projection control unit 230 needs to prevent an image projected by the vehicle 1 from being superimposed on the images 282 projected by another vehicle (third vehicle).
  • an image 126A is projected by the vehicle 1 in a projection area 125A ahead of the vehicle 1.
  • the image 126A is projected by the vehicle 1 at a position so as not to be superimposed on the images 282 projected by another vehicle (third vehicle).
  • the projection position determining unit 231 of the projection control unit 230 obtains, from video data or the like of the vehicle 1, positions at which the images 282 have been projected. Note that the video data or the like of the vehicle 1 may be imaged or captured by the sensor device 400. The projection position determining unit 231 subsequently determines a projection position that is not superimposed on the obtained positions of the images 282, so as to cause the projector 300 to project the image 126A to the determined projection position.
  • an image 292 and an image 293 are displayed in an area 290.
  • the image 292 is an image indicating that another vehicle (third vehicle) is approaching
  • the image 293 is an image indicating that the distance between the vehicle 1 and the vehicle (target 271) ahead of the vehicle 1 has been narrowed.
  • the image 292 and an image 293 displayed in the area 290 are not superimposed on the images 282.
  • FIG. 24 is a second diagram illustrating an example of notification presented by the projector 300 and the HUD device 500 according to the second embodiment.
  • an image 272 is projected by a projector 300 disposed at the back of a preceding vehicle (i.e., the target 271) moving ahead of the vehicle 1.
  • the projector 300 of the vehicle 1 i.e. differing from the projector 300 of the target 271 projects images at positions so as not to be superimposed onto the image 272 (projected by the projector 300 of the target 271).
  • the projector 300 of the vehicle 1 projects images 128 in the projection area 125A of the vehicle 1 so as not to be superimposed onto the image 272. Further, as illustrated in FIG. 24, the HUD device 500 of the vehicle 1 displays an image 293 in an area 290A that is not superimposed onto the images 128 and the image 272, within the driver's field of view of the vehicle 1.
  • FIG. 25 is a diagram illustrating an example of notification projected by the projector of the vehicle 1 according to the second embodiment.
  • the controller 200A according to the second embodiment may cause the projector 300 to project notification to those around the vehicle 1, in accordance with internal information acquired by the internal information acquiring unit 228.
  • the controller 200A causes the projector 300 to project an image 130 in a projection area 129 behind the vehicle 1.
  • the controller 200A causes the projector 300 to project the image 130 as a result of detecting, from the internal information, an infant being present among persons (occupants) inside the vehicle 1.
  • an image illustrating the abnormal change may be projected by the projector 300 or displayed by the HUD device 500.
  • the abnormal change occurring inside the vehicle 1 may thus be propagated to the outside of the vehicle 1.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An information providing system installed in a moving body is disclosed. The information providing system includes a projector; and a controller configured to control the projector, wherein the controller includes an external information acquiring unit configured to acquire external information outside the moving body, a target recognition unit configured to recognize one or more targets based on the external information, an estimating unit configured to estimate rankings of the one or more recognized targets, and a projection control unit configured to cause the projector to project an image outside the moving body, the image representing notification to a desired target, the desired target being specified, from among the one or more recognized targets, in accordance with the rankings of the one or more recognized targets.

Description

INFORMATION PROVIDING SYSTEM, MOVING BODY, INFORMATION PROVIDING METHOD, AND INFORMATION PROVIDING PROGRAM
The disclosures discussed herein relate to an information providing system, a moving body, an information providing method, and an information providing program.
Notification technologies have been known in the art. For example, to notify pedestrians, opposing cars, and the like of information regarding a vehicle, technologies such as projecting (displaying) graphics, shapes, characters, and the like on a road surface using headlights and the like of the vehicle have been used. In addition, to notify drivers of various types of information, head-up display devices installed in the vehicles have been used to display a virtual image in a display area inside the vehicles.

[PTL 1]  Japanese Unexamined Patent Application Publication No. 2016-55691
[PTL 2]  Japanese Unexamined Patent Application Publication No. 2018-106655
Using such notification technologies described above, information for a driver is presented inside a vehicle, and information for others outside the vehicle is presented on a road surface. However, use of the above technologies results in an increase in the amount of information presented in a driver's field of view. This may have an adverse effect on a driver's viewability, thereby making a driver feel irritable.
The disclosed technology is intended to reduce a burden on drivers when providing information.
According to an aspect of disclosed embodiments, an information providing system installed in a moving body is provided. The information providing system includes a projector; and a controller configured to control the projector, wherein the controller includes an external information acquiring unit configured to acquire external information outside the moving body, a target recognition unit configured to recognize one or more targets based on the external information, an estimating unit configured to estimate rankings of the one or more recognized targets, and a projection control unit configured to cause the projector to project an image outside the moving body, the image representing notification to a desired target, the desired target being specified, from among the one or more recognized targets, in accordance with the rankings of the one or more recognized targets.
Advantageous Effect of Invention
The disclosed technology will reduce a burden on drivers when providing information.

FIG. 1 is a diagram illustrating an example of an information providing system according to a first embodiment. FIG. 2A is a diagram illustrating an arrangement example of devices of the information providing system installed in a moving body. FIG. 2B is another diagram illustrating an arrangement example of devices of the information providing system installed in a moving body. FIG. 3 is a first diagram illustrating a hardware configuration example of a projector. FIG. 4 is a second diagram illustrating a hardware configuration example of the projector. FIG. 5 is a third diagram illustrating a hardware configuration example of the projector. FIG. 6 is a diagram illustrating a hardware configuration example of a controller. FIG. 7 is a diagram illustrating a functional configuration of a controller according to the first embodiment. FIG. 8 is a diagram illustrating an example of a priority estimation table according to the first embodiment. FIG. 9 is a diagram illustrating an example of a notification content table according to the first embodiment. FIG. 10 is a flowchart illustrating an operation of the controller according to the first embodiment. FIG. 11 is a flowchart illustrating a process of a projection control unit according to the first embodiment. FIG. 12 is a first diagram illustrating an example of notification projected by a projector according to the first embodiment. FIG. 13 is a second diagram illustrating an example of notification projected by the projector according to the first embodiment. FIG. 14 is a third diagram illustrating an example of notification projected by the projector according to the first embodiment. FIG. 15 is a diagram illustrating an example of an information providing system according to a second embodiment. FIG. 16 is a schematic diagram illustrating a configuration of a HUD device according to the second embodiment. FIG. 17 is a diagram illustrating a configuration of an optical unit of the HUD device. FIG. 18 is a diagram illustrating a hardware configuration example of a controller of the HUD device. FIG. 19 is a diagram illustrating a functional configuration of a controller according to the second embodiment. FIG. 20 is a diagram illustrating an example of a notification content table according to the second embodiment. FIG. 21 is a flowchart illustrating the controller according to the second embodiment. FIG. 22 is a first diagram illustrating an example of notification by the projector and the HUD device according to the second embodiment. FIG. 23 is a second diagram illustrating an example of notification by the projector and the HUD device according to the second embodiment. FIG. 24 is a third diagram illustrating an example of notification by the projector and the HUD device according to the second embodiment. FIG. 25 is a diagram illustrating an example of notification by the projector according to the second embodiment.

(First Embodiment)
Hereinafter, an information providing system according to a first embodiment will be described with reference to the accompanying drawings. FIG. 1 is a diagram illustrating an example of an information providing system according to the first embodiment.
The information providing system 100 according to the first embodiment is installed in a moving body. According to the first embodiment, a four wheeled-vehicle is described as an example of the moving body.
The information providing system 100 according to the first embodiment includes a controller 200 and a projector 300. The controller 200 and projector 300 are connected to each other.
The controller 200 according to the first embodiment performs, for example, control with respect to a vehicle. The controller 200 acquires information relating to the vehicle (hereinafter called "vehicle information" or "moving body information"), and information external to the vehicle, where the information providing system 100 is installed in the vehicle. The information external to the vehicle is acquired, for example, by a sensor device 400 connected to the controller 200. The sensor device 400 may, for example, be a stereo camera or the like installed in the vehicle.
The projector 300 is controlled by the controller 200. The projector 300 according to the first embodiment uses a light source installed in the moving body to project (render) an image with respect to a space outside the moving body (vehicle), where the space acts as a projecting surface. The space outside the vehicle may, for example, include a road surface on which the moving body is traveling, or a wall surface or the like outside the moving body. The image projected by the projector 300 according to the first embodiment acts primarily as an image for notifying a person outside the vehicle of the presence of the vehicle.
According to the first embodiment, by projecting an image onto a road surface by the projector 300, the presence of a vehicle can be notified to those who are present within a viewable range of the projected image.
FIGS. 2A and 2B are diagrams each illustrating an arrangement example of devices of the information providing system 100 installed in a moving body. The information providing system 100 includes a controller 200, and projectors 300-1 and 300-2. As illustrated in FIG. 2A, the projector 300-1 is, for example, located at a left headlight of a vehicle 1 to project a predetermined image ahead of the vehicle 1. As illustrated in FIG. 2A, the projector 300-2 is, for example, located at a right headlight of the vehicle 1 to project a predetermined image ahead of the vehicle 1.
The projectors 300-1 and 300-2 according to the first embodiment have the same configuration. Accordingly, in the following description, the projectors 300-1 and 300-2 are referred to as a projector(s) 300 where identification of the projectors 300-1 and 300-2 is not required.
The controller 200 according to the first embodiment is disposed inside a dashboard of the vehicle 1, for example, as illustrated in FIG. 2B. The controller 200 generates projection image data to be projected by the projector 300, and transmits the generated projection image data to the projector 300.
Note that the number of projectors 300 installed in the information providing system 100 is not limited to two, but may be one. In this case, a single projection device is, for example, located at a front-center position of the vehicle 1 to project a predetermined image ahead of the vehicle 1. Or, the number of projectors installed in the information providing system 100 may be three or more. In this case, the projectors 300 are disposed on two sides of the vehicle 1, and a rear surface (e.g., tail lamp position) of the vehicle 1, in addition to the two headlights of the vehicle 1. The projectors 300 installed in the above manner can project a predetermined image ahead of, from the two sides of, and from a rear side of the vehicle 1.
Next, a hardware configuration example of the projector 300 will be described. Hereinafter, three types of hardware configuration examples will be described.
FIG. 3 is a first diagram illustrating an example of a hardware configuration of a projector.
The projector 300 according to the first embodiment includes a light source 301, a collimator lens 302, MEMS (Micro Electro Mechanical Systems) 303, a wavelength conversion element 305, and a projection lens 306.
The light source 301 outputs light having, for example, a blue wavelength band to render predetermined projection image data generated by the controller 200. The collimator lens 302 collects luminous flux emitted from the light source 301 into the MEMS 303.
The MEMS 303 has a reflective mirror. The MEMS 303 is driven by a mechanism that tilts and moves in two axes, i.e., longitudinally and laterally, based on control signals from the controller 200. When the MEMS 303 reflects light collected by the collimator lens 302, the MEMS 303 two dimensionally scans a wavelength conversion element 305 by controlling the reflecting light within a range indicated by a scan width 304.
The wavelength conversion element 305 is a reflective phosphor to which predetermined projection image data is rendered. The wavelength conversion element 305 emits yellow fluorescence (fluorescence with at least green and red wavelength bands) when a two dimensionally scanned blue luminous flux is illuminated at the front side of the vehicle 1. Note that the blue luminous flux is two dimensionally scanned by the MEMS 303.
The projection lens 306 projects white light ahead of the vehicle 1. The white light is obtained as a result of mixing of light converted by the wavelength conversion element 305 and unconverted light. Accordingly, the projector 300 illustrated in FIG. 3 is enabled to project an image corresponding to the predetermined projection image data within a space ahead of the vehicle 1. The predetermined projection image data is generated by the controller 200.
FIG. 4 is a second diagram illustrating an example of a hardware configuration of a projector 300. The difference between the projector 300 illustrated in FIG. 4 and the projector 300 illustrated in FIG. 3 is in that the blue luminous flux, which is two dimensionally scanned by the MEMS 303, is emitted from the back side of a wavelength conversion element 305a in FIG. 4. This is because the wavelength conversion element 305a is a transparent phosphor.
As with FIG. 3, the projector 300 illustrated in FIG. 4 is enabled to project an image corresponding to the generated predetermined projection image data within a space ahead of the vehicle 1.
FIG. 5 is a third diagram illustrating an example of a hardware configuration of a projector 300. The difference between the projector 300 in FIG. 5 and the projectors 300 in FIGS. 3 and 4 is in that the projector 300 in FIG. 5 includes a light source 301a instead of the light source 301, and a microdisplay 305b instead of the wavelength conversion elements 305 and 305a.
The light source 301a is a white LED that emits white light based on control signals from the controller 200. Light emitted from the light source 301a is emitted to the microdisplay 305b via the collimator lens 302.
The microdisplay 305b is, for example, a digital microdisplay device (DMD, registered trademark). The microdisplay 305b displays predetermined projection image data generated by the controller 200, and controls on and off of image light on a per pixel basis, in accordance with the displayed predetermined projection image data. When the image light is on, light emitted from the light source 301b is applied to the microdisplay 305b, and the light applied to the microdisplay 305b is reflected toward the projection lens 306. As a result, the projector 300 illustrated in FIG. 5 is enabled to project an image corresponding to the predetermined projection image data generated by the controller 200 within a space ahead of the vehicle 1.
When the image light is off, light emitted from the light source 301a is applied to the microdisplay 305b, and the light applied to the microdisplay 305b is directed toward a direction differing from a direction toward the projection lens 306 (see dotted arrows indicated by 307 in FIG. 5). In this case, a black image is projected within a space ahead of the vehicle 1.
The microdisplay 305b is not limited to DMD (TM). The microdisplay 305b may be a reflective liquid crystal panel or a transparent liquid crystal panel.
Next, a hardware configuration of the controller 200 will be described. FIG. 6 is a diagram illustrating an example of a hardware configuration of the controller 200.
The controller 200 according to the first embodiment includes a CPU (Central Processing Unit) 201, a RAM (Random Access Memory) 202, a storage device 203, and an input/output device 204. The above-described devices and units of the controller 200 are interconnected to each other via a bus 205.
The CPU 201 is a computer that executes a program stored in the storage device 203 (such as a control program described below). The RAM 202 is a main storage device such as DRAM (Dynamic Random Access Memory) and SRAM (Static Random Access Memory). The RAM 202 is loaded upon a program stored in storage device 203 being executed by CPU 201, and functions as a work area.
The storage device 203 is a non-volatile memory such as an EPROM or an EEPROM, and stores a program to be executed by the CPU 201. The input/output device 204 is an interface device for communicating with the projector 300 or CAN (Controller Area Network).
Next, a functional configuration of the controller 200 according to the first embodiment will be described with reference to FIG. 7. FIG. 7 is a diagram illustrating a functional configuration of the controller 200 according to the first embodiment.
The controller 200 according to the first embodiment includes a storage unit 210. The storage unit 210 is implemented, for example, by the storage device 203 in FIG. 6.
The controller 200 includes a vehicle information acquiring unit 221, an external information acquiring unit 222, a target recognition unit 223, a recognition information generating unit 224, a priority estimating unit 225, a notification target specifying unit 226, a notification determining unit 227, and a projection control unit 230. Each of these units is implemented by executing a control program installed in the controller 200.
The storage unit 210 according to the first embodiment stores a priority estimation table 211 and a notification content table 212. The priority estimation table 211 manages information, which is information about targets recognized in the vicinity of the vehicle 1, and is used when estimating priorities of the target recognized in the vicinity of the vehicle 1. The notification content table 212 manages projection image data generated by the projector 300.
The vehicle information acquiring unit 221 according to the first embodiment acquires vehicle information relating to the vehicle 1, in which the information providing system 100 is installed. The vehicle information acquiring unit 221 according to the first embodiment acquires the vehicle information by the input/output device 204 through communication via the CAN or a navigation system or the like. The vehicle information includes information indicating a speed, direction, and position of the vehicle 1.
The external information acquiring unit 222 acquires external information representing an external condition outside the vehicle 1. Specifically, the external information acquiring unit 222 may acquire the external information of the vehicle 1 from video data around the vehicle 1 captured by the sensor device 400. The external information acquiring unit 222 may acquire the external information by the input/output device 204 through communication from a monitoring device or the like other than the vehicle 1. The monitoring device other than the vehicle 1 may, for example, be an imaging device disposed in a traffic light or the like.
The external information according to the first embodiment includes information indicating the presence or absence of an object, such as a person, a vehicle, or a building in a space outside the vehicle 1, or information indicating a shape of such an object. The external information also includes three-dimensional information in a space outside the vehicle 1.
According to the first embodiment, information representing a moving speed and moving direction of a target may be acquired by the sensor device 400 that sequentially acquires the external information. Hence, according to the first embodiment, even when a plurality of targets is present in the vicinity of the vehicle 1, notification priorities with respect to these targets may be estimated according to movements of the targets.
In the following description, an object present in a space outside vehicle 1 is referred to as a target. A target according to the first embodiment refers to a person who is to be notified of the presence of the vehicle 1, or an object of which the presence is to be notified to a driver of the vehicle 1.
The target recognition unit 223 recognizes a target present in a space outside the vehicle 1, based on the external information. Specifically, the target recognition unit 223 stores pattern information representing a shape of an object that may become a target, and recognizes a target by matching the external information with the pattern information.
Specifically, the pattern information may include information representing various shapes of, for example, bicycles, motorcycles, guardrails, humans, vehicles, buildings, and the like.
The recognition information generating unit 224 acquires recognition information on a per target basis, in accordance with the recognition results obtained by the target recognition unit 223 and the vehicle information acquired by the vehicle information acquiring unit 221.
The following illustrates details of the recognition information. The recognition information according to the first embodiment includes, for example, as items of information, a type of a target, a moving speed of a target, a position of a target relative to the vehicle 1, a distance between the vehicle 1 and a target, a moving direction of a target, and the like. That is, the recognition information is information indicating a status of each of targets recognized by the target recognition unit 223.
The recognition information generating unit 224 according to the first embodiment generates the above-described recognition information about targets recognized by the target recognition unit 223 on a per target basis, in accordance with the external information and the vehicle information.
The priority estimating unit 225 estimates a notification priority relating to each of the recognized targets, based on a combination of the recognition information generated by the recognition information generating unit 224 and the vehicle information. Specifically, the priority estimating unit 225 refers to the priority estimation table 211, and acquires a priority corresponding to a combination of the recognition information and the vehicle information, with respect to each of the targets.
The notification target specifying unit 226 specifies a target to which the notification is projected by the projector 300, in accordance with the notification priority relating to the target.
The notification determining unit 227 determines whether to cause the projector 300 to project projection image data, in accordance with the target specified by the notification target specifying unit 226. In other words, the notification determining unit 227 determines whether to notify the specified target of the presence of the vehicle 1.
Specifically, the notification determining unit 227 determines that the projection image data is projected to the specified target when the specified target is a person or a moving object driven by a person, for example. Further, the notification determining unit 227 according to the first embodiment further determines that the projection image data is projected to the specified target when a priority corresponding to the specified target is higher than a predetermined level.
That is, according to the first embodiment, when the specified target is a person or a moving object driven by a person, and a priority corresponding to the specified target is higher than the predetermined level, the presence of a vehicle 1 is notified to the specified target.
The projection control unit 230 controls projection of the projection image data performed by the projector 300. The projection control unit 230 includes a projection position determining unit 231, a projectability determining unit 232, a content determining unit 233, and a projection instructing unit 234.
The projection position determining unit 231 determines a position, on the road surface, at which projection image data is projected. Specifically, the projection position determining unit 231 determines a projection position at which projection image data is projected, in accordance with a position, a moving speed, and a moving direction of a specified target.
The projectability determining unit 232 determines whether the projection position determined by the projection position determining unit 231 is a projectable position on a road surface at which the projection image data is projected. Specifically, the projectability determining unit 232 determines whether unevenness of the road surface is within an allowable range, based on three-dimensional external information.
The content determining unit 233 refers to the notification content table 212, and determines a desired projection image data to be projected, in accordance with a type and a priority of the specified target.
The projection instructing unit 234 instructs the projector 300 to project the projection image data determined by the content determining unit 233.
Next, a priority estimation table 211 according to the first embodiment will be described with reference to FIG. 8. FIG. 8 is a diagram illustrating an example of a priority estimation table according to the first embodiment.
The priority estimation table 211 according to the first embodiment includes, as items of information, a type of a target, a position of a target, a moving speed of a target, a distance from a vehicle (vehicle 1 as a reference), a moving direction of a target, a vehicle speed (a speed of the vehicle 1), a moving direction of the vehicle (moving direction of the vehicle 1), and a priority. The above-described items are associated with each other in the priority estimation table 211.
Note that among the items of information included in the priority estimation table 211, the items "a type of a target", "a position of a target", "a moving speed of a target", "a distance from the vehicle," and "a moving direction of a target" are items included in the recognition information generated from the external information by the recognition information generating unit 224.
Among the items of information included in the priority estimation table 211, the items "vehicle speed" and "a moving direction of the vehicle" are the items included in the vehicle information acquired by the vehicle information acquiring unit 221.
That is, the priority estimation table 211 associates the recognition information and the vehicle information with the item "priority".
Of the priority estimation table 211 according to the first embodiment, a value of the item "type of target" indicates a type of a target. A value of the item "position of target" indicates a position of a target relative to the vehicle 1 acting as a reference. Note that a value of the item "position of target" may be indicated by latitude and longitude.
A value of the item "moving speed of target" indicates a speed of a target, a value of the item "distance from vehicle" indicates a distance between the vehicle 1 and a target, and a value of the item "moving direction of target" indicates a moving direction of a target.
A value of the item "vehicle speed" indicates a speed of the vehicle 1, and a value of the item "moving direction of the vehicle" indicates a moving direction of the vehicle 1.
A value of the item "priority" indicates a notification priority with respect to a combination of the recognition information and the vehicle information. The recognition information is indicated by respective values of the "type of target", "position of target", "moving speed of target", "distance from vehicle", and "moving direction of target". The vehicle information is indicated by respective values of the "vehicle speed" and "moving direction of the vehicle".
As illustrated in FIG. 8, the type of a target indicated by the recognition information is, for example, a bicycle, the bicycle as a target is moving at a speed of 10 km/h toward the south on the left front of the vehicle 1, and the distance between the bicycle as a target and the vehicle 1 is 50 m. The vehicle 1 indicated by the vehicle information is moving at a speed of 30 km/h toward the north.
In this combination of the bicycle as a target and the vehicle 1, the bicycle and the vehicle 1 are moving in close proximity to each other. Hence, it is preferable to notify a rider of the bicycle of the presence of the vehicle 1. Accordingly, the priority of this combination is "high".
In addition, as illustrated in FIG. 8, a target is a vehicle, the vehicle as a target is moving toward northeast at a speed of 50 km/h behind the vehicle 1 acting as a reference, and the distance between the vehicle as a target and the vehicle 1 acting as a reference is 100 m. The vehicle 1 acting as a reference is also moving toward northeast at a speed of 50 km/h. Thus, it is unlikely that the vehicle 1 as a reference will come into contact with the vehicle as a target. Accordingly, the priority of this combination is "low".
As described above, the priority estimation table 211 according to the first embodiment includes a notification priority in accordance with the combination of the recognition information and the vehicle information. In other words, the priority estimation table 211 according to the first embodiment is referred to when sorting the recognized targets into two types: one type is a target to which notification is presented preferentially, and the other type is a target to which notification is not presented preferentially. Thus, according to the first embodiment, it may be said that the priority estimating unit 225 refers to the priority estimation table 211 and sets rankings (i.e., the notification priority) with respect to the recognized targets. That is, according to the first embodiment, the notification priority may also be referred to as "ranking".
In addition, the combination of the recognition information and the vehicle information with high priority indicates a status in which notification should be preferentially presented. Thus, the combination of the recognition information and the vehicle information with high priority may indicate a high-risk status.
The priority estimation table 211 according to the first embodiment may be stored in the storage unit 210 in advance. The priority estimation table 211 according to the first embodiment may be periodically updated by, for example, a server that collects traffic information, or the like. In this case, the controller 200 may periodically access the server to acquire an updated priority estimation table 211, and overwrite the storage unit 210 with the updated priority estimation table 211.
According to the first embodiment, by updating the priority estimation table 211 as described above, cases that have resulted in an accident and the like may be stored in the priority estimation table 211. Further, according to the first embodiment, inclusion of the priority estimation table 211 may enable priority estimation without need for communication outside the vehicle 1; this will reduce a communication load and improve the processing speed.
Note that a priority with respect to a combination of the recognition information and the vehicle information may not necessarily be stored as a table such as the priority estimation table 211. The priority may be a value calculated based on conditions indicated by a combination of the recognition information and the vehicle information.
Next, a notification content table 212 according to the first embodiment will be described with reference to FIG. 9. FIG. 9 is a diagram illustrating an example of the notification content table 212 according to the first embodiment.
The notification content table 212 according to the first embodiment includes a type of a target, a priority, and a notification content as items of information, which are associated with each other.
A value of the item "notification content" is projection image data to be rendered by the projector 300. As illustrates in the example in FIG. 9, when a type of a target is a bicycle, and the priority is "high", the projector 300 projects projection image data 11.
The notification content table 212 according to the first embodiment may be stored in the storage unit 210 in advance.
Next, an operation of the controller 200 according to the first embodiment will be described with reference to FIG. 10. FIG. 10 is a flowchart illustrating an operation of the controller 200 according to the first embodiment.
In step S1001, the controller 200 according to the first embodiment acquires vehicle information and external information by the vehicle information acquiring unit 221 and the external information acquiring unit 222, respectively.
Subsequently, in step S1002, the controller 200 determines, via the target recognition unit 223, whether there is a target present outside the vehicle 1. In step S1002, when a target is not present (NO in step S1002), the controller 200 proceeds to step S1009, which will be described later.
When one or more targets are present outside the vehicle 1 (YES in step S1002), the controller 200 generates, via the recognition information generating unit 224, recognition information for the one or more targets recognized by the target recognition unit 223 on a per target basis (step S1003).
Subsequently, in step S1004, the controller 200 estimates, via the priority estimating unit 225, notification priorities of the one or more recognized targets, on a per target basis. Specifically, the priority estimating unit 225 refers to the priority estimation table 211, and acquires a priority in association with a combination of the recognition information generated for each of the one or more recognized targets and the vehicle information, as a priority relating to notification to the corresponding recognized target. Note that the recognition information is generated by the recognition information generating unit 224 in step S1003.
Note that the combination of the generated recognition information (generated by the recognition information generating unit 224) and the vehicle information may not have to completely match the combination of the recognition information and the vehicle information stored in the priority estimation table 211. The combination of the recognition information and the vehicle information may be considered to match the combination stored in the priority estimation table 211 when a value of the moving speed of a target in the recognition information generated by the recognition information generating unit 224 falls within a predetermined range of the moving speed of the target in the recognition information stored in the priority estimation table 211.
Subsequently, in step S1005, the controller 200 specifies, via the notification target specifying unit 226, a target having a highest priority, from among the recognized targets having estimated priorities.
Subsequently, in step S1006, the controller 200 determines, via the notification determining unit 227, whether a priority of the specified target is a predetermined level that requires notification. According to the first embodiment, when the priority is equal to or lower than the predetermined level, the projection image data is not necessarily projected by the projector 300. The predetermined level may be set in advance.
In step S1006, when a priority of the specified target is "low", for example, the notification determining unit 227 may determine that the priority of the specified target does not require notification, and may determine not to project the projection image data.
In step S1006, when the notification determining unit 227 determines that the priority of the specified target is not the predetermined level that requires notification, the controller 200 proceeds to step S1009, which will be described later.
In step S1007, when the notification determining unit 227 determines that the priority of the specified target is the predetermined level that requires notification in step S1006, the notification determining unit 227 determines whether a type of the specified target is a person or an object involving a person. Specifically, the notification determining unit 227 determines whether the specified target is a pedestrian, or an object operated by a person, such as a bicycle, a motorcycle, or the like.
In step S1007, when the type of the specified target is not a person or an object involving a person, the controller 200 proceeds to step S1009, which will be described later. Targets that are not a person or an object involving a person may, for example, be guardrails or buildings.
In step S1007, when a type of the specified target is a person or an object involving a person, the controller 200 causes, via the projection control unit 230, the projector 300 to project projection image data (step S1008). Details of step S1008 are described below.
Subsequently, in step S1009, the controller 200 determines whether the engine of the vehicle 1 has been stopped. In step S1009, when the engine of the vehicle 1 has not been stopped, the controller 200 returns to step S1001. In step S1009, when the engine of the vehicle 1 has been stopped, the controller 200 ends the process.
Next, an operation of the projection control unit 230 according to the first embodiment will be described with reference to FIG. 11. FIG. 11 is a flowchart illustrating a process of the projection control unit 230 according to the first embodiment. The process illustrated in FIG. 11 illustrates details of step S1008 in FIG. 10.
In step S1101, the projection control unit 230 according to the first embodiment determines, via the projection position determining unit 231, a projection position at which the projection image data is projected. The projection position at which the projection image data is projected indicates a projection area, on a road surface, in which an image is projected.
Specifically, the projection position determining unit 231 may, for example, determine a projection position, based on a position, a moving direction, and a moving speed of a target. Such a projection position may be included in a projectable range within which the projection image data is projected by the projector 300, and may also be included in a viewable range within which the projection image data is viewed by a person who is a target, or by a person who drives a target.
Subsequently, in step S1102, the projection control unit 230 determines, via the projectability determining unit 232, whether the road surface at the determined position is a projectable road surface, onto which the projection image data is projected.
Specifically, when a width of an uneven portion of the road surface is equal to or greater than a predetermined width, the projectability determining unit 232 determines that the road surface is not a projectable road surface, onto which the projection image data is projected.
In step S1102, when the projectability determining unit 232 determines that the road surface at the determined position is a projectable road surface, the projection control unit 230 refers to the notification content table 212, and determines projection image data to be projected on the road surface, in accordance with the specified target and the priority of the specified target (step S1103). In step S1104, the projection control unit 230 instructs, via the projection instructing unit 234, the projector 300 to project the determined projection image data, and ends the process.
In step S1102, when the projectability determining unit 232 determines that the road surface at the determined position is not a projectable road surface, the projection control unit 230 changes, via the projection position determining unit 231, the projection position (step S1105). Specifically, the projection position determining unit 231 changes the projection positions by making a projection area small.
Subsequently, in step S1106, the projection control unit 230 determines, via the projectability determining unit 232, whether the road surface of the projection area changed by the projection position determining unit 231 is a projectable road surface. In step S1106, when the projectability determining unit 232 determines that the road surface of the changed projection area is a projectable road surface, the projection control unit 230 proceeds to step S1104.
In step S1106, when the projectability determining unit 232 determines that the road surface of the changed projection area is not a projectable road surface, the projection control unit 230 ends the process.
As can be seen from the above process, the projection control unit 230 according to the first embodiment determines a projection area of an image, in accordance with unevenness of the road surface.
Next, an example of notification according to the first embodiment will be described with reference to FIGS. 12 to 14.
FIG. 12 is a first diagram illustrating an example of notification projected by a projector according to the first embodiment. FIG. 12 schematically illustrates a driver's field of view from the vehicle 1.
As illustrated in the example of FIG. 12, the controller 200 of the vehicle 1 recognizes, via the target recognition unit 223, targets 121, 122, and 123. The target 121 is a vehicle driving on a lane opposite to the lane on which the vehicle 1 is driving, the target 122 is a bicycle moving close to the vehicle 1, and targets 123 are unmoved pedestrians.
FIG. 12 also illustrates a case in which the notification target specifying unit 226 specifies, from among the targets 121, 122, and 123, the target 122 as a target having the highest priority who needs to be notified of the presence of the vehicle 1.
In this case, the controller 200 causes the projector 300 to project images 126 in a projection area 125, which is viewable by the target 122 who is a rider of a bicycle. The images 126 represent notification indicating that the vehicle 1 is approaching. Note that the images 126 may, for example, be projected when the vehicle 1 is about to turn left.
According to the first embodiment, projecting of the images 126 in this manner can notify the bicycle rider of the presence of the vehicle 1.
FIG. 13 is a second diagram illustrating an example of notification projected by the projector according to the first embodiment. As illustrated in FIG. 13, an image 126A is projected in the projection area 125. The image 126A may be projected, for example, when vehicle 1 is about to drive straight ahead.
FIG. 14 is a third diagram illustrating an example of notification projected by the projector 300 according to the first embodiment. In the example of FIG. 14, a target is recognized behind the vehicle 1, and this recognized target is specified as a target subject to notification. Specifically, in the example of FIG. 14, it is assumed that there is a pedestrian (as a target) behind the vehicle 1.
In this case, the vehicle 1 projects an image 142 in a projection area 141 behind the vehicle 1. Note that the image 142 in this example indicates the presence of the vehicle 1.
According to the first embodiment, a target having the highest notification priority is specified, and the notification is projected only to the specified target. Thus, the information presented in a driver's field of view of the vehicle 1 is simplified.
Further, according to the first embodiment, the process illustrated in FIG. 10 is constantly performed. Hence, projection image data can be projected by the projector 300, based on an external condition outside the vehicle 1, which changes over time.
Further, according to the first embodiment, an image can be projected to the road surface at the necessary time only to a target, to whom notification of the presence of the vehicle 1 is desirable. Hence, the sight of the target may be guided to the vehicle 1 to alert the target.
(SECOND EMBODIMENT)
An information providing system according to a second embodiment will be described below with reference to the accompanying drawings. The information providing system according to the second embodiment differs from the information providing system according to the first embodiment in that the information providing system according to the second embodiment includes a HUD (head-up display) device. In the description of the second embodiment noted below, only the differences from the first embodiment are described. Components having the same functional configuration as in the first embodiment are denoted by the same reference numerals as those used in the description of the first embodiment, and description thereof will be omitted.
FIG. 15 is a diagram illustrating an example of an information providing system according to the second embodiment. An information providing system 100A according to the second embodiment includes a controller 200A, a projector 300, and a HUD device 500. The controller 200A is connected to the projector 300 and the HUD device 500.
The controller 200A according to the second embodiment is connected to the sensor device 400 and a sensor device 450. The sensor device 450 may, for example, be a stereo camera, provided inside a vehicle 1, and may be configured to image an inside of the vehicle. The information acquired by the sensor device 450 is thus internal information indicating the inside of the vehicle 1.
The HUD device 500 according to the second embodiment may also be connected to a navigation system 600 or the like provided to the vehicle 1, and may be configured to display vehicle information or information provided by the navigation system 600.
Details of the HUD device 500 will be described below. FIG. 16 is a schematic diagram illustrating a configuration of a HUD device according to the second embodiment.
The HUD device 500 is installed in a vehicle 1. The HUD device 500 is embedded within the dashboard. The HUD device 500 may be a display device configured to display an image toward the windshield 12 through a projection window 501. The projection window 501 is disposed on an upper surface of the HUD device 500. The displayed image is presented as a virtual image I ahead of the windshield 12. Hence, the HUD device 500 may be an aspect of a display device. A driver V is enabled to visually observe information that assists his or her driving while keeping his or her eyes (with a small gaze movement) on a preceding vehicle and on a road surface. The information that assists the driver's driving may be any information; examples of such information other than the vehicle speed will be described later. Further, the HUD device 500 may be disposed at any place other than the dashboard, such as on a ceiling, a sun visor, or the like, insofar as the HUD device 500 can display an image onto the windshield 12.
The HUD device 500 may be a general-purpose information processing terminal or a HUD dedicated terminal. The HUD dedicated terminal is simply referred to as a head-up display device. The HUD dedicated terminal that is integrated with a navigation device may be referred to as a navigation device. The HUD dedicated terminal is also called a PND (Portable Navigation Device). Alternatively, the HUD dedicated terminal may be referred to as a display audio system (or a connected audio system). The display audio system is a device that mainly provides an audio-visual (AV) function and a communication function without installing a navigation function.
Examples of the general-purpose information processing terminal include a smartphone, a tablet terminal, a mobile phone, a PDA (Personal Digital Assistant), a notebook PC, and a wearable PC (e.g., a wristwatch type, a sunglass type). However, the general-purpose information processing terminal is not limited to these examples; the general-purpose information processing terminal may simply include typical functions of an information processing apparatus. A typical general-purpose information processing terminal is used as an information processing apparatus that executes various applications. For example, as with the HUD dedicated terminal, the general-purpose information processing terminal displays information for assisting a driver's driving, upon executing application software for a HUD device.
The HUD device 500 according to the second embodiment may be switched between a vehicle mounted mode and a mobile mode when the HUD device 500 is used as a general purpose information processing terminal or a HUD dedicated terminal.
The HUD device 500 has an optical unit 510 and a controller 520, as major components. As a projection mode of the HUD device 500, a panel mode and a laser scanning mode are known in the related art. The panel mode indicates forming an intermediate image using an imaging device, such as a liquid crystal panel, a DMD panel (digital mirror device panel), or a fluorescent display tube (VFD). The laser scanning mode indicates forming an intermediate image using a two-dimensional scanning device for scanning a laser beam emitted from a laser light source.
The laser scanning mode is preferred in that the laser scanning mode can typically form a high contrast image. This is because, unlike the panel mode that forms an image with partial blocking of light emission to the full screen, the laser scanning mode can assign emission or non-emission of light to each pixel. According to the second embodiment, the laser scanning mode is employed as a projection mode of the HUD device 500. However, any projection mode may be applicable insofar as the projection mode enables a process of reducing a floating feeling.
FIG. 17 is a diagram illustrating a configuration example of an optical unit of the HUD device 500. The optical unit 510 includes a light source unit 502, an optical deflector 503, a mirror 504, a screen 505, and a concave mirror 506. Note that FIG. 17 merely illustrates the main components included in the main HUD device 500.
The light source unit 502 includes, for example, three laser light sources corresponding to RGB (hereinafter referred to as LD: laser diodes), a coupling lens, an aperture, a combining element, a lens, and the like. The light source unit 502 is configured to combine laser beams emitted from the three LDs, and guide the combined laser beam toward a reflecting surface of the optical deflector 503. The laser beam guided to the reflecting surface of the optical deflector 503 is two-dimensionally deflected by the optical deflector 503.
As the optical deflector 503, for example, one micro-mirror oscillating with respect to two orthogonal axes, two micro-mirrors oscillating with respect to or rotating around one axis, and the like may be used. The optical deflector 503 may, for example, be MEMS (Micro Electro Mechanical Systems) fabricated by a semiconductor process, or the like. The optical deflector 503 may be driven, for example, by an actuator that drives deformation force of a piezoelectric element. The optical deflector 503 may be a galvanic mirror, a polygon mirror, or the like.
The two-dimensionally deflected laser beam by optical deflector 503 enters the mirror 504 and is reflected off by the mirror 504 so as to render a two-dimensional image (intermediate image) on the surface of the screen 505 (scanned surface). For example, a concave mirror may be used as the mirror 504 in this example; however, a convex mirror or a planar mirror may also be used as the mirror 504. By deflecting a direction of the laser beam using the optical deflector 503 and the mirror 504, the size of the HUD device 500 may be reduced, or the arrangement of components of the HUD device 500 may be flexibly changed.
As the screen 505, a microlens array or a micro-mirror array having a function of diverging a laser beam at a desired divergence angle may preferably be used; however, a diffuser plate for diffusing a laser beam, a transmitter plate having a smooth surface, a reflector plate, or the like may be also used.
The laser beam emitted from the screen 505 is reflected by the concave mirror 506, and the reflected laser beam is then projected onto the windshield 12. The concave mirror 506 has a function similar to a lens to form an image at a predetermined focal length. Accordingly, a virtual image I is displayed at a position determined based on a distance between the screen 505 (corresponding to an object) and the concave mirror 506, and also on a focal length of the concave mirror 506. As illustrated in FIG. 17, a laser beam is projected via the concave mirror 506 onto the windshield 12 such that the virtual image I is displayed (formed) at a distance L from a viewpoint E of the driver V.
At least a portion of luminous flux directed to the windshield 12 is reflected toward the viewpoint E of a driver V. As a result, the driver V can see the virtual image I. The virtual image I is an intermediate image on the screen 505, which is enlarged through the windshield 12. That is, an intermediate image is magnified as the virtual image I across the windshield 12 when viewed from the driver V.
Note that typically, the windshield 12 is slightly curved rather than flat. Thus, an image-forming position of the virtual image I is determined based not only on a focal length of the concave mirror 506, but also on a curved surface of the windshield 12. The light collection power of the concave mirror 506 is preferably set such that the virtual image I is displayed at a position having a distance L from the viewpoint E of the driver V being 4 m or more and 10 m or less (preferably 6 m or less).
Note that the windshield 12 may cause optical distortion, such as a horizontal line of an intermediate image being curved in a concave or convex manner, due to the shape of the windshield 12. Hence, it is preferable that at least one of the mirror 504 and the concave mirror 506 be designed or arranged to compensate for optical distortion. Or, it is preferable that at least one of the mirror 504 and the concave mirror 506 be designed or arranged in consideration of distortion so that an image to be displayed is corrected.
Further, a combiner may be disposed as a transparent-reflective member at a position closer to the viewpoint E than to the windshield 12. When the combiner is configured to receive light from the concave mirror 506, a virtual image I may be displayed in a manner similar to a configuration in which the windshield 12 is designed to receive light from the concave mirror 506. Note that "to display a virtual image" indicates to display an image in a visually perceivable manner to a driver through a transparent member. However, "to display a virtual image" may be expressed as "to display an image" for simplifying the explanation.
Further, instead of an image being displayed onto the windshield 12, the windshield 12 may be designed to emit light to display an image.
FIG. 18 is a diagram illustrating an example of a hardware configuration of a controller 520 of a HUD device 500. The controller 520 includes an FPGA 511, a CPU 512, a ROM 513, a RAM 514, an I/F 515, a bus line 516, an LD driver 517, and a MEMS controller 518. The FPGA 511, the CPU 512, the ROM 513, the RAM 514, and the I/F 515 are interconnected via the bus line 516.
The CPU 512 controls respective functions of the HUD device 500. The ROM 513 stores a program 530, which is executed by CPU 512 to control respective functions of the HUD device 500. The RAM 514 loads the program 530, and the CPU 512 uses the RAM 514 as a work area for executing the loaded program 530. The RAM 514 also includes an image memory 519. The image memory 519 is used to generate an image, which is displayed as a virtual image I. The I/F 515 is an interface for communicating with other devices installed in the vehicle 1. The I/F 515 is connected, for example, to a CAN (Controller Area Network) bus or Ethernet (TM) of the vehicle 1.
The FPGA 511 controls an LD (Laser Diode) driver 517, based on an image created by the CPU 512. The LD driver 517 drives an LD of the light source unit 502 of the optical unit 510 so as to control light emission of the LD in accordance with the image created by the CPU 512. The FPGA 511 operates the optical deflector 503 of the optical unit 510 via the MEMS controller 518 such that a laser beam is deflected toward a direction corresponding to a pixel position of an image.
Next, a functional configuration of a controller 200A of an information providing system according to the second embodiment will be described with reference to FIG. 19. FIG. 19 is a diagram illustrating a functional configuration of the controller 200A according to the second embodiment.
The controller 200A according to the second embodiment includes a storage unit 210A, a vehicle information acquiring unit 221, an external information acquiring unit 222, a target recognition unit 223, a recognition information generating unit 224, a priority estimating unit 225, a notification target specifying unit 226, a notification determining unit 227A, an internal information acquiring unit 228, a projection control unit 230, and a HUD control unit 240.
The storage unit 210A according to the second embodiment stores the priority estimation table 211 and a notification content table 212A. The notification content table 212A stores projection image data to be projected by a projector 300A and image data to be displayed by the HUD device 500. Details of the notification content table 212A will be described later.
The notification determining unit 227A determines whether to project projection image data by the projector 300A, and whether to display image data by the HUD device 500, in accordance with a type of a specified target and a priority corresponding to the specified target.
Specifically, when a level of the priority corresponding to the specified target is higher than a predetermined value, the notification determining unit 227A determines to perform projection by the projector 300 and display by the HUD device 500. Also, when the type of the specified target is a person or a moving body driven by a person, the notification determining unit 227A determines to perform projection of projection image data by the projector 300 and display of image data by the HUD device 500. Further, when the type of the specified target is not a person or a moving body driven by a person, the notification determining unit 227A determines to perform only display of image data by the HUD device 500.
The internal information acquiring unit 228 acquires internal information. The internal information is information inside the vehicle 1 acquired via the sensor device 450. The internal information includes, for example, information indicating the number of occupants in the vehicle 1, information indicating seating positions of the occupants, information indicating whether each of occupants is an adult or a child, and the like.
The HUD control unit 240 according to the second embodiment controls the HUD device 500. In other words, the HUD control unit 240 is a display control unit that controls the HUD device 500 acting as a display device. The HUD control unit 240 includes a display position determining unit 241, a content determining unit 242, and a display instructing unit 243.
The display position determining unit 241 determines a position at which image data is displayed inside the vehicle 1. The display position determining unit 241 according to the second embodiment determines a display position of image data within a display area of the HUD device 500 such that the display position of the image data is not superimposed on a projection area of the projection image data, which is projected by the projector 300.
Specifically, the display position determining unit 241 may locate (obtain) a projection area of the projector 300 within a driver's field of view, from video data and the like in a range corresponding to the driver's field of view. The display position determining unit 241 may then determine a display position of image data to be within an area excluding the projection area of the projector 300. Note that the video data and the like may be captured (imaged) from inside the vehicle 1 by the sensor device 450.
The content determining unit 242 refers to the notification content table 212A, and determines image data corresponding to the specified target, and a notification priority of the specified target.
The display instructing unit 243 instructs the HUD device 500 to display the image data determined by the content determining unit 242.
FIG. 20 is a diagram illustrating an example of a notification content table 212A according to the second embodiment. The notification content table 212A according to the second embodiment includes a type of a target, a priority, and notification content as items of information. The item "notification content" includes projection image data (projection image data 11 to 13, and 21 to 23) to be projected by the projector 300 on the road surface, and image data (image data 1 to 9) to be displayed by the HUD device 500 in a display area inside the vehicle 1. The display area according to the second embodiment is an area in which an image can be displayed by the HUD device 500. The display area may be determined according to the structure of the optical unit 510 of the HUD device 500.
According to an example of the notification content table 212A illustrated in FIG. 20, when a type of target is a pedestrian and a priority is "intermediate", projection image data 22 is projected onto the road surface by the projector 300 and image data 5 is displayed inside the vehicle 1 by the HUD device 500.
Further, as illustrated in FIG. 20, where a type of the target is a guardrail and a priority is "high", no projection image data is projected by the projector 300, and image data 7 is displayed inside the vehicle 1 by the HUD device 500. That is, an image represented by the image data 7 is displayed on the windshield 12 by the HUD device 500, in this case.
As described above, the projector 300 according to the second embodiment projects projection image data onto a road surface when a type of a target is a person or a person being a driver. Accordingly, the projector 300 according to the second embodiment may also be called a device for presenting notification to a person outside the vehicle 1.
Further, the HUD device 500 according to the second embodiment displays image data inside the vehicle 1 even when a type of a target is not a person or a person being a driver. Accordingly, the HUD device 500 according to the second embodiment may also be called a device for presenting notification to a driver of the vehicle 1.
Next, an operation of the controller 200A according to the second embodiment will be described with reference to FIG. 21. FIG. 21 is a flowchart illustrating the controller 200A according to the second embodiment.
The controller 200A according to the second embodiment acquires vehicle information, external information, and internal information via the vehicle information acquiring unit 221, the external information acquiring unit 222, and the target recognition unit 223, respectively, (step S2101), and then proceeds to step S2102.
Since a process from step S2102 to step S2105 is the same as the process from step S1002 to step S1005 in FIG. 10, a description of the process from step S2102 to step S2105 will not be repeated.
Following step S2105, the controller 200A determines, via the notification determining unit 227, whether to perform projection of the projection image data by the projector 300 and display of the image data by the HUD device 500, based on the estimated priority of the specified target (step S2106). In other words, the notification determining unit 227A determines whether to present notification using the projector 300 and the HUD device 500.
When the notification determining unit 227A determines not to present notification in step S2106, the controller 200A proceeds to step S2110, which will be described later.
In step S2107, when the notification determining unit 227A determines to present notification in step S2106, the notification determining unit 227A determines whether the specified target is a person, or a moving body driven by a person.
When the specified target is not a person or a moving body driven by a person in step S2107, the controller 200A proceeds to step S2109, which will be described later.
In step S2107, when a type of the specified target is a person or a moving body driven by a person, the controller 200A instructs the projection control unit 230 to cause the projector 300 to project projection image data onto a road surface (step S2108). Step S2108 is similar to step S1008 in FIG. 10.
Subsequently, in step S2109, the controller 200A instructs the HUD control unit 240 to cause the HUD device 500 to display image data.
Specifically, the HUD control unit 240 determines, via the display position determining unit 241, a display position of image data, where the image data represents notification to a driver of the vehicle 1. The HUD control unit 240 determines, via the content determining unit 242, notification content (i.e., image data) corresponding to a priority of the specified target, by referring to the notification content table 212A. The HUD control unit 240 then instructs, via the display instructing unit 243, the HUD device 500 to display image data determined by the content determining unit 242. Thus, an image corresponding to the determined image data is displayed on the windshield 12.
Subsequently, in step S2110, the controller 200A determines whether the engine has been turned off. In step S2110, when the engine has not been turned off, the controller 200A returns to step S2101.
In step S2110, when the engine has been turned off, the controller 200A ends the process.
Next, an example of notification according to the second embodiment will be described with reference to FIGS. 22 to 25. FIG. 22 is a first diagram illustrating an example of notification by the projector 300 and the HUD device 500, according to the second embodiment.
FIG. 22 schematically illustrates a driver's field view of the vehicle 1 in which the information providing system 100A is installed.
In FIG. 22, respective targets 121, 122, and 123 are recognized, and a priority corresponding to the target 122 indicates the highest priority. Hence, the target 122 is specified as a target to which notification is presented.
In this case, images 126 are projected in a projection area 125 on the road surface in order to present notification indicating that the vehicle 1 is approaching.
The HUD device 500 displays an image 260 to highlight the target 122 in a display area inside the vehicle 1 so as to notify a driver of the vehicle 1 of the presence of the target 122.
The HUD device 500 also displays an image 262 and an image 263 in an area 261. Note that this area 261 is a part of the display area of the HUD device 500.
The image 262 is a mark indicating that a bicycle as the target 122 is approaching, and the image 263 is a mark indicating that the vehicle 1 is about to turn left.
The controller 200A according to the second embodiment determines, via the display position determining unit 241 of the HUD control unit 240 (an example of a display control unit), a position of the area 261 such that the area 261 is not superimposed onto the projection area 125 of the projector 300, within the driver's field view of the vehicle 1.
According to the second embodiment, since the HUD device 500 displays an image to highlight a specified target (e.g., target 122), a driver of the vehicle 1 can be promptly notified of the presence of the specified target that needs to be attended to.
Further, according to the second embodiment, a projection area, in which projection image data is projected by the projector 300, is not superimposed on a display area, in which image data is displayed by the HUD device 500, within the driver's field of view of the vehicle 1.
Thus, according to the second embodiment, an image displayed by the HUD device 500 will not be superimposed on an image projected by the projector 300, on the road surface, which acts as a background within the driver's field of view of the vehicle 1. This improves viewability of a driver of the vehicle 1.
In addition, according to the second embodiment, information about the vehicle 1 can be notified to a person outside the vehicle by projecting an image onto a road surface, and information about a target that needs to be attended to can be notified to a person inside the vehicle (e.g., a driver) by displaying an image in a display area inside the vehicle 1. Hence, according to the second embodiment, it is possible to alert both a person outside the vehicle and a person inside the vehicle (e.g., vehicle driver) so as to draw their attention simultaneously. This improves safety of a person outside the vehicle and a person inside the vehicle simultaneously.
As illustrated in the example of FIG. 22, the image 260 for highlighting only the specified target 122 is displayed; however, the present invention is not limited to this example. The HUD device 500 may, for example, display images for highlighting any other pedestrians, from among the recognized targets, regardless of priorities of the pedestrians.
In such a case, the images for highlighting the targets 123 are also displayed; however, a display mode of the images for highlighting the targets 123 may be changed from the display mode of the image 260 for highlighting the target 122. For example, color and brightness of an image for highlighting a target with the highest priority may be changed from those of images for highlighting other targets.
FIG. 23 is a second diagram illustrating an example of notification by the projector and the HUD device according to the second embodiment.
FIG. 23 also schematically illustrates a driver's field of view of the vehicle 1 in which the information providing system 100A is installed, as with FIG. 22.
FIG. 23 illustrates that a target 271 is recognized by a driver of the vehicle 1. The target 271 in this case is a vehicle (a second vehicle) moving ahead of the vehicle 1.
FIG. 23 also illustrates that another vehicle (a third vehicle) moving next to the lane along which the vehicle 1 is moving is trying to enter between the vehicle 1 and the target 271 (second vehicle). As illustrated in FIG. 23, images 282 are projected in a projection area 281 by a projector 300 of another vehicle (third vehicle), and the images 282 come into the field of view of the driver of the vehicle 1.
Hence, the projection control unit 230 according to the second embodiment needs to prevent an image projected by the vehicle 1 from being superimposed on the images 282 projected by another vehicle (third vehicle).
For example, as illustrated in FIG. 23, an image 126A is projected by the vehicle 1 in a projection area 125A ahead of the vehicle 1. According to the second embodiment, the image 126A is projected by the vehicle 1 at a position so as not to be superimposed on the images 282 projected by another vehicle (third vehicle).
Specifically, the projection position determining unit 231 of the projection control unit 230 obtains, from video data or the like of the vehicle 1, positions at which the images 282 have been projected. Note that the video data or the like of the vehicle 1 may be imaged or captured by the sensor device 400. The projection position determining unit 231 subsequently determines a projection position that is not superimposed on the obtained positions of the images 282, so as to cause the projector 300 to project the image 126A to the determined projection position.
As described above, according to the second embodiment, it is possible to prevent images projected by a plurality of the vehicles on a road surface from being mutually superimposed, while clearly presenting notification represented by the images.
Further, as illustrated in FIG. 23, an image 292 and an image 293 are displayed in an area 290. The image 292 is an image indicating that another vehicle (third vehicle) is approaching, and the image 293 is an image indicating that the distance between the vehicle 1 and the vehicle (target 271) ahead of the vehicle 1 has been narrowed. According to the second embodiment, the image 292 and an image 293 displayed in the area 290 are not superimposed on the images 282.
FIG. 24 is a second diagram illustrating an example of notification presented by the projector 300 and the HUD device 500 according to the second embodiment.
As illustrated in FIG. 24, an image 272 is projected by a projector 300 disposed at the back of a preceding vehicle (i.e., the target 271) moving ahead of the vehicle 1.
In this case, the projector 300 of the vehicle 1 (i.e. differing from the projector 300 of the target 271) projects images at positions so as not to be superimposed onto the image 272 (projected by the projector 300 of the target 271).
As illustrated in FIG. 24, the projector 300 of the vehicle 1 projects images 128 in the projection area 125A of the vehicle 1 so as not to be superimposed onto the image 272. Further, as illustrated in FIG. 24, the HUD device 500 of the vehicle 1 displays an image 293 in an area 290A that is not superimposed onto the images 128 and the image 272, within the driver's field of view of the vehicle 1.
FIG. 25 is a diagram illustrating an example of notification projected by the projector of the vehicle 1 according to the second embodiment. According to this example, the controller 200A according to the second embodiment may cause the projector 300 to project notification to those around the vehicle 1, in accordance with internal information acquired by the internal information acquiring unit 228.
According to the example illustrated in FIG. 25, the controller 200A causes the projector 300 to project an image 130 in a projection area 129 behind the vehicle 1. In this case, the controller 200A causes the projector 300 to project the image 130 as a result of detecting, from the internal information, an infant being present among persons (occupants) inside the vehicle 1.
Further, according to the second embodiment, upon an abnormal change being detected from a person inside the vehicle 1 based on the internal information, an image illustrating the abnormal change may be projected by the projector 300 or displayed by the HUD device 500.
The abnormal change occurring inside the vehicle 1 may thus be propagated to the outside of the vehicle 1.
Although the invention has been described in accordance with the above-described embodiments, the invention is not limited to requirements illustrated in the embodiments. In these respects, the subject matter of the present invention may be altered without prejudice and may be suitably defined according to different applications.

100, 100A  information providing system
200, 200A  controller
210  storage unit
211  priority estimation table
212, 212A  notification content table
221  vehicle information acquiring unit
222  external information acquiring unit
223  target recognition unit
224  recognition information generating unit
225  priority estimating unit
226  notification target specifying unit
227  notification determining unit
228  internal information acquiring unit
230  projection control unit
240  HUD control unit
300  projector
400, 450  sensor device
500  HUD device

The present application is based on and claims the benefit of priority of Japanese Priority Application No. 2019-052959 filed on March 20, 2019, the entire contents of which are hereby incorporated herein by reference.

Claims (17)

  1.     An information providing system installed in a moving body, the information providing system comprising:
        a projector; and
        a controller configured to control the projector, wherein the controller includes
        an external information acquiring unit configured to acquire external information outside the moving body,
        a target recognition unit configured to recognize one or more targets based on the external information,
        an estimating unit configured to estimate rankings of the one or more recognized targets, and
        a projection control unit configured to cause the projector to project an image representing notification to a desired target outside the moving body, the desired target being specified, from among the one or more recognized targets, in accordance with the rankings of the one or more recognized targets.
  2.     The information providing system according to claim 1, wherein
        the outside the moving body, to which the image representing notification to the desired target is projected, is a road surface on which the moving body is traveling.
  3.     The information providing system according to claim 2, wherein the controller further includes
        a storage unit configured to store information including a priority in association with a combination of moving body information and recognition information, the moving body information indicating a status of the moving body, and the recognition information indicating a status of each of the one or more recognized targets, and
        the estimating unit refers to the storage unit and estimates the rankings of the one or more recognized targets, based on the recognition information indicating respective statuses of the one or more recognized targets and the moving body information indicating the status of the moving body.
  4.     The information providing system according to claim 3, wherein
        the projection control unit causes the projector to project an image representing notification to the desired target onto a road surface on which the moving body is traveling, the desired target having a highest priority.
  5.     The information providing system according to any one of claims 2 to 4, wherein
        the external information includes three-dimensional information representing a space around the moving body.
  6.     The information providing system according to any one of claims 2 to 5, wherein
        the projection control unit determines an area in which the image representing notification to the desired target is projected in accordance with unevenness of the road surface.
  7.     The information providing system according to any one of claims 2 to 6, wherein
        in response to another image being already projected in an area on the road surface by another moving body differing from the moving body, the projection control unit causes the projector to project the image representing notification to the desired target in an area differing from the area in which the another image is already projected.
  8.     The information providing system according to any one of claims 1 to 7, further comprising:
        a display device configured to superimpose a virtual image onto a background, the background being a view viewed from inside the moving body, wherein the controller further includes
        a display control unit configured to cause the display device to display the virtual image in accordance with the rankings of the one or more recognized targets.
  9.     The information providing system according to claim 8, wherein
        the display control unit causes the display device to display a virtual image for highlighting a desired target, the desired target being specified in accordance with the rankings of the one or more recognized targets.
  10.     The information providing system according to claim 8 or 9, wherein
        the display control unit displays the virtual image corresponding to the desired target in a display mode differing from a display mode of a virtual image corresponding to another recognized target, the desired target being specified in accordance with the rankings of the one or more recognized targets.
  11.     The information providing system according to any one of claims 8 to 10, wherein
        in response to an image being projected on the road surface acting as the background viewed from inside the moving body, the display control unit determines a display area for displaying the virtual image such that the virtual image is not superimposed onto the image projected on the road surface.
  12.     The information providing system according to any one of claims 8 to 11, wherein
        in response to a type of a recognized target being a person or a moving body driven by a person, the projection control unit causes the projector to project the image representing the notification to the desired target, and the display control unit causes the display device to display the virtual image, and
        in response to the type of the recognized target not being a person or a moving body driven by a person, the display control unit causes the display device to display the virtual image.
  13.     The information providing system according to any one of claims 8 to 12, wherein
        the controller further includes an internal information acquiring unit configured to acquire internal information of the moving body, and wherein
        the display control unit causes the display device to display the virtual image, in accordance with a status of an occupant of the moving body, the status of the occupant of the moving body being indicated by the acquired internal information of the moving body.
  14.     The information providing system according to claim 13, further comprising:
        a first sensor device configured to acquire the external information; and
        a second sensor device configured to acquire the internal information, wherein
        each of the first sensor device and the second sensor device is a stereo camera.
  15.     A moving body comprising the information providing system according to any one of claims 1 to 14.
  16.     An information providing method performed by an information providing system installed in a moving body, the information providing system including a projector and a controller configured to control the projector, the information providing method comprising:
        acquiring external information outside the moving body;
        recognizing one or more targets based on the external information;
        estimating rankings of the one or more recognized targets; and
        causing the projector to project an image representing notification to a desired target outside the moving body, the desired target being specified, from among the one or more recognized targets, in accordance with the rankings of the one or more recognized targets.
  17.     An information providing program having instructions executed by a controller, the controller being configured to control a projector, wherein when executed by at least one processor, causes the at least one processor to perform a process comprising:
        acquiring external information outside a moving body, the moving body being provided with the projector;
        recognizing one or more targets based on the external information;
        estimating rankings of the one or more recognized targets; and
        causing the projector to project an image representing a notification to a desired target outside the moving body, the desired target being specified, from among the one or more recognized targets, in accordance with the rankings of the one or more recognized targets.
PCT/JP2020/011521 2019-03-20 2020-03-16 Information providing system, moving body, information providing method, and information providing program WO2020189636A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019052959A JP2020152246A (en) 2019-03-20 2019-03-20 Information provision system, movable body, information provision method, and information provision program
JP2019-052959 2019-03-20

Publications (1)

Publication Number Publication Date
WO2020189636A1 true WO2020189636A1 (en) 2020-09-24

Family

ID=70050174

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/011521 WO2020189636A1 (en) 2019-03-20 2020-03-16 Information providing system, moving body, information providing method, and information providing program

Country Status (2)

Country Link
JP (1) JP2020152246A (en)
WO (1) WO2020189636A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024029374A1 (en) * 2022-08-05 2024-02-08 ソニーグループ株式会社 Information processing device, information processing method, and road surface projection system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016055691A (en) 2014-09-08 2016-04-21 株式会社小糸製作所 Vehicular display system
US20170253177A1 (en) * 2016-03-07 2017-09-07 Toyota Jidosha Kabushiki Kaisha Vehicle lighting system
JP2017159881A (en) * 2016-03-10 2017-09-14 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Recognition result presentation device, recognition result presentation method and autonomous movable body
US20180118099A1 (en) * 2015-04-10 2018-05-03 Maxell, Ltd. Image projection apparatus
JP2018106655A (en) 2016-12-28 2018-07-05 株式会社リコー HUD device, vehicle device and display method
US20180260182A1 (en) * 2017-03-10 2018-09-13 Subaru Corporation Image display device
US20180261081A1 (en) * 2017-03-10 2018-09-13 Subaru Corporation Image display device
JP2019052959A (en) 2017-09-15 2019-04-04 日本電信電話株式会社 Method, device and program for inspecting state of columnar structure

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016055691A (en) 2014-09-08 2016-04-21 株式会社小糸製作所 Vehicular display system
US20180118099A1 (en) * 2015-04-10 2018-05-03 Maxell, Ltd. Image projection apparatus
US20170253177A1 (en) * 2016-03-07 2017-09-07 Toyota Jidosha Kabushiki Kaisha Vehicle lighting system
JP2017159881A (en) * 2016-03-10 2017-09-14 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Recognition result presentation device, recognition result presentation method and autonomous movable body
JP2018106655A (en) 2016-12-28 2018-07-05 株式会社リコー HUD device, vehicle device and display method
US20180260182A1 (en) * 2017-03-10 2018-09-13 Subaru Corporation Image display device
US20180261081A1 (en) * 2017-03-10 2018-09-13 Subaru Corporation Image display device
JP2019052959A (en) 2017-09-15 2019-04-04 日本電信電話株式会社 Method, device and program for inspecting state of columnar structure

Also Published As

Publication number Publication date
JP2020152246A (en) 2020-09-24

Similar Documents

Publication Publication Date Title
US10551619B2 (en) Information processing system and information display apparatus
US10546561B2 (en) Display device, mobile device, display method, and recording medium
US10890762B2 (en) Image display apparatus and image display method
US11333521B2 (en) Head-up display, vehicle device, and information display method
JP6699675B2 (en) Information provision device
JP6658859B2 (en) Information provision device
KR20190015552A (en) DISPLAY DEVICE, MOBILE DEVICE, AND METHOD OF MANUFACTURING DISPLAY DEVICE
KR20180103947A (en) Information display device
EP3348433B1 (en) Information display device and vehicle apparatus
US12005832B2 (en) Vehicle display system, vehicle system, and vehicle
WO2021065617A1 (en) Vehicular display system and vehicle
US20210309145A1 (en) Vehicle display system and vehicle
JP2023175794A (en) Head-up display
WO2020189636A1 (en) Information providing system, moving body, information providing method, and information providing program
JP2018058521A (en) Virtual display mirror device
US10914948B2 (en) Display device, display control method, and storage medium
JP6642103B2 (en) Head-up display device
JP2017105245A (en) Head-up display device
JP2019174519A (en) Display unit, display system, moving body, display intensity control method, and program
EP3961291B1 (en) Vehicular head-up display and light source unit used therefor
JP2020148950A (en) Head-up display device
KR102598912B1 (en) Laser safety vehicle head up display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20715214

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20715214

Country of ref document: EP

Kind code of ref document: A1