EP2793193B1 - Display device and display method - Google Patents

Display device and display method Download PDF

Info

Publication number
EP2793193B1
EP2793193B1 EP11877396.9A EP11877396A EP2793193B1 EP 2793193 B1 EP2793193 B1 EP 2793193B1 EP 11877396 A EP11877396 A EP 11877396A EP 2793193 B1 EP2793193 B1 EP 2793193B1
Authority
EP
European Patent Office
Prior art keywords
information
display
roadway
road
outside
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP11877396.9A
Other languages
German (de)
French (fr)
Other versions
EP2793193A4 (en
EP2793193A1 (en
Inventor
Tetsuya Fujie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pioneer Corp
Original Assignee
Pioneer Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=48612041&utm_source=***_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=EP2793193(B1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Pioneer Corp filed Critical Pioneer Corp
Publication of EP2793193A1 publication Critical patent/EP2793193A1/en
Publication of EP2793193A4 publication Critical patent/EP2793193A4/en
Application granted granted Critical
Publication of EP2793193B1 publication Critical patent/EP2793193B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/26Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using acoustic output
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/50Instruments characterised by their means of attachment to or integration in the vehicle
    • B60K35/53Movable instruments, e.g. slidable
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/148Instrument input by voice
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/166Navigation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/334Projection means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/77Instrument locations other than the dashboard
    • B60K2360/771Instrument locations other than the dashboard on the ceiling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/77Instrument locations other than the dashboard
    • B60K2360/785Instrument locations other than the dashboard on or in relation to the windshield or windows
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0123Head-up displays characterised by optical features comprising devices increasing the field of view
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/006Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
    • G09B29/007Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes using computer methods
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • the present invention relates to a technology for displaying information.
  • Patent Reference-1 discloses a technique for making map data so that landmarks do not overlap with any road in the map data.
  • Patent Reference-2 discloses the preamble of claim 1.
  • An object of the present invention is to provide a display device and a display method thereof capable of appropriately displaying information on an object contained in the front scenery of the moving body without superposing it on any road.
  • One invention is a display device according to the subject-matter of claim 1.
  • Another invention is a display method executed by a display device, according to the subject-matter of claim 7.
  • a display device including: a present position information obtaining unit configured to obtain present position information of a moving body; and a display control unit configured to display information on an object through a transparent member based on the present position information, the object being contained in scenery in traveling direction of the moving body, the transparent member being positioned between a viewpoint of a traveler in the moving body and the scenery, wherein the display control unit changes a display position of the information on the object to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, the road being contained in the scenery seen through the transparent member.
  • the display device is used for displaying information on an object contained in scenery through a transparent member based on the present position information of a moving body.
  • the above-mentioned transparent member is positioned between a viewpoint of a traveler in the moving body and the scenery in the traveling direction of the moving body.
  • the display control unit in the display device changes a display position of the information on the object to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, wherein the road is contained in the scenery seen through the transparent member.
  • the display control unit adjusts the display position of the information on the object to a position outside the range of the road when the display position of the information on the object is positioned within the range of the road.
  • the display control unit adjusts the display position of the information on the object to a position outside the range of the road when the display position of the information on the object is positioned within the range of the road.
  • the display device further includes an object information obtaining unit configured to obtain category information on a category of the object, and the display control unit determines whether or not to change the display position of the information on the object based on the category information.
  • an object information obtaining unit configured to obtain category information on a category of the object
  • the display control unit determines whether or not to change the display position of the information on the object based on the category information.
  • the display control unit decides to change the display position of the information on the object to the outward direction of the road in a case that the category of the object according to the category information is relevant to a facility, and decides not to change the display position of the information on the object to the outward direction of the road in a case that the category of the object according to the category information is relevant to a road sign.
  • the display control unit decides to change the display position of the information on the object to the outward direction of the road in a case that the category of the object according to the category information is relevant to a road sign.
  • the display control unit changes the display position of the information on the object to the outward direction of the road in a case that the object is a facility existing outside the road, and does not change the display position of the information on the object to the outward direction of the road in a case that the object is a facility existing inside the road. Thereby, it is possible to properly display information on a facility existing inside the road in accordance with the actual position of the facility.
  • a display method executed by a display device including: a present position information obtaining process for obtaining present position information of a moving body; and a display control process for displaying information on an object through a transparent member based on the present position information, the object being contained in scenery in traveling direction of the moving body, the transparent member being positioned between a viewpoint of a traveler in the moving body and the scenery, wherein in the display control process, a display position of the information on the object is changed to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, the road being contained in the scenery seen through the transparent member.
  • a display device including: a present position information obtaining unit configured to obtain present position information of a moving body; and an image obtaining unit configured to obtain an actual image of a travelling direction of the moving body; and a display control unit configured to display information on an object contained in the actual image on the actual image based on the present position information, wherein the display control unit changes a display position of the information on the object to an outward direction of a road contained in the actual image in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance.
  • a display device including: a display unit configured to let a traveler in a moving body visually recognize scenery in front of the moving body and information on an object contained in the scenery and a display control unit configured to change a display position of the information on the object to an outward direction of a road contained in the scenery in order to prevent the display position of the information on the object from being a position inside the road by at least a predetermined criterion distance.
  • the display device further includes a present position information obtaining unit configured to obtain present position information of the moving body, and the display control unit changes a display position of the information on the object to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, the road being contained in the scenery seen through the display unit.
  • a present position information obtaining unit configured to obtain present position information of the moving body
  • the display control unit changes a display position of the information on the object to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, the road being contained in the scenery seen through the display unit.
  • a display device including: a present position information obtaining unit configured to obtain present position information of a moving body; and a display control unit configured to display information on an object through a transparent member based on the present position information, the object being contained in scenery in traveling direction of the moving body, the scenery being seen through the transparent member, wherein the display control unit changes a display position of the information on the object to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, the road being contained in the scenery seen through the transparent member.
  • FIG. 1 illustrates an example of the configuration of a system according to the embodiment.
  • the system includes a navigation device 1 and a head-up display 2.
  • the system is mounted on a vehicle.
  • the navigation device 1 has a guide function of a route from a departure place to a destination.
  • Examples of the navigation device 1 include a stationary navigation device set in a vehicle, a PND (Portable Navigation Device) and a cell phone such as a smart phone.
  • the head-up display 2 generates an image indicating information for assisting the driving operation such as map information indicating the present position, route guide information and a running speed, and lets the driver visually recognize the image as a virtual image from the position of the eye (eye point) of the driver.
  • the navigation device 1 supplies the head-up display 2 with various kinds of information used for the navigation processing such as the position of the vehicle, the running speed of the vehicle, map information and facility data.
  • the head-up display 2 is an example of "the display unit" in the present invention.
  • the navigation device 1 may be held by a cradle if the navigation device 1 is a cell phone such as a smart phone. In this case, the navigation device 1 may exchange the information with the head-up display 2 via the cradle.
  • FIG. 2 shows the configuration of the navigation device 1.
  • the navigation device 1 includes a stand-alone position measurement device 10, a GPS receiver 18, a system controller 20, a disc drive 31, a data storage unit 36, a communication interface 37, a communication device 38, a display unit 40, a sound output unit 50, and an input device 60.
  • the stand-alone position measurement device 10 includes an acceleration sensor 11, an angular velocity sensor 12 and a distance sensor 13.
  • the acceleration sensor 11 includes a piezoelectric element, for example, and detects the acceleration degree of the vehicle and outputs the acceleration data.
  • the angular velocity sensor 12 includes a vibration gyroscope, for example, and detects the angular velocity of the vehicle at the time of changing the direction of the vehicle and outputs the angular velocity data and the relative direction data.
  • the distance sensor 13 measures vehicle speed pulses including a pulse signal generated with the wheel rotation of the vehicle.
  • the GPS receiver 18 receives an electric wave 19 for transmitting downlink data including position measurement data from plural GPS satellites .
  • the position measurement data is used for detecting the absolute position (hereinafter referred to as "present position” or “own position”) of the vehicle from longitude and latitude information.
  • the system controller 20 includes an interface 21, a CPU (Center Processing Unit) 22, a ROM (Read Only Memory) 23 and a RAM (Random Access Memory) 24, and is configured to control the entire navigation device 1.
  • a CPU Center Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the interface 21 executes the interface operation with the acceleration sensor 11, the angular velocity sensor 12, the distance sensor 13 and the GPS receiver 18. Then, the interface 21 inputs the vehicle speed pulse, the acceleration data, the relative direction data, the angular velocity data, the GPS measurement data and the absolute direction data into the system controller 20.
  • the CPU 22 controls the entire system controller 20.
  • the ROM 23 includes a non-volatile memory (not shown) in which a control program for controlling the system controller 20 is stored.
  • the RAM 24 readably stores various kinds of data such as route data preset by the user via the input device 60, and supplies a working area to the CPU 22.
  • the system controller 20, the disc drive 31 such as a CD-ROM drive or a DVD-ROM drive, the data storage unit 36, the communication interface 37, the display unit 40, the sound output unit 50 and the input device 60 are connected to each other via a bus line 30.
  • the disc drive 31 reads contents data such as sound data and video data from a disc 33 such as a CD and a DVD to output the contents data.
  • the disc drive 31 may be the CD-ROM drive or the DVD-ROM drive, or may be a drive compatible between the CD and the DVD.
  • the data storage unit 36 includes a HDD, for example, and stores various kinds of data used for a navigation process such as map data.
  • the communication device 38 includes an FM tuner or a beacon receiver, a mobile phone and a dedicated communication card for example, and obtains information (hereinafter referred to as "VICS information"; "VICS" is a registered trademark) delivered from a VICS (Vehicle Information Communication System) center by the electric wave 39.
  • the communication interface 37 executes the interface operation of the communication device 38 to input the VICS information into the system controller 20.
  • the communication device 38 sends information on the present position obtained from the GPS receiver 18 to the head-up display 2.
  • the display unit 40 displays various kinds of display data on a display screen of a display 44 under the control of the system controller 20.
  • the system controller 20 reads the map data from the data storage unit 36, and the display unit 40 displays, on its display screen, the map data read from the data storage unit 36 by the system controller 20.
  • the display unit 40 includes a graphic controller 41 for controlling the entire display unit 40 on the basis of the control data transmitted from the CPU 22 via the bus line 30, a buffer memory 42 having a memory such as a VRAM (Video RAM) for temporarily storing immediately displayable image information, a display control unit 43 for controlling a display 44 such as a liquid crystal and a CRT (Cathode Ray Tube) on the basis of the image data outputted from the graphic controller 41, and the display 44.
  • the display 44 is formed by a liquid crystal display device of the opposite angle 5-10 inches, and is mounted in the vicinity of a front panel of the vehicle.
  • the sound output unit 50 includes a D/A converter 51 for executing D/A (Digital to Analog) conversion of the sound digital data transmitted from the CD-ROM drive 31, a DVD-ROM 32 or the RAM 24 via the bus line 30 under the control of the system controller 20, an amplifier (AMP) 52 for amplifying a sound analog signal outputted from the D/A converter 51, and a speaker 53 for converting the amplified sound analog signal into the sound and outputting it to the vehicle compartment.
  • D/A converter 51 for executing D/A (Digital to Analog) conversion of the sound digital data transmitted from the CD-ROM drive 31, a DVD-ROM 32 or the RAM 24 via the bus line 30 under the control of the system controller 20, an amplifier (AMP) 52 for amplifying a sound analog signal outputted from the D/A converter 51, and a speaker 53 for converting the amplified sound analog signal into the sound and outputting it to the vehicle compartment.
  • AMP Analog to Analog
  • the input device 60 includes keys, switches, buttons, a remote controller and a sound input device, which are used for inputting various kinds of commands and data.
  • the input device 60 is arranged in the vicinity of the display 44 and a front panel of a main body of an on-vehicle electric system loaded on the vehicle . Additionally, in such a case that the display 44 is in a touch panel system, a touch panel provided on the display screen of the display 44 also functions as the input device 60.
  • FIG. 3 illustrates the schematic configuration of the head-up display 2.
  • the head-up display 2 includes a light source unit 3, a camera 6 and a combiner 9, and is installed in a vehicle having a front window 25, a ceiling board 27, a hood 28 and a dashboard 29.
  • the light source unit 3 is provided on the ceiling board 27 in the vehicle interior through the support parts 5a and 5b, and emits the light for displaying an image illustrating driver assist information towards the combiner 9.
  • the light source unit 3 under the control of the control unit 4, the light source unit 3 generates an original image (real image) of the display image in the light source unit 3, and emits the light for displaying the image towards the combiner 9 thereby to let the driver visually recognize the virtual image "Iv" via the combiner 9.
  • the light source unit 3 has a display unit such as a laser light source and a LCD light source, and emits the light by means of such a display unit.
  • the display image emitted from the light source unit 3 is projected onto the combiner 9, and the combiner 9 shows the display image as the virtual image Iv by reflecting the display image towards the eye point "Pe" of the driver.
  • the combiner 9 has a support shaft 8 provided on the ceiling board 27 and rotates on the support shaft 8.
  • the support shaft 8 is provided on the ceiling board 27 near the top edge of the front window 25, i.e., near the position of a sun visor (not shown) for the driver. It is noted that the support shaft 8 may be provided in place of the above-mentioned sun visor.
  • the combiner 9 is an example of "the transparent member” and "the display unit” in the present invention.
  • the control unit 4 includes a CPU, a RAM and a ROM which are not shown, and controls the entire head-up display 2.
  • the control unit 4 is capable of communicating with the navigation device 1 and receives various kinds of information used for the navigation processing from the navigation device 1.
  • the control unit 4 receives present position information indicating the own position and information for display from the navigation device 1, and thereby performs the control of displaying the information while superposing the information on the scenery seen through the combiner 9.
  • the control unit 4 performs a control for the AR display.
  • the control unit 4 receives information on an object existing in the scenery seen through the combiner 9 and performs such a control that the virtual image corresponding to the information on the object is seen overlapping with the object in the scenery.
  • the control unit 4 is an example of "the present position information obtaining unit", "display control unit” and "object information obtaining unit” according to the present invention.
  • the camera 6 captures the scenery in the travelling direction of the vehicle. Concretely, the camera 6 captures the scenery within the display range of the virtual image Iv. Then, the camera 6 supplies the captured image to the control unit 4 in the light source unit 3.
  • control unit 4 in the light source unit 3 in the head-up display 2.
  • control unit 4 an overview of the control performed by the control unit 4 will be described below.
  • control unit 4 includes a roadway inside/outside distinction unit 4a, a rendering process unit 4b, a display category information database 4c, frame memories 4d and 4e, and a display output unit 4f.
  • the frame memory 4d is supplied with the captured image from the camera 6, and stores the captured image.
  • the captured image illustrates the scenery (i.e., scenery seen through the combiner 9 in the traveling direction of the vehicle) within the display range of the virtual image Iv.
  • the captured image stored on the frame memory 4d is supplied to the roadway inside/outside distinction unit 4a.
  • the roadway inside/outside distinction unit 4a performs a process (hereinafter referred to as “roadway inside/outside distinction") for distinguishing between the range corresponding to the inside of a roadway in the captured image and the range corresponding to the outside of a roadway in the captured image by applying image processing to the captured image. Then, the roadway inside/outside distinction unit 4a supplies the rendering process unit 4b with information on the range corresponding to the inside of the roadway in the captured image and the range corresponding to the outside of the roadway in the captured image.
  • inside of the roadway indicates the range of the road where the own vehicle is running.
  • the term “inside of the roadway” indicates the range of the road including the plurality of lanes.
  • the term “outside of the roadway” herein indicates the range outside the above-mentioned range of the road. In other words, the term indicates the range other than the road where the own vehicle is running. It is noted that the term “inside of the roadway” may include not only the road where the own vehicle is running but also the road corresponding to the oncoming lane(s).
  • the rendering process unit 4b receives information on the range corresponding to the inside of the roadway in the captured image and the range corresponding to the outside of the roadway in the captured image from the roadway inside/outside distinction unit 4a.
  • the rendering process unit 4b also receives information on the present position of the vehicle (ownposition) and information to be displayed from the navigation device 1. Examples of the information to be displayed include navigation information, information on the driving operation and information on a nearby facility. Then, the rendering process unit 4b determines whether to display the target information inside the roadway or to display it outside the roadway.
  • the rendering process unit 4b makes such a determination by referring to the display category information database 4c including information (hereinafter referred to as "display category information") which associates each kind of information to be displayed with information on whether to display the information inside the roadway or to display it outside the roadway.
  • display category information information which associates each kind of information to be displayed with information on whether to display the information inside the roadway or to display it outside the roadway.
  • the display category information registered in the display category information database 4c is an example of "the category information on the category of the object" in the present invention.
  • the rendering process unit 4b generates an image whose display range overlaps with the range of the roadway when it determines, on the basis of the information on the range corresponding to the inside of the roadway and the range corresponding to the outside of the roadway supplied from the roadway inside/outside distinction unit 4a, that the information is to be displayed inside the roadway. In contrast, when the rendering process unit 4b determines that the information is to be displayed outside the roadway, it generates an image whose display position is outside the range of the roadway.
  • the rendering process unit 4b calculates the display position (hereinafter arbitrarily referred to as "first display position") of the target information based on the positional relationship between the position of the object corresponding to the target information to be displayed and the own position, and thereafter displays the target information on the first display position.
  • first display position the display position calculated as to the information determined to be displayed outside the roadway exists within the roadway
  • the renderingprocess unit 4b performs a process for changing the display position of the information to the outward direction, i.e., a process (“outside roadway drawing process”) for drawing the information into a position outside the range of the roadway.
  • second display position the display position after changing the first display position by the outside roadway drawing process
  • the rendering process unit 4b stores the image generated as mentioned above on the frame memory 4e.
  • the image stored on the frame memory 4e is supplied to the display output unit 4f, and the display output unit 4f outputs the image stored on the frame memory 4e to the display unit.
  • the rendering process unit 4b performs the control of superimposing and displaying the target information obtained from the navigation device 1 on the position corresponding to the information in the scenery seen through the combiner 9. Namely, the rendering process unit 4b performs the control for the AR display.
  • the rendering process unit 4b performs the control for the AR display.
  • it can be considered to calculate the displayposition (first display position) of the information based on the positional relationship between the own position and the position of the object corresponding to the information to be displayed and thereafter to display the information on the position.
  • the information could be displayed inside the roadway.
  • FIG. 5 illustrates an example corresponding to a case that information to be displayed outside the roadway is actually displayed inside the roadway.
  • FIG. 5 illustrates an example of an image seen through the combiner 9 by the driver.
  • information on a POI Point of Interest
  • POI indicates a facility or a place registered in the map data stored on the navigation device 1 or the head-up display 2, or the user-specified facility or the user-specified place specified out of them.
  • the POI indicates a nearby facility existing near a road (i.e., outside the road), and is treated as information to be displayed outside the roadway.
  • the display position of the POI 70 corresponds to the first display position calculated based on the positional relationship between the own position and the position of the facility corresponding to the POI 70. In this way, when the information to be displayed outside the roadway is actually displayed inside the roadway, it could cause misunderstandings to the driver and possibly too much information could be displayed inside the roadway.
  • the rendering process unit 4b determines whether to display it inside the roadway or to display it outside the roadway, and as for the information determined to be displayed outside the roadway, the rendering process unit 4b performs the control for displaying it within the range outside the roadway. In this case, when the information determined to be displayed outside the roadway is considered to be actually displayed inside the roadway, the rendering process unit 4b performs the process (i.e., outside roadway drawing process) for changing the display position of the information to the outward direction of the roadway.
  • the process i.e., outside roadway drawing process
  • the rendering process unit 4b performs the process for calculating the second display position obtained by shifting the first display position towards the outward direction of the roadway according to the outside roadway drawing process.
  • the rendering process unit 4b displays the information at the first display position.
  • the rendering process unit 4b determines whether to display the information inside the roadway or to display the information outside the roadway. Concretely, the rendering process unit 4b makes such a determination by referring to the display category information registered in the display category information database 4c. On the display category information database 4c, there is stored the display category information indicating whether to display the information inside the roadway or to display the information outside the roadway with respect to each kind of information targeted for display.
  • Examples of information to be displayed inside the roadway include information directly associated with the driving operation, information associated with the actual road, and information for enhancing the safety of the driver. Examples categorized into the information to be displayed inside the roadway include route guide information (lane information), information on traffic regulations and traffic rules, information for displaying a direction sign in close-up, and information on the destination. On the other hand, examples of information to be displayed outside the roadway include additional information such as information directly irrelevant to the driving operation and information directly irrelevant to the running road (i.e., roadway) . Examples categorized into the information to be displayed outside the roadway include a POI indicating a nearby facility.
  • FIGS. 6A to 6C are drawings for explaining concrete examples of the outside roadway drawing process executed by the rendering process unit 4b.
  • FIGS. 6A to 6C each indicates an example for calculating the second display position through the outside roadway drawing process in a case that the first display position of the POI 80 indicating a nearby facility is inside the roadway 81.
  • the rendering process unit 4b firstly calculates the coordinates corresponding to the first display position of the POI 80 in the captured image and thereafter performs the above-mentioned outside roadway drawing process based on the coordinates and the range (range of coordinates) of the roadway 81 distinguished in the captured image by the roadway inside/outside distinction unit 4a.
  • first display position indicates not only the first display position calculated as mentioned before but also the coordinates corresponding to the first display position in the captured image.
  • the position of the bottom edge of the POI 80 is herein used as the first display position and the second display position.
  • FIG. 6A illustrates an example (hereinafter referred to as "first process") of the outside roadway drawing process.
  • FIG. 6A illustrates the first display position of the POI 80 on the left side and the second display position of the POI 80 after the first process (i.e., the outside roadway drawing process) on the right side.
  • the rendering process unit 4b draws the vertical line 84 from the first display position of the POI 80 to the roadway line 82 that is vertical to the vertical line 84, and adopts the contact point between the vertical line 84 and the roadway line 82 as the second display position.
  • FIG. 6B illustrates another example (hereinafter referred to as "second process") of the outside roadway drawing process.
  • FIG. 6B illustrates the first display position of the POI 80 on the left side and the second display position of the POI 80 after the second process (i.e., the outside roadway drawing process) on the right side.
  • the rendering process unit 4b draws the horizontal line 85 (i.e., a straight line parallel to the horizon in the captured image) from the first display position of the POI 80, and adopts the contact point between the horizontal line 85 and the roadway line 82 as the second display position.
  • FIG. 6C illustrates still another example (hereinafter referred to as "third process") of the outside roadway drawing process.
  • FIG. 6C illustrates the first display position of the POI 80 on the left side and the second display position of the POI 80 after the third process (i.e., the outside roadway drawing process) on the right side.
  • the rendering process unit 4b draws the straight line 86 vertically extending from the first display position of the POI 80 towards the ground, and adopts the contact point between the straight line 86 and the roadway line 82 as the second display position.
  • the left drawing in FIG. 6C indicates the straight line 86 by the dashed line since the straight line 86 overlaps with the POI 80.
  • FIG. 7 is a flowchart indicating an example of the display control method according to the embodiment. This flowchart is repeatedly executed by the control unit 4 in the head-up display 2.
  • step S101 the captured image captured by the camera 6 is stored on the frame memory 4d in the control unit 4. Then, the process goes to step S102.
  • the roadway inside/outside distinction unit 4a in the control unit 4 makes the roadway inside/outside distinction by using the captured image. Namely, through image processing for the captured image, the roadway inside/outside distinction unit 4a distinguishes between the range corresponding to the inside of the roadway in the captured image and the range corresponding to the outside of the roadway in the captured image. Any known approach can be applied to the image processing executed herein. Then, process goes to step S103.
  • the rendering process unit 4b in the control unit 4 obtains information to be displayed from the navigation device 1.
  • the rendering process unit 4b obtains navigation information in accordance with the own position and information (hereinafter simply referred to as "POI information") on POI indicating a facility in the vicinity of the own position. Then, the process goes to step S104.
  • POI information the own position and information
  • the rendering process unit 4b in the control unit 4 determines whether or not the information obtained at step S103 is information to be displayed inside the roadway. Concretely, the rendering process unit 4b makes such a determination by referring the display category information registered in the display category information database 4c. In this case, when the target information of the determination is categorized into information to be displayed inside the roadway according to the display category information, the rendering process unit 4b determines that the target information should be displayed inside the roadway. In contrast, when the target information of the determination is categorized into information to be displayed outside the roadway according to the display category information, the rendering process unit 4b determines that the information should not be displayed inside the roadway. For example, when the target information of the determination is POI information, the rendering process unit 4b determines that the information should not be displayed inside the roadway.
  • step S105 the rendering process unit 4b generates such an image (graphic) that the display range of the information obtained at step S103 is inside the roadway.
  • the rendering process unit 4b calculates the display position (first display position) of the information based on the positional relationship (e.g., relationship with respect to the latitude and the longitude or the bearing) between the position of the object corresponding to the information to be displayed and the own position, and thereafter displays the image corresponding to the information at the first display position. Then, the process goes to step S110.
  • step S104 when the information obtained at step S103 is not to be displayed inside the roadway (step S104: No), i.e., it is to be displayed outside the roadway, the process goes to step S106.
  • the process at the time when the information to be displayed outside the roadway is POI information, for example.
  • the rendering process unit 4b calculates the position of the POI in the captured image. Concretely, the rendering process unit 4b calculates the display position (first display position) of the POI based on the positional relationship (e.g., relationship with respect to the latitude and the longitude or the bearing) between the position of the facility corresponding to the POI and the own position, and thereafter calculates the position corresponding to the first display position in the captured image. Then, the process goes to step S107.
  • the positional relationship e.g., relationship with respect to the latitude and the longitude or the bearing
  • the rendering process unit 4b determines whether or not to draw the POI information into the outside of the roadway, i.e., whether or not to perform the outside roadway drawing process. In this case, the rendering process unit 4b makes a determination at step S107 by determining whether or not the position of the POI in the captured image calculated at step S106 is within the range corresponding to the inside of the roadway distinguished in the captured image at step S102.
  • step S107 When the rendering process unit 4b draws the POI information into the outside of the roadway (step S107: Yes), i.e., the position of the POI in the captured image is inside the roadway, the process goes to step S108.
  • the rendering process unit 4b performs the outside roadway drawing process for changing the display position of the POI information to the outward direction of the roadway. Concretely, by performing any one of the above-mentioned first to third processes as illustrated in FIGS. 6A to 6C , the rendering process unit 4b calculates the second display position that is an outside position of the roadway shifted from the first display position. Then, the process goes to step S109.
  • step S107 When the POI information is not drawn into the outside of the roadway (step S107: No), i.e., when the position of the POI in the captured image is not inside the roadway, the process goes to step S109. Inthiscase, the outside roadway drawingprocess executed at step S108 is not performed.
  • the rendering process unit 4b generates such an image (graphic) that the information obtained at step S103 is displayed outside the roadway.
  • the rendering process unit 4b calculates the displayposition (first displayposition) of the POI information thereby to display the image corresponding to the POI information at the first display position.
  • the rendering process unit 4b displays the image corresponding to the POI information at the second display position calculated at step S108. Then, the process goes to step S110.
  • the rendering process unit 4b determines whether or not the process for all information to be displayed has finished. Namely, the rendering process unit 4b determines whether or not the process for each piece of information to be displayed which is obtained at step S103 has finished.
  • the rendering process unit 4b generates final images for display and stores the images on the frame memory 4e (step S111). Then, the images stored on the frame memory 4e are supplied to the display output unit 4f and the display output unit 4f outputs the images stored on the frame memory 4e to the display unit (step S112). Thereafter, the process ends.
  • step S110: No the process goes to step S104. In this case, the process at and after step S104 is repeatedly executed.
  • the information is displayed either inside the roadway or outside the roadway depending on the category of the information to be displayed.
  • the information is categorized into information to be displayed inside the roadway and information to be displayed outside the roadway.
  • the display range of the head-up display 2 tends to be narrower than the display range of a typical navigation device .
  • the embodiment it is also possible to display information in accordance with the priority depending on whether the display position is inside the roadway or outside the roadway. For example, when the amount of information to be displayed has increased at a certain traveling position, it is possible to restrict the display of the information according to the priority. For example, as for the information to be displayed inside the roadway, it is preferentially displayed since it is directly relevant to the driving operation. In contrast, as for the information to be displayed outside the roadway, the display is restricted since the information is secondary information.
  • the information to be displayed outside the roadway is estimated to be displayed inside the roadway
  • the information is displayed at the second display position that is a position shifted from the first display position by the outside roadway drawing process, instead of the first display position calculated based on the positional relationship with respect to the own position.
  • any one of the first process to the third process is executed as the outside roadway drawing process, however the approach to which the present invention can be applied is not limited to the approach.
  • the most suitable display position may be adopted out of the second display positions calculated through the first process to the third process.
  • the most nearest position to the actual position i.e., the position of the facility in the captured image
  • the facility corresponding to the POI can be adopted out of the second display positions calculated through the first process to the third process.
  • all the information categorized into the information to be displayed outside the roadway according to the display category information is displayed outside the roadway.
  • the information may be displayed inside the roadway even if the information is categorized into the information to be displayed outside the roadway. For example, if the information cannot be displayed at any position other than the inside of the roadway due to the degree of the curve of the road in the travelling direction, the information may be displayed inside the roadway in order to prioritize the display of the information.
  • the outside roadway drawing process is performed in the case that the information to be displayed outside the roadway is estimated to be displayed inside the roadway. Instead, in a case that a small fraction of the information to be displayed outside the roadway is inside the roadway, the outside roadway drawing process may not be executed. Concretely, in a case that the first display position of the information to be displayed outside the roadway is inside the roadway, if the distance between the first display position and the roadway line is shorter than a predetermined value, the information may be displayed at the first display position without executing the outside roadway drawing process.
  • the front window may be used as the transparent member in place of the combiner 9.
  • the light source unit 3 may be provided inside the dashboard 29 instead of the ceiling board 27 whereas the light source unit 3 is provided on the ceiling board 27 in the embodiment.
  • the present invention is applied to the system including the navigation device 1 and the head-up display 2 in which the camera 6 is incorporated, but the configuration to which the present invention can be applied is not limited to the configuration. Instead, the present invention can be applied to a device in which the navigation device, the head-up display and the camera are integrally configured. The present invention can be also applied to a system in which the navigation device, the head-up display and the camera are configured separately. The present invention can be also applied to a system including a head-up display and a navigation device in which the camera is incorporated.
  • the display control is performed based on the captured image captured by the camera 6 according to the embodiment
  • the display control may be performed without using the captured image.
  • the present invention can be also applied to a system or a device which does not have the camera 6. Concretely, in an alternative example, firstly it distinguishes between the range corresponding to the inside of the roadway and the range corresponding to the outside of the roadway in the display range of the head-up display 2 on the basis of information on the number of the lane and information on the width of a typical road, and thereafter performs the same control as the above-mentioned display control based on the distinguished ranges of the inside and the outside of the roadway.
  • the present invention can be also applied to a device which superimposes, on an actual image obtained by a camera capturing the scenery in the traveling direction of the vehicle, information on an object existing in the actual image .
  • the present invention can be applied to a smart phone which guides the user by the AR display using an actual image captured by the incorporated camera. Even if the present invention is applied to such a device, the above-mentioned outside roadway drawing process is performed in the same way at the time when the display position of the information on the object in accordance with the own position is inside the roadway of the road in the actual image.
  • the present invention can be applied not only to the head-up display 2 but also to a navigation device using a transparent display and a transparent head mounted display.
  • this invention can be applied to a head-up display and a navigation device (including a cell phone such as a smart phone).
  • a navigation device including a cell phone such as a smart phone.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Transportation (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)
  • Processing Or Creating Images (AREA)

Description

    TECHNICAL FIELD
  • The present invention relates to a technology for displaying information.
  • BACKGROUND TECHNIQUE
  • Conventionally, regarding a navigation device, there is proposed a technique for displaying information on a nearby facility over a map image without superposing the information on any road in the map image. For example, Patent Reference-1 discloses a technique for making map data so that landmarks do not overlap with any road in the map data. The Patent Reference-2 discloses the preamble of claim 1.
    • Patent Reference-1: Japanese Patent Application Laid-open under No. 2001-307121
    • Patent Reference-2: DE 10 2004 060380 A1
    DISCLOSURE OF INVENTION PROBLEM TO BE SOLVED BY THE INVENTION
  • In recent years, there is proposed a technique relating to AR (Augmented Reality) for displaying guide information and information on driving operation over the front scenery of a moving body by using a display device such as a head-up display. In case of such an AR display, it is preferable to display information on a nearby facility without superposing it on any road.
  • The above is an example of the problem to be solved by the present invention. An object of the present invention is to provide a display device and a display method thereof capable of appropriately displaying information on an object contained in the front scenery of the moving body without superposing it on any road.
  • MEANS FOR SOLVING THE PROBLEM
  • One invention is a display device according to the subject-matter of claim 1.
  • Another invention is a display method executed by a display device, according to the subject-matter of claim 7.
  • BRIEF DESCRIPTION OF THE DRAWINGS
    • FIG. 1 illustrates an example of the configuration of a system according to an embodiment.
    • FIG. 2 illustrates the schematic configuration of a navigation device.
    • FIG. 3 illustrates the schematic configuration of a head-up display.
    • FIG. 4 illustrates the schematic configuration of a control unit in a light source unit.
    • FIG. 5 illustrates an example corresponding to a case that information to be displayed outside a roadway is actually displayed inside the roadway.
    • FIGS. 6A to 6C are drawings for explaining concrete examples of an outside roadway drawing process.
    • FIG. 7 is a flowchart indicating an example of a display control method according to the embodiment.
    DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • According to a preferable embodiment of the present invention, there is provided a display device including: a present position information obtaining unit configured to obtain present position information of a moving body; and a display control unit configured to display information on an object through a transparent member based on the present position information, the object being contained in scenery in traveling direction of the moving body, the transparent member being positioned between a viewpoint of a traveler in the moving body and the scenery, wherein the display control unit changes a display position of the information on the object to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, the road being contained in the scenery seen through the transparent member.
  • Preferably, the display device is used for displaying information on an object contained in scenery through a transparent member based on the present position information of a moving body. The above-mentioned transparent member is positioned between a viewpoint of a traveler in the moving body and the scenery in the traveling direction of the moving body. Concretely, the display control unit in the display device changes a display position of the information on the object to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, wherein the road is contained in the scenery seen through the transparent member. In other words, the display control unit adjusts the display position of the information on the object to a position outside the range of the road when the display position of the information on the object is positioned within the range of the road. Thereby, it is possible to properly display the information on the object contained in the scenery in front of the moving body without superposing it on the road. For example, it is possible to display information which is not supposed to be displayed inside the road properly on a position outside the road thereby to prevent the traveler from having misunderstandings.
  • In one mode of the display device, the display device further includes an object information obtaining unit configured to obtain category information on a category of the object, and the display control unit determines whether or not to change the display position of the information on the object based on the category information. Thereby, it is possible to display the information on the object inside or outside the road depending on the category of the object. Thus, it is possible to properly arrange the display information in the display range while letting the traveler easily recognize the category of the display information.
  • In another mode of the display device, the display control unit decides to change the display position of the information on the object to the outward direction of the road in a case that the category of the object according to the category information is relevant to a facility, and decides not to change the display position of the information on the object to the outward direction of the road in a case that the category of the object according to the category information is relevant to a road sign. Thereby, it is possible to display information on a facility outside the road while displaying information on a road sign inside the road.
  • In still another mode of the display device, the display control unit changes the display position of the information on the object to the outward direction of the road in a case that the object is a facility existing outside the road, and does not change the display position of the information on the object to the outward direction of the road in a case that the object is a facility existing inside the road. Thereby, it is possible to properly display information on a facility existing inside the road in accordance with the actual position of the facility.
  • According to another preferable embodiment of the present invention, there is provided a display method executed by a display device, including: a present position information obtaining process for obtaining present position information of a moving body; and a display control process for displaying information on an object through a transparent member based on the present position information, the object being contained in scenery in traveling direction of the moving body, the transparent member being positioned between a viewpoint of a traveler in the moving body and the scenery, wherein in the display control process, a display position of the information on the object is changed to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, the road being contained in the scenery seen through the transparent member.
  • According to still another preferable embodiment of the present invention, there is provided a display device including: a present position information obtaining unit configured to obtain present position information of a moving body; and an image obtaining unit configured to obtain an actual image of a travelling direction of the moving body; and a display control unit configured to display information on an object contained in the actual image on the actual image based on the present position information, wherein the display control unit changes a display position of the information on the object to an outward direction of a road contained in the actual image in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance.
  • According to still another preferable embodiment of the present invention, there is provided a display device including: a display unit configured to let a traveler in a moving body visually recognize scenery in front of the moving body and information on an object contained in the scenery and a display control unit configured to change a display position of the information on the object to an outward direction of a road contained in the scenery in order to prevent the display position of the information on the object from being a position inside the road by at least a predetermined criterion distance.
  • Preferably, the display device further includes a present position information obtaining unit configured to obtain present position information of the moving body, and the display control unit changes a display position of the information on the object to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, the road being contained in the scenery seen through the display unit.
  • According to still another preferable embodiment of the present invention, there is provided a display device including: a present position information obtaining unit configured to obtain present position information of a moving body; and a display control unit configured to display information on an object through a transparent member based on the present position information, the object being contained in scenery in traveling direction of the moving body, the scenery being seen through the transparent member, wherein the display control unit changes a display position of the information on the object to an outward direction of a road in a case that the display position of the information on the object based on the present position information is inside the road by at least a predetermined criterion distance, the road being contained in the scenery seen through the transparent member.
  • EMBODIMENT
  • Now, a preferred embodiment of the present invention will be described below with reference to the attached drawings.
  • [System Configuration]
  • FIG. 1 illustrates an example of the configuration of a system according to the embodiment. As illustrated in FIG. 1, the system includes a navigation device 1 and a head-up display 2. The system is mounted on a vehicle.
  • The navigation device 1 has a guide function of a route from a departure place to a destination. Examples of the navigation device 1 include a stationary navigation device set in a vehicle, a PND (Portable Navigation Device) and a cell phone such as a smart phone.
  • The head-up display 2 generates an image indicating information for assisting the driving operation such as map information indicating the present position, route guide information and a running speed, and lets the driver visually recognize the image as a virtual image from the position of the eye (eye point) of the driver. The navigation device 1 supplies the head-up display 2 with various kinds of information used for the navigation processing such as the position of the vehicle, the running speed of the vehicle, map information and facility data. The head-up display 2 is an example of "the display unit" in the present invention.
  • It is noted that the navigation device 1 may be held by a cradle if the navigation device 1 is a cell phone such as a smart phone. In this case, the navigation device 1 may exchange the information with the head-up display 2 via the cradle.
  • [Configuration of Navigation Device]
  • FIG. 2 shows the configuration of the navigation device 1. As shown in FIG. 2, the navigation device 1 includes a stand-alone position measurement device 10, a GPS receiver 18, a system controller 20, a disc drive 31, a data storage unit 36, a communication interface 37, a communication device 38, a display unit 40, a sound output unit 50, and an input device 60.
  • The stand-alone position measurement device 10 includes an acceleration sensor 11, an angular velocity sensor 12 and a distance sensor 13. The acceleration sensor 11 includes a piezoelectric element, for example, and detects the acceleration degree of the vehicle and outputs the acceleration data. The angular velocity sensor 12 includes a vibration gyroscope, for example, and detects the angular velocity of the vehicle at the time of changing the direction of the vehicle and outputs the angular velocity data and the relative direction data. The distance sensor 13 measures vehicle speed pulses including a pulse signal generated with the wheel rotation of the vehicle.
  • The GPS receiver 18 receives an electric wave 19 for transmitting downlink data including position measurement data from plural GPS satellites . The position measurement data is used for detecting the absolute position (hereinafter referred to as "present position" or "own position") of the vehicle from longitude and latitude information.
  • The system controller 20 includes an interface 21, a CPU (Center Processing Unit) 22, a ROM (Read Only Memory) 23 and a RAM (Random Access Memory) 24, and is configured to control the entire navigation device 1.
  • The interface 21 executes the interface operation with the acceleration sensor 11, the angular velocity sensor 12, the distance sensor 13 and the GPS receiver 18. Then, the interface 21 inputs the vehicle speed pulse, the acceleration data, the relative direction data, the angular velocity data, the GPS measurement data and the absolute direction data into the system controller 20. The CPU 22 controls the entire system controller 20. The ROM 23 includes a non-volatile memory (not shown) in which a control program for controlling the system controller 20 is stored. The RAM 24 readably stores various kinds of data such as route data preset by the user via the input device 60, and supplies a working area to the CPU 22.
  • The system controller 20, the disc drive 31 such as a CD-ROM drive or a DVD-ROM drive, the data storage unit 36, the communication interface 37, the display unit 40, the sound output unit 50 and the input device 60 are connected to each other via a bus line 30.
  • Under the control of the system controller 20, the disc drive 31 reads contents data such as sound data and video data from a disc 33 such as a CD and a DVD to output the contents data. The disc drive 31 may be the CD-ROM drive or the DVD-ROM drive, or may be a drive compatible between the CD and the DVD. The data storage unit 36 includes a HDD, for example, and stores various kinds of data used for a navigation process such as map data. The communication device 38 includes an FM tuner or a beacon receiver, a mobile phone and a dedicated communication card for example, and obtains information (hereinafter referred to as "VICS information"; "VICS" is a registered trademark) delivered from a VICS (Vehicle Information Communication System) center by the electric wave 39. The communication interface 37 executes the interface operation of the communication device 38 to input the VICS information into the system controller 20. The communication device 38 sends information on the present position obtained from the GPS receiver 18 to the head-up display 2.
  • The display unit 40 displays various kinds of display data on a display screen of a display 44 under the control of the system controller 20. Concretely, the system controller 20 reads the map data from the data storage unit 36, and the display unit 40 displays, on its display screen, the map data read from the data storage unit 36 by the system controller 20. The display unit 40 includes a graphic controller 41 for controlling the entire display unit 40 on the basis of the control data transmitted from the CPU 22 via the bus line 30, a buffer memory 42 having a memory such as a VRAM (Video RAM) for temporarily storing immediately displayable image information, a display control unit 43 for controlling a display 44 such as a liquid crystal and a CRT (Cathode Ray Tube) on the basis of the image data outputted from the graphic controller 41, and the display 44. The display 44 is formed by a liquid crystal display device of the opposite angle 5-10 inches, and is mounted in the vicinity of a front panel of the vehicle.
  • The sound output unit 50 includes a D/A converter 51 for executing D/A (Digital to Analog) conversion of the sound digital data transmitted from the CD-ROM drive 31, a DVD-ROM 32 or the RAM 24 via the bus line 30 under the control of the system controller 20, an amplifier (AMP) 52 for amplifying a sound analog signal outputted from the D/A converter 51, and a speaker 53 for converting the amplified sound analog signal into the sound and outputting it to the vehicle compartment.
  • The input device 60 includes keys, switches, buttons, a remote controller and a sound input device, which are used for inputting various kinds of commands and data. The input device 60 is arranged in the vicinity of the display 44 and a front panel of a main body of an on-vehicle electric system loaded on the vehicle . Additionally, in such a case that the display 44 is in a touch panel system, a touch panel provided on the display screen of the display 44 also functions as the input device 60.
  • [Configuration of Head-up Display]
  • FIG. 3 illustrates the schematic configuration of the head-up display 2. As illustrated in FIG. 3, the head-up display 2 according to the embodiment includes a light source unit 3, a camera 6 and a combiner 9, and is installed in a vehicle having a front window 25, a ceiling board 27, a hood 28 and a dashboard 29.
  • The light source unit 3 is provided on the ceiling board 27 in the vehicle interior through the support parts 5a and 5b, and emits the light for displaying an image illustrating driver assist information towards the combiner 9. In particular, under the control of the control unit 4, the light source unit 3 generates an original image (real image) of the display image in the light source unit 3, and emits the light for displaying the image towards the combiner 9 thereby to let the driver visually recognize the virtual image "Iv" via the combiner 9. For example, the light source unit 3 has a display unit such as a laser light source and a LCD light source, and emits the light by means of such a display unit.
  • The display image emitted from the light source unit 3 is projected onto the combiner 9, and the combiner 9 shows the display image as the virtual image Iv by reflecting the display image towards the eye point "Pe" of the driver. The combiner 9 has a support shaft 8 provided on the ceiling board 27 and rotates on the support shaft 8. The support shaft 8 is provided on the ceiling board 27 near the top edge of the front window 25, i.e., near the position of a sun visor (not shown) for the driver. It is noted that the support shaft 8 may be provided in place of the above-mentioned sun visor. The combiner 9 is an example of "the transparent member" and "the display unit" in the present invention.
  • The control unit 4 includes a CPU, a RAM and a ROM which are not shown, and controls the entire head-up display 2. The control unit 4 is capable of communicating with the navigation device 1 and receives various kinds of information used for the navigation processing from the navigation device 1. In particular, according to the embodiment, the control unit 4 receives present position information indicating the own position and information for display from the navigation device 1, and thereby performs the control of displaying the information while superposing the information on the scenery seen through the combiner 9. Namely, the control unit 4 performs a control for the AR display. Concretely, the control unit 4 receives information on an object existing in the scenery seen through the combiner 9 and performs such a control that the virtual image corresponding to the information on the object is seen overlapping with the object in the scenery. The control unit 4 is an example of "the present position information obtaining unit", "display control unit" and "object information obtaining unit" according to the present invention.
  • The camera 6 captures the scenery in the travelling direction of the vehicle. Concretely, the camera 6 captures the scenery within the display range of the virtual image Iv. Then, the camera 6 supplies the captured image to the control unit 4 in the light source unit 3.
  • [Configuration of Control Unit]
  • Next, with reference to FIG. 4, a description will be given of the configuration of the control unit 4 in the light source unit 3 in the head-up display 2. Hereinafter, an overview of the control performed by the control unit 4 will be described below.
  • As illustrated in FIG. 4, the control unit 4 includes a roadway inside/outside distinction unit 4a, a rendering process unit 4b, a display category information database 4c, frame memories 4d and 4e, and a display output unit 4f.
  • The frame memory 4d is supplied with the captured image from the camera 6, and stores the captured image. The captured image illustrates the scenery (i.e., scenery seen through the combiner 9 in the traveling direction of the vehicle) within the display range of the virtual image Iv. The captured image stored on the frame memory 4d is supplied to the roadway inside/outside distinction unit 4a.
  • The roadway inside/outside distinction unit 4a performs a process (hereinafter referred to as "roadway inside/outside distinction") for distinguishing between the range corresponding to the inside of a roadway in the captured image and the range corresponding to the outside of a roadway in the captured image by applying image processing to the captured image. Then, the roadway inside/outside distinction unit 4a supplies the rendering process unit 4b with information on the range corresponding to the inside of the roadway in the captured image and the range corresponding to the outside of the roadway in the captured image.
  • The term "inside of the roadway" herein indicates the range of the road where the own vehicle is running. In a case that the road where the own vehicle is running has a plurality of lanes, the term "inside of the roadway" indicates the range of the road including the plurality of lanes. On the other hand, the term "outside of the roadway" herein indicates the range outside the above-mentioned range of the road. In other words, the term indicates the range other than the road where the own vehicle is running. It is noted that the term "inside of the roadway" may include not only the road where the own vehicle is running but also the road corresponding to the oncoming lane(s).
  • The rendering process unit 4b receives information on the range corresponding to the inside of the roadway in the captured image and the range corresponding to the outside of the roadway in the captured image from the roadway inside/outside distinction unit 4a. The rendering process unit 4b also receives information on the present position of the vehicle (ownposition) and information to be displayed from the navigation device 1. Examples of the information to be displayed include navigation information, information on the driving operation and information on a nearby facility. Then, the rendering process unit 4b determines whether to display the target information inside the roadway or to display it outside the roadway. The rendering process unit 4b makes such a determination by referring to the display category information database 4c including information (hereinafter referred to as "display category information") which associates each kind of information to be displayed with information on whether to display the information inside the roadway or to display it outside the roadway. The display category information registered in the display category information database 4c is an example of "the category information on the category of the object" in the present invention.
  • The rendering process unit 4b generates an image whose display range overlaps with the range of the roadway when it determines, on the basis of the information on the range corresponding to the inside of the roadway and the range corresponding to the outside of the roadway supplied from the roadway inside/outside distinction unit 4a, that the information is to be displayed inside the roadway. In contrast, when the rendering process unit 4b determines that the information is to be displayed outside the roadway, it generates an image whose display position is outside the range of the roadway. Concretely, in order to realize the above-mentioned AR display, the rendering process unit 4b calculates the display position (hereinafter arbitrarily referred to as "first display position") of the target information based on the positional relationship between the position of the object corresponding to the target information to be displayed and the own position, and thereafter displays the target information on the first display position. In this case, when the first display position calculated as to the information determined to be displayed outside the roadway exists within the roadway, the renderingprocess unit 4b performs a process for changing the display position of the information to the outward direction, i.e., a process ("outside roadway drawing process") for drawing the information into a position outside the range of the roadway. Hereinafter, the display position after changing the first display position by the outside roadway drawing process is arbitrarily referred to as "second display position".
  • Thereafter, the rendering process unit 4b stores the image generated as mentioned above on the frame memory 4e. The image stored on the frame memory 4e is supplied to the display output unit 4f, and the display output unit 4f outputs the image stored on the frame memory 4e to the display unit.
  • [Display Control Method]
  • Next, a description will be given of the display control method executed by the above-mentioned rendering process unit 4b in the control unit 4 according to the embodiment.
  • Basically, the rendering process unit 4b performs the control of superimposing and displaying the target information obtained from the navigation device 1 on the position corresponding to the information in the scenery seen through the combiner 9. Namely, the rendering process unit 4b performs the control for the AR display. As an approach to display the augmented reality, it can be considered to calculate the displayposition (first display position) of the information based on the positional relationship between the own position and the position of the object corresponding to the information to be displayed and thereafter to display the information on the position. However, merely according to this approach, the information could be displayed inside the roadway.
  • FIG. 5 illustrates an example corresponding to a case that information to be displayed outside the roadway is actually displayed inside the roadway. FIG. 5 illustrates an example of an image seen through the combiner 9 by the driver. Hereinafter, information on a POI (Point of Interest) corresponding to a nearby facility will be used as information to be displayed outside the roadway. The term "POI" herein indicates a facility or a place registered in the map data stored on the navigation device 1 or the head-up display 2, or the user-specified facility or the user-specified place specified out of them. Basically, the POI indicates a nearby facility existing near a road (i.e., outside the road), and is treated as information to be displayed outside the roadway.
  • According to the example illustrated in FIG. 5, on the roadway 71, there exists the mark indicating a POI as indicated by the symbol 70 and other display of information related to the POI (hereinafter, they are referred to as "POI display" or simply "POI".) . In this case, the display position of the POI 70 corresponds to the first display position calculated based on the positional relationship between the own position and the position of the facility corresponding to the POI 70. In this way, when the information to be displayed outside the roadway is actually displayed inside the roadway, it could cause misunderstandings to the driver and possibly too much information could be displayed inside the roadway.
  • According to the embodiment, in order to solve the above-mentioned problems, regarding the information to be displayed, the rendering process unit 4b determines whether to display it inside the roadway or to display it outside the roadway, and as for the information determined to be displayed outside the roadway, the rendering process unit 4b performs the control for displaying it within the range outside the roadway. In this case, when the information determined to be displayed outside the roadway is considered to be actually displayed inside the roadway, the rendering process unit 4b performs the process (i.e., outside roadway drawing process) for changing the display position of the information to the outward direction of the roadway. Concretely, in the case that the calculated first display position relating to the target information determined to be displayed outside the roadway is within the range corresponding to the inside of the roadway distinguished by the roadway inside/outside distinction unit 4a, the rendering process unit 4b performs the process for calculating the second display position obtained by shifting the first display position towards the outward direction of the roadway according to the outside roadway drawing process. In contrast, regarding the information determined to be displayed inside the roadway and the information which is to be displayed outside the roadway and actually displayed outside the roadway, the rendering process unit 4b displays the information at the first display position.
  • Here, on the basis of the category of the information to be displayed, the rendering process unit 4b determines whether to display the information inside the roadway or to display the information outside the roadway. Concretely, the rendering process unit 4b makes such a determination by referring to the display category information registered in the display category information database 4c. On the display category information database 4c, there is stored the display category information indicating whether to display the information inside the roadway or to display the information outside the roadway with respect to each kind of information targeted for display.
  • Examples of information to be displayed inside the roadway include information directly associated with the driving operation, information associated with the actual road, and information for enhancing the safety of the driver. Examples categorized into the information to be displayed inside the roadway include route guide information (lane information), information on traffic regulations and traffic rules, information for displaying a direction sign in close-up, and information on the destination. On the other hand, examples of information to be displayed outside the roadway include additional information such as information directly irrelevant to the driving operation and information directly irrelevant to the running road (i.e., roadway) . Examples categorized into the information to be displayed outside the roadway include a POI indicating a nearby facility.
  • Next, with reference to FIGS. 6A to 6C and FIG. 7, a description will be given of concrete examples of the display control method according to the embodiment.
  • FIGS. 6A to 6C are drawings for explaining concrete examples of the outside roadway drawing process executed by the rendering process unit 4b. FIGS. 6A to 6C each indicates an example for calculating the second display position through the outside roadway drawing process in a case that the first display position of the POI 80 indicating a nearby facility is inside the roadway 81. In this case, the rendering process unit 4b firstly calculates the coordinates corresponding to the first display position of the POI 80 in the captured image and thereafter performs the above-mentioned outside roadway drawing process based on the coordinates and the range (range of coordinates) of the roadway 81 distinguished in the captured image by the roadway inside/outside distinction unit 4a.
  • Hereinafter, The term "first display position" indicates not only the first display position calculated as mentioned before but also the coordinates corresponding to the first display position in the captured image. The position of the bottom edge of the POI 80 is herein used as the first display position and the second display position.
  • FIG. 6A illustrates an example (hereinafter referred to as "first process") of the outside roadway drawing process. FIG. 6A illustrates the first display position of the POI 80 on the left side and the second display position of the POI 80 after the first process (i.e., the outside roadway drawing process) on the right side. In the first process, the rendering process unit 4b draws the vertical line 84 from the first display position of the POI 80 to the roadway line 82 that is vertical to the vertical line 84, and adopts the contact point between the vertical line 84 and the roadway line 82 as the second display position.
  • FIG. 6B illustrates another example (hereinafter referred to as "second process") of the outside roadway drawing process. FIG. 6B illustrates the first display position of the POI 80 on the left side and the second display position of the POI 80 after the second process (i.e., the outside roadway drawing process) on the right side. In the second process, the rendering process unit 4b draws the horizontal line 85 (i.e., a straight line parallel to the horizon in the captured image) from the first display position of the POI 80, and adopts the contact point between the horizontal line 85 and the roadway line 82 as the second display position.
  • FIG. 6C illustrates still another example (hereinafter referred to as "third process") of the outside roadway drawing process. FIG. 6C illustrates the first display position of the POI 80 on the left side and the second display position of the POI 80 after the third process (i.e., the outside roadway drawing process) on the right side. In the third process, the rendering process unit 4b draws the straight line 86 vertically extending from the first display position of the POI 80 towards the ground, and adopts the contact point between the straight line 86 and the roadway line 82 as the second display position. The left drawing in FIG. 6C indicates the straight line 86 by the dashed line since the straight line 86 overlaps with the POI 80.
  • FIG. 7 is a flowchart indicating an example of the display control method according to the embodiment. This flowchart is repeatedly executed by the control unit 4 in the head-up display 2.
  • Firstly, at step S101, the captured image captured by the camera 6 is stored on the frame memory 4d in the control unit 4. Then, the process goes to step S102.
  • At step S102, the roadway inside/outside distinction unit 4a in the control unit 4 makes the roadway inside/outside distinction by using the captured image. Namely, through image processing for the captured image, the roadway inside/outside distinction unit 4a distinguishes between the range corresponding to the inside of the roadway in the captured image and the range corresponding to the outside of the roadway in the captured image. Any known approach can be applied to the image processing executed herein. Then, process goes to step S103.
  • At step S103, the rendering process unit 4b in the control unit 4 obtains information to be displayed from the navigation device 1. For example, the rendering process unit 4b obtains navigation information in accordance with the own position and information (hereinafter simply referred to as "POI information") on POI indicating a facility in the vicinity of the own position. Then, the process goes to step S104.
  • At step S104, the rendering process unit 4b in the control unit 4 determines whether or not the information obtained at step S103 is information to be displayed inside the roadway. Concretely, the rendering process unit 4b makes such a determination by referring the display category information registered in the display category information database 4c. In this case, when the target information of the determination is categorized into information to be displayed inside the roadway according to the display category information, the rendering process unit 4b determines that the target information should be displayed inside the roadway. In contrast, when the target information of the determination is categorized into information to be displayed outside the roadway according to the display category information, the rendering process unit 4b determines that the information should not be displayed inside the roadway. For example, when the target information of the determination is POI information, the rendering process unit 4b determines that the information should not be displayed inside the roadway.
  • When the information obtained at step S103 is to be displayed inside the roadway (step S104: Yes), the process goes to step S105. At step S105, the rendering process unit 4b generates such an image (graphic) that the display range of the information obtained at step S103 is inside the roadway. In this case, in order to realize the above-mentioned AR display, the rendering process unit 4b calculates the display position (first display position) of the information based on the positional relationship (e.g., relationship with respect to the latitude and the longitude or the bearing) between the position of the object corresponding to the information to be displayed and the own position, and thereafter displays the image corresponding to the information at the first display position. Then, the process goes to step S110.
  • In contrast, when the information obtained at step S103 is not to be displayed inside the roadway (step S104: No), i.e., it is to be displayed outside the roadway, the process goes to step S106. Hereinafter, a description will be given of the process at the time when the information to be displayed outside the roadway is POI information, for example.
  • At step S106, the rendering process unit 4b calculates the position of the POI in the captured image. Concretely, the rendering process unit 4b calculates the display position (first display position) of the POI based on the positional relationship (e.g., relationship with respect to the latitude and the longitude or the bearing) between the position of the facility corresponding to the POI and the own position, and thereafter calculates the position corresponding to the first display position in the captured image. Then, the process goes to step S107.
  • At step S107, the rendering process unit 4b determines whether or not to draw the POI information into the outside of the roadway, i.e., whether or not to perform the outside roadway drawing process. In this case, the rendering process unit 4b makes a determination at step S107 by determining whether or not the position of the POI in the captured image calculated at step S106 is within the range corresponding to the inside of the roadway distinguished in the captured image at step S102.
  • When the rendering process unit 4b draws the POI information into the outside of the roadway (step S107: Yes), i.e., the position of the POI in the captured image is inside the roadway, the process goes to step S108. At step S108, the rendering process unit 4b performs the outside roadway drawing process for changing the display position of the POI information to the outward direction of the roadway. Concretely, by performing any one of the above-mentioned first to third processes as illustrated in FIGS. 6A to 6C, the rendering process unit 4b calculates the second display position that is an outside position of the roadway shifted from the first display position. Then, the process goes to step S109.
  • When the POI information is not drawn into the outside of the roadway (step S107: No), i.e., when the position of the POI in the captured image is not inside the roadway, the process goes to step S109. Inthiscase, the outside roadway drawingprocess executed at step S108 is not performed.
  • At step S109, the rendering process unit 4b generates such an image (graphic) that the information obtained at step S103 is displayed outside the roadway. Concretely, as for the POI information whose display position is not inside the roadway, on the basis of the positional relationship (e.g., relationship with respect to the latitude and the longitude or the bearing) between the own position and the position of the facility corresponding to the POI information, the rendering process unit 4b calculates the displayposition (first displayposition) of the POI information thereby to display the image corresponding to the POI information at the first display position. In contrast, as for the POI information whose display position is inside the roadway, the rendering process unit 4b displays the image corresponding to the POI information at the second display position calculated at step S108. Then, the process goes to step S110.
  • At step S110, the rendering process unit 4b determines whether or not the process for all information to be displayed has finished. Namely, the rendering process unit 4b determines whether or not the process for each piece of information to be displayed which is obtained at step S103 has finished. When the process for all information to be displayed has finished (step S110: Yes), the rendering process unit 4b generates final images for display and stores the images on the frame memory 4e (step S111). Then, the images stored on the frame memory 4e are supplied to the display output unit 4f and the display output unit 4f outputs the images stored on the frame memory 4e to the display unit (step S112). Thereafter, the process ends. In contrast, when the process for all information to be displayed has not finished yet (step S110: No), the process goes to step S104. In this case, the process at and after step S104 is repeatedly executed.
  • [Effect of Embodiment]
  • As described above, according to the embodiment, in case of the AR display, the information is displayed either inside the roadway or outside the roadway depending on the category of the information to be displayed. The information is categorized into information to be displayed inside the roadway and information to be displayed outside the roadway. Thereby, it is possible to properly arrange display information in the display range of the head-up display 2 and to let the driver easily recognize the category of the displayed information. It is noted that the display range of the head-up display 2 tends to be narrower than the display range of a typical navigation device . According to the embodiment, it is also possible to display the information, which is inappropriate for displaying inside the roadway, at an outside position of the roadway, and therefore misunderstandings to the driver can be suppressed.
  • According to the embodiment, it is also possible to display information in accordance with the priority depending on whether the display position is inside the roadway or outside the roadway. For example, when the amount of information to be displayed has increased at a certain traveling position, it is possible to restrict the display of the information according to the priority. For example, as for the information to be displayed inside the roadway, it is preferentially displayed since it is directly relevant to the driving operation. In contrast, as for the information to be displayed outside the roadway, the display is restricted since the information is secondary information.
  • Additionally, according to the embodiment, when the information to be displayed outside the roadway is estimated to be displayed inside the roadway, the information is displayed at the second display position that is a position shifted from the first display position by the outside roadway drawing process, instead of the first display position calculated based on the positional relationship with respect to the own position. Thereby, it is possible to effectively suppress the POI information from being displayed inside the roadway and enhance the display accuracy of the AR display.
  • [Modification]
  • Hereinafter, preferred modifications of the above-mentioned embodiment will be described below. Each modification mentioned later can be applied to the above-mentioned embodiment in combination.
  • (First Modification)
  • According to the above-mentioned embodiment, any one of the first process to the third process is executed as the outside roadway drawing process, however the approach to which the present invention can be applied is not limited to the approach. In another example, after all the first process to the third process are executed, the most suitable display position may be adopted out of the second display positions calculated through the first process to the third process. Concretely, the most nearest position to the actual position (i.e., the position of the facility in the captured image) of the facility corresponding to the POI can be adopted out of the second display positions calculated through the first process to the third process. Thereby, it is possible to further enhance the display accuracy of the AR display.
  • (Second Modification)
  • According to the above-mentioned embodiment, all the information categorized into the information to be displayed outside the roadway according to the display category information is displayed outside the roadway. Instead, in a case that the facility corresponding to the information exists inside the roadway, the information may be displayed inside the roadway even if the information is categorized into the information to be displayed outside the roadway. For example, if the information cannot be displayed at any position other than the inside of the roadway due to the degree of the curve of the road in the travelling direction, the information may be displayed inside the roadway in order to prioritize the display of the information.
  • (Third Modification)
  • According to the above-mentioned embodiment, the outside roadway drawing process is performed in the case that the information to be displayed outside the roadway is estimated to be displayed inside the roadway. Instead, in a case that a small fraction of the information to be displayed outside the roadway is inside the roadway, the outside roadway drawing process may not be executed. Concretely, in a case that the first display position of the information to be displayed outside the roadway is inside the roadway, if the distance between the first display position and the roadway line is shorter than a predetermined value, the information may be displayed at the first display position without executing the outside roadway drawing process.
  • (Fourth Modification)
  • Instead of the above-mentioned example that the combiner 9 is used as the transparent member according to the above-mentioned embodiment, the front window may be used as the transparent member in place of the combiner 9. The light source unit 3 may be provided inside the dashboard 29 instead of the ceiling board 27 whereas the light source unit 3 is provided on the ceiling board 27 in the embodiment.
  • (Fifth Modification)
  • According to the above-mentioned embodiment, the present invention is applied to the system including the navigation device 1 and the head-up display 2 in which the camera 6 is incorporated, but the configuration to which the present invention can be applied is not limited to the configuration. Instead, the present invention can be applied to a device in which the navigation device, the head-up display and the camera are integrally configured. The present invention can be also applied to a system in which the navigation device, the head-up display and the camera are configured separately. The present invention can be also applied to a system including a head-up display and a navigation device in which the camera is incorporated.
  • (sixth Modification)
  • Instead of the above-mentioned example that the display control is performed based on the captured image captured by the camera 6 according to the embodiment, the display control may be performed without using the captured image. Namely, the present invention can be also applied to a system or a device which does not have the camera 6. Concretely, in an alternative example, firstly it distinguishes between the range corresponding to the inside of the roadway and the range corresponding to the outside of the roadway in the display range of the head-up display 2 on the basis of information on the number of the lane and information on the width of a typical road, and thereafter performs the same control as the above-mentioned display control based on the distinguished ranges of the inside and the outside of the roadway.
  • (Seventh Modification)
  • The present invention can be also applied to a device which superimposes, on an actual image obtained by a camera capturing the scenery in the traveling direction of the vehicle, information on an object existing in the actual image . For example, the present invention can be applied to a smart phone which guides the user by the AR display using an actual image captured by the incorporated camera. Even if the present invention is applied to such a device, the above-mentioned outside roadway drawing process is performed in the same way at the time when the display position of the information on the object in accordance with the own position is inside the roadway of the road in the actual image.
  • (Eighth Modification)
  • The present invention can be applied not only to the head-up display 2 but also to a navigation device using a transparent display and a transparent head mounted display.
  • INDUSTRIAL APPLICABILITY
  • Preferably, this invention can be applied to a head-up display and a navigation device (including a cell phone such as a smart phone).
  • BRIEF DESCRIPTION OF REFERENCE NUMBERS
  • 1
    Navigation device
    2
    Head-up display
    3
    Light source unit
    4
    Control unit
    4a
    Roadway inside/outside distinction unit
    4b
    Rendering process unit
    4c
    Display category information database
    6
    Camera
    9
    Combiner

Claims (7)

  1. A display device (2) comprising:
    a position information obtaining unit (4b) configured to obtain position information of a moving body; and
    a display control unit (4) configured to display information on an object by superposing the information on scenery in front of the moving body or on an actual image of the scenery based on the position information, the object being contained in the scenery,
    the device being characterised in that the display control unit (4) changes a display position of the information on the object to an outward direction of a road (71) in a case that the display position of the information on the object based on the position information is inside the road (71), the road being contained in the scenery.
  2. The display device (2) according to claim 1,
    wherein the display control unit (4) displays the information through a transparent member (9) positioned between a viewpoint of a traveler in the moving body and the scenery, and
    wherein the display control unit (4) changes the display position of the information to the outward direction in a case that the display position of the information is inside the road (71) contained in the scenery seen through the transparent member (9) .
  3. The display device (2) according to claim 1,
    wherein the display control unit (4) changes the display position of the information to the outward direction in a case that the display position of the information is inside the road (71) contained in the actual image of the scenery.
  4. The display device (2) according to any one of claim 1 to 3, further comprising
    an object information obtaining unit (4) configured to obtain category information on a category of the object,
    wherein the display control unit (4) determines whether or not to change the display position of the information on the object based on the category information.
  5. The display device (2) according to claim 4,
    wherein the display control unit (4) decides to change the display position of the information on the object to the outward direction of the road (71) in a case that the category of the object according to the category information is relevant to a facility, and decides not to change the display position of the information on the object to the outward direction of the road (71) in a case that the category of the object according to the category information is relevant to a road sign.
  6. The display device (2) according to any one of claim 1 to 5,
    wherein the display control unit (4) changes the display position of the information on the object to the outward direction of the road (71) in a case that the object is a facility existing outside the road (71), and does not change the display position of the information on the object to the outward direction of the road in a case that the object is a facility existing inside the road (71).
  7. A display method executed by a display device (2), comprising:
    a position information obtaining process for obtaining position information of a moving body; and
    a display control process for displaying information on an object by superposing the information on scenery in front of the moving body or on an actual image of the scenery based on the position information, the object being contained in the scenery,
    wherein in the display control process, a display position of the information on the object is changed to an outward direction of a road (71) in a case that the display position of the information on the object based on the present position information is inside the road (71), the road (71) being contained in the scenery.
EP11877396.9A 2011-12-15 2011-12-15 Display device and display method Active EP2793193B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/079083 WO2013088557A1 (en) 2011-12-15 2011-12-15 Display device and display method

Publications (3)

Publication Number Publication Date
EP2793193A1 EP2793193A1 (en) 2014-10-22
EP2793193A4 EP2793193A4 (en) 2016-12-14
EP2793193B1 true EP2793193B1 (en) 2019-03-27

Family

ID=48612041

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11877396.9A Active EP2793193B1 (en) 2011-12-15 2011-12-15 Display device and display method

Country Status (3)

Country Link
EP (1) EP2793193B1 (en)
JP (1) JP5735658B2 (en)
WO (1) WO2013088557A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013088557A1 (en) 2011-12-15 2013-06-20 パイオニア株式会社 Display device and display method
JP5884816B2 (en) * 2013-12-16 2016-03-15 コニカミノルタ株式会社 Information display system having transmissive HMD and display control program
DE102015003948B4 (en) * 2015-03-26 2022-08-11 Audi Ag Method for operating virtual reality glasses and a virtual reality system arranged in a motor vehicle
JP6443236B2 (en) * 2015-06-16 2018-12-26 株式会社Jvcケンウッド Virtual image presentation system, image projection apparatus, and virtual image presentation method
EP3343178B1 (en) * 2016-12-27 2019-08-07 Volkswagen AG Driver assistance system, computer program product, signal sequence, vehicle and method for providing information to a user of a vehicle
JP7266257B2 (en) * 2017-06-30 2023-04-28 パナソニックIpマネジメント株式会社 DISPLAY SYSTEM AND METHOD OF CONTROLLING DISPLAY SYSTEM
JP7384014B2 (en) * 2019-12-06 2023-11-21 トヨタ自動車株式会社 display system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001307121A (en) 2000-02-14 2001-11-02 Matsushita Electric Ind Co Ltd Map information correction device and its method
US20040193331A1 (en) 2003-03-28 2004-09-30 Denso Corporation Display method and apparatus for changing display position based on external environment
US20050154505A1 (en) 2003-12-17 2005-07-14 Koji Nakamura Vehicle information display system
US20100226535A1 (en) 2009-03-05 2010-09-09 Microsoft Corporation Augmenting a field of view in connection with vision-tracking
US20100253602A1 (en) 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Dynamic vehicle system information on full windshield head-up display
WO2013088557A1 (en) 2011-12-15 2013-06-20 パイオニア株式会社 Display device and display method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3931339B2 (en) * 2003-09-30 2007-06-13 マツダ株式会社 Vehicle information providing device
JP4696248B2 (en) * 2004-09-28 2011-06-08 国立大学法人 熊本大学 MOBILE NAVIGATION INFORMATION DISPLAY METHOD AND MOBILE NAVIGATION INFORMATION DISPLAY DEVICE
JP2006162442A (en) * 2004-12-07 2006-06-22 Matsushita Electric Ind Co Ltd Navigation system and navigation method
JP2010066042A (en) * 2008-09-09 2010-03-25 Toshiba Corp Image irradiating system and image irradiating method
JP5346650B2 (en) * 2009-03-31 2013-11-20 株式会社エクォス・リサーチ Information display device
JP5555526B2 (en) * 2010-04-06 2014-07-23 東芝アルパイン・オートモティブテクノロジー株式会社 Vehicle display device
JP5728775B2 (en) * 2011-03-22 2015-06-03 株式会社Where Information processing apparatus and information processing method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001307121A (en) 2000-02-14 2001-11-02 Matsushita Electric Ind Co Ltd Map information correction device and its method
US20040193331A1 (en) 2003-03-28 2004-09-30 Denso Corporation Display method and apparatus for changing display position based on external environment
US20050154505A1 (en) 2003-12-17 2005-07-14 Koji Nakamura Vehicle information display system
DE102004060380A1 (en) 2003-12-17 2005-07-14 Denso Corp., Kariya Vehicle information display system
US20100226535A1 (en) 2009-03-05 2010-09-09 Microsoft Corporation Augmenting a field of view in connection with vision-tracking
US20100253602A1 (en) 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Dynamic vehicle system information on full windshield head-up display
WO2013088557A1 (en) 2011-12-15 2013-06-20 パイオニア株式会社 Display device and display method
EP2793193A1 (en) 2011-12-15 2014-10-22 Pioneer Corporation Display device and display method

Also Published As

Publication number Publication date
JPWO2013088557A1 (en) 2015-04-27
EP2793193A4 (en) 2016-12-14
WO2013088557A1 (en) 2013-06-20
JP5735658B2 (en) 2015-06-17
EP2793193A1 (en) 2014-10-22

Similar Documents

Publication Publication Date Title
EP2793193B1 (en) Display device and display method
US10147165B2 (en) Display device, control method, program and recording medium
EP2787324B1 (en) Display device and control method
JP5735657B2 (en) Display device and display method
JP5964332B2 (en) Image display device, image display method, and image display program
JP4975889B1 (en) Head-up display, control method, and display device
US20100023255A1 (en) Navigation apparatus, map display method and map display program
JP5545109B2 (en) Driving support device, information distribution device, driving support method, information distribution method, and computer program
JP2015128956A (en) Head-up display, control method, program and storage medium
JP2015172548A (en) Display control device, control method, program, and recording medium
US20150029214A1 (en) Display device, control method, program and storage medium
JP2001027535A (en) Map display device
WO2013046424A1 (en) Head-up display, control method, and display device
EP2923876B1 (en) Display device, control method, program, and storage medium
WO2015114807A1 (en) Virtual image display, control method, program, and storage medium
JP5702476B2 (en) Display device, control method, program, storage medium
WO2013088512A1 (en) Display device and display method
WO2013046426A1 (en) Head-up display, image display method, image display program, and display device
US20180339590A1 (en) Virtual image display device, control method, program, and recording medium
JP2011179854A (en) Device, method and program for map display
EP2982935B1 (en) Display device, control method, program, and storage medium
WO2013046425A1 (en) Head-up display, control method, and display device
WO2013046423A1 (en) Head-up display, control method, and display device
JP2018158725A (en) Head-up display, control method, program and storage medium
JP2013257269A (en) Display device, head-up display, control method, program, and storage medium

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140624

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
RA4 Supplementary search report drawn up and despatched (corrected)

Effective date: 20161115

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 19/00 20110101AFI20161109BHEP

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: PIONEER CORPORATION

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20171221

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20181018

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R084

Ref document number: 602011057653

Country of ref document: DE

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1113965

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190415

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602011057653

Country of ref document: DE

REG Reference to a national code

Ref country code: GB

Ref legal event code: 746

Effective date: 20190625

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190627

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20190327

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190627

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190628

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1113965

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190327

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190727

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

REG Reference to a national code

Ref country code: DE

Ref legal event code: R026

Ref document number: 602011057653

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190727

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

PLBI Opposition filed

Free format text: ORIGINAL CODE: 0009260

PLAX Notice of opposition and request to file observation + time limit sent

Free format text: ORIGINAL CODE: EPIDOSNOBS2

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

26 Opposition filed

Opponent name: TOMTOM INTERNATIONAL B.V.

Effective date: 20191220

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

PLBB Reply of patent proprietor to notice(s) of opposition received

Free format text: ORIGINAL CODE: EPIDOSNOBS3

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20191231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191215

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191215

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191231

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191231

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20111215

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

APBM Appeal reference recorded

Free format text: ORIGINAL CODE: EPIDOSNREFNO

APBP Date of receipt of notice of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA2O

APAH Appeal reference modified

Free format text: ORIGINAL CODE: EPIDOSCREFNO

APBQ Date of receipt of statement of grounds of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA3O

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190327

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20231102

Year of fee payment: 13

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20231108

Year of fee payment: 13

Ref country code: DE

Payment date: 20231031

Year of fee payment: 13