WO2013111302A1 - Dispositif d'affichage, procédé de commande, programme et support de stockage - Google Patents

Dispositif d'affichage, procédé de commande, programme et support de stockage Download PDF

Info

Publication number
WO2013111302A1
WO2013111302A1 PCT/JP2012/051679 JP2012051679W WO2013111302A1 WO 2013111302 A1 WO2013111302 A1 WO 2013111302A1 JP 2012051679 W JP2012051679 W JP 2012051679W WO 2013111302 A1 WO2013111302 A1 WO 2013111302A1
Authority
WO
WIPO (PCT)
Prior art keywords
building
image
information
display device
camera
Prior art date
Application number
PCT/JP2012/051679
Other languages
English (en)
Japanese (ja)
Inventor
俊一 熊谷
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2012/051679 priority Critical patent/WO2013111302A1/fr
Priority to JP2013555061A priority patent/JP5702476B2/ja
Priority to US14/374,232 priority patent/US20150029214A1/en
Publication of WO2013111302A1 publication Critical patent/WO2013111302A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • G01C21/3638Guidance using 3D or perspective road maps including 3D objects and buildings

Definitions

  • the present invention relates to a technique for displaying information.
  • Patent Document 1 discloses a technique for displaying navigation information (guidance information) in a superimposed manner on a landscape image in front of the vehicle.
  • CG Computer Graphics
  • a display corresponding to a road or the like that is actually blocked by a building and is not visible is superimposed on the live-action image.
  • the sense of depth is lost by superimposing the CG image on the photographed image, and the user has difficulty grasping the sense of distance between the building and the road corresponding to the CG image.
  • the present invention has been made to solve the above-described problems, and a display device capable of appropriately maintaining a sense of depth even when guidance information is superimposed and displayed on a live-action image and its display
  • the main purpose is to provide a control method and a program.
  • a display device that superimposes and displays guidance information on a real image photographed by a camera, the photographing position, position information of a building existing in the photographing range of the camera, and a house shape.
  • the specifying means for specifying the overlap portion between the guide information and the building in the live-action image, and more than the guide information among the overlap portions
  • display control means for displaying the guidance information excluding the shielding portion on which the building is to be displayed on the front side so as to be superimposed on the photographed image.
  • the invention according to claim 8 is a display device that superimposes and displays guidance information on a live-action image taken by a camera, wherein the live-action image is more than the first building image and the first building image.
  • the guidance information indicating a route between the first building image and the second building image in the live-action image, including a second building image that is an image of a building located far from the camera
  • the guide information is displayed on the camera side with respect to the second building image in the live-action image, and the first building image is displayed on the camera side while blocking a part of the guide information. It is characterized by having a display control means.
  • the invention according to claim 10 is a control method executed by a display device that superimposes and displays guidance information on a photographed image taken by a camera, and includes a photographing position and a building existing in the photographing range of the camera. Based on the location information and the house shape information, and the location information of the facility or road corresponding to the guidance information, a specifying step for identifying a superimposed portion of the guidance information and the building in the live-action image, And a display control step of displaying the guidance information excluding the shielding portion where the building should be displayed on the front side of the guidance information so as to be superimposed on the photographed image.
  • the invention according to claim 11 is a display method in which guide information is superimposed and displayed on a live-action image taken by a camera, wherein the live-action image is more than the first building image and the first building image.
  • the guidance information indicating a route between the first building image and the second building image in the live-action image, including a second building image that is an image of a building located far from the camera
  • the guide information is displayed on the camera side with respect to the second building image in the live-action image, and the first building image is displayed on the camera side while blocking a part of the guide information. It is characterized by having display control.
  • the invention according to claim 12 is a program executed by a display device that displays guidance information superimposed on a real image photographed by a camera, the photographing position and the position of a building existing in the photographing range of the camera Based on the information and the house shape information and the location information of the facility or road corresponding to the guide information, the specifying means for specifying the overlap portion between the guide information and the building in the live-action image,
  • the display device is made to function as display control means for displaying the guidance information excluding the shielding portion where the building should be displayed in front of the guidance information, superimposed on the photographed image.
  • A It is an example of the real image image which the camera image
  • B This is an example in which a guide route is superimposed and displayed on a live-action image. It is an example of a display concerning a comparative example. It is an example of a display concerning a modification.
  • a display device that superimposes and displays guidance information on a real image photographed by a camera, the photographing position, position information of a building existing in the photographing range of the camera, and Based on the house shape information and the location information of the facility or road corresponding to the guidance information, the specifying means for identifying a superimposed portion of the guidance information and the building in the live-action image, and the guidance information of the superimposed portion Display control means for displaying the guidance information excluding the part where the building should be displayed on the front side, superimposed on the photographed image.
  • the display device is, for example, a navigation device, and displays guidance information superimposed on a real image taken by a camera.
  • the display device includes a specifying unit and a display control unit. Based on the shooting position, the position information of the building and the house shape information existing in the shooting range of the camera, and the position information of the facility or road corresponding to the guidance information, the specifying means Specify the overlapping part.
  • the display control means superimposes and displays the guidance information excluding the shielding portion where the building should be displayed in front of the guidance information, in the superimposition portion.
  • the sense of depth is lost if guidance information corresponding to a road or facility that is actually shielded by a building is not visible.
  • the above-described display device superimposes and displays on the live-action image, excluding the shielding information where the building should be displayed on the front side of the guidance information. In this way, the display device can maintain a sense of depth even when the photographed image and the guidance information are superimposed.
  • the specifying unit renders the building substantially transparent based on the location information and house shape information of the building, and draws the guidance information based on the location information of the facility or road.
  • the superimposed portion is specified, and the display control means generates a composite image combining the building drawn substantially transparently and the guidance information excluding the shielding portion, and the composite image is taken as the actual image. Overlay the image.
  • the display device can preferably specify the overlapping portion between the photographed image and the guidance information.
  • the display device draws the building substantially transparent, so that the display of the building drawn to exclude the shielding portion of the guidance information remains even when the composite image is superimposed on the live-action image. Can be prevented.
  • the display control unit when generating the composite image, draws a substantially transparent building prior to the guidance information, and then, other than the shielding portion of the overlapping portion. For this part, color mixing processing of the building drawn substantially transparent and the guidance information is performed.
  • the display device can preferably generate a composite image representing the guidance information excluding the shielding portion.
  • the display control unit displays a route on which the moving body should travel as being superimposed on the photographed image as the guidance information.
  • the display device can preferably hide the portion of the guide route to be displayed that is blocked by the building, and can maintain a sense of depth even when the guide information is superimposed on the photographed image.
  • the display control means displays the route in a vertically inverted position on the road corresponding to the route.
  • the display device can suitably hide the portion of the guide route to be displayed that is blocked by the building, and can maintain a sense of depth even when the guide information is superimposed on the photographed image. it can.
  • the display control unit displays a mark indicating the facility on the position corresponding to the facility in the photographed image so as to be superimposed on the photographed image as the guide information. According to this aspect, even if the display device is a mark representing a facility, a portion shielded by a building existing in front of the facility can be suitably hidden.
  • the display control unit excludes the shielding portion only for the mark attached to the facility in the live-action image serving as a guide for route guidance. indicate. By doing in this way, the user can grasp
  • a display device that displays guidance information superimposed on a live-action image photographed by a camera, wherein the live-action image includes a first building image and the first building image. And a second building image that is an image of a building located far from the camera, and displaying the guidance information indicating a route between the first building image and the second building image in the live-action image
  • the guide information is displayed on the camera side with respect to the second building image in the live-action image, and the first building image is displayed on the camera side while blocking part of the guide information. It is characterized by.
  • the display device can suitably maintain the sense of depth when the photographed image and the guidance information are superimposed.
  • the display control means causes the first building image to be displayed superimposed on the guidance information.
  • the display device can display the first building image in front of the guidance information, and can preferably maintain a feeling of depth.
  • a control method executed by a display device that superimposes and displays guidance information on a real image taken by a camera, and is present in a shooting position and a shooting range of the camera. Based on the location information and house shape information of the building, and the location information of the facility or road corresponding to the guidance information, a specifying step of identifying a superimposed portion of the guidance information and the building in the live-action image, A display control step of displaying the guidance information excluding the shielding portion on which the building is to be displayed on the front of the guidance information in a superimposed manner on the photographed image.
  • the display device can favorably maintain the sense of depth even when the photographed image and the guidance information are superimposed.
  • a display method for displaying guidance information superimposed on a real image taken by a camera wherein the real image is a first building image and the first building image. And a second building image that is an image of a building located far from the camera, and displaying the guidance information indicating a route between the first building image and the second building image in the live-action image
  • the guide information is displayed on the camera side with respect to the second building image in the live-action image, and the first building image is displayed on the camera side while blocking part of the guide information.
  • a program executed by a display device that displays guidance information superimposed on a photographed image taken by a camera, wherein the building is located at a shooting position and a shooting range of the camera. Identifying means for identifying a superimposed portion of the guidance information and the building in the live-action image based on the location information and the house shape information of the facility and the location information of the facility or road corresponding to the guidance information,
  • the display device is caused to function as display control means for displaying the guidance information excluding the shielding portion where the building should be displayed in front of the guidance information so as to be superimposed on the photographed image.
  • the display device can preferably maintain a sense of depth even when the photographed image and the guidance information are superimposed.
  • the program is stored in a storage medium.
  • FIG. 1 shows the configuration of the navigation device 1.
  • the navigation device 1 includes a self-supporting positioning device 10, a GPS receiver 18, a system controller 20, a disk drive 31, a data storage unit 36, a communication interface 37, a communication device 38, an interface 39, and a display unit 40.
  • the navigation device 1 shall superimpose and display a guidance route for arriving at the destination on the photographed image acquired from the camera 61 in accordance with the set destination for the driver of the vehicle.
  • the self-supporting positioning device 10 includes an acceleration sensor 11, an angular velocity sensor 12, and a distance sensor 13.
  • the acceleration sensor 11 is made of, for example, a piezoelectric element, detects vehicle acceleration, and outputs acceleration data.
  • the angular velocity sensor 12 is composed of, for example, a vibrating gyroscope, detects the angular velocity of the vehicle when the direction of the vehicle is changed, and outputs angular velocity data and relative azimuth data.
  • the distance sensor 13 measures a vehicle speed pulse composed of a pulse signal generated with the rotation of the vehicle wheel.
  • the GPS receiver 18 receives radio waves 19 carrying downlink data including positioning data from a plurality of GPS satellites.
  • the positioning data is used to detect the absolute position of the vehicle (hereinafter also referred to as “current position”) from latitude and longitude information.
  • the system controller 20 includes an interface 21, a CPU (Central Processing Unit) 22, a ROM (Read Only Memory) 23, and a RAM (Random Access Memory) 24, and controls the entire navigation device 1.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the interface 21 performs an interface operation with the acceleration sensor 11, the angular velocity sensor 12, the distance sensor 13, and the GPS receiver 18. From these, vehicle speed pulses, acceleration data, relative azimuth data, angular velocity data, GPS positioning data, absolute azimuth data, and the like are input to the system controller 20.
  • the CPU 22 controls the entire system controller 20.
  • the ROM 23 includes a nonvolatile memory (not shown) in which a control program for controlling the system controller 20 is stored.
  • the RAM 24 stores various data such as route data preset by the user via the input device 60 so as to be readable, and provides a working area to the CPU 22.
  • a system controller 20 a disk drive 31 such as a CD-ROM drive or a DVD-ROM drive, a data storage unit 36, a communication interface 37, a display unit 40, an audio output unit 50 and an input device 60 are mutually connected via a bus line 30. It is connected to the.
  • the disk drive 31 reads and outputs content data such as music data and video data from a disk 33 such as a CD or DVD under the control of the system controller 20.
  • the disk drive 31 may be either a CD-ROM drive or a DVD-ROM drive, or may be a CD and DVD compatible drive.
  • the data storage unit 36 is configured by, for example, an HDD or the like, and stores various data used for navigation processing such as map data.
  • the map data includes road data and facility information.
  • the facility information includes, in addition to the name of each facility and the position information of each facility, information related to the shape of the building (so-called house shape information) when the facility is a building.
  • the house shape information includes, for example, information such as a building location range and height.
  • the house shape information is used for displaying a city map or the like by a CG image, or used for determining the depth of a guide route for a building to be described later.
  • the communication device 38 includes, for example, an FM tuner, a beacon receiver, a mobile phone, a dedicated communication card, and the like.
  • Information distributed from a VICS (Vehicle Information Communication System (registered trademark)) center or the like (hereinafter referred to as “VICS information”) Is obtained from the radio wave 39.
  • the interface 37 performs an interface operation of the communication device 38 and inputs the VICS information to the system controller 20 or the like.
  • the display unit 40 displays various display data on a display device such as a display under the control of the system controller 20.
  • the system controller 20 reads map data from the data storage unit 36.
  • the display unit 40 displays the map data read from the data storage unit 36 by the system controller 20 on the display screen.
  • the display unit 40 includes a graphic controller 41 that controls the entire display unit 40 based on control data sent from the CPU 22 via the bus line 30 and a memory such as a VRAM (Video RAM), and can display image information that can be displayed immediately.
  • a buffer memory 42 that temporarily stores, a display control unit 43 that controls display of a display 44 such as a liquid crystal or a CRT (Cathode Ray Tube) based on image data output from the graphic controller 41, and a display 44 are provided.
  • the display 44 functions as an image display unit, and includes, for example, a liquid crystal display device having a diagonal size of about 5 to 10 inches and is mounted near the front panel in the vehicle.
  • the audio output unit 50 performs D / A (Digital to Analog) conversion of audio digital data sent from the CD-ROM drive 31, DVD-ROM 32, RAM 24, or the like via the bus line 30 under the control of the system controller 20.
  • a D / A converter 51 to perform an amplifier (AMP) 52 that amplifies the audio analog signal output from the D / A converter 51, and a speaker 53 that converts the amplified audio analog signal into sound and outputs the sound into the vehicle. It is prepared for.
  • AMP amplifier
  • the input device 60 includes keys, switches, buttons, a remote controller, a voice input device, and the like for inputting various commands and data.
  • the input device 60 is disposed around the front panel and the display 44 of the main body of the in-vehicle electronic system mounted in the vehicle.
  • the display 44 is a touch panel system
  • the touch panel provided on the display screen of the display 44 also functions as the input device 60.
  • the camera 61 is an optical machine that has a certain angle of view and photographs a subject within the angle of view.
  • the camera 61 is directed to the front of the vehicle and is placed at a position where the road on which the vehicle travels can be imaged. Then, the camera 61 generates an image (referred to as “actual image”) at predetermined intervals and supplies it to the system controller 20.
  • the system controller 20 superimposes and displays an image excluding a portion (also referred to as a “shielded portion”) that is shielded by a building in front of the guidance route and is not visible to the driver.
  • a portion also referred to as a “shielded portion”
  • the system controller 20 maintains the sense of depth even when the guidance route is superimposed on the photographed image, and allows the driver to accurately grasp the sense of distance.
  • FIG. 2 is an example of a flowchart illustrating a processing procedure according to the present embodiment.
  • the system controller 20 repeatedly executes the processing of the flowchart shown in FIG. 2 every time a photographed image is received from the camera 61, for example.
  • the processing procedure shown in FIG. 2 conforms to the processing procedure of general computer graphics software such as OpenGL (registered trademark) and DirectX (registered trademark), and can be suitably executed by these software. is there.
  • “mixing (blending) processing” refers to processing for mixing the color of the pixel to be drawn on the pixel being drawn.
  • the system controller 20 reads house shape information and the like from the data storage unit 36 (step S101). Specifically, first, the system controller 20 specifies the shooting range of the camera 61 based on the current position recognized by the GPS receiver 18. At this time, for example, the system controller 20 specifies a predetermined range as the imaging range from the current position toward the traveling direction of the vehicle. The predetermined range is determined in advance in consideration of, for example, the installation position, installation direction, angle of view, and the like of the camera 61. Next, the system controller 20 refers to the map data to identify a building that exists within the shooting range, and reads the house shape information and position information of the identified building from the map data.
  • the system controller 20 draws a transparent polygon representing a building in the three-dimensional coordinate space, and then draws a guide route (step S102). Specifically, first, the system controller 20 generates a three-dimensional coordinate space corresponding to the photographing range with the own vehicle position as the viewpoint, and arranges the polygon generated from the house shape information in the three-dimensional coordinate space. Thereafter, the system controller 20 draws the guidance route at a position overlapping the road on the guidance route. In other words, the system controller 20 determines whether the guidance route is at the same depth as the road corresponding to the guidance route, that is, whether it is the front or the back of an arbitrary building. Draw at the position.
  • the system controller 20 performs processing (ie, rasterization) for converting the building and the guidance route drawn in the three-dimensional coordinate space into position information and color information in units of pixels (step S103). Specifically, the system controller 20 generates a raster image in which a three-dimensional coordinate space in which a polygon representing a building and a guide route are drawn is projected from the position of the camera 61 in the shooting direction. At this time, the system controller 20 first draws the building display for the part where the building polygon and the guidance route overlap, and then further displays the guidance route display in the depth determination process in step S104 described later. Judge whether to do. If the system controller 20 determines that the guidance route should be drawn by the depth determination process, the system controller 20 performs a mixing process in step S105.
  • processing ie, rasterization
  • the system controller 20 performs a depth determination process (so-called depth test) for determining the drawing target portion and the non-drawing target portion of the guide route (step S104). Specifically, the system controller 20 determines whether the display of the building and the display of the guidance route overlap for each rasterized pixel. When the display of the building and the display of the guidance route do not overlap, the system controller 20 sets the guidance route as a drawing target in the processing target pixel.
  • a depth determination process so-called depth test
  • the system controller 20 determines whether or not the display of the building is behind the display of the guidance route. To do. When the building display is behind the guide route display, the system controller 20 sets the guide route as the drawing target in the processing target pixel. On the other hand, when the display of the building is in front of the display of the guidance route, the system controller 20 excludes the display of the guidance route from the drawing target in the processing target pixel. Thereby, the part shielded by the building in the guidance route is excluded from the drawing target.
  • the system controller 20 preferably performs the depth determination process before the mixing process in step S105 described later, so that, even when the polygon representing the building is drawn transparently, the rear side of the polygon is preferably displayed. Can be excluded from the drawing target.
  • the system controller 20 performs a mixing (blending) process (step S105). Specifically, the system controller 20 performs the mixing process of the drawing target portion of the guidance route designated as the drawing target in the depth determination process on the image on which the transparent building has already been drawn. Here, since the display color of the building is designated as transparent, as a result, the system controller 20 generates an image in which only the drawing target portion of the guidance route is displayed.
  • the system controller 20 superimposes the CG image (composite image) obtained by the mixing process on the actual image and displays it on the display 44 (step S106).
  • the system controller 20 displays the CG image on the display 44 using the photographed image as the background image.
  • the system controller 20 can suitably hide the portion that is blocked by the building after the guidance route, and maintains a sense of depth even when the CG image and the live-action image are superimposed. The driver can get a sense of distance.
  • FIG. 3A shows a real image captured by the camera 61 while the vehicle is traveling
  • FIG. 3B shows an image obtained by superimposing a guide route 46 that is a CG image on the real image.
  • the system controller 20 displays on the display 44 a guide route 46 indicating that the vehicle should turn left at the intersection 47. Specifically, the system controller 20 draws a curve having a thickness corresponding to the road width as a guide route 46 at a position overlapping with the road to be traveled. At this time, the system controller 20 excludes the shielding portions by the polygons of the buildings 45A to 45C according to the processing shown in FIG. 2 based on the house shape information and position information of the buildings 45A to 45C, the position information of the road 48, and the like. The guide route 46 is superimposed on the photographed image.
  • the road 48 on the guide route on which the vehicle travels after passing through the intersection 47 exists behind the buildings 45A to 45C with respect to the viewpoint of the camera 61.
  • a part of the road 48 is hidden by the buildings 45A to 45C and is not displayed.
  • the overlapping portions with the buildings 45A to 45C are not displayed in the same manner as the road 48 of the photographed image.
  • the user can easily grasp that the road in the back such as the building 45A should be turned left, and can accurately grasp the positional relationship between the road to be turned right and left and the adjacent building. .
  • FIG. 4 shows a display according to a comparative example in which a CG image of a guide route is generated and superimposed on a live-action image without considering the positional relationship between the building and the guide route.
  • a CG image of a guide route is generated and superimposed on a live-action image without considering the positional relationship between the building and the guide route.
  • some of the buildings 45A to 45C are overlapped by the guide route 46x.
  • the user cannot intuitively grasp whether the left turn road 48B exists on the front side or the back side of the building 45A or the like.
  • the sense of depth is lost by superimposing the CG image and the real image.
  • the system controller 20 performs depth determination processing of the polygons that virtually draw each building and the display of the guide route, and does not display the shielding portion of the guide route by the polygon of the building. And thereby, the system controller 20 suitably prevents the sense of depth from being lost even when the CG image and the real image are superimposed.
  • the display mode of the guidance route to which the present invention is applicable is not limited to the display mode that is superimposed on the road scheduled to travel as shown in FIG. Instead, the system controller 20 may display the guidance route display at a different position without overlapping the road corresponding to the guidance route.
  • FIG. 5 shows a display example of the guidance route according to the modification.
  • the system controller 20 displays the guide route 46y by turning it upside down at a position above the planned road.
  • the guidance route 46y is above the road corresponding to the guidance route 46y by a predetermined distance. , Drawing at the same depth as the road.
  • the system controller 20 displays the guide route 46 y so that the portion overlapping the building 45 ⁇ / b> A existing in front of the road 48 is hidden.
  • the user can easily recognize that the road on the back of the building 45A should be turned to the left, and accurately determine the positional relationship between the road to be turned to the right and the left and the building. I can grasp it.
  • the target for hiding the shielding part by the building is not limited to the guide route. Instead of this, or in addition to this, the navigation apparatus 1 may also hide the portion shielded by the building for the guidance information other than the guidance route.
  • guidance information refers to information to be visually recognized by the driver to assist driving, for example, a mark indicating the facility attached to a position corresponding to the facility (also referred to as “facility mark”). , Facilities such as towers that serve as landmarks (landmarks), and traffic lines displayed along congested roads.
  • the system controller 20 executes the processing from step S101 to step S106 in FIG. 2 for the facility mark.
  • step S102 the system controller 20 arranges the building polygons in the three-dimensional coordinate space, and then arranges each facility mark at a display position having the same depth as the corresponding facility. In this way, the system controller 20 prevents the facility mark from being displayed overlapping the building in front of the facility corresponding to the facility mark, and the user misidentifies the correspondence between the facility and the facility mark. Can be reliably suppressed.
  • the system controller 20 may perform the non-display processing of the shielding portion only in the case of displaying the facility mark to be displayed as a mark when driving according to the guidance route.
  • the system controller 20 can appropriately recognize the position of the facility that serves as a landmark for driving.
  • the system controller 20 displays other facility marks regardless of the presence or absence of a shielding part, so that the user can easily find a target facility when searching for a facility to stop by.
  • the processing procedure of the flowchart of FIG. 2 is an example, and the processing procedure to which the present invention is applicable is not limited to this.
  • the navigation device 1 identifies a portion of the guide route that overlaps the building based on the house shape information, the location information, the road location information on the guide route, etc. without generating a transparent polygon of the building. Then, an image of the guide route in which the portion is not displayed may be displayed superimposed on the photographed image.
  • the system controller 20 may appropriately execute the order of a part of the processing.
  • the system controller 20 may draw a substantially transparent polygon instead of drawing a completely transparent polygon when drawing a polygon representing a building.
  • the system controller 20 may draw a polygon having a high transmittance so that the polygon representing the building is not conspicuous when the photographed image and the CG image are superimposed. Even in this case, similarly to the embodiment, it is possible to output a display with a sense of depth while hiding the shielding portion of the building on the guide route.
  • step S102 of FIG. 2 the system controller 20 has drawn all the polygons of the building existing within the shooting range of the camera 61. Instead of this, the system controller 20 may draw polygons of some buildings within the shooting range. Specifically, the system controller 20 may generate polygons only for buildings existing in front of the road corresponding to the guidance route in the traveling direction of the vehicle. Also by this, the system controller 20 can output a display with a sense of depth while hiding the shielded part by the building of the guidance route.
  • the present invention can be suitably applied to an apparatus that performs guidance display based on a photographed image taken by a camera.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)

Abstract

La présente invention concerne un dispositif d'affichage par lequel des informations de guidage sont superposées sur une image réelle, qui est photographiée par un dispositif de prise de vue, et affichées. Selon l'invention, le dispositif d'affichage comprend un moyen de spécification et un moyen de commande d'affichage. Le moyen de spécification spécifie une partie de superposition des informations de guidage et des bâtiments de l'image réelle sur la base de l'emplacement photographique, des informations d'emplacement et des informations de forme de maison des bâtiments dans la portée de photographie de l'appareil de prise de vue, et des informations d'emplacement d'installations ou de routes correspondant aux informations de guidage. Le moyen de commande d'affichage superpose les informations de guidage, dans lesquelles une partie de perte d'image à l'intérieur de la partie de superposition est exclue, le bâtiment devant être affiché plus en avant que les informations de guidage sur l'image réelle, et les affiche.
PCT/JP2012/051679 2012-01-19 2012-01-26 Dispositif d'affichage, procédé de commande, programme et support de stockage WO2013111302A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2012/051679 WO2013111302A1 (fr) 2012-01-26 2012-01-26 Dispositif d'affichage, procédé de commande, programme et support de stockage
JP2013555061A JP5702476B2 (ja) 2012-01-26 2012-01-26 表示装置、制御方法、プログラム、記憶媒体
US14/374,232 US20150029214A1 (en) 2012-01-19 2012-01-26 Display device, control method, program and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/051679 WO2013111302A1 (fr) 2012-01-26 2012-01-26 Dispositif d'affichage, procédé de commande, programme et support de stockage

Publications (1)

Publication Number Publication Date
WO2013111302A1 true WO2013111302A1 (fr) 2013-08-01

Family

ID=48873070

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/051679 WO2013111302A1 (fr) 2012-01-19 2012-01-26 Dispositif d'affichage, procédé de commande, programme et support de stockage

Country Status (2)

Country Link
JP (1) JP5702476B2 (fr)
WO (1) WO2013111302A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015152467A (ja) * 2014-02-17 2015-08-24 パイオニア株式会社 表示制御装置、制御方法、プログラム、及び記憶媒体
WO2018009109A1 (fr) * 2016-07-07 2018-01-11 Saab Ab Système et procédé d'affichage destinés à l'affichage d'une vue en perspective de l'environnement d'un aéronef, dans un aéronef
DE112016006725T5 (de) 2016-05-17 2018-12-27 Mitsubishi Electric Corporation Bildanzeigevorrichtung, bildanzeigeverfahren und bildanzeigeprogramm
JP2019095213A (ja) * 2017-11-17 2019-06-20 アイシン・エィ・ダブリュ株式会社 重畳画像表示装置及びコンピュータプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008128827A (ja) * 2006-11-21 2008-06-05 Matsushita Electric Ind Co Ltd ナビゲーション装置およびナビゲーション方法ならびにそのプログラム
WO2009084134A1 (fr) * 2007-12-28 2009-07-09 Mitsubishi Electric Corporation Dispositif de navigation
JP2011047649A (ja) * 2007-12-28 2011-03-10 Mitsubishi Electric Corp ナビゲーション装置
JP2011529569A (ja) * 2008-07-31 2011-12-08 テレ アトラス ベスローテン フエンノートシャップ ナビゲーションデータを三次元で表示するコンピュータ装置および方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003269972A (ja) * 1999-05-14 2003-09-25 Denso Corp 地図表示装置
JP3943346B2 (ja) * 2001-04-26 2007-07-11 トヨタ自動車株式会社 ナビゲーション装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008128827A (ja) * 2006-11-21 2008-06-05 Matsushita Electric Ind Co Ltd ナビゲーション装置およびナビゲーション方法ならびにそのプログラム
WO2009084134A1 (fr) * 2007-12-28 2009-07-09 Mitsubishi Electric Corporation Dispositif de navigation
JP2011047649A (ja) * 2007-12-28 2011-03-10 Mitsubishi Electric Corp ナビゲーション装置
JP2011529569A (ja) * 2008-07-31 2011-12-08 テレ アトラス ベスローテン フエンノートシャップ ナビゲーションデータを三次元で表示するコンピュータ装置および方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015152467A (ja) * 2014-02-17 2015-08-24 パイオニア株式会社 表示制御装置、制御方法、プログラム、及び記憶媒体
DE112016006725T5 (de) 2016-05-17 2018-12-27 Mitsubishi Electric Corporation Bildanzeigevorrichtung, bildanzeigeverfahren und bildanzeigeprogramm
WO2018009109A1 (fr) * 2016-07-07 2018-01-11 Saab Ab Système et procédé d'affichage destinés à l'affichage d'une vue en perspective de l'environnement d'un aéronef, dans un aéronef
US10982970B2 (en) 2016-07-07 2021-04-20 Saab Ab Displaying system and method for displaying a perspective view of the surrounding of an aircraft in an aircraft
JP2019095213A (ja) * 2017-11-17 2019-06-20 アイシン・エィ・ダブリュ株式会社 重畳画像表示装置及びコンピュータプログラム
US11535155B2 (en) 2017-11-17 2022-12-27 Aisin Corporation Superimposed-image display device and computer program

Also Published As

Publication number Publication date
JPWO2013111302A1 (ja) 2015-05-11
JP5702476B2 (ja) 2015-04-15

Similar Documents

Publication Publication Date Title
EP2724896B1 (fr) Dispositif d'aide au stationnement
JP2015172548A (ja) 表示制御装置、制御方法、プログラム、及び記憶媒体
JP5735658B2 (ja) 表示装置及び表示方法
JPWO2013114617A1 (ja) 画像表示装置、画像表示方法及び画像表示プログラム
JP5795386B2 (ja) 表示装置及び制御方法
JP2009236843A (ja) ナビゲーション装置、ナビゲーション方法、およびナビゲーションプログラム
JP2008014754A (ja) ナビゲーション装置
US20150029214A1 (en) Display device, control method, program and storage medium
JP5702476B2 (ja) 表示装置、制御方法、プログラム、記憶媒体
JP2018128466A (ja) ナビゲーション装置、ヘッドアップディスプレイ、制御方法、プログラム、及び記憶媒体
JP3642776B2 (ja) ナビゲーション装置の地図表示方法およびナビゲーション装置
WO2011135660A1 (fr) Système de navigation, procédé de navigation, programme de navigation, et support de stockage
JP2015105903A (ja) ナビゲーション装置、ヘッドアップディスプレイ、制御方法、プログラム、及び記憶媒体
JP2015141155A (ja) 虚像表示装置、制御方法、プログラム、及び記憶媒体
JP6401925B2 (ja) 虚像表示装置、制御方法、プログラム、及び記憶媒体
JP2015152467A (ja) 表示制御装置、制御方法、プログラム、及び記憶媒体
WO2011121788A1 (fr) Dispositif de navigation, dispositif d'affichage d'informations, procédé de navigation, programme de navigation et support d'enregistrement
JP4917191B1 (ja) 画像制御装置及び画像制御方法
WO2013046426A1 (fr) Affichage tête haute, procédé d'affichage d'image, programme d'affichage d'image et dispositif d'affichage
WO2013088512A1 (fr) Dispositif d'affichage et procédé d'affichage
JP5438172B2 (ja) 情報表示装置、情報表示方法、情報表示プログラムおよび記録媒体
JP3790011B2 (ja) ナビゲーション装置における地図情報表示装置及び地図情報表示方法並びにナビゲーション装置における地図情報表示制御プログラムが記録されたコンピュータ読み取り可能な記録媒体
WO2014002167A1 (fr) Dispositif d'affichage d'informations, procédé d'affichage d'informations, programme d'affichage d'informations et support d'enregistrement
WO2013046425A1 (fr) Affichage tête haute, procédé de commande et dispositif d'affichage
WO2013046423A1 (fr) Affichage tête haute, procédé de commande et dispositif d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12866929

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013555061

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 14374232

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12866929

Country of ref document: EP

Kind code of ref document: A1