KR20170002087A - Display Apparatus and Vehicle Having The Same - Google Patents

Display Apparatus and Vehicle Having The Same Download PDF

Info

Publication number
KR20170002087A
KR20170002087A KR1020150092039A KR20150092039A KR20170002087A KR 20170002087 A KR20170002087 A KR 20170002087A KR 1020150092039 A KR1020150092039 A KR 1020150092039A KR 20150092039 A KR20150092039 A KR 20150092039A KR 20170002087 A KR20170002087 A KR 20170002087A
Authority
KR
South Korea
Prior art keywords
stop position
vehicle
information
processor
display device
Prior art date
Application number
KR1020150092039A
Other languages
Korean (ko)
Inventor
박상하
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150092039A priority Critical patent/KR20170002087A/en
Publication of KR20170002087A publication Critical patent/KR20170002087A/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3617Destination input or retrieval using user history, behaviour, conditions or preferences, e.g. predicted or inferred from previous use or current movement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Navigation (AREA)

Abstract

The display device according to the embodiment stores the stop position of the vehicle, the movement record prepared on the basis of the stop position, and at least one additional information on the stop position, and displays the stop position in the category A storage unit for storing the classified data; An input unit for receiving a search input of a user; A processor for extracting a stop position corresponding to the search input of the user in the storage unit, a movement record corresponding to the stop position, and the additional information; And a display unit displaying at least one of the stop position extracted from the processor, the movement record corresponding to the stop position, and the additional information.

Figure P1020150092039

Description

Display Apparatus and Vehicle Having The Same &

The present invention relates to a display device provided in a vehicle and a vehicle including the same.

A vehicle is a device that moves a user in a desired direction by a boarding user. Typically, automobiles are examples.

For the convenience of the user who uses the vehicle, various sensors and electronic devices are provided. In particular, various devices for the user's driving convenience have been developed.

On the other hand, the driver may sometimes experience difficulties because he or she has failed to memorize the previously moved position or the traveled route. In particular, it is often difficult to find the location of the place visited. At this time, the driver may remember only some specific information about the location of the moved location, but it may be difficult to find the exact location of the moved location by using only some information, and may find it difficult to move to the wrong location .

The existing navigation apparatus only provides the destination search history or the travel route history inputted by the driver and thus does not adequately provide information to be obtained by the driver depending on the situation.

In order to solve the above-described problems, the present invention proposes a display device capable of storing the history of the user's driving history on the basis of a stop position and appropriately providing the stored information according to the user's needs.

The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

The display device according to the embodiment stores the stop position of the vehicle, the movement record prepared on the basis of the stop position, and at least one additional information on the stop position, and displays the stop position in the category A storage unit for storing the classified data; An input unit for receiving a search input of a user; A processor for extracting a stop position corresponding to the search input of the user in the storage unit, a movement record corresponding to the stop position, and the additional information; And a display unit displaying at least one of the stop position extracted from the processor, the movement record corresponding to the stop position, and the additional information.

Further, the vehicle according to the embodiment includes the display device according to the above-described embodiment.

The display device according to the embodiment may store the movement record of the vehicle based on the stop position and store the additional information about the stop position without a separate user input.

The stop position can be searched by the movement record or the supplementary information. At this time, the movement record and the additional information matched with the searched stop position are provided, so that the driver can appropriately provide the necessary information.

The effects of the present invention are not limited to the effects mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the description of the claims.

1 is a view showing the appearance of a vehicle including a display device according to an embodiment of the present invention.
2 is a view showing an inner tube of a vehicle equipped with a display device according to an embodiment of the present invention.
3 shows a block diagram of a display device according to an embodiment of the present invention.
4 is a flowchart illustrating a process of generating and storing stop position based information according to an embodiment of the present invention.
5 is a diagram showing the moving force of the vehicle.
6 shows a stop-position related image photographed by a camera.
7 shows a driver image photographed by the monitoring unit.
8 and 11 show examples of a display screen for displaying the stop position based information according to the embodiment of the present invention.
FIG. 12 is a flowchart illustrating a stop-position-based information display process according to the period search according to the embodiment of the present invention.
13 shows a display screen according to the period search according to the embodiment of the present invention.
14 is a flowchart illustrating a stop position based information display process according to an exemplary embodiment of the present invention.
15 shows a display screen according to the position search according to the embodiment of the present invention.
16 is a flowchart illustrating a stop position based information display process according to a keyword search according to an embodiment of the present invention.
17 shows a display screen according to the keyword search according to the embodiment of the present invention.
18 to 19 are diagrams for explaining real-time stop position-based information display and utilization of a display device according to another embodiment of the present invention.
20 is an example of an internal block diagram of the vehicle of FIG.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.

The vehicle described in the present specification may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.

In the following description, the left side of the vehicle means the left side in the running direction of the vehicle, and the right side of the vehicle means the right side in the running direction of the vehicle.

The LHD vehicle will be mainly described unless otherwise mentioned in the following description.

1 is a view showing the appearance of a vehicle including a display device according to an embodiment of the present invention. 2 is a view showing an inner tube of a vehicle equipped with a display device according to an embodiment of the present invention. 3 shows a block diagram of a display device according to an embodiment of the present invention.

1 to 3, the vehicle may include wheels 13FL and 13FR rotated by a power source, a steering input means 721A for adjusting the traveling direction of the vehicle, and a display device 100. [

The display apparatus 100 according to the present embodiment may include a memory 140 for storing a stop position and a display unit 180 for displaying a stop position.

The driver may sometimes experience difficulties because he or she has not been able to remember the location that was previously moved or the route that was traveled. In particular, it is often difficult to find the location of the place visited. At this time, the driver may remember only some specific information about the location of the moved location, but it may be difficult to find the exact location of the moved location with some information alone, and sometimes it may be difficult to move to the wrong location .

The existing navigation apparatus only provides the destination search history or the travel route history inputted by the driver and thus does not adequately provide information to be obtained by the driver depending on the situation.

In order to solve the above-described problems, the display device 100 according to the embodiment may store the movement record of the vehicle on the basis of the stop position and store the additional information on the stop position without requiring additional user input . Further, the stationary position can be searched by the movement record or the supplementary information. At this time, the movement record and the additional information matched with the searched stationary position are provided, so that the driver can provide necessary information.

The display device 100 according to this embodiment includes an input unit 110, a communication unit 120, an interface unit 130, a memory 140, a monitoring unit 150, a camera 160, a processor 170, An audio output unit 185, and a power supply unit 190, as shown in FIG.

First, the input unit 110 may be a plurality of buttons disposed in the vehicle or a touch screen of the display unit 180. It is possible to turn on and operate the display device 100 via a plurality of buttons or a touch screen.

In addition, the driver can perform the stop position search input through the input unit 110. For example, the driver can enter a specific period to search for the stop positions within the period. Further, the driver can input a specific position and perform a position-based search. Further, the driver can search for the stop positions related to the keyword by inputting a specific keyword.

Next, the communication unit 120 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 120 can wirelessly exchange data with the mobile terminal of the vehicle driver. Wireless data communication methods include Bluetooth, WiFi, Direct WiFi and APiX NFC.

The communication unit 120 may receive traffic condition information on the location information weather information road, for example, TPEG (Transport Protocol Expert Group) information from the mobile terminal 600 or the server 500. The communication unit 120 may receive the navigation information described below from the mobile terminal 600 when the mobile terminal 600 is used as a navigation system.

The communication unit 120 can transmit and receive data to and receive information from the mobile terminal 600 and display the information based on the stationary position on the display unit 180 of the mobile terminal 600, Can be confirmed.

Also, the communication unit 120 may transmit and store the stop position, the moving record, and the additional information to the server, and may receive the necessary information according to the driver's search.

However, in the embodiment, the display device 100 includes the memory 140, stores information such as a stop position in the memory 140, and searches and utilizes the stored information.

Alternatively, the server may store information such as a stop position and transmit / receive data through the communication unit 120, and may efficiently store data using the communication unit 120 and the memory 140 at the same time. When storing information on the server, the server connection for data access may require additional security release procedures such as ID login.

In addition, when the user is boarding the vehicle, the user's mobile terminal 600 and the display device 100 may perform pairing with each other automatically or by execution of the user's application.

Also, the communication unit 120 can receive the traffic light change information from the external server 500. [ Here, the server 500 may be a server located in a traffic control station that controls traffic.

Next, the interface unit 130 may receive the vehicle-related data or transmit the signal processed or generated by the processor 170 to the outside. To this end, the interface unit 130 can perform data communication with the control unit 770, the AVN (Audio Video Navigation) apparatus 400, the sensor unit 760 and the like in the vehicle by wire communication or wireless communication.

The interface unit 130 may receive the navigation information through the data communication with the controller 770, the AVN apparatus 400, or a separate navigation device. Here, the navigation information may include the search word information, the set destination information, the route information according to the destination, the map information related to the driving of the vehicle, the current position information of the vehicle, and the time information according to the position. Further, the navigation information may include position information of the vehicle on the road.

Also, the interface unit 130 may receive the sensor information from the control unit 770 or the sensor unit 760.

Here, the sensor information includes at least one of start-on / off information, juel-cap on / off information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, Information about the inside of the vehicle, information about the inside of the vehicle, information about the inside of the vehicle, information about the battery, information about the fuel cell, fuel information, tire information, vehicle lamp information,

In addition, the sensor information may include at least one of a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, and a vehicle internal humidity sensor. On the other hand, the position module may include a GPS module for receiving GPS information.

Such sensor information may be processed in the processor 170 to generate a stop position, a movement record, and additional information. For example, it is possible to determine whether the vehicle is stopped or not through the startup on / off information. In addition, the fuel information can be used to judge whether or not fuel is injected, fuel injection amount, fuel efficiency information, and the like.

The interface unit 130 may receive user input received through the user input unit 110 of the vehicle. The interface unit 130 may receive the user input from the input unit 720 of the vehicle or via the control unit 770.

The interface unit 130 may receive the information obtained from the server 500. [ The server 500 may be a server located in a traffic control station that controls traffic. For example, when the traffic light change information is received from the server 500 through the communication unit 710 of the vehicle, the interface unit 130 may receive the traffic light change information from the control unit 770.

The memory 140 may then store various data for operation of the display device 100, such as a program for processing or controlling the processor 170.

In detail, the memory 140 may store the stop position classified by category so as to be searchable by the movement record and additional information.

Here, the stop position means a position where the vehicle has stopped over a predetermined time.

Further, the movement record may include at least one or more of the coordinates of the stop position and the stop position with respect to the stop position, the movement path between the stop positions, the movement time, and the movement distance.

Further, the additional information may include at least one of an attribute including at least one of a place name, an industry type, and a mutual name of a stop position, an address of a stop position, a stop position related image, a destination search word, and driver information.

In addition, the memory 140 may store data for object identification. For example, when a predetermined object is detected from the image obtained through the camera 160, the memory 140 may store data for checking what the object corresponds to according to a predetermined algorithm.

The memory 140 may store data on traffic information. For example. The memory 140 may store data for determining what the traffic information corresponds to according to a predetermined algorithm when predetermined traffic information is detected from the image obtained through the camera 160. [

In addition, the memory 140 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like in hardware.

Meanwhile, the vehicle may include a camera 160 for photographing a vehicle front image or a vehicle periphery image, and a monitoring unit 150 for photographing an in-vehicle image.

In detail, the monitoring unit 150 can acquire an image of the passenger. The monitoring unit 150 may obtain an image for biometrics of the passenger. That is, the monitoring unit 150 may be an image acquisition module disposed inside the vehicle. The thus obtained in-vehicle interior ebb is processed from the processor 170, so that occupant information (in particular, driver information) at the stop position can be generated.

In addition, the camera 160 can photograph the surroundings of the vehicle. The vehicle peripheral image photographed by the camera 160 may be stored and displayed as a stop position related image.

For example, the camera 160 captures the forward image of the vehicle at the stop position to acquire the forward image, and the processor 170 may store the forward image in the memory 140 as the stop position related image.

In addition, the camera 160 continuously captures an image before stopping the vehicle, and the processor 170 may store a plurality of images continuously captured in the memory 140 as stop position related images.

The display device 100 according to the embodiment can display the stop position related image together with the stop position to help the memory of the stop position of the driver.

The camera 160 may include a plurality of cameras 160.

The plurality of cameras 160 may be disposed on at least one of left, rear, right, and front of the vehicle, respectively.

The left camera 160 may be disposed in a case surrounding the left side mirror. Alternatively, the left camera 160 may be disposed outside the case surrounding the left side mirror. Alternatively, the left camera 160 may be disposed in one area outside the left front door, the left rear door, or the left fender.

The right camera 160 may be disposed in a case surrounding the right side mirror. Or the right camera 160 may be disposed outside the case surrounding the right side mirror. Alternatively, the right camera 160 may be disposed in one area outside the right front door, the right rear door, or the right fender.

Further, the rear camera 160 can be disposed in the vicinity of a rear license plate or a trunk switch. The front camera 160 may be disposed in the vicinity of the ambulance or in the vicinity of the radiator grill.

As described above, an image photographed by at least one of the four sides of the vehicle can be stored as a stop position related image.

The camera 160 may include an image sensor and an image processing module. The camera 160 may process still images or moving images obtained by an image sensor (e.g., CMOS or CCD). The image processing module processes the still image or moving image obtained through the image sensor, extracts necessary information, and transmits the extracted information to the processor 170. [

Next, the display unit 180 may display at least one of the movement record and the additional information corresponding to the stop position and the stop position.

The display unit 180 may be a display device 100 installed inside the vehicle. In addition, the display unit 180 may include a cluster or an HUD (Head Up Display) on the inside of the vehicle interior. On the other hand, when the display unit 180 is the HUD, it may include a projection module that projects an image on the side glass of the vehicle.

In addition, the display device 100 may further include an audio output unit 185 and a power supply unit 190.

Next, the processor 170 controls the overall operation of each unit in the display device 100.

The processor 170 may generate a stop position, a stop position related movement record and additional information, and store it in the memory 140.

The processor 170 may search for a stop position in the memory 140 according to a user input, and extract movement record and additional information corresponding to the retrieved stop position.

Thereafter, the processor 170 displays the retrieved and extracted stop position, the movement record, and the additional information on the display unit 180, thereby providing a desired vehicle movement history.

More specifically, the processor 170 may determine whether the vehicle is stopped or not, and may generate the stop position in consideration of the position information when it is determined that the vehicle is stationary.

For example, when the processor 170 receives the start-off signal from the interface unit 130, the processor 170 may determine the position of the vehicle when receiving the base signal as the stop position.

In addition, the processor 170 may determine the position as the stop position if the position of the vehicle is not changed by the interface unit 130 or the communication unit 120 for a predetermined time or more.

When the processor 170 receives the fuel cap opening signal from the interface unit 130, the processor 170 determines the position of the vehicle at the time of receiving the base signal as the stop position, and generates the additional information indicating that the attribute of the stop position is the gas station . At this time, the processor 170 may generate fuel consumption information, fuel supply amount, fuel consumption amount, and the like according to the travel distance as additional information. Accordingly, the user can confirm the accurate fuel consumption information according to the gasoline consumption rate according to the gasoline station, and predict the fuel consumption according to the existing driving information rather than the fuel consumption according to the travel distance.

After determining the stop position, the processor 170 may generate the movement record and additional information matching the stop position even if there is no additional user input.

In detail, the processor 170 may generate movement record and additional information from at least one of sensor information, navigation information, and stop position related images.

For example, the processor 170 may calculate a stop time, a stop and stop coordinates, a travel path between the stop positions, and a travel distance from the vehicle position information and the time information among the sensor information.

In addition, the processor 170 searches the Internet through the communication unit 120 for the coordinates of the stop position, and generates the address and the attribute of the stop position as additional information.

In addition, the processor 170 may generate an object query word as additional information in the navigation information, process the object query word, and generate additional information by calculating an industry related to the object query word.

Also, the processor 170 can identify the driver from the photographed image through the monitoring unit 150, and generate the driver information as additional information. That is, the additional information can also be created based on who the driver is.

The processor 170 may also generate additional information from the stop position related image. In detail, the processor 170 can detect a sign on the stop position related image and generate additional information from the detected sign information.

That is, the processor 170 may process the image acquired by the camera 160. [ In particular, the processor 170 performs signal processing based on computer vision. Accordingly, the processor 170 can acquire images from the camera 160 in front of or around the vehicle, and perform object detection and object tracking based on the images. Particularly, when the object is detected, the processor 170 can detect the object, such as a sign detection, a lane detection (LD), a vehicle detection (VD), a pedestrian detection (PD), a brightspot detection ), Traffic signal detection (TSR), road surface detection, and the like.

The processor 170 may detect information in the vehicle front image or the vehicle surround image obtained by the camera 160. [

Here, the information may be additional information related to the stop position. For example, the processor 170 may extract the signboard from the stop position related image, and then determine the attribute of the stop position from the text or graphic form written on the signboard.

The processor 170 may compare the detected information with the information stored in the memory 140 to verify the information.

For example, the processor 170 detects a pattern or text indicating that the vehicle or the parking lot is an object included in the acquired image. Here, the object may be a traffic sign or a parking line. The processor 170 can compare the traffic information stored in the memory 140 with the detected pattern or text to confirm the parking indication.

In this way, the processor 170 directly determines the stop position, generates the movement record and additional information according to the sperm position, and stores it in the memory 140 so that the processor 170 can search the memory 140 .

Thereafter, the processor 170 receives the user's input, and the processor 170 displays the retrieved and extracted stop position, movement record, and additional information on the display unit 180, thereby providing the user with the desired vehicle movement history .

That is, the processor 170 not only displays the stop position of the searched vehicle, but also displays at least one or more of the stopping point of the stop position, the attribute, the passenger, the object search term and the related image on the display unit 180, And can help to restore memory for the desired location.

The processor 170 may also perform additional operations using the retrieved information. For example, the processor 170 may instruct route guidance to the navigation device when it receives an input to set the retrieved stop position as a destination. Alternatively, when the driver receives an input that defines the stop position as the destination, the vehicle may be moved to the destination by autonomous driving.

5 is a diagram showing the moving force of the vehicle. FIG. 6 shows a stop position related image photographed by the camera 160. FIG. FIG. 7 shows a driver image photographed by the monitoring unit 150. FIG.

Hereinafter, the process of generating and storing the stop position based information by the processor 170 will be described in more detail with reference to FIGS.

First, the processor 170 may determine whether the vehicle is stopped. (S101)

More specifically, the processor 170 may determine whether the vehicle is stopped using sensor information or navigation information.

For example, when the processor 170 receives the start-off signal from the interface unit 130, the processor 170 may determine the position of the vehicle when receiving the base signal as the stop position.

In addition, the processor 170 may determine the position as the stop position if the position information of the vehicle is not changed by the interface unit 130 or the communication unit 120 for a predetermined time.

When the processor 170 receives the fuel cap opening signal from the interface unit 130, the processor 170 can determine the position of the vehicle when receiving the base signal as the stop position.

If it is determined that the vehicle is stationary, the processor 170 may determine the position of the stop. (S103)

In detail, the processor 170 may obtain the coordinates of the stop position of the vehicle from the sensor information or the navigation information, and store the coordinates at the stop position.

Next, the processor 170 may generate a movement record based on the stop position. (S105)

In detail, the processor 170 can calculate the stopping point, the stopping point coordinates, the moving path and the moving distance between the stopping positions from the vehicle position information and the time information in the sensor information.

Next, the processor 170 may generate additional information based on the stop position. (S107)

In addition, the processor 170 searches the Internet through the communication unit 120 for the coordinates of the stop position, and generates the address and the attribute of the stop position as additional information.

In addition, the processor 170 may generate the object query as additional information in the navigation information. At this time, the processor 170 may process the object query to generate additional information by calculating an industry related to the object query. For example, if the object query is a specific mutual name, the processor 170 can designate the business type associated with the mutual attribute as an attribute.

5, the stop positions are stored in the nodes N1, N2 and N3 in the map, the vehicle travel path 10 between the nodes N1, N2 and N3 is stored in the movement record, Indicates that attributes of the nodes N1, N2, and N3 are stored so as to be displayed as icons in the map.

The processor 170 may also generate additional information from the stop position related image. In detail, the processor 170 can detect a sign on the stop position related image and generate additional information from the detected sign information.

Referring to FIG. 6, when the processor 170 recognizes the parking indication 21 among the images photographed by the camera 160, the processor 170 may store the image as a stop position related image. Since the image photographed together with the parking lot display 21 photographs and displays the situation adjacent to the stopped position, it can be utilized as more necessary video information for the user.

The processor 170 may then detect additional information in the captured image from the camera 160. [ For example, the processor 170 extracts the signboard 22 from the stop position-related image, and then recognizes the attribute of the stop position from the text or the figure written on the signboard. In Fig. 6, it can be seen that the attribute of the stop position through the signboard 22 is a mart.

To this end, the processor 170 may compare the detected information with the information stored in the memory 140 to verify the information.

For example, the processor 170 detects a pattern or text indicating that the vehicle or the parking lot is an object included in the acquired image. Here, the object may be a traffic sign or a parking line. The processor 170 can compare the traffic information stored in the memory 140 with the detected pattern or text to confirm the parking indication.

Also, the processor 170 can identify the driver from the photographed image through the monitoring unit 150, and generate the driver information as additional information. That is, the additional information can also be created based on who the driver is.

Referring to FIG. 7, the image captured by the monitoring unit 150 includes a driver D, and the processor 170 processes the data to store the driver D at the stop position.

The processor 170 may store the stop position in the memory 140 so that the stop position can be classified and searched according to the movement record and the additional information generated as described above. (S109)

8 and 11 illustrate examples of a screen of the display unit 180 displaying the stop position-based information according to the embodiment of the present invention.

The display unit 180 displays the stop position extracted by the user search in time series together with the movement record and additional information corresponding to the stop position.

Referring to FIG. 8, when a user searches for a business type at a gas station through a keyword search, the corresponding stop position, movement record, and additional information are displayed in time series.

The user can store and confirm the fuel injection time 31, the route information 32, the gas station name 33, and the fuel information 34 without any additional input so that the accurate fuel consumption amount according to the travel distance of the vehicle can be known have.

In other words, the processor 170 can automatically create and provide the fueling account unit to the user.

Referring to FIG. 9, the user displays a stop record extracted by the period search on the map of the movement record and additional information corresponding to the stop position.

Each of the nodes N1, N2 and N3 is a stop position, and the nodes N1, N2 and N3 can be represented by an icon representing the attribute. Alternatively, each of the nodes N1, N2 and N3 may be displayed as a thumbnail image of the stop position related image.

More specifically, the nodes N1, N2 and N3 indicated by the icons indicate the stop position of the vehicle. In addition, a line connecting the nodes N1, N2 and N3 represents the movement route 10. If the vehicle has stopped at the first node N1, it can be known that the property of the stop position is a restaurant. Also, it can be seen that the vehicle has moved from the first node N1 to the second node N2, at which time the travel route 10 can be known and the attribute of the second node N2 is the gas station. It can be seen that the vehicle has moved from the second node N2 to the third node N3 and the traveling path 10 is known and the property of the third node N3 is the repair shop.

At this time, when displaying the stop position on the map, the processor 170 may first calculate an area of the map including the extracted stop position, and adjust the amount of movement record and additional information displayed according to the map area .

For example, as shown in FIG. 10, when the area of the map including the stop position is wide, the nodes N4, N5, N6, and N7 can be represented by simple dots.

On the other hand, when selecting the stop position, the processor 170 may further display the movement record and additional information for the stop position.

11, when the user selects a specific stop position, the processor 170 switches the screen of the display unit 180 to display the attribute 41, the address 42, the business type 43, the personality 44, the map information 48, and the stop position related image 45, and the movement record.

In this way, the display device 100 according to the embodiment can provide information on the movement history and the stop position of the vehicle necessary for the user in accordance with the search of the user, thereby improving the convenience of the user.

FIG. 12 is a flowchart illustrating a stop position based information display process according to an exemplary embodiment of the present invention. FIG. 13 illustrates a display screen according to an exemplary embodiment of the present invention.

12 and 13, processor 170 may receive a user ' s period search through input 110. < RTI ID = 0.0 > (S301)

The user can search through the input unit 110 for the stop positions included in the specific period.

The processor 170 may search for the stop positions of the vehicle within a specific time period when a specific time period is input. (S303)

For example, when the user inputs a specific time point and an end point, the user can search the memory 140 for the position where the vehicle has stopped between the time point and the end point.

13, the first node N1 to the third node N3 correspond to positions where the vehicle has stopped within a certain period of time.

Next, the processor 170 may extract from the memory 140 the movement record and the additional information associated with the retrieved stop positions. (S305)

For example, the processor 170 may extract the movement record and the additional information of each of the first node N1 to the third node N3.

The processor 170 may then display the retrieved stop position and the extracted movement record and additional information on the display unit 180. [ (S307)

For example, the processor 170 may display the retrieved first node N1 through the third node N3, extracted additional information, and movement records through a map, and provide the user with the map.

Then, the processor 170 may perform an operation according to the additional input of the user. (S309)

For example, when the user selects a specific stop position, the processor 170 may display the movement record and additional information for the selected stop position in a more detailed manner by switching the screen.

In addition, when the processor 170 receives an input for determining the searched stop position as a destination, the processor 170 can instruct the navigation device to guide the route. Alternatively, the processor 170 may move the vehicle to the destination by autonomous travel, when receiving an input that defines the stop position as the destination.

FIG. 14 is a flowchart illustrating a stop position based information display process according to an exemplary embodiment of the present invention, and FIG. 15 illustrates a display screen according to an exemplary embodiment of the present invention.

14 and 15, the processor 170 may receive a location search of the user via the input 110. [ (S501)

For example, the user can input the threshold distance R at a specific point 51 and a specific point 51 via the input unit 110. [

The processor 170 can retrieve the vehicle's stop positions within the threshold distance R set or entered at a particular location when a particular location is entered. (S503)

For example, the processor 170 may retrieve the stop positions where the vehicle has stopped at a radius 52 within a predetermined threshold distance R at a particular point 51 entered by the user.

14, the first node N1 to the third node N3 correspond to the positions at which the vehicle has stopped within the critical distance R at the specific point 51. [

Next, the processor 170 may extract from the memory 140 the movement record and the additional information associated with the retrieved stop positions. (S505)

For example, the processor 170 may extract the movement record and the additional information of each of the first node N1 to the third node N3.

The processor 170 may then display the retrieved stop position and the extracted movement record and additional information on the display unit 180. [ (S507)

For example, the processor 170 may display the retrieved first node N1 through the third node N3, extracted additional information, and movement records through a map, and provide the user with the map.

Then, the processor 170 may perform an operation according to the additional input of the user. (S509)

For example, when the user selects a specific stop position, the processor 170 may display the movement record and additional information for the selected stop position in a more detailed manner by switching the screen.

In addition, when the processor 170 receives an input for determining the searched stop position as a destination, the processor 170 can instruct the navigation device to guide the route. Alternatively, the processor 170 may move the vehicle to the destination by autonomous travel, when receiving an input that defines the stop position as the destination.

FIG. 16 is a flowchart illustrating a stop position based information display process according to an embodiment of the present invention. FIG. 17 shows a display screen according to the keyword search according to the embodiment of the present invention.

16 and 17, the processor 170 may receive a user's keyword search through the input 110. [ (S701)

The user can input a keyword indicating at least one of the additional information of the stop position through the input unit 110. [

For example, the user can input a business type, an address, and the like, and retrieve a stop position related to the inputted business type among the stop positions of the vehicle.

FIG. 17 shows a screen displayed when a user inputs a gas station by keyword.

When the keyword is input, the processor 170 can search for a stop position that matches the keyword. (S703)

For example, the processor 170 may extract only the stop position including additional information matching the keyword.

Next, the processor 170 may extract from the memory 140 the movement record and the additional information associated with the retrieved stop positions. (S705)

For example, the processor 170 may extract from the memory 140 the movement record and the additional information of each of the positions that have stopped at the gas station.

The processor 170 may then display the retrieved stop position and the extracted movement record and additional information on the display unit 180. [ (S707)

The processor 170 may display the stop position extracted by the user search through the display unit 180 in a time series together with the movement record and additional information corresponding to the stop position.

Referring to FIG. 17, when a user searches for a business type at a gas station through a keyword search, the corresponding stop position, movement record, and additional information are displayed in a time series.

Therefore, the user can store and confirm the fuel injection time 31, the route information 32, the fueling station name and the fuel information without any additional input, so that the accurate fuel consumption amount according to the travel distance of the vehicle can be known.

Then, the processor 170 may perform an operation according to the additional input of the user. (S509)

For example, when the user selects a specific stop position, the processor 170 may display the movement record and additional information for the selected stop position in a more detailed manner by switching the screen.

In addition, when the processor 170 receives an input for determining the searched stop position as a destination, the processor 170 can instruct the navigation device to guide the route. Alternatively, the processor 170 may move the vehicle to the destination by autonomous travel, when receiving an input that defines the stop position as the destination.

18 to 19 are diagrams for explaining real-time stop position-based information display and utilization of the display device 100, according to another embodiment of the present invention.

The display device 100 according to another embodiment includes the same configuration as that of the above-described embodiment and is different from the driving method only and is considered to include all of the above-described embodiments.

In an embodiment, the processor 170 may automatically retrieve the memory 140 in certain situations and retrieve a stop position for a particular situation.

For example, the processor 170 may search for a stop position that has been stopped for oiling when the gas warning light is displayed, and may suggest a travel to the stop position or route guidance.

18, the fueling warning light is turned on, and the processor 170 can search for a fuel stop position that is adjacent to the HUD display unit 180 of the vehicle or is low in fuel consumption, and can suggest whether to guide the user.

The driver has an advantage that the driver can be guided to a position where he / she wants to stop for lubrication without losing frontal attention through the HUD display unit 180. [

18 to 19, the user can input whether or not to be guided to the stop position by using the steering input means 721A.

For example, as shown in Fig. 18, when the driver inputs a gesture with both hands to the steering input means 721A, the suggestion guidance disappears and guidance can be started to the gas station.

Also, as shown in FIG. 19, when the driver inputs a gesture sweeping the right hand clockwise after fixing the left hand, the gas station guidance message may disappear.

The display device 100 according to another embodiment may provide the driver with the stop position based information in real time so that the convenience of the driver can be further enhanced.

20 is an example of an internal block diagram of the vehicle of FIG.

The vehicle includes a communication unit 710, an input unit 720, a sensing unit 760, an output unit 740, a vehicle driving unit 750, a memory 730, an interface unit 780, a control unit 770, a power source unit 790, A display device 100, and an AVN device 400. [

The communication unit 710 may include one or more modules that enable wireless communication between the vehicle and the mobile terminal 600, between the vehicle and the external server 500, or between the vehicle and the other vehicle 520. [ In addition, the communication unit 710 may include one or more modules that connect the vehicle to one or more networks.

The communication unit 710 may include a broadcast receiving module 711, a wireless Internet module 712, a local area communication module 713, a location information module 714, and an optical communication module 715.

The broadcast receiving module 711 receives broadcast signals or broadcast-related information from an external broadcast management server through a broadcast channel. Here, the broadcast includes a radio broadcast or a TV broadcast.

The wireless Internet module 712 is a module for wireless Internet access, and can be built in or externally mounted in a vehicle. The wireless Internet module 712 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro World Wide Interoperability for Microwave Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), and Long Term Evolution-Advanced (LTE-A) (712) transmits and receives data according to at least one wireless Internet technology in a range including internet technologies not listed above. For example, the wireless Internet module 712 can exchange data with the external server 500 wirelessly. The wireless Internet module 712 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) information from the external server 500. [

The short-range communication module 713 is for short-range communication and may be a Bluetooth ™, a Radio Frequency Identification (RFID), an Infrared Data Association (IrDA), an Ultra Wideband (UWB) It is possible to support near-field communication using at least one of Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct and Wireless USB (Universal Serial Bus)

The short range communication module 713 may form short range wireless communication networks (Wireless Area Networks) to perform short range communication between the vehicle and at least one external device. For example, the short-range communication module 713 can exchange data with the mobile terminal 600 wirelessly. The short distance communication module 713 can receive weather information and traffic situation information of the road (for example, TPEG (Transport Protocol Expert Group)) from the mobile terminal 600. For example, when the user is boarded in the vehicle, the user's mobile terminal 600 and the vehicle can perform pairing with each other automatically or by execution of the user's application.

The position information module 714 is a module for acquiring the position of the vehicle, and a representative example thereof is a Global Positioning System (GPS) module. For example, when the vehicle utilizes a GPS module, it can acquire the position of the vehicle using a signal sent from the GPS satellite.

The optical communication module 715 may include a light emitting portion and a light receiving portion.

The light receiving section can convert the light signal into an electric signal and receive the information. The light receiving unit may include a photodiode (PD) for receiving light. Photodiodes can convert light into electrical signals. For example, the light receiving section can receive information of the front vehicle through light emitted from the light source included in the front vehicle.

The light emitting unit may include at least one light emitting element for converting an electric signal into an optical signal. Here, the light emitting element is preferably an LED (Light Emitting Diode). The optical transmitter converts the electrical signal into an optical signal and transmits it to the outside. For example, the optical transmitter can emit the optical signal to the outside through the blinking of the light emitting element corresponding to the predetermined frequency. According to an embodiment, the light emitting portion may include a plurality of light emitting element arrays. According to the embodiment, the light emitting portion can be integrated with the lamp provided in the vehicle. For example, the light emitting portion may be at least one of a headlight, a tail light, a brake light, a turn signal lamp, and a car light. For example, the optical communication module 715 can exchange data with another vehicle 520 via optical communication.

The input unit 720 may include a driving operation unit 721, a camera 160, a microphone 723, and a user input unit 724.

The driving operation means 721 receives a user input for driving the vehicle. The driving operation means 721 may include a steering input means 721, a shift input means 721, an acceleration input means 721 and a brake input means 721.

The steering input means 721 receives an input of the traveling direction of the vehicle from the user. The steering input means 721 is preferably formed in a wheel shape so that steering input is possible by rotation. According to an embodiment, the steering input means 721 may be formed of a touch screen, a touch pad or a button.

The shift input means 721 receives inputs of parking (P), forward (D), neutral (N), and reverse (R) of the vehicle from the user. The shift input means 721 is preferably formed in a lever shape. According to an embodiment, the shift input means 721 may be formed of a touch screen, a touch pad or a button.

The acceleration input means 721 receives an input for acceleration of the vehicle from the user. The brake input means 721 receives an input for decelerating the vehicle from the user. The acceleration input means 721 and the brake input means 721 are preferably formed in the form of a pedal. According to the embodiment, the acceleration input means 721 or the brake input means 721 may be formed of a touch screen, a touch pad, or a button.

The camera 160 may include an image sensor and an image processing module. The camera 160 may process still images or moving images obtained by an image sensor (e.g., CMOS or CCD). The image processing module processes the still image or moving image obtained through the image sensor, extracts necessary information, and transmits the extracted information to the control unit 770. Meanwhile, the vehicle may include a camera 160 for photographing a vehicle front image or a vehicle periphery image, and a monitoring unit 150 for photographing an in-vehicle image.

The monitoring unit 150 may acquire an image of the passenger. The monitoring unit 150 may obtain an image for biometrics of the passenger.

20, the monitoring unit 150 and the camera 160 are included in the input unit 720. However, the camera 160 may be described as a configuration included in the display device 100 as described above have.

The microphone 723 can process an external sound signal as electrical data. The processed data can be used variously depending on the function being performed in the vehicle. The microphone 723 can convert the voice command of the user into electrical data. The converted electrical data can be transmitted to the control unit 770.

According to the embodiment, the camera 160 or the microphone 723 may be a component included in the sensing unit 760 rather than a component included in the input unit 720. [

The user input unit 724 is for receiving information from a user. When information is input through the user input unit 724, the control unit 770 can control the operation of the vehicle to correspond to the input information. The user input unit 724 may include touch input means or mechanical input means. According to an embodiment, the user input 724 may be located in one area of the steering wheel. In this case, the driver can operate the user input portion 724 with his / her finger while holding the steering wheel.

The sensing unit 760 senses a signal related to the running or the like of the vehicle. To this end, the sensing unit 760 may include a sensor, a wheel sensor, a velocity sensor, a tilt sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor, , A position module, a vehicle forward / reverse sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, an internal humidity sensor, an ultrasonic sensor, a radar, .

Thereby, the sensing unit 760 can acquire the vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, , Fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, and the like.

In addition, the sensing unit 760 may include an acceleration pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A position sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.

The sensing unit 760 may include a biometric information sensing unit. The biometric information sensing unit senses and acquires the biometric information of the passenger. The biometric information may include fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition information, Voice recognition information. The biometric information sensing unit may include a sensor that senses the passenger's biometric information. Here, the monitoring unit 150 and the microphones 723 may operate as sensors. The biometric information sensing unit can acquire the hand shape information and the face recognition information through the monitoring unit 150.

The output unit 740 is for outputting information processed by the control unit 770 and may include a display unit 741, an acoustic output unit 742, and a haptic output unit 743. [

The display unit 741 can display information processed in the control unit 770. For example, the display unit 741 can display the vehicle-related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver. Further, the vehicle-related information may include vehicle state information indicating the current state of the vehicle or vehicle driving information related to the driving of the vehicle.

The display unit 741 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) display, a 3D display, and an e-ink display.

The display unit 741 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. This touch screen may function as a user input 724 that provides an input interface between the vehicle and the user, while providing an output interface between the vehicle and the user. In this case, the display unit 741 may include a touch sensor that senses a touch with respect to the display unit 741 so that a control command can be received by a touch method. When a touch is made to the display unit 741, the touch sensor senses the touch, and the control unit 770 generates a control command corresponding to the touch based on the touch. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.

Meanwhile, the display unit 741 may include a cluster so that the driver can check the vehicle state information or the vehicle driving information while driving. Clusters can be located on the dashboard. In this case, the driver can confirm the information displayed in the cluster while keeping the line of sight ahead of the vehicle.

Meanwhile, according to the embodiment, the display unit 741 may be implemented as a Head Up Display (HUD). When the display unit 741 is implemented as a HUD, information can be output through a transparent display provided in the windshield. Alternatively, the display unit 741 may include a projection module to output information through an image projected on the windshield.

The sound output unit 742 converts an electric signal from the control unit 770 into an audio signal and outputs the audio signal. For this purpose, the sound output unit 742 may include a speaker or the like. It is also possible for the sound output section 742 to output a sound corresponding to the operation of the user input section 724. [

The haptic output unit 743 generates a tactile output. For example, the haptic output section 743 may operate to vibrate the steering wheel, the seat belt, and the seat so that the user can recognize the output.

The vehicle drive unit 750 can control the operation of various devices of the vehicle. The vehicle driving unit 750 includes a power source driving unit 751, a steering driving unit 752, a brake driving unit 753, a lamp driving unit 754, an air conditioning driving unit 755, a window driving unit 756, an airbag driving unit 757, A driving unit 758 and a suspension driving unit 759.

The power source drive section 751 can perform electronic control of the power source in the vehicle.

For example, when the fossil fuel-based engine (not shown) is a power source, the power source drive unit 751 can perform electronic control on the engine. Thus, the output torque of the engine and the like can be controlled. When the power source driving unit 751 is an engine, the speed of the vehicle can be limited by limiting the engine output torque under the control of the control unit 770. [

As another example, when the electric motor (not shown) is a power source, the power source driving unit 751 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.

The steering driver 752 may perform electronic control of a steering apparatus in the vehicle. Thus, the traveling direction of the vehicle can be changed.

The brake driver 753 can perform electronic control of a brake apparatus (not shown) in the vehicle. For example, it is possible to reduce the speed of the vehicle by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.

The lamp driver 754 can control the turn-on / turn-off of the lamps disposed inside and outside the vehicle. Also, the intensity, direction, etc. of the light of the lamp can be controlled. For example, it is possible to perform control on a direction indicating lamp, a brake lamp, and the like.

The air conditioning driving unit 755 can perform electronic control on an air conditioner (not shown) in the vehicle. For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cool air to be supplied to the inside of the vehicle.

The window driving unit 756 may perform electronic control of a window apparatus in the vehicle. For example, it is possible to control the opening or closing of the side of the vehicle with respect to the left and right windows.

The airbag driving unit 757 can perform electronic control of the airbag apparatus in the vehicle. For example, in case of danger, the airbag can be controlled to fire.

The sunroof driving unit 758 may perform electronic control of a sunroof apparatus (not shown) in the vehicle. For example, the opening or closing of the sunroof can be controlled.

The suspension driving unit 759 can perform electronic control of a suspension apparatus (not shown) in the vehicle. For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle.

The memory 730 is electrically connected to the control unit 770. The memory 770 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data. The memory 770 can be, in hardware, various storage devices such as ROM, RAM, EPROM, flash drive, hard drive, and the like. The memory 730 may store various data for operation of the entire vehicle, such as a program for processing or controlling the control unit 770.

The interface unit 780 can serve as a pathway to various kinds of external devices connected to the vehicle. For example, the interface unit 780 may include a port that can be connected to the mobile terminal 600, and may be connected to the mobile terminal 600 through the port. In this case, the interface unit 780 can exchange data with the mobile terminal 600.

Meanwhile, the interface unit 780 may serve as a channel for supplying electrical energy to the connected mobile terminal 600. The interface unit 780 provides electric energy supplied from the power supply unit 790 to the mobile terminal 600 under the control of the control unit 770 when the mobile terminal 600 is electrically connected to the interface unit 780 do.

The control unit 770 can control the overall operation of each unit in the vehicle. The control unit 770 may be referred to as an ECU (Electronic Control Unit).

The controller 770 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) 170 may be implemented using at least one of processors, controllers, micro-controllers, microprocessors 170, and electrical units for performing other functions.

The control unit 770 can delegate the role of the processor 170 described above. That is, the processor 170 of the display device 100 may be set directly to the control unit 770 of the vehicle. In this embodiment, it is understood that the display device 100 refers to a combination of some parts of the vehicle.

Alternatively, the control unit 770 may control the configurations so as to transmit the information requested by the processor 170. [

The power supply unit 790 can supply power necessary for the operation of each component under the control of the control unit 770. [ Particularly, the power supply unit 770 can receive power from a battery (not shown) in the vehicle.

The AVN (Audio Video Navigation) device 400 can exchange data with the control unit 770. The control unit 770 can receive navigation information from the AVN apparatus 400 or a separate navigation device (not shown). Here, the navigation information may include set destination information, route information 32 according to the destination, map information related to the vehicle driving, or vehicle location information.

The feature structure effects and the like described in the above embodiments are included in at least one embodiment of the present invention and are not necessarily limited to only one embodiment. Further, the feature and structure effects exemplified in the embodiments can be combined and modified in other embodiments by those skilled in the art to which the embodiments belong. Therefore, it should be understood that the present invention is not limited to these combinations and modifications.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of example, It will be understood that various modifications and applications are possible. For example, each component specifically shown in the embodiments can be modified and implemented. It is to be understood that the present invention may be embodied in many other specific forms without departing from the spirit or essential characteristics thereof.

Claims (20)

A movement record prepared on the basis of the stop position of the vehicle, at least one additional information on the stop position,
A storage unit for storing the stop position by category according to the movement record and the additional information;
An input unit for receiving a search input of a user;
A processor for extracting a stop position corresponding to the search input of the user in the storage unit, a movement record corresponding to the stop position, and the additional information; And
And a display unit for displaying at least one of the stop position extracted by the processor, the movement record corresponding to the stop position, and the additional information
Display device.
The method according to claim 1,
Wherein the moving record includes a stopping point and a stopping point and a moving distance between the stopping point and the stopping position with respect to the stopping position,
Wherein the additional information includes at least one of an address, a name, an industry name, a mutual name, a stop position related image, a destination search word, and driver information of the stop position
Display device.
The method according to claim 1,
Further comprising an interface unit receiving sensor information measured from a sensor of the vehicle,
The processor comprising:
And determines the stopping point and the stopping point of the vehicle through information on the start-on and off-off of the sensor information
Display device.
The method according to claim 1,
Further comprising an interface unit receiving sensor information measured from a sensor of the vehicle,
The processor comprising:
And additional information including the fuel injection stop position and the fuel injection amount is calculated through the fuel information among the sensor information
Display device.
The method according to claim 1,
Further comprising: at least one camera for photographing the outside of the vehicle to acquire the image,
The processor comprising:
Processes the photographed image to detect a signboard, and extracts the additional information from the detected signboard information
Display device.
6. The method of claim 5,
Further comprising: at least one camera for photographing the outside of the vehicle to acquire the image,
Wherein,
The image detected with the sign, the image captured at the stop position, and the image captured before the preset stop time are stored as the stop position related images
Display device.
The method according to claim 1,
And an interface unit receiving the destination search word and the location information (GPS information) of the vehicle,
The processor comprising:
Extracting at least one additional information of the address, the name, the business type, and the mutual name from the destination search word and the location information of the vehicle
Display device.
The method according to claim 1,
Further comprising a monitoring unit for photographing a driver inside the vehicle to acquire a driver image,
The processor comprising:
Acquires driver information corresponding to the movement record from the driver image
Display device.
The method according to claim 1,
The display unit includes:
The extracted stop position is displayed in a time series together with the movement record and additional information corresponding to the stop position
Display device.
The method according to claim 1,
The processor comprising:
Calculating an area of a map including the extracted stop position,
The display unit includes:
A map of the calculated area is displayed,
And displaying the stop position on the map as a node
Display device.
11. The method of claim 10,
The processor comprising:
Upon receipt of a selection input for the node from the input unit, controls the display unit to display additional information about a stop position of the selected node
Display device.
11. The method of claim 10,
The node comprising:
The image related to the stop position is referred to as a thumbnail image or an icon representing the industry
Display device.
11. The method of claim 10,
The processor comprising:
And adjusting the amount of additional information to be displayed in accordance with the area of the map
Display device.
The method according to claim 1,
The processor comprising:
When receiving the user's period search input from the input unit, extracts the stop position within the period inputted by the user and the movement record and the additional information corresponding to the stop position within the period
Display device.
15. The method of claim 14,
The processor comprising:
And controls the display unit to display a movement path between the extracted stop position and the stop position
Display device.
The method according to claim 1,
The processor comprising:
Extracting additional information corresponding to a stop position within a predetermined threshold distance of the specific position and a stop position within the threshold distance when receiving a search input for a specific position from the input unit
Display device.
17. The method of claim 16,
The processor comprising:
Calculating a region of a map including a radius indicating the critical distance,
A map indicating the calculated area, a radius indicating the critical distance, and a map indicating the extracted stop position
Display device.
The method according to claim 1,
The processor comprising:
When receiving a keyword search input from the input unit, extracting additional information related to the keyword and a stop position corresponding to the keyword related information
Display device.
19. The method of claim 18,
The processor comprising:
Calculating an area of a map including the extracted stop position,
And controls the display unit to display the map of the calculated area and the stop position on the map as a node
Display device.
5. The method of claim 4,
The processor comprising:
When receiving an oil supply record search input from the input unit, extracting a luggage stop position among the stop positions,
The additional information on the stop position of the vehicle includes a gas station name, a fuel amount, a fuel cost, a mileage from the immediately preceding fueling, and a fuel cost from the immediately preceding fueling,
Display device.
KR1020150092039A 2015-06-29 2015-06-29 Display Apparatus and Vehicle Having The Same KR20170002087A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150092039A KR20170002087A (en) 2015-06-29 2015-06-29 Display Apparatus and Vehicle Having The Same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150092039A KR20170002087A (en) 2015-06-29 2015-06-29 Display Apparatus and Vehicle Having The Same

Publications (1)

Publication Number Publication Date
KR20170002087A true KR20170002087A (en) 2017-01-06

Family

ID=57832502

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150092039A KR20170002087A (en) 2015-06-29 2015-06-29 Display Apparatus and Vehicle Having The Same

Country Status (1)

Country Link
KR (1) KR20170002087A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108200551A (en) * 2017-12-22 2018-06-22 深圳市轱辘车联数据技术有限公司 A kind of recording method of oil-filling data, device, terminal device and server
KR20190027644A (en) * 2017-09-07 2019-03-15 엘지전자 주식회사 Error detection IC for Audio Visual system of vehicle
CN110187651A (en) * 2019-06-21 2019-08-30 吉林大学 A kind of vehicle man machine's interactive system, method, apparatus, equipment and storage medium
KR20200023671A (en) * 2018-08-14 2020-03-06 엘지전자 주식회사 Robot for vehicle mounted on the vehcile and method for controlling the robot for vehicle

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190027644A (en) * 2017-09-07 2019-03-15 엘지전자 주식회사 Error detection IC for Audio Visual system of vehicle
CN108200551A (en) * 2017-12-22 2018-06-22 深圳市轱辘车联数据技术有限公司 A kind of recording method of oil-filling data, device, terminal device and server
KR20200023671A (en) * 2018-08-14 2020-03-06 엘지전자 주식회사 Robot for vehicle mounted on the vehcile and method for controlling the robot for vehicle
CN110187651A (en) * 2019-06-21 2019-08-30 吉林大学 A kind of vehicle man machine's interactive system, method, apparatus, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN106470274B (en) Apparatus and method for controlling portable device in vehicle
KR101732983B1 (en) Rear combination lamp for vehicle and Vehicle including the same
KR101708657B1 (en) Vehicle and control method for the same
KR101942793B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101750159B1 (en) Assistance Apparatus for Driving of a Vehicle, Method thereof, and Vehicle having the same
KR101860615B1 (en) Vehicle Assistance Apparatus and Vehicle Having The Same
KR20170016177A (en) Vehicle and control method for the same
KR101762805B1 (en) Vehicle and control method for the same
KR101823230B1 (en) External modules and vehicles connected to the same
KR101917412B1 (en) Apparatus for providing emergency call service using terminal in the vehicle and Vehicle having the same
KR20160147557A (en) Automatic parking apparatus for vehicle and Vehicle
KR101750875B1 (en) Diagnostic apparatus for vehicle and operating method for the same
KR20170054849A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101732263B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR20170002087A (en) Display Apparatus and Vehicle Having The Same
KR20170053880A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR20170101874A (en) Vehicle and control method for the same
KR20170005663A (en) Display control apparatus for vehicle and operating method for the same
KR101767507B1 (en) Display apparatus for a vehicle, and control method for the same
KR20170041418A (en) Display apparatus for vehicle and control method for the same
KR20170041072A (en) Detecting device for monitoring noise in vehicle and vehicle having the same
KR101897350B1 (en) Driver Assistance Apparatus
KR20180069646A (en) Driver assistance apparatus
KR101985496B1 (en) Driving assistance apparatus and vehicle having the same
KR101705454B1 (en) Driver Assistance Apparatus, Vehicle having the same