KR101929303B1 - Driver assistance apparatus and method having the same - Google Patents

Driver assistance apparatus and method having the same Download PDF

Info

Publication number
KR101929303B1
KR101929303B1 KR1020160005959A KR20160005959A KR101929303B1 KR 101929303 B1 KR101929303 B1 KR 101929303B1 KR 1020160005959 A KR1020160005959 A KR 1020160005959A KR 20160005959 A KR20160005959 A KR 20160005959A KR 101929303 B1 KR101929303 B1 KR 101929303B1
Authority
KR
South Korea
Prior art keywords
vehicle
information
building
parking
map data
Prior art date
Application number
KR1020160005959A
Other languages
Korean (ko)
Other versions
KR20170086293A (en
Inventor
조태훈
설지이
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020160005959A priority Critical patent/KR101929303B1/en
Publication of KR20170086293A publication Critical patent/KR20170086293A/en
Application granted granted Critical
Publication of KR101929303B1 publication Critical patent/KR101929303B1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • G08G1/141Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces
    • G08G1/143Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces inside the vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3658Lane guidance
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3679Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities
    • G01C21/3685Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities the POI's being parking facilities
    • G06F17/30241
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2400/00Special features of vehicle units
    • B60Y2400/30Sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mathematical Physics (AREA)
  • Navigation (AREA)

Abstract

A method of operating a vehicle driving assistance apparatus according to an embodiment of the present invention includes: sensing whether a vehicle has entered a building; Outputting indoor map data corresponding to a building into which the vehicle has entered, when the vehicle has entered the building; Receiving sensor information of the vehicle; And outputting driving information to a specific parking scheduled position among the parking spaces in the building using the sensor information of the vehicle and the indoor map data.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a vehicle driving assistant apparatus,

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a vehicle driving assist apparatus, and more particularly, to a driving assist apparatus and a method thereof for providing a customized multi-purpose parking service capable of guiding a parking space of an indoor parking space.

A vehicle is a device that moves a user in a desired direction by a boarding user. Typically, automobiles are examples.

The automobiles are internal combustion engine cars, external combustion engine cars, gas turbine cars or electric vehicles according to the prime mover used.

Electric vehicles are electric vehicles that use electric energy to drive electric motors. They include pure electric vehicles, hybrid electric vehicles (HEV), plug-in hybrid electric vehicles (PHEV), and hydrogen fuel cell vehicles (FCEV).

Meanwhile, in recent years, the development of an intelligent vehicle (Smart Vehicle) has been actively developed for safety and convenience of drivers, pedestrians, and the like.

Intelligent automobiles are also called smart automobiles, cutting-edge vehicles that combine information technology (IT) technology. Intelligent automobiles provide optimum transportation efficiency through interworking with intelligent transportation system (ITS) as well as introducing advanced system of automobile itself.

For example, intelligent vehicles have the technical advantage of maximizing the safety of the occupants as well as the occupants by developing key safety-related technologies such as Adaptive Cruise Control (ACC), obstacle detection, collision detection or mitigation.

In addition, recently, V2V (Vehicle to Vehicle) technology which attracts automobiles to run while exchanging wireless communication between automobiles running adjacent to each other is getting attention. With this V2V technology, automobiles can run at constant distance from each other on the road. In detail, it is possible to prevent a sudden traffic accident by sharing the location and speed information of the nearby vehicle in real time.

On the other hand, the above-mentioned vehicle provides guidance for the destination desired by the user due to development of GPS (Global Positioning System) device.

The route guidance function according to the related art is mainly provided outdoors so that the route guidance function is automatically terminated when the user arrives at the vicinity of a specific building corresponding to the destination of the user.

However, the actual destination of the user is a parking lot existing in the underground or ground in the building, and accordingly, there is a lot of difficulty in going to the desired parking area according to the complexity of the inside of the building.

The embodiments of the present invention provide a vehicle driving assistant device and a method of operating the same that can provide a route guidance function in a building without a sensor.

It is to be understood that the technical objectives to be achieved by the embodiments are not limited to the technical matters mentioned above and that other technical subjects not mentioned are apparent to those skilled in the art to which the embodiments proposed from the following description belong, It can be understood.

A method of operating a vehicle driving assistance apparatus according to an embodiment of the present invention includes: sensing whether a vehicle has entered a building; Outputting indoor map data corresponding to a building into which the vehicle has entered, when the vehicle has entered the building; Receiving sensor information of the vehicle; And outputting driving information to a specific parking scheduled position among the parking spaces in the building using the sensor information of the vehicle and the indoor map data.

Further, the vehicle driving assistant device may include an interface unit for communicating with the vehicle and receiving sensor information of the vehicle; A communication unit for communicating with the server; A processor for outputting the driving information of the vehicle using the sensor information and the indoor map data of the entered building when the vehicle enters the building; And an output unit for outputting driving information of the vehicle, wherein the processor detects whether the vehicle has entered the building, and, if the vehicle has entered the building, And outputs the obtained driving information to the output unit. The driving information acquiring unit acquires driving information to a predetermined parking position of the parking space in the building using the received sensor information and the outputted indoor map data, Lt; / RTI >

According to the embodiment of the present invention, by downloading the indoor map data and using the downloaded indoor map data to guide the parking space inside the building, Time can be saved dramatically.

In addition, according to the embodiment of the present invention, by assigning a specific parking zone among a plurality of parking zones according to a user registration setting, user satisfaction can be improved.

In addition, according to the embodiment of the present invention, not only the parking fee is noticed in real time but also various linked services linked with parking are provided, convenience for use of various services provided inside the building can be improved.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a view showing an appearance of a vehicle provided with a vehicle driving assistant device according to an embodiment of the present invention; FIG.
2 is a view showing an inner tube of a vehicle provided with a vehicle driving assist system according to an embodiment of the present invention.
3 shows a block diagram of a vehicle driving assist system according to an embodiment of the present invention.
FIG. 4 is a flowchart illustrating a method of operating a vehicle driving assist system according to an embodiment of the present invention.
FIGS. 5 to 7 are flowcharts for explaining how the vehicle enters the building and information on the entered building in FIG.
8 to 11 are diagrams for explaining a method of outputting indoor map data according to the first embodiment of the present invention.
12 to 15 are flowcharts for explaining a method of outputting indoor map data according to a second embodiment of the present invention.
16 is a view showing a scheduled parking position setting screen according to an embodiment of the present invention.
17 is a diagram illustrating a user information setting screen according to an embodiment of the present invention.
18 and 19 are diagrams showing indoor map data displayed according to an embodiment of the present invention.
20 is a view illustrating a route guidance screen according to an embodiment of the present invention.
FIGS. 21 to 27 are views for explaining a scheduled parking position setting process according to an embodiment of the present invention.
28 to 32 are views for explaining a process of displaying driving information of a vehicle according to an embodiment of the present invention.
33 to 35 are views for explaining a parking method according to an embodiment of the present invention.
36 to 44 are views for explaining a parking link service according to an embodiment of the present invention.
45 is an example of an internal block diagram of the vehicle of Fig.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.

The vehicle described in the present specification may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.

In the following description, the left side of the vehicle means the left side in the running direction of the vehicle, and the right side of the vehicle means the right side in the running direction of the vehicle.

Unless otherwise mentioned in the following description, the LHD (Left Hand Drive) vehicle will be mainly described.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a vehicle driving assistance device according to an embodiment will be described in detail with reference to the drawings.

FIG. 1 is a view showing the appearance of a vehicle provided with a vehicle driving assist system according to an embodiment of the present invention, and FIG. 2 is a view showing an inner tube of a vehicle equipped with a vehicle driving assist system according to an embodiment of the present invention 3 is a block diagram of a vehicle driving assist system according to an embodiment of the present invention.

1 to 3, a vehicle 700 according to the embodiment includes wheels 13FL and 13FR rotated by a power source, driving operation means 721A, 721B, 721C and 721D for controlling the running of the vehicle, And a vehicle driving assistant (100).

Here, the vehicle driving assistant device 100 is a separate device that can perform the function of assisting the driving and receiving of necessary information through data communication with the vehicle 700, And may be defined as the vehicle driving assistant device (100).

Some of the units of the vehicle driving assistant apparatus 100 may not be included in the vehicle driving assistant apparatus 100 but may be units of other apparatuses mounted on the vehicle or the vehicle 700. [ These units can be understood to be included in the vehicle driving assistant device 100 by transmitting and receiving data through the interface part of the vehicle driving assistant device 100. [

The vehicle driving assistant device 100 according to the embodiment includes the respective units shown in FIG. 3, but it is also possible to use the units directly installed in the vehicle 700 through the interface unit 130, Or a combination of the units directly installed in the memory 700.

The vehicle driving assistant device 100 may be an idling restriction device that turns off the vehicle at the time of stopping, but the following description will mainly focus on the point that the vehicle driving assistant device 100 is turned on .

Meanwhile, the vehicle driving assistant device 100 may be a mobile terminal.

The vehicle driving assistance apparatus 100 includes an input unit 110, a communication unit 120, an interface unit 130, a memory 140, a monitoring unit 150, a camera 160, a processor 170, An audio output unit 185, and a power supply unit 190, as shown in FIG.

First, the vehicle driving assistance apparatus 100 may include an input unit 110 for sensing a user's input. The user can input an execution input to turn on / off the driving assistant function through the input unit 110 or to turn on / off the power of the driving assistant apparatus 100.

The input unit 110 may include at least one of a gesture input unit for sensing a user gesture, a touch input unit for sensing a touch, and a microphone for sensing a voice input.

Next, the vehicle driving assistant apparatus 100 may include a communication unit 120 that communicates with the other vehicle 510, the terminal 600, and the server 500 and the like. The vehicle driving assistant device 100 may receive navigation information and / or traffic information through the communication unit 120. In addition, the communication unit 120 can receive the parking status information about the parking space inside the building transmitted from an external building management server (not shown).

In detail, the communication unit 120 can exchange data with the mobile terminal 600 or the server 500 in a wireless manner. In particular, the communication unit 120 may communicate with the vehicle 700 equipped with the vehicle driving assistance device to exchange data.

Wireless data communication methods include Bluetooth (WiFi), Direct WiFi, APiX or NFC.

The communication unit 120 may transmit the information on the scheduled parking position of the vehicle 700 to the server 500 based on the parking status information when the parking status information is received from the server 500. [

The parking status information includes information on each parking area for the parking lot existing in the building in which the vehicle 700 entered and information on the parking space occupied by the other vehicle And information on an empty space.

Next, the vehicle driving assistance apparatus 100 may include an interface unit 130 that receives vehicle-related data or transmits a signal processed or generated by the processor 170 to the outside.

More specifically, the vehicle driving assistance apparatus 100 may receive navigation information and / or sensor information via the interface unit 130. [

The interface unit 130 can perform data communication with a control unit 770, an AVN (Audio Video Navigation) device 400, a sensor unit 760, and the like in the vehicle by a wired communication or a wireless communication method.

The interface unit 130 may receive the navigation information by the data communication with the controller 770, the AVN apparatus 400, and / or the separate navigation device.

Also, the interface unit 130 may receive the sensor information from the control unit 770 or the sensor unit 760.

Here, the sensor information includes direction information of the vehicle 700, position information, vehicle speed information, acceleration information, tilt information, forward / backward information, fuel information, distance information to front and rear vehicles, distance information between the vehicle and the lane, And may include at least one or more of the information.

Also, the sensor information may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, a vehicle speed sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, and a vehicle internal humidity sensor. On the other hand, the position module may include a GPS module for receiving GPS information.

The interface unit 130 may receive a user input received through the user input unit 110 of the vehicle 700. [ The interface unit 130 may receive the user input from the input unit 720 of the vehicle 700 or via the control unit 770. That is, when the input unit is arranged in the vehicle itself, user input may be received through the interface unit 130. [

The user input may include user registration information for determining a scheduled parked position, information for determining whether to execute a route guidance function in the room, information for determining whether the vehicle 700 is to be actually parked And the like.

Next, the memory 140 may store various data for operation of the vehicle driving assistant 100, such as a program for processing or controlling the processor 170. [

In detail, the memory 140 may store indoor map data for a building into which the vehicle 700 has entered.

The memory 140 may be a variety of storage devices such as ROM, RAM, EPROM, flash drive, hard drive, and the like.

Next, the vehicle driving assistant apparatus 100 may include a monitoring unit 150 that photographs an in-vehicle image.

In detail, the monitoring unit 150 can detect and acquire biometric information of the user.

 Such biometric information may include image information captured by a user, fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition Facial recognition information, and voice recognition information. That is, the monitoring unit 150 may include a sensor for sensing biometric information of a user.

In addition, the monitoring unit 150 may acquire an image for biometrics of the user. That is, the monitoring unit 150 may include an image acquisition module disposed inside the vehicle.

Next, the vehicle driving assistant device 100 may include a camera 160 for acquiring the vehicle surroundings image. The obtained vehicle surroundings image can be processed by the processor 170 and used to generate image information.

Here, the image information may include at least one of the photographed object, the type of the object, the traffic signal information displayed by the object, the distance between the object and the vehicle, and the position of the object.

The processor 170 controls the overall operation of the vehicle driving assist system.

In particular, the processor 170 determines the state of the vehicle 700 equipped with the vehicle driving assistance device and controls the operation for providing the driving information of the vehicle 700. [

The camera 160 may include an internal camera that captures an image of the surroundings of the vehicle within the vehicle.

In addition, the camera 160 may be provided at various positions outside the vehicle.

The plurality of cameras 160 may be disposed on at least one of the left, rear, right, front, and ceiling of the vehicle, respectively.

The left camera may be disposed in a case surrounding the left side mirror. Alternatively, the left camera may be disposed outside the case surrounding the left side mirror. Alternatively, the left camera may be disposed in one area outside the left front door, the left rear door, or the left fender.

The right camera can be disposed in a case surrounding the right side mirror. Or the right camera may be disposed outside the case surrounding the right side mirror. Alternatively, the right camera may be disposed in one area outside the right front door, the right rear door, or the right fender.

Further, the rear camera may be disposed in the vicinity of a rear license plate or a trunk switch. The front camera can be placed near the ambulance or near the radiator grill.

The processor 170 may synthesize images photographed in all directions and provide an overview image of the vehicle viewed from the top view. When the surrounding view image is generated, a boundary portion between each image area occurs. These boundary portions can be naturally displayed by image blending processing.

Further, the ceiling camera may be disposed on the ceiling of the vehicle to photograph all the front, rear, left, and right sides of the vehicle.

Such a camera 160 may directly include an image sensor and an image processing module. The camera 160 may process still images or moving images obtained by an image sensor (e.g., CMOS or CCD). In addition, the image processing module may process the still image or the moving image obtained through the image sensor, extract required image information, and transmit the extracted image information to the processor 170.

The display unit 180 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input for providing an input interface between the vehicle driving assistance device and the user and may provide an output interface between the vehicle driving assistance device and the user.

The display unit 180 displays (outputs) information processed by the vehicle driving assistant. For example, when the parking assist function by the vehicle driving assist device is executed, a UI (User Interface) or a GUI (Graphic User Interface) related to the indoor map data of the building into which the vehicle 700 enters is displayed. Also, the display unit 180 displays the captured or / and received video or UI and GUI.

The display unit 180 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 180 may also be a light transmissive structure. With this structure, the user can see an object located behind the vehicle driving assistant body through the area occupied by the display portion 180 of the vehicle driving assistant body.

There may be two or more display portions 180 according to the implementation of the vehicle driving assistant device 100. [ For example, in the vehicle driving assistant device 100, the plurality of display portions may be spaced apart from one another or disposed integrally with each other, and may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 180 and a sensor (hereinafter, referred to as a 'touch sensor') for sensing a touch operation form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 180 or a capacitance generated at a specific portion of the display unit 180 into an electrical input signal. The touch sensor may be configured to detect not only a position and an area to be touched but also a pressure and a capacitance at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the processor 170. Thus, the processor 170 can know which area of the display unit 180 is touched or the like.

In addition, the processor 170 can determine the type of the touch input of the user based on the area, pressure, and capacitance at the time of touch. Accordingly, the processor 170 can distinguish between a finger touch of a user, a nail touch, a finger touch, and a multi-touch using a plurality of fingers.

A proximity sensor may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen that is wrapped by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life than the contact-type sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output unit 185 can output audio data received from the outside or stored in the memory 140 via the communication unit 120. [ The audio output unit 185 also outputs an acoustic signal related to a function (e.g., a navigation function) performed in the vehicle driving assist apparatus. The audio output unit 185 may include a receiver, a speaker, a buzzer, and the like.

In addition, the power supply unit 190 may receive external power and internal power under the control of the processor 170 to supply power required for operation of the respective components.

Finally, the vehicle driving assistance apparatus 100 may include a processor 170 that controls the overall operation of each unit in the vehicle driving assistance apparatus 100. [

In addition, processor 170 may control at least some of the components discussed with FIG. 2 to drive an application program. Further, the processor 170 may operate at least two of the components included in the vehicle driving assistant device 100 in combination with each other for driving the application program.

Such a processor 170 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) 170 may be implemented using at least one of processors, controllers, micro-controllers, microprocessors 170, and electrical units for performing other functions.

In particular, the processor 170 detects whether the vehicle 700 has entered the building, and, if the vehicle has entered the building, outputs the indoor map data corresponding to the building into which the vehicle has entered through the display unit 180.

The processor 170 receives sensor information of the vehicle through the interface unit.

At this time, the received sensor information is information received through a sensor that can operate in the room, unlike the GPS information. Then, the processor 170 acquires the running information of the vehicle using the sensor information. The running information may be vehicle position information on the indoor map data.

Here, the sensor information may include steering sensor information, vehicle speed sensor information, yaw sensor information, gyro sensor information, wheel sensor information, and tilt sensor information of the vehicle.

That is, the processor 170 acquires the sensor information as described above in order to acquire position tracking and running information of the vehicle 700 even when the GPS module is disconnected.

The sensor information includes steering sensor information, vehicle speed sensor information, yaw sensor information, gyro sensor information, wheel sensor information, and tilt sensor information of the vehicle, so that the processor 170 can utilize the sensor information So that it is possible to easily grasp the current position of the vehicle on the indoor map data even when the connection of the GPS module is released.

In addition, the processor 170 may track the vehicle position in the indoor map data using the image captured through the camera.

The processor 170 outputs the driving information to the predetermined parking position in the parking space on the display unit 180 using the sensor information of the vehicle and the indoor map data.

Here, the processor 170 periodically checks the connection status of the GPS (Global Positioning System) module of the vehicle to detect whether the vehicle 700 has entered the building, and when the GPS module is connected, When the connection is released, it is judged that the vehicle has entered the building.

In addition, the processor 170 determines building information about the building in which the vehicle 700 entered in order to output the indoor map data. That is, the processor 170 determines whether or not at least one of the position information at the time when the GPS (Global Positioning System) module is disconnected, the destination information predetermined by the user, and the location information of an access point And determines the building information based on any one of them.

In other words, when the vehicle 700 enters the parking lot inside the building, the GPS connection is released, so that the processor 170 recognizes the final position where the GPS connection is released, Identify building information.

Also, the vehicle 700 can move to a destination set by the user. The destination may be set by a specific address, a building name, and the like.

The general navigation automatically terminates the route guidance function when the vehicle 700 approaches the building even if the specific address is inside the building. Accordingly, the processor 170 of the present invention can grasp the information of the building into which the vehicle 700 entered based on the destination on which the vehicle 700 moves according to the route guidance function.

In addition, the processor 170 periodically accesses a peripheral access point (AP) through the communication unit, and when the specific access point is connected to the specific access point through the communication unit, The building information can be grasped.

When the vehicle 700 enters the building, the processor 170 determines whether the indoor map data corresponding to the building in which the vehicle 700 is stored is stored in the memory 140. The processor 170 downloads the indoor map data of the building through the server 500 if the indoor map data of the building in which the vehicle 700 enters is not stored in the memory 140. [

At this time, the processor 170 may download the indoor map data of the building corresponding to the destination when the destination is set, when the destination is set. Alternatively, when the vehicle 700 enters the building, the processor 170 may download the indoor map data at the time when the vehicle 700 enters the building.

In addition, the processor 170 may receive location information, parking charge information, and event information for each service area of the building along with the indoor map data and store the same in the memory 140.

On the other hand, the processor 170 can automatically receive the planned parking position of the vehicle 700 at the time when the indoor map data is output. Alternatively, the processor 170 can assign a specific parking position among a plurality of parking spaces from the user to a predetermined parked position Alternatively, the predetermined parked position may be set based on user registration information previously set.

To this end, the processor 170 receives the parking status information for the parking space in the building from the server 500 through the communication unit 120. [ The parking status information may include a parking space already occupied by another vehicle in the parking space of the parking lot of the building, and information on an empty parking space.

Then, the server 500 allocates the scheduled parking position of the vehicle 700 to be parked among the vacant parking spaces using the user registration information, and transmits the information on the allocated scheduled parking position to the communication unit 120 ).

Alternatively, the processor 170 may display the parking status information transmitted through the server 500, and may input the specific parking space of the empty parking space from the user to set the parking reserved position.

 When the parking status of the predetermined scheduled parking position is changed, the server 500 transmits the changed information to the communication unit 120.

When the change information is received, the processor 170 displays the received change information through the display unit 180, thereby resetting the scheduled parking position of the vehicle 700.

Then, the processor 170 confirms the vehicle position in the indoor map data using the sensor information, and outputs driving information for guiding the vehicle to the predetermined parking position.

The driving information may include at least one of current position information of the vehicle 700, current traveling speed information, direction indicating information, lane information, and distance information to a direction changing point.

That is, the indoor map data is provided based on the actual size of the parking lot, and the sensor information is on-board diagnostics (OBD) information of the vehicle 700. Accordingly, the processor 170 can accurately know the moving direction and the moving distance of how far the vehicle has moved in the forward / backward / left / right directions by using the OBD information, and accordingly, It is possible to provide the function of guiding the vehicle in the room even when the vehicle is not installed.

Meanwhile, the processor 170 outputs driving information to the exit of the building using the sensor information and the indoor map data when the departure of the parked vehicle is detected at the parking reserved position.

At this time, when the re-parking command is input during the departure of the vehicle, the processor 170 outputs driving information to the parking position where the vehicle 700 was previously parked. At this time, the previous parking position may have already been occupied by another vehicle. In this case, the processor 170 may automatically set another scheduled parking position or input a new scheduled parking position from the user.

In addition, when the parking of the vehicle is completed, the processor 170 executes an IPS (Indoor Position System) and outputs movement guide information for the user to move to a specific service area in the building.

In addition, when the parking of the vehicle is completed, the processor 170 outputs the parking linkage service information according to the use of the service provided in the building, and the parking linkage service includes the parking fee guidance service, A service for delivering goods to a parking position of the vehicle, and a route guidance service to a parking position of the vehicle.

According to the embodiment of the present invention, by downloading the indoor map data and using the downloaded indoor map data to guide the parking space inside the building, Time can be saved dramatically.

In addition, according to the embodiment of the present invention, by assigning a specific parking zone among a plurality of parking zones according to a user registration setting, user satisfaction can be improved.

In addition, according to the embodiment of the present invention, not only the parking fee is noticed in real time but also various linked services linked with parking are provided, convenience for use of various services provided inside the building can be improved.

Meanwhile, the processor 170 may be controlled by the control unit or may control various functions of the vehicle through the control unit.

Hereinafter, the process of controlling the components by the processor 170 to perform the route guidance function will be described in detail with reference to FIGS. 4 to 45. FIG.

FIG. 4 is a flowchart illustrating a method of operating a vehicle driving assist system according to an embodiment of the present invention.

4, the processor 170 periodically senses the location of the vehicle 700 and determines whether the vehicle 700 has entered a specific building (e.g., a parking lot of a particular building) other than the exterior (Step 110).

If the vehicle enters the inside of the building, the processor 170 recognizes the building information of the building in which the vehicle has entered (operation 120).

If the building information is recognized, the processor 170 displays the indoor map data corresponding to the confirmed building information (operation 130). At this time, the indoor map data may be previously downloaded and stored in the memory 140, and if the indoor map data is not stored in the memory 140, the indoor map data corresponding to the building information Indoor map data can be downloaded.

Then, the processor 170 sets user information for selecting a parking position of the vehicle 700 from among a plurality of parking spaces existing on the indoor map data (operation 140). Here, the user information setting step is not a necessary step, and the user information may have already been set.

Then, the processor 170 sets a scheduled parking position at which the vehicle 700 is parked among the plurality of parking spaces (step 150). Here, the scheduled parking position can be set by the processor 170 using the user information, or alternatively, the scheduled parking position can be directly input from the user, or alternatively, Can be received and set.

When the expected parking position is set, the processor 170 uses the sensor information of the vehicle 700, that is, the OBD information and the indoor map data provided in the actual size, (Road guidance information) for moving the vehicle (step 160).

The driving information includes current position information, current traveling speed information, direction instruction information, lane information, and distance information to a direction changing point of the vehicle 700 for moving the vehicle 700 to the parking reserved position can do.

FIGS. 5 to 7 are flowcharts for explaining how the vehicle enters the building and information on the entered building in FIG.

Referring to FIG. 5, the processor 170 is connected to the vehicle 700, and periodically measures the connection state of the GPS module existing in the vehicle 700 or the GPS module existing in the vehicle driving assistant (Step 210).

Then, the processor 170 determines whether the connection of the GPS module is released (step 220). In other words, the GPS module operates in a connected state when the vehicle exists inside the building, and becomes disconnected when the vehicle enters the inside of the building. Accordingly, the processor 170 detects whether the vehicle 700 has entered the building based on the connection state of the GPS module.

When the GPS module is disconnected, the processor 170 confirms the position of the vehicle 700 at the time when the GPS connection is released (step 230). That is, the location information of the vehicle 700 can be checked periodically before the GPS connection is released. Accordingly, the position of the vehicle 700 finally obtained before the connection of the GPS is released is confirmed.

Then, the processor 170 confirms the building information at the identified location (step 240).

In step 250, the processor 170 determines the building information as information on the building in which the vehicle 700 enters.

That is, when the vehicle 700 enters the inside of the building, the processor 170 detects whether the GPS module is disconnected or not by using the point where the connection of the GPS module is released, Is determined as a building in which the vehicle has entered.

Also, referring to FIG. 6, the processor 170 receives destination information through communication with the first device (step 310). Here, the first device may be navigation present in the vehicle 700. In addition, if the vehicle driving assist device provides a GPS navigation function, the first device may be the vehicle driving assist device itself.

On the other hand, the user can set the destination he / she wants to go through the navigation or the vehicle driving assistant. The destination may be address information or may be specific building information.

At this time, the general route guidance function performs the route guidance function only to the periphery of the set destination, and does not provide the route guidance function to the inside of the building when the destination is a building.

On the other hand, if the destination is set and the destination is a specific building, the vehicle 700 is highly likely to enter the building.

Accordingly, the processor 170 receives the set destination information if the destination of the vehicle 700 is set.

Then, the processor 170 determines whether the received destination information is a specific building (step 320).

If the destination is a specific building, the processor 170 determines that the vehicle 700 has entered the building corresponding to the set destination (operation 330).

Accordingly, when the movement of the vehicle 700 is performed through the destination setting, the processor 170 detects in advance that the vehicle 700 is going to enter the building corresponding to the destination, It is possible to determine the building in which the vehicle 700 enters.

Referring to FIG. 7, in step 410, the processor 170 attempts to periodically establish a connection with the neighboring access point in order to grasp the information of the building in which the vehicle 700 enters.

Then, the processor 170 determines whether the communication unit 120 has successfully connected to the specific access point (step 420).

Then, if the communication unit 120 has successfully connected to the specific access point, the processor 170 determines a building existing at a location where the connected access point is installed (operation 430).

Then, the processor 170 determines that the identified building is a building into which the vehicle has entered (step 430).

As described above, when the vehicle 700 enters a specific building, an access point existing in the specific building and the communication unit 120 will be communicated with each other. Accordingly, the processor 170 may determine a building into which the vehicle 700 has entered based on the installation position of the access point to which the communication unit 120 is connected.

When the vehicle 700 enters the inside of the specific building, the processor 170 outputs the indoor map data of the building into which the vehicle 700 has entered, on the display unit 180, as described above. At this time, in order to output the indoor map data, the indoor map data to be output must be stored in the memory 140. However, when the vehicle first enters the building, there is a high possibility that the indoor map data is not stored in the memory 140.

Accordingly, the processor 170 downloads the indoor map data of the building in which the vehicle 700 enters.

At this time, the downloading of the indoor map data may proceed at different points of time as follows.

8 to 11 are diagrams for explaining a method of outputting indoor map data according to the first embodiment of the present invention.

Referring to FIG. 8, if the destination of the vehicle 700 is set by the user (step 510), the processor 170 determines whether the destination is a building (operation 520).

If the set destination is the specific building, the processor 170 accesses the server 500 to download the indoor map data of the building corresponding to the destination, and stores the indoor map data in the memory 140 in step 530.

Referring to FIG. 9, the destination of the vehicle 700 may be set as an LG department store. At this time, the vehicle 700 is highly likely to enter the interior of the LG department store later. Accordingly, the processor 170 outputs a screen 900 asking whether to download the indoor map data of the interior of the building of the LG department store via the display unit 180.

10, the processor 170 downloads the indoor map data from the server 500 when the indoor map data download execution command is input on the screen 900, And displays the screen 1000 on the display unit 180.

Referring to FIG. 11, when the download is completed, the processor 170 may display the downloaded indoor map data on the display unit 180 with priority.

At this time, the screen 1100 may include information on a parking space existing in the building. That is, the parking space includes a disabled area 1110 for parking a disabled person's vehicle, a VIP exclusive parking area 1120 for parking a VIP customer's vehicle, an infant companion area for parking a customer vehicle with an infant Pregnant / mothers' parking area 1140 for parking the elderly or pregnant woman, and a general parking area 1150 for parking the general customer's vehicle.

Accordingly, the user can grasp at a glance the parking space in the building or the parking space in the building in which the vehicle has already entered by using the indoor map data.

12 to 15 are flowcharts for explaining a method of outputting indoor map data according to a second embodiment of the present invention.

Referring to FIG. 12, the processor 170 determines the position of the vehicle 700 (step 610) and determines whether the vehicle 700 has entered the building (operation 620).

If the vehicle 700 enters the inside of the building, the processor 170 identifies the building into which the vehicle 700 entered (operation 630).

If the building is confirmed, the processor 170 downloads the indoor map data of the identified building through the server and stores it in the memory 140 (operation 640).

Referring to FIG. 13, if the vehicle 700 enters the inside of a specific building, the processor 170 outputs information indicating that the vehicle 700 has entered a specific building, together with indoor map data A screen 1300 for asking whether or not to download the image is displayed.

When a download execution command is input on the screen 1300, the processor 170 downloads the indoor map data of the building through the server 500 and stores the downloaded indoor map data in the memory 140 ).

Referring to FIG. 14, the processor 170 displays a screen 1400 as to whether or not to execute the parking guide function according to the down-loaded indoor map data when the indoor map data is down.

At this time, on the screen 1400, a shortcut menu 1410 for directly executing the parking guide function is included.

On the other hand, indoor map data for the building may already be stored in the memory 140 as the vehicle 700 enters a specific building and the entered building is a building having a previous history .

15, if the indoor map data for the building in which the vehicle 700 has already been stored is already stored in the memory 140, the indoor map data of the entered building is already stored in the memory 140 And displays a screen 1500 informing that the parking guide function in the building is executed by using the stored indoor map data.

16 is a view showing a scheduled parking position setting screen according to an embodiment of the present invention.

Referring to FIG. 16, the scheduled parking position of the vehicle 700 should be set preferentially as the parking guide function is executed. At this time, the scheduled parking position can be arbitrarily assigned by the server, and the user can set it manually.

When the scheduled parking position of the vehicle 700 is set, a screen 1600 including information on the predetermined scheduled parking position is displayed.

The screen 1600 includes scheduled parking position information 1610 for the parking space where the vehicle 700 is to be parked.

17 is a diagram illustrating a user information setting screen according to an embodiment of the present invention.

Referring to FIG. 17, in order to set the scheduled parking position, the processor 170 displays a screen 1700 for setting user information.

The user information setting screen 1700 displays an item for setting the type of the vehicle 700 and an item for setting the gender of the driver and information on whether or not the elderly person, Item can be included.

In addition, the user information setting screen 1700 may include additional setting items such as a parking guidance setting item, an automatic download setting item, an automatic updating setting item, and a setting item for parking only in the indoor parking lot.

The processor 170 displays the screen 1700 as described above at the time of initial use of the parking guide function so that user information can be set. The processor 170 applies the parking position recommendation differently according to the driver. For example, in the case of the elderly person, the obstacle, the pregnant woman, and the female driver, the processor 170 sets a scheduled parking position of the vehicle 700 do.

In addition, the user registration information is activated by collating with the user information of the vehicle driving assistance device (e.g., mobile terminal) that receives the indoor map data, and can further provide the ARS identification inquiry service and the like.

In addition, the processor 170 automatically downloads the indoor map data of the building when the vehicle 700 enters the other parking lot according to the set item. The processor 170 can selectively download only the indoor parking lot data according to the setting item.

Also, if the automatic execution item is checked, the processor 170 activates the parking guide function immediately when the vehicle 700 enters the inside of the building. And, if the indoor map data of the entered building is not stored, the indoor map data can be automatically downloaded.

When the user information is set, the same set value may be applied to each building in which the vehicle 700 enters.

On the other hand, the scheduled parking position setting may be different according to the driver of the vehicle 700 and the passenger.

That is, the processor 170 assigns priorities according to the number of the elderly, the disabled, pregnant women, passengers, and accompanying children, And the like).

Upon entering the building for the first time, the processor 170 outputs a screen (not shown) for selectively inputting driver and passenger-related information, and guides the scheduled parking position according to priority analysis .

Then, the processor 170 automatically recognizes the vehicle with the registration sign of the disabled person without the user's input so that the scheduled parking position of the vehicle 700 can be set to the disabled parking space.

Further, in the embodiment, in the case of a disabled person vehicle which is not issued with the sign of the disabled person, it is possible to automatically recognize the registered vehicle of the disabled person in association with the Ministry of Health and Welfare and the Ministry of Health and Welfare, To be set.

In addition, the processor 170 automatically associates with the internal system of the building for a customer or an internal person designated as a VIP in the building so that the scheduled parking position of the vehicle can be automatically set as a VIP exclusive parking space without user input.

In addition, the child companion vehicle can be automatically recognized according to the installation of the car seat and whether or not the child is aboard.

18 is a diagram showing indoor map data displayed according to an embodiment of the present invention.

Referring to FIG. 18, the processor 170 displays the indoor map data 1800 into which the building has entered. At this time, the indoor map data may include parking target information for each zone as described above, and may include parking charge information for each parking zone.

As shown in FIG. 18, the processor 170 may determine whether the parking lot is in a good parking position (for example, in the case of a complex underground / underground parking lot, a location near a first floor building, Such as a building entrance at a level where service can be readily received by the user.

At this time, the parking fee may be charged differently according to the floor and the parking position according to the zone. In the case of a general building, more expensive parking charges can be allocated to the floor where the building is easy to enter, or to a location close to the entrance.

In the present invention, the first visitor of the building can select a desired parking fee plan when downloading the indoor map data. Then, a vehicle visited multiple times in the building can be automatically guided to a previously parked location, and a screen for reselecting the location of each parking charge can be separately displayed.

When a specific parking area is selected by the user in FIG. 18, a screen 1900 including information on the selected parking area is displayed as shown in FIG.

At this time, detailed information (parking fee, parking lot, distance information from the entrance, etc. 1910) of the parking zone selected by the user together with the screen 1900 can be output. At this time, the detailed information may be displayed as an image, or alternatively, it may be outputted as a voice.

20 is a view illustrating a route guidance screen according to an embodiment of the present invention.

Referring to FIG. 20, when the preset parking position is completed, the processor 170 displays a route guidance screen 2000. FIG.

The route guidance screen 2000 basically includes the indoor map data and includes a mini-map menu 2010 for viewing mini-map data, information on the current position of the vehicle 700 and information 2020).

The route guidance screen 2000 includes information on a direction in which the vehicle 700 should travel, and the mini-map may be displayed through the mini-map menu 2010 as described above. The mini-map will be described in more detail below.

Hereinafter, a process of setting the scheduled parking position by the user will be described.

FIGS. 21 to 27 are views for explaining a scheduled parking position setting process according to an embodiment of the present invention.

First, referring to FIG. 21, a screen 2100 including parking space information for a floor nearest to the ground is displayed to set a scheduled parking position.

For example, the screen 2100 may include parking space information for a first floor parking lot. The parking space information may include location information of each parking area of the one-storey car park and parking charge information for each parking area.

In the state where the screen 2100 is displayed, the user can select a desired parking position by selecting a specific parking area. At this time, if the desired parking space does not exist in the parking space displayed on the screen 2100, the parking space of another floor can be displayed.

That is, in the state that the one-storey car park is displayed, the user can perform touch movement while selecting a specific area of the screen.

That is, as shown in FIG. 21, when the touch movement moves downward, the screen 2110 of the parking space corresponding to the lower layer of the currently displayed parking lot is displayed. That is, when the touch movement is performed on the upper floor, the parking space screen 2110 of the first floor below the currently displayed one-story parking lot is displayed.

Also, referring to FIG. 22, the touch movement may be performed from the lower side to the upper side. Accordingly, when the touch movement to the image is performed as described above, a parking space screen existing on the upper floor of the currently displayed parking lot is displayed.

That is, if the touch movement to the image is performed under the above-mentioned state in the state that the screen 2200 for the parking space of the underground parking space is displayed, the parking space of the first floor parking A screen 2210 is displayed.

Meanwhile, as described above, the user can move the indoor map data by touching up and down to move to the floor map area, and can select a layer desired to park his / her vehicle.

In addition, the user can move the indoor map data to a desired area within the same floor.

That is, referring to FIG. 23, the user can perform the touch movement in the left and right direction instead of the touch movement in the vertical direction as described above. That is, if the touch movement from left to right is performed while the screen 2300 for the parking space in the central area of the first floor parking lot is displayed, the screen 2310 for the parking space in the right area in the one- Is displayed.

24, if a touch movement is made from right to left in a state that a screen 2400 for the parking space in the central area of the one-story parking lot is displayed, A screen 2410 for the space is displayed.

Also, as described above, the user can move the indoor map data to the desired floor and the desired area within the floor, so that the user can select the area to be actually parked.

If a specific zone in the specific layer is selected, the display unit 180 displays detailed information of the selected specific zone.

Referring to FIG. 25, when the underground one-story parking lot is selected and the B zone is selected in the underground one-story parking lot, a screen 2500 including detailed parking space information on the B- Is displayed.

The screen 2500 includes information 2510 on possible parking positions for all the parking spaces existing within the area B of the ground floor. Accordingly, the user can select a desired parked position by selecting any one of the information 2510. [

On the other hand, such scheduled parking position selection as described above can also be executed in the mini-map.

26 shows an initially displayed mini-map menu. In the mini-map 2600, a floor icon 2610 is displayed in a vertical direction, and a parking zone icon 2620 in the floor is displayed in a horizontal direction. At this time, a specific parking zone is selected as a default by default. For example, as shown in FIG. 26, the A zone of the B1 floor can be selected and displayed by default.

When the left-to-right touch movement is performed in the state that the mini-map is displayed as described above, since the movement in the horizontal direction is performed, the mini-map in which the parking zone movement is performed in the touch movement direction in the same layer 2630) is displayed.

That is, when touch movement is performed from the left to the right as described above, the area of the right area of the currently selected area is selected, and in the state where the A zone of the first floor is selected as shown in the drawing, A mini-map 2630 in which a zone is selected may be displayed.

Referring to FIG. 27, when a touch is moved from below to below in the state that the mini-map 2700 selected as the A-zone of the B1 floor is displayed, since the movement in the vertical direction is performed, A mini map 2710 in which floor movement is performed instead of zone movement is displayed.

That is, when the touch movement to the image is performed as described above, the upper layer of the currently selected layer is selected, and in the state where the A zone of the first basement is selected as shown in the drawing, Can be selected.

28 to 32 are views for explaining a process of displaying driving information of a vehicle according to an embodiment of the present invention.

Referring to FIG. 28, when the expected parking position is set as described above, the processor 170 grasps real-time movement information of the vehicle using the sensor information and the indoor map data, And determines the position of the vehicle.

The display unit 180 displays a driving information screen 2800 for moving the vehicle to the scheduled parking position under the control of the processor 170.

The driving information screen 2800 displays the current position information 2810 of the vehicle, the whole map information 2820 including the current position of the vehicle and the expected parking position, the distance information 2830 to the direction changing point, Direction instruction information 2840. [

As described above, the user can move the vehicle to the parked position while viewing the driving information screen 2800. [

That is, when the predetermined or allocated scheduled parking position is underground, the operation information screen 2800 can provide information on a position to be taken out of the parking lot through the entire map. On the other hand, when it is desired to reset the scheduled parking position, the new parking position can be set again through another position receiving button.

When the scheduled parking position is finally set, guidance to the scheduled parking position is started. Information about the moving route is displayed in the floor area of the map in the driving information screen 2800 as described above. And information on speed can be additionally displayed. The driving information screen 2800 may also be displayed on a Head Up Display (HUD).

When the vehicle moves downward on the driving information screen 2800, it is possible to display an enlarged moving path through the 3D map view (or mini-map). When the vehicle reaches the lower floor, the 3D top view The information on the moving direction can be displayed by switching to the camera view. In addition, when the route guidance starts, communication with the vehicle is started and the emergency blinking lamp of the vehicle is automatically operated to easily distinguish the parking vehicle and the outbound vehicle.

On the other hand, the movement path can be displayed so that the user can intuitively confirm the movement path. That is, a general parking lot has a limited rotation period or information on the rotation period is not clearly provided, so that the user often passes a left turn or a right turn point.

Thus, as shown in FIG. 29 (a), a blinking pattern is displayed in the left area of the display area with respect to the left turn point of the vehicle, thereby accurately guiding the left turn point of the vehicle. Also, as shown in FIG. 29 (b), a right-turn point of the vehicle is displayed with a blinking pattern in the right-hand area of the display area. 29 (c), a blinking pattern is displayed in the left and right areas of the display area when approaching the predetermined scheduled parking position, thereby guiding the vehicle not to exceed the scheduled parking position.

Meanwhile, the above driving information may be provided through Augmented Reality (AR).

30, the processor 170 displays the destination (parking reserved position) of the vehicle as an arrow on the ground, and acquires information on the route along the forward, left, and right turns of the vehicle as augmented reality . In the embodiment, the parking facility can be operated flexibly according to the number of the vehicles entering the parking lot. For example, if there are many vehicles attempting to park and the disabled area is preempted earlier than usual, Can be utilized.

Accordingly, as shown in Fig. 31, the parking space of the general parking zone can be changed to serve as the disabled zone through the augmented reality.

On the other hand, when the scheduled parking position is set, the processor 170 instructs the vehicle to park at the scheduled parking position. 32, when the vehicle is parked at a position other than the scheduled parking position, a warning screen 3200 is displayed to display the warning screen 3200 so that the moving parking of the vehicle is performed at the scheduled parking position do.

In addition, the processor 170 provides penalty information and information if the vehicle is not parked. At this time, the parked position of the vehicle may be already set to the parked position of another vehicle. Accordingly, the processor 170 provides the server with information on the parked parking position of the vehicle so that the scheduled parking position of the other vehicle can be reset.

33 to 35 are views for explaining a parking method according to an embodiment of the present invention.

First, as shown in FIG. 33, when the parked vehicle comes out, the processor 170 searches for an exit in the indoor map data, and accordingly obtains travel route information 3300 to the exit to provide.

At this time, the movement route information 3300 includes a re-parking menu 3310, thereby easily providing information for re-parking even during the departure of the vehicle. That is, when the user desires to park again due to a flashing object, the processor 170 guides the parking position of the vehicle in consideration of the position and direction of the vehicle.

At this time, the re-parked position can be guided based on the previously parked position, and can be guided to a new position differently.

That is, as shown in FIG. 34, when the previous parking position of the vehicle is not occupied by another vehicle, the processor 170 displays a guidance screen 3400). To this end, in the embodiment, the other vehicle is prevented from parking at the previous parking position of the vehicle for a certain period of time after the vehicle departure time.

35, the processor 170 may be configured to re-park the vehicle based on a new parking position adjacent to the previous parking position when the previous parking position of the vehicle has already been preempted by another vehicle A guide screen 3500 for guiding is displayed.

Meanwhile, in the embodiment, various parking linking services linked to parking of the vehicle are provided. The parking linking service may include a parking fee guidance service, a route guidance service to a specific service area according to user registration information, a delivery service for goods using the service to the parking position of the vehicle, And may include at least one.

36 to 44 are views for explaining a parking link service according to an embodiment of the present invention.

Referring to FIG. 36, the parking link service may be a parking charge guidance service. The processor 170 displays a charge guidance screen 3600 including parking position information and parking charge information of the vehicle on the basis of the parking charge per hour ) Is displayed.

At this time, the fare guidance screen 3600 includes a guidance menu 3610 for providing route guidance information to the parking position of the vehicle.

In addition, various buildings may be included in the building where the vehicle is parked, and the parking fee may be discounted according to expenses incurred in using the service in the service area.

37, the processor 170 periodically receives information on expenses incurred by the user in using the service, and displays a guide screen 3700 including parking discount information according to the received information Display. At this time, the guidance screen 3700 also includes a guidance menu 3710 for providing guidance information to the parking position of the vehicle.

The parking linking service may be a route guidance service to the parking position of the vehicle. Referring to FIG. 38, when the guidance menu 3610 or 3710 is input, the processor 170 displays a route guidance screen 3800 for guiding the vehicle to the parking position on the basis of the current position of the user Display.

At this time, the route guidance screen 3800 can toggle between the augmented reality view and the map view 3810.

In other words, parking fee information of the vehicle is received in real time while the user remains in the building after parking the vehicle. In the embodiment, a service for guiding the route to the parking position of the vehicle together with the parking fee information guidance is provided.

As shown in FIG. 39, the processor 170 provides information on whether or not the service is used and the parking fee according to the elapsed time of the parking time.

That is, as shown in FIG. 39 (a), when the user uses a specific service, the information 3900 on the free parking time can be displayed according to the usage amount of the service. The free parking time is provided in the form of a bar, and may be increased as the service usage amount increases, and may decrease as the parking time elapses.

As shown in Figure 39 (b), the processor 170 decreases the free parking availability time of the bar in accordance with the elapse of the parking time, and the information 3910 regarding the changed free parking time accordingly, .

In addition, as shown in (c) of FIG. 39, the processor 170 displays information 3920 including information on the elapsed time and the additional consumption amount when the free parking availability time elapses.

In addition, as shown in FIG. 39 (d), additional services such as associated coupons and store guidance can be provided to guide the user to increase consumption for the purpose of making the parking fee free, and the final parking information 3930 ) May be provided.

Meanwhile, the present invention can provide a route guidance service to a specific service area according to user registration information.

That is, as shown in FIG. 40, the user registration information may be a two-list (4000), and the processor 170 may use the information included in the two-list (4000) Provide directions for moving.

41, the processor 170 displays a guidance screen 4100 asking whether or not to provide the guidance service linked to the two-to-four lists 4000. [

When the guidance command is input on the guidance screen 4100, the guidance information linked to the two-to-four list 4000 is provided.

That is, as shown in FIG. 42, the two-to-one list 4000 may be information on items to be bought, and the processor 170 may generate route guidance information 4200 for moving to the service area in which the goods are sold, Lt; / RTI > That is, the article may be a TV, and accordingly, the route guidance information 4200 includes travel route information to a home appliance store for purchasing the TV.

Meanwhile, the user registration information may be characteristic information of the driver and the passenger. In an embodiment, the customized service is provided according to the characteristic information.

That is, as shown in FIG. 43, a screen 4300 for asking whether the child is included in the passenger can be displayed. Accordingly, when the child is included, The selection screen 4310 may be provided.

In response to the selection of a specific service on the selection screen 4310, the mobile terminal provides movement route information 4320 for moving to a service area providing the selected service.

Meanwhile, the parking link service may be a goods delivery service.

44, when the user purchases an item in the building and moves to the parking position accordingly, a guide screen 4400 of a delivery service for transferring the purchased item to the parking position is provided do.

According to the embodiment of the present invention, by downloading the indoor map data and using the downloaded indoor map data to guide the parking space inside the building, Time can be saved dramatically.

In addition, according to the embodiment of the present invention, by assigning a specific parking zone among a plurality of parking zones according to a user registration setting, user satisfaction can be improved.

In addition, according to the embodiment of the present invention, not only the parking fee is noticed in real time but also various linked services linked with parking are provided, convenience for use of various services provided inside the building can be improved.

45 is an example of an internal block diagram of the vehicle of Fig.

Such vehicle auxiliary apparatus 100 may be included in the vehicle.

The vehicle includes a communication unit 710, an input unit 720, a sensing unit 760, an output unit 740, a vehicle driving unit 750, a memory 730, an interface unit 780, a control unit 770, a power source unit 790, A vehicle auxiliary device 100, and an AVN device 400. [

The communication unit 710 is connected to the communication unit 710 to communicate with the vehicle 700 and the mobile terminal 600, Modules. In addition, the communication unit 710 may include one or more modules that connect the vehicle to one or more networks.

The communication unit 710 may include a broadcast receiving module 711, a wireless Internet module 712, a local area communication module 713, a location information module 714, and an optical communication module 715.

The broadcast receiving module 711 receives broadcast signals or broadcast-related information from an external broadcast management server through a broadcast channel. Here, the broadcast includes a radio broadcast or a TV broadcast.

The wireless Internet module 712 is a module for wireless Internet access, and can be built in or externally mounted in a vehicle. The wireless Internet module 712 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro World Wide Interoperability for Microwave Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), and Long Term Evolution-Advanced (LTE-A) (712) transmits and receives data according to at least one wireless Internet technology in a range including internet technologies not listed above. For example, the wireless Internet module 712 can exchange data with the external server 510 wirelessly. The wireless Internet module 712 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) information from the external server 510. [

The short-range communication module 713 is for short-range communication and may be a Bluetooth ™, a Radio Frequency Identification (RFID), an Infrared Data Association (IrDA), an Ultra Wideband (UWB) It is possible to support near-field communication using at least one of Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct and Wireless USB (Universal Serial Bus)

The short range communication module 713 may form short range wireless communication networks (Wireless Area Networks) to perform short range communication between the vehicle and at least one external device. For example, the short-range communication module 713 can exchange data with the mobile terminal 600 wirelessly. The short distance communication module 713 can receive weather information and traffic situation information of the road (for example, TPEG (Transport Protocol Expert Group)) from the mobile terminal 600. For example, when the user has boarded the vehicle, the user's mobile terminal 600 and the vehicle can perform pairing with each other automatically or by execution of the user's application.

The position information module 714 is a module for acquiring the position of the vehicle, and a representative example thereof is a Global Positioning System (GPS) module. For example, when the vehicle utilizes a GPS module, it can acquire the position of the vehicle using a signal sent from the GPS satellite.

The optical communication module 715 may include a light emitting portion and a light receiving portion.

The light receiving section can convert the light signal into an electric signal and receive the information. The light receiving unit may include a photodiode (PD) for receiving light. Photodiodes can convert light into electrical signals. For example, the light receiving section can receive information of the front vehicle through light emitted from the light source included in the front vehicle.

The light emitting unit may include at least one light emitting element for converting an electric signal into an optical signal. Here, the light emitting element is preferably an LED (Light Emitting Diode). The optical transmitter converts the electrical signal into an optical signal and transmits it to the outside. For example, the optical transmitter can emit the optical signal to the outside through the blinking of the light emitting element corresponding to the predetermined frequency. According to an embodiment, the light emitting portion may include a plurality of light emitting element arrays. According to the embodiment, the light emitting portion can be integrated with the lamp provided in the vehicle. For example, the light emitting portion may be at least one of a headlight, a tail light, a brake light, a turn signal lamp, and a car light. For example, the optical communication module 715 can exchange data with another vehicle 520 via optical communication.

The input unit 720 may include a driving operation unit 721, a camera 195, a microphone 723, and a user input unit 724.

The driving operation means 721 receives a user input for driving the vehicle. The driving operation means 721 may include a steering input means 721A, a shift input means 721D, an acceleration input means 721C, and a brake input means 721B.

The steering input means 721A receives the input of the traveling direction of the vehicle from the user. The steering input means 721A is preferably formed in a wheel shape so that steering input is possible by rotation. According to the embodiment, the steering input means 721A may be formed of a touch screen, a touch pad, or a button.

The shift input means 721D receives input of parking (P), forward, neutral (N), and reverse (R) of the vehicle from the user. The shift input means 721D is preferably formed in a lever shape. According to an embodiment, the shift input means 721D may be formed of a touch screen, a touch pad, or a button.

The acceleration input means 721C receives an input for acceleration of the vehicle from the user. The brake inputting means 721B receives an input for decelerating the vehicle from the user. The acceleration input means 721C and the brake input means 721B are preferably formed in the form of a pedal. According to the embodiment, the acceleration input means 721C or the brake input means 721B may be formed of a touch screen, a touch pad, or a button.

The camera 722 may include an image sensor and an image processing module. The camera 722 may process still images or moving images obtained by an image sensor (e.g., CMOS or CCD). The image processing module processes the still image or moving image obtained through the image sensor, extracts necessary information, and transmits the extracted information to the control unit 770. Meanwhile, the vehicle may include a camera 722 for photographing a vehicle front image or a vehicle periphery image, and a monitoring unit for photographing an inside image of the vehicle.

The monitoring unit may obtain an image of the occupant. The monitoring unit can acquire an image for biometrics of the passenger.

45, the monitoring unit and the camera 722 are included in the input unit 720. However, the camera 722 may be described as a configuration included in the vehicle auxiliary device, as described above.

The microphone 723 can process an external sound signal as electrical data. The processed data can be used variously depending on the function being performed in the vehicle. The microphone 723 can convert the voice command of the user into electrical data. The converted electrical data can be transmitted to the control unit 770.

The camera 722 or the microphone 723 may be a component included in the sensing unit 760 rather than a component included in the input unit 720. [

The user input unit 724 is for receiving information from a user. When information is input through the user input unit 724, the control unit 770 can control the operation of the vehicle to correspond to the input information. The user input unit 724 may include touch input means or mechanical input means. According to an embodiment, the user input 724 may be located in one area of the steering wheel. In this case, the driver can operate the user input portion 724 with his / her finger while holding the steering wheel.

The sensing unit 760 senses a signal related to the running or the like of the vehicle. To this end, the sensing unit 760 may include a sensor, a wheel sensor, a velocity sensor, a tilt sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor, , A position module, a vehicle forward / reverse sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, an internal humidity sensor, an ultrasonic sensor, a radar, .

Thereby, the sensing unit 760 can acquire the vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, , Fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, and the like.

In addition, the sensing unit 760 may include an acceleration pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A position sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.

The sensing unit 760 may include a biometric information sensing unit. The biometric information sensing unit senses and acquires the biometric information of the passenger. The biometric information may include fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition information, Voice recognition information. The biometric information sensing unit may include a sensor that senses the passenger's biometric information. Here, the monitoring unit and the microphones 723 can operate as sensors. The biometric information sensing unit can acquire hand shape information and facial recognition information through the monitoring unit.

The output unit 740 is for outputting information processed by the control unit 770 and may include a display unit 741, an acoustic output unit 742, and a haptic output unit 743. [

The display unit 741 can display information processed in the control unit 770. For example, the display unit 741 can display the vehicle-related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver. Further, the vehicle-related information may include vehicle state information indicating the current state of the vehicle or vehicle driving information related to the driving of the vehicle.

The display unit 741 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) display, a 3D display, and an e-ink display.

The display unit 741 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. This touch screen may function as a user input 724 that provides an input interface between the vehicle and the user, while providing an output interface between the vehicle and the user. In this case, the display unit 741 may include a touch sensor that senses a touch with respect to the display unit 741 so that a control command can be received by a touch method. When a touch is made to the display unit 741, the touch sensor senses the touch, and the control unit 770 generates a control command corresponding to the touch based on the touch. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.

Meanwhile, the display unit 741 may include a cluster so that the driver can check the vehicle state information or the vehicle driving information while driving. Clusters can be located on the dashboard. In this case, the driver can confirm the information displayed in the cluster while keeping the line of sight ahead of the vehicle.

Meanwhile, according to the embodiment, the display unit 741 may be implemented as a Head Up Display (HUD). When the display unit 741 is implemented as a HUD, information can be output through a transparent display provided in the windshield. Alternatively, the display unit 741 may include a projection module to output information through an image projected on the windshield.

The sound output unit 742 converts an electric signal from the control unit 770 into an audio signal and outputs the audio signal. For this purpose, the sound output unit 742 may include a speaker or the like. It is also possible for the sound output section 742 to output a sound corresponding to the operation of the user input section 724. [

The haptic output unit 743 generates a tactile output. For example, the haptic output section 743 may operate to vibrate the steering wheel, the seat belt, and the seat so that the user can recognize the output.

The vehicle drive unit 750 can control the operation of various devices of the vehicle. The vehicle driving unit 750 includes a power source driving unit 751, a steering driving unit 752, a brake driving unit 753, a lamp driving unit 754, an air conditioning driving unit 755, a window driving unit 756, an airbag driving unit 757, A driving unit 758 and a suspension driving unit 759.

The power source drive section 751 can perform electronic control of the power source in the vehicle.

For example, when the fossil fuel-based engine (not shown) is a power source, the power source drive unit 751 can perform electronic control on the engine. Thus, the output torque of the engine and the like can be controlled. When the power source drive unit 751 is an engine, the speed of the vehicle can be limited by limiting the engine output torque under the control of the control unit 770. [

As another example, when the electric motor (not shown) is a power source, the power source driving unit 751 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.

The steering driver 752 may perform electronic control of a steering apparatus in the vehicle. Thus, the traveling direction of the vehicle can be changed.

The brake driver 753 can perform electronic control of a brake apparatus (not shown) in the vehicle. For example, it is possible to reduce the speed of the vehicle by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.

The lamp driver 754 can control the turn-on / turn-off of the lamps disposed inside and outside the vehicle. Also, the intensity, direction, etc. of the light of the lamp can be controlled. For example, it is possible to perform control on a direction indicating lamp, a brake lamp, and the like.

The air conditioning driving unit 755 can perform electronic control on an air conditioner (not shown) in the vehicle. For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cool air to be supplied to the inside of the vehicle.

The window driving unit 756 may perform electronic control of a window apparatus in the vehicle. For example, it is possible to control the opening or closing of the side of the vehicle with respect to the left and right windows.

The airbag driving unit 757 can perform electronic control of the airbag apparatus in the vehicle. For example, in case of danger, the airbag can be controlled to fire.

The sunroof driving unit 758 may perform electronic control of a sunroof apparatus (not shown) in the vehicle. For example, the opening or closing of the sunroof can be controlled.

The suspension driving unit 759 can perform electronic control of a suspension apparatus (not shown) in the vehicle. For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle.

The memory 730 is electrically connected to the control unit 770. The memory 770 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data. The memory 790 can be, in hardware, various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like. The memory 730 may store various data for operation of the entire vehicle, such as a program for processing or controlling the control unit 770.

The interface unit 780 can serve as a pathway to various kinds of external devices connected to the vehicle. For example, the interface unit 780 may include a port that can be connected to the mobile terminal 600, and may be connected to the mobile terminal 600 through the port. In this case, the interface unit 780 can exchange data with the mobile terminal 600.

Meanwhile, the interface unit 780 may serve as a channel for supplying electrical energy to the connected mobile terminal 600. The interface unit 780 provides electric energy supplied from the power supply unit 790 to the mobile terminal 600 under the control of the control unit 770 when the mobile terminal 600 is electrically connected to the interface unit 780 do.

The control unit 770 can control the overall operation of each unit in the vehicle. The control unit 770 may be referred to as an ECU (Electronic Control Unit).

The control unit 770 can perform a function corresponding to the transmitted signal in accordance with the execution signal transmission of the vehicle auxiliary apparatus.

The controller 770 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) processors, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions.

The control unit 770 can delegate the role of the processor 170 described above. That is, the processor 170 of the vehicle auxiliary apparatus can be set directly to the control unit 770 of the vehicle. In this embodiment, it is understood that the vehicle auxiliary device refers to a combination of some parts of the vehicle.

Alternatively, the control unit 770 may control the configurations so as to transmit the information requested by the processor 170. [

The power supply unit 790 can supply power necessary for the operation of each component under the control of the control unit 770. [ Particularly, the power supply unit 770 can receive power from a battery (not shown) in the vehicle.

The AVN (Audio Video Navigation) device 400 can exchange data with the control unit 770. The control unit 770 can receive navigation information from the AVN apparatus 400 or a separate navigation device (not shown). Here, the navigation information may include set destination information, route information according to the destination, map information about the vehicle driving, or vehicle location information.

The features, structures, effects and the like described in the foregoing embodiments are included in at least one embodiment of the present invention and are not necessarily limited to one embodiment. Further, the features, structures, effects, and the like illustrated in the embodiments may be combined or modified in other embodiments by those skilled in the art to which the embodiments belong. Therefore, it should be understood that the present invention is not limited to these combinations and modifications.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be construed as limiting the scope of the present invention. It can be seen that various modifications and applications are possible. For example, each component specifically shown in the embodiments may be modified and implemented. It is to be understood that the present invention may be embodied in many other specific forms without departing from the spirit or essential characteristics thereof.

Claims (21)

A method of operating a vehicle driving assist system,
Detecting whether the vehicle has entered the building;
Outputting indoor map data corresponding to the building if the vehicle enters the building;
Receiving sensor information of the vehicle; And
And outputting driving information to a scheduled parking position of a parking space in the building using sensor information of the vehicle and the indoor map data,
The operating method comprises:
Downloading the indoor map data corresponding to the building;
Receiving parking status information on a parking space in the building; And
Further comprising setting the scheduled parking position based on the received parking status information,
Wherein the downloading comprises:
Setting a destination of the vehicle;
Determining whether the set destination is a building; And
And downloading the indoor map data corresponding to the building if the set destination is the building,
The predetermined parking position is set based on at least one of the driver and the passenger of the vehicle
A method of operating a vehicle driving assist system.
The method according to claim 1,
Wherein the step of receiving the sensor information comprises:
Further comprising acquiring travel information of the vehicle using the sensor information,
The running information includes:
And the vehicle position information on the indoor map data
A method of operating a vehicle driving assist system.
3. The method of claim 2,
The sensor information includes:
(OBD) information including at least one of steering sensor information, vehicle speed sensor information, yaw sensor information, gyro sensor information, wheel sensor information, and tilt sensor information of the vehicle
A method of operating a vehicle driving assist system.
The method according to claim 1,
Wherein the sensing comprises:
Confirming a connection state of a GPS (Global Positioning System) module of the vehicle,
And determining that the vehicle has entered the building if the connection state of the GPS module is the disconnected state
A method of operating a vehicle driving assist system.
The method according to claim 1,
Further comprising the step of determining building information for a building in which the vehicle has entered,
The building information includes:
The location information of the GPS (Global Positioning System) module, the preset destination information, and the location information of an access point (AP) located in the vicinity of the GPS
A method of operating a vehicle driving assist system.
6. The method of claim 5,
Wherein the downloading of the indoor map data comprises:
The indoor map data is downloaded at the time when the vehicle enters the building
A method of operating a vehicle driving assist system.
The method according to claim 1,
The information of at least one of the driver and the passenger may be,
Wherein the at least one of the driver and the passenger includes at least one of a sex of a driver or a passenger, an elderly person, a disabled person, a pregnant woman, a resident, a VIP, a passenger, and a child.
The method according to claim 1,
Further comprising the step of resetting the scheduled parking position when the changing information of the parking state with respect to the predetermined scheduled parking position is received from the server
A method of operating a vehicle driving assist system.
The method according to claim 1,
The vehicle-
The current position information, the current traveling speed information, the direction indicating information, the lane information, and the distance information to the direction changing point
A method of operating a vehicle driving assist system.
The method according to claim 1,
And outputting the driving information to the exit of the building using the sensor information and the indoor map data when the departure of the parked vehicle is detected in the parking reserved position
A method of operating a vehicle driving assist system.
11. The method of claim 10,
Further comprising the step of outputting driving information to a previous parking position when a re-parking command is inputted during departure of the vehicle
A method of operating a vehicle driving assist system.
The method according to claim 1,
And when the parking of the vehicle is completed, outputting movement guide information to a specific service area of the service area in the building
A method of operating a vehicle driving assist system.
The method according to claim 1,
Further comprising the step of, when the parking of the vehicle is completed, outputting the parking linkage service information according to the use of the service provided in the building,
The parking linkage service,
A parking fee guidance service, a route guidance service to a specific service area according to user registration information, a delivery service for delivering the goods according to the use of the service to the parking position of the vehicle, and a route guidance service to the parking position of the vehicle
A method of operating a vehicle driving assist system.
An interface for communicating with a vehicle and receiving sensor information of the vehicle;
A communication unit for communicating with the server;
A processor for outputting the driving information of the vehicle using the sensor information and the indoor map data of the entered building when the vehicle enters the building; And
And an output unit for outputting driving information of the vehicle,
The processor comprising:
Detects whether the vehicle has entered the building,
And outputs the indoor map data corresponding to the building into which the vehicle has entered, onto the output unit when the vehicle has entered the building,
Acquiring driving information to a scheduled parking position in the parking space in the building using the received sensor information and the outputted indoor map data,
Outputting the obtained driving information onto the output unit,
The processor comprising:
The indoor map data corresponding to the building is downloaded from the server and stored in the memory when the destination of the vehicle is the building,
From the server, parking status information on a parking space in the building,
Sets the scheduled parking position based on the received parking state information,
The predetermined parking position is set based on at least one of the driver and the passenger of the vehicle
Vehicle driving assistance device.
15. The method of claim 14,
The processor comprising:
Acquiring vehicle running information including vehicle position information on the indoor map data using the received sensor information,
And outputs the driving information on the basis of the obtained vehicle running information
Vehicle driving assistance device.
16. The method of claim 15,
The sensor information includes:
(OBD) information including at least one of steering sensor information, vehicle speed sensor information, yaw sensor information, gyro sensor information, wheel sensor information, and tilt sensor information of the vehicle
Vehicle driving assistance device.
15. The method of claim 14,
The processor comprising:
(GPS) module of the vehicle, and when the connection of the GPS module is released, it is determined that the vehicle has entered the building
Vehicle driving assistance device.
15. The method of claim 14,
The processor comprising:
The building information on the building into which the vehicle has entered is based on at least any one of the position information of the GPS (Global Positioning System) module disassociated, the preset destination information, and the location information of an AP (Access Point) Judging
Vehicle driving assistance device.
19. The method of claim 18,
The processor comprising:
At the point of time when the vehicle enters the building, indoor map data corresponding to the building into which the vehicle has entered, from the server and storing the indoor map data in the storage unit of the vehicle driving assistant
Vehicle driving assistance device.
15. The method of claim 14,
The information of at least one of the driver and the passenger may be,
Including at least one of the sex of the driver or the passenger, whether it is an elderly person, a disabled person, whether pregnant or not, whether the person is a resident, VIP status, the number of passengers,
Vehicle driving assistance device.
15. The method of claim 14,
The processor comprising:
When the vehicle is parked at the scheduled parking position, outputs parking linkage service information according to use of a service provided in the building on the output unit,
The parking linkage service,
A parking fee guidance service, a route guidance service to a specific service area according to user registration information, a delivery service for delivering the goods according to the use of the service to the parking position of the vehicle, and a route guidance service to the parking position of the vehicle
Vehicle driving assistance device.
KR1020160005959A 2016-01-18 2016-01-18 Driver assistance apparatus and method having the same KR101929303B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160005959A KR101929303B1 (en) 2016-01-18 2016-01-18 Driver assistance apparatus and method having the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160005959A KR101929303B1 (en) 2016-01-18 2016-01-18 Driver assistance apparatus and method having the same

Publications (2)

Publication Number Publication Date
KR20170086293A KR20170086293A (en) 2017-07-26
KR101929303B1 true KR101929303B1 (en) 2018-12-17

Family

ID=59427169

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160005959A KR101929303B1 (en) 2016-01-18 2016-01-18 Driver assistance apparatus and method having the same

Country Status (1)

Country Link
KR (1) KR101929303B1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102083571B1 (en) * 2018-12-18 2020-03-02 박주환 Method for analyzing location of vehicle and navigation device
KR20210082966A (en) * 2019-12-26 2021-07-06 현대자동차주식회사 Apparatus and method for contorlling driving of vehicle
US20220083631A1 (en) * 2020-09-15 2022-03-17 Facebook Technologies, Llc Systems and methods for facilitating access to distributed reconstructed 3d maps

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4331175B2 (en) * 2006-02-23 2009-09-16 三菱電機インフォメーションシステムズ株式会社 Map distribution server and program
JP2016006605A (en) * 2014-06-20 2016-01-14 住友電気工業株式会社 Parking management device, computer program, and parking management device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4331175B2 (en) * 2006-02-23 2009-09-16 三菱電機インフォメーションシステムズ株式会社 Map distribution server and program
JP2016006605A (en) * 2014-06-20 2016-01-14 住友電気工業株式会社 Parking management device, computer program, and parking management device

Also Published As

Publication number Publication date
KR20170086293A (en) 2017-07-26

Similar Documents

Publication Publication Date Title
KR101990547B1 (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101979694B1 (en) Vehicle control device mounted at vehicle and method for controlling the vehicle
KR102014261B1 (en) Vehicle control device mounted on vehicle and method for controlling the vehicle
KR102562786B1 (en) Driver assistance apparatus and parking control system comprising same
KR101942793B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101832466B1 (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101891599B1 (en) Control method of Autonomous vehicle and Server
US9854085B2 (en) Apparatus and method for controlling portable device in vehicle
KR101826408B1 (en) Display Apparatus and Vehicle Having The Same
KR20190005442A (en) Driving system for vehicle and Vehicle
KR20170027635A (en) Method and apparatus for providing stopping movement mode and vehicle having the same
KR20190065042A (en) Driving system for vehicle
KR20190086601A (en) Vehicle control device mounted on vehicle and method for controlling the vehicle
KR101843538B1 (en) Driver assistance appratus and method thereof
KR101979268B1 (en) Autonomous drive system
KR20190041172A (en) Autonomous vehicle and method of controlling the same
KR20190031050A (en) Vehicle control device and vehicle comprising the same
KR101917412B1 (en) Apparatus for providing emergency call service using terminal in the vehicle and Vehicle having the same
KR20160147557A (en) Automatic parking apparatus for vehicle and Vehicle
KR101929303B1 (en) Driver assistance apparatus and method having the same
KR102420922B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101972352B1 (en) Parking Assistance Apparatus and Vehicle Having The Same
KR20170041418A (en) Display apparatus for vehicle and control method for the same
KR20180110943A (en) Vehicle controlling device mounted at vehicle and method for controlling the vehicle
KR101807788B1 (en) Display apparatus for vehicle and control method for the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
AMND Amendment
E90F Notification of reason for final refusal
AMND Amendment
X701 Decision to grant (after re-examination)
GRNT Written decision to grant