KR20100072971A - Navigation termninal and method for guiding route thereof - Google Patents

Navigation termninal and method for guiding route thereof Download PDF

Info

Publication number
KR20100072971A
KR20100072971A KR1020080131544A KR20080131544A KR20100072971A KR 20100072971 A KR20100072971 A KR 20100072971A KR 1020080131544 A KR1020080131544 A KR 1020080131544A KR 20080131544 A KR20080131544 A KR 20080131544A KR 20100072971 A KR20100072971 A KR 20100072971A
Authority
KR
South Korea
Prior art keywords
point
interest
guide
navigation terminal
moving object
Prior art date
Application number
KR1020080131544A
Other languages
Korean (ko)
Inventor
최현우
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020080131544A priority Critical patent/KR20100072971A/en
Publication of KR20100072971A publication Critical patent/KR20100072971A/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3461Preferred or disfavoured areas, e.g. dangerous zones, toll or emission zones, intersections, manoeuvre types, segments such as motorways, toll roads, ferries
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3476Special cost functions, i.e. other than distance or default speed limit of road segments using point of interest [POI] information, e.g. a route passing visible POIs
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3629Guidance using speech or audio output, e.g. text-to-speech
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096877Systems involving transmission of navigation instructions to the vehicle where the input to the navigation device is provided by a suitable I/O arrangement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/024Guidance services

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Navigation (AREA)

Abstract

The present invention provides a user input unit for receiving an input for setting a destination, a location information module for detecting a position of a moving object, a path for reaching the set destination, and one or more guide points on the searched path. A controller configured to search for a point of interest associated with the guide point, and select at least one point of interest from among the searched points of interest in consideration of the set path of the moving object; The present invention relates to a navigation device including a navigation unit for outputting information related to a point of interest and performing route guidance.

Description

NAVIGATION TERMNINAL AND METHOD FOR GUIDING ROUTE THEREOF}

The present invention relates to a navigation terminal for guiding a route to a set destination and a route guidance method of the navigation terminal.

In general, a navigation terminal is a device that provides a user's location and route guidance from a destination to a destination, and includes a vehicle, a ship, an air vehicle, and a portable navigation device.

Among the various navigational vehicles, the vehicle navigation terminal is installed in a vehicle or installed to be mounted / removable, so that the user needs information for driving the moving object, that is, the location of the moving object, road information (departure point, destination location, and route from the starting point to the destination). Information) and various additional information related to the location of the moving object (including information related to dangerous areas, accident areas, control sections and surrounding facilities), so that users can arrive at their destination more quickly and safely. Make sure

As a result of the steady development of technology, the vehicle navigation provides various multimedia and additional functions such as DMB (Digital Multimedia Broadcasting), MP3, karaoke, video playback, games, etc. It has come to provide so-called telematics service that combines mobile communication technology and Global Positioning System Technology (GPS).

In addition to the vehicle navigation terminal, there may be a portable navigation terminal which is manufactured to be portable by the user and provides information necessary for walking of the user.

When the moving object is close to a point where the route can be changed, such as an intersection or an interchange, when the user is guided by the route set to drive, consider guiding the route using information of the point of interest that the user can easily recognize. Can be.

The present invention is to guide the path effectively and accurately when the moving object enters the intersection or interchange.

Navigation terminal according to an embodiment of the present invention for realizing the above object, a user input unit for receiving an input for setting the destination, a location information module for detecting the position of the moving object, a path for reaching the set destination Search for, set one or more guide points on the searched route, search for points of interest associated with the guide points, and select at least one point of interest from the searched points of interest in consideration of the set path of the moving object. And a controller and an output unit configured to output the information related to the selected point of interest when the moving object enters an area related to the guide point and perform route guidance.

In one aspect of the invention, the output unit may include a sound output module for performing a path guidance by outputting a voice signal containing the name of the selected point of interest or the appearance of the appearance of the selected point of interest. .

In another aspect of the present invention, the sound output module may perform a route guidance by outputting a voice signal for guiding a driving direction of the moving object based on the selected point of interest.

In another aspect of the present invention, the output unit includes a display unit for performing a route guidance by outputting the visual information associated with the selected point of interest.

In another aspect of the present invention, the visual information may include a name of the point of interest.

In another aspect of the present invention, the visual information may include still image or video information of a point of interest.

In another aspect of the present invention, the controller may be located on a previously searched path, and may set at least one point of the point where the moving object may travel away from the searched path as a guide point.

In another aspect of the invention, the region associated with the guide point may include a region located within a predetermined distance from the guide point.

In another aspect of the invention, the region associated with the guide point may include a region located within a predetermined time distance from the guide point.

In another aspect of the present invention, the point of interest associated with the guide point may be located at a point of interest located in an area associated with the guide point or within a predetermined distance from the searched route.

In accordance with another aspect of the present invention, there is provided a route guidance method of a navigation terminal, the method comprising: searching for a route to reach a set destination, and setting one or more guidance points on the found route; Searching for a point of interest associated with the guide point, selecting at least one point of interest from among the searched points of interest in consideration of a set path of the moving object, and when the moving object enters an area associated with the guide point, the selected point of interest And outputting information related to the point to perform the route guidance.

In an aspect of the present invention, the information related to the point of interest may include a voice signal including the name of the selected point of interest.

In another aspect of the invention, the information related to the point of interest may include a voice signal for guiding the driving direction of the moving object based on the selected point of interest.

In another aspect of the present invention, the information related to the point of interest may include visual information related to the selected point of interest.

In another aspect of the present invention, the setting may be a step of setting at least one point as a guide point, which is located on the searched path and which the moving object may travel off the searched path. have.

The navigation terminal according to at least one embodiment of the present invention configured as described above, when a user of the navigation terminal enters an intersection or an interchange, provides a visual and audio information based on a point of interest, thereby providing a user of the navigation terminal. Can easily recognize the path.

Hereinafter, a navigation terminal according to the present invention will be described in detail with reference to the accompanying drawings. The suffixes "module" and "unit" for components used in the following description are given or used in consideration of ease of specification, and do not have distinct meanings or roles from each other.

The navigation terminal or another terminal described herein may be implemented in the form of a vehicle navigation terminal mounted on a vehicle to form a car navigation system (CNS).

The movable body described herein may mean a user of the navigation terminal 100 in addition to a vehicle, a ship, an aircraft, and the like. The position of the moving object may be recognized as the position of the navigation terminal 100.

Alternatively, the navigation terminal or another terminal may be a mobile phone equipped with a location information module, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), or the like. It may be implemented in the form.

1 is a block diagram of a navigation terminal according to an embodiment of the present invention.

The navigation terminal 100 includes a wireless communication unit 110, a location information module 120, an A / V input unit 130, a user input unit 140, a sensing unit 150, an output unit 160, The memory 170 may include an interface unit 180, a controller 190, a power supply unit 200, and the like. The components shown in FIG. 1 are not essential, so a navigation terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the navigation terminal 100 and the wireless communication system or between the navigation terminal 100 and a network in which the navigation terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, and the like.

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.

The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H). Digital broadcast signals can be received using digital broadcasting systems such as Handheld and Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 170.

The mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.

The wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the navigation terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like may be used.

The location information module 120 is a module for obtaining a location of a mobile terminal, and a representative example thereof is a GPS (Global Position System) module.

The GPS module receives a signal including time information from at least three navigation satellites, and calculates a distance from each satellite using the signal. Position information may be obtained by applying triangulation to the calculated distance. The GPS module may increase the accuracy of the calculated location information by further applying techniques such as map matching and dead reckoning to the location information obtained by applying the triangulation method.

The location information module 120 may obtain location information by using various technologies such as cell tower signals, wireless internet signals, and Bluetooth sensors in addition to the GPS module to determine the location of the mobile terminal. This technique is called Hybrid Positioning System.

The location of the navigation terminal 100 sensed by the location information module 120 may mean the location of the moving object.

Referring to FIG. 1, the A / V input unit 130 is for inputting an audio signal or a video signal, and may include a camera 131 and a microphone 132. The camera 131 processes image frames such as still images or moving images obtained by the image sensor in the photographing mode. The processed image frame may be displayed on the display unit 161.

The image frame processed by the camera 131 may be stored in the memory 170 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 131 may be provided according to a usage environment.

The microphone 132 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data. The microphone 132 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.

The user input unit 140 generates input data for controlling a user's navigation terminal operation. The user input unit 140 may include a key pad dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, and the like.

When the navigation terminal 100 is a vehicle navigation device 100, a steering wheel, an acceleration pedal, a brake pedal, a gear shift lever, and the like mounted on a vehicle may be a user input unit. 140 may be configured.

The sensing unit 150 may sense whether or not the user contacts the navigation terminal 100, whether the power supply unit 200 supplies power, whether the interface unit 180 is coupled with an external device, or a vehicle component. . The sensing unit 150 may include a proximity sensor 151.

On the other hand, the sensing unit 150 is a sensor for detecting the movement of the moving object, for example, a speed sensor (speed sensor) for detecting the moving speed of the moving object, an acceleration sensor (G-sensor) for detecting the acceleration, or the moving object It may include a gyro sensor (gyro sensor) for detecting the rotational angular velocity or rotational angular acceleration of the. The sensors for detecting the movement of the moving object may be used as a dead reckoning sensor (DR sensor). The DR sensor may be included in the location information module or used to determine the location of the moving object by interacting with the location information module.

When the navigation terminal 100 is a vehicle navigation apparatus 100, the sensing unit 150 may open or close a vehicle door or window, whether a seat belt is worn, a driver's steering wheel, an accelerator pedal, a brake pedal, and a gear. Operation status of the shifting lever, temperature inside or outside the vehicle, whether the collision between the vehicle and other objects and their strength, distance between the vehicle and other objects, the state of the parts mounted on the vehicle, and lamps mounted inside or outside the vehicle A sensing signal for controlling the operation of the navigation terminal 100 or the vehicle may be generated by detecting a current state of the vehicle or the navigation terminal 100, such as a flashing state or brightness of a lamp, or whether a vehicle occupant is seated. For example, when the vehicle door is opened or when the vehicle occupant is seated, it may be detected by using a pressure sensor that senses the pressure applied to the chair.

The output unit 160 is used to generate an output related to sight, hearing, or tactile sense, and includes a display unit 161, a sound output module 162, an alarm unit 163, and a haptic module 164. Can be.

The display unit 161 displays (outputs) information processed by the navigation terminal 100. For example, when the navigation terminal is in the route search mode, the UI (User Interface) or GUI (Graphic User Interface) related to the route search is displayed. When the navigation terminal 100 is in a video call mode or a photographing mode, the navigation terminal 100 displays a photographed and / or received image, a UI, and a GUI.

The display unit 161 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). and at least one of a 3D display.

Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display. A representative example of the transparent display is TOLED (Transparant OLED).

The display unit 161 may be implemented in the form of a head up display (HUD). For example, the display unit 161 may be implemented in a window provided in the windshield or the door of the vehicle. In this case, the display unit 161 may be configured as a transparent type or a light transmissive type.

According to the implementation form of the navigation terminal 100, two or more display units 161 may exist.

When the display unit 161 and a sensor for detecting a touch operation (hereinafter, referred to as a “touch sensor”) form a mutual layer structure (hereinafter referred to as a “touch screen”), the display unit 161 may be configured in addition to an output device. Can also be used as an input device. The touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 161 or capacitance generated at a specific portion of the display unit 161 into an electrical input signal. The touch sensor may be configured to detect not only the position and area of the touch but also the pressure at the touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and then transmits the corresponding data to the controller 190. As a result, the controller 190 may determine which area of the display unit 161 is touched.

Referring to FIG. 1, a proximity sensor 151 may be disposed in an inner region of the navigation terminal covered by the touch screen or near the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays. Proximity sensors have a longer life and higher utilization than touch sensors.

Examples of the proximity sensor include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by the change of the electric field according to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the proximity touch is performed by the pointer on the touch screen refers to a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.

The proximity sensor detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.

The sound output module 162 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, a path search mode, and the like. . The sound output module 152 may also output a sound signal related to a function (for example, a call signal reception sound, a message reception sound, a route guidance voice, etc.) performed by the navigation terminal 100. The sound output module 162 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 163 outputs a signal for notifying occurrence of an event of the navigation terminal 100. Examples of events occurring in the navigation terminal include call signal reception, message reception, touch input, abnormalities of components mounted on the vehicle, and abnormal opening and closing of vehicle doors and windows. The alarm unit 163 may output a signal for notifying the occurrence of an event by vibration, in addition to a video signal or an audio signal. The video signal or the audio signal may also be output through the display unit 161 or the audio output module 162, so that they 161 and 162 may be classified as part of the alarm unit 163.

The haptic module 164 generates various tactile effects that a user can feel. Vibration is a representative example of the haptic effect generated by the haptic module 164. The intensity and pattern of vibration generated by the haptic module 164 can be controlled. For example, different vibrations may be synthesized and output or may be sequentially output.

In addition to the vibration, the haptic module 164 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of the electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 164 may not only deliver the haptic effect through direct contact, but may also be implemented to allow the user to feel the haptic effect through a muscle sense such as a finger or an arm. Two or more haptic modules 164 may be provided according to a configuration aspect of the navigation terminal 100.

When the navigation terminal 100 is a vehicle navigation terminal 100, the haptic module 164 may be provided where frequent contact with the user is made in the vehicle. For example, it may be provided in a steering wheel, a shift gear lever, a seat seat, and the like.

The memory 170 may store a program for the operation of the controller 190, and may temporarily store input / output data (for example, music, still image, video, map data, etc.). The memory 170 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.

The memory 170 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM (Random Access Memory, RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic Memory, Magnetic It may include a storage medium of at least one type of disk, optical disk. The navigation terminal 100 may operate in association with a web storage that performs a storage function of the memory 170 on the Internet.

The interface unit 180 serves as a path with all external devices connected to the navigation terminal 100. The interface unit 180 receives data from an external device, receives power, transfers the power to each component inside the navigation terminal 100, or transmits data within the navigation terminal 100 to an external device. For example, wired / wireless headset ports, external charger ports, wired / wireless data ports, memory card ports, ports for connecting devices with identification modules, audio input / output (I / O) ports, The video input / output (I / O) port, the earphone port, and the like may be included in the interface unit 180.

When the navigation terminal 100 is a vehicle navigation terminal 100, the interface unit 180 may be a controller-area network (CAN), a local interconnect network (LIN), a FlexRay, a media oriented systems transport (MOST), or the like. Can be connected to other mounted devices of the vehicle.

The identification module is a chip that stores various types of information for authenticating the use authority of the navigation terminal 100. The identification module may include a user identification module (UIM), a subscriber identify module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Therefore, the identification device may be connected to the terminal 100 through a port. Alternatively, the identification device may be manufactured in the form of a vehicle key.

The controller 190 typically controls the overall operation of the navigation terminal. For example, it performs related control and processing for data communication, video call, route search, vehicle control and the like. The controller 190 may include a multimedia module 191 for playing multimedia.

When the navigation terminal 100 is a vehicle navigation terminal 100, the controller 190 may include an airbag controller 192 for controlling an airbag mounted on a vehicle, and an emergency battery for controlling an emergency battery mounted on the vehicle. The battery controller 193 may be further provided.

The multimedia module 191, the airbag controller 192, and the emergency battery controller 193 may be implemented in the controller 180 or may be implemented separately from the controller 190.

When the navigation terminal 100 is a vehicle navigation terminal 100 and is implemented with a mobile communication technology, it is also called a telematics terminal. In this case, the control unit 190 becomes a telematics control unit (TCU).

The controller 190 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.

The power supply unit 200 receives an external power source and an internal power source under the control of the controller 190 to supply power for operation of each component.

The navigation terminal 100 may be implemented as a portable navigation terminal 100 provided with some or all of the above components and manufactured to be portable. Alternatively, the navigation terminal 100 may be implemented as a vehicle navigation terminal 100 that includes some or all of the components and is integrated with or detachable from the vehicle.

The navigation terminal 100 may be integrally implemented with the vehicle or may be implemented to be detachable from the vehicle as a separate device from the vehicle.

Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.

According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the controller 190 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more functions and operations described herein. Software code may be implemented in software applications written in a suitable programming language. The software code may be stored in the memory 170 and executed by the controller 190.

As illustrated in FIG. 1, the user input unit 140 may receive an input for setting a destination from a user.

As illustrated in FIG. 1, the controller 190 may search a path for reaching the set destination. The path may be searched for one or more. When a plurality of paths are found, the controller 190 may select one or more paths from among a plurality of paths searched based on a preset criterion. The preset criteria may be determined in consideration of the distance from the starting point to the destination, the travel time, the traffic volume, the toll, whether the road can be driven, the weather, the accident, and other natural disasters. The controller 190 may set the searched movement route or the selected movement route as a movement route to guide the user.

The controller 190 may set one or more guide points on the searched route. The guide point may be at least any one of the points that are located on the searched path and may move away from the searched path. For example, a point at which the moving object can change a path such as a crossroad, an interchange, and an intersection may be set as a guide point.

The controller 190 may search for a point of interest related to the guide point. The point of interest may include buildings, facilities, structures, roads, workpieces, and the like that are easily visually recognized. The point of interest may be registered by the user, preset when the navigation terminal 100 is manufactured, or updated by data received from a server.

The point of interest associated with the guide point may mean a point of interest located in an area related to the guide point among the points of interest or a point of interest located within a predetermined distance from the searched route.

The area associated with the guide point may include an area located within a predetermined physical / temporal distance from the guide point.

The controller 190 may select at least one point of interest from the searched points of interest. The controller 190 may select at least one point of interest from among the searched points of interest, in consideration of the set path of the moving object.

For example, when the controller 190 selects the point of interest in consideration of the set path of the moving object, the controller 190 may select a point of interest located near the guide point and visually easy to observe. Alternatively, the controller 190 may select a point of interest located in the direction in which the moving object is to proceed.

The controller 190 may determine whether the moving object has entered the area associated with the guide point using the location of the moving object detected by the location information module 120.

As shown in FIG. 1, when the moving object enters an area related to the guide point, the output unit 160 may output information related to the selected point of interest.

The output unit 160 may include a sound output module 162 for outputting a voice signal including the name of the selected point of interest or a voice signal depicting the appearance of the selected point of interest to perform route guidance. The sound output module 162 may output a voice signal for guiding a driving direction of the moving object based on the selected point of interest.

For example, assume that the name of the selected point of interest is "XX department store". The sound output module 162 may output a voice signal including "XX department store" which is the name of the point of interest, such as "turn right to the XX department store right".

Or, for example, assume that the selected point of interest is a 20-story building and the outer wall of the building is blue. The sound output module 162 may output a voice signal depicting the appearance of the point of interest, such as "turn right on a blue building 20 stories high on the right".

The output unit 160 may include a display unit 161 for outputting visual information related to the selected point of interest and performing route guidance.

The visual information may include a name of the point of interest, a still image or a video photographing the point of interest.

Hereinafter, a route guidance method of a navigation terminal 100 according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings.

2 is a flowchart illustrating a route guidance method of a navigation terminal according to an embodiment of the present invention.

As shown in FIG. 2, the navigation terminal 100 may receive an input for setting a destination (S10). The input for setting the destination may be received from the user by the user input unit 140 or from a server or another terminal.

The navigation terminal 100 may search for a path for reaching the set destination due to the input for setting the destination (S20). More than one route for reaching the set destination may be searched. In this case, the navigation terminal 100 may select any one path based on a predetermined criterion or a user's selection.

The navigation terminal 100 may set one or a plurality of guide points on the found route (S30). The guide point may mean a section or a point at which the user may deviate from the set path.

 The navigation terminal 100 may search for a point of interest related to the guide point (S40). The information related to the guide point may be stored in the navigation terminal 100 or received from a server or another terminal.

The navigation terminal 100 may select at least one point of interest from among searched points of interest in consideration of the set path of the moving object (S50). For example, the navigation terminal 100 may select a point of interest existing in a traveling direction on a set path of the moving object.

The navigation terminal 100 may detect the position of the moving object and determine whether the moving object has entered an area associated with the guide point (S60). The area associated with the guide point may mean an area located within a predetermined physical / temporal distance from the guide point.

When the moving object enters the area related to the guide point, the navigation terminal 100 may output the information related to the selected point of interest and perform route guidance (S70). When the navigation terminal 100 outputs information related to a point of interest, the navigation terminal 100 may output visual information related to the point of interest or output a sound signal related to the point of interest. The sound signal may include the name of the point of interest.

3 and 4 is a view showing a navigation terminal performs a route guidance according to an embodiment of the present invention.

As illustrated in FIG. 3, the display unit 161 may display an icon I1 indicating an orientation of a map displayed on an area of the screen. The display unit 161 may display a map such that a specific direction (for example, the true north direction of the earth), the moving direction of the moving object, the direction in which the destination is located, and the like are fixed to the upper direction of the screen.

The display unit 161 may display an icon I2 indicating whether the sound output module 162 is activated and a volume setting in one area of the screen. The user may activate or deactivate the sound output module 162 or adjust the volume by applying a touch input to the icon I2.

The display unit 161 may display an icon I3 indicating whether to activate a route search function using a TPEG (Transport Portal Experts Group) in one area of the screen. TPEG (Transport Portal Experts Group) was originally established in 1997 by the European Broadcasting Union, a group for the establishment of traffic information protocols, but the navigation system is widely known as a route guidance function using real-time traffic information. have.

The display unit 161 may display an icon I4 displaying a scale of the map data in one area of the screen.

The display unit 161 may display an icon I5 displaying a current time in one area of the screen. In addition, the display unit 161 may display an icon I6 indicating an estimated time to reach a preset destination in one region of the screen. In addition, an icon indicating a time required to reach a predetermined destination may be displayed.

The display unit 161 may display an icon I7 displaying a distance remaining to a preset destination in one region of the screen.

The display unit 161 may display an icon I8 for enlarging or reducing the icon I8 'for enlarging the map displayed on one area of the screen.

The display unit 161 may display an icon I9 indicating a position and a moving direction of the moving object in one region of the screen. The icon I9 may be displayed at a point corresponding to the location of the current moving object on the map. Further, the moving direction of the moving object may be indicated by the vertex direction of the arrow or the like in the icon I9.

The display unit 161 may display an icon I10 displaying a place name of an area where the moving object is located in one area of the screen.

The display unit 161 may display an icon I11 indicating a lane configuration of a roadway when the moving road travels on a road in one area of the screen.

As illustrated in FIG. 4, the display unit 161 may display a path required to reach the preset destination I12. The route may not be displayed if the destination of the moving object is not set.

The route set from the starting point to the destination I12 may include points I13 and I13 'where the moving object may travel away from the set route.

Points I13 and I13 ′ that may be traveling off the set path may include a crossroad, an interchange, an intersection, and the like.

As illustrated in FIG. 4, the controller 190 may set points I13 and I13 ′, which are likely to travel outside the set path, as the guide point.

The controller 190 may set all of the points (I13, I13 ') that may be traveling off the set path as a guide point, or set some of the points (I13, I13') as guide points. have.

The navigation terminal 100 may provide information to help the vehicle move along a preset route at the guide point when the vehicle enters an area related to the guide point.

The display unit 161 may display the location of the guide point on a map displayed on the screen.

5 is a view illustrating a region in which a navigation point is set in a navigation terminal according to an embodiment of the present invention.

As illustrated in FIG. 5, the navigation terminal 100 may set regions R1 and R1 ′ associated with the guide point. The display unit 161 may display the positions of the guide points R1 and R1 'on a map displayed on the screen.

Areas R1 and R1 ′ associated with the guide point may include an area located within a predetermined temporal / physical distance from the guide point.

6 is a diagram illustrating a point of interest selected from a navigation terminal according to an embodiment of the present invention.

As illustrated in FIG. 6, one or more points of interest may exist in regions R1 and R1 ′ associated with the guide point.

The point of interest may include buildings, facilities, structures, roads, workpieces, and the like that are easily visually recognized. The point of interest may be registered by the user, preset when the navigation terminal 100 is manufactured, or updated by data received from a server.

For example, assume that there is a tall building P1 in the area R1 associated with the guide point. The navigation terminal 100 may select the high-rise building P1 from a plurality of points of interest including the high-rise building P1.

7 is a diagram illustrating a guide using a voice signal in a navigation terminal according to an embodiment of the present invention.

As illustrated in FIG. 7, when the moving object enters the region R1 associated with the guide point, the navigation terminal 100 may detect this and display it on a map.

The navigation terminal 100 may provide information to the user so that the mobile body can travel in accordance with a predetermined course.

The sound output module 162 may output a voice signal including the name of the point of interest to guide the path of the moving object.

For example, when the name of the high-rise building P1 is "URBAN HIVE", the sound output module 162 may output a voice signal including "URBAN HIVE" which is the name of the high-rise building P1. . The voice signal may include content for guiding a direction based on the point of interest.

For example, including the name "URBAN HIVE" which is the name of the high-rise building P1, "30 meters ahead, turn left after 50 minutes along the URBAN HIVE", the direction of the high-rise building (P1) in any direction. A voice that guides whether the vehicle should be driven may be output.

FIG. 8 is a diagram illustrating a guide using a voice signal including content describing a point of interest in a navigation terminal according to an embodiment of the present invention.

As illustrated in FIG. 8, when the moving object enters an area R1 associated with the guide point, the navigation terminal 100 may detect this and display it on a map.

The navigation terminal 100 may provide information to the user so that the mobile body can travel in accordance with a predetermined course.

The sound output module 162 may guide the path of the moving object by outputting a voice signal including content describing the point of interest.

For example, if the exterior of the high-rise building (P1) is a circular hole is arranged on the outer wall, the color is white, "50m after turning left with a white 20-storey building with a circular hole in the outer wall in the front of 30m straight ahead 50m After describing the appearance of the high-rise building (P1) as shown in the following may be output based on which direction to guide the voice to guide.

Even if the user does not know the name of the point of interest, the user may recognize the point of interest through the voice, visually identify the point of interest corresponding to the voice, and then move based on this point.

FIG. 9 is a diagram illustrating a state in which a navigation terminal displays visual information and performs guidance in accordance with an embodiment of the present invention.

As illustrated in FIG. 9, when the moving object enters an area R1 associated with the guide point, the navigation terminal 100 may detect this and display it on a map.

The navigation terminal 100 may provide information to the user so that the mobile body can travel in accordance with a predetermined course.

As shown in FIG. 9, the display unit 161 may display an appearance of the point of interest.

The appearance of the point of interest may include a still image or a moving picture of the point of interest.

Even if the user does not know the appearance of the point of interest, the user can check the appearance of the point of interest through the navigation terminal 100 and be guided the driving direction of the moving object based on the point of interest, so that the user can easily navigate the route. I can recognize it.

The display unit 161 and the sound output module 162 may be separately activated to perform route guidance using the point of interest.

Alternatively, the display unit 161 and the sound output module 162 may be activated together to provide audiovisual information related to the point of interest, thereby performing route guidance.

In addition, according to an embodiment of the present invention, the above-described method may be implemented as code that can be read by a processor in a medium in which a program is recorded. Examples of processor-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may be implemented in the form of a carrier wave (for example, transmission over the Internet). Include.

The navigation terminal for implementing the above-described path guidance method is not limited to the configuration and method of the above-described embodiments, the embodiments are all or part of each embodiment so that various modifications can be made It may alternatively be configured in combination.

1 is a block diagram of a navigation terminal according to an embodiment of the present invention.

2 is a flowchart illustrating a route guidance method of a navigation terminal according to an embodiment of the present invention.

3 and 4 is a view showing a navigation terminal performs a route guidance according to an embodiment of the present invention.

FIG. 5 is a view illustrating a region associated with a guide point in a navigation terminal according to an embodiment of the present invention; FIG.

6 is a view showing a point of interest selected in the navigation terminal according to an embodiment of the present invention.

7 is a view illustrating a guide using a voice signal in a navigation terminal according to an embodiment of the present invention.

FIG. 8 is a diagram illustrating a guide using a voice signal including content describing a point of interest in a navigation terminal according to an embodiment of the present invention. FIG.

FIG. 9 is a view illustrating a visual display of visual information in a navigation terminal according to an embodiment of the present invention to perform guidance; FIG.

Claims (15)

A user input unit for receiving an input for setting a destination; A location information module for detecting a location of the moving object; Search a route for reaching the set destination, set one or a plurality of guide points on the found route, A controller for searching for a point of interest associated with the guide point and selecting at least one point of interest from among the searched points of interest in consideration of a set path of a moving object; And a mobile unit which outputs information related to the selected point of interest when the moving object enters an area related to the guide point and performs route guidance. According to claim 1, The output unit, And a sound output module configured to output a voice signal including the name of the selected point of interest or a voice signal depicting the appearance of the selected point of interest to perform route guidance. The method of claim 2, The sound output module, A navigation terminal for performing route guidance by outputting a voice signal for guiding the driving direction of the moving object based on the selected point of interest. According to claim 1, The output unit, And a display unit configured to output visual information related to the selected point of interest and perform route guidance. 5. The method of claim 4, The visual information, Navigation terminal including the name of the point of interest. 5. The method of claim 4, The visual information, Navigation device including still image or video information of the point of interest. According to claim 1, The control unit, And a navigation terminal positioned on the searched route and setting at least one point of the point where the moving object can travel away from the searched route as a guide point. According to claim 1, Areas associated with the guide point, And a region located within a predetermined distance from the guide point. According to claim 1, Areas associated with the guide point, And a region located within a preset time distance from the guide point. According to claim 1, Points of interest associated with the guidance point, And a point of interest located in an area associated with the guide point or a point of interest located within a predetermined distance from the searched route. Searching for a route to reach a set destination and setting one or a plurality of guide points on the found route; Searching for a point of interest associated with the guide point, and selecting at least one point of interest from among the searched points of interest in consideration of the set path of the moving object; and And when the moving object enters an area related to the guide point, outputting information related to the selected point of interest and performing route guidance. 12. The method of claim 11, Information related to the point of interest, And a voice signal including a name of the selected point of interest. The method of claim 12, Information related to the point of interest, And a voice signal for guiding a driving direction of the moving object based on the selected point of interest. 12. The method of claim 11, Information related to the point of interest, Route guidance method of the navigation terminal including the visual information associated with the selected point of interest. 12. The method of claim 11, The setting step, The at least one point located on the searched route, the moving object can travel away from the searched route as a guide point setting the route guidance method of the navigation terminal.
KR1020080131544A 2008-12-22 2008-12-22 Navigation termninal and method for guiding route thereof KR20100072971A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020080131544A KR20100072971A (en) 2008-12-22 2008-12-22 Navigation termninal and method for guiding route thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020080131544A KR20100072971A (en) 2008-12-22 2008-12-22 Navigation termninal and method for guiding route thereof

Publications (1)

Publication Number Publication Date
KR20100072971A true KR20100072971A (en) 2010-07-01

Family

ID=42636040

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020080131544A KR20100072971A (en) 2008-12-22 2008-12-22 Navigation termninal and method for guiding route thereof

Country Status (1)

Country Link
KR (1) KR20100072971A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101246589B1 (en) * 2010-07-08 2013-03-25 에스케이플래닛 주식회사 Method And Apparatus Providing Customized Moving Path
KR20160060278A (en) * 2014-11-20 2016-05-30 현대엠엔소프트 주식회사 Apparatus and method for displaying buliding data around the junction when driving in an alley
KR20200042443A (en) * 2017-12-14 2020-04-23 구글 엘엘씨 Systems and methods for selecting points of interest (POIs) to associate with navigation controls
KR20200069076A (en) * 2018-12-06 2020-06-16 한국전자통신연구원 Driving Guide Apparatus and System for Providing with a Language Description of Image Characteristics
EP3770557A1 (en) * 2019-07-22 2021-01-27 Bayerische Motoren Werke Aktiengesellschaft Recommendation engine for sights, events and places along route from a to b

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101246589B1 (en) * 2010-07-08 2013-03-25 에스케이플래닛 주식회사 Method And Apparatus Providing Customized Moving Path
KR20160060278A (en) * 2014-11-20 2016-05-30 현대엠엔소프트 주식회사 Apparatus and method for displaying buliding data around the junction when driving in an alley
KR20200042443A (en) * 2017-12-14 2020-04-23 구글 엘엘씨 Systems and methods for selecting points of interest (POIs) to associate with navigation controls
KR20220066987A (en) * 2017-12-14 2022-05-24 구글 엘엘씨 Systems and methods for selecting a poi to associate with a navigation maneuver
KR20200069076A (en) * 2018-12-06 2020-06-16 한국전자통신연구원 Driving Guide Apparatus and System for Providing with a Language Description of Image Characteristics
EP3770557A1 (en) * 2019-07-22 2021-01-27 Bayerische Motoren Werke Aktiengesellschaft Recommendation engine for sights, events and places along route from a to b

Similar Documents

Publication Publication Date Title
KR101502013B1 (en) Mobile terminal and method for providing location based service thereof
KR101649643B1 (en) Information display apparatus and method thereof
KR101570369B1 (en) Telematics terminal and method for controlling vehicle by using thereof
US20160097651A1 (en) Image display apparatus and operating method of image display apparatus
US9470543B2 (en) Navigation apparatus
KR101537694B1 (en) Navigation terminal, mobile terminal and method for guiding route thereof
KR20110045762A (en) Navigation method of mobile terminal and apparatus thereof
US10788331B2 (en) Navigation apparatus and method
KR101631959B1 (en) Vehicle control system and method thereof
KR101578721B1 (en) Navigation device and method for guiding route thereof
KR20100072971A (en) Navigation termninal and method for guiding route thereof
KR20100064937A (en) Navigation device and method for guiding route thereof
KR20110054825A (en) Navigation method of mobile terminal and apparatus thereof
KR20140122956A (en) Information providing apparatus and method thereof
KR20100050958A (en) Navigation device and method for providing information using the same
KR20100064248A (en) Navigation apparatus and method thereof
KR101667699B1 (en) Navigation terminal and method for guiding movement thereof
KR20100052324A (en) Navigation apparatus and method thereof
KR20150033428A (en) Electronic device and control method for the electronic device
KR20100081591A (en) Mobile vehicle navigation method and apparatus thereof
KR101575047B1 (en) Mobile vehicle navigation method and apparatus thereof
KR20100079091A (en) Navigation system and method thereof
KR20110055267A (en) Navigation method of mobile terminal and apparatus thereof
KR20100068062A (en) Navigation apparatus and method thereof
KR20110061427A (en) Navigation apparatus and method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
J201 Request for trial against refusal decision
B601 Maintenance of original decision after re-examination before a trial
J301 Trial decision

Free format text: TRIAL NUMBER: 2015101004535; TRIAL DECISION FOR APPEAL AGAINST DECISION TO DECLINE REFUSAL REQUESTED 20150803

Effective date: 20170203