KR101736820B1 - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR101736820B1
KR101736820B1 KR1020150131803A KR20150131803A KR101736820B1 KR 101736820 B1 KR101736820 B1 KR 101736820B1 KR 1020150131803 A KR1020150131803 A KR 1020150131803A KR 20150131803 A KR20150131803 A KR 20150131803A KR 101736820 B1 KR101736820 B1 KR 101736820B1
Authority
KR
South Korea
Prior art keywords
mobile terminal
mirroring
screen
display
vehicle
Prior art date
Application number
KR1020150131803A
Other languages
Korean (ko)
Other versions
KR20170033699A (en
Inventor
조동준
조택일
김민구
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150131803A priority Critical patent/KR101736820B1/en
Publication of KR20170033699A publication Critical patent/KR20170033699A/en
Application granted granted Critical
Publication of KR101736820B1 publication Critical patent/KR101736820B1/en

Links

Images

Classifications

    • H04M1/72522
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/42Graphical user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/74Details of telephonic subscriber devices with voice recognition means

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

According to an embodiment of the present invention, there is provided a mobile terminal and a control method thereof. The mobile terminal includes a display unit, a communication unit that communicates with a display device of the vehicle, And a control unit, wherein the control unit outputs a user interface for controlling the mirroring screen to the control screen, and based on user input to the user interface, And mirrors the image displayed on the mirroring screen to the display device through the communication unit.

Description

[0001] MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME [0002]

The present invention relates to a mobile terminal and a control method thereof, and more particularly, to a mobile terminal and a control method thereof.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.

As the function of the mobile terminal gradually becomes diversified, for example, data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video to a display unit . Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.

As the function of the mobile terminal is diversified, it is implemented in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, have.

In order to support and enhance the functionality of such a mobile terminal, it may be considered to improve the structural and / or software aspects of the mobile terminal.

Due to the limitation of the screen size, the mobile terminal has some limitations in outputting various information at the same time or displaying an arbitrary image in a lively manner. In order to solve such a problem, a mirroring technique has been disclosed in recent years in which video data output on the screen of a mobile terminal can be shared with an electronic device (e.g., TV) having a relatively large screen.

On the other hand, a vehicle is a device that drives a wheel to transport a person or a cargo from any place to another place. In order to increase the safety and convenience of a user (e.g., a driver or a passenger) The development of technology for grafting is accelerating. In addition, various types of displays are provided in the vehicle to provide the user with various information processed in the vehicle or received from the outside. However, the mirroring between the mobile terminal and the vehicle remains at a level where the screen currently displayed on the mobile terminal is directly output to the display of the vehicle.

The present invention is directed to solving the above-mentioned problems and other problems. Another object of the present invention is to provide a mobile terminal which can display a mirror image and a user interface for controlling the mirror image on different screen areas of the mobile terminal when the image displayed on the mobile terminal is mirrored on a display of a vehicle, And an object of the present invention is to provide a mobile terminal capable of easily selecting an application to be mirrored on a display of a middle vehicle and a control method thereof.

According to an aspect of the present invention, there is provided an information processing apparatus including a display unit, a communication unit for communicating with a display device of a vehicle, and a display unit including a mirror screen and a control screen, And a control unit for dividing the screen into a plurality of screens, wherein the control unit outputs a user interface for controlling the mirroring screen to the control screen, and displays, on the basis of a user input to the user interface, To the display device through the communication unit.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, comprising: detecting occurrence of a preset event; entering a mirroring mode for a display device of a vehicle in response to occurrence of the event; The method of claim 1, further comprising: dividing a screen into a plurality of screens including a mirroring screen and a control screen; outputting a user interface for controlling the mirroring screen to the control screen; And mirroring the image displayed on the screen to the display device.

The details of other embodiments are included in the detailed description and drawings.

Effects of the mobile terminal and the control method according to the present invention will be described as follows.

According to at least one embodiment of the present invention, when the image displayed on the mobile terminal is mirrored on the display of the vehicle, the mirroring image and the user interface for controlling the mirroring image are output to different screen areas of the mobile terminal , It is possible to easily select an application to be mirrored on the display of the vehicle among the applications installed in the mobile terminal.

In addition, according to at least one embodiment of the present invention, when the specific application selected through the user interface provided by the mobile terminal is the traffic regulation object, the mirroring of the application is prevented so that it is unnecessary for the driver, And the concentration of the driver can be prevented from being lowered due to mirroring of the image on the display of the vehicle.

Further, according to at least one embodiment of the present invention, there is an advantage that, through the user interface, a display for mirroring an image of the mobile terminal among a plurality of displays on the vehicle can be easily selected. Particularly, by automatically changing the shape and position of the screen area in which the mirrored image of the mobile terminal is displayed, according to the shape and position of the selected display of the vehicle, the disparity between the image displayed on the mobile terminal and the image displayed on the vehicle display is eliminated can do.

The effects of the present invention are not limited to the effects mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the description of the claims.

1A is a block diagram illustrating a mobile terminal according to the present invention.
1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
2 shows a block diagram of a vehicle according to an embodiment of the present invention.
3 shows a flow chart of an exemplary process executed by a mobile terminal when mirroring between a mobile terminal and a display device of a vehicle, in accordance with an embodiment of the present invention.
4 is a view for explaining mirroring between a mobile terminal and a display device of a vehicle according to an embodiment of the present invention.
5 is a view for explaining mirroring between a mobile terminal and a display device of a vehicle according to an embodiment of the present invention.
6 is a view for explaining an exemplary operation in which a mobile terminal according to an embodiment of the present invention selects a display of a display device for mirroring an image.
FIG. 7 illustrates an exemplary operation of dividing a screen when the mobile terminal enters the mirroring mode according to an embodiment of the present invention.
FIG. 8 illustrates an exemplary operation of the mobile terminal according to an exemplary embodiment of the present invention to change the position and size of the screen according to the direction of the main body.
9A and 9B illustrate an exemplary operation of the mobile terminal according to an exemplary embodiment of the present invention to mirror a predetermined image to a display device based on a user input.
Figs. 10A and 10B show an exemplary operation in which the mobile terminal related to Fig. 9B additionally mirrors a predetermined image to a display device based on user input.
Figs. 11A and 11B show an exemplary operation in which the mobile terminal related to Fig. 10B changes the state of the display areas included in the mirroring screen based on user input.
12A and 12B illustrate an exemplary operation in which the mobile terminal blocks mirroring of an application that is subject to traffic regulation according to an embodiment of the present invention.
13A and 13B illustrate an exemplary operation in which the mobile terminal changes display positions of a mirroring screen and a control screen based on a user input according to an embodiment of the present invention.
14A and 14B illustrate an exemplary operation of changing the size of an image mirrored on a display device based on a user input on a mirroring screen according to an embodiment of the present invention.
15A and 15B illustrate an exemplary operation in which a mobile terminal according to an embodiment of the present invention mirrors an image on two or more different displays included in a display device.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.

However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.

1A to 1C are block diagrams for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams showing an example of a mobile terminal according to the present invention in different directions.

The mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, ), And the like. The components shown in FIG. 1A are not essential for implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than the components listed above.

The wireless communication unit 110 may be connected between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and another mobile terminal 100 or between the mobile terminal 100 and the external server 100. [ Lt; RTI ID = 0.0 > wireless < / RTI > In addition, the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.

The wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114, and a location information module 115 .

The input unit 120 includes a camera 121 or an image input unit for inputting a video signal, a microphone 122 for inputting an audio signal, an audio input unit, a user input unit 123 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.

The sensing unit 140 may include at least one sensor for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information. For example, the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, A G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, A microphone 226, a battery gauge, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, A thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). Meanwhile, the mobile terminal disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The output unit 150 includes at least one of a display unit 151, an acoustic output unit 152, a haptic tip module 153, and a light output unit 154 to generate an output related to visual, auditory, can do. The display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and a user and may provide an output interface between the mobile terminal 100 and a user.

The interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100. The interface unit 160 is connected to a device having a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, And may include at least one of a port, an audio I / O port, a video I / O port, and an earphone port. In the mobile terminal 100, corresponding to the connection of the external device to the interface unit 160, it is possible to perform appropriate control related to the connected external device.

In addition, the memory 170 stores data supporting various functions of the mobile terminal 100. The memory 170 may store a plurality of application programs or applications running on the mobile terminal 100, data for operation of the mobile terminal 100, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least a part of these application programs may exist on the mobile terminal 100 from the time of shipment for the basic functions (e.g., telephone call receiving function, message receiving function, and calling function) of the mobile terminal 100. Meanwhile, the application program may be stored in the memory 170, installed on the mobile terminal 100, and may be operated by the control unit 180 to perform the operation (or function) of the mobile terminal.

In addition to the operations related to the application program, the control unit 180 typically controls the overall operation of the mobile terminal 100. The control unit 180 may process or process signals, data, information, and the like input or output through the above-mentioned components, or may drive an application program stored in the memory 170 to provide or process appropriate information or functions to the user.

In addition, the controller 180 may control at least some of the components illustrated in FIG. 1A in order to drive an application program stored in the memory 170. FIG. In addition, the controller 180 may operate at least two of the components included in the mobile terminal 100 in combination with each other for driving the application program.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power to the components included in the mobile terminal 100. The power supply unit 190 includes a battery, which may be an internal battery or a replaceable battery.

At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 170. [

Hereinafter, the various components of the mobile terminal 100 will be described in detail with reference to FIG. 1A.

First, referring to the wireless communication unit 110, the broadcast receiving module 111 of the wireless communication unit 110 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.

The mobile communication module 112 may be a mobile communication module or a mobile communication module such as a mobile communication module or a mobile communication module that uses technology standards or a communication method (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution And an external terminal, or a server on a mobile communication network established according to a long term evolution (AR), a long term evolution (AR), or the like.

The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. The wireless Internet module 113 is configured to transmit and receive a wireless signal in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.

The wireless Internet module 113 for performing a wireless Internet connection through the mobile communication network can be used for wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE or LTE- May be understood as a kind of the mobile communication module 112.

The short-range communication module 114 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology. The short-range communication module 114 is connected to the mobile terminal 100 and the wireless communication system through the wireless area networks, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal 100 ) And the other mobile terminal 100 (or the external server). The short-range wireless communication network may be a short-range wireless personal area network.

Here, the other mobile terminal 100 may be a wearable device (e.g., a smartwatch, a smart glass, etc.) capable of interchanging data with the mobile terminal 100 according to the present invention (smart glass), HMD (head mounted display)). The short range communication module 114 may detect (or recognize) a wearable device capable of communicating with the mobile terminal 100 around the mobile terminal 100. [ If the detected wearable device is a device authenticated to communicate with the mobile terminal 100 according to the present invention, the control unit 180 may transmit at least a part of the data processed by the mobile terminal 100 to the short- 114 to the wearable device. Therefore, the user of the wearable device can use the data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a telephone is received in the mobile terminal 100, the user performs a telephone conversation via the wearable device, or when a message is received in the mobile terminal 100, It is possible to check the message.

The position information module 115 is a module for obtaining the position (or current position) of the mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the mobile terminal utilizes the GPS module, it can acquire the position of the mobile terminal by using a signal transmitted from the GPS satellite. As another example, when the mobile terminal utilizes the Wi-Fi module, it can acquire the position of the mobile terminal based on information of a wireless access point (AP) that transmits or receives the wireless signal with the Wi-Fi module. Optionally, the location information module 115 may perform any of the other functions of the wireless communication unit 110 to obtain data relating to the location of the mobile terminal, in addition or alternatively. The location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.

Next, the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user. For inputting image information, Or a plurality of cameras 121 may be provided. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame may be displayed on the display unit 151 or stored in the memory 170. [ A plurality of cameras 121 provided in the mobile terminal 100 may be arranged to have a matrix structure and various angles or foci may be provided to the mobile terminal 100 through the camera 121 having the matrix structure A plurality of pieces of image information can be input. In addition, the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for realizing a stereoscopic image.

The microphone 122 processes the external acoustic signal into electrical voice data. The processed voice data can be utilized variously according to a function (or a running application program) being executed in the mobile terminal 100. Meanwhile, the microphone 122 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 123 is for receiving information from a user and when the information is inputted through the user input unit 123, the control unit 180 can control the operation of the mobile terminal 100 to correspond to the input information . The user input unit 123 may include a mechanical input means (or a mechanical key such as a button located on the front, rear or side of the mobile terminal 100, a dome switch, a jog wheel, Jog switches, etc.) and touch-type input means. For example, the touch-type input means may comprise a virtual key, a soft key or a visual key displayed on the touch screen through software processing, The virtual key or the visual key can be displayed on the touch screen with various forms. For example, the virtual key or the visual key can be displayed on the touch screen, ), An icon, a video, or a combination thereof.

Meanwhile, the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a corresponding sensing signal. The control unit 180 may control the driving or operation of the mobile terminal 100 or may perform data processing, function or operation related to the application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.

First, the proximity sensor 141 refers to a sensor that detects the presence of an object approaching a predetermined detection surface, or the presence of an object in the vicinity of the detection surface, without mechanical contact by using electromagnetic force or infrared rays. The proximity sensor 141 may be disposed in the inner area of the mobile terminal or in proximity to the touch screen, which is covered by the touch screen.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. In the case where the touch screen is electrostatic, the proximity sensor 141 can be configured to detect the proximity of the object with a change of the electric field along the proximity of the object having conductivity. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.

On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The proximity sensor 141 can detect a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, have. Meanwhile, the control unit 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 141 as described above, and further provides visual information corresponding to the processed data It can be output on the touch screen. Furthermore, the control unit 180 can control the mobile terminal 100 such that different operations or data (or information) are processed according to whether the touch to the same point on the touch screen is a proximity touch or a touch touch .

The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.

For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like. Here, the touch controller may be a separate component from the control unit 180, and may be the control unit 180 itself.

On the other hand, the control unit 180 may perform different controls or perform the same control according to the type of the touch object touching the touch screen (or a touch key provided on the touch screen). Whether to perform different controls or to perform the same control according to the type of the touch object may be determined according to the current state of the mobile terminal 100 or an application program being executed.

On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.

The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the controller 180 can calculate the position of the wave generating source through the information sensed by the optical sensor and the plurality of ultrasonic sensors. The position of the wave source can be calculated using the fact that the light is much faster than the ultrasonic wave, that is, the time when the light reaches the optical sensor is much faster than the time the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generating source can be calculated using the time difference with the time when the ultrasonic wave reaches the reference signal.

The camera 121 includes at least one of a camera sensor (for example, a CCD, a CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.

The camera 121 and the laser sensor may be combined with each other to sense a touch of the sensing object with respect to the three-dimensional stereoscopic image. The photosensor can be laminated to the display element, which is adapted to scan the movement of the object to be detected proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of change of light, and position information of the object to be sensed can be obtained through the calculation.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, the display unit 151 may display execution screen information of an application program driven by the mobile terminal 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

Also, the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.

In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.

The sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output unit 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output unit 152 may include a receiver, a speaker, a buzzer, and the like.

The haptic module 153 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 153 may be vibration. The intensity and pattern of the vibration generated in the haptic module 153 can be controlled by the user's selection or the setting of the control unit. For example, the haptic module 153 may synthesize and output different vibrations or sequentially output the vibrations.

In addition to vibration, the haptic module 153 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 153 can transmit the tactile effect through the direct contact, and the tactile effect can be felt by the user through the muscles of the finger or arm. The haptic module 153 may include two or more haptic modules 153 according to the configuration of the mobile terminal 100.

The light output unit 154 outputs a signal for notifying the occurrence of an event using the light of the light source of the mobile terminal 100. Examples of events that occur in the mobile terminal 100 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.

The signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or rear surface. The signal output may be terminated by the mobile terminal detecting the event confirmation of the user.

The interface unit 160 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 160 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data in the mobile terminal 100 to an external device. For example, a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, an audio I / O port, a video I / O port, an earphone port, and the like may be included in the interface unit 160.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) A universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the interface unit 160. [

The interface unit 160 may be a path through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, And various command signals may be transmitted to the mobile terminal 100. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The memory 170 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 170 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 170 may be a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type), a multimedia card micro type ), Card type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read memory, a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs the storage function of the memory 170 on the Internet.

Meanwhile, as described above, the control unit 180 controls the operations related to the application program and the general operation of the mobile terminal 100. [ For example, when the state of the mobile terminal satisfies a set condition, the controller 180 may execute or release a lock state for restricting input of a user's control command to applications.

In addition, the control unit 180 performs control and processing related to voice communication, data communication, video call, or the like, or performs pattern recognition processing to recognize handwriting input or drawing input performed on the touch screen as characters and images, respectively . Further, the controller 180 may control any one or a plurality of the above-described components in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components. The power supply unit 190 includes a battery, the battery may be an internal battery configured to be chargeable, and may be detachably coupled to the terminal body for charging or the like.

In addition, the power supply unit 190 may include a connection port, and the connection port may be configured as an example of an interface 160 through which an external charger for supplying power for charging the battery is electrically connected.

As another example, the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port. In this case, the power supply unit 190 may use at least one of an inductive coupling method based on a magnetic induction phenomenon from an external wireless power transmission apparatus and a magnetic resonance coupling method based on an electromagnetic resonance phenomenon Power can be delivered.

In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

Referring to FIGS. 1B and 1C, the disclosed mobile terminal 100 includes a bar-shaped terminal body. However, the present invention is not limited thereto and can be applied to various structures such as a folder type, a flip type, a slide type, a swing type, and a swivel type in which a watch type, a clip type, a glass type or two or more bodies are relatively movably coupled . A description of a particular type of mobile terminal, although relevant to a particular type of mobile terminal, is generally applicable to other types of mobile terminals.

Here, the terminal body can be understood as a concept of referring to the mobile terminal 100 as at least one aggregate.

The mobile terminal 100 includes a case (for example, a frame, a housing, a cover, and the like) that forms an appearance. As shown, the mobile terminal 100 may include a front case 101 and a rear case 102. Various electronic components are disposed in the inner space formed by the combination of the front case 101 and the rear case 102. At least one middle case may be additionally disposed between the front case 101 and the rear case 102.

A display unit 151 is disposed on a front surface of the terminal body to output information. The window 151a of the display unit 151 may be mounted on the front case 101 to form a front surface of the terminal body together with the front case 101. [

In some cases, electronic components may also be mounted on the rear case 102. Electronic parts that can be mounted on the rear case 102 include detachable batteries, an identification module, a memory card, and the like. In this case, a rear cover 103 for covering the mounted electronic components can be detachably coupled to the rear case 102. Therefore, when the rear cover 103 is separated from the rear case 102, the electronic parts mounted on the rear case 102 are exposed to the outside.

As shown, when the rear cover 103 is coupled to the rear case 102, a side portion of the rear case 102 can be exposed. In some cases, the rear case 102 may be completely covered by the rear cover 103 during the engagement. Meanwhile, the rear cover 103 may be provided with an opening for exposing the camera 121b and the sound output unit 152b to the outside.

These cases 101, 102, and 103 may be formed by injection molding of synthetic resin or may be formed of metal such as stainless steel (STS), aluminum (Al), titanium (Ti), or the like.

The mobile terminal 100 may be configured such that one case provides the internal space, unlike the above example in which a plurality of cases provide an internal space for accommodating various electronic components. In this case, a universal mobile terminal 100 in which a synthetic resin or metal is connected from the side to the rear side can be realized.

Meanwhile, the mobile terminal 100 may include a waterproof unit (not shown) for preventing water from penetrating into the terminal body. For example, the waterproof portion is provided between the window 151a and the front case 101, between the front case 101 and the rear case 102, or between the rear case 102 and the rear cover 103, And a waterproof member for sealing the inside space of the oven.

The mobile terminal 100 is provided with a display unit 151, first and second sound output units 152a and 152b, a proximity sensor 141, an illuminance sensor 142, a light output unit 154, Cameras 121a and 121b, first and second operation units 123a and 123b, a microphone 122, an interface unit 160, and the like.

1B and 1C, a display unit 151, a first sound output unit 152a, a proximity sensor 141, an illuminance sensor 142, an optical output unit (not shown) A second operation unit 123b, a microphone 122 and an interface unit 160 are disposed on a side surface of the terminal body, And a mobile terminal 100 having a second sound output unit 152b and a second camera 121b disposed on a rear surface thereof.

However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed, or placed on different planes. For example, the first operation unit 123a may not be provided on the front surface of the terminal body, and the second sound output unit 152b may be provided on the side surface of the terminal body rather than the rear surface of the terminal body.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, the display unit 151 may display execution screen information of an application program driven by the mobile terminal 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, a 3D display, and an e-ink display.

In addition, the display unit 151 may exist in two or more depending on the embodiment of the mobile terminal 100. In this case, the mobile terminal 100 may be provided with a plurality of display portions spaced apart from each other or disposed integrally with one another, or may be disposed on different surfaces, respectively.

The display unit 151 may include a touch sensor that senses a touch with respect to the display unit 151 so that a control command can be received by a touch method. When a touch is made to the display unit 151, the touch sensor senses the touch, and the control unit 180 generates a control command corresponding to the touch based on the touch. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.

The touch sensor may be a film having a touch pattern and disposed between the window 151a and a display (not shown) on the rear surface of the window 151a, or may be a metal wire . Alternatively, the touch sensor may be formed integrally with the display. For example, the touch sensor may be disposed on a substrate of the display or inside the display.

In this way, the display unit 151 can form a touch screen together with the touch sensor. In this case, the touch screen can function as a user input unit 123 (see FIG. 1A). In some cases, the touch screen may replace at least some functions of the first operation unit 123a.

The first sound output unit 152a may be implemented as a receiver for transmitting a call sound to a user's ear and the second sound output unit 152b may be implemented as a loud speaker for outputting various alarm sounds or multimedia playback sounds. ). ≪ / RTI >

The window 151a of the display unit 151 may be provided with an acoustic hole for emitting the sound generated from the first acoustic output unit 152a. However, the present invention is not limited to this, and the sound may be configured to be emitted along an assembly gap (for example, a gap between the window 151a and the front case 101) between the structures. In this case, the appearance of the mobile terminal 100 can be made more simple because the hole formed independently for the apparent acoustic output is hidden or hidden.

The optical output unit 154 is configured to output light for notifying the occurrence of an event. Examples of the event include a message reception, a call signal reception, a missed call, an alarm, a schedule notification, an email reception, and reception of information through an application. The control unit 180 may control the light output unit 154 to start or stop the output of light when the event confirmation of the user is detected.

The first camera 121a processes an image frame of a still image or a moving image obtained by the image sensor in the photographing mode or the video communication mode. The processed image frame can be displayed on the display unit 151 and can be stored in the memory 170. [

The first and second operation units 123a and 123b may be collectively referred to as a manipulating portion as an example of a user input unit 123 operated to receive a command for controlling the operation of the mobile terminal 100 have. The first and second operation units 123a and 123b can be employed in any manner as long as the user is in a tactile manner such as touch, push, scroll, or the like. In addition, the first and second operation units 123a and 123b may be employed in a manner that the user operates the apparatus without touching the user through a proximity touch, a hovering touch, or the like.

In this figure, the first operation unit 123a is a touch key, but the present invention is not limited thereto. For example, the first operation unit 123a may be a mechanical key, or a combination of a touch key and a touch key.

The contents input by the first and second operation units 123a and 123b can be variously set. For example, the first operation unit 123a receives a command such as a menu, a home key, a cancellation, a search, and the like, and the second operation unit 123b receives a command from the first or second sound output unit 152a or 152b The size of the sound, and the change of the display unit 151 to the touch recognition mode.

On the other hand, a rear input unit (not shown) may be provided on the rear surface of the terminal body as another example of the user input unit 123. The rear input unit is operated to receive a command for controlling the operation of the mobile terminal 100, and input contents may be variously set. For example, commands such as power on / off, start, end, scrolling, and the like, the size adjustment of the sound output from the first and second sound output units 152a and 152b, And the like can be inputted. The rear input unit may be implemented as a touch input, a push input, or a combination thereof.

The rear input unit may be disposed so as to overlap with the front display unit 151 in the thickness direction of the terminal body. For example, the rear input unit may be disposed at the rear upper end of the terminal body such that when the user holds the terminal body with one hand, the rear input unit can be easily manipulated using the index finger. However, the present invention is not limited thereto, and the position of the rear input unit may be changed.

When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. When the first operation unit 123a is not disposed on the front surface of the terminal body in place of at least a part of the functions of the first operation unit 123a provided on the front surface of the terminal body, The display unit 151 may be configured as a larger screen.

Meanwhile, the mobile terminal 100 may be provided with a fingerprint recognition sensor for recognizing the fingerprint of the user, and the controller 180 may use the fingerprint information sensed through the fingerprint recognition sensor as authentication means. The fingerprint recognition sensor may be embedded in the display unit 151 or the user input unit 123.

The microphone 122 is configured to receive the user's voice, other sounds, and the like. The microphone 122 may be provided at a plurality of locations to receive stereophonic sound.

The interface unit 160 is a path through which the mobile terminal 100 can be connected to an external device. For example, the interface unit 160 may include a connection terminal for connection with another device (for example, an earphone or an external speaker), a port for short-range communication (for example, an infrared port (IrDA Port), a Bluetooth port A wireless LAN port, or the like), or a power supply terminal for supplying power to the mobile terminal 100. The interface unit 160 may be implemented as a socket for receiving an external card such as a SIM (Subscriber Identification Module) or a UIM (User Identity Module) or a memory card for storing information.

And a second camera 121b may be disposed on a rear surface of the terminal body. In this case, the second camera 121b has a photographing direction which is substantially opposite to that of the first camera 121a.

The second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may be arranged in a matrix form. Such a camera can be named an 'array camera'. When the second camera 121b is configured as an array camera, images can be taken in various ways using a plurality of lenses, and a better quality image can be obtained.

The flash 124 may be disposed adjacent to the second camera 121b. The flash 124 shines light toward the subject when the subject is photographed by the second camera 121b.

And a second sound output unit 152b may be additionally disposed in the terminal body. The second sound output unit 152b may implement a stereo function together with the first sound output unit 152a and may be used for implementing a speakerphone mode in a call.

The body of the mobile terminal 100 may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the broadcast receiving module 111 (see FIG. 1A) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film type and attached to the inner surface of the rear cover 103, or a case including a conductive material may be configured to function as an antenna.

The body of the mobile terminal 100 is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the mobile terminal 100. The power supply unit 190 may include a battery 191 built in the terminal body or detachable from the outside of the terminal body.

The battery 191 may be configured to receive power through a power cable connected to the interface unit 160. In addition, the battery 191 may be configured to be wirelessly chargeable through a wireless charger. The wireless charging may be implemented by a magnetic induction method or a resonance method (magnetic resonance method).

The rear cover 103 is configured to be coupled to the rear case 102 so as to cover the battery 191 to restrict the release of the battery 191 and to protect the battery 191 from external impact and foreign matter . When the battery 191 is detachably attached to the terminal body, the rear cover 103 may be detachably coupled to the rear case 102.

The mobile terminal 100 may be provided with an accessory that protects the appearance or supports or expands the function of the mobile terminal 100. [ One example of such an accessory is a cover or pouch that covers or accommodates at least one side of the mobile terminal 100. [ The cover or pouch may be configured to interlock with the display unit 151 to expand the function of the mobile terminal 100. Another example of an accessory is a touch pen for supplementing or extending a touch input to the touch screen.

Hereinafter, embodiments related to a control method that can be implemented in the mobile terminal 100 configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

2 shows a block diagram of a vehicle 200 according to an embodiment of the present invention.

The vehicle 200 includes a communication unit 210, an input unit 220, a memory 230, an output unit 240, a vehicle driving unit 250, a sensing unit 260, a control unit 270, an interface unit 280, (290).

The communication unit 210 may include one or more modules that enable wireless communication between the vehicle 200 and an external device (e.g., portable terminal, external server, other vehicle). In addition, the communication unit 210 may include one or more modules that connect the vehicle 200 to one or more networks.

The communication unit 210 may include a broadcast receiving module 211, a wireless Internet module 212, a local area communication module 213, a location information module 214, and an optical communication module 215.

The broadcast receiving module 211 receives broadcast signals or broadcast-related information from an external broadcast management server through a broadcast channel. Here, the broadcast includes a radio broadcast or a TV broadcast.

The wireless Internet module 212 is a module for wireless Internet access, and may be embedded in the vehicle 200 or externally. The wireless Internet module 212 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 212 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above. For example, the wireless Internet module 212 can exchange data wirelessly with an external server. The wireless Internet module 212 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) information from an external server.

The short-range communication module 213 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology.

The short-range communication module 213 may form short-range wireless communication networks to perform short-range communication between the vehicle 200 and at least one external device. For example, the short range communication module 213 can exchange data wirelessly with the passenger's portable terminal. The short-range communication module 213 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) from a portable terminal or an external server. For example, when the user has boarded the vehicle 200, the portable terminal of the user and the vehicle 200 can perform pairing with each other automatically or by execution of the user's application.

The position information module 214 is a module for obtaining the position of the vehicle 200, and a representative example thereof is a Global Positioning System (GPS) module. For example, when the vehicle utilizes a GPS module, it can acquire the position of the vehicle using a signal sent from the GPS satellite.

The optical communication module 215 may include a light emitting portion and a light receiving portion.

The light receiving section can convert the light signal into an electric signal and receive the information. The light receiving unit may include a photodiode (PD) for receiving light. Photodiodes can convert light into electrical signals. For example, the light receiving section can receive information of the front vehicle through light emitted from the light source included in the front vehicle.

The light emitting unit may include at least one light emitting element for converting an electric signal into an optical signal. Here, the light emitting element is preferably an LED (Light Emitting Diode). The optical transmitter converts the electrical signal into an optical signal and transmits it to the outside. For example, the optical transmitter can emit the optical signal to the outside through the blinking of the light emitting element corresponding to the predetermined frequency. According to an embodiment, the light emitting portion may include a plurality of light emitting element arrays. According to the embodiment, the light emitting unit can be integrated with the lamp provided in the vehicle 200. [ For example, the light emitting portion may be at least one of a headlight, a tail light, a brake light, a turn signal lamp, and a car light. For example, the optical communication module 215 can exchange data with another vehicle through optical communication.

The input unit 220 may include a driving operation unit 221, a microphone 223, and a user input unit 224.

The driving operation means 221 receives a user input for driving the vehicle 200. The driving operation means 221 may include a steering input means 221a, a shift input means 221b, an acceleration input means 221c and a brake input means 221d.

The steering input means 221a receives a forward direction input of the vehicle 200 from the user. The steering input means 221a may include a steering wheel. According to the embodiment, the steering input means 221a may be formed of a touch screen, a touch pad, or a button.

The shift input means 221b receives the input of parking (P), forward (D), neutral (N), and reverse (R) of the vehicle 200 from the user. The shift input means 221b is preferably formed in a lever shape. According to the embodiment, the shift input means 221b may be formed of a touch screen, a touch pad, or a button.

The acceleration input means 221c receives an input for acceleration of the vehicle 200 from the user. The brake input means 221d receives an input for deceleration of the vehicle 200 from the user. The acceleration input means 221c and the brake input means 221d are preferably formed in the form of a pedal. According to the embodiment, the acceleration input means 221c or the brake input means 221d may be formed of a touch screen, a touch pad or a button.

The camera 222 is disposed at one side of the interior of the vehicle 200, and generates an indoor image of the vehicle 200. For example, the camera 222 may be disposed at various positions of the vehicle 200, such as a dashboard surface, a roof surface, a rear view mirror, etc., to photograph the passenger of the vehicle 200. In this case, the camera 222 can generate an indoor image of an area including the driver's seat of the vehicle 200. [ In addition, the camera 222 can generate an indoor image for an area including an operator's seat and an assistant seat of the vehicle 200. [ The indoor image generated by the camera 222 may be a two-dimensional image and / or a three-dimensional image. To generate a three-dimensional image, the camera 222 may include at least one of a stereo camera, a depth camera, and a three-dimensional laser scanner. The camera 222 can provide the indoor image generated by the camera 222 to the controller 270 functionally combined with the indoor image.

The control unit 270 can analyze the indoor image provided from the camera 222 and detect various objects. For example, the control unit 270 can detect the sight line and / or the gesture of the driver from the portion corresponding to the driver's seat area in the indoor image. As another example, the control unit 270 can detect the sight line and / or the gesture of the passenger from the portion corresponding to the indoor area excluding the driver's seat area in the indoor image. Of course, the sight line and / or the gesture of the driver and the passenger may be detected at the same time.

The microphone 223 can process an external acoustic signal into electrical data. The processed data can be utilized variously according to functions performed in the vehicle 200. The microphone 223 can convert the voice command of the user into electrical data. The converted electrical data may be transmitted to the control unit 270.

The camera 222 or the microphone 223 may be a component included in the sensing unit 260 and not a component included in the input unit 220. [

The user input unit 224 is for receiving information from a user. When information is input through the user input unit 224, the control unit 270 can control the operation of the vehicle 200 to correspond to the input information. The user input unit 224 may include touch input means or mechanical input means. According to an embodiment, the user input 224 may be located in one area of the steering wheel. In this case, the driver can operate the user input portion 224 with his / her finger while holding the steering wheel.

The input unit 220 may include a plurality of buttons or a touch sensor. It is also possible to perform various input operations through a plurality of buttons or touch sensors.

The sensing unit 260 senses a signal related to the running of the vehicle 200 or the like. To this end, the sensing unit 260 may include a sensor, a steering sensor, a speed sensor, a tilt sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor, Position sensor, vehicle forward / backward sensor, battery sensor, fuel sensor, tire sensor, steering sensor by steering wheel rotation, vehicle internal temperature sensor, internal humidity sensor, ultrasonic sensor, infrared sensor, radar, . ≪ / RTI >

Accordingly, the sensing unit 260 can sense the vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, Fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, and the like. The control unit 270 controls the acceleration and deceleration of the vehicle 200 based on the external environment information obtained by at least one of the camera, the ultrasonic sensor, the infrared sensor, the radar, A control signal for changing direction, etc. can be generated. Here, the external environment information may be information related to various objects located within a predetermined distance from the vehicle 200 in motion. For example, the external environment information may include information on the number of obstacles located within a distance of 100 m from the vehicle 200, a distance to the obstacle, a size of the obstacle, a type of the obstacle, and the like.

In addition, the sensing unit 260 may include an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.

The sensing unit 260 may include a biometric information sensing unit. The biometric information sensing unit senses and acquires the biometric information of the passenger. The biometric information may include fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition information, Voice recognition information. The biometric information sensing unit may include a sensor that senses the passenger's biometric information. Here, the camera 222 and the microphone 223 can operate as sensors. The biometric information sensing unit can acquire the hand shape information and the face recognition information through the camera 222.

The sensing unit 260 may include at least one camera 261 that photographs the outside of the vehicle 200. The camera 261 may be referred to as an external camera. For example, the sensing unit 260 may include a plurality of cameras 261 disposed at different positions of the vehicle exterior. The camera 261 may include an image sensor and an image processing module. The camera 261 can process a still image or a moving image obtained by an image sensor (for example, CMOS or CCD). The image processing module processes the still image or moving image obtained through the image sensor, extracts necessary information, and transmits the extracted information to the controller 270.

The camera 261 may include an image sensor (e.g., CMOS or CCD) and an image processing module. In addition, the camera 261 can process still images or moving images obtained by the image sensor. The image processing module can process the still image or moving image obtained through the image sensor. In addition, the camera 261 may acquire an image including at least one of a traffic light, a traffic sign, a pedestrian, a traffic light, and a road surface.

The output unit 240 may include a display device 241, an audio output unit 242, and a haptic output unit 243 for outputting information processed by the control unit 270. [

The display device 241 can display information processed by the control unit 270. [ For example, the display device 241 can display the vehicle-related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver. Further, the vehicle-related information may include vehicle state information indicating the current state of the vehicle or vehicle driving information related to the driving of the vehicle.

The display device 241 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) display, a 3D display, and an e-ink display.

The display device 241 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. Such a touch screen may function as a user input 224 that provides an input interface between the vehicle 200 and a user and may provide an output interface between the vehicle 200 and a user. In this case, the display device 241 may include a touch sensor that senses a touch with respect to the display device 241 so as to receive a control command by a touch method. When a touch is made to the display device 241, the touch sensor senses the touch, and the control unit 270 generates a control command corresponding to the touch based on the touch. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.

Meanwhile, the display device 241 may include a cluster so that the driver can check the vehicle state information or the vehicle driving information while driving. Clusters can be located on the dashboard. In this case, the driver can confirm the information displayed in the cluster while keeping the gaze ahead of the vehicle.

Meanwhile, according to the embodiment, the display device 241 may include at least one of a transparent display and a head-up display. Transparent displays can have transmissivity above a certain level, which means, like a common glass, a display in which a user can perceive an object located on the opposite side across a transparent display. When such a transparent display is disposed in the windshield of the vehicle 1, the user can be prevented from being disturbed by the front view, and it is possible to check various information related to the vehicle 200 while looking at the front.

The head-up display can output various images to the windshield, for example, in which case the windshield can function as a projection surface. To this end, the Head Up Display may include a projector. The head up display can output various information through a well-known method, and a detailed description thereof will be omitted.

The sound output unit 242 converts an electric signal from the control unit 270 into an audio signal and outputs the audio signal. For this purpose, the sound output unit 242 may include a speaker or the like. It is also possible for the sound output section 242 to output a sound corresponding to the operation of the user input section 224. [

The haptic output unit 243 generates a tactile output. For example, the haptic output section 243 may vibrate the steering wheel, the seat belt, and the seat so that the user can operate to recognize the output.

The vehicle drive unit 250 can control the operation of various devices of the vehicle. The vehicle driving unit 250 includes a power source driving unit 251, a steering driving unit 252, a brake driving unit 253, a lamp driving unit 254, an air conditioning driving unit 255, a window driving unit 256, an airbag driving unit 257, A driving unit 258, and a wiper driving unit 259. [0035]

The power source driving unit 251 may perform electronic control of the power source in the vehicle 200. [ The power source driving section 251 may include an accelerator for increasing the speed of the vehicle 200 and a decelerating device for decreasing the speed of the vehicle 200. [

For example, when the fossil fuel-based engine (not shown) is a power source, the power source drive section 251 can perform electronic control of the engine. Thus, the output torque of the engine and the like can be controlled. When the power source driving unit 251 is an engine, the speed of the vehicle can be limited by limiting the engine output torque under the control of the control unit 270. [

As another example, when the electric motor (not shown) is a power source, the power source driving unit 251 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.

The steering driver 252 may include a steering apparatus. Accordingly, the steering driver 252 can perform electronic control of the steering apparatus in the vehicle 200. [ For example, the steering driver 252 may be provided with a steering torque sensor, a steering angle sensor, and a steering motor, and the steering torque applied by the driver to the steering wheel may be sensed by the steering torque sensor. The steering driver 252 can control the steering force and the steering angle by changing the magnitude and direction of the current applied to the steering motor based on the speed of the vehicle 200 and the steering torque. Also, the steering driver 252 can determine whether the traveling direction of the vehicle 200 is properly adjusted based on the steering angle information obtained by the steering angle sensor. Thereby, the running direction of the vehicle can be changed. The steering driver 252 reduces the weight of the steering wheel by increasing the steering force of the steering motor when the vehicle 200 is traveling at a low speed and reduces the steering force of the steering motor when the vehicle 200 is traveling at a high speed, The weight can be increased. In the case where the autonomous running function of the vehicle 200 is executed, the steering driving unit 252 may be configured to determine whether the steering wheel 260 is in a state in which the steering wheel 260 is in a state in which the driver operates the steering wheel (e.g., Based on a sensing signal or a control signal provided by the control unit 270, the steering motor may be controlled so as to generate an appropriate steering force.

The brake driver 253 may perform electronic control of a brake apparatus (not shown) in the vehicle 200. [ For example, it is possible to reduce the speed of the vehicle 200 by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle 200 to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.

The lamp driver 254 can control the turn-on / turn-off of at least one lamp disposed in the vehicle or outside. The lamp driver 254 may include a lighting device. Further, the lamp driving unit 254 can control intensity, direction, and the like of light output from each of the lamps included in the lighting apparatus. For example, it is possible to perform control for a direction indicating lamp, a head lamp, a brake lamp, and the like.

The air conditioning driving unit 255 may perform electronic control on an air conditioner (not shown) in the vehicle 200. For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cool air to be supplied to the inside of the vehicle.

The window driving unit 256 may perform electronic control of a window apparatus in the vehicle 200. [ For example, it is possible to control the opening or closing of the side of the vehicle with respect to the left and right windows.

The airbag drive 257 may perform electronic control of the airbag apparatus in the vehicle 200. [ For example, in case of danger, the airbag can be controlled to fire.

The sunroof driving unit 258 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 200. For example, the opening or closing of the sunroof can be controlled.

The wiper driving unit 259 may control the wipers 24a and 14b provided in the vehicle 200. [ For example, the wiper drive 259 may be configured to provide an electronic control for the number of drives, drive speeds, etc. of the wipers 24a, 14b in response to user input upon receipt of a user input that commands to drive the wiper via the user input 224. [ Can be performed. For example, the wiper driving unit 259 may determine the amount or intensity of the rainwater based on the sensing signal of the rain sensor included in the sensing unit 260, and may control the wipers 24a and 14b without user input, Can be automatically driven.

Meanwhile, the vehicle driving unit 250 may further include a suspension driving unit (not shown). The suspension driving unit may perform electronic control of a suspension apparatus (not shown) in the vehicle 200. For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle 200. [

The memory 230 is electrically connected to the controller 270. The memory 270 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data. The memory 290 can be, in hardware, various storage devices such as ROM, RAM, EPROM, flash drive, hard drive, and the like. The memory 230 may store various data for operation of the vehicle 200, such as a program for processing or controlling the control unit 270. [

The interface unit 280 may serve as a pathway to various kinds of external devices connected to the vehicle 200. For example, the interface unit 280 may include a port that can be connected to the portable terminal, and may be connected to the portable terminal through the port. In this case, the interface unit 280 can exchange data with the portable terminal.

The interface unit 280 may receive the turn signal information. Here, the turn signal information may be a turn-on signal of the turn signal lamp for the left turn or the turn right turn inputted by the user. When the left or right turn signal turn-on input is received through the user input portion (724 in Fig. 6) of the vehicle, the interface portion 280 can receive left turn signal information or right turn signal information.

The interface unit 280 may receive vehicle speed information, rotation angle information of the steering wheel, or gear shift information. The interface unit 280 may receive the sensed vehicle speed information, the steering wheel rotation angle information, or the gear shift information through the sensing unit 260 of the vehicle. Alternatively, the interface unit 280 may receive the vehicle speed information, the steering wheel rotation angle information, or the gear shift information from the control unit 270 of the vehicle. Here, the gear shift information may be information on which state the shift lever of the vehicle is in. For example, the gear shift information may be information on which state the shift lever is in the parking (P), reverse (R), neutral (N), running (D) .

The interface unit 280 may receive user input received via the user input 224 of the vehicle 200. The interface unit 280 may receive user input from the input unit 220 of the vehicle 200 or via the control unit 270.

The interface unit 280 can receive information obtained from an external device. For example, when the traffic light change information is received from the external server through the communication unit 210 of the vehicle 200, the interface unit 280 can receive the traffic light change information from the control unit 270. [

The control unit 270 can control the overall operation of each unit in the vehicle 200. [ The control unit 270 may be referred to as an ECU (Electronic Control Unit).

The controller 270 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) ), Controllers, micro-controllers, microprocessors, and other electronic units for performing other functions.

The power supply unit 290 can supply power necessary for the operation of the respective components under the control of the controller 270. In particular, the power supply unit 270 can receive power from a battery (not shown) in the vehicle.

The AVN (Audio Video Navigation) device 400 can exchange data with the control unit 270. The control unit 270 may receive navigation information from the AVN apparatus or a separate navigation apparatus (not shown). Here, the navigation information may include set destination information, route information according to the destination, map information about the vehicle driving, or vehicle location information.

On the other hand, some of the components shown in Fig. 2 may not be essential to the implementation of the vehicle 200. Fig. Thus, the vehicle 200 described herein may have more or fewer components than those listed above.

The mirroring between the mobile terminal 100 and the display device 241 of the vehicle 200 according to the embodiments of the present invention is performed by the mobile terminal 100 for transmitting the data of the image to be mirrored to the display device 241 of the vehicle 200 100 and a display device 241 for displaying an image corresponding to the data transmitted from the mobile terminal 100. [ That is, the mobile terminal 100 provides the image to the display device 241, and the display device 241 receives the image from the mobile terminal 100.

When a plurality of displays are included in the display device 241, the mobile terminal 100 is connected to one of the plurality of displays 1: 1 to perform mirroring, N is an integer greater than 2) to perform mirroring. At this time, when the mobile terminal 100 is connected to two or more displays included in the display device 241 by 1: N (where N is an integer greater than 2) and performs mirroring, an image mirrored on one display is displayed on another display It may be different from the mirroring image. That is, the mobile terminal 100 can mirror different images on different displays of the display device 241, respectively.

In the following embodiments, the mobile terminal 100 performs an operation when the control unit 180 of the mobile terminal 100 uses the other components of the mobile terminal 100 to perform a corresponding operation . ≪ / RTI > For example, the data transmission / reception operation with the display device 241 may be performed by the control unit 180 using the communication unit 110 or the interface unit 160. As another example, the display operation of some information may be performed by the control unit 180 using the display unit 151. [

3 is a flow chart of an exemplary process S300 executed by the mobile terminal 100 when mirroring between the mobile terminal 100 and the display device 241 of the vehicle 200 according to an embodiment of the present invention. Lt; / RTI >

In step S310, the mobile terminal 100 may determine whether a predetermined event has occurred. Here, the predetermined event may be, for example, (i) entry of the mobile terminal 100 into the vehicle 200, (ii) tagging of the mobile terminal 100 to the NFC module included in the vehicle 200, (iii) (Iv) connection of the mobile terminal 100 to the USB port of the mobile terminal 200, (v) voice command to enter the mirroring mode, and (v) execution of the mirroring application.

Specifically, the occurrence of an entry event of the mobile terminal 100 in the vehicle 200 can be detected by automatic vehicle search by the mobile terminal 100. [ For example, when the signal strength of a wireless signal (e.g., Bluetooth signal, Wi-Fi signal) including the unique identifier of the vehicle 200 transmitted by the vehicle 200 is equal to or greater than a threshold value, the mobile terminal 100 Can be determined to be located in the current vehicle 200. In addition, the NFC module included in the vehicle 200 may have data including an instruction to enter the mirroring mode, and the mobile terminal 100 may input data read from the NFC module into the mirroring mode It is possible to decide whether to enter the mirroring mode or not.

The mobile terminal 100 may perform step S320 upon occurrence of a predetermined event.

In step S320, the mobile terminal 100 may enter the mirroring mode in response to occurrence of a predetermined event. The mirroring mode may be a mode for forming a wired and / or wireless communication network for image sharing between the mobile terminal 100 and the display device 241 and control of the shared image. For example, the wired communication connection between the mobile terminal 100 and the display device 241 can be established using the interface unit 160 that supports High-Definition Multimedia Interface (HDMI). As another example, the wireless communication connection between the mobile terminal 100 and the display device 241 may be implemented using any one or more of the following technologies: wireless fidelity, WiFi Direct, infrared, Zigbee, near field communications, (Radio Frequency IDentification), Bluetooth, UltraWideBand (UWB), and the like.

The mobile terminal 100 receives the characteristic information of each display included in the display device 241 from the display device 241 through the wired and / or wireless communication network with the display device 241 formed in accordance with the entry into the mirroring mode can do. For example, the characteristic information may include a size, an aspect ratio, a resolution, a position, a shape, and the like of the display.

In step S330, the mobile terminal 100 may divide the entire screen of the display unit 151 into a plurality of screens including a mirroring screen and a control screen. For example, the mirroring screen and the control screen may be displayed in a form overlaid on the entire screen of the display unit 151. [

At this time, the mobile terminal 100 may determine the shape and display position of the mirroring screen according to the shape and position of the display selected by the user among at least one display included in the display device 241. [ For example, the aspect ratio of the mirrored screen can be determined based on the aspect ratio of the selected display. As another example, based on the position in the vehicle 200 of the selected display, it is possible to determine at which position of the display unit 151 the mirroring screen is to be displayed.

When two or more displays are selected by the user among the plurality of displays included in the display device 241, the mobile terminal 100 can change the number of the mirroring screens according to the number of the selected displays. For example, when three displays among a plurality of displays included in the display device 241 are selected, the mobile terminal 100 may display three different mirroring screens on the display unit 151. [

In step S340, the mobile terminal 100 may output a user interface for controlling the mirroring screen to the control screen. According to one embodiment, the user interface output on the control screen may include a plurality of icons each representing a plurality of applications installed in the mobile terminal 100. [ That is, through the user interface, a list of applications installed in the mobile terminal 100 can be provided to the user.

In step S350, the mobile terminal 100 may display the execution image of the application corresponding to the user input on the user interface displayed on the control screen on the mirroring screen. Specifically, the mobile terminal 100 may execute an application represented by the selected icon in response to a user input selecting at least one of the plurality of icons included in the user interface. In one embodiment, in response to one of the plurality of icons included in the user interface being dragged and dropped onto the mirroring screen, the mobile terminal 100 executes an application indicated by the dragged and dropped icon, The execution image of the placement can be displayed on at least a part of the mirroring screen. For example, when the application represented by the dragged and dropped icon is a map application, a map image may be displayed on the mirroring screen.

In one embodiment, the mobile terminal 100 may split the mirroring screen into a plurality of display areas corresponding to each of the two or more icons, in response to selecting two or more of the plurality of icons included in the user interface . For example, when an icon is dragged and dropped onto the mirroring screen and then another icon is dragged and dropped onto the mirroring screen, the mobile terminal 100 divides the mirroring screen into two display areas, The execution images of the two applications represented by the two icons can be respectively displayed in the two display areas.

In step S360, the mobile terminal 100 may determine whether or not the application indicated by the selected icon corresponds to the preset travel regulation object. Here, the running regulation subject may be an application in which mirroring to the display device 241 is restricted under a specific condition or a part of the application. This running regulation object may be set to the default from the time of shipment of the mobile terminal 100, or may be set by the user, and may be stored in the memory 130 as setting information for the mirroring mode.

If the application indicated by the icon selected in the user interface corresponds to the preset travel restriction target, the mobile terminal 100 may block the mirroring of the display device 241 and terminate the process (S300). If the application indicated by the icon selected in the user interface does not correspond to the preset driving regulation object, the mobile terminal 100 may perform step S370.

In step S370, the image displayed on the mirroring screen (i.e., the execution image of the application indicated by the selected icon) can be mirrored on the display device 241. [ In this case, the mobile terminal 100 may convert the image displayed on the mirroring screen using the characteristic information of the display device 241 described above, and mirror the converted image on the display device 241.

4 is a view for explaining mirroring between the mobile terminal 100 and the display device 241 of the vehicle 200 according to an embodiment of the present invention.

Referring to FIG. 4, the display device 241 of the vehicle 200 may include at least one display. In one embodiment, the display device 241 may include first through fifth displays 241a, 241b, 241c, 241d-1 and 241d-2, as shown.

Specifically, the first display 241a may be a navigation display disposed in the center fascia of the vehicle 200. [ The second display 241b is a display disposed in one area of the dashboard of the vehicle 200 and may have a ratio of N: 1 (N is larger than 1). That is, the second display 241b may be a display whose length is longer than the vertical length, and may extend from one position in front of the driver's seat to one position in front of the assistant's seat. For example, the second display 241b may have a lateral length corresponding to about 7/10 of the distance between the right and left doors of the vehicle 200. [ In one embodiment, the forward image provided from the camera 261 may be displayed on the second display 241b. The forward image may be an image for a region that is hidden by the bonnet of the vehicle 200 or the like and is not visible to the driver during the front view area. The third display 241c may be disposed in the windshield of the vehicle 200 or may be a transparent display replacing the windshield. The user can confirm the information displayed on the third display 241c together with the actual sight ahead of the vehicle 200, so that the user can concentrate more on the forward sight. The fourth and fifth displays 241d-1 and 241d-2 may be displays disposed on the left A pillar and the right A pillar of the vehicle 200, respectively. However, it is apparent to those skilled in the art that the displays included in the display device 241 are not limited to the above-described types and positions, and various other modifications are possible.

The mobile terminal 100 located in the interior of the vehicle 200 forms a wired or wireless network with the vehicle 200 so that a specific image displayed on the mobile terminal 100 is displayed on the first to fifth displays 241a, 241b, 241c, 241d-1, and 241d-2, respectively. This will be described in more detail with reference to FIGS. 6 to 15B.

5 is a view for explaining the mirroring between the mobile terminal 100 and the display device 241 of the vehicle 200 according to an embodiment of the present invention.

5, at least one NFC module 213a-1, 213a-2, 213a-3 and 213a-4 and at least one USB port 280a-1, 280a-2, 280a- 3, and 280a-4. The NFC modules 213a-1, 213a-2, 213a-3 and 213a-4 are included in the short range communication module 213 shown in FIG. 2, and the USB ports 280a-1, 280a-2, And 280a-4 may be included in the interface unit 280 shown in FIG.

In one embodiment, when a tagging event occurs between at least one of the mobile terminal 100 and the NFC modules 213a-1, 213a-2, 213a-3, and 213a-4, the mobile terminal 100 enters the mirroring mode You can enter. A mirroring execution command may be recorded in the NFC modules 213a-1, 213a-2, 213a-3 and 213a-4 and the local area communication module 114 of the mobile terminal 100 may include NFC modules 213a-1 and 213a- 2, 213a-3, and 213a-4, the mirroring execution command recorded in the NFC modules 213a-1, 213a-2, 213a-3, and 213a-4 may be read and the mirroring mode may be entered .

In one embodiment, when a connection event occurs between at least one of the mobile terminal 100 and the USB ports 280a-1, 280a-2, 280a-3, and 280a-4, the mobile terminal 100 may enter the mirroring mode You can enter. The vehicle 200 is connected to the mobile terminal 100 based on the characteristic information of the mobile terminal 100 provided from the USB ports 280a-1, 280a-2, 280a-3 and 280a-4 connected to the mobile terminal 100 100) is a device supporting the mirroring function. If the mobile terminal 100 is a device supporting the mirroring function, the vehicle 200 transmits a mirroring execution command to the mobile terminal 200a through the USB ports 280a-1, 280a-2, 280a-3, 100). The mobile terminal 100 may enter the mirroring mode upon reception of the mirroring execution command through the USB ports 280a-1, 280a-2, 280a-3, and 280a-4.

6 is a view for explaining an exemplary operation in which the mobile terminal 100 selects a display of a display device 241 to mirror an image according to an embodiment of the present invention.

6, when the mobile terminal 100 enters the mirroring mode, the mobile terminal 100 displays an image (e.g., an image) for guiding the plurality of displays 241a, 241b, 241c, 241d-1, and 241d-2 included in the display device 241 612, 613, 614, and 615 corresponding to the plurality of displays 241a, 241b, 241c, 241d-1, and 241d-2 are displayed on the display unit 151 .

For example, as shown in the figure, in the image 610, different identification information '(', ',', ',' ① ',' ② ',' ③ ',' ④ ',' ⑤ ') can be displayed. Namely, the five pieces of identification information ('1', '2', '3', '4', '5') are sequentially displayed in the first to fifth displays 241a, 241b, 241c, 241d- 2). ≪ / RTI >

The mobile terminal 100 displays icons 611 and 611 corresponding to the plurality of displays 241a, 241b, 241c, 241d-1, and 241d-2 in an area separate from the area in which the image 610 is displayed, 612, 613, 614, and 615 can be displayed. The user touches the icons 611, 612, 613, 614 and 615 to select a display among the plurality of displays 241a, 241b, 241c, 241d-1 and 241d-2 to be mirrored with the mobile terminal 100 . For example, if the user touches the icon 613 and then presses the completion button 620, the mobile terminal 100 may mirror a predetermined image only to the third display 241c. For example, if the user touches the two icons 611 and 612 and then presses the completion button 620, the mobile terminal 100 may mirror a predetermined image only on the first and second displays 241a and 241b . That is, the mobile terminal 100 can mirror a predetermined image to two or more displays included in the display device 241 at the same time.

FIG. 7 shows an exemplary operation of dividing a screen when the mobile terminal 100 enters the mirroring mode according to an embodiment of the present invention. For convenience of explanation, it is assumed that the mobile terminal 100 mirrors the image to the second display 241b.

Referring to FIG. 7, the mobile terminal 100 may divide the entire area of the display unit 151 into a plurality of screens. For example, as shown, the entire area of the display unit 151 may include a mirroring screen S1 and a control screen S2. Icons 611, 612, 613, and 614 corresponding to the plurality of displays 241a, 241b, 241c, 241d-1, and 241d-2 are displayed in empty spaces other than the mirroring screen S1 and the control screen S2 , 615) may be displayed. In this case, since the current mobile terminal 100 is currently performing mirroring with the second display 241b, the icon 612 corresponding to the second display 241b is displayed on the other icons 611, 613, 614, and 615 Can be displayed separately. For example, the icon 612 and the remaining icons 611, 613, 614, and 615 may be displayed differently in color, shade, highlight, border thickness, blink period, and the like.

Meanwhile, the mobile terminal 100 may determine the shape of the mirroring screen S1 according to the shape of the second display 241b. For example, the shape of the mirroring screen S1 can be transformed so as to have the same aspect ratio as the aspect ratio of the second display 241b. In addition, the mobile terminal 100 may determine the display position of the mirroring screen S1 according to the arrangement position of the second display 241b in the vehicle 200. [ For example, when the second display 241b is positioned symmetrically with respect to the center of the vehicle 200, the mirroring screen S1 is also symmetrically moved left and right with respect to the center of the display unit 151 Can be displayed.

The mobile terminal 100 may output a user interface including a plurality of icons 730a to 730j representing a plurality of applications installed on the mobile terminal 100 on the control screen S2. For example, the plurality of icons 730a through 730j may represent a gallery application, a camera application, a calendar application, a map application, an SNS application, an Internet application, a message application, a telephone application, a music application, and a weather application in order.

On the other hand, the number of icons displayed at a time on the control screen S2 may be limited to a predetermined number. For example, as shown in the figure, the number of icons that can be simultaneously displayed on the control screen S2 may be limited to ten. If the number of applications installed in the mobile terminal 100 exceeds 10, the mobile terminal 100 may include objects 741 and 742 for switching the current user interface to another user interface including the icons of the remaining applications, To the control screen S2.

The mobile terminal 100 displays a predetermined image on at least a part of the mirroring screen S1 in response to a user input on the user interface displayed on the control screen S2 and simultaneously displays the selected image on the display device 241 The mirroring can be performed in detail in FIG.

FIG. 8 shows an exemplary operation in which the mobile terminal 100 according to an embodiment of the present invention changes the position and size of the screen according to the body direction.

Referring to FIG. 8, the mobile terminal 100 may sense the direction of the main body using the sensing unit 140 provided therein. Here, the sensing unit 140 may include at least one of the acceleration sensor, the magnetic sensor, the gravity sensor, and the gyroscope sensor described above with reference to FIG.

The mobile terminal 100 can change at least one of the display position, shape and size of at least one of the plurality of screens displayed on the display unit 151 based on the change of the body direction.

For example, when the body direction of the mobile terminal 100 is rotated by 90 degrees in the clockwise direction from the lateral direction as shown in FIG. 7 to be changed to the longitudinal direction as shown in FIG. 8, At least one of the display position, the size and the shape of the mirroring screen S1 and the control screen S2 may be changed as shown in Fig. 8, depending on the angle at which the main body 100 is rotated. 7 is adjacent to the right bezel of the mobile terminal 100 while the mirroring screen S1 shown in FIG. 8 is adjacent to the upper bezel of the mobile terminal 100 Can be displayed in a relatively small size. The icons 730a to 730j of the control screen S2 shown in Fig. 7 are arranged in the form of 2x5, while the icons 730a to 730j of the control screen S2 shown in Fig. Can be arranged in the form of 4X3 in accordance with the changed shape or size of the second lens S2.

9A and 9B show an exemplary operation in which the mobile terminal 100 according to the embodiment of the present invention mirrors a predetermined image to the display device 241 based on a user input.

9A, the mobile terminal 100 may receive a user input 900 for selecting an icon included in the user interface provided by the control screen S2. For example, the user input 900 may be a touch input that drag and drop a particular icon 730d onto the mirroring screen S1.

The mobile terminal 100 executes an application corresponding to the corresponding icon 730d in response to the user input 900 for dragging and dropping the specific icon 730d onto the mirroring screen S1, Can be displayed on the mirroring screen S1. 9B, when the application corresponding to the selected icon 730d is a map application, the execution image of the map application, that is, the map image 911, may be displayed on the mirroring screen S1.

The mobile terminal 100 displays the map image 911 on the mirroring screen S1 or displays the map image 911 on the mirroring screen S1 and then displays the same image on the display device 241 Mirroring is possible. The image 912 corresponding to the map image 911 displayed on the mirroring screen S1 may be displayed on the second display 241b when the second display 241b is mirrored on the mobile terminal 100 have.

According to the present embodiment, when it is difficult for the driver to execute the map application for confirming the route to the destination, the passenger operates the mobile terminal 100 instead and displays the map image Can be displayed.

10A and 10B show an exemplary operation in which the mobile terminal 100 related to FIG. 9B additionally mirrors a predetermined image to the display device 241 based on a user input.

The control unit 180 may divide the mirroring screen S1 into a plurality of display areas according to the number of icons to be dragged and dropped onto the mirroring screen S1 among the plurality of icons.

10A, in a state in which the map image 911 is displayed on the mirroring screen S1, the mobile terminal 100 displays a user input for selecting another icon included in the user interface provided by the control screen S2 Lt; RTI ID = 0.0 > 1000 < / RTI > For example, the user input 1000 may be a touch input for dragging and dropping a specific icon 730j onto the mirroring screen S1.

The mobile terminal 100 executes an application corresponding to the corresponding icon 730j in response to the user input 1000 for additionally dragging and dropping the specific icon 730j to the mirroring screen S1 and executes the executed application The image can be displayed together with the map image 911 on the mirroring screen S1. In this case, the controller 180 divides the mirroring screen S1 into a plurality of display areas according to the number of icons dragged and dropped into the mirroring screen S1, and displays different images for each of the divided display areas .

10B, when the application corresponding to the selected icon 730j is a weather application, the mobile terminal 100 divides the mirroring screen S1 into two different display areas, and one display area displays a map image 911 and display the weather guide image 1011 in another display area.

The mobile terminal 100 displays the weather guide image 1011 on the mirroring screen S1 or the weather guide image 1011 on the mirroring screen S1 and then displays the same image on the display device 241 ). ≪ / RTI > When the second display 241b is mirrored on the mobile terminal 100, the second display 241b displays an image 1012 corresponding to the weather guide image 1011 displayed on the mirroring screen S1 have.

Figs. 11A and 11B show an exemplary operation in which the mobile terminal 100 related to Fig. 10B changes the state of the display areas included in the mirroring screen S1 based on user input.

11A, when a user input 1100 that is dragged from one display area of the mirroring screen S1 to another display area is received, the mobile terminal 100 responds to the user input 1100, It is possible to change the arrangement order between the display areas included in the display area S1. For example, the user input 1100 may be a hand gesture of the user from the map image 911 of the mirroring screen S1 to the weather guide image 1011. [

11B, according to the user input 1100, the mobile terminal 100 displays the map image 911 of the mirroring screen S1 between the left display area where the map image 911 is displayed and the right display area where the weather guide image 1011 is displayed You can change the order of the arrays.

According to the present embodiment, when a plurality of images are mirrored on the display device 241, it is possible to move a specific image among a plurality of images to a desired position by the user, thereby improving user convenience.

11A and 11B, only the operation of changing the arrangement order between the display areas included in the mirroring screen S1 on the basis of the user input is described by the mobile terminal 100, but other operations are also possible. For example, the mobile terminal 100 may remove the long touched display area from the mirroring screen S1 in response to a long touch on any one of the display areas included in the mirroring screen S1. In another example, the mobile terminal 100 may respond to a pinch-in / pinch-out touch for any one of the display areas included in the mirroring screen S1, pinch-in / pinch-out) can be reduced / enlarged by a predetermined ratio.

12A and 12B illustrate an exemplary operation in which the mobile terminal 100 according to the embodiment of the present invention blocks mirroring of an application that is subject to traffic regulation.

Referring first to FIG. 12A, the mobile terminal 100 may receive a user input 1200 for selecting an icon included in the user interface provided by the control screen S2. For example, the user input 1200 may be a touch input for dragging and dropping a specific icon 730g onto the mirroring screen S1.

In this case, the mobile terminal 100 can determine whether or not the application indicated by the specific icon 730g corresponds to the preset running regulation object. The running regulation object may be an application set as default from the time of shipment of the mobile terminal 100 or an application specified by the user.

In addition, when the mobile terminal 100 downloads a new application from the outside and installs it in the mobile terminal 100, the mobile terminal 100 provides the user with an object for selecting whether to set the application as a travel regulation target, It is possible to set or not set the newly downloaded application as the running regulation object based on the user input.

The process of determining whether or not the application indicated by the specific icon 730g of the mobile terminal 100 corresponds to the predetermined traffic regulation object may be performed only when the vehicle 200 is in the specific running state. In one embodiment, the mobile terminal 100 receives the running state information of the vehicle 200 from the vehicle 200 using the communication unit, and only when the received running state information corresponds to the predetermined running state, The process of determining whether or not the application indicated by the icon 730g corresponds to the predetermined running regulation object can be started.

For example, when the speed of the vehicle 200 exceeds a predetermined reference speed or the position of the vehicle 200 is a specific type of road (e.g., a construction section, a rapid curve, a school zone, Intersection), it can be determined whether or not the application indicated by the specific icon 730g corresponds to the predetermined running regulation object. On the other hand, when the speed of the vehicle 200 is equal to or lower than the reference speed, or the vehicle is parked, the mobile terminal 100 determines whether the application indicated by the specific icon 730g corresponds to the preset running regulation target It can be omitted.

12B illustrates a case where an application indicated by a specific icon 730g dragged and dropped from the control screen S2 to the mirroring screen S1 is a predetermined travel regulation object as shown in FIG. 12A.

For example, when it is assumed that the application indicated by the specific icon 730g is a message application, the mobile terminal 100 may not mirror the execution image of the message application, that is, the message confirmation image to the display device 241. [ Specifically, the mobile terminal 100 may execute the message application, display the message confirmation image only on the mirroring screen S1, block the mirroring of the display device 241, or block the execution of the message application from the beginning .

In this case, as shown in the figure, the mobile terminal 100 may display a guidance window 1211 indicating that the mirroring of the message confirmation image is blocked on the mirroring screen S1. In this case, the second display 241b can display the image 1212 corresponding to the guide window 1211. [

13A and 13B show an exemplary operation in which the mobile terminal 100 according to the embodiment of the present invention changes the display position of the mirroring screen S1 and the control screen S2 based on user input.

13A, when a user input 1300 dragged from a point on the mirroring screen S1 to an empty area of the display unit 151 is received, the mobile terminal 100 responds to the user input 1300 , The display position of the mirroring screen S1 can be changed within the entire area of the display unit 151. [

FIG. 13B illustrates a state in which the display position of the mirroring screen S1 is changed according to the user input 1300. FIG. The mirroring screen S1 can move to the lower end of the display unit 151 according to the user input 1300 which is released at the lower end of the display unit 151 as shown in FIG. At the same time, the control screen S2 can move toward the upper end of the display unit 151 by the vertical length of the mirroring screen S1.

14A and 14B illustrate an exemplary operation for changing the size of an image mirrored on the display device 241 based on a user input to the mirroring screen S1 according to an embodiment of the present invention. Lt; / RTI >

Referring to FIG. 14A, the mobile terminal 100 may divide the entire area of the display unit 151 into a plurality of screens including a mirroring screen S1 and a control screen S2. A user interface including icons 730a to 730j of applications that can be displayed through the mirroring screen S1 may be displayed on the control screen S2. 7, the mirroring screen S1 may be interlocked with the third display 241c included in the display device 241. In this case, That is, the mobile terminal 100 may mirror the image displayed on the mirroring screen S1 to the third display 241c. The third display 241c may be a transparent display that is disposed in the windshield or replaces the windshield as described above. For example, in the mirroring screen S1, five pictures can be displayed as an execution image of the gallery application, and an image corresponding to the execution image can be displayed in the display area 1440 of the third display 241c . At this time, the mobile terminal 100 may display the object 613 corresponding to the third display 241c to be distinguished from the other objects 611, 612, 614, and 615.

The mobile terminal 100 may change the size of the mirroring screen S1 in response to the user input 1400 for the mirroring screen S1. For example, the mobile terminal 100 may reduce the size of the mirroring screen S1 when the user input 1400 is a pinch-in touch, while reducing the size of the mirroring screen S1 when the user input 1400 is a pinch- Can be enlarged. In addition, the mobile terminal 100 may change the size of the mirrored image on the third display 241c linked to the mirroring screen S1 in response to the change of the size of the mirroring screen S1.

14B illustrates a mirroring operation when the user input 1400 is a pinch-out touch. The mobile terminal 100 may enlarge the size of the mirroring screen S1 corresponding to the user input 1400. [ For example, if the user input 1400 is a pinch-out touch that moves vertically away as shown, the mobile terminal 100 may increase the vertical length of the mirroring screen S1 based on the pinch-out distance. Accordingly, the vertical length H2 of the mirroring screen S1 after the user input 1400 may be longer than the user input 1400 time H1. At the same time, the vertical length H12 of the display area 1440 of the third display 241c linked to the mirroring screen S1 corresponds to the vertical length of the mirroring screen S1, before the user input 1400 H11). For example, H1: H11 = H2: H12.

14B, as the vertical length H2 of the mirroring screen S1 becomes longer than the vertical length H1 of the mirroring screen S1, the mirroring screen S1 is relatively more A large number of pictures can be displayed.

On the other hand, as the size of the mirroring screen S1 increases, the size of the control screen S2 can be relatively reduced. Accordingly, the sizes of the icons 730a to 730j displayed on the control screen S2 can be reduced as much as the control screen S2 is reduced.

The mobile terminal 100 changes the size of the image to be mirrored on the display device 241 according to the size of the mirroring screen S1 so that the image of the mirroring screen S1 and the image It is possible to reduce the sense of heterogeneity between the images of the device 241. [

15A and 15B illustrate an exemplary operation in which the mobile terminal 100 according to an embodiment of the present invention mirrors an image on two or more different displays included in the display device 241. FIG.

Referring to FIG. 15A, when two or more objects 614 and 615 are selected and a completion button 620 is touched in FIG. 6, the mobile terminal 100 displays the same number of objects 614 and 615 as the selected two or more objects 614 and 615 The mirroring screens S1-1 and S1-2 and the control screen S2 can be displayed in different areas of the display unit 151. [

In this case, the shape and display position of the first mirroring screen S1-1 correspond to the shape and arrangement position of the fourth display 241d-1, and the shape and display position of the second mirroring screen S1-2 are And the shape and arrangement position of the fifth display 241d-2.

For example, as shown in the drawing, the mobile terminal 100 displays the first mirroring screen S1-1 linked to the fourth display 241d-1 corresponding to the object 614 in the left area of the display unit 151 Can be displayed. At this time, the aspect ratio of the first mirroring screen S1-1 may be substantially the same as the aspect ratio of the fourth display 241d-1. The second mirroring screen S1-2 interlocked with the fifth display 241d-2 corresponding to the object 615 can be displayed on the right area of the display unit 151. [ At this time, the aspect ratio of the second mirroring screen S1-2 may be substantially the same as the aspect ratio of the fifth display 241d-2. In addition, the mobile terminal 100 may display the control screen S2 between the first mirroring screen S1-1 and the second mirroring screen S1-2.

In addition, the mobile terminal 100 may display the selected two objects 614 and 615 so as to be distinguished from the remaining objects 611, 612, and 613.

The mobile terminal 100 receives a user input 1501 for selecting one icon 730c included in the user interface provided by the control screen S2 and a user input 1502 for selecting another icon 730h can do. For example, the user input 1501 may be a touch input for dragging and dropping the icon 730c onto the first mirroring screen S1-1, and the user input 1502 may display the icon 730h on the second mirroring screen S1- 2). ≪ / RTI > At this time, the user input 1501 and the user input 1502 may be received simultaneously or sequentially.

The mobile terminal 100 executes two applications corresponding to the two icons 730c and 700h respectively in response to the user input 1501 and the user input 1502 and displays the execution image of the executed application in the first and second Can be displayed on the mirroring screens (S1-1, S1-2).

Referring to FIG. 15B, the mobile terminal 100 may execute a schedule application corresponding to the icon 730c to display the schedule guide image on the first mirroring screen S1-1. At this time, the mobile terminal 100 displays the schedule guide image on the first mirroring screen S1-1 or displays the schedule guide image on the first mirroring screen S1-1, And can be mirrored on the display 241d-1. In addition, the mobile terminal 100 can execute the telephone application corresponding to the icon 730h to display the contact guide image on the second mirroring screen S1-2. At this time, the mobile terminal 100 displays the contact guide image on the second mirroring screen S1-2 or displays the contact guide image on the second mirroring screen S1-2, And can be mirrored on the display 241d-2.

The embodiments of the present invention described above are not only implemented by the apparatus and method but may be implemented through a program for realizing the function corresponding to the configuration of the embodiment of the present invention or a recording medium on which the program is recorded, The embodiments can be easily implemented by those skilled in the art from the description of the embodiments described above. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet).

It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are intended to be illustrative, The present invention is not limited to the drawings, but all or some of the embodiments may be selectively combined so that various modifications may be made.

100: mobile terminal
200: vehicle

Claims (20)

A display unit;
A communication unit for performing communication with a display device of the vehicle; And
Upon entering the mirroring mode,
Wherein the controller divides the entire screen of the display unit into a control screen including a mirroring screen and a user interface for controlling the mirroring screen,
A controller for transmitting an image displayed on the mirroring screen to a display device of the vehicle; .
The method according to claim 1,
Wherein,
Upon occurrence of a preset event, the mirroring mode is entered,
The preset event may include:
(ii) tagging of the mobile terminal with respect to an NFC module included in the vehicle, (iii) connection of the mobile terminal to the USB port provided in the vehicle, (iv) entry of the mobile terminal into the vehicle, A voice command to indicate entry into the mirroring mode, and (v) execution of a mirroring application.
The method according to claim 1,
Wherein the user interface comprises:
And a plurality of icons representing each of a plurality of applications installed in the mobile terminal.
The method of claim 3,
Wherein,
Executing a first application represented by the first icon in response to a first one of the plurality of icons being dragged and dropped onto the mirroring screen,
And displays an execution image of the first application on the mirroring screen.
5. The method of claim 4,
Wherein,
And mirrors the execution image of the first application to the display device through the communication unit.
5. The method of claim 4,
Wherein,
And blocks mirroring of an execution image of the first application to the display device when the first application is a predetermined driving regulation object.
5. The method of claim 4,
Wherein,
Executing a second application represented by the second icon in response to a second one of the plurality of icons being dragged and dropped onto the mirroring screen,
And displays the execution image of the second application together with the execution image of the first application on the mirroring screen.
The method of claim 3,
Wherein,
And divides the mirroring screen into a plurality of display areas according to the number of icons to be dragged and dropped onto the mirroring screen among the plurality of icons.
The method according to claim 1,
Wherein,
(I) changing the arrangement order among the plurality of display areas, (ii) changing at least one of the plurality of display areas when the mirroring screen is divided into a plurality of display areas, And (iii) changing the size of at least one of the plurality of display areas.
The method according to claim 1,
Wherein,
And changes the size of the mirroring screen in response to user input to the mirroring screen.
11. The method of claim 10,
Wherein the display device includes a transparent display disposed in a windshield of the vehicle,
Wherein,
And changes the size of an image mirrored on the transparent display according to the changed size of the mirroring screen.
The method according to claim 1,
Wherein,
Selecting one of a plurality of displays included in a display device of the vehicle and mirroring the image of the mirroring screen on the selected display,
And determines the shape of the mirroring screen according to the shape of the selected display.
The method according to claim 1,
And a sensing unit for sensing a direction of the main body of the mobile terminal,
Wherein,
Shape and size of at least one of the mirroring screen and the control screen based on a change in the direction of the main body.
Detecting an occurrence of a predetermined event;
Entering a mirroring mode for a display device of the vehicle in response to the occurrence of the event;
Dividing the entire screen of the mobile terminal into a mirroring screen and a control screen when entering the mirroring mode;
Outputting a user interface for controlling the mirroring screen to the control screen; And
Mirroring an image displayed on the mirroring screen to the display device based on a user input to the user interface;
The method comprising the steps of:
15. The method of claim 14,
Wherein the user interface comprises:
And a plurality of icons representing each of a plurality of applications installed in the mobile terminal.
16. The method of claim 15,
Displaying an execution image of an application represented by the icon on the mirroring screen in response to an icon of the plurality of icons being dragged and dropped onto the mirroring screen; And
Mirroring an execution image of the application to the display device;
The method comprising the steps of:
17. The method of claim 16,
Blocking mirroring of an execution image of the application to the display device when the application is a predetermined running regulation object;
Further comprising the steps of:
15. The method of claim 14,
Changing at least one of a size and a display position of the mirroring screen in response to a user input to the mirroring screen;
Further comprising the steps of:
19. The method of claim 18,
Changing a size of an image to be mirrored on the display device according to a changed size of the mirroring screen;
Further comprising the steps of:
15. The method of claim 14,
Changing at least one of a display position, a shape, and a size of at least one of the mirroring screen and the control screen based on a change in the body direction of the mobile terminal;
Further comprising the steps of:
KR1020150131803A 2015-09-17 2015-09-17 Mobile terminal and method for controlling the same KR101736820B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150131803A KR101736820B1 (en) 2015-09-17 2015-09-17 Mobile terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150131803A KR101736820B1 (en) 2015-09-17 2015-09-17 Mobile terminal and method for controlling the same

Publications (2)

Publication Number Publication Date
KR20170033699A KR20170033699A (en) 2017-03-27
KR101736820B1 true KR101736820B1 (en) 2017-05-17

Family

ID=58496813

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150131803A KR101736820B1 (en) 2015-09-17 2015-09-17 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR101736820B1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112631533A (en) * 2019-10-09 2021-04-09 上海博泰悦臻电子设备制造有限公司 Information sharing method, information sharing device, in-vehicle terminal, and storage medium
KR20220087815A (en) * 2020-12-18 2022-06-27 삼성전자주식회사 Electronice device and control method thereof
KR102348321B1 (en) * 2021-07-07 2022-01-06 유석영 Method and device for mirroring
KR102581600B1 (en) * 2022-01-13 2023-09-21 엘지전자 주식회사 Signal processing device and vehicle display device comprising the same

Also Published As

Publication number Publication date
KR20170033699A (en) 2017-03-27

Similar Documents

Publication Publication Date Title
KR101730315B1 (en) Electronic device and method for image sharing
US10431086B2 (en) Vehicle, mobile terminal and method for controlling the same
KR101822945B1 (en) Mobile terminal
US10149132B2 (en) Pedestrial crash prevention system and operation method thereof
CN110099836B (en) Vehicle and method of controlling display therein
KR101711835B1 (en) Vehicle, Vehicle operating method and wearable device operating method
US10977689B2 (en) Mobile terminal and method for controlling same
KR101716145B1 (en) Mobile terminal, vehicle and mobile terminal link system
KR101878811B1 (en) V2x communication system for generating real-time map and method for controlling the same
KR101924059B1 (en) Display apparatus for vehicle and Vehicle including the same
KR20170007980A (en) Mobile terminal and method for controlling the same
KR101736820B1 (en) Mobile terminal and method for controlling the same
KR101828400B1 (en) Portable v2x terminal and method for controlling the same
KR20160114486A (en) Mobile terminal and method for controlling the same
KR101841501B1 (en) Mobile device for car sharing and method for car sharing system
KR101859043B1 (en) Mobile terminal, vehicle and mobile terminal link system
KR20170041418A (en) Display apparatus for vehicle and control method for the same
KR101916425B1 (en) Vehicle interface device, vehicle and mobile terminal link system
KR101807788B1 (en) Display apparatus for vehicle and control method for the same
EP4191204A1 (en) Route guidance device and route guidance method thereof
KR20170009558A (en) Navigation terminal device for sharing intervehicle black box image
KR101892498B1 (en) Vehicle interface device, vehicle and mobile terminal link system
KR101661970B1 (en) Mobile terminal and method for controlling the same
CN115583202A (en) Display device, vehicle, display control method and device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant