WO2017183743A1 - Terminal mobile, stylet et procédé de commande associé - Google Patents

Terminal mobile, stylet et procédé de commande associé Download PDF

Info

Publication number
WO2017183743A1
WO2017183743A1 PCT/KR2016/004063 KR2016004063W WO2017183743A1 WO 2017183743 A1 WO2017183743 A1 WO 2017183743A1 KR 2016004063 W KR2016004063 W KR 2016004063W WO 2017183743 A1 WO2017183743 A1 WO 2017183743A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile terminal
stylus pen
information
input
content
Prior art date
Application number
PCT/KR2016/004063
Other languages
English (en)
Korean (ko)
Inventor
조태훈
김은영
김수민
최진해
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to PCT/KR2016/004063 priority Critical patent/WO2017183743A1/fr
Publication of WO2017183743A1 publication Critical patent/WO2017183743A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Definitions

  • the present invention relates to a method for providing a user interface for controlling various contents on a mobile terminal using a stylus pen paired with the mobile terminal.
  • Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility.
  • the mobile terminal may be further classified into a handheld terminal and a vehicle mounted terminal according to whether a user can directly carry it.
  • the functions of mobile terminals are diversifying. For example, data and voice communication, taking a picture and video with a camera, recording a voice, playing a music file through a speaker system, and outputting an image or video to a display unit.
  • Some terminals have an electronic game play function or a multimedia player function.
  • recent mobile terminals may receive multicast signals that provide visual content such as broadcasting, video, and television programs.
  • such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.
  • the conventional stylus pen was merely used for touching the touch screen and was not utilized for various interactions with the mobile terminal. Therefore, in addition to applying a touch input to the mobile terminal using a stylus pen, it is necessary to increase user convenience through interaction between the stylus pen and the mobile terminal.
  • the conventional stylus pen can be recognized by the mobile terminal only when located in close proximity to the mobile terminal, it is necessary to increase the usability for the user convenience.
  • Another object is to provide a mobile terminal and a stylus pen with improved user convenience.
  • an object of the present invention is to provide data copying between a plurality of mobile terminals using a stylus pen.
  • an object of the present invention is to provide a method for easily performing payment or bank transfer using a stylus pen and a mobile terminal.
  • Another object of the present invention is to provide a method for using drawing content and document content on a mobile terminal using a stylus pen.
  • a wireless communication unit for transmitting / receiving data with at least one mobile terminal;
  • a sensing unit sensing an input signal;
  • a control unit for controlling to transmit a control signal to the at least one mobile terminal in response to the input signal.
  • control unit senses the input signal in a state where the first content is output to the at least one mobile terminal, and corresponding to the input signal, a second different from the first content
  • a control device for outputting content to the at least one mobile terminal further comprises a control, to provide a stylus pen.
  • the at least one mobile terminal comprises a first mobile terminal and a second mobile terminal, the input signal comprising a first input signal and a second input signal. to provide.
  • control unit in response to the first input signal, receives the information on the first content from the first mobile terminal and stores the information on the first content, And, in response to the second input signal, controlling to transmit the information about the first content to the second mobile terminal.
  • the sensing unit further provides a stylus pen, which further includes sensing fingerprint information of the user and movement information of the stylus pen.
  • the controller performs a first user authentication when the sensed fingerprint information matches registered fingerprint information, and when the sensed motion information matches registered motion information. And, when the first user authentication and the second user authentication are completed, further comprising performing a second user authentication, further comprising transmitting payment information to an external device.
  • the movement information of the stylus pen is caused by the pressure sensed by the stylus pen, the angle of the stylus pen, the movement pattern of the stylus pen, the rotation pattern of the stylus pen and the stylus pen
  • a stylus pen is provided that includes at least one of handwritten contents.
  • a touch screen A wireless communication unit for transmitting / receiving data with a stylus pen; And a controller configured to sense a first touch input by the stylus pen while the first content is output on the touch screen, and to output second content on the touch screen in response to the sensed first touch input. It provides, including a mobile terminal.
  • the first content is a video content
  • the second content provides a mobile terminal, the writing content for the video.
  • control unit senses the first touch input while the video content is playing, and stores video playback information and outputs the handwritten content in response to the first touch input.
  • Video playback information includes information of a specific time point at which the first touch input is sensed, and senses a second touch input for information of the specific time point of the handwritten content, and corresponds to the second touch input, It further comprises controlling to output a pop-up message indicating the reproduction information.
  • the video content is output in the first display area of the touch screen
  • the handwritten content for the video is provided in the second display area of the touch screen provides a mobile terminal.
  • the first content corresponds to payment target information
  • the second content provides a mobile terminal corresponding to the signature information
  • the controller when the first touch input by the stylus pen is sensed while the payment target information is output, the controller outputs the signature information and registers the signature information. If the matched signature information, the mobile terminal further comprises controlling to transmit the payment completion information to the external device.
  • control unit further provides a mobile terminal, if the second touch input is sensed before the first touch input is sensed, to enter a payment mode.
  • a control method of a system comprising a mobile terminal and a stylus pen, the method comprising: outputting first content on a touch screen of the mobile terminal; Sensing an input signal at the stylus pen; Transmitting a control signal of the first content to the mobile terminal in response to the input signal; And outputting second content to a touch screen of the mobile terminal in response to a control signal of the first content.
  • the mobile terminal can be easily controlled using a stylus pen.
  • FIG. 1A is a block diagram illustrating a mobile terminal related to the present invention.
  • 1B and 1C are conceptual views of one example of a mobile terminal, viewed from different directions.
  • FIG. 2 is a view showing another example of a mobile terminal according to the present invention.
  • FIG. 3 is a block diagram of a stylus pen according to an embodiment of the present invention.
  • FIG. 4 is a perspective view of a stylus pen according to an embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a method of controlling a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating an example of controlling a mobile terminal according to an input signal to a stylus pen.
  • FIG. 7 is a diagram illustrating another example of controlling a mobile terminal according to an input signal to a stylus pen.
  • FIG. 8 is a diagram illustrating another example of controlling a mobile terminal according to an input signal to a stylus pen.
  • FIG. 9 is a flowchart illustrating a method of controlling a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • FIG. 10 is a diagram illustrating an example of controlling a plurality of mobile terminals by a stylus pen according to an embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a method of controlling a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • FIG. 12 is a diagram illustrating an example of receiving an input by a stylus pen simultaneously with playing a video on a mobile terminal.
  • FIG. 13 is a flowchart illustrating a payment method using a stylus pen according to an embodiment of the present invention.
  • FIG. 14 is a diagram illustrating an example of making a payment using a stylus pen according to an embodiment of the present invention.
  • 15 is a diagram illustrating in detail a first user authentication and a second user authentication process using a stylus pen according to an embodiment of the present invention.
  • 16 is a flowchart illustrating a payment method of a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • 17 is a diagram illustrating an example of a payment method using a mobile terminal and a stylus pen according to an embodiment of the present invention.
  • FIG. 18 is a diagram illustrating another example of a payment method using a mobile terminal and a stylus pen according to an embodiment of the present invention.
  • FIG. 19 is a diagram illustrating an example of performing an account transfer in a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • FIG. 20 is a diagram illustrating an example of remittance from a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • FIG. 21 is a diagram illustrating an example of remittance while an address book is output on a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • FIG. 22 is a diagram illustrating an example of providing drawing content using a stylus pen according to an embodiment of the present invention.
  • FIG. 23 is a diagram illustrating another example of providing drawing content using a stylus pen according to an embodiment of the present invention.
  • FIG. 23 is a diagram illustrating another example of providing drawing content using a stylus pen according to an embodiment of the present invention.
  • FIG. 24 illustrates another example of providing drawing content using a stylus pen according to an embodiment of the present invention.
  • 25 is a diagram illustrating another example of providing drawing content using a stylus pen according to an embodiment of the present invention.
  • FIG. 26 illustrates an example of controlling book contents using a stylus pen according to an embodiment of the present invention.
  • FIG. 27 is a diagram illustrating an example of providing content using a stylus pen according to an embodiment of the present invention.
  • FIG. 28 is a diagram illustrating an example of controlling photo content using a stylus pen according to an embodiment of the present invention.
  • the mobile terminal described herein includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, a slate PC , Tablet PCs, ultrabooks, wearable devices, such as smartwatches, glass glasses, head mounted displays, and the like. have.
  • FIG. 1A is a block diagram illustrating a mobile terminal according to the present invention
  • FIGS. 1B and 1C are conceptual views of one example of the mobile terminal, viewed from different directions.
  • the mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, and a power supply unit 190. ) May be included.
  • the components shown in FIG. 1A are not essential to implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than those listed above.
  • the wireless communication unit 110 of the components, between the mobile terminal 100 and the wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or the mobile terminal 100 and the external server It may include one or more modules that enable wireless communication therebetween.
  • the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.
  • the wireless communication unit 110 may include at least one of the broadcast receiving module 111, the mobile communication module 112, the wireless internet module 113, the short range communication module 114, and the location information module 115. .
  • the input unit 120 may include a camera 121 or an image input unit for inputting an image signal, a microphone 122 for inputting an audio signal, an audio input unit, or a user input unit 123 for receiving information from a user. , Touch keys, mechanical keys, and the like.
  • the voice data or the image data collected by the input unit 120 may be analyzed and processed as a control command of the user.
  • the sensing unit 140 may include one or more sensors for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information.
  • the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, and gravity.
  • Optical sensors e.g. cameras 121), microphones (see 122), battery gauges, environmental sensors (e.g.
  • the mobile terminal disclosed herein may use a combination of information sensed by at least two or more of these sensors.
  • the output unit 150 is for generating an output related to sight, hearing, or tactile sense, and includes at least one of the display unit 151, the audio output unit 152, the hap tip module 153, and the light output unit 154. can do.
  • the display unit 151 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
  • the touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and the user, and may also provide an output interface between the mobile terminal 100 and the user.
  • the interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100.
  • the interface unit 160 connects a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, and an identification module. It may include at least one of a port, an audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port.
  • I / O audio input / output
  • I / O video input / output
  • earphone port an earphone port
  • the memory 170 stores data supporting various functions of the mobile terminal 100.
  • the memory 170 may store a plurality of application programs or applications driven in the mobile terminal 100, data for operating the mobile terminal 100, and instructions. At least some of these applications may be downloaded from an external server via wireless communication.
  • at least some of these application programs may exist on the mobile terminal 100 from the time of shipment for basic functions of the mobile terminal 100 (for example, a call forwarding, a calling function, a message receiving, and a calling function).
  • the application program may be stored in the memory 170 and installed on the mobile terminal 100 to be driven by the controller 180 to perform an operation (or function) of the mobile terminal.
  • the controller 180 In addition to the operation related to the application program, the controller 180 typically controls the overall operation of the mobile terminal 100.
  • the controller 180 may provide or process information or a function appropriate to a user by processing signals, data, information, and the like, which are input or output through the above-described components, or by driving an application program stored in the memory 170.
  • controller 180 may control at least some of the components described with reference to FIG. 1A in order to drive an application program stored in the memory 170. Furthermore, the controller 180 may operate by combining at least two or more of the components included in the mobile terminal 100 to drive the application program.
  • the power supply unit 190 receives power from an external power source and an internal power source under the control of the controller 180 to supply power to each component included in the mobile terminal 100.
  • the power supply unit 190 includes a battery, which may be a built-in battery or a replaceable battery.
  • At least some of the components may operate in cooperation with each other to implement an operation, control, or control method of the mobile terminal according to various embodiments described below.
  • the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 170.
  • the broadcast receiving module 111 of the wireless communication unit 110 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or switching of broadcast channels for at least two broadcast channels.
  • the broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal.
  • the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
  • the broadcast signal may be encoded according to at least one of technical standards (or broadcast methods, for example, ISO, IEC, DVB, ATSC, etc.) for transmitting and receiving digital broadcast signals, and the broadcast receiving module 111 may
  • the digital broadcast signal may be received by using a method suitable for the technical standard set by the technical standards.
  • the broadcast associated information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
  • the broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.
  • the broadcast related information may exist in various forms such as an electronic program guide (EPG) of digital multimedia broadcasting (DMB) or an electronic service guide (ESG) of digital video broadcast-handheld (DVB-H).
  • EPG electronic program guide
  • ESG electronic service guide
  • the broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 170.
  • the mobile communication module 112 may include technical standards or communication schemes (eg, Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (CDMA2000), and EV).
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced) and the like to transmit and receive a radio signal with at least one of a base station, an external terminal, a server on a mobile communication network.
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO)
  • WCDMA Wideband CDMA
  • HSDPA High
  • the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
  • the wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100.
  • the wireless internet module 113 is configured to transmit and receive wireless signals in a communication network according to wireless internet technologies.
  • wireless Internet technologies include Wireless LAN (WLAN), Wireless-Fidelity (Wi-Fi), Wireless Fidelity (Wi-Fi) Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), and WiMAX (World).
  • the wireless Internet module 113 for performing a wireless Internet access through the mobile communication network 113 May be understood as a kind of mobile communication module 112.
  • the short range communication module 114 is for short range communication, and includes Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and NFC. (Near Field Communication), at least one of Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus) technology can be used to support short-range communication.
  • the short-range communication module 114 may be configured between a mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or through the wireless area networks. ) And a network in which the other mobile terminal 100 (or an external server) is located.
  • the short range wireless communication network may be short range wireless personal area networks.
  • the other mobile terminal 100 is a wearable device capable of exchanging (or interworking) data with the mobile terminal 100 according to the present invention (for example, smartwatch, smart glasses). (smart glass), head mounted display (HMD).
  • the short range communication module 114 may sense (or recognize) a wearable device that can communicate with the mobile terminal 100, around the mobile terminal 100.
  • the controller 180 may include at least a portion of data processed by the mobile terminal 100 in the short range communication module ( The transmission may be transmitted to the wearable device through 114. Therefore, the user of the wearable device may use data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a call is received by the mobile terminal 100, the user performs a phone call through the wearable device or when a message is received by the mobile terminal 100, the received through the wearable device. It is possible to check the message.
  • the location information module 115 is a module for obtaining a location (or current location) of a mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module.
  • GPS Global Positioning System
  • Wi-Fi Wireless Fidelity
  • the mobile terminal may acquire the location of the mobile terminal using a signal transmitted from a GPS satellite.
  • the mobile terminal may acquire the location of the mobile terminal based on information of the wireless access point (AP) transmitting or receiving the Wi-Fi module and the wireless signal.
  • the location information module 115 may perform any function of other modules of the wireless communication unit 110 to substitute or additionally obtain data regarding the location of the mobile terminal.
  • the location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.
  • the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user.
  • the mobile terminal 100 is one.
  • the plurality of cameras 121 may be provided.
  • the camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode.
  • the processed image frame may be displayed on the display unit 151 or stored in the memory 170.
  • the plurality of cameras 121 provided in the mobile terminal 100 may be arranged to form a matrix structure, and through the camera 121 forming a matrix structure in this way, the mobile terminal 100 may have various angles or focuses.
  • the plurality of pieces of image information may be input.
  • the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for implementing a stereoscopic image.
  • the microphone 122 processes external sound signals into electrical voice data.
  • the processed voice data may be variously used according to a function (or an application program being executed) performed by the mobile terminal 100. Meanwhile, various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in the process of receiving an external sound signal.
  • the user input unit 123 is for receiving information from a user. When information is input through the user input unit 123, the controller 180 may control an operation of the mobile terminal 100 to correspond to the input information. .
  • the user input unit 123 may be a mechanical input unit (or a mechanical key, for example, a button, a dome switch, a jog wheel, or the like located on the front, rear, or side surfaces of the mobile terminal 100). Jog switch, etc.) and touch input means.
  • the touch input means may include a virtual key, a soft key, or a visual key displayed on the touch screen through a software process, or a portion other than the touch screen. It may be made of a touch key disposed in the.
  • the virtual key or the visual key may be displayed on the touch screen while having various forms, for example, graphic, text, icon, video, or the like. It can be made of a combination of.
  • the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a sensing signal corresponding thereto.
  • the controller 180 may control driving or operation of the mobile terminal 100 or perform data processing, function or operation related to an application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.
  • the proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
  • the proximity sensor 141 may be disposed in an inner region of the mobile terminal covered by the touch screen described above or near the touch screen.
  • the proximity sensor 141 examples include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the proximity sensor 141 may be configured to detect the proximity of the object by the change of the electric field according to the proximity of the conductive object.
  • the touch screen (or touch sensor) itself may be classified as a proximity sensor.
  • the proximity sensor 141 may detect a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). have.
  • the controller 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern detected through the proximity sensor 141 as described above, and further, provides visual information corresponding to the processed data. It can be output on the touch screen. Further, the controller 180 may control the mobile terminal 100 to process different operations or data (or information) according to whether the touch on the same point on the touch screen is a proximity touch or a touch touch. .
  • the touch sensor applies a touch (or touch input) applied to the touch screen (or the display unit 151) using at least one of various touch methods such as a resistive film method, a capacitive method, an infrared method, an ultrasonic method, and a magnetic field method. Detect.
  • the touch sensor may be configured to convert a change in pressure applied to a specific portion of the touch screen or capacitance generated at the specific portion into an electrical input signal.
  • the touch sensor may be configured to detect a position, an area, a pressure at the touch, a capacitance at the touch, and the like, when the touch object applying the touch on the touch screen is touched on the touch sensor.
  • the touch object is an object applying a touch to the touch sensor and may be, for example, a finger, a touch pen or a stylus pen, a pointer, or the like.
  • the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180.
  • the controller 180 can know which area of the display unit 151 is touched.
  • the touch controller may be a separate component from the controller 180 or may be the controller 180 itself.
  • the controller 180 may perform different control or perform the same control according to the type of touch object that touches the touch screen (or a touch key provided in addition to the touch screen). Whether to perform different control or the same control according to the type of touch object may be determined according to the operation state of the mobile terminal 100 or an application program being executed.
  • the touch sensor and the proximity sensor described above may be independently or combined, and may be a short (or tap) touch, a long touch, a multi touch, a drag touch on a touch screen. ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, etc. A touch can be sensed.
  • the ultrasonic sensor may recognize location information of a sensing object using ultrasonic waves.
  • the controller 180 can calculate the position of the wave generation source through the information detected from the optical sensor and the plurality of ultrasonic sensors.
  • the position of the wave source can be calculated using the property that the light is much faster than the ultrasonic wave, that is, the time that the light reaches the optical sensor is much faster than the time when the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generation source may be calculated using a time difference from the time when the ultrasonic wave reaches the light as the reference signal.
  • the camera 121 which has been described as the configuration of the input unit 120, includes at least one of a camera sensor (eg, CCD, CMOS, etc.), a photo sensor (or image sensor), and a laser sensor.
  • a camera sensor eg, CCD, CMOS, etc.
  • a photo sensor or image sensor
  • a laser sensor e.g., a laser sensor
  • the camera 121 and the laser sensor may be combined with each other to detect a touch of a sensing object with respect to a 3D stereoscopic image.
  • the photo sensor may be stacked on the display element, which is configured to scan the movement of the sensing object in proximity to the touch screen. More specifically, the photo sensor mounts a photo diode and a transistor (TR) in a row / column and scans contents mounted on the photo sensor by using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor calculates coordinates of the sensing object according to the amount of light change, and thus, the position information of the sensing object can be obtained.
  • TR transistor
  • the display unit 151 displays (outputs) information processed by the mobile terminal 100.
  • the display unit 151 may display execution screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information according to the execution screen information. .
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.
  • the stereoscopic display unit may be a three-dimensional display method such as a stereoscopic method (glasses method), an auto stereoscopic method (glasses-free method), a projection method (holographic method).
  • a 3D stereoscopic image is composed of a left image (left eye image) and a right image (right eye image).
  • a top-down method in which the left and right images are arranged up and down in one frame according to the way in which the left and right images are merged into three-dimensional stereoscopic images.
  • L-to-R (left-to-right, side by side) method to be arranged as a checker board method to arrange the pieces of the left and right images in the form of tiles, a column unit of the left and right images Or an interlaced method of alternately arranging rows, and a time sequential (frame by frame) method of alternately displaying left and right images by time.
  • the 3D thumbnail image may generate a left image thumbnail and a right image thumbnail from the left image and the right image of the original image frame, respectively, and may be generated as one image as they are combined.
  • a thumbnail refers to a reduced image or a reduced still image.
  • the left image thumbnail and the right image thumbnail generated as described above are displayed with a left and right distance difference on the screen by a depth corresponding to the parallax of the left image and the right image, thereby representing a three-dimensional space.
  • the left image and the right image necessary for implementing the 3D stereoscopic image may be displayed on the stereoscopic display by the stereoscopic processing unit.
  • the stereoscopic processing unit receives 3D images (images of the base view and images of the extended view) and sets left and right images therefrom, or receives 2D images and converts them into left and right images.
  • the sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output unit 152 may also output a sound signal related to a function (for example, a call signal reception sound or a message reception sound) performed in the mobile terminal 100.
  • the sound output unit 152 may include a receiver, a speaker, a buzzer, and the like.
  • the haptic module 153 generates various haptic effects that a user can feel.
  • a representative example of the tactile effect generated by the haptic module 153 may be vibration.
  • the intensity and pattern of vibration generated by the haptic module 153 may be controlled by the user's selection or the setting of the controller. For example, the haptic module 153 may synthesize different vibrations and output or sequentially output them.
  • the haptic module 153 may be used to stimulate pins that vertically move with respect to the contact skin surface, jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of electrodes, and electrostatic force
  • Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endothermic heat generation.
  • the haptic module 153 may not only deliver a tactile effect through direct contact, but also may allow a user to feel the tactile effect through a muscle sense such as a finger or an arm. Two or more haptic modules 153 may be provided according to a configuration aspect of the mobile terminal 100.
  • the light output unit 154 outputs a signal for notifying occurrence of an event by using light of a light source of the mobile terminal 100.
  • Examples of events occurring in the mobile terminal 100 may be message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
  • the signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or the rear.
  • the signal output may be terminated by the mobile terminal detecting the user's event confirmation.
  • the interface unit 160 serves as a path to all external devices connected to the mobile terminal 100.
  • the interface unit 160 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device.
  • the port, audio input / output (I / O) port, video input / output (I / O) port, earphone port, etc. may be included in the interface unit 160.
  • the identification module is a chip that stores a variety of information for authenticating the usage rights of the mobile terminal 100, a user identification module (UIM), subscriber identity module (SIM), universal user authentication And a universal subscriber identity module (USIM).
  • a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through the interface unit 160.
  • the interface unit 160 may be a passage for supplying power from the cradle to the mobile terminal 100 or may be input from the cradle by a user.
  • Various command signals may be a passage through which the mobile terminal 100 is transmitted.
  • Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.
  • the memory 170 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
  • the memory 170 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
  • the memory 170 may include a flash memory type, a hard disk type, a solid state disk type, an SSD type, a silicon disk drive type, and a multimedia card micro type. ), Card-type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read It may include at least one type of storage medium of -only memory (PROM), programmable read-only memory (PROM), magnetic memory, magnetic disk and optical disk.
  • the mobile terminal 100 may be operated in connection with a web storage that performs a storage function of the memory 170 on the Internet.
  • the controller 180 controls the operation related to the application program, and generally the overall operation of the mobile terminal 100. For example, if the state of the mobile terminal satisfies a set condition, the controller 180 may execute or release a lock state that restricts input of a user's control command to applications.
  • controller 180 may perform control and processing related to voice call, data communication, video call, or the like, or may perform pattern recognition processing for recognizing handwriting input or drawing input performed on a touch screen as text and images, respectively. Can be. Furthermore, the controller 180 may control any one or a plurality of components described above in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.
  • the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
  • the power supply unit 190 includes a battery, and the battery may be a built-in battery configured to be rechargeable, and may be detachably coupled to the terminal body for charging.
  • the power supply unit 190 may be provided with a connection port, the connection port may be configured as an example of the interface 160 is electrically connected to the external charger for supplying power for charging the battery.
  • the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port.
  • the power supply unit 190 uses one or more of an inductive coupling based on a magnetic induction phenomenon or a magnetic resonance coupling based on an electromagnetic resonance phenomenon from an external wireless power transmitter. Power can be delivered.
  • various embodiments of the present disclosure may be implemented in a recording medium readable by a computer or a similar device using, for example, software, hardware, or a combination thereof.
  • the disclosed mobile terminal 100 includes a terminal body in the form of a bar.
  • the present invention is not limited thereto, and the present invention can be applied to various structures such as a watch type, a clip type, a glass type, or a folder type, a flip type, a slide type, a swing type, a swivel type, and two or more bodies which are coupled to be movable relative.
  • a description of a particular type of mobile terminal may generally apply to other types of mobile terminals.
  • the terminal body may be understood as a concept that refers to the mobile terminal 100 as at least one aggregate.
  • the mobile terminal 100 includes a case (eg, a frame, a housing, a cover, etc.) forming an external appearance. As shown, the mobile terminal 100 may include a front case 101 and a rear case 102. Various electronic components are disposed in the internal space formed by the combination of the front case 101 and the rear case 102. At least one middle case may be additionally disposed between the front case 101 and the rear case 102.
  • a case eg, a frame, a housing, a cover, etc.
  • the mobile terminal 100 may include a front case 101 and a rear case 102.
  • Various electronic components are disposed in the internal space formed by the combination of the front case 101 and the rear case 102.
  • At least one middle case may be additionally disposed between the front case 101 and the rear case 102.
  • the display unit 151 may be disposed in front of the terminal body to output information. As shown, the window 151a of the display unit 151 may be mounted to the front case 101 to form a front surface of the terminal body together with the front case 101.
  • an electronic component may be mounted on the rear case 102.
  • Electronic components attachable to the rear case 102 include a removable battery, an identification module, a memory card, and the like.
  • the rear cover 102 may be detachably coupled to the rear case 102 to cover the mounted electronic component. Therefore, when the rear cover 103 is separated from the rear case 102, the electronic components mounted on the rear case 102 are exposed to the outside.
  • the rear cover 103 when the rear cover 103 is coupled to the rear case 102, a portion of the side surface of the rear case 102 may be exposed. In some cases, the rear case 102 may be completely covered by the rear cover 103 during the coupling. On the other hand, the rear cover 103 may be provided with an opening for exposing the camera 121b or the sound output unit 152b to the outside.
  • the cases 101, 102, and 103 may be formed by injecting a synthetic resin, or may be formed of a metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti), or the like.
  • STS stainless steel
  • Al aluminum
  • Ti titanium
  • the mobile terminal 100 may be configured such that one case may provide the internal space, unlike the above example in which a plurality of cases provide an internal space for accommodating various electronic components.
  • the mobile terminal 100 of the unibody that the synthetic resin or metal from the side to the rear may be implemented.
  • the mobile terminal 100 may be provided with a waterproof portion (not shown) to prevent water from seeping into the terminal body.
  • the waterproof portion is provided between the window 151a and the front case 101, between the front case 101 and the rear case 102 or between the rear case 102 and the rear cover 103, and a combination thereof. It may include a waterproof member for sealing the inner space.
  • the mobile terminal 100 includes a display unit 151, first and second sound output units 152a and 152b, a proximity sensor 141, an illuminance sensor 142, an optical output unit 154, and first and second units.
  • the cameras 121a and 121b, the first and second manipulation units 123a and 123b, the microphone 122, the interface unit 160, and the like may be provided.
  • the display unit 151, the first sound output unit 152a, the proximity sensor 141, the illuminance sensor 142, and the light output unit may be disposed on the front surface of the terminal body.
  • the first camera 121a and the first operation unit 123a are disposed, and the second operation unit 123b, the microphone 122, and the interface unit 160 are disposed on the side of the terminal body.
  • the mobile terminal 100 in which the second sound output unit 152b and the second camera 121b are disposed on the rear surface of the mobile terminal 100 will be described as an example.
  • first manipulation unit 123a may not be provided on the front surface of the terminal body, and the second sound output unit 152b may be provided on the side of the terminal body instead of the rear surface of the terminal body.
  • the display unit 151 displays (outputs) information processed by the mobile terminal 100.
  • the display unit 151 may display execution screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information according to the execution screen information. .
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 may include a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible display). display, a 3D display, or an e-ink display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display flexible display
  • display a 3D display, or an e-ink display.
  • two or more display units 151 may exist according to an implementation form of the mobile terminal 100.
  • the plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces.
  • the display unit 151 may include a touch sensor that senses a touch on the display unit 151 so as to receive a control command by a touch method.
  • the touch sensor may sense the touch, and the controller 180 may generate a control command corresponding to the touch based on the touch sensor.
  • the content input by the touch method may be letters or numbers or menu items that can be indicated or designated in various modes.
  • the touch sensor is formed of a film having a touch pattern and disposed between the window 151a and the display (not shown) on the rear surface of the window 151a or directly patterned on the rear surface of the window 151a. May be Alternatively, the touch sensor may be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or provided in the display.
  • the display unit 151 may form a touch screen together with the touch sensor.
  • the touch screen may function as the user input unit 123 (see FIG. 1A).
  • the touch screen may replace at least some functions of the first manipulation unit 123a.
  • the first sound output unit 152a may be implemented as a receiver for transmitting a call sound to the user's ear, and the second sound output unit 152b may be a loud speaker for outputting various alarm sounds or multimedia reproduction sounds. It can be implemented in the form of).
  • a sound hole for emitting sound generated from the first sound output unit 152a may be formed in the window 151a of the display unit 151.
  • the present invention is not limited thereto, and the sound may be configured to be emitted along an assembly gap between the structures (for example, a gap between the window 151a and the front case 101).
  • an externally formed hole may be invisible or hidden for sound output, thereby simplifying the appearance of the mobile terminal 100.
  • the light output unit 154 is configured to output light for notifying when an event occurs. Examples of the event may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
  • the controller 180 may control the light output unit 154 to end the light output.
  • the first camera 121a processes an image frame of a still image or a moving image obtained by the image sensor in a shooting mode or a video call mode.
  • the processed image frame may be displayed on the display unit 151 and stored in the memory 170.
  • the first and second manipulation units 123a and 123b may be collectively referred to as a manipulating portion as an example of the user input unit 123 operated to receive a command for controlling the operation of the mobile terminal 100. have.
  • the first and second manipulation units 123a and 123b may be adopted in any manner as long as the user is tactile manner such as touch, push, scroll, and the like while the user is tactile.
  • the first and second manipulation units 123a and 123b may be employed in such a manner that the first and second manipulation units 123a and 123b are operated without a tactile feeling by the user through proximity touch, hovering touch, or the like.
  • the first operation unit 123a is illustrated as being a touch key, but the present invention is not limited thereto.
  • the first manipulation unit 123a may be a mechanical key or a combination of a touch key and a push key.
  • the contents input by the first and second manipulation units 123a and 123b may be variously set.
  • the first operation unit 123a receives a command such as a menu, a home key, a cancellation, a search, etc.
  • the second operation unit 123b is output from the first or second sound output units 152a and 152b.
  • the user may receive a command such as adjusting the volume of the sound and switching to the touch recognition mode of the display unit 151.
  • a rear input unit (not shown) may be provided on the rear surface of the terminal body.
  • the rear input unit is manipulated to receive a command for controlling the operation of the mobile terminal 100, and the input contents may be variously set. For example, commands such as power on / off, start, end, scroll, etc., control of the volume of sound output from the first and second sound output units 152a and 152b, and the touch recognition mode of the display unit 151. Commands such as switching can be received.
  • the rear input unit may be implemented in a form capable of input by touch input, push input, or a combination thereof.
  • the rear input unit may be disposed to overlap the front display unit 151 in the thickness direction of the terminal body.
  • the rear input unit may be disposed at the rear upper end of the terminal body so that the user can easily manipulate the index body when the user grips the terminal body with one hand.
  • the present invention is not necessarily limited thereto, and the position of the rear input unit may be changed.
  • the rear input unit when the rear input unit is provided at the rear of the terminal body, a new type user interface using the same may be implemented.
  • the touch screen or the rear input unit described above replaces at least some functions of the first operation unit 123a provided in the front of the terminal body, the first operation unit 123a is not disposed on the front of the terminal body.
  • the display unit 151 may be configured with a larger screen.
  • the mobile terminal 100 may be provided with a fingerprint recognition sensor for recognizing a user's fingerprint, and the controller 180 may use fingerprint information detected through the fingerprint recognition sensor as an authentication means.
  • the fingerprint recognition sensor may be embedded in the display unit 151 or the user input unit 123.
  • the microphone 122 is configured to receive a user's voice, other sounds, and the like.
  • the microphone 122 may be provided at a plurality of locations and configured to receive stereo sound.
  • the interface unit 160 serves as a path for connecting the mobile terminal 100 to an external device.
  • the interface unit 160 may be connected to another device (eg, an earphone or an external speaker), a port for short-range communication (for example, an infrared port (IrDA Port), or a Bluetooth port (Bluetooth). Port), a wireless LAN port, or the like, or a power supply terminal for supplying power to the mobile terminal 100.
  • the interface unit 160 may be implemented in the form of a socket for receiving an external card such as a subscriber identification module (SIM) or a user identity module (UIM), a memory card for storing information.
  • SIM subscriber identification module
  • UIM user identity module
  • the second camera 121b may be disposed on the rear surface of the terminal body. In this case, the second camera 121b has a photographing direction substantially opposite to that of the first camera 121a.
  • the second camera 121b may include a plurality of lenses arranged along at least one line.
  • the plurality of lenses may be arranged in a matrix format.
  • Such a camera may be referred to as an 'array camera'.
  • the second camera 121b is configured as an array camera, images may be photographed in various ways using a plurality of lenses, and images of better quality may be obtained.
  • the flash 124 may be disposed adjacent to the second camera 121b.
  • the flash 124 shines light toward the subject when the subject is photographed by the second camera 121b.
  • the second sound output unit 152b may be additionally disposed on the terminal body.
  • the second sound output unit 152b may implement a stereo function together with the first sound output unit 152a and may be used to implement a speakerphone mode during a call.
  • the terminal body may be provided with at least one antenna for wireless communication.
  • the antenna may be built in the terminal body or formed in the case.
  • an antenna that forms part of the broadcast receiving module 111 (refer to FIG. 1A) may be configured to be pulled out from the terminal body.
  • the antenna may be formed in a film type and attached to the inner side of the rear cover 103, or may be configured such that a case including a conductive material functions as an antenna.
  • the terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the mobile terminal 100.
  • the power supply unit 190 may include a battery 191 embedded in the terminal body or detachably configured from the outside of the terminal body.
  • the battery 191 may be configured to receive power through a power cable connected to the interface unit 160.
  • the battery 191 may be configured to enable wireless charging through a wireless charger.
  • the wireless charging may be implemented by a magnetic induction method or a resonance method (magnetic resonance method).
  • the rear cover 103 is coupled to the rear case 102 to cover the battery 191 to limit the detachment of the battery 191 and to protect the battery 191 from external shock and foreign matter.
  • the rear cover 103 may be detachably coupled to the rear case 102.
  • An accessory may be added to the mobile terminal 100 to protect the appearance or to assist or expand the function of the mobile terminal 100.
  • An example of such an accessory may be a cover or pouch that covers or accommodates at least one surface of the mobile terminal 100.
  • the cover or pouch may be configured to be linked with the display unit 151 to expand the function of the mobile terminal 100.
  • Another example of the accessory may be a touch pen for assisting or extending a touch input to a touch screen.
  • FIG. 2 is a view showing another example of a mobile terminal according to the present invention.
  • the mobile terminal may correspond to the two-folder display device 200 as shown in FIG. 2, but may include various foldable devices such as a three-folder display device.
  • the two-folder display device 200 may include a first body and a second body.
  • a folding unit 215 is provided between the first body and the second body, and the two-folder display device 200 can fold the first body and the second body in both directions based on the folding unit 215. .
  • the two-folder display device 200 may include a first display area 210 in the first body and a second display area 220 in the second body.
  • the first display area 210 and the second display 220 may correspond to different displays or may correspond to the same single display.
  • the two-folder display device 200 described in the present invention includes at least one of the components shown in FIG. 1.
  • the mobile terminal 100 collectively refers to the mobile terminals of FIGS. 1 and 2.
  • FIG. 3 is a block diagram of a stylus pen according to an embodiment of the present invention.
  • the stylus pen includes a wireless communication unit 310 for performing wireless communication with the mobile terminal, a user input unit 320 for sensing an input by a user, whether or not the mobile terminal is in contact with the mobile terminal, and the surrounding environment of the stylus pen.
  • the sensing unit 330 may include a sensing unit 330, a memory 340 for storing data received from the mobile terminal, and a controller 350 for controlling overall operations of the stylus pen.
  • the stylus pen may further include a power supply unit for supplying power required for the operation of each component of the stylus pen, and a display unit for displaying an operation state of the stylus pen.
  • the wireless communication unit 310 is for performing wireless communication with the mobile terminal.
  • the mobile terminal may include not only the mobile terminal shown in FIG. 1 but also the foldable device shown in FIG.
  • Wireless communication between the stylus and the mobile terminal may be performed through short-range communication, and the short-range communication technique may include Bluetooth, Radio Frequency Identification (RFID), and infrared communication (IrDA) as in the example described above with reference to FIG. 1.
  • RFID Radio Frequency Identification
  • IrDA infrared communication
  • UWB Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • NFC Near Field Communication
  • the user input unit 320 serves to generate input data for controlling the operation of the stylus pen.
  • the user input unit 320 may include a button or a touch sensor exposed to the outside of the stylus pen.
  • the sensing unit 330 detects the contact between the stylus pen and the mobile terminal, specifically, the touch screen of the mobile terminal, and generates a sensing signal for controlling the operation of the stylus pen.
  • the sensing unit 330 may sense a surrounding environment of the stylus pen and generate a sensing signal for controlling the operation of the stylus pen.
  • the sensing unit 310 for detecting a contact with the touch screen may include at least one of a pressure sensor and a magnetic sensor.
  • the sensing unit 310 may include a gyro sensor and an acceleration sensor.
  • the pressure sensor is for measuring the strength of the pressure when the stylus pen 50 is in contact with the touch screen and at the touch screen contact.
  • the magnetic sensor is for detecting a contact touch and a proximity touch between the stylus pen 50 and the touch screen.
  • the magnetic sensor may distinguish whether the stylus pen 50 is in contact or proximity touch with the touch screen based on the change in the magnetic field.
  • the sensing unit 330 may include at least one of a pressure sensor and a magnetic sensor. Or it can be used for proximity touch or the like, of course.
  • the sensing unit 330 may include an RGB sensor for detecting a color of an object, a fingerprint sensor for recognizing fingerprint information, and the like.
  • the memory 340 stores various data necessary for the operation of the stylus pen 50, and the memory 340 may include a control area and a data area.
  • the control area may be an area for storing a program for operating the stylus pen 50
  • the data area may be an area for storing data received from the outside.
  • the controller 350 controls the overall operation of the stylus pen 50.
  • the controller 350 may perform control and processing for wireless communication with the mobile terminal 100.
  • the controller 350 may generate a control signal of the paired mobile terminal and control the signal to be transmitted to the mobile terminal.
  • the controller 350 may control to store information of content being output to the mobile terminal.
  • the controller 350 may control to perform first user authentication when fingerprint information regarding the stylus pen is sensed and perform second user authentication when motion information of the stylus pen is sensed. Can be. In this regard, it will be described later in detail for each embodiment.
  • FIG. 4 is a perspective view of a stylus pen according to an embodiment of the present invention.
  • the stylus pen 50 may include a head portion 410 and a body portion 420.
  • the head portion 410 is a portion for contacting the touch screen of the mobile terminal 100
  • the body portion 420 is a portion for the user to hold the stylus pen 50.
  • Electronic components constituting the stylus pen 50 may be mounted on the body portion 420 of the stylus pen 50.
  • a first input unit 51 may be positioned at the end of the head unit 410, and the first input unit 51 may include a sensing unit 310. In this case, the first input unit 51 may include an RGB sensor.
  • the body 420 may include a second input unit 52 and a third input unit 53 for applying a user input to the stylus pen 50.
  • the second input unit 52 may be implemented in the form of a touch sensor on the surface
  • the third input unit 53 may be implemented in the form of a push button.
  • the second input unit 52 may include a fingerprint sensor.
  • the second input unit 52 and the third input unit 53 may be implemented differently from the example shown in FIG. 4.
  • the mobile terminal and the stylus pen described in the present invention include at least one of the components shown in FIG. 1A.
  • the mobile terminal according to the present invention may be more easily implemented when the display unit 151 is a touch screen, it is assumed below that the display unit 151 is a touch screen.
  • the stylus pen 50 'touches' or 'contacts' the mobile terminal 100 may include not only a case where the stylus pen 50 and the mobile terminal 100 are in contact but also a close touch. Can be.
  • operations performed by the mobile terminal 100 or the stylus pen 50 may be controlled by respective controllers.
  • the drawings and the following description will collectively describe these operations to be described as being performed / controlled by the mobile terminal or the stylus pen.
  • the stylus pen 50 and the mobile terminal 100 are in a paired state, and the mobile terminal 100 is in a state capable of receiving a control signal from the stylus pen 50.
  • FIG. 5 is a flowchart illustrating a method of controlling a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • the mobile terminal 100 may output the first content on the touch screen (S510).
  • the first content may include various applications, videos, documents, images, etc. executable in the mobile terminal 100. That is, the user may correspond to a state of using the first content through the mobile terminal 100.
  • step S510 may correspond to a state in which the first content is output on the touch screen as well as a state in which the touch screen is inactivated.
  • the stylus pen 50 may sense the first input signal (S520).
  • the first input signal corresponds to an input signal sensed from a user in a state where the first input unit 51 of the stylus pen is not touched by the touch screen.
  • the first input signal may correspond to a touch input through the second input unit 52.
  • the first input signal may correspond to a push input through the third input unit 53.
  • the stylus pen 50 may generate a control signal of the first content in response to the first input signal and transmit the generated control signal to the mobile terminal 100 (S530). That is, unlike the conventional stylus pen, even if the mobile terminal 100 and the stylus pen 50 are not in close proximity, data transmission / reception is possible.
  • the mobile terminal 100 may receive a control signal of the first content and control to output the second content to the touch screen (S540). That is, the user may easily control the mobile terminal 100 by applying an input to the stylus pen 50 itself rather than a touch input to the touch screen of the mobile terminal 100 using the stylus pen 50. .
  • 6 to 8 illustrate various embodiments of the method for controlling the mobile terminal 100 by the stylus pen 50 described above.
  • FIG. 6 is a diagram illustrating an example of controlling a mobile terminal according to an input signal to a stylus pen.
  • the stylus pen 50 may sense an input signal through the third input unit 53.
  • the third input unit 53 may sense a push input by the user.
  • the stylus pen 50 may generate a control signal of the mobile terminal 100 and transmit a control signal to the mobile terminal 100 based on the sensed input signal.
  • the mobile terminal 100 may control to output the preset content 620 on the touch screen in response to a control signal from the stylus pen 50.
  • the preset content 620 corresponds to a memo application.
  • the preset content 620 may correspond to various contents such as contents set by the user or contents preset by the mobile terminal. That is, through this, when the user is using other contents of the mobile terminal 100 or wants memo or writing while not in use, the user applies an input to the stylus pen 50 and immediately writes on the mobile terminal 100. can do.
  • the touch screen of the mobile terminal 100 is inactivated.
  • the present invention is not limited thereto, and other contents are output on the touch screen.
  • the input signal through the third input unit 53 of the stylus pen 50 is taken as an example, but may be an input signal through another input unit to the stylus pen 50.
  • FIG. 7 is a diagram illustrating another example of controlling a mobile terminal according to an input signal to a stylus pen.
  • the stylus pen 50 may sense the first input signal 740a.
  • the first input signal 740a corresponds to a push input to the third input unit 53, but is not limited thereto.
  • the first input signal 740a corresponds to a signal sensed when the stylus pen 50 is not in contact with the mobile terminal 100.
  • the push input to the third input unit 53 may correspond to the long press input, unlike the embodiment of FIG. 6.
  • the stylus pen 50 may generate a control signal of the mobile terminal 100 in response to the first input signal 740a and transmit the generated control signal to the mobile terminal 100.
  • the mobile terminal may perform a screenshot of the content 710 being output on the touch screen according to the received control signal.
  • the mobile terminal 100 may sense the second input signal 740b while the screenshot image 720a is output.
  • the second input signal 740b may correspond to a drag touch input on the touch screen through the first input unit 51 of the stylus pen 50.
  • the second input signal 740b corresponds to a signal for cropping the partial region 720b of the screenshot image 720a.
  • the mobile terminal may control to enlarge and output the cropped region 720b in response to the second input signal 740b.
  • writing with the stylus pen 50 may be possible while the screenshot image 720a is output to the mobile terminal in response to the first input signal 740a.
  • FIG. 8 is a diagram illustrating another example of controlling a mobile terminal according to an input signal to a stylus pen.
  • the mobile terminal is the two-foldable display device described above in FIG. 2.
  • the mobile terminal is not limited thereto and may include various display devices that can distinguish and use a plurality of display areas.
  • the stylus pen 50 may sense the first input signal 810a.
  • the mobile terminal 200 may control to output a palette capable of color selection and brush selection on the first display area and output a sketchbook on the second area.
  • the mobile terminal 200 controls to output a picture according to the touch input by changing the color, the thickness, etc. according to the touch input to the first region 210 or the second region 220 of the stylus pen 50. can do.
  • the user may want to change the color, thickness, etc. currently being output according to the drag touch.
  • the stylus pen 50 may sense the first input signal 810a.
  • the first input signal 810a may correspond to an input for rotating the stylus pen 50.
  • the first input signal 810a may correspond to an input for changing the color of the pen or brush being used on the drawing content.
  • the stylus pen 50 may generate a control signal in response to the first input signal 810a and transmit the generated control signal to the mobile terminal 100.
  • the first input signal 810a corresponds to a rotational input sensed by the stylus pen 50 regardless of whether the mobile terminal 100 is in contact with the mobile terminal 100.
  • the mobile terminal may control to output the indicator 830 on the touch screen in response to the received control signal.
  • the user may easily recognize the color of the brush changed through the first input signal 810a.
  • the color output to the indicator 830 is not a color desired by the user, the color may be changed several times by applying an input of rotating the stylus pen 50 again.
  • the mobile terminal 200 may change the thickness of the brush rather than the color according to the first input signal 810. In this case, it is possible to set to increase or decrease the thickness in accordance with the rotation direction of the stylus pen 50.
  • the mobile terminal when the mobile terminal senses the second input signal 810b for the touch screen, the mobile terminal uses a brush or pen of a color changed in correspondence to the first input signal 810a. You can control the output by drawing
  • FIG. 9 is a flowchart illustrating a method of controlling a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • the first mobile terminal is assumed to be the mobile terminal 100 of FIG. 1, and the second mobile terminal is assumed to be the mobile terminal 200 of FIG. 2.
  • the first mobile terminal or the second mobile terminal may be implemented in reverse.
  • the first mobile terminal 100 may control to output the first content on the touch screen (S910).
  • the first content may correspond to content that is visually provided, such as an image, a web page address, and text.
  • the first content may correspond to voice content, tactile content, etc. in addition to the visually provided content.
  • the stylus pen 50 may sense the first input signal (S920). More specifically, the stylus pen 50 may sense the first input signal while the stylus pen 50 is in contact with the first content output on the touch screen of the first mobile terminal 100.
  • the first input signal may correspond to a user input to the second input unit or the third input unit described above with reference to FIG. 3.
  • the state in which the stylus pen 50 is in contact with the first content output on the touch screen of the first mobile terminal 100 may correspond to specifying the first content.
  • the stylus pen 50 may receive information on the first content from the first mobile terminal 100 and store information on the first content in response to the first input signal (S930).
  • the stylus pen 50 may sense the second input signal (S940).
  • the second input signal may correspond to a user's input to the second input unit or the third input unit described above with reference to FIG. 3.
  • the position where the stylus pen 50 contacts on the touch screen of the second mobile terminal 200 may correspond to the case where the output position of the preset content is specified.
  • the stylus pen 50 may transmit information about the first content to the second mobile terminal 200 in response to the second input signal.
  • the second mobile terminal 200 may output the first content on the touch screen (S950).
  • the user can easily move the content being used in one terminal to another terminal using a stylus pen while using a plurality of terminals at the same time.
  • FIG. 10 is a diagram illustrating an example of controlling a plurality of mobile terminals by a stylus pen according to an embodiment of the present invention.
  • the embodiment of FIG. 10 illustrates a method of outputting image content being output from the first mobile terminal to the second mobile terminal under the control of the stylus pen.
  • the stylus pen 50 and the first mobile terminal 100 and the stylus pen 50 and the second mobile terminal 200 are paired and capable of data transmission / reception.
  • the first mobile terminal 100 is the mobile terminal of FIG. 1A
  • the second mobile terminal 200 is the mobile terminal of FIG. 2.
  • the type of the first and second mobile terminals is not limited thereto, and may include various devices provided with a display unit.
  • the first mobile terminal 100 may control to output image content to the touch screen.
  • the user may want to transfer an image, which is viewed as a small screen of the smartphone, to a display device having a large screen.
  • the first mobile terminal 100 may sense the contact of the stylus pen 50 on the touch screen. More specifically, as illustrated in FIG. 10A, a contact of the stylus pen 50 with respect to the image content 1010a output on the touch screen may be sensed. In this case, the first mobile terminal 100 may perform an operation associated with the image content touched by the stylus pen 50.
  • the stylus pen 50 may sense the first input signal 1020a while the first input unit 51 is in contact with the touch screen.
  • the first input signal 1020a may correspond to a user input to the second input unit 52.
  • the stylus pen 50 may transmit a control signal to the first mobile terminal 100, and the first mobile terminal 100 may transmit information of the image content 1010a to the stylus pen 50.
  • the stylus pen 50 may control to store information of the image content 1010a in a memory.
  • the second mobile terminal 200 may sense a contact of the stylus pen 50 on the touch screen.
  • the stylus pen 50 may sense the second input signal 1020b while the first input unit 41 is in contact with the touch screen of the second mobile terminal 200.
  • the second input signal 1020b may correspond to a user input to the second input unit 52 and may correspond to the same input as the first input signal 1010b.
  • the stylus pen 50 may transmit information of the image content 1010a to the second mobile terminal 200 in response to the second input signal 1020b.
  • the second mobile terminal 200 may control to output the image content 1010b to the touch screen.
  • the second mobile terminal 200 may control to output the image content 1010b to the position where the first input unit 41 of the stylus pen 50 is in contact with.
  • the image content 1010a output to the first mobile terminal 100 and the image content 1010b output to the second mobile terminal 200 may correspond to the same image.
  • the second mobile terminal 200 may control the image content 1010b to be enlarged and output to fit the size of the touch screen.
  • FIG. 10 has described a method of copying and pasting an image as it is, a method of cutting and pasting image content 1010a output to the first mobile terminal 100 and pasting it to the second mobile terminal 200. May also be used.
  • the second mobile terminal 200 enlarges or reduces the image content 1010b according to the signal strength from the first input unit 51 of the stylus pen 50. Can be controlled.
  • 11 and 12 a method of controlling content of a mobile terminal when a video playback and writing using a stylus pen are simultaneously performed in the mobile terminal will be described.
  • 11 and 12 it is assumed that the stylus pen and the mobile terminal are in a paired state, and the mobile terminal can transmit / receive a control signal from the stylus pen.
  • 11 and 12 it is assumed that the mobile terminal includes a first area and a second area on the touch screen.
  • FIG. 11 is a flowchart illustrating a method of controlling a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • the mobile terminal may sense the first input by the stylus pen in the second area while playing the video in the first area (S1110).
  • the first input may correspond to a handwriting input of information related to a video being played in the first area.
  • the mobile terminal may control to store the moving picture reproduction information at the time when the writing input is sensed in the memory (S1120).
  • the video playback information may correspond to time information, content information, etc. at the time when the first input is sensed in the video being played.
  • the mobile terminal may sense a second input by the stylus pen in a state in which handwritten content is output to the second area (S1130).
  • the second input may correspond to a long press touch input as a touch input by the stylus pen.
  • the mobile terminal may control to output video information corresponding to the writing at the point where the touch input is sensed as a pop-up message (S1140).
  • the mobile terminal may provide a pop-up message with information corresponding to the point where the second input is sensed among the written information.
  • the user may easily check or use the video information related to the handwriting later with respect to the handwritten content when the video content is used. In addition, it will be described in detail with reference to FIG. 12.
  • FIG. 12 is a diagram illustrating an example of receiving an input by a stylus pen simultaneously with playing a video on a mobile terminal.
  • the mobile terminal is the two-folder display device described above in FIG. 2.
  • the embodiment of FIG. 12 assumes that writing is related to learning while playing a learning video.
  • the mobile terminal 200 receives a writing input 1210a by the stylus pen 50 in the second area 220 while playing the training video in the first area 210. You can sense it.
  • the mobile terminal 200 may control to store the playback information of the training video in the memory.
  • the video play information may include time information, content information, etc. at the time when a handwriting input is sensed in the video being played. That is, the mobile terminal 200 may control to store the writing input sensed during video playback in association with the video and store it in the memory.
  • the mobile terminal 200 may sense the touch input 1210b by the stylus pen 50 while the writing information is output.
  • the touch input by the stylus pen 50 may correspond to the long press touch, but is not limited thereto.
  • the mobile terminal 200 may output information of a video corresponding to the area where the touch input 1210b is sensed among the handwriting information in the form of a pop-up message 1220 in response to the touch input 1210b.
  • the popup message 1220 may correspond to time information or content information of a video corresponding to an area where the touch input 1210b is sensed. In this way, the user can easily recognize the video playback time associated with the handwriting.
  • the mobile terminal 200 when the additional input for the pop-up 1220 is sensed while the pop-up 1220 is output, the mobile terminal 200 includes the first area 210 and the second area 220. ) To play the video. At this time, the playback start time of the video may correspond to a time corresponding to the handwriting information.
  • FIGS. 13 to 21 will be described in the case of using the stylus pen in the payment act.
  • the stylus pen and the mobile terminal are in a paired state, so that a signal transmission / reception between the mobile terminal and the stylus pen is possible.
  • FIGS. 13 to 15 a method of performing payment using only a stylus pen without using a mobile terminal will be described.
  • a payment is performed using a mobile terminal as well as a stylus pen.
  • FIG. 13 is a flowchart illustrating a payment method using a stylus pen according to an embodiment of the present invention.
  • the stylus pen may be provided with a wireless communication unit to transmit / receive information related to payment from the stylus pen itself.
  • a wireless communication unit to transmit / receive information related to payment from the stylus pen itself.
  • additional authentication may be required in addition to the signature to enhance security.
  • the stylus pen may enter the payment mode (S1310). For example, when the stylus pen receives payment related information from the mobile terminal, the stylus pen may enter the payment mode. Also, for example, when the stylus pen receives an input signal from the user, the stylus pen may enter the payment mode.
  • the stylus pen may sense the fingerprint information of the user (S1320). More specifically, the fingerprint information of the user may be sensed through the fingerprint sensing unit provided in the second input unit of the stylus pen. In this case, the stylus pen may determine whether the user's fingerprint information and the registered fingerprint information match, and perform a first user authentication (S1330).
  • the stylus pen may transmit the fingerprint information sensed to the mobile terminal, determine whether it is the same as the fingerprint information of the user registered in the mobile terminal, and perform the first user authentication. As another example, the stylus pen may determine whether it is identical to fingerprint information of a user registered by itself, and perform first user authentication.
  • the stylus pen may sense a signature input of the user (S1340).
  • the signature input of the user may correspond to a signature input in a state in which the first input unit of the stylus pen contacts the display unit of the POS for payment.
  • the stylus pen may perform second user authentication (S1350).
  • the movement information of the stylus pen includes the contents of the signature input sensed through various sensors for sensing the movement of the stylus pen, the pressure of the signature input, the angle of the stylus pen when the signature is input, the movement pattern of the stylus pen, and the like. can do.
  • the first user authentication through the fingerprint information and the second user authentication through the signature input may be performed at the same time, or may be performed after the change.
  • the stylus pen may transmit payment information to the external device (S1360).
  • the external device may correspond to a force.
  • the user can easily perform security-enhanced user authentication only with the stylus pen.
  • the embodiment of FIG. 13 may be performed in the same manner for user authentication of secure content in addition to user authentication for payment.
  • FIG. 14 is a diagram illustrating an example of making a payment using a stylus pen according to an embodiment of the present invention.
  • a step of tagging a product for payment or inputting a cost may be performed in the force 1400.
  • the payment amount may be output to the display unit 1410 of the force 1400 as shown in FIG. 14A.
  • the user may wish to proceed with the payment using the stylus pen 50.
  • the stylus pen 50 may sense the signature input 1420 while being in contact with the force 1400.
  • the stylus pen 50 may sense the fingerprint information of the user.
  • the fingerprint information of the user matches the registered fingerprint information, the first user authentication may be performed.
  • the stylus pen 50 may perform second user authentication based on the motion information sensed by the signature input 1420 when it matches the registered motion information.
  • the motion information may include a signature pattern sensed through various sensors included in the stylus pen 50, an angle of the stylus pen 50, a pressure sensed by the stylus pen 50, a speed of a signature input, and the like. Can be.
  • the stylus pen 50 may transmit payment information to the force 1400.
  • the stylus pen 50 may include a wireless communication unit using a communication method such as magnetic secure transmission (MST), near field communication (NFC), radio frequency identification (RFID), Bluetooth low energy (BLE), or Bluetooth (BT).
  • MST magnetic secure transmission
  • NFC near field communication
  • RFID radio frequency identification
  • BLE Bluetooth low energy
  • BT Bluetooth
  • the payment information may be controlled to be transmitted to the force 1400.
  • the stylus pen 50 when entering the payment mode of the stylus pen 50, the stylus pen 50 may perform the payment with the default (default) card. In this case, when a signal for rotating the stylus pen 50 is sensed, the stylus pen 50 may be changed to another card held by the user instead of the defold card to perform payment.
  • FIG. 15 is a diagram illustrating in detail a first user authentication and a second user authentication process using a stylus pen according to an embodiment of the present invention.
  • the stylus pen 50 has entered a payment mode.
  • Figure 15 (a) shows the first authentication through the fingerprint.
  • the second input unit 52 of the stylus pen 50 senses the fingerprint of the user's thumb or index finger. can do.
  • the stylus pen 50 may determine whether the sensed fingerprint information matches the registered fingerprint information and perform primary authentication.
  • the first authentication through the fingerprint may be performed by the stylus pen 50 itself, or by transmitting the sensed fingerprint information to the mobile terminal, the first authentication may be performed on the mobile terminal.
  • Fig. 15B shows the second authentication through signature.
  • the stylus pen 50 may sense the signature input 1520.
  • the mobile terminal 100 determines whether the sensed signature input 1520 is the same as the registered signature input of the user. Secondary authentication can be performed.
  • the stylus pen 50 checks whether the motion information of the signature input 1520 and the registered user's movement information are the same. It can determine whether or not to perform the second authentication.
  • the motion information of the stylus pen 50 includes the contents of the signature input sensed by various sensors for sensing the movement of the stylus pen 50, the pressure of the signature input, the angle of the stylus pen 50 when the signature is input, and the signature input.
  • the rotation pattern of the stylus pen 50 may be included.
  • the angle of the stylus pen 50 may correspond to a pattern in which the angle of the stylus pen 50 changes when the user generally signs.
  • various types of authentication may be performed in addition to authentication through a fingerprint and authentication through a signature. For example, when the user's voice is sensed while the user's finger is held by the stylus pen 50, it is determined whether the user corresponds to a registered user through the voice transmitted through the bone and conduction method through the skin and bone. can do.
  • FIG. 16 is a flowchart illustrating a payment method of a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • this corresponds to a case where both a stylus pen and a mobile terminal are used for payment.
  • the mobile terminal can enter a payment mode (S1610).
  • the mobile terminal may enter the payment mode.
  • the stylus pen paired with the mobile terminal may also enter the payment mode.
  • the mobile terminal may sense a signature input while the stylus pen is in contact with the touch screen (S1620). More specifically, the mobile terminal can sense a signature input in a state where payment target information is output on the touch screen. In addition, when the signature input by the stylus pen and the signature information registered in the mobile terminal are the same, the mobile terminal may complete user authentication (S1630). Meanwhile, the mobile terminal may additionally sense the fingerprint information of the user in the stylus pen to perform additional user authentication.
  • the mobile terminal can transmit payment completion information to the external device through the wireless communication unit (S1640).
  • the external device may correspond to a POS as a payment terminal.
  • the mobile terminal and the stylus pen may end the payment mode.
  • the user can sign the force through the stylus pen connected to the mobile terminal on the mobile terminal possessed by the user without signing the POS through another signing tool, thereby making it easier to complete the user's authentication and payment.
  • 17 is a diagram illustrating an example of a payment method using a mobile terminal and a stylus pen according to an embodiment of the present invention.
  • the force 1700 when a product is tagged for payment or a payment amount is input to the force 1700, the force 1700 outputs payment amount information to the display 1710a or voice ( 1710b).
  • the user may execute a payment application on the mobile terminal 100 for payment.
  • the mobile terminal 100 may receive information related to a payment amount from the force 1700 and control to output the payment target information 1720 on the touch screen.
  • the user may sign the mobile terminal using the stylus pen 50. That is, the mobile terminal 100 may sense the signature input 1730 by the stylus pen 50. In more detail, the mobile terminal 100 may sense the signature input 1730 through the first input unit 51 of the stylus pen 50.
  • the mobile terminal may determine whether the signature input 1730 is the same as the registered signature information. When the signature input 1730 and the registered signature information are the same, the mobile terminal may complete authentication of the user. However, when the signature input 1730 and registered signature information are not the same, the mobile terminal may additionally perform a user authentication procedure other than the signature input.
  • the additional authentication procedure may correspond to fingerprint authentication through the Stanley pen 50, but is not limited thereto.
  • the mobile terminal 100 may transmit payment completion information to the force 1700.
  • the payment mode of the mobile terminal 100 and the stylus pen 50 may be terminated.
  • FIG. 18 is a diagram illustrating another example of a payment method using a mobile terminal and a stylus pen according to an embodiment of the present invention.
  • the touch input 1810 may be sensed by the stylus pen 50.
  • the touch input 1810 may correspond to a double tap input by the first input unit 51 of the stylus pen 50.
  • the touch input 1810 may correspond to a plurality of continuous touch inputs not by the stylus pen 50.
  • the mobile terminal may enter the payment mode.
  • the touch screen of the mobile terminal may still correspond to an inactive state in the payment mode.
  • the mobile terminal 100 may not output any payment target information on the touch screen.
  • the mobile terminal 100 may indicate that the mobile terminal 100 enters a payment mode in various ways.
  • the LED on the front of the mobile terminal blinks to indicate that the payment mode is entered.
  • the LED of the stylus pen 50 may blink to indicate that the payment mode is entered.
  • the mobile terminal 100 or the stylus pen 50 may provide a sound indicator, a vibration indicator, or the like to indicate that the mobile terminal 100 enters a payment mode.
  • the mobile terminal 100 may sense the signature input 1820 by the stylus pen 50.
  • the signature input 1820 may correspond to an input configured as a drag touch on the touch screen of the mobile terminal 100.
  • the signature input 1820 may correspond to a table located near the mobile terminal 100 or an input performed by the stylus pen 50 in the palm of the user. In this case, the stylus pen 50 may transmit information on the signature input to the mobile terminal.
  • the mobile terminal 100 may determine whether the sensed signature input 1820 is identical to the registered user's signature. In this case, although not shown in FIG. 18, when the sensed signature input 1820 is the same as the signature of the user, the mobile terminal 100 may indicate that user authentication is completed through the vibration indicator. In addition, it may indicate that user authentication has been completed in various ways.
  • the mobile terminal 100 may complete payment by transmitting payment completion information to the force 1800. In this case, the mobile terminal 100 may end the payment mode.
  • FIGS. 19 to 21 will be described with reference to the method of account transfer using a stylus pen and a mobile terminal.
  • the transfer was performed through an application supporting an account transfer function such as a bank application through a mobile terminal, a method for simplifying this was required.
  • FIG. 19 is a diagram illustrating an example of performing an account transfer in a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • the mobile terminal 100 may enter a bank transfer (transfer) mode. For example, when a handwriting input of a predetermined word such as 'finance' is sensed on the touch screen through the stylus pen 50, the mobile terminal 100 may enter a transfer mode. Also, for example, when a touch input of a preset pattern is sensed, the mobile terminal 100 may enter a transfer mode.
  • a bank transfer transfer
  • the mobile terminal 100 when entering the remittance mode, may provide a touch screen in the form of a notepad to sense remittance information.
  • the user may write remittance information on the touch screen using the stylus pen 50. That is, referring to FIG. 19A, the mobile terminal 100 may sense an input of remittance information 1910 on the touch screen.
  • the remittance information may include information such as a remittance account, a remittance amount.
  • the mobile terminal may extract a character from the remittance information 1910 input through an optical character reader (OCR).
  • OCR optical character reader
  • the mobile terminal 100 may sense the signature input 1920 after the remittance information 1910 is input. In this case, the mobile terminal can determine whether the sensed signature input 1920 is the same as the registered signature.
  • the mobile terminal 100 may output a remittance popup message 1930 for checking whether the money is transmitted to the touch screen. Meanwhile, the mobile terminal 100 may perform additional user authentication for enhanced security in addition to user authentication through the signature input 1920.
  • the additional user authentication may correspond to authentication through fingerprint information of the user sensed through the stylus pen 50, but is not limited thereto.
  • the mobile terminal 100 may transmit the transfer information to an external device.
  • the external device may correspond to a bank server.
  • the mobile terminal 100 may output the transfer result on the touch screen.
  • the remittance result may include an account balance after the remittance, a remittance account, a remittance time, and the like.
  • FIG. 20 is a diagram illustrating an example of a method for transmitting money in a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • FIG. 20 illustrates a method of changing a part of remittance information in the embodiment of FIG. 19.
  • description of content overlapping with the above description will be omitted.
  • the mobile terminal 100 may sense the writing signal 2030a by the stylus pen 50.
  • the writing signal 2030a may correspond to writing by the stylus pen 50 for inputting the remittance information 2010.
  • the mobile terminal 100 may recognize the remittance information by extracting a character from the remittance information input 2010 through the OCR.
  • the mobile terminal 100 may sense the first input signal 2030b by the stylus pen 50.
  • the first input signal 2030b is a drag touch input for modifying some of the remittance information.
  • the first input signal 2030b changes the remittance bank.
  • the mobile terminal 100 may output the remittance bank 2020 that is changed in response to the first input signal 2030b.
  • the mobile terminal 100 may determine the bank to which the remittance target is.
  • the second input signal 2030c may correspond to a touch input by the stylus pen 50.
  • the mobile terminal 100 may sense the third input signal 2030e for the final remittance.
  • the third input signal 2030e may correspond to a voice input sensed while the stylus pen 50 is in contact with the touch screen 2030d.
  • the voice input 2030e may include remittance information.
  • the mobile terminal 100 may perform speech-to-text (STT) to recognize the user's voice input 2030e.
  • STT speech-to-text
  • the mobile terminal 100 may transmit the remittance information to the external device.
  • the external device may correspond to a bank server.
  • the mobile terminal 100 may transmit remittance information to an external device.
  • FIG. 21 is a diagram illustrating an example of remittance while an address book is output on a mobile terminal using a stylus pen according to an embodiment of the present invention.
  • the mobile terminal may sense a first input signal 2110a for a specific person included in the address book 2100.
  • the first input signal 2110a may correspond to a long press touch input by the stylus pen 50.
  • the mobile terminal 100 may control to output a menu option 2120 for a specific person.
  • the menu option 2120 may include various menus such as making a phone call, sending a message, and making a bank transfer.
  • the mobile terminal 100 may sense a second input signal 2110b for an account transfer option among the menu options 2120.
  • the mobile terminal 100 may perform bank transfer in various ways. As an example, as illustrated in FIG. 21C, the mobile terminal 100 may output a memo interface 2130 for sensing a handwriting input for the account transfer history. In addition, the mobile terminal can sense the handwriting input 2110c for account transfer. In this case, as described above with reference to FIGS. 19 and 20, the mobile terminal 100 may recognize the contents of the handwriting input through the OCR, and perform the account transfer.
  • the mobile terminal 100 may execute an account transfer application.
  • the mobile terminal 100 may receive the account transfer information on the application, and perform the account transfer.
  • the mobile terminal 100 may output a pop-up message to indicate that the transfer is completed.
  • the mobile terminal 100 may output the address book 2100 again.
  • FIGS. 22 to 28 describe a user interface for providing various experiences in a mobile terminal using a stylus pen in addition to the above-described embodiments.
  • the stylus pen 10 and the mobile terminals 100 and 200 are paired and capable of data transmission / reception.
  • FIG. 22 is a diagram illustrating an example of providing drawing content using a stylus pen according to an embodiment of the present invention.
  • the color change of the pen or brush is made through a color picker.
  • a color picker since it is limited to the color included in the color picker, it may be difficult for the user to draw the same as the realistic color of the real object.
  • the stylus pen 50 is provided with the RGB color sensor, it will be able to draw the same as the actual color of the real object.
  • the user corresponds to a case in which the user wants to draw on the mobile terminal 100 in the same color as the actual apple.
  • the stylus pen 50 may recognize the color of the object located in front of the RGB sensor provided in the first input unit 51.
  • the stylus pen 50 may sense the input signal 2210.
  • the input signal 2210 may correspond to an input to the second input unit 52 in a state where the stylus pen 50 is in contact with the touch screen of the mobile terminal 100.
  • the stylus pen 50 may transmit color information recognized through the RGB sensor to the mobile terminal 100 in response to the sensed input signal 2210.
  • the mobile terminal 100 may perform drawing of the same color as the color recognized by the RGB sensor of the stylus pen 50 on the drawing content.
  • FIG. 23 is a diagram illustrating another example of providing drawing content using a stylus pen according to an embodiment of the present invention.
  • FIG. 23 is a diagram illustrating another example of providing drawing content using a stylus pen according to an embodiment of the present invention.
  • the mobile terminal 100 may output various shape guides on one side of the touch screen while the drawing application is executed. For example, in a state in which a straight line is selected among the shape guides, the mobile terminal may sense the first input signal 2310 and the second input signal 2320.
  • the first input signal 2310 may correspond to a state in which a finger touches the touch screen.
  • the second input signal 2320 may correspond to a drag touch input by the stylus pen 50.
  • the mobile terminal 100 may output a straight line according to the second input signal 2320 based on the position where the first input signal 2310 is sensed. That is, the mobile terminal 2320 may provide a drawing result in a form desired by the user through the first input signal 2310 and the second input signal 2320.
  • FIG. 24 illustrates another example of providing drawing content using a stylus pen according to an embodiment of the present invention.
  • the mobile terminal 100 may sense the first input signal 2410 by the stylus pen 50 while the drawing application is executed.
  • the first input signal 2410 may correspond to a drag touch input by the stylus pen 50.
  • the mobile terminal may output a star image as illustrated in FIG. 24 (a).
  • the mobile terminal may sense the second input signal 2420 for the star-shaped image.
  • the second input signal 2420 may correspond to a drag touch input by a user's thumb.
  • the mobile terminal 100 may erase the image output at the position where the second input signal 2420 is sensed.
  • the mobile terminal 100 distinguishes a drag touch input by the stylus pen 50 and a drag touch input by a finger, thereby providing a user interface that allows a user to easily draw and erase an image. can do.
  • the mobile terminal can distinguish the drag touch input by the stylus pen 50 and the finger through a pressure difference, a contact signal difference, and the like.
  • 25 is a diagram illustrating another example of providing drawing content using a stylus pen according to an embodiment of the present invention.
  • the mobile terminal 100 may capture an object 2510 in the field of view area through a camera.
  • the mobile terminal 100 may execute a drawing application to place the captured object 2510 on a lowermost layer and provide it as a drawing guide.
  • the mobile terminal 100 may crop only the area of the object 2510 focused in the captured image and provide it as a drawing guide.
  • the user may perform the drawing operation 2520 by referring to the captured object 2510 as a drawing guide using the stylus pen 50.
  • the user can draw a drawing result with a high similarity to the real object.
  • FIG. 26 illustrates an example of controlling book contents using a stylus pen according to an embodiment of the present invention. It is assumed that the embodiment of FIG. 26 is book content output on the mobile terminal 200 including the first area and the second area.
  • the mobile terminal 200 may sense the first input signal 2610a while outputting book content.
  • the first input signal 2610a may correspond to a drag touch input by the stylus pen 50.
  • the mobile terminal 200 may specify the area 2620 where the input signal 2610 is sensed as the storage area.
  • the mobile terminal 200 may sense the second input signal 2610b.
  • the second input signal 2610b may correspond to a drag touch input in a right direction by the stylus pen 50.
  • the mobile terminal 200 outputs the clipboard 2630 in response to the second input signal 2610b and adds a storage area to the clipboard 2630. Can be controlled.
  • FIG. 27 is a diagram illustrating an example of providing content using a stylus pen according to an embodiment of the present invention.
  • the embodiment of FIG. 27 is assumed to be established on a mobile terminal 200 including a first area and a second area.
  • the mobile terminal 200 may output content 2710 to at least one of the first area 210 and the second area 220.
  • the content 2710 may include various contents such as document content, image content, book content, and presentation content.
  • the user may want to organize or write data related thereto in the second area 210 based on the content output to the first area 210.
  • the mobile terminal 200 may sense a folding signal.
  • the folding signal corresponds to a signal for folding the mobile terminal 200 so that the first area 210 and the second area 220 face each other.
  • the mobile terminal 200 outputs the content 2710a output in the first area 210 in the second area 220 in response to the folding signal 2710b. can do.
  • the user writes information necessary for the content 2710b output to the second area 220 based on the content 2710a output to the first area 210.
  • the modified information including the original information and the handwriting can be conveniently used.
  • FIG. 28 is a diagram illustrating an example of controlling photo content using a stylus pen according to an embodiment of the present invention.
  • the mobile terminal 100 may execute a gallery application including at least one thumbnail image 2810.
  • the mobile terminal may sense an input signal 2820 that selects at least one thumbnail by the stylus pen 50.
  • the input signal 2820 may correspond to a drag touch input.
  • the mobile terminal 100 controls to output the thumbnail image into a first group 2830 a and a second group 2830 b in response to the input signal 2820.
  • the input signal 2820 may include various touch inputs that can distinguish regions in addition to one drag touch input.
  • the user can easily group only the desired picture among the plurality of pictures output in the gallery by using the stylus pen.
  • the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
  • the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
  • the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.
  • the present invention has industrial applicability in mobile terminals and stylus pens, and can be repeatedly applied.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé permettant de fournir une interface utilisateur qui commande divers contenus dans un terminal mobile en utilisant un stylet apparié au terminal mobile. Afin d'atteindre les objectifs susmentionnés ou d'autres objectifs, un aspect de l'invention concerne un stylet comprenant : une unité de communication sans fil permettant de transmettre/recevoir des données vers/depuis au moins un terminal mobile ; une unité de détection permettant de détecter un signal d'entrée ; et une unité de commande permettant d'exécuter une commande de façon à transmettre un signal de commande au terminal ou aux terminaux mobiles en réponse au signal d'entrée.
PCT/KR2016/004063 2016-04-19 2016-04-19 Terminal mobile, stylet et procédé de commande associé WO2017183743A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2016/004063 WO2017183743A1 (fr) 2016-04-19 2016-04-19 Terminal mobile, stylet et procédé de commande associé

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2016/004063 WO2017183743A1 (fr) 2016-04-19 2016-04-19 Terminal mobile, stylet et procédé de commande associé

Publications (1)

Publication Number Publication Date
WO2017183743A1 true WO2017183743A1 (fr) 2017-10-26

Family

ID=60116190

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/004063 WO2017183743A1 (fr) 2016-04-19 2016-04-19 Terminal mobile, stylet et procédé de commande associé

Country Status (1)

Country Link
WO (1) WO2017183743A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019164143A1 (fr) * 2018-02-22 2019-08-29 삼성전자주식회사 Appareil d'affichage pour transmettre des données par l'intermédiaire d'un stylo électronique et son procédé de commande
EP3605284A1 (fr) * 2018-07-30 2020-02-05 Samsung Electronics Co., Ltd. Dispositif électronique comprenant un stylo numérique
CN112667126A (zh) * 2021-01-22 2021-04-16 深圳市绘王动漫科技有限公司 手写屏及其调节关闭屏幕菜单的方法
CN112740152A (zh) * 2018-09-30 2021-04-30 华为技术有限公司 手写笔检测方法、***及相关装置
WO2022065844A1 (fr) * 2020-09-24 2022-03-31 삼성전자 주식회사 Procédé d'affichage d'image de prévisualisation et appareil électronique le prenant en charge
EP4180999A1 (fr) * 2021-11-11 2023-05-17 William Wang Procédé d'authentification d'un utilisateur par biométrie et signature manuscrite numérisée, et système mettant en oeuvre ce procédé

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050122209A1 (en) * 2003-12-03 2005-06-09 Black Gerald R. Security authentication method and system
US20100221999A1 (en) * 2009-03-02 2010-09-02 Motorola, Inc. Method for selecting content for transfer or synchronization between devices
US20130091238A1 (en) * 2011-10-06 2013-04-11 Eric Liu Pen-based content transfer system and method thereof
US20130154956A1 (en) * 2011-12-20 2013-06-20 Htc Corporation Stylus Device
KR20150100470A (ko) * 2014-02-24 2015-09-02 삼성전자주식회사 컨텐츠 표시 방법 및 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050122209A1 (en) * 2003-12-03 2005-06-09 Black Gerald R. Security authentication method and system
US20100221999A1 (en) * 2009-03-02 2010-09-02 Motorola, Inc. Method for selecting content for transfer or synchronization between devices
US20130091238A1 (en) * 2011-10-06 2013-04-11 Eric Liu Pen-based content transfer system and method thereof
US20130154956A1 (en) * 2011-12-20 2013-06-20 Htc Corporation Stylus Device
KR20150100470A (ko) * 2014-02-24 2015-09-02 삼성전자주식회사 컨텐츠 표시 방법 및 장치

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190101039A (ko) * 2018-02-22 2019-08-30 삼성전자주식회사 전자 펜을 통해 데이터를 전송하는 디스플레이 장치 및 이의 제어 방법
WO2019164143A1 (fr) * 2018-02-22 2019-08-29 삼성전자주식회사 Appareil d'affichage pour transmettre des données par l'intermédiaire d'un stylo électronique et son procédé de commande
US11194409B2 (en) 2018-02-22 2021-12-07 Samsung Electronics Co., Ltd. Display apparatus for transmitting data through electronic pen and control method thereof
KR102397892B1 (ko) * 2018-02-22 2022-05-13 삼성전자주식회사 전자 펜을 통해 데이터를 전송하는 디스플레이 장치 및 이의 제어 방법
EP3605284A1 (fr) * 2018-07-30 2020-02-05 Samsung Electronics Co., Ltd. Dispositif électronique comprenant un stylo numérique
US10990199B2 (en) 2018-07-30 2021-04-27 Samsung Electronics Co., Ltd. Electronic device including digital pen
CN112740152B (zh) * 2018-09-30 2023-09-08 华为技术有限公司 手写笔检测方法、***及相关装置
CN112740152A (zh) * 2018-09-30 2021-04-30 华为技术有限公司 手写笔检测方法、***及相关装置
US11899879B2 (en) 2018-09-30 2024-02-13 Huawei Technologies Co., Ltd. Stylus detection method, system, and related apparatus for switching frequencies for detecting input signals
WO2022065844A1 (fr) * 2020-09-24 2022-03-31 삼성전자 주식회사 Procédé d'affichage d'image de prévisualisation et appareil électronique le prenant en charge
CN112667126A (zh) * 2021-01-22 2021-04-16 深圳市绘王动漫科技有限公司 手写屏及其调节关闭屏幕菜单的方法
TWI831082B (zh) * 2021-11-11 2024-02-01 王士華 生物簽章驗證系統與生物簽章驗證方法
EP4180999A1 (fr) * 2021-11-11 2023-05-17 William Wang Procédé d'authentification d'un utilisateur par biométrie et signature manuscrite numérisée, et système mettant en oeuvre ce procédé

Similar Documents

Publication Publication Date Title
WO2017057803A1 (fr) Terminal mobile et son procédé de commande
WO2017030223A1 (fr) Terminal mobile à unité de carte et son procédé de commande
WO2020171287A1 (fr) Terminal mobile et dispositif électronique comportant un terminal mobile
WO2017082508A1 (fr) Terminal de type montre, et procédé de commande associé
WO2017047854A1 (fr) Terminal mobile et son procédé de commande
WO2015199270A1 (fr) Terminal mobile, et procédé de commande correspondant
WO2017090823A1 (fr) Terminal mobile enroulable et son procédé de commande
WO2017090826A1 (fr) Terminal mobile, et procédé de commande associé
WO2018105806A1 (fr) Terminal mobile et procédé de commande associé
WO2016035921A1 (fr) Terminal mobile et son procédé de commande
WO2017104860A1 (fr) Terminal mobile enroulable
WO2017099276A1 (fr) Terminal mobile enroulable et son procédé de commande
WO2017119529A1 (fr) Terminal mobile
WO2016032045A1 (fr) Terminal mobile et son procédé de commande
WO2017007064A1 (fr) Terminal mobile, et son procédé de commande
WO2017183743A1 (fr) Terminal mobile, stylet et procédé de commande associé
WO2017007045A1 (fr) Drone, terminal mobile, et procédé de commande de drone et de terminal mobile
WO2015133658A1 (fr) Dispositif mobile et son procédé de commande
WO2016039498A1 (fr) Terminal mobile et son procédé de commande
WO2016129778A1 (fr) Terminal mobile et procédé de commande associé
WO2018124343A1 (fr) Dispositif électronique
WO2017051959A1 (fr) Appareil de terminal et procédé de commande pour appareil de terminal
WO2015194694A1 (fr) Terminal mobile
WO2018030619A1 (fr) Terminal mobile
WO2016190484A1 (fr) Terminal mobile et procédé de commande associé

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16899512

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16899512

Country of ref document: EP

Kind code of ref document: A1