KR20150026693A - Method and system for controlling device - Google Patents

Method and system for controlling device Download PDF

Info

Publication number
KR20150026693A
KR20150026693A KR20130126674A KR20130126674A KR20150026693A KR 20150026693 A KR20150026693 A KR 20150026693A KR 20130126674 A KR20130126674 A KR 20130126674A KR 20130126674 A KR20130126674 A KR 20130126674A KR 20150026693 A KR20150026693 A KR 20150026693A
Authority
KR
South Korea
Prior art keywords
input
user
touch screen
window
mode
Prior art date
Application number
KR20130126674A
Other languages
Korean (ko)
Inventor
이성해
Original Assignee
이성해
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 이성해 filed Critical 이성해
Publication of KR20150026693A publication Critical patent/KR20150026693A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a terminal control method and a terminal control system. According to a first embodiment of the present invention, a method to control a terminal with a touch screen includes a step of sensing settings of an one-finger mode; and a step of controlling a window displayed on the touch screen according to the touch input of a user when a mode of the terminal is set to be a one-finger mode.

Description

[0001] METHOD AND SYSTEM FOR CONTROLLING DEVICE [0002]

The present invention relates to a terminal control method and a terminal control system, and more particularly, to a method and system for controlling a terminal having a touch screen.

2. Description of the Related Art [0002] Recently, various types of electronic terminals have become popular, and various contents are provided to users through electronic terminals.

Such an electronic terminal has a touch screen as a user interface widely used as a new input method in various terminals instead of an existing keypad, keyboard, and the like.

The touch screen is an advanced user interface in that a display screen as an output device can be used as an input device and input can be performed more intuitively without having a separate mechanical input button .

Such a touch screen is a screen equipped with a touch sensor panel so as to recognize a touch position where a finger or a tool touches on the screen. Although there are a variety of technical implementations thereof, the precision of recognition and the reaction speed are gradually improved, In recent years, practicality such as handwriting recognition and multi-touch has increased, and the demand for personal terminals and mobile phones has been rapidly increasing.

On the other hand, there is a tendency that the size of the electronic terminal has recently been increasing. That is, if it is a prior trend to create an electronic terminal with a size that can fit easily in one hand, the size of the electronic terminal also increases as the size of the screen increases for the purpose of enjoying contents more conveniently. At this time, since the electronic terminal is an electronic device manufactured considering mobility, a user who uses the electronic terminal must control the electronic terminal in a state of gripping the electronic terminal. When controlling the electronic terminal with two hands Even if the size of the electronic terminal is large, the size of the electronic terminal is not a problem because the electronic terminal can be controlled with one hand while the electronic terminal can be controlled with the other hand while gripping the electronic terminal with one hand. However, There is a problem in that it is difficult to control the electronic terminal with the thumb and the control is difficult in the case of a small hand.

In Patent Application No. 2010-0128453, which is a prior art document, a portable terminal adaptively toggles enlargement and reduction of screen data through user interaction, and in a zoom mode for activating by user interaction, Discloses a method and an apparatus for controlling a zoom function capable of providing a zoom function in real time. That is, although the prior art discloses a method of controlling a zoom function of a portable terminal through touch input, the content disclosed in the prior art also discloses a case where the portable terminal can be freely controlled with one hand while holding the portable terminal with one hand It does not solve the above-mentioned problems.

Therefore, a technique for solving the above-described problems is required.

On the other hand, the background art described above is technical information acquired by the inventor for the derivation of the present invention or obtained in the derivation process of the present invention, and can not necessarily be a known technology disclosed to the general public before the application of the present invention .

An embodiment of the present invention has an object to provide a terminal control method and a terminal control system.

It is another object of the present invention to provide a method and system for controlling a terminal having a touch screen.

According to a first aspect of the present invention there is provided a method for controlling a terminal having a touch screen, the method comprising the steps of: detecting a setting of an original finger mode; And controlling the window displayed on the touch screen according to the touch input of the user when the finger mode is set.

According to a second aspect of the present invention, there is provided a computer-readable recording medium on which a program for performing a method for controlling a terminal having a touch screen is recorded, the method comprising: detecting a setting of a one- And controlling the window displayed on the touch screen according to the touch input of the user when the mode of the terminal is set to the one finger mode.

According to a third aspect of the present invention, there is provided a terminal control system for controlling a terminal having a touch screen, the terminal control system comprising: And a mode setting unit configured to control a window displayed on the touch screen according to the mode setting unit.

According to any one of the above-mentioned objects of the present invention, an embodiment of the present invention can provide a terminal control method and a terminal control system.

Further, according to any one of the tasks of the present invention, a terminal control method and a terminal control system having a touch screen can be presented.

According to a preferred embodiment of the present invention, there is provided a terminal control method capable of easily performing input at a position of a window through movement of a cursor or a window without touch input at a position of a window to be input by a user And a terminal control system.

The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.

1 is a block diagram illustrating a terminal control system according to an embodiment of the present invention.
2 is a flowchart for explaining a terminal control method according to an embodiment of the present invention.
3 to 7 are exemplary diagrams for explaining a terminal control method according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, which will be readily apparent to those skilled in the art. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "electrically connected" with another part in between . Also, when an element is referred to as "comprising ", it means that it can include other elements as well, without departing from the other elements unless specifically stated otherwise.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

That is, the 'tap' operation refers to an operation of pressing and releasing a specific point on the screen, and the 'drag' operation refers to an operation of moving to another point while being in contact with a specific point.

FIG. 1 is a block diagram illustrating a terminal control system 100 according to an embodiment of the present invention. Referring to FIG.

The terminal control system 100 is a system capable of controlling a terminal equipped with a touch input.

The terminal control system 100 may be implemented as an electronic terminal. Here, the electronic terminal can be implemented as a portable terminal that can be connected to a remote server through the network N, or can be connected to other terminals and servers. Here, the portable terminal is, for example, a portable communication device with guaranteed portability and mobility, such as a Personal Communication System (PCS), a Global System for Mobile communications (GSM), a Personal Digital Cellular (PDC) PDA (Personal Digital Assistant), IMT (International Mobile Telecommunication) -2000, CDMA (Code Division Multiple Access) -2000, W-CDMA (W-CDMA), Wibro (Wireless Broadband Internet) Phone), and the like, for example.

The terminal control system 100 may include an interface unit (not shown) for receiving an input from a user and displaying information to the user. At this time, when the terminal control system is implemented as an electronic terminal, the interface unit can be implemented with a touch screen.

The terminal control system 100 may also include a control unit (not shown) configured to control the operation of each of the above-described components. The control unit includes a series of electric circuits and software for operating the same, Operating System) and application programs.

The terminal control system 100 may include a mode setting unit 110, a first mode processing unit 120, a second mode processing unit 130, a final input processing unit 140, and a content processing unit 150.

The mode setting unit 110 detects the setting of the one finger mode and if the mode of the terminal is set to the one finger mode according to the detection result, the window displayed on the touch screen is input to the touch input of the user As shown in FIG.

In this case, the 'one finger mode' is a mode for implementing the terminal control method of the present invention. When the mode is set to the one finger mode, the user can process the input according to the first mode or the second mode described below. In the related art, if the mode is set to the one finger mode, before the one finger mode is set, a normal mode (i.e., a mode in which a user taps a position to be tapped to determine a user's input) A control method may be implemented, or a terminal control method of the present invention may be implemented in a state in which the mode is changed to the one-finger mode in the normal mode.

The method of detecting the setting of the one-finger mode is possible by installing an application, touch input at a specific position, setting of an electronic terminal, and the like. For example, when an application capable of setting the electronic terminal in the one finger mode is installed in the electronic terminal, the electronic terminal can determine that the one finger mode is set. Alternatively, for example, when at least a part of a boundary region of a window displayed on a touch screen of the electronic terminal, that is, an area including a line separating the inside of the window and the outside of the window, It can be determined that the one finger mode is set. Alternatively, for example, it can be determined that the original finger mode is set according to the setting of the electronic terminal or the setting of the user.

When the one-finger mode is set as described above, the mode setting unit 110 can process the user's input according to at least one of the first mode and the second mode.

The first mode processing unit 120 is a module for processing a user input according to the first mode.

The first mode is a mode for providing a cursor according to a touch input of a user and determining the position of the cursor as a final input position of the user.

Here, the 'cursor' may be provided as a mark indicating the input position on the touch screen.

In a typical touch screen input method, the cursor is different from that provided at the point where the user's input is present, and in the terminal control method according to an embodiment of the present invention, the cursor can be displayed at any point.

At this time, the cursor may be provided on the position where the user's input is located, or may be provided on a predetermined position by the electronic terminal (or user), or a window to be controlled by the user Application) on a predetermined location. The cursor may also be provided on a location where the user has positioned the cursor with the highest frequency or may be communicated to a server located outside the electronic terminal to determine the location where the cursor was positioned by the highest frequency As shown in FIG.

The 'window' is for providing a user interface for an application executed in the electronic terminal, and can be displayed through a touch screen. For example, a window displayed through the web browser when the mobile web browser is launched is called a window.

The first mode processing unit 120 recognizes the tab as the first input of the user when there is a tab of the user in a part of the window boundary or the boundary of the touch screen (for example, a line separating the electronic terminal and the touch screen) can do. Upon sensing that such a first input is present, the first mode processing unit 120 may provide a cursor on any point on the touch screen and may move the cursor in accordance with a continuous drag with the first input . When the second input, which is an input for ending the drag, is sensed, the position corresponding to the cursor can be determined as the final input position of the user.

On the other hand, the second mode processing unit 130 is a module that processes a user input according to the second mode.

The second mode is a mode for moving the window according to the touch input of the user and determining the touch input position of the user with respect to the moved window as the final input position of the user.

When the second mode processing unit 130 detects the first input on the touch screen or at least a part of the window, the second mode processing unit 130 can move the window according to the amount of change in the drag position at the first input position. That is, the first input may be an input by a tap operation, and the drag operation may be an operation of moving to another point while being in contact with a specific point, where the specific point may be the first input point.

Here, the window can be moved in a direction opposite to the drag change direction when the user drags the drag. In addition, the window can be moved by a distance that is a predetermined multiple of the x-axis and y-axis difference between the A point and the B point when the user drags from the A point to the B point. For example, if the leftmost coordinate on the touch screen is (0, 0) and the point A is (8, 0) and the point B is (5, 2), the difference between the x- The window is shifted by 1, for example, by applying 2 times, and by applying 0.5 times to 2, which is the difference of the y-axis between points A and B, by 1, (6, -1).

When the second mode processing unit 130 detects the second input, the second mode processing unit 130 may determine the position corresponding to the second input as the user final input position.

Accordingly, when the tab of the user is present in a part of the window boundary or the boundary of the touch screen (for example, a line separating the electronic terminal and the touch screen), the tab can be recognized as the first input of the user. The second mode processing unit 130 can move the window according to the drag when it recognizes the continuous drag with the first input. When the second mode processing unit 130 detects the second input, The position where the second input was present can be determined as the final input position of the user.

On the other hand, the final input processing unit 140 may be configured to execute a command corresponding to a position where the final input of the user has been performed.

That is, in the normal mode, a command corresponding to a point tapped by the user is executed. In the first mode, the final input processing unit 140 sets a point at which the cursor is positioned. In the second mode, The corresponding instruction can be executed.

Meanwhile, the content processing unit 150 may be configured to provide the content through at least a part of the margin appearing on the touch screen as the window moves.

Here, 'blank space' means a blank space that appears on the touch screen as the window moves, at a position where the original window was located. Also, the 'content' may be any kind of information that can be provided through the touch screen, for example, a moving image, an image, text, or the like.

The content processing unit 150 processes the window into one layer and positions the content layer including the content at the lower end of the window (i.e., overlaps the window layer on the content layer), and accordingly, Accordingly, the content layer can be displayed in the margin of the touch screen.

In addition, the content processing unit 150 may detect that a margin is generated according to the movement of the window, and may generate and provide a content layer that can be displayed through the margin.

The terminal control method according to the embodiment shown in FIG. 2 includes the steps of time-series processing in the terminal control system 100 shown in FIG. Therefore, the contents described above with respect to the terminal control system 100 shown in FIG. 2 can be applied to the terminal control method according to the embodiment shown in FIG. 2, even if omitted from the following description.

3 to 7 are diagrams illustrating an example of a touch screen of the electronic terminal 10 in which the terminal control system 100 according to the embodiment of the present invention is implemented, .

First, when the setting of the one finger mode is sensed (S210), the terminal control system 100 can set the mode for terminal control to the one finger mode.

When the mode of the terminal is set to the first finger mode, in particular, when the mode is set to the first mode, the terminal control system 100 processes the user's tab according to the first mode, It can be processed as having an input (S220).

The terminal control system 100 may sense a first input on at least a portion of a window or touch screen (S230) and may provide a cursor on any point of the window (S231).

That is, as shown in FIG. 3, the window 300 may be displayed through the touch screen 11 of the terminal 10. And, as shown in FIG. 4, upon sensing the first input by the user's thumb 12, the cursor 400 can be provided at any point in the window.

When the user performs a continuous drag operation with the first input, the terminal control system 100 can move the cursor (S232).

That is, as shown in FIG. 5, the cursor on the window can be moved according to the drag input of the user, and when detecting the second input, that is, when detecting the end of the drag, It can be processed as the final input position.

Meanwhile, when the mode of the terminal is set to the one-finger mode, in particular, when the mode is set to the second mode, the window displayed on the touch screen can be controlled according to the touch input of the user (S220)

A first input on the window or at least a portion of the touch screen may be sensed (S240).

Then, the window can be moved according to a continuous drag with the first input (S241).

3, the window 300 can be displayed through the touch screen 11 on the terminal 10, which detects a first input by the user and, in response to a subsequent drag on the first input, , The terminal control system 100 may move the window 600, as shown in FIG.

Content 720 may also be displayed on at least a portion of the margin 710, along with the moved window 700 according to the drag.

As shown in FIGS. 6 and 7, as the user performs a continuous drag with the first input, the window is moved. When the user's thumb is positioned at a position to be tapped by the user, the user can remove the thumb, At this time, the input when the thumb is released can be detected as the second input, and the position corresponding to the thumb can be determined as the final input position of the user (S242).

When the final input position of the user is determined according to the above-described method, an instruction corresponding to the final input position may be executed (S250). That is, the terminal control system 100 can normally process the instruction when the user directly taps at the final input position.

For example, as shown in FIG. 7, when the final input position of the user is an article 730 included in the news category, the terminal control system 100 judges that the user taps the article 730 directly (Such as loading the content of article 730).

The terminal control method according to the embodiment described with reference to FIG. 2 can also be implemented in the form of a recording medium including instructions executable by a computer such as a program module executed by a computer. Computer readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media. In addition, the computer-readable medium may include both computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Communication media typically includes any information delivery media, including computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, or other transport mechanism.

It will be understood by those skilled in the art that the foregoing description of the present invention is for illustrative purposes only and that those of ordinary skill in the art can readily understand that various changes and modifications may be made without departing from the spirit or essential characteristics of the present invention. will be. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive. For example, each component described as a single entity may be distributed and implemented, and components described as being distributed may also be implemented in a combined form.

The scope of the present invention is defined by the appended claims rather than the detailed description and all changes or modifications derived from the meaning and scope of the claims and their equivalents are to be construed as being included within the scope of the present invention do.

100: terminal control system
110: Mode setting section
120: first mode processing section
130: second mode processing section
140: final input processing unit
150:

Claims (15)

A method for controlling a terminal having a touch screen,
Detecting a setting of the one finger mode; And
And controlling a window displayed on the touch screen according to a touch input of the user when the mode of the terminal is set to the one finger mode.
The method according to claim 1,
The step of controlling the window comprises:
Sensing a user's first input on at least a portion of the touch screen;
Providing a cursor on any point of the touch screen;
Moving the cursor on the touch screen in accordance with a continuous drag with the first input; And
And when the second input of the user is sensed, processing the position corresponding to the cursor as a user final input position.
The method according to claim 1,
The step of controlling the window comprises:
Sensing a user's first input on at least a portion of the touch screen;
Moving the window according to a continuous drag with the first input; And
And when the second input of the user is sensed, processing the position corresponding to the second input as a user final input position.
The method of claim 3,
The step of moving the window comprises:
And moving the window in a direction that is symmetrical with the drag direction.
The method of claim 3,
The step of moving the window comprises:
When the input of the user moves from the first input point A to the B point in accordance with the drag, the x-axis variation amount of the A point and the B point and the y-axis variation amount of the A point and the B point, And shifting the window in accordance with an x-axis variation amount and a y-axis variation amount to which a predetermined multiple is applied to each of the variation amounts.
The method of claim 3,
Further comprising providing content through at least a portion of a margin that appears on the touch screen as the window is moved.
The method according to claim 6,
Wherein the providing of the content comprises:
Placing a content layer including the content at a lower end of the window; And
And displaying the content layer through a margin appearing on the touch screen as the window is moved.
The method according to claim 6,
Wherein the providing of the content comprises:
Generating a content layer including content based on a margin appearing on the touch screen as moving the window, and displaying the content layer through the margin.
The method according to claim 2 or 3,
Wherein sensing a user's first input on at least a portion of the touch screen comprises:
Sensing a user's first input on a boundary of the touch screen or on a boundary of the window.
The method according to claim 2 or 3,
And the second input is an input for detecting that the drag operation is terminated.
A computer-readable recording medium on which a program for carrying out the method according to claim 1 is recorded. A terminal control system for controlling a terminal having a touch screen,
And a mode setting unit configured to detect a setting of the one finger mode and to control a window displayed on the touch screen according to a touch input of the user when the mode of the terminal is set to the one finger mode, .
13. The method of claim 12,
Sensing a user's first input on at least a portion of the touch screen to provide a cursor on any point of the touch screen and moving the cursor on the touch screen in accordance with a continuous drag with the first input, Further comprising a first mode processing unit configured to process a position corresponding to the cursor as a user final input position upon sensing a second input of the terminal.
13. The method of claim 12,
When the user senses a first input of the user on at least a portion of the touch screen, moves the window in accordance with a continuous drag with the first input, and upon detecting a second input of the user, As a user final input position. ≪ Desc / Clms Page number 22 >
13. The method of claim 12,
Further comprising a content processor configured to provide content over at least a portion of a margin appearing on the touch screen as the window is moved.

KR20130126674A 2013-09-03 2013-10-23 Method and system for controlling device KR20150026693A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020130105179 2013-09-03
KR20130105179 2013-09-03

Publications (1)

Publication Number Publication Date
KR20150026693A true KR20150026693A (en) 2015-03-11

Family

ID=53022569

Family Applications (1)

Application Number Title Priority Date Filing Date
KR20130126674A KR20150026693A (en) 2013-09-03 2013-10-23 Method and system for controlling device

Country Status (1)

Country Link
KR (1) KR20150026693A (en)

Similar Documents

Publication Publication Date Title
KR102141099B1 (en) Rapid screen segmentation method and apparatus, electronic device, display interface, and storage medium
RU2616536C2 (en) Method, device and terminal device to display messages
US9035883B2 (en) Systems and methods for modifying virtual keyboards on a user interface
US20100245275A1 (en) User interface apparatus and mobile terminal apparatus
US20090160808A1 (en) Method for controlling electronic apparatus and electronic apparatus using the method
EP2770423A2 (en) Method and apparatus for operating object in user device
CN104571852A (en) Icon moving method and device
EP2613247B1 (en) Method and apparatus for displaying a keypad on a terminal having a touch screen
WO2014118602A1 (en) Emulating pressure sensitivity on multi-touch devices
US11567725B2 (en) Data processing method and mobile device
JP5854928B2 (en) Electronic device having touch detection function, program, and control method of electronic device having touch detection function
KR101339420B1 (en) Method and system for controlling contents in electronic book using bezel region
US20140359541A1 (en) Terminal and method for controlling multi-touch operation in the same
US9727151B2 (en) Avoiding accidental cursor movement when contacting a surface of a trackpad
JP6411067B2 (en) Information processing apparatus and input method
US20160224111A1 (en) Method for controlling touch screen by detecting position of line of sight of user
TWI480792B (en) Operating method of electronic apparatus
US20150153925A1 (en) Method for operating gestures and method for calling cursor
US20140002404A1 (en) Display control method and apparatus
WO2014148090A1 (en) Information processing device and information processing method
CN103809794A (en) Information processing method and electronic device
CN104423657A (en) Information processing method and electronic device
KR20150026693A (en) Method and system for controlling device
US20130300685A1 (en) Operation method of touch panel
JP2018170048A (en) Information processing apparatus, input method, and program

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E601 Decision to refuse application