WO2016158125A1 - Electronic device - Google Patents

Electronic device Download PDF

Info

Publication number
WO2016158125A1
WO2016158125A1 PCT/JP2016/055813 JP2016055813W WO2016158125A1 WO 2016158125 A1 WO2016158125 A1 WO 2016158125A1 JP 2016055813 W JP2016055813 W JP 2016055813W WO 2016158125 A1 WO2016158125 A1 WO 2016158125A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
key
series
display
input
Prior art date
Application number
PCT/JP2016/055813
Other languages
French (fr)
Japanese (ja)
Inventor
恭平 江口
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2016158125A1 publication Critical patent/WO2016158125A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to an electronic device that accepts a user operation.
  • the user can perform intuitive operations (tap operation, drag operation, pinch operation, etc.) by operating the touch panel.
  • the user needs to perform the touch panel operation and the hard key operation in order to cause the electronic device including the touch panel and the hard key to perform a desired process, and the operation is troublesome. Accordingly, there is a problem that the convenience of the electronic device having the touch panel is reduced.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2014-85858
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2014-85858
  • a user realizes control of a display image based on an operation process by performing an operation on another operation detection unit while performing an operation on one operation detection unit among a plurality of operation detection units. ing.
  • the present disclosure has been made in view of the above-described problems, and an object thereof is to provide an electronic device having excellent operability.
  • the electronic device includes an operation receiving unit for receiving an operation for inputting information to the electronic device and a control unit for controlling the electronic device.
  • the operation reception unit includes a plurality of operation keys and a touch pad that is provided over the plurality of operation keys and is operated by an object.
  • the control unit is configured such that the operation content for the touch pad received by the operation receiving unit is a pressing operation for at least one of the plurality of operation keys, and an operation following the pressing operation.
  • FIG. 1 is a perspective view illustrating an appearance of a mobile phone 1 according to Embodiment 1.
  • FIG. It is a figure for demonstrating the hardware constitutions of the mobile telephone 1 of FIG. 3 is a diagram schematically illustrating a configuration of functions according to Embodiment 1.
  • FIG. It is a flowchart for demonstrating the image display process of the pinch operation which concerns on this Embodiment 1.
  • FIG. It is a flowchart for demonstrating this Embodiment 1 but a window movement display process.
  • FIG. 1 is a perspective view showing the appearance of the mobile phone 1.
  • the mobile phone 1 includes an upper housing 101, a lower housing 102, and a hinge portion 103.
  • the upper housing 101 and the lower housing 102 are connected by a hinge portion 103 so as to be foldable.
  • Each of the upper housing 101 and the lower housing 102 has a substantially rectangular parallelepiped shape.
  • the upper housing 101 includes a display 160 made of liquid crystal or the like.
  • the lower housing 102 includes an operation accepting unit 130 for accepting an information input operation on the mobile phone 1 by the user.
  • the operation reception unit 130 includes an electrostatic key.
  • the electronic device is not limited to the mobile phone 1 and may be, for example, a tablet terminal, a personal computer, a digital camera, an audio player, a smartphone, a wearable terminal, or the like.
  • a virtual axis extending in the longitudinal direction of each of the upper casing 101 and the lower casing 102 is referred to as a Y axis.
  • An imaginary axis extending along the short direction of each of the upper casing 101 and the lower casing 102 and orthogonal to the Y axis is referred to as an X axis.
  • An intersection between the X axis and the Y axis is referred to as a point O.
  • the direction in which the Y axis extends and the side on which the upper housing 101 is located is referred to as “upward direction”, and the direction in which the Y axis extends and the direction opposite to the “upward direction” is “ This is referred to as “downward”.
  • the direction in which the X axis extends and the point O (see FIG. 1) is located is referred to as “left direction”, and the direction opposite to “left direction” is referred to as “right direction”.
  • the surface of the lower housing 102 that faces the display surface of the display 160 when the upper housing 101 and the lower housing 102 are folded is referred to as a main surface.
  • the operation receiving unit 130 receives a user operation for inputting information to the mobile phone 1.
  • the operation accepting unit 130 includes other operations such as a numeric keypad 300 including a plurality of keys arranged two-dimensionally (arranged in a grid pattern), a touch pad 350 (shaded portion in the drawing), and a cross key. Provide a key.
  • the touch pad 350 is provided so as to overlap the numeric keypad 300 and other keys.
  • a touch pad 350 is provided on the exposed surface side (front surface side) of the numeric keypad 300 and other keys. In other words, the numeric keypad 300 and other keys are covered with the touch pad 350.
  • the touch pad 350 may be installed on the back side of the numeric keypad 300 and other keys.
  • the operation reception unit 130 detects an operation on the touch pad 350 such as a pressing operation of the numeric keypad 300 and other keys by an object such as a finger and a stylus pen.
  • an operation on the operation receiving unit 130 will be described as an operation with a finger among the above objects.
  • the position of each key in the two-dimensional array is indicated by (X, Y) coordinates.
  • This (X, Y) coordinate is hereinafter referred to as a key position.
  • the touch pad 350 has a pattern in which electrodes of a plurality of touch sensors are arranged in a matrix. Each key of the ten key 300 is mounted so as to overlap each corresponding touch sensor of the touch pad 350.
  • the detection method of each touch sensor is a method of detecting a change in electrostatic capacitance between electrodes caused by the approach of a finger.
  • the touch sensor When the touch sensor is operated by a finger, the touch sensor detects a change in capacitance between the electrodes caused by the approach of the finger.
  • the operation reception unit 130 determines the position (X, Y) on the matrix of the operated touch sensor based on the detection signal.
  • the position (X, Y) based on the detection signal when the key of the numeric keypad 300 is operated corresponds to the key position (X, Y) of the operated key.
  • the operation reception unit 130 determines the type of operation based on the magnitude of the change in capacitance indicated by the detection signal.
  • the type of operation is either a key pressing operation (click operation) or a touch operation.
  • the operation reception unit 130 receives operation type data 131 indicating the determined type of key operation, and position data 132 indicating the matrix-like position (X, Y) of the operated touch sensor. 2).
  • the user performs a series of operations including a key depression operation (corresponding to a click operation) and a swipe operation, so that the series of operations is performed on the mobile phone 1.
  • a key depression operation corresponding to a click operation
  • a swipe operation so that the series of operations is performed on the mobile phone 1.
  • An instruction for executing a process based on the content of the input operation can be given.
  • the swipe operation indicates an operation in which a finger slides in a region on the surface of the touch pad 350 of the operation reception unit 130 while keeping touching the surface (touch operation).
  • FIG. 2 is a diagram for explaining a hardware configuration of the mobile phone 1.
  • mobile phone 1 includes CPU (Central Processing Unit) 110 for controlling mobile phone 1, memory 120, operation receiving unit 130, operation key driver 140, display driver 150, and display 160. Is provided.
  • Display driver 150 controls display 160 to display information.
  • the operation reception unit 130 includes at least the numeric keypad 300 including a plurality of keys 301, 302, and the touch pad 350.
  • the memory 120 stores various data and programs.
  • the operation key driver 140 is a driver circuit for the operation receiving unit 130.
  • the operation key driver 140 controls the operation of the touch pad 350 of the operation receiving unit 130, and from the operation receiving unit 130, operation type data 131 indicating the type of operation on the touch pad 350, and the operated position (X, Y ) Is received.
  • the operation key driver 140 outputs a set of the operation type data 131 and the position data 132 received together with the operation type data 131 to the CPU 110.
  • CPU 110 When CPU 110 receives a set of operation type data 131 and position data 132 from operation key driver 140, CPU 110 executes a predetermined process based on the set.
  • FIG. 3 is a diagram schematically illustrating a configuration of functions according to the first embodiment.
  • CPU 110 includes an operation detection unit 111 and a process determination unit 113 as functions.
  • the operation detection unit 111 receives a set of operation type data 131 and position data 132 from the operation key driver 140.
  • the operation detection unit 111 determines whether the operation is a push-down operation on at least one operation key of the plurality of operation keys of the numeric keypad 300 or a swipe operation on the touch pad 350 based on the received set.
  • the processing determination unit 113 receives the series of inputs. The process is executed based on the content of the operation. Specifically, the process determining unit 113 generates a display control signal and outputs the display control signal to the display driver 150 so that the display process according to the series of input operation contents is executed.
  • the display driver 150 controls the display 160 based on the display control signal.
  • FIG. 4 is a flowchart for explaining the image display processing of the pinch operation according to the first embodiment.
  • a program according to this flowchart is stored in the memory 120.
  • the CPU 110 reads the program from the memory 120 and executes the read program, thereby realizing processing. This program is repeatedly executed periodically. A specific process for the user input operation will be described with reference to the flowchart of FIG.
  • the operation detection unit 111 determines whether or not a touch operation has been performed on the touch pad 350 based on an input from the operation key driver 140 (a set of operation type data 131 and position data 132) (step S1).
  • operation detection unit 111 determines that the input operation is a touch operation on touch pad 350 (YES in step S1)
  • CPU 110 executes a predetermined process for a normal touch panel operation (step S3). Thereafter, the process ends.
  • the operation detection unit 111 determines that the input operation is not a touch operation on the touch pad 350 (NO in step S1), that is, determines that the input operation is a key pressing operation, the operation detection unit 111 determines that the operation key driver 140 Is input to the process determining unit 113 (a set of operation type data 131 and position data 132).
  • the processing determination unit 113 determines whether or not the position data 132 of the input (the combination of the operation type data 131 and the position data 132) from the operation detection unit 111 indicates the (X, Y) position of the key 301 of “9”. Based on this, it is determined whether or not the input indicates an operation of depressing the “9” key 301 (step S2). If it is determined that the input does not indicate an operation of depressing the “9” key 301 (NO in step S2), the process returns to step S1.
  • step S determining unit 113 determines that the input from the operation detecting unit 111 indicates a pressing operation of the “9” key 301 (YES in step S ⁇ b> 2), subsequently, the operation key driver via the operation detecting unit 111 is used. Based on the input from 140 (a set of operation type data 131 and position data 132), it is determined whether or not the finger has left the touch pad 350 (step S4).
  • the process determining unit 113 analyzes time-series inputs (a set of the operation type data 131 and the position data 132) from the operation key driver 140 over a predetermined time, and based on the analysis result, the finger is moved. It is determined whether or not the touch pad 350 has been removed (step S4). Specifically, the process determining unit 113 indicates that the time-series input indicates that the “operation type data 131 does not indicate“ touch operation ”or“ push-down operation ”, and“ position data 132 ”has an indefinite value. It is determined whether or not the finger has moved away from the touch pad 350 depending on whether or not the “show” condition is satisfied (step S4).
  • the process determining unit 113 determines that the finger has left the touch pad 350 (YES in step S4). If it is determined that the finger has moved away from the touch pad 350, the CPU 110 executes normal processing in accordance with the pressing operation of the “9” key 301 (step S5). Thereafter, the process ends.
  • the process determining unit 113 determines that the above time-series input from the operation key driver 140 does not satisfy the above condition, it determines that the finger is not separated from the touch pad 350 (in step S4). NO). If it is determined that the finger is not separated from the touch pad 350, the process determination unit 113 determines that the user has performed the time-series input from the operation key driver 140 (the combination of the operation type data 131 and the position data 132). It is determined whether or not a swipe operation has been performed (step S6).
  • the processing determining unit 113 indicates that the above-described time-series input operation type data 131 indicates “touch operation” and each value (position (X, Y) indicated by the time-series “position data 132”. When it is determined that the value of) has changed, it is determined that the user has performed a swipe operation (YES in step S6). If it is not determined that the time-series input operation is a swipe operation (NO in step S6), the process ends.
  • the process determining unit 113 starts the operation of pressing the “9” key 301 and the subsequent swipe operation. It is determined that the content of the series of input operations is “pinch operation”. When it is determined that the content of the series of input operations is “pinch operation”, the process determining unit 113 generates a display control signal corresponding to the pinch operation, and outputs the generated display control signal to the display driver 150. (Step S7).
  • the pinch operation corresponds to an operation for changing the enlargement ratio of the display image on the display 160
  • the display control signal indicates a signal for controlling the display so that the enlargement ratio of the image changes. .
  • the display driver 150 controls the display 160 based on the display control signal so that the enlargement ratio of the display image changes.
  • the CPU 110 when the CPU 110 receives a series of input operations including a depression operation of the “9” key 301 and a swipe operation following the depression operation, the contents of the series of input operations are received. Based on the above, a pinch operation process, that is, a display image enlargement ratio changing process is performed.
  • the details of the above-described image enlargement process that is, the parameters for determining the enlargement ratio are changed based on the operation amount in the swipe operation.
  • the process determining unit 113 detects the operation amount of the swipe operation when receiving the above-described series of input operations, and based on the detected operation amount, the enlargement ratio in the enlargement process of the image included in the display control signal Set the parameters.
  • the operation amount includes a finger movement direction and a movement distance in the swipe operation.
  • the process determining unit 113 Based on the swipe operation time-series input (a combination of the operation type data 131 and the position data 132), the process determining unit 113 detects the position of the “9” key 301 detected in step S2 while performing the touch operation. It is determined whether the position has moved from (X, Y) to the position (X, Y) in the “upward” or “downward” direction.
  • the process determining unit 113 When it is determined that the key position of the touch operation has moved “upward”, the process determining unit 113 generates a display control signal having an enlargement factor parameter that increases the enlargement factor value. On the other hand, if it is determined that the key position of “9” detected in step S2 is moving “downward”, the process determining unit 113 sets an enlargement factor parameter that reduces the enlargement factor value. A display control signal is generated.
  • the process determination unit 113 based on the swipe operation time-series input (the combination of the operation type data 131 and the position data 132), the first key position in the time series (the key position (X, Y) of “9”). To the last touch operation position (X, Y) in time series (movement amount of finger) is calculated.
  • the enlargement ratio is increased, the parameter is changed so that the enlargement ratio increases as the calculated distance increases.
  • the enlargement ratio decreases as the distance decreases (reduction). Parameter is changed.
  • the display driver 150 controls the display operation for image enlargement of the display 160 based on the above parameters included in the display control signal.
  • FIG. 5 is a flowchart for explaining window moving display processing according to the first embodiment.
  • a program according to this flowchart is stored in the memory 120.
  • the CPU 110 reads the program from the memory 120 and executes the read program, thereby realizing processing. This program is repeatedly executed periodically.
  • FIG. 5 is a process different from that of FIG. 4 in step 2a, step 5a, and step 7a when compared with the flowchart of FIG. Other processes in FIG. 5 are the same as those in FIG. Therefore, FIG. 5 will be described mainly with different processing.
  • the operation detection unit 111 determines that the input operation is not a touch operation on the touch pad 350 based on the operation type data 131 of the input from the operation key driver 140 (a combination of the operation type data 131 and the position data 132) ( If NO in step S1, that is, if it is determined that the operation is a key pressing operation, the operation detection unit 111 outputs the input from the operation key driver 140 (a combination of the operation type data 131 and the position data 132) to the process determination unit 113. .
  • the process determination unit 113 analyzes the input from the operation detection unit 111 (a combination of the operation type data 131 and the position data 132), and based on the analysis result, whether the input indicates a pressing operation of the “5” key 302 or not. Is determined (step S2a). If it is determined that the input does not indicate an operation of depressing the “5” key 302 (NO in step S2a), the process returns to step S1.
  • step S determining unit 113 determines that the input indicates a pressing operation of the “5” key 302 based on the above analysis result (YES in step S ⁇ b> 2 a)
  • the operation determining unit 113 performs an operation via the operation detecting unit 111.
  • the key driver 140 a combination of the operation type data 131 and the position data 132
  • step S4 When the process determining unit 113 determines that the finger has moved away from the touch pad 350 (YES in step S4), the CPU 110 executes a normal process in accordance with the depressing operation of the “5” key 302 (step S5a). . Thereafter, the process ends.
  • the process determining unit 113 determines whether or not the user has performed a swipe operation (step S6). If it is determined that the user has performed a swipe operation (YES in step S6), the process determining unit 113 determines the contents of a series of input operations including the depressing operation of the “302” key 302 and the subsequent swipe operation. Then, the “window moving operation” is determined by dragging.
  • the process determining unit 113 generates a display control signal for the window moving operation, and outputs the generated display control signal to the display driver 150 (step S7a).
  • the window moving operation corresponds to an operation for changing the display position of the window image displayed on the display 160.
  • the display control signal described above indicates a signal for controlling the display of the display 160 so that the display position of the window image is changed.
  • the display driver 150 controls the display 160 based on the display control signal so that the window image being displayed is moved.
  • the content of the window image moving process that is, the parameter for determining the moving amount (moving direction and moving distance) is changed based on the operation amount in the swipe operation.
  • the process determining unit 113 detects the operation amount of the swipe operation when receiving the above-described series of input operations, and based on the detected operation amount, the moving direction and the moving distance of the window image in the display control signal described above. Set the value of the parameter that specifies.
  • the operation amount includes a finger movement direction and a movement distance in the swipe operation.
  • the process determining unit 113 Based on the swipe operation time-series input (a combination of the operation type data 131 and the position data 132), the process determining unit 113 detects the position of the key 302 of “5” detected in step S2 while performing the touch operation. From (X, Y), it is determined to which position (X, Y) the touch operation position has moved “upward”, “downward”, “rightward”, or “leftward”.
  • the process determining unit 113 displays the window image.
  • a display control signal having parameters for moving “upward” from the current position is generated.
  • the process determining unit 113 generates a display control signal having a parameter for moving the window image in the “downward direction” from the current position.
  • the process determining unit 113 generates a display control signal having a parameter for moving the window image in the “right direction” from the current position.
  • the process determining unit 113 when it is determined that the window image is moved in the “left direction”, the process determining unit 113 generates a display control signal having a parameter for moving the window image in the “left direction” from the current position.
  • the process determining unit 113 based on the swipe operation time-series input (the combination of the operation type data 131 and the position data 132), sets the first key position (X, Y) (“5” key 302 of the time series).
  • the distance (the movement distance of the finger) from the position (X, Y)) to the position (X, Y) of the last touch operation in time series is calculated.
  • a display control signal having a parameter for moving the window image further from the current position is generated as the calculated distance is longer.
  • a display control signal having a parameter for moving the window image closer to the current position is generated.
  • the display driver 150 controls the display operation for moving the window image of the display 160 based on the above parameters included in the display control signal.
  • the pinch on the touch panel is changed depending on the type of the depressed key.
  • the touch panel operation that can be performed is not limited to the pinch operation and the drag operation.
  • the detection method using the touch sensor of the touch pad 350 is a method for detecting a change in capacitance, but is not limited to this detection method.
  • the touch pad 350 is a system that detects a change in pressure according to a touch operation or a push-down operation, the configuration and function of the first embodiment can be realized.
  • the program according to the flowchart is stored on a recording medium that can be read by the CPU 110 via a memory driver (not shown), such as a memory card (not shown) attached to the mobile phone 1. It can also be recorded and provided to the portable telephone 1 as a program product. Alternatively, the program can be provided by receiving it via an antenna (not shown) of the mobile phone 1 via a network and downloading it to the storage area of the memory 120.
  • a memory driver such as a memory card (not shown) attached to the mobile phone 1. It can also be recorded and provided to the portable telephone 1 as a program product.
  • the program can be provided by receiving it via an antenna (not shown) of the mobile phone 1 via a network and downloading it to the storage area of the memory 120.
  • the provided program product includes the program itself and a recording medium on which the program is recorded non-temporarily.
  • the user can give a command (information) equivalent to a pinch operation, a drag operation, or the like to the mobile phone 1 by performing the above-described series of operations only on the touch pad 350. . Further, the user can give the mobile phone 1 a command equivalent to the case where the operation amount of the pinch operation and the drag operation is changed by changing the operation amount of the series of operations described above. Further, since the series of operations described above are operations on the touch pad 350, the user can perform the series of operations with one finger while holding the mobile phone 1 with one hand. Therefore, according to the above embodiment, it is possible to provide the mobile phone 1 having excellent operability.
  • 1 mobile phone 111 operation detection unit, 113 process determination unit, 120 memory, 130 operation reception unit, 131 operation type data, 132 position data, 140 operation key driver, 150 display driver, 160 display, 300 numeric keypad, 350 touchpad .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This electronic device is provided with an operation acceptance unit (130) for accepting an operation for inputting information to the electronic device, and a control unit for controlling the electronic device. The operation acceptance unit includes a plurality of operation keys and a touch pad (350) provided on top of the plurality of operation keys and operated by an object. When the content of operation on the touch panel accepted by the operation acceptance unit indicates a series of input operations including a depressing operation on at least one operation key among the plurality of operation keys and a swipe operation by an object that is an operation that follows the depressing operation in a state in which the object is kept in contact with an input unit, the control unit executes a process on the basis of the content of the series of input operations.

Description

電子機器Electronics
 本発明は、ユーザ操作を受け付ける電子機器に関する。 The present invention relates to an electronic device that accepts a user operation.
 タッチパネルを備える電子機器においては、ユーザはタッチパネルを操作することにより、直感的な操作(タップ操作、ドラッグ操作、ピンチ操作等)を行うことができる。 In an electronic device equipped with a touch panel, the user can perform intuitive operations (tap operation, drag operation, pinch operation, etc.) by operating the touch panel.
 しかし、電子機器が有する全ての処理(機能)を、タッチパネルの操作のみで電子機器に実施させることは困難である。当該困難性に対処するために、電子機器は、タッチパネルとともにハードキーを備える場合が多い。ユーザは、ハードキーを押下する等のキー操作を行うことにより、タッチパネル操作では実施できなかった処理を電子機器に実施させることができる。 However, it is difficult to cause the electronic device to perform all the processing (functions) of the electronic device only by operating the touch panel. In order to cope with this difficulty, electronic devices often include hard keys together with a touch panel. The user can cause the electronic device to perform a process that cannot be performed by the touch panel operation by performing a key operation such as pressing a hard key.
 ユーザは、このようにタッチパネルとハードキーとを備えた電子機器に、所望の処理を実施させるためには、タッチパネル操作とハードキー操作を行う必要があり、操作が煩わしい。したがって、タッチパネルを有した電子機器の利便性が低減するとの課題があった。 The user needs to perform the touch panel operation and the hard key operation in order to cause the electronic device including the touch panel and the hard key to perform a desired process, and the operation is troublesome. Accordingly, there is a problem that the convenience of the electronic device having the touch panel is reduced.
 このような課題に関連して、特許文献1(特開2014-85858号公報)が開示する電子機器は、異なる位置に配置された複数の操作検出部を備える。特許文献1では、ユーザは、複数の操作検出部のうちの1つの操作検出部に対する操作をしながら、他の操作検出部に対する操作をすることで、操作処理に基づく表示画像の制御を実現している。 In relation to such a problem, an electronic device disclosed in Patent Document 1 (Japanese Patent Application Laid-Open No. 2014-85858) includes a plurality of operation detection units arranged at different positions. In Patent Literature 1, a user realizes control of a display image based on an operation process by performing an operation on another operation detection unit while performing an operation on one operation detection unit among a plurality of operation detection units. ing.
特開2014-85858号公報JP 2014-85858 A
 しかし、特許文献1の電子機器の場合には、ユーザは、異なる位置に配置された2つの操作検出部に対する操作を同時に行う必要がある。したがって、複数本の指で電子機器を操作する必要があり、電子機器の形状によっては両手で操作しなくてはならず、使い勝手に優れない。そのため、特許文献1の電子機器であっても、上記に述べた利便性が低減するとの課題を解消することはできない。 However, in the case of the electronic device disclosed in Patent Document 1, the user needs to simultaneously operate two operation detection units arranged at different positions. Therefore, it is necessary to operate the electronic device with a plurality of fingers, and depending on the shape of the electronic device, it must be operated with both hands, which is not easy to use. For this reason, even the electronic device disclosed in Patent Document 1 cannot solve the problem that the convenience described above is reduced.
 本開示は、上記の問題点に鑑みなされたものであって、その目的は、操作性に優れた電子機器を提供することにある。 The present disclosure has been made in view of the above-described problems, and an object thereof is to provide an electronic device having excellent operability.
 一実施の形態に従うと、電子機器は、当該電子機器に情報を入力するための操作を受付けるための操作受付部と、電子機器を制御するための制御部と、を備える。操作受付部は、複数の操作キーと、複数の操作キーに重ねて設けられて物体により操作されるタッチパッドとを含む。制御部は、操作受付部により受付けられたタッチパッドに対する操作内容が、複数の操作キーのうちの少なくとも1つの操作キーに対する押し下げ操作と、当該押し下げ操作の後に続く操作であって、入力部への物体の接触が維持された状態で物体によるスワイプ操作とを含む一連の入力操作を示す場合に、当該一連の入力操作の内容に基づき、処理を実行する。 According to one embodiment, the electronic device includes an operation receiving unit for receiving an operation for inputting information to the electronic device and a control unit for controlling the electronic device. The operation reception unit includes a plurality of operation keys and a touch pad that is provided over the plurality of operation keys and is operated by an object. The control unit is configured such that the operation content for the touch pad received by the operation receiving unit is a pressing operation for at least one of the plurality of operation keys, and an operation following the pressing operation. When a series of input operations including a swipe operation by an object is shown in a state where the contact of the object is maintained, processing is executed based on the contents of the series of input operations.
実施の形態1に係る携帯型電話機1の外観を表した斜視図である。1 is a perspective view illustrating an appearance of a mobile phone 1 according to Embodiment 1. FIG. 図1の携帯型電話機1のハードウェア構成を説明するための図である。It is a figure for demonstrating the hardware constitutions of the mobile telephone 1 of FIG. 実施の形態1に係る機能の構成を模式的に示す図である。3 is a diagram schematically illustrating a configuration of functions according to Embodiment 1. FIG. 本実施の形態1に係るピンチ操作の画像表示処理を説明するためのフローチャートである。It is a flowchart for demonstrating the image display process of the pinch operation which concerns on this Embodiment 1. FIG. 本実施の形態1に係るがウインドウ移動表示処理を説明するためのフローチャートである。It is a flowchart for demonstrating this Embodiment 1 but a window movement display process.
 以下、図面を参照しつつ、電子機器の一例として携帯型電話機を例に挙げて説明する。
[実施の形態1]
 図1は、携帯型電話機1の外観を表した斜視図である。図1を参照して、携帯型電話機1は、上部筐体101と、下部筐体102と、ヒンジ部103とを備える。上部筐体101と下部筐体102とは、ヒンジ部103により折り畳み可能なように接続されている。上部筐体101と下部筐体102は、それぞれ、略直方体の形状を有する。上部筐体101は、液晶等からなるディスプレイ160を備える。下部筐体102は、ユーザによる携帯型電話機1に対する情報の入力操作を受付けるための操作受付部130を備える。操作受付部130は、実施の形態1では静電キーを含んで構成される。
Hereinafter, a mobile phone will be described as an example of an electronic device with reference to the drawings.
[Embodiment 1]
FIG. 1 is a perspective view showing the appearance of the mobile phone 1. With reference to FIG. 1, the mobile phone 1 includes an upper housing 101, a lower housing 102, and a hinge portion 103. The upper housing 101 and the lower housing 102 are connected by a hinge portion 103 so as to be foldable. Each of the upper housing 101 and the lower housing 102 has a substantially rectangular parallelepiped shape. The upper housing 101 includes a display 160 made of liquid crystal or the like. The lower housing 102 includes an operation accepting unit 130 for accepting an information input operation on the mobile phone 1 by the user. In the first embodiment, the operation reception unit 130 includes an electrostatic key.
 図1の携帯型電話機1は、折り畳み型であるが、折り畳むことができないフラットタイプであってもよい。また、電子機器は、携帯型電話機1に限定されず、例えば、タブレット端末、パーソナルコンピュータ、デジタルカメラ、オーディオプレイヤ、スマートフォン、ウェアラブル端末等であってもよい。 1 is a folding type, it may be a flat type that cannot be folded. Further, the electronic device is not limited to the mobile phone 1 and may be, for example, a tablet terminal, a personal computer, a digital camera, an audio player, a smartphone, a wearable terminal, or the like.
 実施の形態1では、上部筐体101と下部筐体102それぞれの長手方向に沿って延びる仮想の軸をY軸と称する。上部筐体101と下部筐体102それぞれの短手方向に沿って延びる仮想の軸であって、Y軸と直交する軸をX軸と称する。X軸とY軸との交差点を点Oと称する。 In the first embodiment, a virtual axis extending in the longitudinal direction of each of the upper casing 101 and the lower casing 102 is referred to as a Y axis. An imaginary axis extending along the short direction of each of the upper casing 101 and the lower casing 102 and orthogonal to the Y axis is referred to as an X axis. An intersection between the X axis and the Y axis is referred to as a point O.
 また、Y軸が延びる方向であって、上部筐体101が位置する側の方向を“上方向”と称し、Y軸が延びる方向であって、“上方向”とは反対側の方向を“下方向”と称する。また、X軸が延びる方向であって、点O(図1参照)が位置する側の方向を“左方向”と称し、“左方向”とは反対側の方向を“右方向”と称する。 Further, the direction in which the Y axis extends and the side on which the upper housing 101 is located is referred to as “upward direction”, and the direction in which the Y axis extends and the direction opposite to the “upward direction” is “ This is referred to as “downward”. Further, the direction in which the X axis extends and the point O (see FIG. 1) is located is referred to as “left direction”, and the direction opposite to “left direction” is referred to as “right direction”.
 また、下部筐体102の面であって、上部筐体101と下部筐体102とが折り畳まれた状態において、ディスプレイ160の表示面と対向する面を主面と称する。 In addition, the surface of the lower housing 102 that faces the display surface of the display 160 when the upper housing 101 and the lower housing 102 are folded is referred to as a main surface.
 (操作受付部130の構成)
 操作受付部130は、携帯型電話機1に情報を入力するためのユーザ操作を受付ける。具体的には、操作受付部130は、2次元に配列(格子状に配列)された複数のキーを含むテンキー300、タッチパッド350(図中の斜線部)、および十字キー等の他の操作キーを備える。タッチパッド350は、テンキー300および他のキーに重ねて設けられている。典型的には、テンキー300および他のキーの露出面側(表面側)にタッチパッド350が設置されている。換言すれば、テンキー300および他のキーは、タッチパッド350に覆われている。なお、テンキー300および他のキーの裏面側にタッチパッド350が設置された構成であってもよい。
(Configuration of operation reception unit 130)
The operation receiving unit 130 receives a user operation for inputting information to the mobile phone 1. Specifically, the operation accepting unit 130 includes other operations such as a numeric keypad 300 including a plurality of keys arranged two-dimensionally (arranged in a grid pattern), a touch pad 350 (shaded portion in the drawing), and a cross key. Provide a key. The touch pad 350 is provided so as to overlap the numeric keypad 300 and other keys. Typically, a touch pad 350 is provided on the exposed surface side (front surface side) of the numeric keypad 300 and other keys. In other words, the numeric keypad 300 and other keys are covered with the touch pad 350. The touch pad 350 may be installed on the back side of the numeric keypad 300 and other keys.
 操作受付部130は、指およびスタイラスペン等の物体によるテンキー300および他のキーの押し下げ操作等のタッチパッド350上での操作を検知する。以下では、操作受付部130に対する操作は、上記の物体のうち指による操作であるとして説明する。 The operation reception unit 130 detects an operation on the touch pad 350 such as a pressing operation of the numeric keypad 300 and other keys by an object such as a finger and a stylus pen. Hereinafter, the operation on the operation receiving unit 130 will be described as an operation with a finger among the above objects.
 テンキー300では、2次元配列における各キーの位置は(X,Y)座標で示される。この(X,Y)座標を、以下、キー位置と称する。 In the numeric keypad 300, the position of each key in the two-dimensional array is indicated by (X, Y) coordinates. This (X, Y) coordinate is hereinafter referred to as a key position.
 操作受付部130において、タッチパッド350は、複数のタッチセンサの電極がマトリックス状に配されたパターンを有する。テンキー300の各キーは、タッチパッド350の対応する各タッチセンサと重なるように実装されている。実施の形態1では、各タッチセンサの検出方式は、指が接近して生じた電極間の静電容量の変化を検出する方式である。 In the operation receiving unit 130, the touch pad 350 has a pattern in which electrodes of a plurality of touch sensors are arranged in a matrix. Each key of the ten key 300 is mounted so as to overlap each corresponding touch sensor of the touch pad 350. In the first embodiment, the detection method of each touch sensor is a method of detecting a change in electrostatic capacitance between electrodes caused by the approach of a finger.
 タッチセンサが、指により操作されると、タッチセンサは、指の接近により生じた電極間の静電容量の変化を検出する。 When the touch sensor is operated by a finger, the touch sensor detects a change in capacitance between the electrodes caused by the approach of the finger.
 操作受付部130は、各タッチセンサから検出信号を入力したとき、当該検出信号に基づき、操作されたタッチセンサのマトリックス上の位置(X,Y)を判断する。テンキー300のキーが操作された場合の検出信号の基づく位置(X,Y)は、操作されたキーのキー位置(X,Y)に相当する。また、操作受付部130は、当該検出信号により示される静電容量の変化の大きさに基づき、操作の種類を判断する。ここでは、説明を簡単にするために、操作の種類は、キーの押し下げ操作(クリック操作)およびタッチ操作のいずれかであるとする。操作受付部130は、判断されたキー操作の種類を示す操作種類データ131と、操作されたタッチセンサのマトリックス状の位置(X,Y)を示す位置データ132とを、操作キードライバ140(図2参照)に出力する。 When the detection signal is input from each touch sensor, the operation reception unit 130 determines the position (X, Y) on the matrix of the operated touch sensor based on the detection signal. The position (X, Y) based on the detection signal when the key of the numeric keypad 300 is operated corresponds to the key position (X, Y) of the operated key. Further, the operation reception unit 130 determines the type of operation based on the magnitude of the change in capacitance indicated by the detection signal. Here, to simplify the description, it is assumed that the type of operation is either a key pressing operation (click operation) or a touch operation. The operation reception unit 130 receives operation type data 131 indicating the determined type of key operation, and position data 132 indicating the matrix-like position (X, Y) of the operated touch sensor. 2).
 実施の形態1では、テンキー300のキー操作に関して、ユーザはキーの押し下げ操作(クリック操作に相当)と、スワイプ操作とからなる一連の操作を実施することにより、携帯型電話機1に、当該一連の入力操作の内容に基づく処理を実行させるための指示を与えることができる。実施の形態1では、スワイプ操作は、操作受付部130のタッチパッド350の表面における領域において、当該表面に触れた状態(タッチ操作)を維持したままなぞるように指を滑らせる操作を示す。 In the first embodiment, regarding the key operation of the numeric keypad 300, the user performs a series of operations including a key depression operation (corresponding to a click operation) and a swipe operation, so that the series of operations is performed on the mobile phone 1. An instruction for executing a process based on the content of the input operation can be given. In the first embodiment, the swipe operation indicates an operation in which a finger slides in a region on the surface of the touch pad 350 of the operation reception unit 130 while keeping touching the surface (touch operation).
 図2は、携帯型電話機1のハードウェア構成を説明するための図である。図2を参照して、携帯型電話機1は、携帯型電話機1を制御するためのCPU(Central Processing Unit)110、メモリ120、操作受付部130、操作キードライバ140、ディスプレイドライバ150、およびディスプレイ160を備える。ディスプレイドライバ150は、情報を表示するようにディスプレイ160を制御する。 FIG. 2 is a diagram for explaining a hardware configuration of the mobile phone 1. Referring to FIG. 2, mobile phone 1 includes CPU (Central Processing Unit) 110 for controlling mobile phone 1, memory 120, operation receiving unit 130, operation key driver 140, display driver 150, and display 160. Is provided. Display driver 150 controls display 160 to display information.
 操作受付部130は、上述したように、少なくとも、複数のキー301,302…から構成されるテンキー300と、タッチパッド350とを含む。メモリ120は、各種のデータおよびプログラムを格納する。 As described above, the operation reception unit 130 includes at least the numeric keypad 300 including a plurality of keys 301, 302, and the touch pad 350. The memory 120 stores various data and programs.
 操作キードライバ140は、操作受付部130のためのドライバ回路である。操作キードライバ140は、操作受付部130のタッチパッド350の動作を制御するとともに、操作受付部130から、タッチパッド350における操作の種類を示す操作種類データ131と、操作された位置(X,Y)を示す位置データ132を受け付ける。操作キードライバ140は、操作種類データ131と、当該操作種類データ131とともに受け付けられた位置データ132とからなる組を、CPU110に出力する。 The operation key driver 140 is a driver circuit for the operation receiving unit 130. The operation key driver 140 controls the operation of the touch pad 350 of the operation receiving unit 130, and from the operation receiving unit 130, operation type data 131 indicating the type of operation on the touch pad 350, and the operated position (X, Y ) Is received. The operation key driver 140 outputs a set of the operation type data 131 and the position data 132 received together with the operation type data 131 to the CPU 110.
 CPU110は、操作キードライバ140から操作種類データ131と位置データ132の組を受け付けると、当該組に基づき、予め定められた処理を実行する。 When CPU 110 receives a set of operation type data 131 and position data 132 from operation key driver 140, CPU 110 executes a predetermined process based on the set.
 図3は、実施の形態1に係る機能の構成を模式的に示す図である。図3を参照して、CPU110は、機能として、操作検出部111および処理決定部113を含む。操作検出部111は、操作キードライバ140から、操作種類データ131と位置データ132とからなる組を受付ける。操作検出部111は、受付けられた組に基づき、テンキー300の複数の操作キーのうちの少なくとも1つの操作キーに対する押し下げ操作であるか、またはタッチパッド350上のスワイプ操作であるかを判断する。 FIG. 3 is a diagram schematically illustrating a configuration of functions according to the first embodiment. Referring to FIG. 3, CPU 110 includes an operation detection unit 111 and a process determination unit 113 as functions. The operation detection unit 111 receives a set of operation type data 131 and position data 132 from the operation key driver 140. The operation detection unit 111 determines whether the operation is a push-down operation on at least one operation key of the plurality of operation keys of the numeric keypad 300 or a swipe operation on the touch pad 350 based on the received set.
 処理決定部113は、操作検出部111からの判断結果が、キーに対する押し下げ操作と、当該押し下げ操作の後に続くスワイプ操作とを含む一連の入力操作を受付けたことを示す場合に、当該一連の入力操作の内容に基づき、処理を実行する。具体的には、処理決定部113は、当該一連の入力操作内容に従う表示処理が実行されるように、表示制御信号を生成し、ディスプレイドライバ150に当該表示制御信号を出力する。ディスプレイドライバ150は、表示制御信号に基づきディスプレイ160を制御する。 When the determination result from the operation detection unit 111 indicates that a series of input operations including a depression operation on the key and a swipe operation following the depression operation are received, the processing determination unit 113 receives the series of inputs. The process is executed based on the content of the operation. Specifically, the process determining unit 113 generates a display control signal and outputs the display control signal to the display driver 150 so that the display process according to the series of input operation contents is executed. The display driver 150 controls the display 160 based on the display control signal.
 (ピンチ操作の画像表示処理)
 図4は、本実施の形態1に係るピンチ操作の画像表示処理を説明するためのフローチャートである。このフローチャートに従うプログラムは、メモリ120に格納される。CPU110は、メモリ120からプログラムを読出し、読出されたプログラムを実行することにより、処理が実現される。このプログラムは、定期的に繰り返し実行される。図5のフローチャートに従い、ユーザの入力操作についての具体的な処理を説明する。
(Pinch operation image display processing)
FIG. 4 is a flowchart for explaining the image display processing of the pinch operation according to the first embodiment. A program according to this flowchart is stored in the memory 120. The CPU 110 reads the program from the memory 120 and executes the read program, thereby realizing processing. This program is repeatedly executed periodically. A specific process for the user input operation will be described with reference to the flowchart of FIG.
 まず、操作検出部111は、操作キードライバ140からの入力(操作種類データ131,位置データ132の組)に基づき、タッチパッド350へのタッチ操作がされたか否かを判断する(ステップS1)。操作検出部111は、入力操作はタッチパッド350に対するタッチ操作であると判断すると(ステップS1でYES)、CPU110は、通常のタッチパネル操作のための予め定められた処理を実行する(ステップS3)。その後、処理は終了する。 First, the operation detection unit 111 determines whether or not a touch operation has been performed on the touch pad 350 based on an input from the operation key driver 140 (a set of operation type data 131 and position data 132) (step S1). When operation detection unit 111 determines that the input operation is a touch operation on touch pad 350 (YES in step S1), CPU 110 executes a predetermined process for a normal touch panel operation (step S3). Thereafter, the process ends.
 一方、操作検出部111が、入力操作はタッチパッド350に対するタッチ操作ではないと判断すると(ステップS1でNO)、すなわちキーの押し下げ操作であると判断すると、操作検出部111は、操作キードライバ140からの入力(操作種類データ131,位置データ132の組)を処理決定部113に出力する。 On the other hand, if the operation detection unit 111 determines that the input operation is not a touch operation on the touch pad 350 (NO in step S1), that is, determines that the input operation is a key pressing operation, the operation detection unit 111 determines that the operation key driver 140 Is input to the process determining unit 113 (a set of operation type data 131 and position data 132).
 処理決定部113は、操作検出部111からの入力(操作種類データ131,位置データ132の組)の位置データ132が、「9」のキー301の(X,Y)位置を示すか否かに基づき、当該入力は「9」のキー301の押し下げ操作を示すか否かを判断する(ステップS2)。当該入力は「9」のキー301の押し下げ操作を示さないと判断すると(ステップS2でNO)、処理はステップS1に戻る。 The processing determination unit 113 determines whether or not the position data 132 of the input (the combination of the operation type data 131 and the position data 132) from the operation detection unit 111 indicates the (X, Y) position of the key 301 of “9”. Based on this, it is determined whether or not the input indicates an operation of depressing the “9” key 301 (step S2). If it is determined that the input does not indicate an operation of depressing the “9” key 301 (NO in step S2), the process returns to step S1.
 一方、処理決定部113は、操作検出部111からの入力は「9」のキー301の押し下げ操作を示すと判断すると(ステップS2でYES)、続いて、操作検出部111を介した操作キードライバ140からの入力(操作種類データ131,位置データ132の組)に基づき、指がタッチパッド350から離れたか否かを判断する(ステップS4)。 On the other hand, when the process determining unit 113 determines that the input from the operation detecting unit 111 indicates a pressing operation of the “9” key 301 (YES in step S <b> 2), subsequently, the operation key driver via the operation detecting unit 111 is used. Based on the input from 140 (a set of operation type data 131 and position data 132), it is determined whether or not the finger has left the touch pad 350 (step S4).
 具体的には、処理決定部113は、操作キードライバ140からの予め定められた時間にわたる時系列の入力(操作種類データ131,位置データ132の組)を解析し、解析結果に基づき、指がタッチパッド350から離れたか否かを判断する(ステップS4)。具体的には、処理決定部113は、当該時系列の入力が「操作種類データ131が“タッチ操作”および“押し下げ操作”のいずれの操作も示さず、かつ“位置データ132”が不定値を示す」の条件を満たすか否かにより、指がタッチパッド350から離れたか否かを判断する(ステップS4)。 Specifically, the process determining unit 113 analyzes time-series inputs (a set of the operation type data 131 and the position data 132) from the operation key driver 140 over a predetermined time, and based on the analysis result, the finger is moved. It is determined whether or not the touch pad 350 has been removed (step S4). Specifically, the process determining unit 113 indicates that the time-series input indicates that the “operation type data 131 does not indicate“ touch operation ”or“ push-down operation ”, and“ position data 132 ”has an indefinite value. It is determined whether or not the finger has moved away from the touch pad 350 depending on whether or not the “show” condition is satisfied (step S4).
 処理決定部113は、操作キードライバ140からの上記の時系列の入力が上記の条件を満たすと判断した場合には、指がタッチパッド350から離れたと判断する(ステップS4でYES)。指がタッチパッド350から離れたと判断された場合には、CPU110は、「9」のキー301の押し下げ操作に応じた通常の処理を実行する(ステップS5)。その後、処理は終了する。 When determining that the time-series input from the operation key driver 140 satisfies the above condition, the process determining unit 113 determines that the finger has left the touch pad 350 (YES in step S4). If it is determined that the finger has moved away from the touch pad 350, the CPU 110 executes normal processing in accordance with the pressing operation of the “9” key 301 (step S5). Thereafter, the process ends.
 一方、処理決定部113が、操作キードライバ140からの上記の時系列の入力が上記の条件を満たさないと判断した場合には、指がタッチパッド350から離れていないと判断する(ステップS4でNO)。指がタッチパッド350から離れていないと判断されると、処理決定部113は、操作キードライバ140からの上記の時系列の入力(操作種類データ131,位置データ132の組)に基づき、ユーザがスワイプ操作したか否かを判断する(ステップS6)。 On the other hand, when the process determining unit 113 determines that the above time-series input from the operation key driver 140 does not satisfy the above condition, it determines that the finger is not separated from the touch pad 350 (in step S4). NO). If it is determined that the finger is not separated from the touch pad 350, the process determination unit 113 determines that the user has performed the time-series input from the operation key driver 140 (the combination of the operation type data 131 and the position data 132). It is determined whether or not a swipe operation has been performed (step S6).
 具体的には、処理決定部113は、上記の時系列の入力の操作種類データ131は“タッチ操作”を示し、かつ時系列の“位置データ132”により示される各値(位置(X,Y)の値)が変化していると判断した場合には、ユーザがスワイプ操作したと判断する(ステップS6でYES)。時系列の入力操作がスワイプ操作であると判断されない場合には(ステップS6でNO)、処理は終了する。 Specifically, the processing determining unit 113 indicates that the above-described time-series input operation type data 131 indicates “touch operation” and each value (position (X, Y) indicated by the time-series “position data 132”. When it is determined that the value of) has changed, it is determined that the user has performed a swipe operation (YES in step S6). If it is not determined that the time-series input operation is a swipe operation (NO in step S6), the process ends.
 一方、時系列の入力操作がスワイプ操作であると判断された場合には(ステップS6でYES)、処理決定部113は、「9」のキー301の押し下げ操作と、その後に続く当該スワイプ操作からなる一連の入力操作の内容は、「ピンチ操作」であると決定する。一連の入力操作の内容は「ピンチ操作」であると決定されると、処理決定部113は、ピンチ操作に応じた表示制御信号を生成し、生成された表示制御信号をディスプレイドライバ150に出力する(ステップS7)。実施の形態1では、ピンチ操作は、ディスプレイ160の表示画像の拡大率を変更するための操作に相当し、表示制御信号は、画像の拡大率が変化するよう表示を制御するための信号を示す。 On the other hand, when it is determined that the time-series input operation is a swipe operation (YES in step S6), the process determining unit 113 starts the operation of pressing the “9” key 301 and the subsequent swipe operation. It is determined that the content of the series of input operations is “pinch operation”. When it is determined that the content of the series of input operations is “pinch operation”, the process determining unit 113 generates a display control signal corresponding to the pinch operation, and outputs the generated display control signal to the display driver 150. (Step S7). In the first embodiment, the pinch operation corresponds to an operation for changing the enlargement ratio of the display image on the display 160, and the display control signal indicates a signal for controlling the display so that the enlargement ratio of the image changes. .
 ディスプレイドライバ150は、表示制御信号に基づき、表示画像の拡大率が変化するようディスプレイ160を制御する。 The display driver 150 controls the display 160 based on the display control signal so that the enlargement ratio of the display image changes.
 図5のフローチャートによれば、CPU110は、「9」のキー301の押し下げ操作と、当該押し下げ操作の後に続くスワイプ操作とを含む一連の入力操作を受付けた場合に、当該一連の入力操作の内容に基づき、ピンチ操作の処理、すなわち表示画像の拡大率の変更処理を実施する。 According to the flowchart of FIG. 5, when the CPU 110 receives a series of input operations including a depression operation of the “9” key 301 and a swipe operation following the depression operation, the contents of the series of input operations are received. Based on the above, a pinch operation process, that is, a display image enlargement ratio changing process is performed.
 <拡大率の変更>
 実施の形態1では、上記の画像の拡大処理の内容、すなわち拡大率を決定するためのパラメータは、スワイプ操作における操作量に基づき変更される。処理決定部113は、上記の一連の入力操作を受付けた場合に、スワイプ操作の操作量を検出し、検出された操作量に基づき、上記の表示制御信号に含まれる画像の拡大処理における拡大率のパラメータを設定する。操作量は、スワイプ操作における指の移動方向と、移動距離を含む。
<Change of magnification>
In the first embodiment, the details of the above-described image enlargement process, that is, the parameters for determining the enlargement ratio are changed based on the operation amount in the swipe operation. The process determining unit 113 detects the operation amount of the swipe operation when receiving the above-described series of input operations, and based on the detected operation amount, the enlargement ratio in the enlargement process of the image included in the display control signal Set the parameters. The operation amount includes a finger movement direction and a movement distance in the swipe operation.
 処理決定部113は、スワイプ操作の時系列の入力(操作種類データ131,位置データ132の組)に基づき、ユーザは、タッチ操作したまま、ステップS2で検出された「9」のキー301の位置(X,Y)から“上方向”および“下方向”のいずれの方向の位置(X,Y)に移動したかを判断する。 Based on the swipe operation time-series input (a combination of the operation type data 131 and the position data 132), the process determining unit 113 detects the position of the “9” key 301 detected in step S2 while performing the touch operation. It is determined whether the position has moved from (X, Y) to the position (X, Y) in the “upward” or “downward” direction.
 タッチ操作のキー位置が“上方向”に移動していると判断されると、処理決定部113は、拡大率の値が大きくなるような拡大率のパラメータを有した表示制御信号を生成する。一方、ステップS2で検出された「9」のキー位置から“下方向”に移動していると判断されると、処理決定部113は、拡大率の値が小さくなるような拡大率のパラメータを有した表示制御信号を生成する。 When it is determined that the key position of the touch operation has moved “upward”, the process determining unit 113 generates a display control signal having an enlargement factor parameter that increases the enlargement factor value. On the other hand, if it is determined that the key position of “9” detected in step S2 is moving “downward”, the process determining unit 113 sets an enlargement factor parameter that reduces the enlargement factor value. A display control signal is generated.
 また、処理決定部113は、スワイプ操作の時系列の入力(操作種類データ131,位置データ132の組)に基づき、時系列の最初のキー位置(「9」のキー位置(X,Y))から時系列の最後のタッチ操作位置(X,Y)までの距離(指の移動量)を算出する。拡大率を大きくする場合には、当該算出距離が長いほど、拡大率が大きくなるようにパラメータが更され、拡大率を小さくする場合には、当該距離が短いほど、拡大率が小さくなる(縮小される)ようパラメータが変更される。 Further, the process determination unit 113, based on the swipe operation time-series input (the combination of the operation type data 131 and the position data 132), the first key position in the time series (the key position (X, Y) of “9”). To the last touch operation position (X, Y) in time series (movement amount of finger) is calculated. When the enlargement ratio is increased, the parameter is changed so that the enlargement ratio increases as the calculated distance increases. When the enlargement ratio is decreased, the enlargement ratio decreases as the distance decreases (reduction). Parameter is changed.
 ディスプレイドライバ150は、表示制御信号に含まれる上記のパラメータに基づきディスプレイ160の画像拡大のための表示動作を制御する。 The display driver 150 controls the display operation for image enlargement of the display 160 based on the above parameters included in the display control signal.
 (ウインドウ移動操作の表示処理)
 図5は、本実施の形態1に係るがウインドウ移動表示処理を説明するためのフローチャートである。このフローチャートに従うプログラムは、メモリ120に格納される。CPU110は、メモリ120からプログラムを読出し、読出されたプログラムを実行することにより、処理が実現される。このプログラムは、定期的に繰り返し実行される。
(Window move operation display processing)
FIG. 5 is a flowchart for explaining window moving display processing according to the first embodiment. A program according to this flowchart is stored in the memory 120. The CPU 110 reads the program from the memory 120 and executes the read program, thereby realizing processing. This program is repeatedly executed periodically.
 図5のフローチャートは、図4のフローチャートと比較した場合に、ステップ2a,ステップ5aおよびステップ7aが図4とは異なる処理である。図5の他の処理は、図4と同様である。したがって、図5については、異なる処理を主体に説明する。 5 is a process different from that of FIG. 4 in step 2a, step 5a, and step 7a when compared with the flowchart of FIG. Other processes in FIG. 5 are the same as those in FIG. Therefore, FIG. 5 will be described mainly with different processing.
 まず、操作検出部111は、操作キードライバ140からの入力(操作種類データ131,位置データ132の組)の操作種類データ131に基づき、入力操作はタッチパッド350に対するタッチ操作ではないと判断すると(ステップS1でNO)、すなわちキーの押し下げ操作であると判断すると、操作検出部111は、操作キードライバ140からの入力(操作種類データ131,位置データ132の組)を処理決定部113に出力する。 First, the operation detection unit 111 determines that the input operation is not a touch operation on the touch pad 350 based on the operation type data 131 of the input from the operation key driver 140 (a combination of the operation type data 131 and the position data 132) ( If NO in step S1, that is, if it is determined that the operation is a key pressing operation, the operation detection unit 111 outputs the input from the operation key driver 140 (a combination of the operation type data 131 and the position data 132) to the process determination unit 113. .
 処理決定部113は、操作検出部111からの入力(操作種類データ131,位置データ132の組)を解析し、解析結果に基づき、当該入力は「5」のキー302の押し下げ操作を示すか否かを判断する(ステップS2a)。当該入力は「5」のキー302の押し下げ操作を示さないと判断すると(ステップS2aでNO)、処理はステップS1に戻る。 The process determination unit 113 analyzes the input from the operation detection unit 111 (a combination of the operation type data 131 and the position data 132), and based on the analysis result, whether the input indicates a pressing operation of the “5” key 302 or not. Is determined (step S2a). If it is determined that the input does not indicate an operation of depressing the “5” key 302 (NO in step S2a), the process returns to step S1.
 一方、処理決定部113は、上記の解析結果に基づき、当該入力は「5」のキー302の押し下げ操作を示すと判断すると(ステップS2aでYES)、続いて、操作検出部111を介した操作キードライバ140からの入力(操作種類データ131,位置データ132の組)に基づき、指がタッチパッド350から離れたか否かを判断する(ステップS4)。この判断方法は、図4と同様であるから説明は繰返さない。 On the other hand, if the process determining unit 113 determines that the input indicates a pressing operation of the “5” key 302 based on the above analysis result (YES in step S <b> 2 a), then the operation determining unit 113 performs an operation via the operation detecting unit 111. Based on the input from the key driver 140 (a combination of the operation type data 131 and the position data 132), it is determined whether or not the finger has left the touch pad 350 (step S4). Since this determination method is the same as in FIG. 4, the description will not be repeated.
 処理決定部113は、指がタッチパッド350から離れたことを判断すると(ステップS4でYES)、CPU110は、「5」のキー302の押し下げ操作に応じた通常の処理を実行する(ステップS5a)。その後、処理は終了する。 When the process determining unit 113 determines that the finger has moved away from the touch pad 350 (YES in step S4), the CPU 110 executes a normal process in accordance with the depressing operation of the “5” key 302 (step S5a). . Thereafter, the process ends.
 一方、処理決定部113が、指がタッチパッド350から離れていないと判断すると(ステップS4でNO)、ユーザがスワイプ操作したか否かを判断する(ステップS6)。ユーザがスワイプ操作したことが判断されると(ステップS6でYES)、処理決定部113は、「5」のキー302の押し下げ操作と、その後に続く当該スワイプ操作からなる一連の入力操作の内容は、ドラッグによる「ウインドウ移動操作」であると決定する。 On the other hand, when the process determining unit 113 determines that the finger is not separated from the touch pad 350 (NO in step S4), it determines whether or not the user has performed a swipe operation (step S6). If it is determined that the user has performed a swipe operation (YES in step S6), the process determining unit 113 determines the contents of a series of input operations including the depressing operation of the “302” key 302 and the subsequent swipe operation. Then, the “window moving operation” is determined by dragging.
 処理決定部113は、ウインドウ移動操作のための表示制御信号を生成し、生成された表示制御信号をディスプレイドライバ150に出力する(ステップS7a)。 The process determining unit 113 generates a display control signal for the window moving operation, and outputs the generated display control signal to the display driver 150 (step S7a).
 実施の形態1では、ウインドウ移動操作は、ディスプレイ160に表示されているウインドウ画像の表示位置を変更するための操作に相当する。上記に述べた表示制御信号は、ウインドウ画像の表示位置が変更されるようディスプレイ160の表示を制御するための信号を示す。 In the first embodiment, the window moving operation corresponds to an operation for changing the display position of the window image displayed on the display 160. The display control signal described above indicates a signal for controlling the display of the display 160 so that the display position of the window image is changed.
 ディスプレイドライバ150は、表示制御信号に基づき、表示中のウインドウ画像が移動されるようディスプレイ160を制御する。 The display driver 150 controls the display 160 based on the display control signal so that the window image being displayed is moved.
 図5のフローチャートによれば、CPU110は、「5」のキー302の押し下げ操作と、当該押し下げ操作の後に続くスワイプ操作とを含む一連の入力操作を受付けた場合に、当該一連の入力操作の内容に基づき、ウインドウ画像の移動操作の処理を実行する。 According to the flowchart of FIG. 5, when the CPU 110 receives a series of input operations including a pressing operation of the “5” key 302 and a swipe operation following the pressing operation, the contents of the series of input operations are received. Based on the above, a window image moving operation process is executed.
 <ウインドウ画像の移動量の変更>
 実施の形態1では、上記のウインドウ画像の移動処理の内容、すなわち移動量(移動方向と移動距離)を決定するためのパラメータは、スワイプ操作における操作量に基づき変更される。処理決定部113は、上記の一連の入力操作を受付けた場合に、スワイプ操作の操作量を検出し、検出された操作量に基づき、上記の表示制御信における、ウインドウ画像の移動方向と移動距離を指定するパラメータの値を設定する。操作量は、スワイプ操作における指の移動方向と、移動距離を含む。
<Change the amount of movement of the window image>
In the first embodiment, the content of the window image moving process, that is, the parameter for determining the moving amount (moving direction and moving distance) is changed based on the operation amount in the swipe operation. The process determining unit 113 detects the operation amount of the swipe operation when receiving the above-described series of input operations, and based on the detected operation amount, the moving direction and the moving distance of the window image in the display control signal described above. Set the value of the parameter that specifies. The operation amount includes a finger movement direction and a movement distance in the swipe operation.
 処理決定部113は、スワイプ操作の時系列の入力(操作種類データ131,位置データ132の組)に基づき、ユーザは、タッチ操作したまま、ステップS2で検出された「5」のキー302の位置(X,Y)から、タッチ操作位置が“上方向”、“下方向”、“右方向”および“左方向”のいずれの位置(X,Y)に移動したかを判断する。 Based on the swipe operation time-series input (a combination of the operation type data 131 and the position data 132), the process determining unit 113 detects the position of the key 302 of “5” detected in step S2 while performing the touch operation. From (X, Y), it is determined to which position (X, Y) the touch operation position has moved “upward”, “downward”, “rightward”, or “leftward”.
 タッチ操作の位置(X,Y)が、「5」のキー302のキー位置(X,Y)から“上方向”に移動していると判断されると、処理決定部113は、ウインドウ画像を現在位置から“上方向”に移動させるためのパラメータを有した表示制御信号を生成する。同様に、“下方向”に移動していると判断されると、処理決定部113は、ウインドウ画像を現在位置から“下方向”に移動させるためのパラメータを有した表示制御信号を生成する。同様に、“右方向”に移動していると判断されると、処理決定部113は、ウインドウ画像を現在位置から“右方向”に移動させるためのパラメータを有した表示制御信号を生成する。同様に、“左方向”に移動していると判断されると、処理決定部113は、ウインドウ画像を現在位置から“左方向”に移動させるためのパラメータを有した表示制御信号を生成する。 When it is determined that the position (X, Y) of the touch operation has moved “upward” from the key position (X, Y) of the “5” key 302, the process determining unit 113 displays the window image. A display control signal having parameters for moving “upward” from the current position is generated. Similarly, if it is determined that the window image is moving in the “downward direction”, the process determining unit 113 generates a display control signal having a parameter for moving the window image in the “downward direction” from the current position. Similarly, when it is determined that the window image is moving in the “right direction”, the process determining unit 113 generates a display control signal having a parameter for moving the window image in the “right direction” from the current position. Similarly, when it is determined that the window image is moved in the “left direction”, the process determining unit 113 generates a display control signal having a parameter for moving the window image in the “left direction” from the current position.
 また、処理決定部113は、スワイプ操作の時系列の入力(操作種類データ131,位置データ132の組)に基づき、時系列の最初のキー位置(X,Y)(「5」のキー302の位置(X,Y))から時系列の最後のタッチ操作の位置(X,Y)まで距離(指の移動距離)を算出する。上記の上下左右の各方向にウインドウ画像を移動させる場合に、算出された距離が長いほど、ウインドウ画像を現在位置からより遠くに移動させるためのパラメータを有した表示制御信号を生成する。また、算出された距離が短いほど、ウインドウ画像を現在位置のより近くに移動させるためのパラメータを有した表示制御信号を生成する。 Further, the process determining unit 113, based on the swipe operation time-series input (the combination of the operation type data 131 and the position data 132), sets the first key position (X, Y) (“5” key 302 of the time series). The distance (the movement distance of the finger) from the position (X, Y)) to the position (X, Y) of the last touch operation in time series is calculated. When the window image is moved in the above, down, left, and right directions, a display control signal having a parameter for moving the window image further from the current position is generated as the calculated distance is longer. In addition, as the calculated distance is shorter, a display control signal having a parameter for moving the window image closer to the current position is generated.
 ディスプレイドライバ150は、表示制御信号に含まれる上記のパラメータに基づきディスプレイ160のウインドウ画像移動のための表示動作を制御する。 The display driver 150 controls the display operation for moving the window image of the display 160 based on the above parameters included in the display control signal.
 上記の実施の形態では、テンキーのいずれかのキーに対する押し下げ操作と、当該押し下げ操作の後に続くスワイプ操作とを含む一連の入力操作を受付けた場合に、押し下げられたキーの種類によって、タッチパネルにおけるピンチ操作またはウインドウ画像のドラッグ操作のいずれかが実施されるとしたが、実施可能なタッチパネル操作は、ピンチ操作およびドラッグ操作に限定されない。 In the above-described embodiment, when a series of input operations including a depression operation on any one of the numeric keys and a swipe operation following the depression operation is received, the pinch on the touch panel is changed depending on the type of the depressed key. Although either the operation or the drag operation of the window image is performed, the touch panel operation that can be performed is not limited to the pinch operation and the drag operation.
 上記の実施の形態では、タッチパッド350のタッチセンサによる検出方式は、静電容量の変化を検出する方式であったが、この検出方式に限定されない。例えば、タッチパッド350タは、タッチ操作または押し下げ操作に応じた圧力の変化を検出する方式であっても、上記の実施の形態1の構成および機能を実現することができる。 In the above embodiment, the detection method using the touch sensor of the touch pad 350 is a method for detecting a change in capacitance, but is not limited to this detection method. For example, even if the touch pad 350 is a system that detects a change in pressure according to a touch operation or a push-down operation, the configuration and function of the first embodiment can be realized.
 [実施の形態2]
 上述した実施の形態1では、フローチャートに従うプログラムは、携帯型電話機1に付属するメモリカード(図示せず)などのように、CPU110がメモリドライバ(図示せず)を介して読取り可能な記録媒体に記録させて、プログラム製品として携帯型電話機1に提供することもできる。あるいは、ネットワークを介して携帯型電話機1のアンテナ(図示せず)により受信しメモリ120の記憶領域へのダウンロードによって、プログラムを提供することもできる。
[Embodiment 2]
In the first embodiment described above, the program according to the flowchart is stored on a recording medium that can be read by the CPU 110 via a memory driver (not shown), such as a memory card (not shown) attached to the mobile phone 1. It can also be recorded and provided to the portable telephone 1 as a program product. Alternatively, the program can be provided by receiving it via an antenna (not shown) of the mobile phone 1 via a network and downloading it to the storage area of the memory 120.
 提供されるプログラム製品は、プログラム自体と、プログラムが非一時的に記録された記録媒体とを含む。 The provided program product includes the program itself and a recording medium on which the program is recorded non-temporarily.
 [実施の形態の効果]
 実施の形態によれば、ユーザは、タッチパッド350上のみにおいて上述の一連の操作を行うことで、ピンチ操作、ドラッグ操作等と同等の指令(情報)を、携帯型電話機1に与えることができる。また、ユーザは、上述の一連の操作の操作量を変更することにより、ピンチ操作、ドラッグ操作の操作量を変更した場合と同等の指令を携帯型電話機1に与えることが可能となる。また、上述の一連の操作はタッチパッド350上の操作であるから、ユーザは、携帯型電話機1を片手で把持しながら、一本の指で当該一連の操作を実施することが可能となる。したがって、上記の実施の形態によれば、操作性に優れた携帯型電話機1を提供することができる。
[Effect of the embodiment]
According to the embodiment, the user can give a command (information) equivalent to a pinch operation, a drag operation, or the like to the mobile phone 1 by performing the above-described series of operations only on the touch pad 350. . Further, the user can give the mobile phone 1 a command equivalent to the case where the operation amount of the pinch operation and the drag operation is changed by changing the operation amount of the series of operations described above. Further, since the series of operations described above are operations on the touch pad 350, the user can perform the series of operations with one finger while holding the mobile phone 1 with one hand. Therefore, according to the above embodiment, it is possible to provide the mobile phone 1 having excellent operability.
 今回開示された実施の形態は例示であって、上記内容のみに制限されるものではない。本発明の範囲は請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 The embodiment disclosed this time is an example, and is not limited to the above contents. The scope of the present invention is defined by the terms of the claims, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
 1 携帯型電話機、111 操作検出部、113 処理決定部、120 メモリ、130 操作受付部、131 操作種類データ、132 位置データ、140 操作キードライバ、150 ディスプレイドライバ、160 ディスプレイ、300 テンキー、350 タッチパッド。 1 mobile phone, 111 operation detection unit, 113 process determination unit, 120 memory, 130 operation reception unit, 131 operation type data, 132 position data, 140 operation key driver, 150 display driver, 160 display, 300 numeric keypad, 350 touchpad .

Claims (5)

  1.  電子機器に情報を入力するための操作を受付けるための操作受付部と、
     前記電子機器を制御するための制御部と、を備え、
     前記操作受付部は、複数の操作キーと、前記複数の操作キーに重ねて設けられて物体により操作されるタッチパッドとを含み、
     前記制御部は、
     前記操作受付部により受付けられた前記タッチパッドに対する操作内容が、前記複数の操作キーのうちの少なくとも1つの操作キーに対する押し下げ操作と、当該押し下げ操作の後に続く操作であって、前記入力部への前記物体の接触が維持された状態で前記物体によるスワイプ操作とを含む一連の入力操作を示す場合に、当該一連の入力操作の内容に基づき、処理を実行する、電子機器。
    An operation accepting unit for accepting an operation for inputting information to the electronic device;
    A control unit for controlling the electronic device,
    The operation accepting unit includes a plurality of operation keys and a touch pad that is provided on the plurality of operation keys and operated by an object,
    The controller is
    The operation content for the touch pad received by the operation receiving unit is a pressing operation for at least one operation key of the plurality of operation keys, and an operation following the pressing operation. An electronic device that executes processing based on the contents of a series of input operations when a series of input operations including a swipe operation by the objects is performed in a state where contact of the objects is maintained.
  2.  前記処理は、各前記複数の操作キーに対応した処理を含み、
     前記制御部は、
     前記一連の入力操作が受付けられた場合に、前記複数の操作キーのうち前記押し下げ操作された操作キーに対応した処理を実行する、請求項1に記載の電子機器。
    The processing includes processing corresponding to each of the plurality of operation keys,
    The controller is
    The electronic device according to claim 1, wherein when the series of input operations is accepted, processing corresponding to the operation key that has been pressed down among the plurality of operation keys is executed.
  3.  前記制御部は、
     前記一連の入力操作が受付けられた場合に、前記複数の操作キーのうち前記スワイプ操作の操作量を検出し、検出された操作量に基づき、前記処理の内容を変更する、請求項1または2に記載の電子機器。
    The controller is
    3. The operation amount of the swipe operation among the plurality of operation keys is detected when the series of input operations is accepted, and the content of the processing is changed based on the detected operation amount. The electronic device as described in.
  4.  前記操作量は、前記スワイプ操作における前記物体の移動方向または移動量を含む、請求項3に記載の電子機器。 The electronic device according to claim 3, wherein the operation amount includes a moving direction or a moving amount of the object in the swipe operation.
  5.  情報を表示するためのディスプレイを、さらに備え、
     前記処理は、前記ディスプレイに情報を表示するための処理を含む、請求項1から4のいずれか1項に記載の電子機器。
    A display for displaying information is further provided,
    The electronic device according to claim 1, wherein the process includes a process for displaying information on the display.
PCT/JP2016/055813 2015-03-31 2016-02-26 Electronic device WO2016158125A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015071903 2015-03-31
JP2015-071903 2015-03-31

Publications (1)

Publication Number Publication Date
WO2016158125A1 true WO2016158125A1 (en) 2016-10-06

Family

ID=57005658

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/055813 WO2016158125A1 (en) 2015-03-31 2016-02-26 Electronic device

Country Status (1)

Country Link
WO (1) WO2016158125A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010282424A (en) * 2009-06-04 2010-12-16 Panasonic Corp Input device and method of determining invalidated operation
JP2011095900A (en) * 2009-10-28 2011-05-12 Panasonic Corp Apparatus and method for processing information

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010282424A (en) * 2009-06-04 2010-12-16 Panasonic Corp Input device and method of determining invalidated operation
JP2011095900A (en) * 2009-10-28 2011-05-12 Panasonic Corp Apparatus and method for processing information

Similar Documents

Publication Publication Date Title
US11461004B2 (en) User interface supporting one-handed operation and terminal supporting the same
JP5977627B2 (en) Information processing apparatus, information processing method, and program
US9891732B2 (en) Selective rejection of touch contacts in an edge region of a touch surface
JP4372188B2 (en) Information processing apparatus and display control method
EP2433203B1 (en) Hand-held device with two-finger touch triggered selection and transformation of active elements
US8370772B2 (en) Touchpad controlling method and touch device using such method
US20100177121A1 (en) Information processing apparatus, information processing method, and program
JP2013030050A (en) Screen pad inputting user interface device, input processing method, and program
JP2009536385A (en) Multi-function key with scroll
JP5197533B2 (en) Information processing apparatus and display control method
WO2018123320A1 (en) User interface device and electronic apparatus
WO2016158125A1 (en) Electronic device
AU2013100574A4 (en) Interpreting touch contacts on a touch surface
KR20110066545A (en) Method and terminal for displaying of image using touchscreen
JP5165624B2 (en) Information input device, object display method, and computer-executable program
JP6971573B2 (en) Electronic devices, their control methods and programs
US20140085340A1 (en) Method and electronic device for manipulating scale or rotation of graphic on display
JP6661421B2 (en) Information processing apparatus, control method, and program
JP2024027867A (en) oscilloscope
TWI505170B (en) Method for controlling application program, electronic device thereof, recording medium thereof, and computer program product using the method
JP5777934B2 (en) Information processing apparatus, information processing apparatus control method, and control program
AU2017219061A1 (en) Interpreting touch contacts on a touch surface
JP2016018453A (en) Information display device and program
WO2013157280A1 (en) Position input device, position input method, position input program, and information processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16772009

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16772009

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP