WO2012111227A1 - タッチ式入力装置、電子機器および入力方法 - Google Patents
タッチ式入力装置、電子機器および入力方法 Download PDFInfo
- Publication number
- WO2012111227A1 WO2012111227A1 PCT/JP2011/078666 JP2011078666W WO2012111227A1 WO 2012111227 A1 WO2012111227 A1 WO 2012111227A1 JP 2011078666 W JP2011078666 W JP 2011078666W WO 2012111227 A1 WO2012111227 A1 WO 2012111227A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- contact
- operation surface
- contact position
- input device
- detected
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present invention relates to a touch-type input device, an electronic device, and an input method capable of performing input by touching an operation surface.
- Some electronic devices such as mobile phone terminals employ a touch input device such as a touch panel as an input device.
- a touch input device such as a touch panel
- the user can perform input by touching the operation surface of the touch input device with a finger or the like.
- the touch input device includes a contact position that touches the operation surface, a relative positional relationship between a contact position that first touches the operation surface and a contact position that touches the next, a contact time on the operation surface, and Some input contents change depending on the operation content such as the moving distance of the contact position touching the operation surface.
- the touch input device has a problem that there are few operations that can be selected by the user.
- Patent Document 1 specifies which finger of the user is the finger touching the operation surface based on the fingerprint of the finger or the arrangement, shape, or size of the finger, and associates it with the specified finger. An information input device that executes a registered command is described.
- a user In a pointing device such as a mouse, a user generally performs input by placing a cursor on an image such as an icon on the screen and pressing a button provided on the pointing device.
- the pointing device includes a plurality of buttons
- the user can change the input content by selecting a button to be pressed.
- the input content changes depending on the pressed button, regardless of which finger presses the button, the user can intuitively recognize the finger pressing the button.
- the input content can be selected.
- An object of the present invention is to provide a touch input device, an electronic device, and an input method capable of changing input contents by intuitive finger selection.
- the touch-type input device includes an operation surface, a plurality of non-contact positions that are positions of the plurality of detection objects that are close to a predetermined distance from the operation surface, and a plurality of the detection objects. And a control unit that detects a contact position that is a position on the operation surface of the detected object that is in contact with the operation surface, and outputs a detection signal corresponding to each non-contact position and the contact position.
- an electronic apparatus includes the touch input device described above and an information processing unit that performs information processing according to a detection signal output from the touch input device.
- the input method according to the present invention is an input method using a touch-type input device having an operation surface, and a plurality of non-detection positions that are positions of a plurality of detected objects that are close to a predetermined distance from the operation surface.
- a contact position is detected, and a contact position that is a position on the operation surface of the detected object that is in contact with the operation surface among the plurality of detected objects is detected, and according to each non-contact position and the contact position
- the detected signal is output.
- FIG. 1 is a front view of a mobile terminal according to an embodiment of the present invention.
- the mobile terminal 1 has an operation surface 11 for a user to operate the mobile terminal 1.
- the user makes an input to the portable terminal 1 by bringing a plurality of detected objects close to the operation surface 11 and then bringing any of the detected objects into contact with the operation surface 11.
- a user's finger is convenient as the detection target, but a touch pen or the like may be used.
- FIG. 2 is a block diagram showing a functional configuration of the mobile terminal 1.
- the mobile terminal 1 includes a touch panel 10 and an information processing unit 20.
- the touch panel 10 is a touch input device that includes an operation surface 11 and a control unit 12 that detects an operation on the operation surface 11.
- the control unit 12 includes a plurality of non-contact positions that are positions of the plurality of detection objects that are close to the operation surface 11 within a predetermined distance, and a target that is in contact with the operation surface 11 among the plurality of detection objects.
- a contact position that is a position on the operation surface 11 of the detection body is detected.
- the control unit 12 outputs a detection signal corresponding to the detected non-contact position and contact position to the information processing unit 20.
- the non-contact position may be a three-dimensional position that includes at least the projection position of the detection target object on the operation surface 11 and adds the distance from the operation surface 11 to the projection position.
- control unit 12 includes a detection unit 13 and an output control unit 14.
- the detection unit 13 outputs an operation signal corresponding to the position of the detected object approaching the operation surface 11 within a predetermined distance and the position of the detected object in contact with the operation surface 11 to the output control unit 14.
- the output control unit 14 When receiving the operation signal, the output control unit 14 detects the non-contact position and the contact position based on the operation signal, and outputs a detection signal corresponding to the non-contact position and the contact position to the information processing unit 20.
- the detection unit 13 detects the change amount of the capacitance of the operation surface 11 for each position on the operation surface 11 and outputs an operation signal indicating the change amount and the position. It becomes a capacitance detection unit that outputs to the control unit 14.
- the predetermined distance is a distance at which the detection unit 13 can detect a change in the capacitance of the operation surface 11 due to the influence of the detection target.
- the output control unit 14 detects a position where the change amount of the capacitance indicated by the operation signal is included in the predetermined range as a non-contact position, and detects a position where the change amount of the capacitance exceeds the predetermined range as the contact position. .
- the information processing unit 20 receives a detection signal from the control unit 12 and performs information processing according to the detection signal.
- FIG. 3 is a flowchart for explaining an example of the operation of the mobile terminal 1. As shown in FIG. 4, it is assumed that the user brings his two fingers 31 and 32 close to the operation surface 11 as a detected object, and then brings the finger 32 into contact with the operation surface 11.
- the touch panel 10 is assumed to be a capacitive touch panel.
- the detection unit 13 detects an amount of change in capacitance due to the influence of each of the fingers 31 and 32, and an operation indicating the position on the operation surface 11 where the capacitance has changed for each change amount. A signal is output to the output control unit 14 (step S301).
- the output control unit 14 determines whether or not each change amount of the capacitance indicated by the operation signal is included in the predetermined range (step S302).
- the output control unit 14 detects each of the two positions indicated by the operation signal as a non-contact position, and holds these non-contact positions (step S303).
- step S303 the output control unit 14 holds the latest non-contact position.
- step S302 the output control unit 14 determines that one of the amounts of change in capacitance exceeds a predetermined range. In this case, the output control unit 14 detects a position where the amount of change in capacitance exceeds a predetermined range as a contact position (step S304).
- the output control unit 14 outputs a detection signal corresponding to the contact position detected in step S304 and the non-contact position held in step S303 to the information processing unit 20.
- the information processing unit 20 executes information processing according to the detection signal (step S305).
- the output control unit 14 based on the contact position and the non-contact position, the output control unit 14 relates the non-contact body that is a finger different from the contact body to the contact body that is the finger that has contacted the operation surface 11. A specific positional relationship is obtained, and a detection signal corresponding to the positional relationship is output.
- the output control unit 14 uses a finger at a contact position as a contact body, and of two non-contact positions, a finger at a position different from the contact position or a position far from the contact position as a non-contact body, Information indicating whether or not is located on the right side or the left side of the non-contact body is obtained as a relative positional relationship of the non-contact body with respect to the contact body, and a detection signal corresponding to the positional relationship is output.
- the user can cause the mobile terminal 1 to execute different information processing simply by selecting a finger touching the operation surface 11 with a sense of selecting a right click operation and a left click operation of the mouse.
- two fingers are used as the detection target, but three or more fingers may be used as the detection target.
- detection signals corresponding to a plurality of non-contact positions and contact positions are output.
- the user brings a plurality of fingers close to the operation surface 11 and the plurality of fingers. Since the operation content on the operation surface 11 can be changed simply by selecting the finger to be brought into contact with the operation surface 11 from among the above, the input content can be changed by intuitive finger selection.
- a general pointing device can be used. It is possible to change the input contents with the same feeling.
- the illustrated configuration is merely an example, and the present invention is not limited to the configuration.
- both a non-contact position and a contact position are detected using a single capacitance detection unit, but a proximity sensor using infrared rays or the like is detected as a means for detecting the non-contact position. It may be provided separately from the means.
- the touch input device may be provided in an electronic device (for example, a game machine) other than the mobile terminal.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
11 操作面
12 制御部
13 検出部
14 出力制御部
20 情報処理部
Claims (6)
- 操作面と、
前記操作面に対して所定距離以内まで接近した複数の被検出体のそれぞれの位置である複数の非接触位置と、前記複数の被検出体のうち前記操作面と接触した被検出体の前記操作面上の位置である接触位置とを検出し、各非接触位置と前記接触位置とに応じた検出信号を出力する制御部と、を有するタッチ式入力装置。 - 前記制御部は、各非接触位置と前記接触位置とに基づいて、前記操作面と接触した被検出体である接触体に対する、当該接触体とは別の被検出体である非接触体の相対的な位置関係を求め、当該位置関係に応じた前記検出信号を出力する、請求項1に記載のタッチ式入力装置。
- 前記制御部は、2つの前記非接触位置を検出し、また、前記接触体が前記非接触体より右側にあるか左側にあるかを前記位置関係として求める、請求項3に記載のタッチ式入力装置。
- 前記制御部は、
前記操作面の静電容量の変化量を前記操作面の位置ごとに検出する静電容量検出部と、
前記変化量が所定範囲に含まれる位置を前記非接触位置として検出し、前記変化量が前記所定範囲を超えている位置を前記接触位置として検出し、各非接触位置と前記接触位置とに応じた検出信号を出力する出力制御部と、を有する、請求項1または2に記載のタッチ式入力装置。 - 請求項1ないし4のいずれか1項に記載のタッチ式入力装置と、
前記タッチ式入力装置から出力された検出信号に応じた情報処理を行う情報処理部と、を有する電子機器。 - 操作面を有するタッチ式入力装置による入力方法であって、
前記操作面に対して所定距離以内まで接近した複数の被検出体のそれぞれの位置である複数の非接触位置を検出し、
前記複数の被検出体のうち前記操作面と接触した被検出体の前記操作面上の位置である接触位置とを検出し、
各非接触位置と前記接触位置とに応じた検出信号を出力する、入力方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11858810.2A EP2677404A4 (en) | 2011-02-16 | 2011-12-12 | Touch input device, electronic apparatus, and input method |
JP2012557793A JPWO2012111227A1 (ja) | 2011-02-16 | 2011-12-12 | タッチ式入力装置、電子機器および入力方法 |
US14/000,049 US20130321320A1 (en) | 2011-02-16 | 2011-12-12 | Touch input device, electronic apparatus, and input method |
CN2011800677774A CN103370680A (zh) | 2011-02-16 | 2011-12-12 | 触摸输入装置、电子设备以及输入方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-030968 | 2011-02-16 | ||
JP2011030968 | 2011-02-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012111227A1 true WO2012111227A1 (ja) | 2012-08-23 |
Family
ID=46672176
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/078666 WO2012111227A1 (ja) | 2011-02-16 | 2011-12-12 | タッチ式入力装置、電子機器および入力方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20130321320A1 (ja) |
EP (1) | EP2677404A4 (ja) |
JP (1) | JPWO2012111227A1 (ja) |
CN (1) | CN103370680A (ja) |
WO (1) | WO2012111227A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104102448A (zh) * | 2013-04-02 | 2014-10-15 | 义隆电子股份有限公司 | 浮控物体的识别方法 |
JP2014199492A (ja) * | 2013-03-29 | 2014-10-23 | 株式会社ジャパンディスプレイ | 電子機器および電子機器の制御方法 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106406505A (zh) * | 2015-07-28 | 2017-02-15 | 北京金山安全软件有限公司 | 一种图片滤镜效果的编辑方法及其*** |
JP2017073128A (ja) * | 2015-10-08 | 2017-04-13 | 船井電機株式会社 | 空間入力装置 |
CN106200989B (zh) * | 2016-09-08 | 2019-11-12 | 广东小天才科技有限公司 | 一种移动终端亮屏的方法及装置 |
KR102469722B1 (ko) * | 2018-09-21 | 2022-11-22 | 삼성전자주식회사 | 디스플레이 장치 및 그 제어 방법 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008097172A (ja) * | 2006-10-10 | 2008-04-24 | Sony Corp | 表示装置および表示方法 |
JP2008123032A (ja) | 2006-11-08 | 2008-05-29 | Toyota Motor Corp | 情報入力装置 |
JP2009146435A (ja) * | 2003-09-16 | 2009-07-02 | Smart Technologies Ulc | ジェスチャ認識方法及びそれを組み込んだタッチシステム |
JP2010521732A (ja) * | 2007-03-14 | 2010-06-24 | パワー2ビー,インコーポレイティド | 表示装置および情報入力装置 |
JP2010244302A (ja) * | 2009-04-06 | 2010-10-28 | Sony Corp | 入力装置および入力処理方法 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7653883B2 (en) * | 2004-07-30 | 2010-01-26 | Apple Inc. | Proximity detector in handheld device |
US20100117970A1 (en) * | 2008-11-11 | 2010-05-13 | Sony Ericsson Mobile Communications Ab | Methods of Operating Electronic Devices Using Touch Sensitive Interfaces with Contact and Proximity Detection and Related Devices and Computer Program Products |
-
2011
- 2011-12-12 WO PCT/JP2011/078666 patent/WO2012111227A1/ja active Application Filing
- 2011-12-12 US US14/000,049 patent/US20130321320A1/en not_active Abandoned
- 2011-12-12 EP EP11858810.2A patent/EP2677404A4/en not_active Withdrawn
- 2011-12-12 JP JP2012557793A patent/JPWO2012111227A1/ja active Pending
- 2011-12-12 CN CN2011800677774A patent/CN103370680A/zh active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009146435A (ja) * | 2003-09-16 | 2009-07-02 | Smart Technologies Ulc | ジェスチャ認識方法及びそれを組み込んだタッチシステム |
JP2008097172A (ja) * | 2006-10-10 | 2008-04-24 | Sony Corp | 表示装置および表示方法 |
JP2008123032A (ja) | 2006-11-08 | 2008-05-29 | Toyota Motor Corp | 情報入力装置 |
JP2010521732A (ja) * | 2007-03-14 | 2010-06-24 | パワー2ビー,インコーポレイティド | 表示装置および情報入力装置 |
JP2010244302A (ja) * | 2009-04-06 | 2010-10-28 | Sony Corp | 入力装置および入力処理方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2677404A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014199492A (ja) * | 2013-03-29 | 2014-10-23 | 株式会社ジャパンディスプレイ | 電子機器および電子機器の制御方法 |
CN104102448A (zh) * | 2013-04-02 | 2014-10-15 | 义隆电子股份有限公司 | 浮控物体的识别方法 |
TWI490748B (zh) * | 2013-04-02 | 2015-07-01 | Elan Microelectronics Corp | 浮控式物件識別方法 |
Also Published As
Publication number | Publication date |
---|---|
CN103370680A (zh) | 2013-10-23 |
EP2677404A4 (en) | 2017-09-27 |
JPWO2012111227A1 (ja) | 2014-07-03 |
EP2677404A1 (en) | 2013-12-25 |
US20130321320A1 (en) | 2013-12-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101766187B1 (ko) | 작동 모드들을 변화시키는 방법 및 장치 | |
CN205485930U (zh) | 输入装置和键盘 | |
JP5640486B2 (ja) | 情報表示装置 | |
JP2010244132A (ja) | タッチパネル付きユーザインタフェース装置、ユーザインタフェース制御方法およびユーザインタフェース制御プログラム | |
WO2012070682A1 (ja) | 入力装置及び入力装置の制御方法 | |
WO2012111227A1 (ja) | タッチ式入力装置、電子機器および入力方法 | |
JP5485154B2 (ja) | 入力装置、特にコンピュータ用マウス | |
JP2009536385A (ja) | スクロール付き多機能キー | |
JP5542224B1 (ja) | 電子機器および座標検出方法 | |
CN104423697A (zh) | 显示控制设备、显示控制方法和程序 | |
US10126843B2 (en) | Touch control method and electronic device | |
US20170192465A1 (en) | Apparatus and method for disambiguating information input to a portable electronic device | |
US20150009136A1 (en) | Operation input device and input operation processing method | |
US20140359541A1 (en) | Terminal and method for controlling multi-touch operation in the same | |
JP6154148B2 (ja) | 入力操作装置、表示装置、および、コマンド選択方法 | |
TWI480792B (zh) | 電子裝置的操作方法 | |
US10338692B1 (en) | Dual touchpad system | |
JP2013114645A (ja) | 小型情報機器 | |
US9377911B2 (en) | Input device | |
JP4080498B2 (ja) | タッチパネルのインテリジェントタイプ移動の制御方法 | |
KR20160000534U (ko) | 터치패드를 구비한 스마트폰 | |
US20130300685A1 (en) | Operation method of touch panel | |
US20130063347A1 (en) | Method of processing signal of portable computer and portable computer using the method | |
US20150138102A1 (en) | Inputting mode switching method and system utilizing the same | |
JP5777934B2 (ja) | 情報処理装置、情報処理装置の制御方法、及び制御プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11858810 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011858810 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2012557793 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14000049 Country of ref document: US |