JP5749805B2 - デバイスタッチインタフェースの機能を用いる仮想オブジェクトの制御 - Google Patents
デバイスタッチインタフェースの機能を用いる仮想オブジェクトの制御 Download PDFInfo
- Publication number
- JP5749805B2 JP5749805B2 JP2013537666A JP2013537666A JP5749805B2 JP 5749805 B2 JP5749805 B2 JP 5749805B2 JP 2013537666 A JP2013537666 A JP 2013537666A JP 2013537666 A JP2013537666 A JP 2013537666A JP 5749805 B2 JP5749805 B2 JP 5749805B2
- Authority
- JP
- Japan
- Prior art keywords
- touch
- touch interface
- interface
- virtual object
- input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000006870 function Effects 0.000 title description 3
- 238000000034 method Methods 0.000 claims description 39
- 238000004590 computer program Methods 0.000 claims 1
- 230000000007 visual effect Effects 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 16
- 230000008859 change Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 8
- 230000006835 compression Effects 0.000 description 5
- 238000007906 compression Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000003247 decreasing effect Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000005057 finger movement Effects 0.000 description 3
- 238000010897 surface acoustic wave method Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
- G06F3/04186—Touch location disambiguation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04802—3D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Description
Claims (16)
- 仮想オブジェクトを制御する方法であって、
a)第1タッチインタフェース上の第1タッチ入力の位置を決定するステップと、
b)第2タッチインタフェース上の第2タッチ入力の位置を決定するステップと、
c)前記第1タッチ入力の位置、前記第2タッチ入力の位置、および前記第1タッチインタフェースと前記第2タッチインタフェースとの間の事前に定められた空間的な関係を用いて3次元線分を生成するステップと、
d)c)における3次元線分を制御入力として用いて仮想オブジェクトを操作するステップと、
e)操作した前記仮想オブジェクトを表示装置上に表示するステップとを含むことを特徴とする方法。 - 前記第1タッチインタフェースはタッチスクリーンであることを特徴とする請求項1に記載の方法。
- 前記タッチインタフェースはタッチパッドであることを特徴とする請求項2に記載の方法。
- 前記第1タッチ入力の位置を決定するステップは、第1タッチインタフェースにおける第1タッチ入力の位置を決定するための事前タッチと、当該第1タッチ入力の位置とタッチの適用点(point of application)との間の垂直距離とを用いることを含むことを特徴とする請求項1に記載の方法。
- 前記第2タッチ入力の位置を決定するステップは、第2タッチインタフェースにおける第2タッチ入力の位置を決定するための事前タッチと、当該第2タッチ入力の位置とタッチの適用点との間の垂直距離とを用いることを含むことを特徴とする請求項1に記載の方法。
- e)は、前記第1タッチインタフェースを含むデバイスおよび前記第2タッチインタフェースを含むデバイスから離れている表示装置上に前記仮想オブジェクトを表示することを含むことを特徴とする請求項1に記載の方法。
- e)は、前記仮想オブジェクトを第1タッチスクリーンに表示することを特徴とする請求項1に記載の方法。
- e)は、前記仮想オブジェクトを第2タッチスクリーンにも表示することを特徴とする請求項7に記載の方法。
- 仮想オブジェクトを制御する装置であって、
第1タッチインタフェースと、
第2タッチインタフェースと、
前記第1タッチインタフェースと動作可能に接続しているプロセッサと、
前記プロセッサによって実行可能な命令を含み、当該命令は、
a)第1タッチインタフェース上の第1タッチ入力の位置を決定するステップと、
b)第2タッチインタフェース上の第2タッチ入力の位置を決定するステップと、
c)前記第1タッチ入力の位置、前記第2タッチ入力の位置、および前記第1タッチインタフェースと前記第2タッチインタフェースとの間の事前に定められた空間的な関係を用いて3次元線分を生成するステップと、
d)c)における3次元線分を制御入力として用いて仮想オブジェクトを操作するステップと、
e)操作した前記仮想オブジェクトを表示装置上に表示するステップとを前記プロセッサが実行することを特徴とする装置。 - 前記プロセッサはさらに、前記第2タッチインタフェースに動作可能に接続されており、前記第1タッチインタフェースと前記第2タッチインタフェースとはともに、第1と第2の広い面(major surface)を有するケースに存在することを特徴とする請求項9に記載の装置。
- 前記第1タッチインタフェースは、前記第1の広い面に位置するタッチスクリーンであることを特徴とする請求項10に記載の装置。
- 前記第2タッチインタフェースは、前記第2の広い面に位置するタッチパッドであることを特徴とする請求項11に記載の装置。
- a)における前記第1タッチインタフェースは第1デバイス上に位置し、b)における前記第2タッチインタフェースは第2デバイス上に位置することを特徴とする請求項9に記載の装置。
- 前記第1デバイスと前記第2デバイスとは、無線ネットワークで接続されていることを特徴とする請求項13に記載の装置。
- 前記第1タッチインタフェースを含むデバイスおよび前記第2タッチインタフェースを含むデバイスから離れている視覚的表示装置をさらに含むことを特徴とする請求項9に記載の装置。
- ふたつのタッチインタフェースを用いて仮想オブジェクトを制御するためのプログラムであって、
a)第1タッチインタフェース上の第1タッチ入力の位置を決定する機能と、
b)第2タッチインタフェース上の第2タッチ入力の位置を決定する機能と、
c)前記第1タッチ入力の位置、前記第2タッチ入力の位置、および前記第1タッチインタフェースと前記第2タッチインタフェースとの間の事前に定められた空間的な関係を用いて3次元線分を生成する機能と、
d)c)における3次元線分を制御入力として用いて仮想オブジェクトを操作する機能と、
e)操作した前記仮想オブジェクトを表示装置上に表示する機能とをコンピュータに実現させることを特徴とするコンピュータプログラム。
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/917,362 US9092135B2 (en) | 2010-11-01 | 2010-11-01 | Control of virtual object using device touch interface functionality |
US12/917,362 | 2010-11-01 | ||
PCT/US2011/048507 WO2012060919A2 (en) | 2010-11-01 | 2011-08-19 | Control of virtual object using device touch interface functionality |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2014501001A JP2014501001A (ja) | 2014-01-16 |
JP5749805B2 true JP5749805B2 (ja) | 2015-07-15 |
Family
ID=45998034
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2013537666A Active JP5749805B2 (ja) | 2010-11-01 | 2011-08-19 | デバイスタッチインタフェースの機能を用いる仮想オブジェクトの制御 |
Country Status (5)
Country | Link |
---|---|
US (3) | US9092135B2 (ja) |
EP (1) | EP2635955B1 (ja) |
JP (1) | JP5749805B2 (ja) |
CN (1) | CN103403646B (ja) |
WO (1) | WO2012060919A2 (ja) |
Families Citing this family (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SE533704C2 (sv) | 2008-12-05 | 2010-12-07 | Flatfrog Lab Ab | Pekkänslig apparat och förfarande för drivning av densamma |
KR101123005B1 (ko) * | 2010-06-14 | 2012-03-12 | 알피니언메디칼시스템 주식회사 | 초음파 진단장치, 거기에 이용되는 그래픽 환경 제어장치 및 그 제어방법 |
US9092135B2 (en) | 2010-11-01 | 2015-07-28 | Sony Computer Entertainment Inc. | Control of virtual object using device touch interface functionality |
US10168835B2 (en) | 2012-05-23 | 2019-01-01 | Flatfrog Laboratories Ab | Spatial resolution in touch displays |
KR20140010616A (ko) * | 2012-07-16 | 2014-01-27 | 한국전자통신연구원 | 3d 가상 객체에 대한 조작 처리 장치 및 방법 |
US9733667B2 (en) * | 2012-10-01 | 2017-08-15 | Nec Corporation | Information processing device, information processing method and recording medium |
US20140237408A1 (en) * | 2013-02-15 | 2014-08-21 | Flatfrog Laboratories Ab | Interpretation of pressure based gesture |
US10019113B2 (en) | 2013-04-11 | 2018-07-10 | Flatfrog Laboratories Ab | Tomographic processing for touch detection |
US9874978B2 (en) | 2013-07-12 | 2018-01-23 | Flatfrog Laboratories Ab | Partial detect mode |
US10013092B2 (en) | 2013-09-27 | 2018-07-03 | Sensel, Inc. | Tactile touch sensor system and method |
WO2015048582A1 (en) | 2013-09-27 | 2015-04-02 | Sensel, Inc. | Resistive touch sensor system and method |
US11221706B2 (en) | 2013-09-27 | 2022-01-11 | Sensel, Inc. | Tactile touch sensor system and method |
KR102165445B1 (ko) * | 2013-09-30 | 2020-10-14 | 엘지전자 주식회사 | 디지털 디바이스 및 그 제어 방법 |
US10168873B1 (en) | 2013-10-29 | 2019-01-01 | Leap Motion, Inc. | Virtual interactions for machine control |
US9996797B1 (en) | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Interactions with virtual objects for machine control |
US10416834B1 (en) * | 2013-11-15 | 2019-09-17 | Leap Motion, Inc. | Interaction strength using virtual objects for machine control |
US10146376B2 (en) | 2014-01-16 | 2018-12-04 | Flatfrog Laboratories Ab | Light coupling in TIR-based optical touch systems |
WO2015108480A1 (en) | 2014-01-16 | 2015-07-23 | Flatfrog Laboratories Ab | Improvements in tir-based optical touch systems of projection-type |
US10161886B2 (en) | 2014-06-27 | 2018-12-25 | Flatfrog Laboratories Ab | Detection of surface contamination |
US10127927B2 (en) | 2014-07-28 | 2018-11-13 | Sony Interactive Entertainment Inc. | Emotional speech processing |
TWI536239B (zh) * | 2014-10-27 | 2016-06-01 | 緯創資通股份有限公司 | 觸控裝置及觸控方法 |
EP3250993B1 (en) | 2015-01-28 | 2019-09-04 | FlatFrog Laboratories AB | Dynamic touch quarantine frames |
US10318074B2 (en) | 2015-01-30 | 2019-06-11 | Flatfrog Laboratories Ab | Touch-sensing OLED display with tilted emitters |
US20160224203A1 (en) * | 2015-02-02 | 2016-08-04 | Cirque Corporation | Using a perpendicular bisector of a multi-finger gesture to control motion of objects shown in a multi-dimensional environment on a display |
WO2016130074A1 (en) | 2015-02-09 | 2016-08-18 | Flatfrog Laboratories Ab | Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel |
CN107250855A (zh) | 2015-03-02 | 2017-10-13 | 平蛙实验室股份公司 | 用于光耦合的光学部件 |
CN108140360B (zh) * | 2015-07-29 | 2020-12-04 | 森赛尔股份有限公司 | 用于操纵虚拟环境的***和方法 |
US10739968B2 (en) | 2015-11-23 | 2020-08-11 | Samsung Electronics Co., Ltd. | Apparatus and method for rotating 3D objects on a mobile device screen |
KR102400705B1 (ko) | 2015-12-09 | 2022-05-23 | 플라트프로그 라보라토리즈 에이비 | 개선된 스타일러스 식별 |
WO2017165894A1 (en) | 2016-03-25 | 2017-09-28 | Sensel Inc. | System and method for detecting and characterizing force inputs on a surface |
CN105892754A (zh) * | 2016-05-27 | 2016-08-24 | 北京小鸟看看科技有限公司 | 一种手指动作识别方法和*** |
US20180143693A1 (en) * | 2016-11-21 | 2018-05-24 | David J. Calabrese | Virtual object manipulation |
EP3545392A4 (en) | 2016-11-24 | 2020-07-29 | FlatFrog Laboratories AB | AUTOMATIC TACTILE SIGNAL OPTIMIZATION |
KR102629629B1 (ko) | 2016-12-07 | 2024-01-29 | 플라트프로그 라보라토리즈 에이비 | 개선된 터치 장치 |
EP3458946B1 (en) | 2017-02-06 | 2020-10-21 | FlatFrog Laboratories AB | Optical coupling in touch-sensing systems |
WO2018174788A1 (en) | 2017-03-22 | 2018-09-27 | Flatfrog Laboratories | Object characterisation for touch displays |
EP4036697A1 (en) | 2017-03-28 | 2022-08-03 | FlatFrog Laboratories AB | Optical touch sensing apparatus |
CN117311543A (zh) | 2017-09-01 | 2023-12-29 | 平蛙实验室股份公司 | 触摸感测设备 |
US10671238B2 (en) * | 2017-11-17 | 2020-06-02 | Adobe Inc. | Position-dependent modification of descriptive content in a virtual reality environment |
WO2019143364A1 (en) * | 2018-01-19 | 2019-07-25 | John Alex Souppa | Touch screen interface for audio signal processing in an electronic musical-effects unit |
US11567610B2 (en) | 2018-03-05 | 2023-01-31 | Flatfrog Laboratories Ab | Detection line broadening |
US11943563B2 (en) | 2019-01-25 | 2024-03-26 | FlatFrog Laboratories, AB | Videoconferencing terminal and method of operating the same |
US11030940B2 (en) * | 2019-05-03 | 2021-06-08 | X Development Llc | Display array with distributed audio |
CN115039063A (zh) | 2020-02-10 | 2022-09-09 | 平蛙实验室股份公司 | 改进的触摸感测设备 |
Family Cites Families (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10009A (en) * | 1853-09-13 | Cutting boots and shoes | ||
US9025A (en) * | 1852-06-15 | And chas | ||
US6597347B1 (en) * | 1991-11-26 | 2003-07-22 | Itu Research Inc. | Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
US6239389B1 (en) * | 1992-06-08 | 2001-05-29 | Synaptics, Inc. | Object position detection system and method |
US6139433A (en) * | 1995-11-22 | 2000-10-31 | Nintendo Co., Ltd. | Video game system and method with enhanced three-dimensional character and background control due to environmental conditions |
US6278418B1 (en) * | 1995-12-29 | 2001-08-21 | Kabushiki Kaisha Sega Enterprises | Three-dimensional imaging system, game device, method for same and recording medium |
US6437777B1 (en) * | 1996-09-30 | 2002-08-20 | Sony Corporation | Three-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium |
KR100537880B1 (ko) * | 1997-08-08 | 2005-12-21 | 가부시키가이샤 세가 | 게임 장치 및 게임 시스템 |
US20020036617A1 (en) * | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
TW469379B (en) * | 1998-02-16 | 2001-12-21 | Sony Computer Entertainment Inc | Portable electronic device |
US6215498B1 (en) * | 1998-09-10 | 2001-04-10 | Lionhearth Technologies, Inc. | Virtual command post |
JP2000116940A (ja) * | 1998-10-15 | 2000-04-25 | Seta Corp | 双方向通信型ゲームシステム |
US7020326B1 (en) * | 1998-11-13 | 2006-03-28 | Hsu Shin-Yi | System for guiding users to formulate and use object extraction rules |
US6222465B1 (en) * | 1998-12-09 | 2001-04-24 | Lucent Technologies Inc. | Gesture-based computer interface |
JP2000293280A (ja) | 1999-04-07 | 2000-10-20 | Sharp Corp | 情報入力装置 |
US6539421B1 (en) * | 1999-09-24 | 2003-03-25 | America Online, Inc. | Messaging application user interface |
DE10045117C2 (de) * | 2000-09-13 | 2002-12-12 | Bernd Von Prittwitz | Verfahren und Vorrichtung zur Echtzeit-Geometriesteuerung |
US7445549B1 (en) * | 2001-05-10 | 2008-11-04 | Best Robert M | Networked portable and console game systems |
US20030050116A1 (en) * | 2001-09-10 | 2003-03-13 | William Chen | Picture video signal converting/processing circuit for GBA (gameboy advance) or GBC (gameboy color) |
DE10144634A1 (de) * | 2001-09-11 | 2003-04-10 | Trw Automotive Electron & Comp | Bediensystem |
DE10146471A1 (de) * | 2001-09-21 | 2003-04-17 | 3Dconnexion Gmbh | 3D-Eingabegerät mit integriertem Touchscreen |
JP4054585B2 (ja) * | 2002-02-18 | 2008-02-27 | キヤノン株式会社 | 情報処理装置および方法 |
US20040233223A1 (en) * | 2003-05-22 | 2004-11-25 | Steven Schkolne | Physical/digital input methodologies for spatial manipulations and entertainment |
JP2006018727A (ja) * | 2004-07-05 | 2006-01-19 | Funai Electric Co Ltd | 3次元座標入力装置 |
CN103365595B (zh) | 2004-07-30 | 2017-03-01 | 苹果公司 | 用于触敏输入设备的手势 |
US20070255468A1 (en) * | 2006-04-26 | 2007-11-01 | Alps Automotive, Inc. | Vehicle window control system |
US20080220878A1 (en) * | 2007-02-23 | 2008-09-11 | Oliver Michaelis | Method and Apparatus to Create or Join Gaming Sessions Based on Proximity |
JP2009116583A (ja) * | 2007-11-06 | 2009-05-28 | Ricoh Co Ltd | 入力制御装置および入力制御方法 |
JP2009187290A (ja) * | 2008-02-06 | 2009-08-20 | Yamaha Corp | タッチパネル付制御装置およびプログラム |
KR100963238B1 (ko) * | 2008-02-12 | 2010-06-10 | 광주과학기술원 | 개인화 및 협업을 위한 테이블탑-모바일 증강현실 시스템과증강현실을 이용한 상호작용방법 |
US20090256809A1 (en) * | 2008-04-14 | 2009-10-15 | Sony Ericsson Mobile Communications Ab | Three-dimensional touch interface |
KR20100041006A (ko) * | 2008-10-13 | 2010-04-22 | 엘지전자 주식회사 | 3차원 멀티 터치를 이용한 사용자 인터페이스 제어방법 |
KR101544364B1 (ko) * | 2009-01-23 | 2015-08-17 | 삼성전자주식회사 | 듀얼 터치 스크린을 구비한 휴대 단말기 및 그 컨텐츠 제어방법 |
JP5233708B2 (ja) * | 2009-02-04 | 2013-07-10 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
US8289316B1 (en) * | 2009-04-01 | 2012-10-16 | Perceptive Pixel Inc. | Controlling distribution of error in 2D and 3D manipulation |
US10198854B2 (en) * | 2009-08-14 | 2019-02-05 | Microsoft Technology Licensing, Llc | Manipulation of 3-dimensional graphical objects for view in a multi-touch display |
US8400398B2 (en) * | 2009-08-27 | 2013-03-19 | Schlumberger Technology Corporation | Visualization controls |
US20120066648A1 (en) * | 2010-09-14 | 2012-03-15 | Xerox Corporation | Move and turn touch screen interface for manipulating objects in a 3d scene |
US9092135B2 (en) | 2010-11-01 | 2015-07-28 | Sony Computer Entertainment Inc. | Control of virtual object using device touch interface functionality |
-
2010
- 2010-11-01 US US12/917,362 patent/US9092135B2/en active Active
-
2011
- 2011-08-19 JP JP2013537666A patent/JP5749805B2/ja active Active
- 2011-08-19 EP EP11838392.6A patent/EP2635955B1/en active Active
- 2011-08-19 WO PCT/US2011/048507 patent/WO2012060919A2/en active Application Filing
- 2011-08-19 CN CN201180060188.3A patent/CN103403646B/zh active Active
-
2015
- 2015-07-27 US US14/809,974 patent/US9372624B2/en active Active
-
2016
- 2016-06-21 US US15/188,887 patent/US9575594B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US9092135B2 (en) | 2015-07-28 |
US20150331577A1 (en) | 2015-11-19 |
CN103403646B (zh) | 2016-10-05 |
EP2635955A2 (en) | 2013-09-11 |
EP2635955A4 (en) | 2017-04-05 |
JP2014501001A (ja) | 2014-01-16 |
US20120110447A1 (en) | 2012-05-03 |
WO2012060919A3 (en) | 2013-08-29 |
US9372624B2 (en) | 2016-06-21 |
WO2012060919A2 (en) | 2012-05-10 |
US9575594B2 (en) | 2017-02-21 |
US20160299624A1 (en) | 2016-10-13 |
EP2635955B1 (en) | 2018-07-25 |
CN103403646A (zh) | 2013-11-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5749805B2 (ja) | デバイスタッチインタフェースの機能を用いる仮想オブジェクトの制御 | |
JP5801931B2 (ja) | 先行する補助的なタッチ入力に基づいたタッチスクリーンの明確化 | |
KR101180218B1 (ko) | 터치스크린과 디지털 촉각 픽셀들을 구비한 휴대용 장치 | |
US10401964B2 (en) | Mobile terminal and method for controlling haptic feedback | |
KR102118091B1 (ko) | 오브젝트에 대한 사전 실행 기능을 가지는 모바일 장치 및 그 제어방법 | |
KR102121533B1 (ko) | 투명 디스플레이를 구비한 디스플레이 장치 및 그 디스플레이 장치의 제어 방법 | |
TW201316208A (zh) | 觸控面板之影像顯示方法及觸控面板 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20140220 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20141009 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20141021 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20150512 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20150514 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 5749805 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |