WO2014184902A1 - User interface device - Google Patents

User interface device Download PDF

Info

Publication number
WO2014184902A1
WO2014184902A1 PCT/JP2013/063570 JP2013063570W WO2014184902A1 WO 2014184902 A1 WO2014184902 A1 WO 2014184902A1 JP 2013063570 W JP2013063570 W JP 2013063570W WO 2014184902 A1 WO2014184902 A1 WO 2014184902A1
Authority
WO
WIPO (PCT)
Prior art keywords
control unit
button
finger
touch
fingers
Prior art date
Application number
PCT/JP2013/063570
Other languages
French (fr)
Japanese (ja)
Inventor
良弘 中井
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2013/063570 priority Critical patent/WO2014184902A1/en
Publication of WO2014184902A1 publication Critical patent/WO2014184902A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to a user interface device using a touch pad.
  • a separation panel unit configured separately from the display unit is attached to the handle, and the driver contacts the separation panel unit with a different number of fingertips.
  • Different input screens corresponding to the number of contacts are displayed on the display unit, and operations on the input screen can be performed.
  • the operator's hand shape and fingertip position on the touch panel display are detected, and the display position of the user interface component image is made to follow the movement of each fingertip position of the hand shape.
  • the user interface component image is assigned to all fingers, and the operation corresponding to the user interface component image can be performed.
  • Patent Document 1 since the number of touched fingers and the operation are uniquely associated, there is a problem that the functions that can be executed by this operation are limited to the number of fingers. For this reason, when executing a function at a deeper level than the input screen displayed by this operation, the user has to operate while viewing the input screen. Moreover, in the case of the said patent document 2, it was necessary to add the camera for detecting a hand shape and a fingertip position, and those recognition parts, and there existed a subject that a structure becomes complicated.
  • the present invention has been made to solve the above-described problems, and an object of the present invention is to provide a user interface device that allows a user to perform a touch operation without turning his / her line of sight with the existing configuration.
  • the user interface device detects a finger contact, outputs a touch pad for outputting the number of touched fingers and the position of the finger, and an operation table in which a command group for operating the operation target device is set.
  • the number of fingers touched from the storage unit by recognizing the number of fingers touched based on the output of the touch pad and the touch operation (tap and swipe operation) performed with the touched finger based on the output of the touch pad.
  • a control unit that selects a command corresponding to the touch operation executed by the touched finger from the command group set in the operation table.
  • FIG. 4 is a flowchart showing processing at the time of finger contact of the user interface device according to the first embodiment.
  • 6 is a flowchart illustrating processing when a finger contact state changes in the user interface device according to the first embodiment.
  • 6 is a diagram illustrating an example of an operation table stored in a storage unit of the user interface device according to Embodiment 1.
  • FIG. It is a block diagram which shows the structure of the user interface apparatus which concerns on Embodiment 2 of this invention.
  • FIG. 1 is a block diagram showing the configuration of the user interface device according to the first embodiment.
  • This user interface device is a user interface that receives an operation for controlling a device other than the device (for example, an operation target device 100 such as an audio device, a television tuner, a disc playback device, and a navigation device) from the user.
  • the operation control device 2 is configured.
  • the touch display 1 is configured integrally with a touch pad that can detect a plurality of touches of a user's finger and a display (for example, a liquid crystal display) that displays the output of the operation target device 100 on the screen.
  • a display for example, a liquid crystal display
  • the operation target device 100 has a display having the same function as the touch display 1, the display of the touch display 1 can be omitted.
  • the operation control device 2 includes a display control unit 3 that is connected to the touch display 1 and controls the display, a touch pad control unit 4 that is connected to the touch display 1 and controls the touch pad, a memory, a hard disk, and the like.
  • a communication unit 6 that performs communication between the storage unit 5 and the operation target device 100, an audio output unit 7 that includes a speaker, a memory that stores a program in which processing contents to be described later are described, and the program And a control unit 8 including a CPU (Central Processing Unit) to be executed.
  • CPU Central Processing Unit
  • the display control unit 3 displays (draws) images, characters, and the like output from the operation target device 100 on the touch display 1 under the control of the control unit 8.
  • the touch pad control unit 4 acquires finger contact information from the touch display 1, converts it into position information, and outputs the position information to the control unit 8.
  • the storage unit 5 stores an operation table in which a command group for operating the operation target device 100 is set for each number of fingers touching the touch display 1.
  • the communication unit 6 communicates with the operation target device 100 under the control of the control unit 8 to transmit / receive a command for operating the operation target device, and to display information to be displayed on the touch display 1 from the operation target device. Or receive. Communication may be wireless or wired.
  • the sound output unit 7 sounds a sound such as a buzzer under the control of the control unit 8.
  • the control unit 8 mainly acquires the number of finger contacts and the contact position from the touch pad control unit 4, and selects and selects a command for operating the operation target device 100 from the storage unit 5 based on the acquired information.
  • the transmitted command is transmitted from the communication unit 6 to the operation target device 100.
  • FIG. 2 shows the relationship between the touch display 1 and the user's hand 10 during a touch operation.
  • the user interface device is applied to an in-vehicle audio device (operation target device 100)
  • the touch display 1 is installed on, for example, a dashboard in a vehicle interior.
  • a driver exists on the right side of the touch display 1 and performs a touch operation with the left hand 10.
  • the user operates the operation target device 100 by bringing a finger into contact with the touch display 1 and changing the contact state of the finger, that is, by performing a touch operation.
  • FIG. 3 is a flowchart showing processing when the finger touches the touch display 1
  • FIG. 4 is a flowchart showing processing when the finger contact state changes.
  • the user interface device waits for any of the fingers 11 to 15 to touch the touch display 1 (step ST1).
  • the touch display 1 detects the touch (step ST1 “YES”), and information that the touch pad control unit 4 has touched (for example, the number of fingers and their number) Position) is notified to the controller 8 (step ST2).
  • step ST1 “YES” information that the touch pad control unit 4 has touched
  • the control unit 8 recognizes the number of contact fingers based on the information received from the touch pad control unit 4 (step ST3), transitions to any of steps ST4 to ST9, and performs an operation according to the number of contact fingers.
  • a table is acquired from the storage unit 5.
  • the control part 8 identifies a finger
  • FIG. 5 shows an example of the operation table stored in the storage unit 5.
  • An operation table is set for each number of fingers touching the touch display 1, and a command group for operating the operation target device 100 is set in each operation table.
  • the control unit 8 transitions to step ST ⁇ b> 6 and acquires from the storage unit 5 an operation table for three surrounded by a broken line in FIG. 5.
  • a command group “volume up, volume down, MUTE, continuous volume up, continuous volume down” for volume control of the in-vehicle audio apparatus is set.
  • a command group “Forward, Back, Fast Forward, Fast Back” for channel change operation of the TV tuner is set in the operation table for four.
  • “-” in FIG. 5 indicates that the command cannot be set because the corresponding touch operation cannot be performed only with the contact finger.
  • “Not set” indicates that no command is set in this example.
  • step ST3 when the number of contact fingers is recognized more than the design number (here, 6 or more), the process proceeds to step ST9.
  • the process in step ST9 is not specified.
  • the control unit 8 in order to notify the user that the operation is invalid, notifies the audio output unit 7 to sound a buzzer or the like.
  • the control unit 8 may request ringing (or screen display) from the operation target device 100 or the like via the communication unit 6. By doing so, it can be expected that the user recognizes that an abnormal touch operation has been performed on the touch display 1 and prompts the user to redo the touch operation.
  • step ST1 since the finger is still in contact, in order to eliminate the influence on the next touch operation, the process returns to step ST1 after detecting that all the fingers in contact are lost in step ST9. Or the like.
  • control unit 8 After acquiring the operation table, the control unit 8 shifts to a touch finger state change detection process (step ST10).
  • step ST11 shown in FIG. 4 the control unit 8 counts the valid time (for example, 5 seconds) for detecting a change in the state of the contact finger using a timer for waiting for a change in state. This is because it is not realistic to continue waiting for the user's finger to perform a touch operation after contact.
  • the control unit 8 determines whether or not there is a change in the state of the contact finger based on the contact information input from the touch pad control unit 4 (step ST12). If there is no change, the process proceeds to step ST13, and if there is a change, the process proceeds to step ST14.
  • step ST13 the control unit 8 confirms the value of the timer, and if timed out, the process ends.
  • the state change referred to here is a touch operation, for example, a tap by an arbitrary finger among contact fingers (a finger is once removed from the touch display 1, and the touch display 1 is within a short time (for example, 0. (Within 5 seconds) contact), swipe (sliding in an arbitrary direction while keeping the finger in contact with the touch display 1). However, do not move the remaining contact fingers in contact.
  • the control unit 8 detects a change in the state of the contact finger based on the contact information from the touch pad control unit 4 (step ST14), and determines a change in the number of contact fingers before and after the detection (step ST15). If the number of fingers in contact with the touch display 1 has changed (step ST15 “YES”), the control unit 8 determines that the user intends another operation, and acquires an operation table. In order to correct, the process proceeds to step ST2 in FIG. On the other hand, if the number of fingers in contact before and after the state change detection is the same (step ST15 “NO”), the control unit 8 transitions to step ST16, and whether the state change detected in step ST14 is a tap or a swipe. Determine.
  • step ST16 “YES” When the change in the state of the contact finger is a tap (step ST16 “YES”), the control unit 8 selects a command corresponding to the tapped contact finger from the acquired operation table, and sends the selected command to the corresponding operation target device 100.
  • the communication unit 6 is requested to transmit (step ST17). For example, when the finger 13 with the identification number 2 in FIG. 2 taps, the control unit 8 selects the “Tap 2” command “Volume Low” from the three-finger operation table shown in FIG. Perform the operation.
  • step ST16 “NO”) the control unit 8 selects a command corresponding to the swipe contact finger from the acquired operation table, and the operation target device 100 corresponding to the selected command is selected.
  • the communication unit 6 is requested to transmit (step ST18).
  • the control unit 8 selects the “swipe 2” command “continuous volume low” command from the three-finger operation table shown in FIG. Perform volume control of the device.
  • the volume operation is changed by the swipe contact finger.
  • the present invention is not limited to this. For example, the volume operation is changed according to the swipe direction, or the volume is continuously changed according to the swipe amount. May be changed.
  • tap 1, 2 indicates that both contact fingers with identification numbers 1 and 2 have been tapped.
  • the operation table stored in the storage unit 5 can change the command according to the user's preference.
  • the command may be customized by voice recognition.
  • the identification numbers are assigned in order from the lower contact finger of the touch display 1, but conversely, the identification numbers may be assigned in order from the upper contact finger.
  • settings such as assigning identification numbers from the right to the left of the touch display 1 can be performed.
  • the user interface device detects a finger contact, outputs the number of touched fingers and their positions, and a command group for operating the operation target device 100. Is stored for each number of fingers, and the number of touched fingers and the touch operation performed with the touching finger are recognized based on the output of the touch display 1, and the touch is performed from the storage unit 5. An operation table corresponding to the number of fingers is obtained, and the control unit 8 selects a command corresponding to the touch operation executed with the touch finger from the command group set in the operation table. As described above, by combining the operation of touching a finger and the operation of touching with any of the fingers from the state of touching the finger, a command having more than the number of fingers can be selected. Therefore, it is possible to provide a user interface device that allows the user to perform a touch operation without turning his / her line of sight with the existing configuration.
  • the control unit 8 can change the command according to the tap / swipe by recognizing the tap or swipe of the finger in contact with the touch display 1 as a touch operation with the touch finger. Therefore, it becomes possible to select more types of commands.
  • the tap / swipe touch operation is illustrated, but other touch operations can be recognized, and more types of commands can be selected by combining different touch operations.
  • FIG. FIG. 6 is a block diagram showing the configuration of the user interface device according to the second embodiment, and the same or corresponding parts as those in FIG.
  • the user interface device according to the second embodiment is mounted on a vehicle, and displays a touch on the vehicle speed acquisition unit 30 that acquires the vehicle speed from the vehicle side, a mode switching unit 31 that switches a touch operation mode, and the touch display 1 to display buttons.
  • the mode switching unit 31 switches to the button operation mode by the button operation mode control unit 32 and switches the button displayed on the touch display 1.
  • a user interface for operating the operation target device 100 (shown in FIG. 1) is made by a touch operation by the user.
  • the mode switching unit 31 may acquire parking brake information and determine that the vehicle is stopped when the parking brake is operating and the vehicle speed is less than 10 km / h.
  • a command for operating the operation target device 100 and its button are stored in the storage unit 5 in association with each other, and when the button operation mode is switched, the button operation mode control unit 32 reads the button. Therefore, the data is output to the display control unit 3.
  • the display control unit 3 displays the button on the touch display 1, and the touch pad control unit 4 detects contact information on the touch operation of the button and notifies the button operation mode control unit 32.
  • the button operation mode control unit 32 selects a command corresponding to the button from the storage unit 5 and transmits the command to the operation target device 100 via the communication unit 6.
  • buttons operation mode not only a button operation with one finger but also an enlargement / reduction operation with two fingers (for example, two fingers touching the touch display 1 are slid in a direction to separate the image. And the like, an operation of reducing the image by sliding the image in the direction of enlarging or approaching.
  • the mode switching unit 31 switches to an operation mode (hereinafter referred to as a blind operation mode) by the control unit 8 and
  • the user interface of form 1 is used.
  • Switching from the blind operation mode to the button operation mode may be a method other than the switching by the vehicle speed.
  • a photoelectric sensor 33 that emits infrared rays or the like is provided at the right end of the touch display 1 so that it can be determined whether or not the user on the left side (passenger seat) of the touch display 1 is performing a touch operation.
  • the photoelectric sensor 33 detects the reflected infrared rays and determines that the touch operation by the user at the passenger seat is being performed, and the mode switching unit 31. To notify.
  • the mode switching unit 31 switches to the button operation mode by the button operation mode control unit 32 when the passenger seat user is performing a touch operation according to the determination result of the photoelectric sensor 33, and otherwise (for example, the driver seat user Is switched to the blind operation mode by the control unit 8.
  • the passenger in the front passenger seat can perform a touch operation while visually observing the buttons displayed on the touch display 1, so that an efficient operation is possible.
  • the user at the driver's seat can perform a touch operation without observing the touch display 1, which is highly convenient.
  • the user determination unit that determines whether the touch operation is performed by the user at the passenger seat or the touch operation by the user at the driver seat may be configured other than the photoelectric sensor 33.
  • a hard button may be provided on the touch display 1 and the user may be determined depending on whether or not the button is pressed.
  • control unit 8 acquires an operation table corresponding to the number of contact fingers from the storage unit 5 (step ST3 in FIG. 3), if there are 3 to 5 contact fingers, the process proceeds to steps ST4 to ST6 and is blinded. While the operation mode is performed, the button operation mode of the button operation mode control unit 32 may be switched to one or two contact fingers.
  • the switching from the blind operation mode to the button operation mode is set to two or less contact fingers.
  • N an arbitrary number
  • any combination of each embodiment, any component of each embodiment can be modified, or any component can be omitted in each embodiment. .
  • the user interface device is adapted to be used for a user interface device for operating an in-vehicle operation target device because the user can perform a touch operation without directing his / her line of sight.
  • 1 touch display 2 operation control device, 3 display control unit, 4 touchpad control unit, 5 storage unit, 6 communication unit, 7 audio output unit, 8 control unit, 10 hands, 11-15 fingers, 21-23 region, 30 vehicle speed acquisition unit, 31 mode switching unit, 32 button operation mode control unit, 33 photoelectric sensor (user discrimination unit), 100 operation target device.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

In the present invention, a user contacts a touch display (1) with a finger, and operates an operated device (100) by executing a touch operation by means of the contacted finger. A storage unit (5) holds operation data, to which a command group for operating the operated device (100) has been set, for each number of fingers contacting the touch display (1). A control unit (8) acquires an operation table corresponding to the number of fingers contacting the touch display (1) from the storage unit (5), selects a command corresponding to the touch operation executed by the contacting finger from among the command group in the operation table, and transmits the command to the operated device (100) via a communication unit (6).

Description

ユーザインタフェース装置User interface device
 この発明は、タッチパッドを用いたユーザインタフェース装置に関する。 The present invention relates to a user interface device using a touch pad.
 例えば特許文献1の車両用情報表示装置では、ディスプレイ部とは別体で構成した分離パネルユニット(タッチパッド)をハンドルに取り付け、運転者が分離パネルユニットに異なる本数の指先で接触することによって、接触数に応じた異なる入力画面をディスプレイ部に表示させ、当該入力画面上の操作を実施可能にする構成であった。 For example, in the vehicle information display device of Patent Document 1, a separation panel unit (touch pad) configured separately from the display unit is attached to the handle, and the driver contacts the separation panel unit with a different number of fingertips. Different input screens corresponding to the number of contacts are displayed on the display unit, and operations on the input screen can be performed.
 また例えば、特許文献2のユーザインタフェース装置では、タッチパネルディスプレイ上にある操作者の手形状と指先位置を検出し、手形状の各指先位置の動きに対してユーザインタフェース部品画像の表示位置を追従させることによって、全ての指にユーザインタフェース部品画像を割り当て、ユーザインタフェース部品画像に対応する操作を実施可能にする構成であった。 Further, for example, in the user interface device of Patent Document 2, the operator's hand shape and fingertip position on the touch panel display are detected, and the display position of the user interface component image is made to follow the movement of each fingertip position of the hand shape. Thus, the user interface component image is assigned to all fingers, and the operation corresponding to the user interface component image can be performed.
特開2011-149749号公報JP 2011-149749 A 特開2010-9311号公報JP 2010-9931 A
 上記特許文献1の場合、タッチした指の本数と操作とが一意に関連付けられているので、この操作で実行できる機能が指の本数に限定されるという課題があった。そのため、この操作で表示させた入力画面より深い階層にある機能を実行する場合、ユーザはその入力画面を目視しながら操作する必要があった。
 また、上記特許文献2の場合、手形状および指先位置を検出するためのカメラおよびそれらの認識部を追加する必要があり、構成が複雑となる課題があった。
In the case of Patent Document 1, since the number of touched fingers and the operation are uniquely associated, there is a problem that the functions that can be executed by this operation are limited to the number of fingers. For this reason, when executing a function at a deeper level than the input screen displayed by this operation, the user has to operate while viewing the input screen.
Moreover, in the case of the said patent document 2, it was necessary to add the camera for detecting a hand shape and a fingertip position, and those recognition parts, and there existed a subject that a structure becomes complicated.
 この発明は、上記のような課題を解決するためになされたもので、既存の構成のままで、ユーザが視線を向けることなくタッチ操作できるユーザインタフェース装置を提供することを目的とする。 The present invention has been made to solve the above-described problems, and an object of the present invention is to provide a user interface device that allows a user to perform a touch operation without turning his / her line of sight with the existing configuration.
 この発明のユーザインタフェース装置は、指の接触を検出し、当該接触した指の本数および指の位置を出力するタッチパッドと、操作対象装置を操作するコマンド群が設定された操作テーブルを、指の本数ごとに保持する記憶部と、タッチパッドの出力に基づいて接触した指の本数および当該接触した指で実行されたタッチ操作(タップおよびスワイプ動作)を認識し、記憶部から接触した指の本数に対応する操作テーブルを取得し、当該操作テーブルに設定されたコマンド群の中から接触した指で実行されたタッチ操作に対応するコマンドを選択する制御部とを備えるものである。 The user interface device according to the present invention detects a finger contact, outputs a touch pad for outputting the number of touched fingers and the position of the finger, and an operation table in which a command group for operating the operation target device is set. The number of fingers touched from the storage unit by recognizing the number of fingers touched based on the output of the touch pad and the touch operation (tap and swipe operation) performed with the touched finger based on the output of the touch pad. And a control unit that selects a command corresponding to the touch operation executed by the touched finger from the command group set in the operation table.
 この発明によれば、タッチパッドに指を接触する動作と、接触した指でタッチ操作を実行する動作とを組み合わせることにより、指の本数以上のコマンドを選択することができるので、既存の構成のままで、ユーザがタッチパッドに視線を向けることなくタッチ操作できるユーザインタフェース装置を提供することができる。 According to this invention, by combining the operation of touching the finger with the touch pad and the operation of executing the touch operation with the touched finger, it is possible to select more commands than the number of fingers. It is possible to provide a user interface device that allows the user to perform a touch operation without directing his / her line of sight to the touchpad.
この発明の実施の形態1に係るユーザインタフェース装置の構成を示すブロック図である。It is a block diagram which shows the structure of the user interface apparatus which concerns on Embodiment 1 of this invention. 実施の形態1に係るユーザインタフェース装置においてタッチ操作時のタッチディスプレイとユーザの手の関係を示す図である。6 is a diagram illustrating a relationship between a touch display and a user's hand during a touch operation in the user interface device according to Embodiment 1. FIG. 実施の形態1に係るユーザインタフェース装置の指接触時の処理を示すフローチャートである。4 is a flowchart showing processing at the time of finger contact of the user interface device according to the first embodiment. 実施の形態1に係るユーザインタフェース装置の指の接触状態変化時の処理を示すフローチャートである。6 is a flowchart illustrating processing when a finger contact state changes in the user interface device according to the first embodiment. 実施の形態1に係るユーザインタフェース装置の記憶部が記憶している操作テーブルの例を示す図である。6 is a diagram illustrating an example of an operation table stored in a storage unit of the user interface device according to Embodiment 1. FIG. この発明の実施の形態2に係るユーザインタフェース装置の構成を示すブロック図である。It is a block diagram which shows the structure of the user interface apparatus which concerns on Embodiment 2 of this invention.
 以下、この発明をより詳細に説明するために、この発明を実施するための形態について、添付の図面に従って説明する。
実施の形態1.
 図1は、本実施の形態1に係るユーザインタフェース装置の構成を示すブロック図である。このユーザインタフェース装置は、当装置以外の装置(例えばオーディオ装置、テレビチューナ、ディスク再生装置、ナビゲーション装置といった操作対象装置100)を制御するための操作をユーザから受け付けるユーザインタフェースであり、タッチディスプレイ1および操作制御装置2から構成される。
Hereinafter, in order to explain the present invention in more detail, modes for carrying out the present invention will be described with reference to the accompanying drawings.
Embodiment 1 FIG.
FIG. 1 is a block diagram showing the configuration of the user interface device according to the first embodiment. This user interface device is a user interface that receives an operation for controlling a device other than the device (for example, an operation target device 100 such as an audio device, a television tuner, a disc playback device, and a navigation device) from the user. The operation control device 2 is configured.
 タッチディスプレイ1は、ユーザの指の接触を複数点検出可能なタッチパッドと、操作対象装置100の出力を画面表示するディスプレイ(例えば、液晶ディスプレイ)とが一体に構成されている。ただし、操作対象装置100がタッチディスプレイ1と同等の機能を持つディスプレイを有しているなら、タッチディスプレイ1のディスプレイは省略可能である。 The touch display 1 is configured integrally with a touch pad that can detect a plurality of touches of a user's finger and a display (for example, a liquid crystal display) that displays the output of the operation target device 100 on the screen. However, if the operation target device 100 has a display having the same function as the touch display 1, the display of the touch display 1 can be omitted.
 操作制御装置2は、タッチディスプレイ1に接続されてディスプレイを制御するディスプレイ制御部3と、タッチディスプレイ1に接続されてタッチパッドを制御するタッチパッド制御部4と、メモリおよびハードディスクなどで構成された記憶部5と、操作対象装置100との間で通信を行う通信部6と、スピーカなどで構成された音声出力部7と、後述する処理内容が記述されたプログラムを記憶するメモリとそのプログラムを実行するCPU(Central Processing Unit)とを含む制御部8とを備えている。 The operation control device 2 includes a display control unit 3 that is connected to the touch display 1 and controls the display, a touch pad control unit 4 that is connected to the touch display 1 and controls the touch pad, a memory, a hard disk, and the like. A communication unit 6 that performs communication between the storage unit 5 and the operation target device 100, an audio output unit 7 that includes a speaker, a memory that stores a program in which processing contents to be described later are described, and the program And a control unit 8 including a CPU (Central Processing Unit) to be executed.
 ディスプレイ制御部3は、制御部8の制御により、操作対象装置100の出力する映像および文字などをタッチディスプレイ1に表示(描画)する。タッチパッド制御部4は、タッチディスプレイ1から指の接触情報を取得し、位置情報に変換して制御部8へ出力する。
 記憶部5は、タッチディスプレイ1に接触した指の本数ごとに、操作対象装置100を操作するためのコマンド群が設定された操作テーブルを記憶している。
 通信部6は、制御部8の制御により、操作対象装置100との間で通信を行い、操作対象装置を操作するためのコマンドを送受信したり、タッチディスプレイ1に表示する情報を操作対象装置から受信したりする。通信は無線、有線を問わない。
 音声出力部7は、制御部8の制御により、ブザーなどの音声を鳴動させる。
 制御部8は、主に、タッチパッド制御部4から指の接触本数および接触位置を取得し、取得した情報に基づいて記憶部5から操作対象装置100を操作するためのコマンドを選択し、選択したコマンドを通信部6から操作対象装置100に送信する。
The display control unit 3 displays (draws) images, characters, and the like output from the operation target device 100 on the touch display 1 under the control of the control unit 8. The touch pad control unit 4 acquires finger contact information from the touch display 1, converts it into position information, and outputs the position information to the control unit 8.
The storage unit 5 stores an operation table in which a command group for operating the operation target device 100 is set for each number of fingers touching the touch display 1.
The communication unit 6 communicates with the operation target device 100 under the control of the control unit 8 to transmit / receive a command for operating the operation target device, and to display information to be displayed on the touch display 1 from the operation target device. Or receive. Communication may be wireless or wired.
The sound output unit 7 sounds a sound such as a buzzer under the control of the control unit 8.
The control unit 8 mainly acquires the number of finger contacts and the contact position from the touch pad control unit 4, and selects and selects a command for operating the operation target device 100 from the storage unit 5 based on the acquired information. The transmitted command is transmitted from the communication unit 6 to the operation target device 100.
 次に、ユーザインタフェース装置の基本動作を説明する。
 図2に、タッチ操作時のタッチディスプレイ1とユーザの手10の関係を示す。この例では、ユーザインタフェース装置を車載オーディオ装置(操作対象装置100)に適用した場合を考え、タッチディスプレイ1を例えば車室内のダッシュボードに設置する。タッチディスプレイ1の右側には運転者(ユーザ)が存在し、左の手10でタッチ操作を行う。ユーザは、タッチディスプレイ1に指を接触させ、その指の接触状態を変化させる、即ち、タッチ操作を実行することによって、操作対象装置100を操作する。
Next, the basic operation of the user interface device will be described.
FIG. 2 shows the relationship between the touch display 1 and the user's hand 10 during a touch operation. In this example, the case where the user interface device is applied to an in-vehicle audio device (operation target device 100) is considered, and the touch display 1 is installed on, for example, a dashboard in a vehicle interior. A driver (user) exists on the right side of the touch display 1 and performs a touch operation with the left hand 10. The user operates the operation target device 100 by bringing a finger into contact with the touch display 1 and changing the contact state of the finger, that is, by performing a touch operation.
 図3はタッチディスプレイ1への指接触時の処理を示すフローチャート、図4は指の接触状態変化時の処理を示すフローチャートである。
 ユーザインタフェース装置は、指11~15のいずれかがタッチディスプレイ1に接触するのを待つ(ステップST1)。タッチディスプレイ1に指11~15のいずれかが接触すると、タッチディスプレイ1でその接触を検知し(ステップST1“YES”)、タッチパッド制御部4が接触した情報(例えば、指の本数とそれらの位置)を制御部8に通知する(ステップST2)。例えば図2では、3本の指12~14がタッチディスプレイ1の領域21~23において接触している。
FIG. 3 is a flowchart showing processing when the finger touches the touch display 1, and FIG. 4 is a flowchart showing processing when the finger contact state changes.
The user interface device waits for any of the fingers 11 to 15 to touch the touch display 1 (step ST1). When any of the fingers 11 to 15 touches the touch display 1, the touch display 1 detects the touch (step ST1 “YES”), and information that the touch pad control unit 4 has touched (for example, the number of fingers and their number) Position) is notified to the controller 8 (step ST2). For example, in FIG. 2, three fingers 12 to 14 are in contact with each other in the areas 21 to 23 of the touch display 1.
 制御部8は、タッチパッド制御部4から受け取った情報に基づいて、接触指の本数を認識し(ステップST3)、ステップST4~ST9のいずれかに遷移して、接触指の本数に応じた操作テーブルを記憶部5から取得する。また、制御部8は、接触指同士の相対位置関係から指を識別して、識別番号を付与する。例えば図2では、タッチディスプレイ1の右側にいるユーザが左の手10を伸ばしてタッチ操作するので、ほとんどの場合、指11~15はタッチディスプレイ1に対して上下方向に並んで接触する。従って、制御部8は上下方向の下側から順に、指14(領域21)を識別番号1、指13(領域22)を識別番号2、指12(領域23)を識別番号3とする。 The control unit 8 recognizes the number of contact fingers based on the information received from the touch pad control unit 4 (step ST3), transitions to any of steps ST4 to ST9, and performs an operation according to the number of contact fingers. A table is acquired from the storage unit 5. Moreover, the control part 8 identifies a finger | toe from the relative positional relationship of contact fingers, and provides an identification number. For example, in FIG. 2, a user on the right side of the touch display 1 performs a touch operation by extending the left hand 10, and in most cases, the fingers 11 to 15 are in contact with the touch display 1 side by side in the vertical direction. Accordingly, the control unit 8 sets the finger 14 (region 21) as the identification number 1, the finger 13 (region 22) as the identification number 2, and the finger 12 (region 23) as the identification number 3 from the bottom in the vertical direction.
 ここで、図5に、記憶部5が記憶している操作テーブルの例を示す。タッチディスプレイ1に接触した指の本数ごとに操作テーブルが設定され、各操作テーブルには操作対象装置100を操作するコマンド群が設定されている。例えば図2では接触指が3本なので、制御部8がステップST6に遷移して図5に破線で囲った3本用の操作テーブルを記憶部5から取得する。例えば、3本用の操作テーブルには、車載オーディオ装置の音量操作用のコマンド群「音量大、音量小、MUTE、連続音量大、連続音量小」が設定されている。
 また例えば、4本用の操作テーブルには、テレビチューナのチャンネル変更操作用のコマンド群「Forward、Back、Fast Forward、Fast Back」が設定されている。
 また、図5の「―」は、対応するタッチ操作を接触指だけで行うことができないので、コマンドが設定できないことを示す。「未設定」は、この例においてコマンドが設定されていないことを示す。
Here, FIG. 5 shows an example of the operation table stored in the storage unit 5. An operation table is set for each number of fingers touching the touch display 1, and a command group for operating the operation target device 100 is set in each operation table. For example, since there are three contact fingers in FIG. 2, the control unit 8 transitions to step ST <b> 6 and acquires from the storage unit 5 an operation table for three surrounded by a broken line in FIG. 5. For example, in the operation table for three, a command group “volume up, volume down, MUTE, continuous volume up, continuous volume down” for volume control of the in-vehicle audio apparatus is set.
Further, for example, a command group “Forward, Back, Fast Forward, Fast Back” for channel change operation of the TV tuner is set in the operation table for four.
Further, “-” in FIG. 5 indicates that the command cannot be set because the corresponding touch operation cannot be performed only with the contact finger. “Not set” indicates that no command is set in this example.
 なお、ステップST3において、接触指が設計本数以上(ここでは6本以上)認識された場合には、ステップST9に遷移する。
 ステップST9の処理は特に指定しないが、例えば無効な操作であることをユーザに通知するために、制御部8から音声出力部7に通知してブザー等を鳴動させる。あるいは、制御部8が通信部6を経由して操作対象装置100などに鳴動(または、画面表示)を依頼してもよい。このようにすることで、ユーザがタッチディスプレイ1に対して異常なタッチ操作を実施したことを認識し、タッチ操作のやり直しを促すことが期待できる。
In step ST3, when the number of contact fingers is recognized more than the design number (here, 6 or more), the process proceeds to step ST9.
The process in step ST9 is not specified. For example, in order to notify the user that the operation is invalid, the control unit 8 notifies the audio output unit 7 to sound a buzzer or the like. Alternatively, the control unit 8 may request ringing (or screen display) from the operation target device 100 or the like via the communication unit 6. By doing so, it can be expected that the user recognizes that an abnormal touch operation has been performed on the touch display 1 and prompts the user to redo the touch operation.
 またこの場合には、まだ指が接触しているため、次のタッチ操作への影響を排除するために、ステップST9で接触している指が全て無くなったことを検知してからステップST1に戻る等のフローにしてもよい。 In this case, since the finger is still in contact, in order to eliminate the influence on the next touch operation, the process returns to step ST1 after detecting that all the fingers in contact are lost in step ST9. Or the like.
 操作テーブル取得後、制御部8は接触指の状態変化検知処理に移行する(ステップST10)。 After acquiring the operation table, the control unit 8 shifts to a touch finger state change detection process (step ST10).
 図4に示すステップST11において、制御部8が状態変化待ち用のタイマを用いて、接触指の状態変化検知の有効時間(例えば、5秒)を計時する。これは、いつまでもユーザの指が接触後にタッチ操作を実行するのを待ち続けるのは現実的ではないためである。タイマの計時開始後、制御部8は、タッチパッド制御部4から入力される接触情報に基づいて、接触指の状態変化の有無を判断する(ステップST12)。変化がなければステップST13に遷移し、変化があればステップST14に遷移する。ステップST13では制御部8がタイマの値を確認し、タイムアウトしていれば処理を終了する。 In step ST11 shown in FIG. 4, the control unit 8 counts the valid time (for example, 5 seconds) for detecting a change in the state of the contact finger using a timer for waiting for a change in state. This is because it is not realistic to continue waiting for the user's finger to perform a touch operation after contact. After the timer starts counting, the control unit 8 determines whether or not there is a change in the state of the contact finger based on the contact information input from the touch pad control unit 4 (step ST12). If there is no change, the process proceeds to step ST13, and if there is a change, the process proceeds to step ST14. In step ST13, the control unit 8 confirms the value of the timer, and if timed out, the process ends.
 ここで言う状態変化とはタッチ操作のことであり、例えば接触指のうちの任意の指によるタップ(指を一旦タッチディスプレイ1から離して、またタッチディスプレイ1に短い時間内に(例えば、0.5秒以内)接触させる)、スワイプ(指をタッチディスプレイ1に接触させたまま任意方向にスライドさせる)動作である。ただし、残りの接触指は接触させたまま動かさない。 The state change referred to here is a touch operation, for example, a tap by an arbitrary finger among contact fingers (a finger is once removed from the touch display 1, and the touch display 1 is within a short time (for example, 0. (Within 5 seconds) contact), swipe (sliding in an arbitrary direction while keeping the finger in contact with the touch display 1). However, do not move the remaining contact fingers in contact.
 制御部8は、タッチパッド制御部4からの接触情報に基づいて、接触指の状態変化を検出し(ステップST14)、検出前後での接触指の本数変化を判断する(ステップST15)。もし、タッチディスプレイ1に接触している指の本数が変化していた場合(ステップST15“YES”)、制御部8はユーザが別の操作を意図していると判断し、操作テーブルを取得し直すために、図3のステップST2に遷移する。一方、状態変化検出前後で接触している指の本数が同じであれば(ステップST15“NO”)、制御部8はステップST16に遷移して、ステップST14で検出した状態変化がタップかスワイプかを判定する。 The control unit 8 detects a change in the state of the contact finger based on the contact information from the touch pad control unit 4 (step ST14), and determines a change in the number of contact fingers before and after the detection (step ST15). If the number of fingers in contact with the touch display 1 has changed (step ST15 “YES”), the control unit 8 determines that the user intends another operation, and acquires an operation table. In order to correct, the process proceeds to step ST2 in FIG. On the other hand, if the number of fingers in contact before and after the state change detection is the same (step ST15 “NO”), the control unit 8 transitions to step ST16, and whether the state change detected in step ST14 is a tap or a swipe. Determine.
 接触指の状態変化がタップの場合(ステップST16“YES”)、制御部8は取得した操作テーブルからタップした接触指に対応するコマンドを選択し、選択したコマンドを該当する操作対象装置100に対して送信するよう通信部6に依頼する(ステップST17)。
 例えば図2において識別番号2の指13がタップした場合、制御部8は図5に示す3本指の操作テーブルから、「タップ2」のコマンド「音量小」を選択し、車載オーディオ装置の音量操作を実施する。
When the change in the state of the contact finger is a tap (step ST16 “YES”), the control unit 8 selects a command corresponding to the tapped contact finger from the acquired operation table, and sends the selected command to the corresponding operation target device 100. The communication unit 6 is requested to transmit (step ST17).
For example, when the finger 13 with the identification number 2 in FIG. 2 taps, the control unit 8 selects the “Tap 2” command “Volume Low” from the three-finger operation table shown in FIG. Perform the operation.
 接触指の状態変化がスワイプの場合(ステップST16“NO”)、制御部8は取得した操作テーブルからスワイプした接触指に対応するコマンドを選択し、選択したコマンドに該当する操作対象装置100に対して送信するよう通信部6に依頼する(ステップST18)。
 例えば図2において識別番号2の指13がスワイプした場合、制御部8は図5に示す3本指の操作テーブルから、「スワイプ2」のコマンド「連続音量小」のコマンドを選択し、車載オーディオ装置の音量操作を実施する。なお、図5の例ではスワイプする接触指によって音量操作の大小を変えたが、これに限定されるものではなく、例えばスワイプする方向によって音量操作の大小を変えたり、スワイプする量によって音量を連続的に変化させたりしてもよい。
When the state change of the contact finger is swipe (step ST16 “NO”), the control unit 8 selects a command corresponding to the swipe contact finger from the acquired operation table, and the operation target device 100 corresponding to the selected command is selected. The communication unit 6 is requested to transmit (step ST18).
For example, when the finger 13 with the identification number 2 in FIG. 2 is swiped, the control unit 8 selects the “swipe 2” command “continuous volume low” command from the three-finger operation table shown in FIG. Perform volume control of the device. In the example of FIG. 5, the volume operation is changed by the swipe contact finger. However, the present invention is not limited to this. For example, the volume operation is changed according to the swipe direction, or the volume is continuously changed according to the swipe amount. May be changed.
 ここでは、1本の接触指だけのタップ/スワイプ例を示したが、複数の接触指をタップ/スワイプさせてもよい。例えば図5の「タップ1,2」は、識別番号1,2の接触指が両方タップされたことを示す。 Here, an example of tap / swipe with only one contact finger is shown, but a plurality of contact fingers may be tapped / swipe. For example, “tap 1, 2” in FIG. 5 indicates that both contact fingers with identification numbers 1 and 2 have been tapped.
 このように、指を接触させる動作と、指を接触した状態からいずれかの指をタップまたはスワイプさせる動作を組み合わせることで、指の本数以上のコマンドを選択可能になる。図5の設定例なら片手5本指で31種類のコマンドが選択可能であり、論理的には、片手5本指で最大104種類のコマンドを設定可能である。また、従来のように、ユーザがタッチディスプレイ1を目視しながらタッチディスプレイ1に表示されたボタンを操作するわけではないため、ボタンを表示する必要がない。このため、上記特許文献2のような、ユーザの手の位置に応じてボタンを追従させるような複雑な処理および複雑なシステム構成が不要となり、インタフェース装置のコストを低減できる。 In this way, by combining the operation of touching a finger with the operation of tapping or swiping any finger from the state of touching the finger, it is possible to select a command that exceeds the number of fingers. In the setting example shown in FIG. 5, 31 types of commands can be selected with five fingers of one hand, and logically, a maximum of 104 types of commands can be set with five fingers of one hand. Moreover, since the user does not operate the buttons displayed on the touch display 1 while viewing the touch display 1 as in the past, it is not necessary to display the buttons. For this reason, a complicated process and a complicated system configuration in which the button is made to follow in accordance with the position of the user's hand as in Patent Document 2 are not required, and the cost of the interface device can be reduced.
 なお、記憶部5の記憶する操作テーブルはユーザの好みに応じてコマンドを変更可能であり、例えば音声認識によってコマンドをカスタマイズしてもよい。また、上記説明では、タッチディスプレイ1の下側の接触指から順に識別番号を割り振ったが、反対に上側の接触指から順に識別番号を割り振っても構わない。さらに、タッチディスプレイ1とユーザの位置関係によっては、タッチディスプレイ1の右から左へ識別番号を割り振るなどの設定も可能である。 Note that the operation table stored in the storage unit 5 can change the command according to the user's preference. For example, the command may be customized by voice recognition. In the above description, the identification numbers are assigned in order from the lower contact finger of the touch display 1, but conversely, the identification numbers may be assigned in order from the upper contact finger. Furthermore, depending on the positional relationship between the touch display 1 and the user, settings such as assigning identification numbers from the right to the left of the touch display 1 can be performed.
 以上より、実施の形態1によれば、ユーザインタフェース装置は、指の接触を検出し、当該接触した指の本数およびそれらの位置を出力するタッチディスプレイ1と、操作対象装置100を操作するコマンド群が設定された操作テーブルを指の本数ごとに保持する記憶部5と、タッチディスプレイ1の出力に基づいて接触した指の本数および接触指で実行されたタッチ操作を認識し、記憶部5から接触した指の本数に対応する操作テーブルを取得し、当該操作テーブルに設定されたコマンド群の中から接触指で実行されたタッチ操作に対応するコマンドを選択する制御部8とを備える構成にした。このように、指を接触する動作と、指を接触した状態からいずれかの指でタッチ操作する動作とを組み合わせることにより、指の本数以上のコマンドを選択することができる。従って、既存の構成のままで、ユーザが視線を向けることなくタッチ操作できるユーザインタフェース装置を提供することができる。 As described above, according to the first embodiment, the user interface device detects a finger contact, outputs the number of touched fingers and their positions, and a command group for operating the operation target device 100. Is stored for each number of fingers, and the number of touched fingers and the touch operation performed with the touching finger are recognized based on the output of the touch display 1, and the touch is performed from the storage unit 5. An operation table corresponding to the number of fingers is obtained, and the control unit 8 selects a command corresponding to the touch operation executed with the touch finger from the command group set in the operation table. As described above, by combining the operation of touching a finger and the operation of touching with any of the fingers from the state of touching the finger, a command having more than the number of fingers can be selected. Therefore, it is possible to provide a user interface device that allows the user to perform a touch operation without turning his / her line of sight with the existing configuration.
 また、実施の形態1によれば、制御部8が、タッチディスプレイ1に接触している指のタップまたはスワイプを接触指によるタッチ操作と認識することにより、タップ/スワイプに応じてコマンドを変更できるので、さらに多くの種類のコマンドを選択可能になる。
 なお、上記説明ではタップ/スワイプのタッチ操作を例示したが、これ以外のタッチ操作を認識することも可能であり、異なるタッチ操作を組み合わせることによってさらに多くの種類のコマンドを選択可能になる。
Further, according to the first embodiment, the control unit 8 can change the command according to the tap / swipe by recognizing the tap or swipe of the finger in contact with the touch display 1 as a touch operation with the touch finger. Therefore, it becomes possible to select more types of commands.
In the above description, the tap / swipe touch operation is illustrated, but other touch operations can be recognized, and more types of commands can be selected by combining different touch operations.
実施の形態2.
 図6は、本実施の形態2に係るユーザインタフェース装置の構成を示すブロック図であり、図1と同一または相当の部分については同一の符号を付し説明を省略する。
 本実施の形態2のユーザインタフェース装置は車両に搭載され、車両側から車速を取得する車速取得部30、タッチ操作のモードを切り替えるモード切替部31、およびタッチディスプレイ1にボタン等を表示してタッチ操作を受け付けるボタン操作モード制御部32とを備えている。
Embodiment 2. FIG.
FIG. 6 is a block diagram showing the configuration of the user interface device according to the second embodiment, and the same or corresponding parts as those in FIG.
The user interface device according to the second embodiment is mounted on a vehicle, and displays a touch on the vehicle speed acquisition unit 30 that acquires the vehicle speed from the vehicle side, a mode switching unit 31 that switches a touch operation mode, and the touch display 1 to display buttons. A button operation mode control unit 32 for receiving an operation.
 モード切替部31は、車速取得部30の取得した車速が停車状態(例えば、10km/h未満)の場合、ボタン操作モード制御部32によるボタン操作モードに切り替え、タッチディスプレイ1に表示されたボタンをユーザがタッチ操作することで操作対象装置100(図1に示す)を操作するユーザインタフェースにする。
 なお、モード切替部31が、パーキングブレーキの情報を取得して、パーキングブレーキが作動状態、かつ、車速が10km/h未満の場合に停車状態と判断してもよい。
When the vehicle speed acquired by the vehicle speed acquisition unit 30 is in a stopped state (for example, less than 10 km / h), the mode switching unit 31 switches to the button operation mode by the button operation mode control unit 32 and switches the button displayed on the touch display 1. A user interface for operating the operation target device 100 (shown in FIG. 1) is made by a touch operation by the user.
The mode switching unit 31 may acquire parking brake information and determine that the vehicle is stopped when the parking brake is operating and the vehicle speed is less than 10 km / h.
 例えば記憶部5に、操作対象装置100を操作するコマンドとそのボタン(画像データなど)とを対応付けて記憶しておき、ボタン操作モードに切り替わると、ボタン操作モード制御部32がそのボタンを読みだしてディスプレイ制御部3に出力する。ディスプレイ制御部3がそのボタンをタッチディスプレイ1に表示させ、タッチパッド制御部4がそのボタンがタッチ操作された接触情報を検知してボタン操作モード制御部32へ通知する。この通知を受けたボタン操作モード制御部32がそのボタンに対応するコマンドを記憶部5から選択して、通信部6を経由して操作対象装置100に送信する。 For example, a command for operating the operation target device 100 and its button (image data or the like) are stored in the storage unit 5 in association with each other, and when the button operation mode is switched, the button operation mode control unit 32 reads the button. Therefore, the data is output to the display control unit 3. The display control unit 3 displays the button on the touch display 1, and the touch pad control unit 4 detects contact information on the touch operation of the button and notifies the button operation mode control unit 32. Upon receiving this notification, the button operation mode control unit 32 selects a command corresponding to the button from the storage unit 5 and transmits the command to the operation target device 100 via the communication unit 6.
 また、ボタン操作モードのとき、指1本によるボタン操作だけでなく、指2本による拡大/縮小の操作(例えば、タッチディスプレイ1に接触させた2本の指を離間させる方向にスライドさせて画像を拡大、近づける方向にスライドさせて画像を縮小する操作)などを実施してもよい。 Further, in the button operation mode, not only a button operation with one finger but also an enlargement / reduction operation with two fingers (for example, two fingers touching the touch display 1 are slid in a direction to separate the image. And the like, an operation of reducing the image by sliding the image in the direction of enlarging or approaching.
 一方、車速取得部30の取得した車速が走行状態(例えば、10km/h以上)の場合、モード切替部31は制御部8による操作モード(以下、ブラインド操作モードと称する)に切り替え、上記実施の形態1のユーザインタフェースにする。 On the other hand, when the vehicle speed acquired by the vehicle speed acquisition unit 30 is in a traveling state (for example, 10 km / h or more), the mode switching unit 31 switches to an operation mode (hereinafter referred to as a blind operation mode) by the control unit 8 and The user interface of form 1 is used.
 このように構成することで、停車時にはユーザがタッチディスプレイ1に表示されたボタンを目視しながらタッチ操作できるので、効率よい操作が可能となる。一方、運転中はユーザがタッチディスプレイ1を目視せずにタッチ操作できるので、運転者にとって利便性が高い。 With this configuration, since the user can perform a touch operation while visually observing the buttons displayed on the touch display 1 when the vehicle is stopped, an efficient operation is possible. On the other hand, since the user can perform a touch operation without viewing the touch display 1 during driving, it is highly convenient for the driver.
 ブラインド操作モードからボタン操作モードへの切り替えは、車速による切り替え以外の方法であってもよい。
 例えば、タッチディスプレイ1の右端に、赤外線等を出射する光電センサ33を設けて、タッチディスプレイ1の左側(助手席)のユーザがタッチ操作を行なっているか否かを判断できるようにする。この構成の場合、ユーザの手が光電センサ33の赤外線を反射すると、反射した赤外線を光電センサ33が検知して、助手席のユーザによるタッチ操作が行われていると判定し、モード切替部31へ通知する。
Switching from the blind operation mode to the button operation mode may be a method other than the switching by the vehicle speed.
For example, a photoelectric sensor 33 that emits infrared rays or the like is provided at the right end of the touch display 1 so that it can be determined whether or not the user on the left side (passenger seat) of the touch display 1 is performing a touch operation. In the case of this configuration, when the user's hand reflects the infrared rays of the photoelectric sensor 33, the photoelectric sensor 33 detects the reflected infrared rays and determines that the touch operation by the user at the passenger seat is being performed, and the mode switching unit 31. To notify.
 モード切替部31は、光電センサ33の判定結果に従い、助手席のユーザがタッチ操作している場合はボタン操作モード制御部32によるボタン操作モードに切り替え、それ以外の場合(例えば、運転席のユーザがタッチ操作している場合)は制御部8によるブラインド操作モードに切り替える。 The mode switching unit 31 switches to the button operation mode by the button operation mode control unit 32 when the passenger seat user is performing a touch operation according to the determination result of the photoelectric sensor 33, and otherwise (for example, the driver seat user Is switched to the blind operation mode by the control unit 8.
 このように構成することで、助手席のユーザはタッチディスプレイ1に表示されたボタンを目視しながらタッチ操作できるので、効率よい操作が可能となる。一方、運転席のユーザはタッチディスプレイ1を目視せずにタッチ操作できるので、利便性が高い。 With this configuration, the passenger in the front passenger seat can perform a touch operation while visually observing the buttons displayed on the touch display 1, so that an efficient operation is possible. On the other hand, the user at the driver's seat can perform a touch operation without observing the touch display 1, which is highly convenient.
 助手席のユーザによるタッチ操作か、運転席のユーザによるタッチ操作かの判別を行うユーザ判別部を、光電センサ33以外の構成にすることも可能である。例えばタッチディスプレイ1にハードボタンを設けて、ボタン押下の有無に応じてユーザを判別してもよい。 The user determination unit that determines whether the touch operation is performed by the user at the passenger seat or the touch operation by the user at the driver seat may be configured other than the photoelectric sensor 33. For example, a hard button may be provided on the touch display 1 and the user may be determined depending on whether or not the button is pressed.
 また例えば、制御部8が接触指の本数に応じた操作テーブルを記憶部5から取得する際に(図3のステップST3)、接触指が3~5本ならステップST4~ST6に遷移してブラインド操作モードを実施する一方、接触指が1~2本ならボタン操作モード制御部32のボタン操作モードに切り替えるようにしてもよい。 Further, for example, when the control unit 8 acquires an operation table corresponding to the number of contact fingers from the storage unit 5 (step ST3 in FIG. 3), if there are 3 to 5 contact fingers, the process proceeds to steps ST4 to ST6 and is blinded. While the operation mode is performed, the button operation mode of the button operation mode control unit 32 may be switched to one or two contact fingers.
 このように構成することで、従来スマートフォンなどで行なっている、指1本によるボタンタッチ操作、および指2本による拡大/縮小の操作はそのままに、指3本以上によるブラインド操作モードの拡張が可能となり、既存のタッチ操作を害することなく、操作拡張ができる。上記説明では、ブラインド操作モードからボタン操作モードへの切り替えを、接触指2本以下に設定したが、これに限定されるものではなく、任意の本数(N本)でよい。 By configuring in this way, it is possible to expand the blind operation mode with three or more fingers while keeping the button touch operation with one finger and the enlargement / reduction operation with two fingers, which are conventionally performed on a smartphone or the like. Thus, the operation can be expanded without harming the existing touch operation. In the above description, the switching from the blind operation mode to the button operation mode is set to two or less contact fingers. However, the present invention is not limited to this, and an arbitrary number (N) may be used.
 なお、本発明はその発明の範囲内において、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。 In the present invention, within the scope of the invention, any combination of each embodiment, any component of each embodiment can be modified, or any component can be omitted in each embodiment. .
 以上のように、この発明に係るユーザインタフェース装置は、ユーザが視線を向けることなくタッチ操作できるようにしたので、車載向けの操作対象装置を操作するユーザインタフェース装置などに用いるのに適している。 As described above, the user interface device according to the present invention is adapted to be used for a user interface device for operating an in-vehicle operation target device because the user can perform a touch operation without directing his / her line of sight.
 1 タッチディスプレイ、2 操作制御装置、3 ディスプレイ制御部、4 タッチパッド制御部、5 記憶部、6 通信部、7 音声出力部、8 制御部、10 手、11~15 指、21~23 領域、30 車速取得部、31 モード切替部、32 ボタン操作モード制御部、33 光電センサ(ユーザ判別部)、100 操作対象装置。 1 touch display, 2 operation control device, 3 display control unit, 4 touchpad control unit, 5 storage unit, 6 communication unit, 7 audio output unit, 8 control unit, 10 hands, 11-15 fingers, 21-23 region, 30 vehicle speed acquisition unit, 31 mode switching unit, 32 button operation mode control unit, 33 photoelectric sensor (user discrimination unit), 100 operation target device.

Claims (6)

  1.  指の接触を検出し、当該接触した指の本数および前記指の位置を出力するタッチパッドと、
     操作対象装置を操作するコマンド群が設定された操作テーブルを、指の本数ごとに保持する記憶部と、
     前記タッチパッドの出力に基づいて接触した指の本数および当該接触した指で実行されたタッチ操作を認識し、前記記憶部から前記接触した指の本数に対応する操作テーブルを取得し、当該操作テーブルに設定されたコマンド群の中から前記接触した指で実行されたタッチ操作に対応するコマンドを選択する制御部とを備えるユーザインタフェース装置。
    A touchpad that detects finger contact and outputs the number of touched fingers and the position of the finger;
    A storage unit that holds an operation table in which a command group for operating the operation target device is set for each number of fingers;
    Recognizing the number of touched fingers and the touch operation executed with the touched finger based on the output of the touchpad, obtaining an operation table corresponding to the number of touched fingers from the storage unit, and the operation table And a control unit that selects a command corresponding to the touch operation executed with the touched finger from the command group set in the above.
  2.  前記制御部は、前記タッチパッドに接触した指で実行されたタップまたはスワイプを、前記タッチ操作と認識することを特徴とする請求項1記載のユーザインタフェース装置。 The user interface device according to claim 1, wherein the control unit recognizes a tap or swipe executed by a finger touching the touch pad as the touch operation.
  3.  前記操作テーブルのコマンドは、前記タッチ操作を実行した指を識別する識別番号と対応付けられており、
     前記制御部は、前記タッチパッドに接触した指同士の相対位置関係に基づいて当該指それぞれに前記識別番号を付して前記タッチ操作を実行した指を識別し、前記識別番号に対応するコマンドを前記操作テーブルから選択することを特徴とする請求項1記載のユーザインタフェース装置。
    The command of the operation table is associated with an identification number that identifies the finger that performed the touch operation,
    The control unit identifies the finger that performed the touch operation by assigning the identification number to each of the fingers based on the relative positional relationship between the fingers that have touched the touch pad, and issues a command corresponding to the identification number. The user interface device according to claim 1, wherein the user interface device is selected from the operation table.
  4.  前記タッチパッドに重ねて設けられたディスプレイと、
     前記ディスプレイにボタンを表示し、前記タッチパッドの出力に基づいて当該ボタンへの指の接触を認識すると当該ボタンに対応するコマンドを選択するボタン操作モード制御部とを備え、
     前記制御部は、前記タッチパッドに接触した指の本数がN本以下の場合に、前記ボタン操作モード制御部によるコマンド選択に切り替えることを特徴とする請求項1記載のユーザインタフェース装置。
    A display provided over the touchpad;
    A button operation mode control unit for displaying a button on the display and selecting a command corresponding to the button when recognizing a finger contact with the button based on an output of the touchpad;
    The user interface device according to claim 1, wherein the control unit switches to command selection by the button operation mode control unit when the number of fingers touching the touch pad is N or less.
  5.  前記タッチパッドに重ねて設けられたディスプレイと、
     前記ディスプレイにボタンを表示し、前記タッチパッドの出力に基づいて当該ボタンへの指の接触を認識すると当該ボタンに対応するコマンドを選択するボタン操作モード制御部と、
     前記操作対象装置が搭載された車両の走行速度を取得する車速取得部とを備え、
     前記制御部は、前記車速取得部の取得した走行速度が所定値以下の場合に、前記ボタン操作モード制御部によるコマンド選択に切り替えることを特徴とする請求項1記載のユーザインタフェース装置。
    A display provided over the touchpad;
    A button operation mode control unit that displays a button on the display and selects a command corresponding to the button when recognizing a finger contact with the button based on an output of the touchpad;
    A vehicle speed acquisition unit that acquires a traveling speed of a vehicle on which the operation target device is mounted,
    The user interface device according to claim 1, wherein the control unit switches to command selection by the button operation mode control unit when the traveling speed acquired by the vehicle speed acquisition unit is equal to or less than a predetermined value.
  6.  前記タッチパッドに重ねて設けられたディスプレイと、
     前記ディスプレイにボタンを表示し、前記タッチパッドの出力に基づいて当該ボタンへの指の接触を認識すると当該ボタンに対応するコマンドを選択するボタン操作モード制御部と、
     前記操作対象装置が搭載された車両において助手席のユーザが前記タッチパッドの操作を行なっていることを検出するユーザ判別部とを備え、
     前記制御部は、前記ユーザ判別部の検出結果に基づいて、前記助手席のユーザが操作を行なっている場合に、前記ボタン操作モード制御部によるコマンド選択に切り替えることを特徴とする請求項1記載のユーザインタフェース装置。
    A display provided over the touchpad;
    A button operation mode control unit that displays a button on the display and selects a command corresponding to the button when recognizing a finger contact with the button based on an output of the touchpad;
    A user discriminating unit for detecting that a passenger in a passenger seat is operating the touchpad in a vehicle equipped with the operation target device;
    The control unit switches to command selection by the button operation mode control unit when a user at the passenger seat is performing an operation based on a detection result of the user determination unit. User interface device.
PCT/JP2013/063570 2013-05-15 2013-05-15 User interface device WO2014184902A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/063570 WO2014184902A1 (en) 2013-05-15 2013-05-15 User interface device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/063570 WO2014184902A1 (en) 2013-05-15 2013-05-15 User interface device

Publications (1)

Publication Number Publication Date
WO2014184902A1 true WO2014184902A1 (en) 2014-11-20

Family

ID=51897917

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/063570 WO2014184902A1 (en) 2013-05-15 2013-05-15 User interface device

Country Status (1)

Country Link
WO (1) WO2014184902A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016119022A (en) * 2014-12-24 2016-06-30 カルソニックカンセイ株式会社 User interface device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007105457A1 (en) * 2006-02-23 2007-09-20 Pioneer Corporation Operation input device and navigation device
JP2010108273A (en) * 2008-10-30 2010-05-13 Sony Corp Information processor, information processing method and program
JP2010262400A (en) * 2009-04-30 2010-11-18 Denso Corp Onboard electronic device operation device
JP2013105395A (en) * 2011-11-15 2013-05-30 Sony Corp Information processing apparatus, information processing method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007105457A1 (en) * 2006-02-23 2007-09-20 Pioneer Corporation Operation input device and navigation device
JP2010108273A (en) * 2008-10-30 2010-05-13 Sony Corp Information processor, information processing method and program
JP2010262400A (en) * 2009-04-30 2010-11-18 Denso Corp Onboard electronic device operation device
JP2013105395A (en) * 2011-11-15 2013-05-30 Sony Corp Information processing apparatus, information processing method, and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016119022A (en) * 2014-12-24 2016-06-30 カルソニックカンセイ株式会社 User interface device

Similar Documents

Publication Publication Date Title
CN107107841B (en) Information processing apparatus
US20180232057A1 (en) Information Processing Device
KR102029842B1 (en) System and control method for gesture recognition of vehicle
US20110169750A1 (en) Multi-touchpad multi-touch user interface
US11299045B2 (en) Method for operating a display arrangement of a motor vehicle, operator control device and motor vehicle
US9446712B2 (en) Motor vehicle comprising an electronic rear-view mirror
CN111319624B (en) System and method for initiating and executing automatic lane change maneuver
US10967737B2 (en) Input device for vehicle and input method
US20140281964A1 (en) Method and system for presenting guidance of gesture input on a touch pad
CN108108042B (en) Display device for vehicle and control method thereof
JP5750687B2 (en) Gesture input device for car navigation
US11144193B2 (en) Input device and input method
JP2004362429A (en) Command input device using touch panel display
KR100924532B1 (en) Apparatus and method for controlling external peripheral device using IR control signal of car video interface
JP2016029532A (en) User interface
JP2018501998A (en) System and method for controlling automotive equipment
JP5849597B2 (en) Vehicle control device
WO2014184902A1 (en) User interface device
JP5626259B2 (en) Image display device
JP6265839B2 (en) INPUT DISPLAY DEVICE, ELECTRONIC DEVICE, ICON DISPLAY METHOD, AND DISPLAY PROGRAM
JP2008120134A (en) Manual operation device
EP2757407B1 (en) Multiple-view display system with user recognition and operation method thereof
JP6390380B2 (en) Display operation device
JP6429699B2 (en) Vehicle input system
JP6315443B2 (en) Input device, input detection method for multi-touch operation, and input detection program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13884411

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13884411

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP