WO2014112080A1 - 操作装置 - Google Patents
操作装置 Download PDFInfo
- Publication number
- WO2014112080A1 WO2014112080A1 PCT/JP2013/050847 JP2013050847W WO2014112080A1 WO 2014112080 A1 WO2014112080 A1 WO 2014112080A1 JP 2013050847 W JP2013050847 W JP 2013050847W WO 2014112080 A1 WO2014112080 A1 WO 2014112080A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- operation surface
- touch operation
- screen
- downward movement
- cursor
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 68
- 230000002093 peripheral effect Effects 0.000 claims abstract description 10
- 238000000034 method Methods 0.000 description 24
- 230000006870 function Effects 0.000 description 21
- 230000008569 process Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 11
- 230000004044 response Effects 0.000 description 10
- 230000008859 change Effects 0.000 description 9
- 230000007246 mechanism Effects 0.000 description 7
- 239000000758 substrate Substances 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 239000003990 capacitor Substances 0.000 description 2
- 239000012212 insulator Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
- B60K2360/1442—Emulation of input devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/182—Distributing information between displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
Definitions
- This disclosure relates to an operation device.
- a touch-type input device that includes a touch panel and selects a selection item in an operation screen displayed on a display unit by a touch panel selection operation and performs an input operation of the selection item is known (for example, Patent Documents). 1).
- an object of the present disclosure is to provide an operation device that can easily perform a scroll operation and the like while saving space.
- the touch operation surface is configured to be movable up and down and is provided with a sensor that outputs a signal representing finger contact, and the sensor signal is not output even when the finger touches.
- an operating device including a control device that executes at least one of sequentially moving the position of the cursor in the screen to be operated.
- FIG. 1 is a system diagram illustrating a schematic configuration of a vehicle operating device 1 according to an embodiment.
- 1 is a top view schematically showing a touch pad 10.
- FIG. 2 is a cross-sectional view schematically showing a cross-section of the main part of the touchpad 10.
- 6 is a diagram illustrating an example of an operation screen displayed on the display 20.
- FIG. It is a figure which shows the display 20 and the touchpad 10 roughly, and is a figure which shows notionally the operation example of absolute coordinate mode.
- 4 is a flowchart illustrating an example of processing executed by a display control unit 30.
- FIG. 12 is a flowchart illustrating another example of processing executed by the display control unit 30. It is a figure which shows an example of a cursor sequential movement aspect.
- 3 is a diagram schematically illustrating an example of a hand state during a pressing operation at a right portion of the outer frame member 50 and an example of a hand state during a pressing operation at a left portion of the outer frame member 50.
- FIG. 15 is a cross sectional view schematically showing a cross section along line CC in FIG. 14.
- FIG. 1 is a system diagram showing a schematic configuration of a vehicle operating device 1 according to an embodiment.
- FIG. 2 is a top view schematically showing the touch pad 10.
- FIG. 3 is a cross-sectional view schematically showing a cross-section of the main part of the touch pad 10. 2 schematically shows a hand for operating the coordinate detection unit 12 (touch operation surface) of the touch pad 10, but it is not shown in FIG.
- FIG. 4 is a diagram illustrating an example of an operation screen displayed on the display 20.
- FIG. 5 is a diagram schematically showing the display 20 and the touch pad 10 and conceptually showing an operation example in the absolute coordinate mode.
- the vehicle operating device 1 includes a touch pad 10, a display 20, and a display control unit 30.
- the touch pad 10 is provided at an appropriate place in the vehicle interior.
- the touch pad 10 is preferably disposed at a position that is easy for the driver to operate (a position where the hand reaches out while maintaining the driving posture).
- the touch pad 10 is typically disposed at a position where an operator extends his / her hand from the front side of the touch pad 10 to operate.
- the touch pad 10 may be disposed, for example, on the console box or around it.
- the touch pad 10 includes a coordinate detection unit 12, a downward movement detection unit 14, a control unit 16, and a memory 18.
- the touch pad 10 includes an outer frame member 50, a stopper 52, an elastic member 54, and a substrate 60.
- the coordinate detection unit 12 includes a two-dimensional substantially flat operation surface (touch operation surface) on the surface side.
- An electrostatic sensor is provided on the touch operation surface.
- the output signal of the electrostatic sensor is sent to the control unit 16.
- the coordinate detection unit 12 is configured by, for example, an electrostatic pad.
- the electrostatic pad has, for example, a structure in which an electrode (electrostatic sensor) extends linearly with an insulator sandwiched between the X direction and the Y direction on a plane.
- an electrode detection signal (a signal indicating the amount of change in the charge accumulated in the electrode) may be sent to the control unit 16.
- the coordinate detection unit 12 is configured to be movable in the vertical direction (Z direction in FIG. 3).
- a mechanism for enabling the coordinate detection unit 12 to move up and down may be arbitrary.
- the coordinate detection unit 12 is supported by the substrate 60 via the elastic member 54.
- the vertical movement stroke of the coordinate detection unit 12 may be arbitrary or may be minute.
- the outer frame member 50 is provided on the outer periphery of the coordinate detection unit 12. As shown in FIG. 3, the outer frame member 50 is preferably provided in a manner protruding upward from the touch operation surface of the coordinate detection unit 12. Further, the outer frame member 50 preferably protrudes upward from the peripheral surface of the touch pad 10 (for example, when the touch pad 10 is mounted on the console box, the surface around the touch pad 10 in the console box). Is provided.
- the outer frame member 50 may be attached to the coordinate detection unit 12 so as to be integrated with the coordinate detection unit 12. For example, the outer frame member 50 may be coupled to the outer periphery of the coordinate detection unit 12 by fitting, may be coupled by adhesion, or may be coupled using a fastener such as a screw.
- the outer frame member 50 may be attached in a manner of being placed on the touch operation surface (that is, may be attached in a manner of covering the outer periphery of the touch operation surface), or the outer periphery of the touch operation surface. You may attach to the outer peripheral side part (outer peripheral side surface) of a touch operation surface in the aspect which does not cover a part.
- the outer frame member 50 may be formed of an arbitrary material, and may be configured of a material different from that of the coordinate detection unit 12. Further, the outer frame member 50 may be integrally formed with the coordinate detection unit 12. Since the outer frame member 50 is provided integrally with the coordinate detection unit 12, the outer frame member 50 is configured to be movable in the vertical direction together with the coordinate detection unit 12.
- the stopper 52 regulates the vertical movement stroke of the outer frame member 50 and the coordinate detection unit 12.
- the stopper 52 is provided below the outer frame member 50.
- the stopper 52 may be provided below the coordinate detection unit 12.
- the outer frame member 50 and the coordinate detection unit 12 are restricted from moving downward when the stopper 52 hits the upper surface of the substrate 60.
- stopper mechanisms There are a wide variety of stopper mechanisms, and other stopper mechanisms may be used.
- a stopper mechanism that restricts the upward movement of the outer frame member 50 and the coordinate detection unit 12 may be added.
- a guide mechanism for guiding the vertical movement of the outer frame member 50 and the coordinate detection unit 12 may be added. Further, even when a mechanism for transmitting a click feeling (or vibration) to the outer frame member 50 and the coordinate detection unit 12 when the outer frame member 50 and the coordinate detection unit 12 are moved downward by a predetermined amount or more is added. Good.
- the elastic member 54 may be composed of an arbitrary spring such as a leaf spring or a coil spring, or may be composed of rubber or a soft resin material. The elastic member 54 urges the coordinate detection unit 12 upward so that the coordinate detection unit 12 is maintained at the nominal height.
- the downward movement detection means 14 outputs a signal indicating the downward movement of the outer frame member 50 and the coordinate detection unit 12 of the touch pad 10.
- the downward movement detecting means 14 may be constituted by, for example, a tact switch, a pressure sensor (for example, a piezoelectric element), or the like.
- the downward movement detection means 14 may be disposed at any location as long as it is in contact with the coordinate detection unit 12 or the outer frame member 50 as the operation surface of the coordinate detection unit 12 moves downward.
- the pressure-sensitive sensor that constitutes the downward movement detection unit 14 is installed below the center of the coordinate detection unit 12, but around the coordinate detection unit 12 (that is, the outer frame member 50). Below). Further, a plurality of pressure sensors constituting the downward movement detecting means 14 may be provided at a dispersed position.
- the control unit 16 and the memory 18 are configured by a microcomputer, for example.
- the control unit 16 and the memory 18 may be mounted on the substrate 60.
- Various functions (including functions described below) of the control unit 16 may be realized by arbitrary hardware, software, firmware, or a combination thereof.
- any part or all of the functions of the control unit 16 may be realized by an application-specific ASIC (application-specific integrated circuit) or FPGA (Field Programmable Gate Array).
- the function of the control unit 16 may be realized in cooperation with a plurality of computers.
- the control unit 16 detects the touch of the finger on the touch operation surface based on the output (output signal) from the electrostatic sensor of the coordinate detection unit 12. At this time, the control unit 16 generates a coordinate signal representing a coordinate position in the touch operation surface, that is, a coordinate signal representing a coordinate position (touch position of the operation finger) touched by the operator.
- a coordinate signal representing a coordinate position in the touch operation surface that is, a coordinate signal representing a coordinate position (touch position of the operation finger) touched by the operator.
- the coordinate detection unit 12 is composed of an electrostatic pad
- charge is stored in the capacitor composed of the electrode and the operation finger as described above, and the amount of change in charge at each electrode differs depending on the position of the operation finger.
- the position of the operation finger can be specified based on the detection signal from each electrode.
- the control unit 16 detects a finger contact with the touch operation surface when the output level from the coordinate detection unit 12 exceeds a predetermined reference value (detection threshold), and the level of the detection signal is maximum.
- a coordinate signal is generated based on the arrangement position of the electrode to be (maximum).
- the predetermined reference value is, for example, a value related to the amount of change in the charge accumulated on the electrode.
- the control unit 16 determines that the selection operation is performed by the operator when the change amount of the charge accumulated in the electrode (maximum charge change amount) exceeds a reference value, and determines the coordinate signal (for example, the charge amount).
- Coordinate signal representing a two-dimensional position where the amount of change is maximum
- the reference value may be stored in the memory 18.
- the generated coordinate signal is transmitted to the display control unit 30.
- the control unit 16 generates a determination signal based on the output signal from the downward movement detection means 14. For example, when the downward movement detecting means 14 is composed of a pressure sensor, the determination signal is generated when the output (pressing pressure) from the pressure sensor exceeds a predetermined threshold value Pn. When the downward movement detecting means 14 is composed of a tact switch, a determination signal is generated when an ON signal is input from the tact switch. The generated determination signal is transmitted to the display control unit 30. When a plurality of pressure sensors constituting the downward movement detecting means 14 are provided, the control unit 16 generates a determination signal when the output from any of the pressure sensors exceeds a predetermined threshold value Pn. Also good.
- the determination signal is a signal indicating only that the outer frame member 50 and the coordinate detection unit 12 have moved downward, and information on the position of the pressing operation (whether the outer frame member 50 or the coordinate detection unit 12 has been pressed). It may be a signal that does not include other information such as Similarly, when a plurality of tact switches constituting the downward movement detection unit 14 are provided, the control unit 16 may generate a determination signal when an ON signal is input from any tact switch. The generated determination signal is transmitted to the display control unit 30.
- the touch pad 10 since the outer frame member 50 is provided as described above, even when the same determination signal is transmitted, the touch pad 10 transmits the determination signal with the transmission of the coordinate signal and the coordinate signal. There are two cases in which only the decision signal is transmitted without transmission of. That is, when the operator pushes the touch pad 10 downward with a finger and moves the touch pad 10 downward, a determination signal and a coordinate signal are generated, but the operator does not touch the touch pad 10 (that is, the touch of the finger). When only the outer frame member 50 is pushed downward with a finger and moved downward (without exceeding the detection threshold), only a determination signal is generated without generating a coordinate signal.
- the control unit 16 communicates with the display control unit 30 and transmits various information (such as a coordinate signal, a determination signal, and a message output request) to the display control unit 30.
- various information such as a coordinate signal, a determination signal, and a message output request
- a part or all of the functions of the control unit 16 may be realized by the coordinate detection unit 12.
- the display 20 is disposed at a remote position with respect to the touch pad 10.
- the display 20 may be an arbitrary display device such as a liquid crystal display or a HUD (head-up display).
- the display 20 is disposed at an appropriate position (for example, an instrument panel) in the vehicle interior.
- the display 20 may be a touch panel display or a type of display that cannot be touched.
- an operation screen (see FIG. 4) representing the operation content that can be operated with the touch pad 10 is displayed.
- an image of a TV, a peripheral monitoring camera, or the like may be displayed on the display 20.
- the operation screen may be displayed on the entire screen as shown in FIG. 4 or may be displayed on a part of the screen. As shown in FIG. 4, the operation screen may include two or more selection items that can be operated with the touch pad 10.
- the operation screen may include another information display unit (for example, a part for displaying travel information such as TV, audio, outside air temperature, fuel consumption, entertainment information, etc.). In the example shown in FIG. 4, the operation screen is for performing various radio operations with the touch pad 10.
- the selection item constitutes a virtual operation button (meaning that it is not a mechanical button that is directly operated by hand).
- the selection item (operation button) may relate to an arbitrary type (function). That is, the content that can be operated with the touchpad 10 may be arbitrary.
- the selection item may include a selection item for displaying (calling) a screen (operation screen) or a map screen (for example, current location display screen) for performing various settings of the navigation device on the display 20.
- the selection items may include selection items for performing various settings of the air conditioner and selection items for displaying the operation screen on the display 20.
- the selection items may include selection items for performing various settings of audio and TV (such as volume adjustment) and selection items for displaying the operation screen on the display 20.
- the selection item may be a selection item (icon, launcher) for starting an arbitrary application. Further, the selection item may be a character input button on an operation screen such as a 50 sound input screen.
- the selection items may include scrolled lists 71a to 71f in a list display area (list screen) 90 as shown in FIG.
- the list scroll mode may be arbitrary, for example, the list scroll may be drum-type (rotating drum type), or the list scroll may end at both ends.
- the lists 71a to 71f are selection items for setting radio frequencies, and represent the frequencies that can be set.
- the selection item may include a scroll button 72 for scrolling each list.
- the scroll button 72 includes a button 72a for scrolling the list upward and a button 72b for scrolling the list downward.
- the selection item is changed from the normal display to the selection display or from the selection display to the normal display based on the coordinate signal from the touch pad 10 under the control of the display control unit 30 described later. .
- a cursor 80 that can be moved by an operation on the touch pad 10 is shown on the display 20.
- the cursor 80 is positioned on the selection item “SAT” related to satellite broadcasting, for example, and therefore the selection item “SAT” is selected and displayed.
- the cursor 80 represents the selection item that is selected and displayed. Therefore, the position of the cursor 80 corresponds to the position of the selection item that is selected and displayed.
- the display control unit 30 is configured by, for example, a microcomputer, like the control unit 16, and may be embodied as an ECU.
- the connection mode between the display control unit 30 and the touch pad 10 is arbitrary, and may be wired, wireless, or a combination thereof, or may be a direct connection or an indirect connection.
- part or all of the functions of the display control unit 30 may be realized by the control unit 16 of the touchpad 10 or a control unit (not shown) in the display 20, or conversely, the control unit of the touchpad 10 Part or all of the 16 functions may be realized by the display control unit 30.
- the display control unit 30 may be input with vehicle speed information indicating the vehicle speed, power supply information regarding the power supply state (IG, ACC) of the vehicle, and the like as necessary.
- a controller 40 is connected to the display control unit 30.
- the controller 40 controls a device operated via the vehicle operation device 1.
- the controller 40 may include an audio controller that controls the audio device, a navigation controller that controls the navigation device, a controller that controls the air conditioner, and the like.
- the display control unit 30 may realize part or all of the functions of the controller 40.
- the display control unit 30 assists the operation on the touch pad 10 by synchronizing the display 20 and the touch pad 10. Specifically, the display control unit 30 displays an operation screen (see FIG. 4) on the display 20, and performs a selection process of various selection items based on signals (coordinate signals and determination signals) from the touch pad 10. Perform decision processing.
- the display control unit 30 when receiving the coordinate signal from the touch pad 10, the display control unit 30 selects and displays any one selection item on the operation screen (that is, responds to “selection operation”). That is, the position of the cursor 80 is determined. At this time, the display control unit 30 may operate in the absolute coordinate mode.
- the absolute coordinate mode refers to a mode in which the coordinate system of the screen of the display 20 is synchronized in an absolute synchronization manner that absolutely corresponds to the coordinate system of the operation surface of the touch pad 10. In the absolute coordinate mode, typically, the origin of the coordinate system of the screen of the display 20 is located at a fixed position, and the origin of the coordinate system of the operation surface of the touch pad 10 is located at a fixed position.
- the correspondence between the coordinate systems may be proportional to the size ratio.
- any one selection item may be selected and displayed by default, or any selection item may be non-selected.
- the selection display (that is, the mode of the cursor 80) is arbitrary as long as the operator can recognize that the selection item is selected. For example, the brightness of the display of the selection item to be selected is displayed. It may be realized by making the color or the color different from other selection items, or the outer frame of the selection item may be highlighted.
- the display control unit 30 selects and displays one arbitrary selection item (that is, when the cursor 80 is displayed), and thereafter, the coordinate signal from the touch pad 10 is interrupted.
- the selection state of the selection item may be maintained. This is because there is a possibility of continuing the operation again after releasing the finger from the touch pad 10.
- the selection state of the selection item is determined until another selection item is selected according to a coordinate signal from the touch pad 10 thereafter (or until the operation screen is switched to another screen, or the operation screen is turned off. May be maintained).
- the selection state of the selection item may be maintained for a relatively long predetermined time after the coordinate signal from the touch pad 10 is interrupted.
- the display control unit 30 realizes the operation content of the selection item selected and displayed at that time (that is, in response to the “decision operation”, realizes the determination function) To do).
- This operation content depends on the selection item related to the determination operation, but the screen transition such as display of the lower selection item and change of the operation screen, input of characters, activation of the application, and input to the controller 40 It may be accompanied by transmission of a control signal.
- the display of the determined selection item may be appropriately changed or a predetermined sound may be generated in order to notify the user that the “decision operation” has been detected. .
- the determination signal from the touch pad 10 is received together with the reception of the coordinate signal from the touch pad 10, and the touch pad 10. There is a case where only the determination signal is received without receiving the coordinate signal from. That is, even when the display control unit 30 receives the same determination signal, the display control unit 30 may receive the coordinate signal at the same time or may not receive the coordinate signal at the same time.
- the display control unit 30 may respond to the above-described determination operation when receiving the determination signal from the touch pad 10 together with the reception of the coordinate signal.
- the display control unit 30 when receiving the determination signal from the touch pad 10 without receiving the coordinate signal, the display control unit 30 does not respond to the determination operation described above, and realizes the function described below. May be.
- the display control unit 30 when the display control unit 30 receives the determination signal from the touch pad 10 without receiving the coordinate signal, the display control unit 30 may implement the above-described determination function under certain conditions. In some cases, the functions to be described may be realized.
- the display control unit 30 does not respond to the determination operation described above when receiving the determination signal from the touch pad 10 without receiving the coordinate signal.
- Such a configuration is also referred to as “a configuration in which a determination operation by the outer frame member 50 is not possible”, since the determination function is not realized by a pressing operation of only the outer frame member 50.
- the display control unit 30 when the display control unit 30 receives the determination signal from the touch pad 10 without receiving the coordinate signal, the display control unit 30 performs a predetermined operation according to the currently displayed operation screen.
- An operation (different from the determination operation) may be realized.
- the list scroll (list scroll operation) may be realized regardless of the presence of the cursor 80 (that is, in response to the “list scroll operation”). To do).
- the operation screen currently displayed is a map screen
- the map screen itself may be scrolled (that is, it responds to a “map screen scroll operation”).
- the operation screen when the operation screen includes a plurality of pages (for example, in the case of a hierarchical operation screen), the operation screen may be sequentially switched regardless of the presence or absence of the cursor 80 (that is, responding to “screen switching (page turning) operation”). To do). Further, in the case of a screen displaying a book, the page screen of the book may be sequentially switched (that is, responding to “screen switching (page turning) operation”). Further, for example, in the case of a screen displaying a WEB site, the screen may be sequentially switched like a so-called “return” / “forward” function (ie, responding to “screen switching operation”).
- the cursor 80 may be sequentially moved in a predetermined order on each operation item in the operation screen (that is, in response to the “cursor sequential movement operation”).
- the sequential movement of the cursor 80 may be realized only within the operation screen currently displayed, or may be accompanied by switching of the operation screen. Note that these functions may be realized while the determination signal is continuously received from the touch pad 10 without receiving the coordinate signal (that is, while the outer frame member 50 is kept depressed).
- the display control unit 30 stops scrolling and the like.
- the display control unit 30 when the display control unit 30 receives the determination signal from the touchpad 10 without receiving the coordinate signal, the display control unit 30 selectively responds to the determination operation or implements another function. To do.
- Such a configuration is also referred to as “a configuration in which a determination operation by the outer frame member 50 can be performed” below, because the determination operation can be realized by operating only the outer frame member 50.
- the display control unit 30 receives the determination signal from the touch pad 10 without receiving the coordinate signal, and the operation is performed when a predetermined list scroll condition is satisfied. Scroll the list on the screen (ie, respond to a “list scroll operation”).
- the predetermined list scroll condition may be, for example, a case where a selection item on the operation screen currently displayed includes a list.
- the predetermined list scroll condition may be, for example, a case where the selection item of the operation screen currently displayed includes a list and the cursor 80 is positioned in the list display area 90 (see FIG. 4).
- the predetermined list scrolling condition is, for example, a case where the selection item of the operation screen currently displayed includes a list and the cursor 80 is not positioned on any selection item or the cursor 80 is not displayed. Good.
- the list display area 90 may be an area that does not include the scroll button 72.
- the display control unit 30 performs the function of the selection item. It is good also as implement
- the display control unit 30 receives a determination signal from the touch pad 10 without receiving a coordinate signal, and when a predetermined map scroll condition is satisfied, Scrolls the map screen itself (ie, responds to “map screen scroll operation”).
- the predetermined map scroll condition may be, for example, a case where the operation screen currently displayed includes a map screen.
- the predetermined map scroll condition may be a case where the currently displayed operation screen includes a map screen and the cursor 80 or the pointer is not displayed on the map screen. In the latter case, the display control unit 30 may respond to the determination operation when the predetermined map scroll condition is not satisfied and the cursor 80 or the pointer is positioned on the map screen.
- the display control unit 30 receives a determination signal from the touch pad 10 without receiving a coordinate signal, and when a predetermined screen switching condition is satisfied, Realize screen switching (ie, respond to “screen switching operation”).
- the screen switching includes switching of the operation screen and switching of the WEB screen.
- the predetermined screen switching condition may be, for example, when the cursor 80 or the pointer is not displayed.
- the display control unit 30 may respond to the determination operation when the predetermined screen switching condition is not satisfied and the cursor 80 or the pointer is positioned on the screen.
- the display control unit 30 receives a determination signal from the touch pad 10 without receiving a coordinate signal, and when a predetermined cursor sequential movement condition is satisfied.
- the cursor 80 is sequentially moved (that is, responds to the “cursor sequential movement operation”).
- the predetermined cursor sequential movement condition may be a case where the cursor 80 is not displayed.
- the display control unit 30 sets the selection item. It is good also as implement
- any one of list scroll, map screen scroll, screen switching, and cursor sequential movement is selected. Only one, or any two, or any three may be realized. Alternatively, all of this may be realized. Moreover, when the conditions concerning each operation overlap, any one operation (for example, an operation with a high priority given in advance) may be given priority.
- the list scroll may be realized when a predetermined list scroll condition and a predetermined cursor sequential movement condition are satisfied at the same time.
- the list scroll may be realized with priority over the screen switching operation and the cursor sequential movement.
- FIG. 6 is an explanatory diagram of an example of a method for determining the scroll direction at the time of scrolling the list, and is a diagram illustrating an example of an operation screen.
- the list display area 90 is virtually divided into an upper list display area 92 and a lower list display area 94.
- the determination operation by the outer frame member 50 for example, when the display control unit 30 receives the determination signal from the touch pad 10 without receiving the coordinate signal, the cursor 80 is positioned in the list display area 90. If so, perform list scrolling. At this time, the display control unit 30 executes list scrolling upward when the cursor 80 is positioned in the upper list display area 92, and moves downward when the cursor 80 is positioned in the lower list display area 94. A list scroll may be executed in the direction. In the example shown in FIG. 6, since the cursor 80 is located in the upper list display area 92, when the outer frame member 50 is pushed down, the list scroll is executed in the upward direction.
- the operator who wants to scroll the list in the list display area 90 first operates the touch operation surface to move the cursor 80 to the upper list display area 92, and then presses the outer frame member 50. You can lower it.
- the operator who wants to scroll the list in the list display area 90 first operates the touch operation surface to move the cursor 80 to the upward scroll button 72a, and then presses the touch operation surface. It may be lowered.
- the scroll button 72a is narrower than the upper list display area 92, it is relatively difficult to move the cursor 80.
- the scroll operation by the outer frame member 50 is possible, and therefore the scroll button 72 can be eliminated.
- the display control unit 30 may determine the scroll direction at the time of list scrolling according to the position of the cursor 80 in the operation screen. For example, when the cursor 80 is positioned in the upper half area of the operation screen, the display control unit 30 performs list scrolling upward in response to receiving a determination signal that does not involve reception of a coordinate signal, When 80 is located in the lower half area of the operation screen, the list scroll may be executed in a downward direction in response to reception of the determination signal not accompanied by reception of the coordinate signal.
- the list scroll is performed in the vertical direction, but the same applies to the list scroll in the horizontal direction.
- the list display area is virtually divided into left and right, and the scroll direction is determined depending on which area the cursor 80 is positioned on the left or right. Good.
- the list scroll is executed in the left direction in response to the reception of the determination signal without receiving the coordinate signal.
- the list scroll may be executed in the right direction in response to the reception of the determination signal not accompanied by the reception of the coordinate signal.
- FIG. 7 is an explanatory diagram showing an example of a method for determining the scroll direction when the map is scrolled, and shows an example of an operation screen including a map screen.
- the operation screen is virtually displayed in four regions (upper region, lower region, right region, and left region) by two virtual lines 84 and 86 that intersect at the center of the screen and go to the screen corner. Divided.
- the determination operation by the outer frame member 50 for example, when the display control unit 30 receives the determination signal from the touch pad 10 without receiving the coordinate signal, the pointer 82 is positioned in the map screen and When the cursor 80 is not displayed, the map scroll is executed. At this time, the display control unit 30 scrolls the map screen upward when the pointer 82 is located in the upper area, and scrolls the map screen downward when the pointer 82 is located in the lower area.
- the display control unit 30 may similarly determine the scroll direction at the time of list scrolling according to the position of the pointer 82 in the operation screen. Therefore, an operator who wants to scroll the map screen upward may first operate the touch operation surface to move the pointer 82 to the upper area, and then press down the outer frame member 50.
- the entire operation screen is virtually divided into four areas, but the portion of the map screen in the operation screen may be virtually divided into four areas.
- the virtual lines 84 and 86 may be displayed while the pointer 82 is moved.
- FIG. 8 is a flowchart showing an example of processing executed by the display control unit 30.
- the determination operation by the outer frame member 50 is impossible.
- step 800 it is determined whether a determination signal is received from the touch pad 10. If a determination signal is received from the touch pad 10, the process proceeds to step 802, and if not received, the process ends.
- step 802 it is determined whether a coordinate signal has been received from the touch pad 10. If no coordinate signal is received from the touch pad 10, the process proceeds to step 804. On the other hand, when the coordinate signal is received from the touch pad 10, the process is terminated as it is. In this case, in another processing routine, a selection process in response to the selection operation is executed based on the received coordinate signal.
- step 804 it is determined whether or not the currently displayed operation screen includes the list display area 90. If the list display area 90 is included, the process proceeds to step 806. If the list display area 90 is not included, the process proceeds to step 808.
- step 806 the list in the list display area 90 is scrolled according to the position of the cursor 80.
- the display control unit 30 performs list scrolling upward when the cursor 80 is positioned in the upper half area of the operation screen, and moves downward when the cursor 80 is positioned in the lower half area of the operation screen.
- a list scroll may be executed in the direction. This scrolling may be performed continuously while the decision signal is received.
- the display mode of the cursor 80 during list scrolling is arbitrary. For example, a configuration in which the cursor 80 does not move during list scrolling (a configuration in which only the list changes) may be used, or the cursor 80 scrolls during list scrolling. It may be configured to move to the top or bottom position according to the direction (a configuration in which only the list changes after moving to the top or bottom position).
- step 808 it is determined whether the currently displayed screen is a map screen.
- the map screen may be a screen (operation screen) including selection items in the periphery as shown in FIG. If the currently displayed screen is a map screen, the process proceeds to step 810. On the other hand, if the currently displayed screen is not a map screen, the processing ends. In this case, either a screen switching operation or a cursor sequential movement operation may be realized depending on the currently displayed screen.
- step 810 the map screen is scrolled according to the position of the pointer 82 in the map screen.
- the display control unit 30 scrolls the map screen upward when the pointer 82 is located in the upper area of the map screen, and moves downward when the pointer 82 is located in the lower area of the map screen.
- the map screen is scrolled in the right direction.
- the map is The screen may be scrolled. This scrolling may be performed continuously while the decision signal is received.
- FIG. 9 is an explanatory diagram of another example of the method for determining the scroll direction at the time of scrolling the list, and is a top view schematically showing the state of the hand operating the touch operation screen.
- FIG. 10 is a schematic diagram showing an image of the output level of the electrostatic sensor during the operation shown in FIG. In FIG. 10, two curves P10 and P20 are shown with the Y-axis position of the touch operation surface on the horizontal axis and the output level of the electrostatic sensor on the vertical axis.
- the state of the hand when operating the portion on the back side of the outer frame member 50 corresponding to the back side of the touch operation screen is schematically indicated by the symbol P1, and the outside corresponding to the front side of the touch operation screen is displayed.
- the state of the hand when operating the front side portion of the frame member 50 is generally indicated by the symbol P2.
- the waveform P20 of the output level of the electrostatic sensor is shown. In this case, if the output level of the electrostatic sensor is less than the set value Th, it is determined that a pressing operation has been performed on the front side portion of the outer frame member 50, and the output level of the electrostatic sensor is greater than or equal to the set value Th. If it is present and less than the detection threshold value related to finger contact, it can be determined that the pressing operation has been executed at the inner portion of the outer frame member 50.
- the set value Th is obtained at the time of the pressing operation at the rear part of the outer frame member 50 and the actual data of the output signal of the electrostatic sensor obtained at the pressing operation at the front part of the outer frame member 50. It may be adapted so that these pressing operations can be discriminated based on actual data of the output signal of the electrostatic sensor.
- FIG. 11 is a flowchart showing another example of processing executed by the display control unit 30.
- the process shown in FIG. 11 is an example that may be executed as an alternative to the process shown in FIG. 8 described above.
- step 1100 it is determined whether or not a determination signal is received from the touch pad 10. If a determination signal is received from the touch pad 10, the process proceeds to step 1102, and if not received, the process ends.
- step 1102 it is determined whether a coordinate signal is received from the touch pad 10 or not. If no coordinate signal is received from the touch pad 10, the process proceeds to step 1104. On the other hand, when the coordinate signal is received from the touch pad 10, the process is terminated as it is. In this case, in another processing routine, a selection process in response to the selection operation is executed based on the received coordinate signal.
- Step 1104 it is determined whether or not the output level of the electrostatic sensor is equal to or higher than a set value Th (see FIG. 10). If the output level of the electrostatic sensor is greater than or equal to the set value Th, the process proceeds to step 1106. If the output level of the electrostatic sensor is less than the set value T1, the process proceeds to step 1108. Whether or not the output level of the electrostatic sensor is equal to or higher than the set value Th may be determined by the control unit 16, and information indicating the determination result may be supplied to the display control unit 30. For example, multiple types of determination signals may be set.
- the first determination signal is a determination signal generated with the generation of the coordinate signal
- the second determination signal is a determination signal generated without the generation of the coordinate signal
- the second determination signal is A first signal when the output level of the electrostatic sensor is greater than or equal to a set value Th and less than a detection threshold for finger contact; and a second signal when the output level of the electrostatic sensor is less than the set value Th. It's okay.
- the display control unit 30 may proceed to step 1106 when the first signal is received, or may proceed to step 1108 when the second signal is received.
- step 1106 scrolling is executed upward.
- the scroll target may be different depending on the currently displayed screen.
- the list in the list display area 90 may be scrolled upward.
- the currently displayed screen is a map screen
- the map screen may be scrolled upward.
- the cursor 80 may be sequentially moved upward.
- the alphabets after 70 may be sequentially moved to the operation items 70a to 70n in the order of “a”.
- the cursor 80 currently positioned on the operation item 70h may be sequentially moved in the order of the operation items 70g, 70f, 70e, 70d, 70c, 70b, and 70a. Note that when the cursor 80 is moved to the operation item 70a, the cursor 80 may be stopped, moved to the operation item 70n, and the same upward movement may be continued.
- the currently displayed screen is a WEB screen
- the WEB screen (page) may be sequentially switched in the “return” direction.
- step 1108 scroll downward is executed.
- the scroll target may be different depending on the currently displayed screen.
- the list in the list display area 90 may be scrolled downward.
- the map screen may be scrolled downward.
- the cursor 80 may be sequentially moved downward.
- the alphabets after 70 may be sequentially moved to the operation items 70a to 70n in the order of “n”.
- the cursor 80 positioned on the current operation item 70h may be sequentially moved in the order of the operation items 70i, 70j, 70k, 70l, 70m, and 70n.
- the cursor 80 may be stopped, moved to the operation item 70a, and the same downward movement may be continued.
- the currently displayed screen is a WEB screen
- the WEB screen (page) may be sequentially switched in the “forward” direction.
- the pressing operation at the site on the near side of the outer frame member 50 and the pressing operation at the site on the back side of the outer frame member 50 are discriminated.
- the scroll direction is varied accordingly, but instead of or in addition, the pressing operation at the left part of the outer frame member 50 and the pressing operation at the right part of the outer frame member 50 are determined,
- the scroll direction may be varied according to the determination result. For example, as in the case of the vertical scroll described above, in the case of a pressing operation at the left portion of the outer frame member 50, scrolling is executed in the left direction, and in the case of a pressing operation at the right portion of the outer frame member 50, You may scroll to the right.
- the cursor 80 is sequentially moved in the left direction (or may be accompanied by a sequential movement upward)). In the case of the pressing operation at the right portion of the outer frame member 50, the cursor 80 may be sequentially moved in the right direction (may be accompanied by a sequential movement downward). In addition, in the case of a pressing operation at the left portion of the outer frame member 50, the WEB screen is sequentially switched in the “return” direction, and in the case of a pressing operation at the right portion of the outer frame member 50, “ Sequential switching of the WEB screen in the “forward” direction may be realized.
- FIG. 13 schematically illustrates an example of a hand state during a pressing operation at a right portion of the outer frame member 50 and an example of a hand state during a pressing operation at a left portion of the outer frame member 50.
- the state of the hand when operating the right part of the outer frame member 50 corresponding to the right side of the touch operation screen is indicated by the symbol P3, and the outer frame member 50 corresponding to the left side of the touch operation screen.
- the state of the hand when operating the left part of is generally indicated by the symbol P4.
- the hand When pressing the right part of the outer frame member 50, the hand is positioned so as to cover the right area of the touch operation screen, while when pressing the left part of the outer frame member 50, Is located so as to cover the left area of the touch operation screen. Accordingly, since the difference occurs in the output level of the electrostatic sensor due to such a difference, it is possible to determine which side of the outer frame member 50 is operated by using this difference. .
- the pressing operation is performed on the right portion of the outer frame member 50.
- the pressing operation is performed on the left portion of the outer frame member 50.
- the set value Th1 is the actual data of the output signal of the electrostatic sensor obtained when the outer frame member 50 is pressed on the right side of the outer frame member 50 and the electrostatic value obtained when the outer frame member 50 is pressed on the left side of the outer frame member 50.
- the pressing operation may be determined based on actual data of the sensor output signal.
- the outer frame member 50 is provided on the touch operation surface of the coordinate detection unit 12, so that the touch is performed based on the presence or absence of the coordinate signal when the determination signal is generated.
- the operation surface pressing operation and the outer frame member 50 pressing operation can be discriminated. Thereby, based on the determination result, different functions can be realized between the pressing operation of the touch operation surface and the pressing operation of the outer frame member 50, so that the operability is improved while saving space. be able to. More specifically, it is possible to realize a list scroll or the like when a pressing operation on the touch operation surface is performed and a determination operation is responded to when the outer frame member 50 is pressed.
- the scroll button 72 can be eliminated even in the operation screen, and space can be saved also in the operation screen. However, the scroll button 72 in the operation screen may be maintained as necessary.
- FIG. 14 shows an example of a vehicle-mounted state of the touch pad 110 according to another example
- FIG. 15 is a cross-sectional view schematically showing a cross section along line CC in FIG. In FIG. 15, only the coordinate detection unit 12 (touch operation surface) of the touch pad 110 is illustrated.
- the touch pad 110 is arranged in the vicinity of the console box 9, but the arrangement position is arbitrary.
- the touch pad 110 is mainly different from the touch pad 10 according to each example described above in that the outer frame member 50 is not provided.
- a dead zone 120 is set on the outer periphery of the coordinate detection unit 12 (touch operation surface).
- the dead zone area 120 is an area where the output signal of the electrostatic sensor is not generated even if the area is touched with a finger or is invalidated even if it is generated.
- the dead zone region 120 may be realized by not providing an electrode for the region.
- the width W4 of the dead zone area 120 on the near side of the touch operation surface may be the same as the width W5 of the dead zone area 120 in the other three directions.
- the touch operation surface of the coordinate detection unit 12 is preferably set above the peripheral surface of the touch pad 110.
- the touch operation surface of the coordinate detection unit 12 is set above the surface around the touch pad 110 in the console box 9. Thereby, the operativity of the pressing operation realized by touching only the dead zone area 120 of the touch pad 110 can be enhanced.
- the touch operation surface of the coordinate detection unit 12 may be the same height as the peripheral surface of the touch pad 110 or may be lower than the peripheral surface of the touch pad 110.
- the effective region of the touch operation surface ( It is possible to discriminate between a pressing operation of a region where a coordinate signal is generated when touched with a finger and a pressing operation of the dead zone 120 on the touch operation surface. Thereby, based on the determination result, different functions can be realized when the effective area of the touch operation surface is pressed and when the dead zone area 120 of the touch operation surface is pressed, thereby saving space. The operability can be improved. More specifically, it is possible to realize a list scroll or the like when pressing the dead zone area 120 of the touch operation surface while responding to the determination operation when pressing the effective area of the touch operation surface.
- a mechanical switch for scroll operation may be separately provided as necessary.
- the scroll button 72 can be eliminated also in the operation screen, and space can be saved also in the operation screen.
- the scroll button 72 in the operation screen may be maintained as necessary.
- the embodiment described above relates to the vehicular operating device 1, but it can be applied to a wide variety of operating devices other than those for vehicles (for example, operating devices for ships, operating devices for machines (robots, etc.)). Is also applicable.
- the touch operation detection mechanism in the touch pad 10 uses an electrostatic sensor, but the touch operation may be detected by another principle (sensor).
- the touch pad 10 may be configured by an ultrasonic surface acoustic wave type touch panel.
- the position of the cursor 80 is determined in the absolute coordinate mode, but the position of the cursor 80 may be determined in the relative coordinate mode.
- the relative coordinate mode refers to a mode in which the coordinate system of the screen of the display 20 is synchronized in a relative synchronization manner corresponding relatively to the coordinate system of the operation surface of the touch pad 10.
- the origin of the coordinate system of the screen of the display 20 is located at the current cursor 80 position, and the coordinates of the operation surface of the touch pad 10 are at the finger contact position on the current operation surface.
- the coordinate system of the screen of the display 20 is associated with the coordinate system of the operation surface of the touch pad 10 in such a manner that the origin of the system is located.
- the position of the cursor 80 is determined in the absolute coordinate mode, but the absolute coordinate mode and the relative coordinate mode may be switched as appropriate. For example, when any of the selection items on the operation screen of the display 20 is selected (that is, in the selected state), the relative coordinate mode is set, and any of the selection items on the operation screen of the display 20 is selected. When there is no (that is, when the cursor 80 is not displayed), the absolute coordinate mode may be set.
- the cursor 80 is configured to indicate the relationship between the contact position of the user's finger on the touch operation surface of the touch pad 10 and the position on the operation screen of the display 20, but a normal PC (Personal) A pointer used in (Computer) or the like may be used. Even when a pointer is used, the cursor 80 (selection display) may be maintained.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
前記タッチ操作面の下方への移動を表す信号を出力する下方移動検出手段と、
前記センサの出力信号に基づいて前記タッチ操作面に対する指の接触が検出されずに前記下方移動検出手段の出力信号に基づいて前記タッチ操作面の下方への移動が検出された場合に、前記タッチ操作面に対して遠隔配置された表示装置に表示される画面中のリスト又は画面自体をスクロールすること、前記表示装置に表示される画面を他の画面に切り換えること、及び、前記表示装置に表示される画面内のカーソルの位置を順次移動させること、のうちの少なくともいずれか1つを実行する制御装置とを備える、操作装置が提供される。
Computer)等で使用されるポインタが使用されてもよい。ポインタが用いられる場合も、カーソル80(選択表示)は維持されてもよい。
10,110 タッチパッド
12 座標検出部
14 下方移動検出手段
15a 第1部位
15b 第2部位
16 制御部
18 メモリ
20 ディスプレイ
30 ディスプレイ制御部
40 コントローラ
50 外枠部材
52 ストッパ
54 弾性部
60 基板
71a~71f リスト
72 スクロール用ボタン
80 カーソル
82 ポインタ
90 リスト表示領域
120 不感帯領域
Claims (11)
- 上下動可能に構成され、指の接触を表す信号を出力するセンサが設けられるタッチ操作面であって、指が接触しても前記センサの信号が出力されない外枠部材又は不感帯領域が外周部に設けられるタッチ操作面と、
前記タッチ操作面の下方への移動を表す信号を出力する下方移動検出手段と、
前記センサの出力信号に基づいて前記タッチ操作面に対する指の接触が検出されずに前記下方移動検出手段の出力信号に基づいて前記タッチ操作面の下方への移動が検出された場合に、前記タッチ操作面に対して遠隔配置された表示装置に表示される画面中のリスト又は画面自体をスクロールすること、前記表示装置に表示される画面を他の画面に切り換えること、及び、前記表示装置に表示される画面内のカーソルの位置を順次移動させること、のうちの少なくともいずれか1つを実行する制御装置とを備える、操作装置。 - 前記制御装置は、前記タッチ操作面に対する指の接触が検出されずに前記タッチ操作面の下方への移動が検出された場合に、前記表示装置に表示される操作画面のリストをスクロールする、請求項1に記載の操作装置。
- 前記制御装置は、前記タッチ操作面に対する指の接触が検出されずに前記タッチ操作面の下方への移動が検出された場合に、前記表示装置に表示される地図画面をスクロールする、請求項1に記載の操作装置。
- 前記制御装置は、前記表示装置に表示される画面上におけるカーソル又はポインタの位置に応じて、スクロール方向を決定する、請求項2又は3に記載の操作装置。
- 前記制御装置は、カーソル又はポインタが前記表示装置に表示される画面内の第1領域に位置する場合は、スクロール方向を第1方向とし、カーソル又はポインタが前記画面内の前記第1領域とは逆側の第2領域に位置する場合は、スクロール方向を前記第1方向とは逆方向の第2方向とする、請求項4に記載の操作装置。
- 前記制御装置は、カーソル又はポインタが前記操作画面における前記リストが表示されるリスト表示領域内に位置する状況下で、前記タッチ操作面に対する指の接触が検出されずに前記タッチ操作面の下方への移動が検出された場合は、前記スクロールを実行する一方、カーソル又はポインタが前記リスト表示領域内に位置しない状況下で、前記タッチ操作面に対する指の接触が検出されずに前記タッチ操作面の下方への移動が検出された場合は、前記操作画面内の選択されている選択項目の機能を実現する、請求項2に記載の操作装置。
- 前記センサは、静電センサであり、
前記制御装置は、前記静電センサの出力信号に基づいて、スクロール方向を決定する、請求項2又は3に記載の操作装置。 - 前記制御装置は、前記静電センサの出力信号のレベルが所定基準値以上である場合に、指の接触を検出し、
前記制御装置は、前記静電センサの出力信号のレベルが所定基準値未満であり且つ所定閾値以上である場合には、スクロール方向を第1方向とし、前記静電センサの出力信号のレベルが所定基準値未満であり且つ所定閾値未満である場合には、スクロール方向を前記第1方向とは逆の第2方向とする、請求項7に記載の操作装置。 - 前記第1方向は上方向であり、前記第2方向は下方向である、請求項8に記載の操作装置。
- 上下動可能に構成され、指の接触を表す信号を出力するセンサが設けられるタッチ操作面であって、指が接触しても前記センサの信号が出力されない外枠部材又は不感帯領域が外周部に設けられるタッチ操作面と、
前記タッチ操作面の下方への移動を表す信号を出力する下方移動検出手段と、
前記センサの出力信号に基づいて前記タッチ操作面に対する指の接触が検出されつつ前記下方移動検出手段の出力信号に基づいて前記タッチ操作面の下方への移動が検出された場合と、前記センサの出力信号に基づいて前記タッチ操作面に対する指の接触が検出されずに前記下方移動検出手段の出力信号に基づいて前記タッチ操作面の下方への移動が検出された場合とで、異なる機能を実現する制御装置とを備える、操作装置。 - 前記制御装置は、前記タッチ操作面に対する指の接触が検出されつつ前記タッチ操作面の下方への移動が検出された場合には、決定機能を実現し、前記タッチ操作面に対する指の接触が検出されずに前記タッチ操作面の下方への移動が検出された場合には、前記決定機能とは異なる所定機能を実現する、請求項10に記載の操作装置。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380070263.3A CN104919403B (zh) | 2013-01-17 | 2013-01-17 | 操作装置 |
JP2014557247A JP5962776B2 (ja) | 2013-01-17 | 2013-01-17 | 操作装置 |
US14/652,690 US10061504B2 (en) | 2013-01-17 | 2013-01-17 | Operation apparatus |
BR112015016792A BR112015016792A2 (pt) | 2013-01-17 | 2013-01-17 | dispositivo de operação |
EP13872080.0A EP2947552A4 (en) | 2013-01-17 | 2013-01-17 | CONTROL DEVICE |
PCT/JP2013/050847 WO2014112080A1 (ja) | 2013-01-17 | 2013-01-17 | 操作装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2013/050847 WO2014112080A1 (ja) | 2013-01-17 | 2013-01-17 | 操作装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014112080A1 true WO2014112080A1 (ja) | 2014-07-24 |
Family
ID=51209201
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/050847 WO2014112080A1 (ja) | 2013-01-17 | 2013-01-17 | 操作装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US10061504B2 (ja) |
EP (1) | EP2947552A4 (ja) |
JP (1) | JP5962776B2 (ja) |
CN (1) | CN104919403B (ja) |
BR (1) | BR112015016792A2 (ja) |
WO (1) | WO2014112080A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021015474A (ja) * | 2019-07-12 | 2021-02-12 | アルパイン株式会社 | 入力検出装置および入力検出方法 |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105320327A (zh) * | 2014-07-25 | 2016-02-10 | 南京瀚宇彩欣科技有限责任公司 | 手持式电子装置及其触控外盖 |
US9727231B2 (en) | 2014-11-19 | 2017-08-08 | Honda Motor Co., Ltd. | System and method for providing absolute coordinate and zone mapping between a touchpad and a display screen |
US20170371515A1 (en) | 2014-11-19 | 2017-12-28 | Honda Motor Co., Ltd. | System and method for providing absolute and zone coordinate mapping with graphic animations |
JP6087394B2 (ja) * | 2015-06-17 | 2017-03-01 | 日本写真印刷株式会社 | 表示一体型入力装置 |
DE102016003072A1 (de) * | 2016-03-12 | 2017-09-14 | Audi Ag | Bedienvorrichtung und Verfahren zum Erfassen einer Benutzerauswahl zumindest einer Bedienfuktion der Bedienvorrichtung |
CN106873955A (zh) * | 2016-06-07 | 2017-06-20 | 阿里巴巴集团控股有限公司 | 动态列表的显示方法、装置、设备和*** |
DE102016216543A1 (de) * | 2016-09-01 | 2018-03-01 | Audi Ag | Bedieneinrichtung für ein Komfortsystem eines Kraftfahrzeugs, Komfortsystem mit einer Bedieneinrichtung und Kraftfahrzeug mit einem Komfortsystem |
JP7112345B2 (ja) * | 2019-02-01 | 2022-08-03 | 本田技研工業株式会社 | 表示装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005004690A (ja) * | 2003-06-16 | 2005-01-06 | Sony Corp | 入力方法および入力装置 |
JP2006029917A (ja) | 2004-07-14 | 2006-02-02 | Tokai Rika Co Ltd | タッチ式入力装置 |
JP2010176328A (ja) * | 2009-01-28 | 2010-08-12 | Sony Corp | 表示入力装置 |
JP2010191892A (ja) * | 2009-02-20 | 2010-09-02 | Sony Corp | 情報処理装置、表示制御方法、及びプログラム |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10198507A (ja) | 1997-01-13 | 1998-07-31 | Komota Kk | ポインティングデバイス |
US6496122B2 (en) * | 1998-06-26 | 2002-12-17 | Sharp Laboratories Of America, Inc. | Image display and remote control system capable of displaying two distinct images |
US7469381B2 (en) * | 2007-01-07 | 2008-12-23 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
CN100530065C (zh) * | 2006-04-20 | 2009-08-19 | 铼宝科技股份有限公司 | 透明触控面板结构 |
US20090128507A1 (en) * | 2007-09-27 | 2009-05-21 | Takeshi Hoshino | Display method of information display device |
US8441450B2 (en) * | 2008-09-30 | 2013-05-14 | Apple Inc. | Movable track pad with added functionality |
US20100275122A1 (en) * | 2009-04-27 | 2010-10-28 | Microsoft Corporation | Click-through controller for mobile interaction |
CN101882018A (zh) | 2009-05-06 | 2010-11-10 | 怡利电子工业股份有限公司 | 具有按键功能鼠标板的操作方法 |
JP2011053974A (ja) | 2009-09-02 | 2011-03-17 | Sony Corp | 操作制御装置、操作制御方法およびコンピュータプログラム |
KR101843592B1 (ko) * | 2010-04-30 | 2018-03-29 | 톰슨 라이센싱 | 동적 ui 프레임워크를 통한 주 스크린 뷰 제어 |
US8527900B2 (en) | 2010-07-21 | 2013-09-03 | Volkswagen Ag | Motor vehicle |
TWI425391B (zh) | 2010-07-28 | 2014-02-01 | Asustek Comp Inc | 活動式操作板模組以及具有此模組之電子裝置 |
US8749486B2 (en) | 2010-12-21 | 2014-06-10 | Stmicroelectronics, Inc. | Control surface for touch and multi-touch control of a cursor using a micro electro mechanical system (MEMS) sensor |
FR2973528B1 (fr) * | 2011-03-31 | 2013-04-26 | Valeo Systemes Thermiques | Module de commande et d'affichage pour vehicule automobile |
EP2523072B1 (en) | 2011-05-09 | 2018-07-25 | BlackBerry Limited | Multi-modal user input device |
WO2013029641A1 (en) * | 2011-08-31 | 2013-03-07 | Sony Ericsson Mobile Communications Ab | Method for operating a touch sensitive user interface |
-
2013
- 2013-01-17 WO PCT/JP2013/050847 patent/WO2014112080A1/ja active Application Filing
- 2013-01-17 JP JP2014557247A patent/JP5962776B2/ja active Active
- 2013-01-17 US US14/652,690 patent/US10061504B2/en active Active
- 2013-01-17 BR BR112015016792A patent/BR112015016792A2/pt not_active Application Discontinuation
- 2013-01-17 CN CN201380070263.3A patent/CN104919403B/zh active Active
- 2013-01-17 EP EP13872080.0A patent/EP2947552A4/en not_active Ceased
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005004690A (ja) * | 2003-06-16 | 2005-01-06 | Sony Corp | 入力方法および入力装置 |
JP2006029917A (ja) | 2004-07-14 | 2006-02-02 | Tokai Rika Co Ltd | タッチ式入力装置 |
JP2010176328A (ja) * | 2009-01-28 | 2010-08-12 | Sony Corp | 表示入力装置 |
JP2010191892A (ja) * | 2009-02-20 | 2010-09-02 | Sony Corp | 情報処理装置、表示制御方法、及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2947552A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021015474A (ja) * | 2019-07-12 | 2021-02-12 | アルパイン株式会社 | 入力検出装置および入力検出方法 |
JP7337574B2 (ja) | 2019-07-12 | 2023-09-04 | アルパイン株式会社 | 入力検出装置および入力検出方法 |
Also Published As
Publication number | Publication date |
---|---|
CN104919403B (zh) | 2018-03-16 |
US20150339025A1 (en) | 2015-11-26 |
US10061504B2 (en) | 2018-08-28 |
BR112015016792A2 (pt) | 2017-07-11 |
CN104919403A (zh) | 2015-09-16 |
EP2947552A4 (en) | 2016-01-13 |
EP2947552A1 (en) | 2015-11-25 |
JP5962776B2 (ja) | 2016-08-03 |
JPWO2014112080A1 (ja) | 2017-01-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5962776B2 (ja) | 操作装置 | |
JP5644962B2 (ja) | 操作装置 | |
US20110169750A1 (en) | Multi-touchpad multi-touch user interface | |
JP6277786B2 (ja) | 車両用操作装置 | |
JP6086146B2 (ja) | 車両用操作装置 | |
JP2014102660A (ja) | 操作支援システム、操作支援方法及びコンピュータプログラム | |
JP2013117900A (ja) | 車両用操作装置 | |
JP2015170282A (ja) | 車両用操作装置 | |
US20140210795A1 (en) | Control Assembly for a Motor Vehicle and Method for Operating the Control Assembly for a Motor Vehicle | |
JP6127679B2 (ja) | 操作装置 | |
JP2015228118A (ja) | 操作装置 | |
JP5849597B2 (ja) | 車両用操作装置 | |
WO2016001730A1 (en) | Operation device and operation method | |
JP5954145B2 (ja) | 入力装置 | |
JP2018010472A (ja) | 車内電子機器操作装置及び車内電子機器操作方法 | |
JP5985829B2 (ja) | 車両用操作装置 | |
JP2013084052A (ja) | 車両用操作装置 | |
JP5451246B2 (ja) | 遠隔制御システム、遠隔制御方法 | |
US11347344B2 (en) | Electronic device | |
JP2018122827A (ja) | 車載用情報入力装置および車載用情報入力システム | |
JP2014102659A (ja) | 操作支援システム、操作支援方法及びコンピュータプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13872080 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014557247 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14652690 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013872080 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112015016792 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112015016792 Country of ref document: BR Kind code of ref document: A2 Effective date: 20150714 |