WO2014021063A1 - Dispositif d'actionnement - Google Patents

Dispositif d'actionnement Download PDF

Info

Publication number
WO2014021063A1
WO2014021063A1 PCT/JP2013/068693 JP2013068693W WO2014021063A1 WO 2014021063 A1 WO2014021063 A1 WO 2014021063A1 JP 2013068693 W JP2013068693 W JP 2013068693W WO 2014021063 A1 WO2014021063 A1 WO 2014021063A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
areas
screen
shape
image
Prior art date
Application number
PCT/JP2013/068693
Other languages
English (en)
Japanese (ja)
Inventor
輝子 石川
雅基 加藤
太田 聡
郁代 笹島
貴之 波田野
広川 拓郎
Original Assignee
日本精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本精機株式会社 filed Critical 日本精機株式会社
Publication of WO2014021063A1 publication Critical patent/WO2014021063A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D1/00Steering controls, i.e. means for initiating a change of direction of the vehicle
    • B62D1/02Steering controls, i.e. means for initiating a change of direction of the vehicle vehicle-mounted
    • B62D1/04Hand wheels
    • B62D1/046Adaptations on rotatable parts of the steering wheel for accommodation of switches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/11Instrument graphical user interfaces or menu aspects
    • B60K2360/113Scrolling through menu items
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1446Touch switches
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/77Instrument locations other than the dashboard
    • B60K2360/782Instrument locations other than the dashboard on the steering wheel
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H13/00Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch
    • H01H13/02Details
    • H01H13/04Cases; Covers
    • H01H13/08Casing of switch constituted by a handle serving a purpose other than the actuation of the switch
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H13/00Switches having rectilinearly-movable operating part or parts adapted for pushing or pulling in one direction only, e.g. push-button switch
    • H01H13/02Details
    • H01H13/12Movable parts; Contacts mounted thereon
    • H01H13/14Operating parts, e.g. push-button
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H2231/00Applications
    • H01H2231/026Car

Definitions

  • the present invention relates to an operating device mounted on a vehicle.
  • a display unit that displays a plurality of options for selecting the processing content of a vehicle-mounted device and a steering spoke that constitutes a steering wheel are installed and divided into a substantially similar shape to the display content of the display unit.
  • Input operation means for generating operation information indicating a detection area corresponding to the contact position of the operation finger of the operator, and the operator based on the operation information generated by the input operation means.
  • a technique for determining an option corresponding to a detection area corresponding to the contact position of the operating finger and controlling a vehicle-mounted device is disclosed.
  • an object of the present invention is to provide an operation device capable of performing an intuitive operation.
  • the present invention provides an input unit including an operation surface having a plurality of operation regions, and a sensor unit capable of detecting a contact position of a detected object with respect to the operation surface; Display means comprising a plurality of display areas; Control means for controlling a predetermined electronic device based on a contact position of the detected object detected by the sensor unit and switching a display image displayed in at least one of the plurality of display areas.
  • An operating device The plurality of display areas are arranged three-dimensionally, The plurality of operation areas are three-dimensionally formed, At least one of the plurality of operation areas has a shape substantially similar to any one of the plurality of display areas.
  • An input means comprising: an operation surface having a plurality of operation regions; and a sensor unit capable of detecting a contact position of the detected object with respect to the operation surface; Display means comprising a plurality of display areas; Control means for controlling a predetermined electronic device based on a contact position of the detected object detected by the sensor unit and switching a display image displayed in at least one of the plurality of display areas.
  • the plurality of display areas are arranged three-dimensionally, The plurality of operation areas are three-dimensionally formed,
  • the first three-dimensional identification unit that is formed on the operation surface and divides the plurality of operation areas has a shape substantially similar to the shape of the second three-dimensional identification unit that divides the plurality of display areas. .
  • the input means is mounted on a steering wheel,
  • the plurality of display areas are arranged above the input means, At least one of the plurality of operation areas has a shape substantially similar to the shape of the display area located above the operation area.
  • the input means is mounted on a steering wheel,
  • the plurality of display areas are arranged above the input means,
  • the first three-dimensional identification unit has a shape substantially similar to the shape of the second three-dimensional identification unit located above the first three-dimensional identification unit.
  • the plurality of operation areas are formed so as to have different surface tactile sensations.
  • At least one of the plurality of operation areas is characterized in that a design part imitating a part of any shape of the plurality of display areas is formed on the surface thereof.
  • the control means causes the display means to display an emphasized image that emphasizes a display area in cooperation with an operation area in contact with the detected object, based on the contact position of the detected object detected by the sensor unit.
  • the display means displays an icon image having a predetermined shape as the display image in at least one of the plurality of display areas, and at least one of the plurality of operation areas is a three-dimensional image that is substantially similar to the icon image.
  • An icon operation unit having a simple shape is formed.
  • the display means includes a projection type display that emits display light, and a plurality of screens that form the plurality of display areas and project the display light to display the display image.
  • control means is characterized in that the display image is switched at a direction and a speed substantially coincident with a locus of the detected object detected by the sensor unit.
  • the present invention enables an intuitive operation.
  • the figure which shows the electrical structure of the operating device which is embodiment of this invention.
  • the operating device is an operating device 1000 mounted on the vehicle 1 shown in FIG.
  • the control unit (control means) 300 performs various operations in accordance with the operation of the in-vehicle electronic device.
  • the display device (display means) 500 displays an image corresponding to the result of the operation or various operations.
  • the vehicle 1 includes a steering 10.
  • the steering 10 is a part of the steering device of the vehicle 1, and includes a main body 11 and a steering wheel 12.
  • the main body 11 is a spoke connected to a steering shaft (not shown) of the vehicle 1, and includes a first input device 100 on the right side and a second input device 200 on the left side when viewed from the user.
  • the main body 31 is formed with attachment holes (not shown) that match the shapes of the first and second input devices 100 and 200, respectively. By attaching the first and second input devices 100 and 200 to the respective mounting holes, only the operation surfaces described later of the first and second input devices 100 and 200 are exposed.
  • the steering wheel 12 is a ring-shaped member attached to the main body 11 and gripped when the driver steers the vehicle 1.
  • the in-vehicle electronic device 20 is an audio device, a car navigation device, or the like, and is detachably arranged in the vehicle 1 by a cradle placed on a dashboard or the like in addition to an electronic device fitted in the instrument panel of the vehicle 1. And an electronic device that is simply brought into the vehicle 1 to be operated in the vehicle 1 and includes a high-function mobile phone called a smartphone.
  • the in-vehicle electronic device 20 is electrically connected to a control unit 300 described later, and operates according to a control signal received from the control unit 300. In addition, an image corresponding to the operation of the in-vehicle electronic device 20 is displayed in a display area described later of the display device 500.
  • the operation device 1000 includes a first input device 100, a second input device 200, a control unit 300, a storage unit 400, and a display device 500.
  • FIG. 3 is a front view and a sectional view showing the first input device 100.
  • the first input device 100 includes a contact sensor 110 and a switch device 120.
  • the touch pad device detects a position where a thumb or the like touches the operation surface under the control of the control unit 300 described later, and includes a front cover 111, a sensor sheet (sensor unit) 112, and a spacer 113.
  • the front cover 111 is formed in a sheet shape from a light-shielding insulating material such as a synthetic resin, and is positioned on the operation surface 111a touched by a user's finger or the like when a touch operation or a gesture operation is performed, and on the periphery of the operation surface 111a. And a peripheral edge portion 111b which is covered with a case or the like (not shown) and cannot be touched by a user's finger or the like. As shown in FIG. 3, a step portion 111c is formed on the operation surface 111a.
  • the operation surface 111a has a first operation region R1 and a second operation region having different depth positions with respect to the step portion 111c.
  • the operation area R2 is three-dimensionally formed. That is, the stepped portion 111c serves as a first three-dimensional identification unit that divides the first and second operation regions R1 and R2 on the operation surface 111a.
  • the first operation region R1 is an operation region located below the stepped portion 111c of the operation surface 111a, and includes a flat portion 111d that is a planar portion of the operation surface 111a, and the first operation region R1. And a hollow portion 111e that is recessed in a circular shape so as to sink in the back side direction from the flat surface portion 111d.
  • the “front side” means the user side with respect to the first input device 100 as shown by the double-ended arrows in FIG. 3B, and the “back side” means the opposite side.
  • region R2 is an operation area
  • the sensor sheet 112 is a projected capacitive sensor sheet having a plurality of sensors 1120 (detection electrodes) for detecting the position of a detection target such as a finger, and is located on the back side of the front cover 111.
  • the sensor 1120 is made of a translucent material, and the sensor sheet 112 is light transmissive.
  • the sensor sheet 112 includes a layer having a first sensor array 112a for detecting the position of the detected object in the X direction, and a first layer for detecting the position of the detected object in the Y direction. It is schematically configured by superposing layers having two sensor rows 112b. By combining the first sensor row 112a and the second sensor row 112b, the sensor 1120 is arranged in a matrix on the sensor sheet 112 as a result.
  • the first sensor row 112a and the second sensor row 112b are each electrically connected to a control unit 300 described later.
  • the control part 300 can detect the change of the electrostatic capacitance in each sensor.
  • the control unit 300 calculates an input coordinate value (X, Y) indicating the contact position of the detection target based on the change in capacitance.
  • the input coordinate value is a coordinate value in the XY coordinate system of each sensor 1120 set in advance on the operation surface.
  • the input coordinate values are the X coordinate assigned to the center of gravity position of the distribution of changes in capacitance in the X direction (for example, the position of the sensor 1120 where the capacitance is higher than a certain threshold value and the largest), and the Y direction. And the Y coordinate assigned to the position of the center of gravity of the distribution of the change in the electrostatic capacity (for example, the position of the sensor 1120 where the electrostatic capacity is higher than a certain threshold value and the largest).
  • the control unit 300 calculates the input coordinate value (X, Y) by calculating the X coordinate and the Y coordinate.
  • the sensor sheet 112 is integrally formed with the surface cover 111 by drawing to be processed into the same shape as the surface cover 111 (see FIG. 3B).
  • the surface cover 111 and the sensor sheet 112 become like a single sheet, and the stepped portions 111c, the flat surface portion 111d, the recessed portion 111e, the raised portion 111f, etc. of the operation surface are provided.
  • the shape is composed of a bent portion of the single sheet.
  • the back surface of the front cover 111 and the front surface of the sensor sheet 112 come into contact with each other by being integrally molded in this way.
  • the sensor 1120 is arranged corresponding to the step shape of the front cover 111. Since the sensor 1120 is arranged in this way, even when the gesture operation is performed on the operation surface having a stepped shape such as the stepped portion 111c, the control unit 300 changes the capacitance of each sensor. It can be detected.
  • the spacer 113 is located on the back side of the sensor sheet 112, and is formed in accordance with the shape of the integrally formed surface cover 111 and sensor sheet 112 as shown in FIG. This is a member that retains these shapes when pressure is applied from the front side.
  • the switch device 120 is located on the back side of the contact sensor 110 and is electrically connected to the control unit 300.
  • a pressing operation When the user performs an operation of pressing the operation surface of the first input device 100 (hereinafter referred to as a pressing operation), the switch device 120 is pressed and transmits a predetermined input signal to the control unit 300.
  • the pressing operation is performed when control different from control by touch operation or gesture operation on the operation surface of the first input device 100 is executed.
  • the first input device 100 is attached to the main body 11 of the steering 10 by, for example, a case (not shown) of the contact sensor 110 being welded to the main body 11 with a soft resin. By being attached in this way, when the user presses the operation surface, the contact sensor 110 sinks and the switch device 120 is pressed.
  • the first input device 100 is configured by the above units.
  • FIG. 5 is a front view and a sectional view showing the second input device 200.
  • the second input device 200 has the same configuration as the first input device 100 and includes a contact sensor 210 and a switch device 220.
  • the contact sensor 210 is a touch pad device similar to the contact sensor 110, and includes a surface cover 211, a sensor sheet 212, and a spacer 213.
  • the front cover 211 is formed in a sheet shape from a light-shielding insulating material such as a synthetic resin, and is positioned on the operation surface 211a touched by a user's finger or the like when a touch operation or a gesture operation is performed, and on the periphery of the operation surface 211a And a peripheral portion 211b that is covered with a case or the like (not shown) and cannot be touched by a user's finger or the like. As shown in FIG. 5, the entire operation surface 211a is raised from the peripheral edge 211b to the front side, and a stepped portion 211c is formed.
  • a light-shielding insulating material such as a synthetic resin
  • the operation surface 211a is divided
  • the third operation region R3 is an operation region located on the lower side of the operation surface 211a with respect to the stepped portion 211c.
  • the third operation region R3 includes a planar portion 211d that is a planar portion of the operation surface 211a and the third operation region R3. And a hollow portion 211e that is recessed in a circular shape so as to sink in the back side direction from the flat surface portion 211d.
  • the “front side” means the user side with respect to the second input device 200 as shown by the double-ended arrows in FIG. 5B, and the “back side” means the opposite side.
  • region R4 is an operation area
  • the sensor sheet 212 is a projected capacitance type sensor sheet having a plurality of sensors (detection electrodes) for detecting the position of an object to be detected such as a finger. Located on the side.
  • the sensor is made of a translucent material, and the sensor sheet 212 is light transmissive.
  • the spacer 213 is located on the back side of the sensor sheet 212, and is formed in accordance with the shape of the integrally formed surface cover 211 and sensor sheet 212 as shown in FIG. This is a member that retains these shapes when pressure is applied from the front side.
  • the switch device 220 is located on the back side of the contact sensor 210 and is electrically connected to the control unit 300.
  • a pressing operation When the user performs an operation of pressing the operation surface of the second input device 200 (hereinafter referred to as a pressing operation), the switch device 220 is pressed and transmits a predetermined input signal to the control unit 300.
  • the pressing operation is performed when a control different from the control by the touch operation or the gesture operation on the operation surface of the second input device 200 is executed.
  • the second input device 200 is configured by the above units.
  • control unit 300 includes a CPU (Central Processing Unit) and the like, and executes an operation program stored in the storage unit 400 to perform various processes and controls. At least a part of the control unit 300 may be configured by various dedicated circuits such as an ASIC (Application Specific Integrated Circuit).
  • ASIC Application Specific Integrated Circuit
  • the storage unit 400 includes a ROM (Read Only Memory), a RAM (Random Access Memory), a flash memory, and the like.
  • the work area of the CPU that configures the control unit 300, a program area that stores an operation program executed by the CPU, and data It functions as an area.
  • the program area stores operation programs such as a program for executing vehicle-mounted electronic device control processing, which will be described later, and a program for executing display device control processing.
  • a virtual operation area In the data area, a virtual operation area, execution conditions, corresponding operation data, image data, and the like are stored in advance.
  • the virtual operation area is an area that is set by virtually dividing the operation surface of the first and second input devices 100 and 200.
  • the operation surfaces 111a and 211a are mainly configured by the stepped portion 111c, Operation areas R1 to R4 that are divided vertically with respect to 211c are set.
  • the virtual operation area may include an area indicating the depressions 111e and 211e.
  • the execution condition is data serving as a trigger for transmitting a control signal to the in-vehicle electronic device 20 in each virtual operation region, that is, for controlling the in-vehicle electronic device 20.
  • the execution condition includes a condition as to whether or not it is a specific locus, a condition as to whether or not a predetermined virtual operation area has been touched, and a condition as to whether or not a pressing operation has been detected.
  • a plurality of execution conditions can be set for each virtual operation area. For example, a condition for whether or not a specific locus is set and a condition for whether or not a pressing operation is detected are set for the same virtual operation area. May be.
  • the “specific trajectory” is, for example, a substantially arc-shaped trajectory along the raised portions 111f and 211f.
  • Corresponding operation data is data of a control signal that causes the in-vehicle electronic device 20 to execute a predetermined operation.
  • the image data is image signal data that causes the display device 500 to display a predetermined display image.
  • the control unit 300 performs control to update a display image to be displayed on the display device 500 based on an operation on the first and second input devices 100 and 200 and a predetermined operation of the in-vehicle electronic device 20 according to the operation. .
  • storage part 400 is suitably stored as a default value or a user's own operation using a known data registration method.
  • FIG. 6 is a perspective view showing the display device 500.
  • the display device 500 is disposed in front of the user and above the steering 10, and includes a projector (projection display) 510, a mirror 520, a first screen 530, a second screen 540, and a third screen.
  • a screen 550, a fourth screen 560, a fifth screen 570, and a screen holder 580 are provided.
  • the projector 510 emits display light L indicating a display image toward the first to fifth screens 530 to 570.
  • display light L indicating a display image toward the first to fifth screens 530 to 570.
  • liquid crystal that forms display light L by transmitting light from a light source through a liquid crystal panel. Consists of a projector. A plurality of projectors 510 may be provided.
  • the mirror 520 is a reflecting member formed by depositing a metal such as aluminum on a resin material such as polycarbonate to form a reflecting surface.
  • the display light L emitted from the projector 510 is reflected by the mirror 520 and projected onto the first to fifth screens 530 to 570.
  • the first to fifth screens 530 to 570 constitute display areas on which the display light L emitted from the projector 510 is projected to display (display) display images.
  • the first to fifth screens 530 to 570 are reflection screens that reflect the display light L and display a display image, and have a three-dimensional shape including a plane, a curved surface, a spherical surface, or a combination thereof.
  • the first screen 530 is a screen located in the center, and has a first display area for displaying vehicle information such as vehicle speed, engine speed, remaining fuel, electric energy, travel distance, and navigation information as a display image.
  • the second screen 540 is a screen located on the lower right side when viewed from the user side.
  • the second screen 540 cooperates with the first operation region R1 of the first input device 100 to display the first operation as a display image.
  • a second display area that displays information related to the operation of the region R1 or the operation of the in-vehicle electronic device 20 associated with the operation is configured.
  • the third screen 550 is a screen located on the upper right side when viewed from the user side, and displays various indicators and warnings as a display image.
  • the second operation area of the first input device 100 is displayed.
  • a third display area for displaying information on the operation of the second operation area R2 or the operation of the in-vehicle electronic device 20 associated with this operation as a display image is configured.
  • the fourth screen 560 is a screen located on the lower left side when viewed from the user side. As described later, the fourth screen 560 cooperates with the third operation region R3 of the second input device 200 to display a third operation as a display image. A fourth display area that displays information related to the operation of the region R3 or the operation of the in-vehicle electronic device 20 associated with the operation is configured.
  • the fifth screen 570 is a screen located on the upper left side when viewed from the user side, displays various indicators and warnings as display images, and, as will be described later, a fourth operation area of the second input device 100.
  • a fifth display region is configured to display information related to the operation of the fourth operation region R4 or the operation of the in-vehicle electronic device 20 associated with this operation.
  • the first to fifth screens 530 to 570 are respectively held in a three-dimensional manner, that is, with a space therebetween in the depth direction, by being held by a plurality of holding surfaces provided with steps of the screen holder 580 and having different positions in the depth direction. The positions are arranged differently.
  • the second and fourth screens 540 and 560 are positioned on the first front side (user side) with a space between the first screen 530 and the third and fifth screens 550 and 570 are the first screen 530. 2.
  • the screen holder 580 is a plate-like member that is made of, for example, synthetic resin and holds the first to fifth screens 530 to 570 at predetermined positions via an adhesive material (not shown) that does not cause thermal shrinkage.
  • the first to fifth screens 530 to 570 may be fixed by a frame-like member such as a bezel.
  • step portions 581 to 584 are formed on the holding surface for holding the first to fifth screens 530 to 570 of the screen holder 580 so as to divide the first to fifth screens 530 to 570.
  • FIG. 7 is a diagram showing a gesture operation OP1 for the first operation region R1 of the first input device 100 and a display image on the second screen 540 of the display device 500 that is updated (switched) accordingly.
  • the outside temperature indicated as “OUTSIDE TEMP” in FIG. 7
  • the travelable distance indicated as “DISTANCE TO EMPTY” in FIG. 7
  • the first operation region R1 of the first input device 100 has a shape substantially similar to the shape of the second screen 540 disposed above the first input device 100, and the first operation region R1.
  • R1 and the second screen 540 are associated with each other.
  • substantially similar includes not only complete similarity but also a case where the shapes are close enough to be recognized as similar shapes when the two are compared.
  • the user grasps the shape of the first operation region R1 by visually recognizing the shape of the second screen 540 arranged in a three-dimensional manner, and the fingertip of the three-dimensional shape of the operation surface 111a grasped through vision is applied to the stepped portion 111c and the like. By touching with and feeling through the sense of touch, a gesture operation or the like can be intuitively performed on the first operation region R1.
  • the second screen 540 is arranged in a three-dimensional manner, the user can easily recognize the shape, and the user can feel the three-dimensional shape of the first operation region R1 by touch, and the user views the operation surface 111a. It is possible to perform gesture operations and the like intuitively.
  • an operation on the first operation region R1 when the user performs a gesture operation OP1 that traces the first operation region R1, the control unit 300 transmits an image signal to the projector 510, and is approximately the locus of the gesture operation OP1.
  • the display image on the second screen 540 is sequentially switched in the matching direction (right direction in the present embodiment) and the speed to display a peripheral image showing, for example, the right rear landscape of the vehicle 1. Since the display image on the second screen 540 is switched at a direction and speed substantially matching the locus of the gesture operation OP1 with respect to the first operation region R1, the user operates as if he / she actually operates while touching the display image. Is possible.
  • FIG. 8 is a diagram illustrating a touch operation OP2 for the second operation region R2 of the first input device 100 and a display image on the third screen 550 of the display device 500 that is updated accordingly.
  • FIG. 9 is a diagram showing a gesture operation OP3 for the second operation region R2 of the first input device 100 and a display image on the third screen 550 of the display device 500 that is updated accordingly.
  • the indicator of the direction indicator is displayed on the third screen 550 as a display image of the initial screen.
  • the stepped portion 582 that divides the second and third screens 540 and 550 is formed in a shape substantially similar to the shape of the stepped portion 111c that divides the first and second operation regions R1 and R2.
  • the stepped portion 582 becomes a second three-dimensional identification unit that is substantially similar to the shape of the stepped portion 111 c that is the first three-dimensional identification unit in the first input device 100.
  • the user grasps a part of the shape (lower part in the present embodiment) and the position of the second operation region R2 divided from the first operation region R1 by the step portion 111c by visually recognizing the three-dimensional shape of the step portion 582. Then, the user can intuitively perform a gesture operation or the like on the second operation region R2 by touching the step 111c or the like with a fingertip and feeling the three-dimensional shape of the operation surface 111a grasped through vision visually.
  • the step portion 582 has a three-dimensional shape, it is easy for the user to visually recognize the shape, and the three-dimensional shape of the step portion 111c can be sensed by touch, and the user can intuitively operate the gesture without looking at the operation surface 111a. It can be carried out.
  • the control unit 300 transmits an image signal to the projector 510, and the third operation region R2 is performed.
  • a menu image consisting of an icon group indicating content that can be displayed on the display device 500 is displayed on the screen 550 at a position and shape along the step 582. Furthermore, as shown in FIG.
  • the control unit 300 has a direction that substantially matches the locus of the gesture operation OP3 (this embodiment).
  • the icons are sequentially slid at the upper right) and speed. Since the display image on the third screen 550 is switched in a direction and speed that substantially match the locus of the gesture operation OP3 with respect to the second operation region R2, the user operates as if he / she actually operates while touching the display image. Is possible. Note that when the pressing operation is performed on the second operation region R2 while the menu image is displayed, the selected content is displayed on the second screen 540 as a display image.
  • FIG. 10 is a diagram showing a gesture operation OP4 for the third operation region R3 of the second input device 100 and a display image on the fourth screen 560 of the display device 500 that is updated accordingly.
  • an audio operation image is displayed as a display image of the initial screen.
  • the third operation region R3 of the second input device 200 has a shape substantially similar to the shape of the fourth screen 560 disposed above the second input device 200, and the third operation region R3.
  • R3 and the fourth screen 560 are associated with each other.
  • the user grasps the shape of the third operation region R3 by visually recognizing the shape of the fourth screen 560 arranged in three dimensions, and the fingertip of the three-dimensional shape of the operation surface 211a grasped through vision is applied to the step portion 211c and the like.
  • a gesture operation or the like can be intuitively performed on the third operation region R3. Since the fourth screen 560 is three-dimensionally arranged, the user can easily recognize the shape thereof, and the three-dimensional shape of the third operation region R3 can be sensed by touch, and the user views the operation surface 211a. It is possible to perform gesture operations and the like intuitively.
  • the control unit 300 transmits a volume control signal to the in-vehicle electronic device 20 Then, the volume is changed, an image signal is transmitted to the projector 510, and the volume display in the display image on the fourth screen 560 is updated in accordance with the change in the volume.
  • FIG. 11 is a diagram showing a touch operation OP5 for the fourth operation region R4 of the second input device 200 and a display image on the fifth screen 570 of the display device 500 that is updated accordingly.
  • FIG. 12 is a diagram showing a gesture operation OP6 for the fourth operation region R4 of the second input device 200 and a display image on the fifth screen 570 of the display device 500 that is updated accordingly.
  • an indicator of a direction indicator is displayed as a display image of the initial screen.
  • the stepped portion 584 that divides the fourth and fifth screens 560 and 570 is formed in a shape that is substantially similar to the shape of the stepped portion 211c that divides the third and fourth operation regions R3 and R4.
  • the stepped portion 584 is a second three-dimensional identification unit that is substantially similar to the shape of the stepped portion 211 c that is the first three-dimensional identification unit in the second input device 200.
  • the user grasps a part of the shape (lower part in the present embodiment) and the position of the fourth operation region R4 that is divided from the third operation region R3 by the step portion 211c by visually recognizing the three-dimensional shape of the step portion 584. Then, the user can intuitively perform a gesture operation or the like on the fourth operation region R4 by touching the stepped portion 211c or the like with a fingertip and feeling the three-dimensional shape of the operation surface 211a grasped through sight through a tactile sense.
  • the stepped portion 584 has a three-dimensional shape, it is easy for the user to visually recognize the shape, and the three-dimensional shape of the stepped portion 211c can be sensed by tactile sense. It can be carried out.
  • the control unit 300 transmits an image signal to the projector 510, and A menu image consisting of an icon group indicating the function of the in-vehicle electronic device 20 is displayed on the screen 570 of 5 at a position and shape along the step portion 584. Furthermore, as illustrated in FIG.
  • the control unit 300 substantially matches the trajectory of the gesture operation OP6 (this embodiment).
  • the icons are sequentially slid at the upper left) and the speed. Since the display image on the fifth screen 570 is switched in a direction and speed substantially matching the locus of the gesture operation OP6 with respect to the fourth operation region R4, the user operates as if he / she actually operates while touching the display image. Is possible. Note that when the pressing operation is performed on the fourth operation region R4 while the menu image is displayed, the selected function is executed.
  • the operation device 1000 includes an operation surface 111a having a plurality of operation regions R1 and R2, and a sensor sheet (sensor unit) 112 that can detect the contact position of the detection target with respect to the operation surface 111a.
  • Control means 300, The plurality of screens 530 to 570 are arranged three-dimensionally, The plurality of operation regions R1, R2, and R3, R4 are three-dimensionally formed, The first operation region R1 and the third operation region R3 have shapes substantially similar to the shapes of the second screen 540 and the fourth screen 560, respectively. According to this, the user can intuitively perform a gesture operation or the like without looking at the operation surfaces 111a and 211a.
  • the operation device 1000 includes an operation surface 111a having a plurality of operation regions R1 and R2, and a sensor sheet (sensor unit) 112 that can detect a contact position of the detection target with respect to the operation surface 111a.
  • a second input device (input means) 200 comprising: A display device (display means) 500 including a plurality of screens (display areas) 530 to 570; A control unit that controls a predetermined in-vehicle electronic device 20 based on the contact position of the detected object detected by the sensor sheets 112 and 212, and switches an image displayed on at least one of the plurality of screens 530 to 570.
  • the plurality of screens 530 to 570 are arranged three-dimensionally, The plurality of operation regions R1, R2, and R3, R4 are three-dimensionally formed, A step portion (first three-dimensional identification portion) 111d formed on the operation surface 111a and dividing the operation regions R1 and R2 and a step portion (first three-dimensional identification portion) formed on the operation surface 211a and dividing the operation regions R3 and R4.
  • 211c is a step portion for dividing the second screen 540 and the third screen 550, and a step portion for dividing the fourth screen 560 and the fifth screen 570.
  • (Second three-dimensional identification part) It has a shape substantially similar to the shape of 584. According to this, the user can intuitively perform a gesture operation or the like without looking at the operation surfaces 111a and 211a.
  • the first and second input devices 100 and 200 are mounted on the steering 10, and the screens 530 to 570 are disposed above the first and second input devices 100 and 200.
  • the first operation region R1 and the third operation region R3 have shapes substantially similar to the second screen 540 and the fourth screen 560 located above the first operation region R1 and the third operation region R3. According to this, the left and right positions of the associated operation areas R1 and R3 and the screens 540 and 560 substantially coincide with each other, so that the operation on the operation surfaces 111a and 211a can be performed more comfortably while looking at the screens 540 and 560. be able to.
  • the first and second input devices 100 and 200 are mounted on the steering 10, and the screens 530 to 570 are disposed above the first and second input devices 100 and 200.
  • the step portions 111c and 211c have a shape substantially similar to the shape of the step portions 582 and 584 located above the step portions 111c and 211c. According to this, the left and right positions of the stepped portions 111c and 211c and the stepped portions 582 and 584 that are associated with each other substantially coincide with each other, so that the operation on the operation surfaces 111a and 211a can be performed more comfortably while looking at the stepped portions 582 and 584. It can be performed.
  • the display device 500 includes a projector (projection type display) 510 that emits display light L, and a plurality of screens 530 to 570 that form a plurality of display areas and display images by projecting the display light L. It is prepared. According to this, it is possible to easily obtain a plurality of display areas arranged three-dimensionally using the screens 530 to 570.
  • a projector projection type display
  • a plurality of screens 530 to 570 that form a plurality of display areas and display images by projecting the display light L. It is prepared. According to this, it is possible to easily obtain a plurality of display areas arranged three-dimensionally using the screens 530 to 570.
  • control unit 300 switches the display image at a direction and a speed that substantially coincide with the locus of the detected object detected by the sensor sheets 112 and 212. According to this, the user can operate as if he / she is directly touching the display image, and intuitive operation is possible.
  • the present invention is not limited to the above-described embodiment, and it is needless to say that changes (including deletion of components) can be made as appropriate without departing from the scope of the invention.
  • the two input devices 100 and 200 are provided as input means.
  • one input means may be provided.
  • three or more operation areas may be formed.
  • the shape of two or more operation areas in one input means may be a shape that is substantially similar to any one of the display areas.
  • an emphasized image such as a gradation or a pattern may be displayed at the periphery of each display area so as to emphasize the shape of the display area, and the shape of the display area can be visually recognized more clearly.
  • the plurality of operation areas may be formed so as to have different tactile sensations by means such as printing, paint coating, laser processing, etc., and the difference between touched operation areas can be made easier with a fingertip or the like. It can be felt and is more suitable for intuitive operation.
  • a background image pronounced of the tactile sensation of the operation area may be displayed in a display area that is associated (the display image is updated according to the operation and the shape is substantially similar). It is more preferable that the user can intuitively perform the operation by grasping the surface tactile sensation and feeling the surface tactile sensation in the operation area while touching the operation surface.
  • a glossy background image such as a metallic tone pronounced of the tactile sensation is displayed in a display area associated with the tactile sensation operation area having a low frictional resistance (smooth).
  • a background image including concavities and convexities pronounced of the tactile sensation is displayed in a display region associated therewith.
  • each screen may be formed so as to have a texture pronounced of the tactile sensation of the operation area.
  • a screen associated with the tactile sensation area is formed with a texture free from unevenness that reminds the tactile sensation.
  • a screen associated therewith is formed with a texture including irregularities that recalls the tactile sensation.
  • the operation device 1000 is linked to the operation regions R2 and R4 on the surfaces of the second operation region R2 and the fourth operation region R4 of the first and second input devices 100 and 200, respectively.
  • the first and second design portions 130 and 230 simulating part of the characteristic shapes of the third screen 550 and the fifth screen 570 are formed.
  • the first and second design portions 130 and 230 are printed and formed at locations corresponding to the second operation region R2 and the fourth operation region R4 of the front covers 111 and 211, respectively.
  • “simulated” means that the shape of a part of the screen, which is the display area, is simplified and simplified, and the shapes are close enough to be recognized as being similar when comparing the two. Including the case of a shape.
  • the first design portion 130 is a curved design that imitates the lower curved shape as a part of the shape of the third screen 550 as shown in FIG.
  • the second design portion 230 is a curved design that imitates the lower curved shape as a part of the shape of the fifth screen 570 as shown in FIG.
  • the user visually recognizes a part (lower part) of the characteristic shapes of the third screen 550 and the fifth screen 570 and the first and second designs 130 and 230 imitating them, thereby operating the operation device 1000. Even if the user is unfamiliar with the operation, such as when it is the first time, the relationship between the third screen 550 and the fifth screen 570 and the second operation region R4 and the fourth operation region R4 can be understood at a glance. can do.
  • the control unit 300 transmits an image signal to the projector 510 and the third screen 550.
  • a menu image including a group of icons indicating displayable content is displayed at a position and shape along the step portion 582.
  • the control unit 300 displays an enhanced image that emphasizes the third screen 550 that cooperates with the second operation region R2 that is touched by the finger or the like that is the detection target.
  • a colored background image having a color different from that in the normal state is displayed on the entire third screen 550 as a background.
  • the emphasized image may be any image that emphasizes the display area, and a surrounding line, a gradation, a pattern, or the like may be displayed on the periphery of each display area.
  • the control unit 300 transmits an image signal to the projector 510 and the fifth screen.
  • a menu image including a group of icons indicating the function of the in-vehicle electronic device 20 is displayed on the 570 at a position and shape along the stepped portion 584.
  • the control unit 300 displays an enhanced image that emphasizes the fifth screen 570 that cooperates with the fourth operation region R4 in contact with the finger or the like that is the detection target.
  • a colored background image having a color different from that in the normal state is displayed on the entire fifth screen 570 as a background.
  • the second screen 540 or the fourth screen 560 associated with these operation regions R1 or R3.
  • the highlighted image for enhancing the screen 540 or 560 is displayed. The user can grasp whether or not he / she has touched the desired operation areas R1 to R4 at the time when the user has touched the operation areas R1 to R4 by the emphasized image, and can improve the operability.
  • the second operation region R2 and the third operation region R3 among the plurality of operation regions R1, R2, R3, and R4 are respectively provided on the surface of the plurality of screens 530 to 570.
  • the first and second design portions 130 and 230 imitating part of the shapes of the third screen 550 and the fifth screen 570 are formed. According to this, even when the operation device 1000 is unfamiliar with the operation such as the first appearance, the third screen 550 and the fifth screen 570, the second operation region R4, and the fourth operation at a glance.
  • the cooperative relationship with the region R4 can be grasped.
  • control unit 300 displays screens 540 to 570 that cooperate with the operation areas R1, R2, R3, and R4 in contact with the detected object based on the contact positions of the detected object detected by the sensor sheets 112 and 212.
  • An emphasized image to be emphasized is displayed on the display device 500. According to this, it is possible to grasp whether or not the desired operation areas R1 to R4 have been touched when the operation areas R1 to R4 are touched, and the operability can be improved.
  • the display device 500 displays icon images IC1 to IC4 and IC5 to IC8 having predetermined shapes as the display images on the third screen 550 and the fifth screen 570,
  • icon operation units 111g to 111j having a three-dimensional shape substantially similar to the icon images IC1 to IC4 and IC5 to IC8.
  • 211g to 211j are formed.
  • the icon images IC1 to IC4 displayed on the third screen 550 have a circular shape as shown in FIG. 15 and indicate contents that can be displayed on the display device 500, respectively, along the step portion 582.
  • a menu image for selecting contents to be displayed and displayed as a whole is configured.
  • the icon operation portions 111g to 111j formed in the second operation region R2 have convex shapes substantially similar to the icon images IC1 to IC4, respectively, and the step portions 111c so as to have the same positional relationship as the icon images IC1 to IC4. It is formed to follow.
  • the icon images IC1 to IC4 are switched in a selected state or a non-selected state in accordance with a touch operation on the icon operation units 111g to 111j, respectively, and cooperate with the icon operation units 111g to 111j on a one-to-one basis. That is, as shown in FIG. 15, when the user performs (touches) the touch operation OP7 on the icon operation unit 111j, the control unit 300 transmits an image signal to the projector 510 and displays it on the third screen 550. Then, the icon image IC4 linked to the icon operation unit 111j is selected. In the present embodiment, the icon image IC4 in the selected state is displayed with negative / positive inversion.
  • the content corresponding to the icon image IC4 in the selected state is displayed on the second screen 540 as a display image.
  • the icon operation units 111g to 111h are touch-operated, the corresponding icon images IC1 to IC3 are selected.
  • the user can feel the shape of the icon images IC1 to IC4 grasped through the sight by touching the icon operation units 111g to 111j with a fingertip and feel the touch through the sense of touch. Is possible.
  • the icon operation units 111g to 111j may have a concave shape substantially similar to the icon images IC1 to IC4.
  • the icon images IC5 to IC8 displayed on the fifth screen 570 have a circular shape as shown in FIG. 16, and each indicate the function of the in-vehicle electronic device 20, and are displayed along the step portion 584. As a whole, a menu image for selecting the function of the in-vehicle electronic device 20 is configured.
  • the icon operation portions 211g to 211j formed in the fourth operation region R4 have convex shapes substantially similar to the icon images IC5 to IC8, respectively, and the step portions 211c so as to have the same positional relationship as the icon images IC5 to IC8. It is formed to follow.
  • the icon images IC5 to IC8 are switched in a selected state or a non-selected state in accordance with a touch operation on the icon operation units 211g to 211j, respectively, and cooperate with the icon operation units 211g to 211j on a one-to-one basis. That is, as shown in FIG. 16, when the user performs a touch operation OP8 (touches) on the icon operation unit 211j, the control unit 300 transmits an image signal to the projector 510 and displays it on the fifth screen 570. Then, the icon image IC8 linked with the icon operation unit 211j is selected. In the present embodiment, the icon image IC8 in the selected state is displayed with negative / positive inversion.
  • a function corresponding to the icon image IC8 in the selected state is executed.
  • the icon operation units 211g to 211h are touch-operated, the corresponding icon images IC5 to IC7 are selected.
  • the user can feel the shape of the icon images IC5 to IC8 grasped visually through touching the icon operation units 211g to 211j with a fingertip and feel the touch through the sense of touch and actually operate the icon images IC5 to IC8. Is possible.
  • the icon operation units 211g to 211j may have a concave shape substantially similar to the icon images IC5 to IC8.
  • the display device 500 includes icon images IC1 to IC4 and IC5 to IC8 having predetermined shapes as display images on the third screen 550 and the fifth screen 570 among the plurality of screens 530 to 570.
  • the second operation region R2 and the fourth operation region R4 are icon operation portions 111g to 111j having a three-dimensional shape substantially similar to the icon images IC1 to IC4 and IC5 to IC8, and 211g to 211j are formed. According to this, the user can operate the icon images IC1 to IC4 and IC5 to IC8 as if they are directly touching, and intuitive operation is possible.
  • the present invention is applicable to an operating device mounted on a vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Switches That Are Operated By Magnetic Or Electric Fields (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

La présente invention concerne un dispositif d'actionnement qui permet une action intuitive. Un dispositif d'actionnement (1000) est pourvu : de moyens d'entrée (unités d'entrée) (100, 200) pourvus chacun d'un plan d'actionnement comportant une pluralité de régions d'actionnement, et d'un élément capteur pouvant détecter la position de contact d'un objet à détecter avec le plan d'actionnement ; d'un moyen d'affichage (unité d'affichage) (500) pourvu d'une pluralité de régions d'affichage ; et d'un moyen de commande destiné à commander un dispositif électronique prédéfini sur la base de la position de contact de l'objet à détecter, qui a été détectée par l'élément capteur, et à commuter une image d'affichage affichée dans au moins une région de la pluralité de régions d'affichage. La pluralité de régions d'affichage sont disposées en trois dimensions, la pluralité de régions d'actionnement sont formées en trois dimensions, et au moins une région de la pluralité de régions d'actionnement a une forme approximativement similaire à la forme de n'importe quelle région de la pluralité de régions d'affichage.
PCT/JP2013/068693 2012-08-02 2013-07-09 Dispositif d'actionnement WO2014021063A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2012172279 2012-08-02
JP2012-172279 2012-08-02
JP2012269982A JP2014043232A (ja) 2012-08-02 2012-12-11 操作装置
JP2012-269982 2012-12-11

Publications (1)

Publication Number Publication Date
WO2014021063A1 true WO2014021063A1 (fr) 2014-02-06

Family

ID=50027748

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/068693 WO2014021063A1 (fr) 2012-08-02 2013-07-09 Dispositif d'actionnement

Country Status (2)

Country Link
JP (1) JP2014043232A (fr)
WO (1) WO2014021063A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017039392A (ja) * 2015-08-20 2017-02-23 マツダ株式会社 車両の表示装置
JP2017134508A (ja) * 2016-01-26 2017-08-03 豊田合成株式会社 タッチセンサ装置
EP3222471A4 (fr) * 2014-11-19 2017-11-29 Panasonic Intellectual Property Management Co., Ltd. Dispositif d'entrée et procédé d'entrée pour celui-ci
US10759461B2 (en) 2019-01-31 2020-09-01 Toyota Motor Engineering & Manufacturing North America, Inc. Multi-function vehicle input apparatuses with rotatable dials for vehicle systems control and methods incorporating the same
FR3121640A1 (fr) * 2021-04-09 2022-10-14 Faurecia Interieur Industrie Panneau de commande pour véhicule et procédé de réalisation
US20230091049A1 (en) * 2021-09-17 2023-03-23 Toyota Jidosha Kabushiki Kaisha Vehicular display control device, vehicular display system, vehicle, display method, and non-transitory computer-readable medium storing program

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6329932B2 (ja) 2015-11-04 2018-05-23 矢崎総業株式会社 車両用操作システム
JP2019206297A (ja) * 2018-05-30 2019-12-05 トヨタ自動車株式会社 内装構造
JP2021075157A (ja) 2019-11-08 2021-05-20 トヨタ自動車株式会社 車両用入力装置
WO2022230591A1 (fr) * 2021-04-28 2022-11-03 テイ・エス テック株式会社 Dispositif d'entrée

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003312373A (ja) * 2002-04-24 2003-11-06 Toyota Motor Corp 入力装置
JP2006011237A (ja) * 2004-06-29 2006-01-12 Denso Corp 車両用表示システム
JP2006117009A (ja) * 2004-10-19 2006-05-11 Tokai Rika Co Ltd 車両の入力及び表示装置
JP2007106353A (ja) * 2005-10-17 2007-04-26 Denso Corp 車両用情報表示装置及び車両用情報表示システム
JP2012059085A (ja) * 2010-09-10 2012-03-22 Diamond Electric Mfg Co Ltd 車載用情報装置
JP2012093802A (ja) * 2010-10-22 2012-05-17 Aisin Aw Co Ltd 画像表示装置、画像表示方法及びプログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003312373A (ja) * 2002-04-24 2003-11-06 Toyota Motor Corp 入力装置
JP2006011237A (ja) * 2004-06-29 2006-01-12 Denso Corp 車両用表示システム
JP2006117009A (ja) * 2004-10-19 2006-05-11 Tokai Rika Co Ltd 車両の入力及び表示装置
JP2007106353A (ja) * 2005-10-17 2007-04-26 Denso Corp 車両用情報表示装置及び車両用情報表示システム
JP2012059085A (ja) * 2010-09-10 2012-03-22 Diamond Electric Mfg Co Ltd 車載用情報装置
JP2012093802A (ja) * 2010-10-22 2012-05-17 Aisin Aw Co Ltd 画像表示装置、画像表示方法及びプログラム

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3222471A4 (fr) * 2014-11-19 2017-11-29 Panasonic Intellectual Property Management Co., Ltd. Dispositif d'entrée et procédé d'entrée pour celui-ci
JP2017039392A (ja) * 2015-08-20 2017-02-23 マツダ株式会社 車両の表示装置
JP2017134508A (ja) * 2016-01-26 2017-08-03 豊田合成株式会社 タッチセンサ装置
US10759461B2 (en) 2019-01-31 2020-09-01 Toyota Motor Engineering & Manufacturing North America, Inc. Multi-function vehicle input apparatuses with rotatable dials for vehicle systems control and methods incorporating the same
FR3121640A1 (fr) * 2021-04-09 2022-10-14 Faurecia Interieur Industrie Panneau de commande pour véhicule et procédé de réalisation
US11635831B2 (en) 2021-04-09 2023-04-25 Faurecia Interieur Industrie Vehicle control panel and production method
US20230091049A1 (en) * 2021-09-17 2023-03-23 Toyota Jidosha Kabushiki Kaisha Vehicular display control device, vehicular display system, vehicle, display method, and non-transitory computer-readable medium storing program
US11820226B2 (en) * 2021-09-17 2023-11-21 Toyota Jidosha Kabushiki Kaisha Vehicular display control device, vehicular display system, vehicle, display method, and non-transitory computer-readable medium storing program

Also Published As

Publication number Publication date
JP2014043232A (ja) 2014-03-13

Similar Documents

Publication Publication Date Title
WO2014021063A1 (fr) Dispositif d'actionnement
EP2010411B1 (fr) Dispositif de commande
WO2014085277A1 (fr) Commandes tactiles à base de lumière sur un volant et un tableau de bord
JP2012190185A (ja) 制御装置
CN106095150B (zh) 触摸输入装置以及具有该装置的车辆
US10166868B2 (en) Vehicle-mounted equipment operation support system
JP2014229014A (ja) タッチパネル入力操作装置
JP2014142777A (ja) タッチパネル入力操作装置
KR20170029180A (ko) 차량, 및 그 제어방법
JP2012176631A (ja) 制御装置
JP2018195134A (ja) 車載用情報処理システム
JP5954145B2 (ja) 入力装置
JP2012208762A (ja) タッチパネル入力操作装置
JP2013033309A (ja) タッチパネル入力操作装置
EP2988194B1 (fr) Dispositif de fonctionnement pour véhicule
JP2020204868A (ja) 表示装置
KR102674463B1 (ko) 차량, 및 그 제어방법
KR102263593B1 (ko) 차량, 및 그 제어방법
JP2017208185A (ja) 入力装置
JP2020102066A (ja) 操作入力装置
JP6329932B2 (ja) 車両用操作システム
JP5640816B2 (ja) 入力装置
JP2013232081A (ja) タッチパネル入力操作装置
JP2014029576A (ja) タッチパネル入力操作装置
JP6911821B2 (ja) 入力装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13826153

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13826153

Country of ref document: EP

Kind code of ref document: A1