WO2017188098A1 - Vehicle-mounted information processing system - Google Patents

Vehicle-mounted information processing system Download PDF

Info

Publication number
WO2017188098A1
WO2017188098A1 PCT/JP2017/015774 JP2017015774W WO2017188098A1 WO 2017188098 A1 WO2017188098 A1 WO 2017188098A1 JP 2017015774 W JP2017015774 W JP 2017015774W WO 2017188098 A1 WO2017188098 A1 WO 2017188098A1
Authority
WO
WIPO (PCT)
Prior art keywords
operator
display
screen
unit
control unit
Prior art date
Application number
PCT/JP2017/015774
Other languages
French (fr)
Japanese (ja)
Inventor
佐藤 晴彦
吉富 輝雄
Original Assignee
カルソニックカンセイ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by カルソニックカンセイ株式会社 filed Critical カルソニックカンセイ株式会社
Publication of WO2017188098A1 publication Critical patent/WO2017188098A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to an in-vehicle information processing system mounted on a vehicle in order to control information related to the vehicle.
  • an operator performs a pointer operation or a cursor operation displayed on one screen, a screen scroll operation, a selection operation, an input operation, etc. by performing a touch operation on a touch pad installed on a center console. Is possible.
  • the operation input device disclosed in Patent Literature 1 reduces the influence of disturbance factors including vibration of a running vehicle, and smoothes the cursor movement on the display unit based on the touch operation on the touch pad. It is possible.
  • the operation device disclosed in Patent Literature 2 can operate a pointer or cursor displayed on the screen and scroll the screen by a touch operation by an operator on a touch pad installed in the center console. To do.
  • Patent Documents 1 and 2 respond to an operation of the operator by drawing a pointer, a cursor, a movement vector, or the like on one screen.
  • a pointer a pointer
  • a cursor a movement vector, or the like
  • a movement vector a movement vector
  • An object of the present invention made in view of such a viewpoint is to provide an in-vehicle information processing system capable of easily selecting each screen.
  • an in-vehicle information processing system includes: A display unit having a plurality of screens; An operation unit for detecting at least a part of the operation of the operator's operator; And a control unit that selects the screen for operating the display content based on the detected operation.
  • each screen can be easily selected.
  • FIG. 1 is a schematic diagram showing an entire in-vehicle information processing system 10 according to the present embodiment.
  • FIG. 2 is a functional block diagram showing a schematic configuration of the in-vehicle information processing system 10 of FIG.
  • the in-vehicle information processing system 10 includes a display unit 11, a touch operation unit 12, an imaging unit 13, an operation unit 14, a control unit 15, and a storage unit 16.
  • FIG. 3 is a schematic diagram illustrating an example of an image displayed on the display unit 11.
  • FIG. 3A shows an example of a menu screen
  • FIG. 3B shows an example of a map screen.
  • FIG. 4 is a diagram schematically illustrating a cross section when the touch operation unit 12 is viewed from the side surface direction.
  • FIG. 5 is a schematic diagram when the operation unit 14 is viewed from above.
  • the in-vehicle information processing system 10 associates the position coordinates in the operation area on the screen configuring the display unit 11 with the position coordinates in the predetermined area of the touch operation unit 12 and images captured by the imaging unit 13. Based on the above, the operator's operating hand is superimposed on the screen. That is, based on the touch operation by the operator on the touch operation unit 12, the operator superimposed on the screen operates the screen virtually at the corresponding position.
  • the in-vehicle information processing system 10 causes the movement of the operator superimposed on the screen to correspond to the movement of the actual operator's operator captured by the imaging unit 13.
  • the operator is, for example, a driver who drives the vehicle or a passenger sitting in the passenger seat, and the operator is, for example, the driver on the center console side or the passenger's own hand.
  • the display unit 11 has at least one screen.
  • the display unit 11 may be configured by an arbitrary display device such as a liquid crystal display.
  • the display unit 11 is disposed, for example, on an instrument panel.
  • the display device constituting the display unit 11 may be a touch panel display or a display incapable of touch operation. In the following description, it is assumed that the display unit 11 is a display that cannot be touched.
  • the in-vehicle information processing system 10 may include a so-called head-up display type device in addition to or instead of the display unit 11.
  • the head-up display type device has a light emitting unit that generates display information as display light, reflects the generated display light toward an observer such as a driver, and the virtual image through the front windshield. Is displayed.
  • the observer is not limited to the driver but may be a passenger sitting in the passenger seat.
  • the display unit 11 displays information on the vehicle, function items for controlling the information, or a combination thereof.
  • the information about the vehicle includes, for example, information such as air conditioning, car navigation, audio, an image around the vehicle by an electronic mirror, a vehicle speed, a traveling position of the host vehicle in a plurality of lanes, or an inter-vehicle distance.
  • the function items for controlling the information include, for example, “return”, “forward”, “home”, “decision”, “various menus”, “temperature high / low”, “current location”, “volume high / low” ”,“ Enlargement / reduction ”,“ speed fast / slow ”,“ lane change ”, or“ distance long / short ”.
  • the display unit 11 may display each item as a character or an icon.
  • the display unit 11 displays various menus on one screen as function items for controlling information related to the vehicle. Specifically, the display unit 11 displays “APPS” as an item for displaying various applications. The display unit 11 displays “TEL” as an item for using the telephone. The display unit 11 displays “A / C” as an item for controlling the air conditioner. The display unit 11 displays “NAVI” as a menu for using the car navigation. The display unit 11 displays “AUDIO” as a menu for using audio. The display unit 11 displays “HOME” as an item for returning to the home screen. The display unit 11 displays “RETURN” as an item for returning to the previous screen.
  • “APPS” as an item for displaying various applications.
  • the display unit 11 displays “TEL” as an item for using the telephone.
  • the display unit 11 displays “A / C” as an item for controlling the air conditioner.
  • the display unit 11 displays “NAVI” as a menu for using the car navigation.
  • the display unit 11 displays “AUDIO” as a menu for using audio.
  • the display unit 11 displays map information that is a part of the car navigation system on one screen as information about the vehicle.
  • the display unit 11 displays function items such as “Destination setting”, “HOME”, and “RETURN” so as to be superimposed on the map information as a combination of information on the vehicle and function items for controlling the information. indicate.
  • the display unit 11 displays the operator's operator so as to overlap the display content. As shown in FIG. 3, the display unit 11 makes the operator's operating hand translucent and displays the above display content behind it.
  • the display unit 11 is not limited to this, and may display an opaque operation hand so that the display content behind the operator is temporarily hidden when the operator's operation hand is superimposed.
  • the degree of semi-transmission that is, the transmittance is described as being constant without depending on the position to be superimposed, but is not limited thereto, and may be changed for each position to be superimposed.
  • the display unit 11 raises the transmittance above a predetermined value so that the operator can sufficiently recognize which item the operator should select. It may be displayed. Conversely, at a position where only the background is displayed, the display unit 11 may display the transmittance lower than a predetermined value.
  • the display unit 11 may display an operator's operator with gradation.
  • the gradations described herein may include any step change with respect to light and darkness, color, or transmittance, or a combination thereof.
  • the display unit 11 is preferably displayed in gradation by an arbitrary method in which the operator can easily view the display content behind. For example, the display unit 11 displays the operator by gradually increasing the brightness of the operator, gradually changing to a lighter color, or gradually increasing the transmittance as the fingertip of the operator's operator superimposed on the display approaches. May be.
  • the display unit 11 displays each display content by an arbitrary display method that allows the operator to easily view the display contents while ensuring the reality of the operator's operator to be superimposed.
  • the display unit 11 has been described as superimposing an operator's hand in the real world on a virtual space where the above-described display contents are displayed, the present invention is not limited to this.
  • the display unit 11 may superimpose display contents and the like from the front surface of the operator's operator displayed on the screen as in a so-called mixed reality.
  • the touch operation part 12 is arrange
  • the touch operation unit 12 includes a touch pad 121 and a tact switch 122 as shown in FIG. As shown in FIG. 4, the operator places his / her arm and wrist on the armrest and palmrest, respectively, and makes a part of the operation hand, for example, a finger contact the touchpad 121.
  • the touch pad 121 detects contact by a contact object such as an operator's operator or a stylus pen at a corresponding contact position.
  • the touch pad 121 detects contact with a part of the operator's operating hand, for example, a finger at a corresponding contact position.
  • the operator operates information displayed on each screen constituting the display unit 11 by performing a touch operation on the touch operation unit 12, particularly the touch pad 121.
  • the touch pad 121 is formed of, for example, transparent glass, and a touch sensor configured by an arbitrary method such as a resistive film method, a capacitance method, a surface acoustic wave method, an infrared method, or an electromagnetic induction method can be used. . In the following description, it is assumed that the touch pad 121 is a capacitive touch pad.
  • the tact switch 122 is disposed immediately below the touch pad 121 and supported by the substrate.
  • the tact switch 122 is turned on by the pressing.
  • the tact switch 122 is turned off when the push of the operator is released and the touch pad 121 returns to the original position.
  • the tact switch 122 is turned on to obtain a click feeling.
  • one tact switch 122 is arranged at the central portion immediately below the touch pad 121.
  • the tact switch 122 is not limited to this, and any number can be used as long as the pressure from the touch pad 121 can be detected. It may be arranged at any place.
  • one tact switch 122 may be disposed on the outer periphery of the tact switch 122 immediately below the touch pad 121, or a plurality of tact switches 122 may be disposed at dispersed positions.
  • the touch operation unit 12 is configured to be able to detect a pressure from the touch pad 121 for each predetermined area of the touch pad 121. Also good. That is, the touch operation unit 12 is configured to be able to detect at which position on the touch pad 121 the operator has pressed when a plurality of fingers of the operator are in contact with the touch pad 121 at the same time. Also good.
  • the configuration unit arranged immediately below the touch pad 121 is not limited to the tact switch 122, and may be any configuration as long as the pressing from the touch pad 121 can be detected.
  • a pressure sensitive sensor such as a piezoelectric element may be disposed immediately below the touch pad 121.
  • the touch operation unit 12 may include a filter in order to remove an unnecessary detection signal of the touch pad 121 due to vibration during traveling of the vehicle.
  • the imaging unit 13 has at least one camera and is disposed on, for example, a roof panel.
  • the camera which comprises the imaging part 13 is arrange
  • the imaging unit 13 may image at least a part of the operator's operating hand, for example, only five fingers, but it is preferable to image the entire operator's operating hand including the back of the hand.
  • the entire operating hand is the entire part including from the vicinity of the operator's wrist to the fingertip.
  • the imaging unit 13 is preferably arranged above the operator, such as a roof panel, in order to easily image the entire operator's operator.
  • the imaging unit 13 is not limited to such an arrangement, and may be arranged at any location as long as at least a part of the operator's operator's hand can be imaged.
  • the imaging unit 13 may be arranged directly below the transparent touch pad 121 and may capture a part of the operator's operator performing a touch operation with the touch pad 121 from below. In this case, for example, by changing the palm rest portion in FIG. 4 to an arbitrary transparent support member, the imaging unit 13 can also image the entire operator's operating hand including the back of the hand.
  • the imaging unit 13 is preferably composed of a camera with a wide dynamic range so that the operator's hand can be clearly imaged both in the daytime bright state and in the nighttime dark state.
  • the image captured by the camera may be a black and white image or a color image.
  • the imaging unit 13 is not limited to a configuration with a camera having a wide dynamic range, and may be configured with a camera capable of imaging in a bright state in the daytime. In this case, the imaging unit 13 may irradiate the operator's operating hand on the touch pad 121 with a spotlight from above in order to clearly capture the operator's operating hand even at night.
  • the vehicle When the vehicle performs automatic driving, it is assumed that the operator leans on the reclining seat and leans on the seat in a relaxed state. At this time, if the position of the touch operation unit 12 is fixed, the operator needs to perform the touch operation by extending his arm while leaning on the seat, and feels inconvenience. Therefore, for example, by configuring the center console on which the touch operation unit 12 is disposed so as to be lowered rearward in conjunction with the movement of the reclining seat, the operator can easily perform the touch operation without extending the arm. Can do. In the case of such a configuration, the imaging unit 13 needs to image the operator on the touch pad 121 in accordance with each position of the touch operation unit 12 that is linked to the movement of the reclining seat.
  • the imaging unit 13 is preferably configured with a camera with a wide angle of view.
  • the imaging unit 13 is not limited to this, and may be configured such that the angle of the camera itself changes in conjunction with a change in the position of the touch operation unit 12 even if the camera has a narrow angle of view.
  • the imaging unit 13 may be configured such that the camera itself translates in conjunction with a change in the position of the touch operation unit 12.
  • the position of the touch operation unit 12 that changes in conjunction with the movement of the reclining seat is limited to two positions, for example, a position during manual operation and a position during automatic operation.
  • a camera may be arranged.
  • the operation part 14 is arrange
  • the part of the operating hand described here is, for example, the palm side of the wrist, that is, the wrist part.
  • the operation unit 14 has, for example, a four-way switch such as a cross key on the upper surface thereof.
  • the operator places a part of the operator on the upper surface of the operation unit 14, particularly the surface of the four-way switch.
  • the operation unit 14 detects the operation of at least a part of the operator's operator touching the four-way switch. For example, when the operator pushes the four-way switch to the back side with the wrist, the operation unit 14 detects the pushing operation to the back side.
  • the operation unit 14 detects the push-in operation toward the near side, the left side, or the right side. That is, the operation unit 14 can detect the movements in the four directions of the operator's wrist.
  • the operation unit 14 transmits a signal based on the detected operation to the control unit 15.
  • the operation unit 14 will be described below as having a four-way switch, but is not limited thereto.
  • the operation unit 14 may have an 8-way switch.
  • the operation unit 14 can detect at least a part of eight-direction motions of the operator's operating hand.
  • the operation unit 14 is not limited to the direction switch, and may have any configuration as long as it can detect at least a part of the operation of the operator's operator.
  • the control unit 15 is a processor that controls and manages the entire in-vehicle information processing system 10 including each functional block of the in-vehicle information processing system 10.
  • the control unit 15 includes a processor such as a CPU (Central Processing Unit) that executes a program that defines a control procedure. Such a program is stored in the storage unit 16, for example.
  • a processor such as a CPU (Central Processing Unit) that executes a program that defines a control procedure.
  • a program is stored in the storage unit 16, for example.
  • the control unit 15 acquires the contact information detected on the touch pad 121 from the touch operation unit 12 as an input signal. Specifically, the control unit 15 acquires detection information regarding a contact object, for example, a contact by an operator's finger and a corresponding contact position. The control unit 15 identifies accurate position coordinates on the touch pad 121 where the touch operation is performed based on the detection information regarding the corresponding contact position.
  • the control unit 15 acquires a signal related to the on or off state of the tact switch 122 from the touch operation unit 12. Specifically, when the operator depresses the tact switch 122 via the touch pad 121, the control unit 15 acquires an on-state signal. When the operator stops pressing the touch pad 121 and releases the pressing of the tact switch 122, the control unit 15 acquires an off-state signal. The control unit 15 identifies the on state or the off state of the tact switch 122 based on the acquired signal.
  • the control unit 15 selects a corresponding item on the screen constituting the display unit 11 when the touch pad 121 detects contact by a part of the operator's operator's hand. At this time, the control unit 15 highlights the item. Highlighting is to display a predetermined item with emphasis.
  • the control unit 15 provides feedback to the operator that the above item is in a selected state by highlighting. For example, as illustrated in FIG. 3A, when contact with the operator's finger is detected at a corresponding position on the touch pad 121, the control unit 15 highlights the function item “NAVI” on the screen. At this time, the control unit 15 superimposes and displays the operator's operating hand at a corresponding position based on the image captured by the imaging unit 13.
  • the control unit 15 determines the selection of a predetermined item on the screen when the tact switch 122 is turned on by pressing the touch pad 121 by a part of the operator's operator.
  • the operation for confirming selection of a predetermined item on the screen is not limited to this, and may be an arbitrary operation such as a double tap on the touch pad 121, for example. In this case, the touch operation unit 12 may not have the tact switch 122.
  • the control unit 15 when the tact switch 122 is turned on by pressing by a part of the operator's operator's hand, the control unit 15 confirms selection of the item “NAVI” displayed on the screen. Let At this time, the control unit 15 causes the same movement to be displayed on the screen in accordance with a part of the operator's hand touching the touch pad 121, for example, pressing of the index finger or double tap.
  • the control unit 15 causes the display unit 11 to display information on the vehicle, function items for controlling the information, or a combination thereof.
  • the control unit 15 causes at least a part of the operator's operator's hand to be superimposed and displayed on the screen at a display magnification based on the size of the operation area on the screen constituting the display unit 11 by image processing to be described later.
  • the control unit 15 acquires a signal based on at least a part of the operator's operating hand, for example, the operation of the wrist detected by the operating unit 14. For example, the control unit 15 identifies the direction of the operation based on the acquired signal. The control unit 15 selects a screen for the operator to operate display contents based on the detected operation.
  • the control unit 15 refers to various information stored in the storage unit 16. Specifically, the control unit 15 refers to information related to the vehicle or information related to a function item for controlling the information. The control unit 15 refers to information regarding the on state or the off state of the tact switch 122. The control unit 15 refers to the image information captured by the imaging unit 13. The control unit 15 refers to information regarding the operator's operator who has been subjected to image processing and is finally superimposed on the display unit 11.
  • the storage unit 16 can be composed of a semiconductor memory, a magnetic memory, or the like, and stores various information described above, a program for operating the vehicle information processing system 10, and the like.
  • the storage unit 16 also functions as a work memory.
  • the storage unit 16 stores information on the operator of the operator who has been subjected to image processing, which is finally superimposed on the display unit 11.
  • FIG. 6 is a schematic diagram illustrating an example of a correspondence relationship between a predetermined area of the touch operation unit 12 and an operation area on the screen constituting the display unit 11.
  • FIG. 6A shows a predetermined region R1 of the touch operation unit 12.
  • FIG. 6B shows an operation area R ⁇ b> 2 on the screen constituting the display unit 11.
  • the control unit 15 sets a predetermined region R1 of the touch operation unit 12 and an operation region R2 on the screen constituting the display unit 11.
  • the predetermined area R1 of the touch operation unit 12 is an area for the operator to perform a touch operation with a part of the operator.
  • the predetermined region R1 of the touch operation unit 12 is a part or the entire region of the touch pad 121.
  • the operation area R2 on the screen constituting the display unit 11 is an area on the screen that can be virtually operated by the operator's operator hand superimposed on the screen.
  • the operation area R2 on the screen constituting the display unit 11 is a part or the entire area of the screen.
  • the predetermined region R1 of the touch operation unit 12 is set on the back side of the touch operation unit 12 so that the operator's operator's hand is superimposed on the entire touch operation unit 12.
  • the back side of the touch operation unit 12 is, for example, the back side of the touch pad 121 constituting the touch operation unit 12. That is, as shown in FIGS. 4 and 6, the back side of the touch operation unit 12 is an area on the touch pad 121 that is farthest from the wrist when the operator's arm and wrist are arranged on the armrest and palmrest, respectively. .
  • the predetermined area R1 of the touch operation unit 12 is not limited to this, and may be an arbitrary partial area on the touch pad 121 or an entire area as described above.
  • the predetermined area R1 of the touch operation unit 12 is an arbitrary partial area on the touch pad 121, the areas on the touch pad 121 other than the predetermined area R1 are configured not to react to the touch operation. May be.
  • the operation area R2 on the screen constituting the display unit 11 is set at the upper part of the screen so as to correspond when the predetermined area R1 is set on the back side of the touch pad 121. That is, the back side and the near side of the touch pad 121 correspond to the upper part and the lower part of the screen, respectively.
  • the correspondence relationship between the touch pad 121 and the screen constituting the display unit 11 is not limited to this. For example, the above correspondence relationship may be reversed, and the near side and the far side of the touch pad 121 may correspond to the upper part and the lower part of the screen, respectively.
  • the operation region R2 on the screen constituting the display unit 11 is associated with the predetermined region R1 on the back side of the touch operation unit 12. , May be set at the lower part thereof.
  • the control unit 15 associates the set position coordinates in the predetermined region R1 of the touch operation unit 12 with the position coordinates in the operation region R2 on the screen constituting the display unit 11. For example, when the predetermined area R1 of the touch operation unit 12 is a square area of a part or the whole of the touch pad 121, and the operation area R2 on the screen is a part of or a whole square area of the screen. think of. In this case, the control unit 15 associates the four vertices of the predetermined area R1 with the four vertices of the operation area R2. The control unit 15 can determine the correspondence between the position coordinates of each point located in the square area connecting the vertices by identifying the correspondence between the position coordinates of the four vertices.
  • Such processing may be executed as calibration at an initial stage when the in-vehicle information processing system 10 is mounted on the vehicle, or may be executed as needed.
  • FIG. 7 is a schematic diagram showing a state of image processing performed by the in-vehicle information processing system 10.
  • FIG. 7A shows the state of the operator's operator performing a touch operation on the touch operation unit 12.
  • FIG. 7B shows the state of the operator's operator hand superimposed on the screen constituting the display unit 11.
  • the control unit 15 acquires image information captured by the camera from the imaging unit 13. As shown by the region R ⁇ b> 3 in FIG. 7, the captured image includes at least a part of an operator's hand performing a touch operation with the touch operation unit 12, and the touch operation unit 12, particularly the touch pad 121. That is, the imaging unit 13 images the positional relationship between the touch operation unit 12 and the operator's operating hand. In addition, as described above, the control unit 15 associates the position coordinates in the predetermined region R1 with the position coordinates in the operation region R2, so that it corresponds to the position of the operator's operator's hand in the touch operation unit 12. In addition, at least a part of the operator can be displayed in a superimposed manner on the screen.
  • the control unit 15 performs image processing for extracting a part or the whole of the operator's operator hand based on the above image when the operator's operator's hand is superimposed on the display unit 11. That is, the control unit 15 removes image information such as an external background from the contour of the operator's operator's hand. By surrounding the periphery of the touch pad 121 with a black edge, the control unit 15 can easily extract the operator's operator based on the captured image.
  • control unit 15 may or may not color the operating part by image processing. In order to further improve the reality of the operator's operator displayed on the display unit 11, the control unit 15 is preferably colored by image processing.
  • control unit 15 When the captured image is a color image, it is preferable that the control unit 15 directly superimposes the image on the screen in accordance with the actual color and brightness of the operator's operator.
  • the control unit 15 is not limited to this, and in order to make it easier to visually recognize the display content behind, the control unit 15 eliminates the color and brightness of the operating part, and instead performs image processing that adds a predetermined color, for example. You may go.
  • the control unit 15 may perform image processing that eliminates the color and brightness of the operating part and makes the operating part completely colorless and transparent. In this case, the control unit 15 displays only the part near the contour of the operator on the screen.
  • the image captured by the imaging unit 13 is a color image
  • the control unit 15 is described as being superimposed and displayed on the screen as it is in accordance with the actual color and brightness of the operator's operator. That is, the description will be made assuming that the control unit 15 does not need to perform image processing relating to color, brightness, and the like.
  • the control unit 15 operates the operator based on the ratio of the size between the predetermined region R1 of the touch operation unit 12 in the captured image and the operation region R2 on the screen constituting the display unit 11. Determine the hand display magnification. For example, when the predetermined area R1 of the touch operation unit 12 is a square area of a part or the whole of the touch pad 121, and the operation area R2 on the screen is a part of or a whole square area of the screen. think of. In this case, the control unit 15 calculates the ratio between the length of each side of the predetermined region R1 of the touch operation unit 12 and the length of each corresponding side of the operation region R2 on the screen. Based on the ratio, the control unit 15 determines the display magnification of the operator's operator's hand to be displayed on the display unit 11 in a superimposed manner.
  • the control unit 15 superimposes the imaged operator's operator's hand on the display unit 11 in an enlarged or reduced state or as it is based on the determined display magnification.
  • the control unit 15 may set the display magnification of the operator's operator to be the same as the above ratio, or may be a different value based on the above ratio.
  • the display magnification determination process may be executed simultaneously with the above-described calibration at the initial stage when the in-vehicle information processing system 10 is mounted on the vehicle, or may be executed as needed.
  • the control unit 15 may fix the magnification or change the display magnification according to the situation. For example, the control unit 15 may change the display magnification of the operator's operator between daytime and nighttime, or may appropriately change the display magnification according to the operator's setting.
  • the control unit 15 may change the display magnification of the operator's operator to be superimposed and displayed based on the size of the operation area R2 of each screen. Good. Without being limited thereto, the control unit 15 derives, for example, an average value based on the size of the operation region R2 of each screen, and the operator's operator who displays the average value based on the average value.
  • the display magnification may be constant.
  • the control unit 15 may change the display magnification of the operator's operator according to not only the size of the operation area R2 on the screen but also the content displayed on the screen. For example, when the display unit 11 displays a map or a function item and the operator operates each, the control unit 15 reduces the display magnification of the operator to be lower than usual so that the operator can easily operate. Then, the display unit 11 may be displayed in a superimposed manner.
  • the control unit 15 may change the display magnification for each operator, for example, to match the size of the operator displayed superimposed on the display unit 11 between operators having different sizes of the operator.
  • the control unit 15 makes the display magnification of the operator superimposed on the display unit 11 constant among the operators having different sizes of the operator, and adjusts it to the size of the actual operator's operator. May be displayed in a superimposed manner.
  • control unit 15 performs image processing within a predetermined time based on the image captured by the imaging unit 13.
  • the predetermined time is a time delay between the timing of the operation by the operator's actual operator and the timing of the movement of the operator's operator superimposed on the screen, and the operator is not aware of it. Means a time delay. That is, it is preferable that the control unit 15 completes the image processing within a time sufficiently shorter than the time delay in which the operator feels uncomfortable with the operation due to the reaction speed and the recognition ability of the operator. For example, it is preferable that the control unit 15 limit the image processing only to the above-described extraction of the operator's operator's hand that has been imaged and adjustment of the display magnification.
  • the position coordinates in the predetermined region R1 of the touch operation unit 12 where the touch operation by the operator's operator is detected is not identified by image processing of the captured image by the imaging unit 13, but as described above. It is preferable that identification is performed based on detection information from the touch operation unit 12, particularly the touch pad 121.
  • the control unit 15 is described as performing two image processes as described above, but is not limited thereto, and may perform three or more image processes within a predetermined time.
  • the position coordinates in the predetermined region R1 of the touch operation unit 12 where the touch operation by the operator's operator is detected may be identified by image processing of the captured image by the imaging unit 13.
  • the control unit 15 refers to information about the predetermined region R1 of the touch operation unit 12 and the operation region R2 on the screen constituting the display unit 11 from the storage unit 16 when performing the above image processing. That is, the control unit 15 refers to information regarding the position coordinates in the predetermined region R1 of the touch pad 121 corresponding to the detection information. The control unit 15 refers to information regarding the position coordinates in the operation region R2 of each screen constituting the display unit 11. The control unit 15 refers to information on the display magnification of the operator's operator's hand to be displayed in a superimposed manner determined by calibration or the like.
  • FIG. 8 is an enlarged view of the display unit 11 shown in FIG.
  • the display unit 11 has four screens 111, 112, 113, and 114.
  • the screen 111 has two display layers 1111 and 1112 that are different in the depth direction.
  • the display layer 1111 of the screen 111 displays an operation screen.
  • the display layer 1112 of the screen 111 displays information about the vehicle such as the vehicle speed, the traveling position of the host vehicle in a plurality of lanes, or the inter-vehicle distance.
  • the screens 112 and 113 display an image around the vehicle by an electronic mirror.
  • the screen 114 displays function information for controlling map information related to car navigation and information related to vehicles.
  • the number of screens constituting the display unit 11 is four and the number of display layers of the screen 111 is two.
  • the display unit 11 may be configured by an arbitrary number of screens.
  • the screen 111 may be configured by an arbitrary number of display layers. Although only the screen 111 is described as having a display layer, the present invention is not limited to this, and other screens constituting the display unit 11 may have a plurality of display layers.
  • the content displayed on each screen is not limited to the above, and any screen may display any content.
  • the control unit 15 determines the movement direction from the currently selected screen to the next selected screen, at least a part of the detected operator's operating hand (carpal portion). ) Corresponding to the direction of movement. For example, in a state where the screen 111 is selected, when the operator pushes the four-way switch to the left side with the wrist, the control unit 15 selects the screen 112 installed on the left side of the screen 111. Similarly, when the operator depresses the four-way switch to the right side with the wrist while the screen 111 is selected, the control unit 15 selects the screen 113 installed on the right side of the screen 111. In this state, when the operator pushes the four-way switch to the right again at the wrist, the control unit 15 selects the screen 114 installed on the right side of the screen 113.
  • the control unit 15 may return to the right end and select the screen 114.
  • the selection may not be changed, and the control may be performed so that the selection still remains on the screen 112.
  • the control unit 15 may perform the same control.
  • the control unit 15 determines the movement direction from the currently selected display layer to the next selected display layer at least a part of the detected operator's operator's hand. Correspond to the direction of movement. For example, when the display layer 1111 is selected and the operator pushes the four-way switch to the back side with the wrist, the control unit 15 selects the display layer 1112 arranged on the back side of the display layer 1111. To do. Conversely, when the display layer 1112 is selected and the operator pushes the four-way switch to the near side with the wrist, the control unit 15 moves the display layer 1111 disposed on the near side of the display layer 1112. select.
  • the target selected by the above operation is a front that displays a virtual image in addition to each screen and each display layer. It may be a windshield.
  • the control unit 15 causes the front window disposed on the back side of the display layer 1112 to be Select a shield. That is, the control unit 15 identifies the display layers 1111 and 1112 and the front windshield as a single hierarchical structure.
  • the control unit 15 In a state where the display layer 1112 or the front windshield arranged at the back end of the display unit 11 is selected, when the operator further pushes the four-way switch to the back side, the control unit 15 returns to the front end and returns to the display layer. 1111 may be selected.
  • the control unit 15 may perform control so that the selection is not changed and the display layer 1112 or the front windshield still remains.
  • the control unit 15 may perform the same control even when the operator pushes the four-way switch further forward while the screen 111 arranged at the front end of the display unit 11 is selected.
  • control unit 15 selects the screen 112 installed on the left side.
  • control unit 15 selects the screen 113 installed on the right side.
  • the control unit 15 When the operator presses the four-way switch in the direction of selecting the screen 111 while the screen 112 or 113 is selected, the control unit 15 causes the specific display layer or the front windshield (for example, the front end of the front end) to be selected. The selection may be returned to the display layer 1111). The control unit 15 may return the selection to the display layer or the front windshield that was selected immediately before.
  • the method of selecting the screen, the display layer, or the front windshield is not limited to the above, and any method may be used as long as it corresponds to the direction of movement of at least a part of the detected operator's operator's hand. .
  • the control unit 15 may or may not suppress the display level of the display layer that is not selected on the screen 111. In view of the visibility of the operator, it is preferable that the control unit 15 suppresses the display degree of the display layer that is not selected.
  • control unit 15 may reduce the luminance of a display layer that is not selected on the screen 111.
  • the control unit 15 may suppress the display degree by graying out.
  • the control unit 15 may exclude all RGB of the display layer that has not been selected, or may exclude only one or two elements of RGB.
  • the control unit 15 may blur the display layer that has not been selected.
  • the control part 15 does not need to display the display layer which has not been selected in the first place. Without being limited to these methods, the control unit 15 performs control by any method that can relatively improve the visibility of the selected display layer by reducing the visibility of the display layer that is not selected. May be.
  • FIG. 9 is a flowchart showing an example of the operation of the in-vehicle information processing system 10.
  • the control unit 15 performs calibration. That is, the control unit 15 associates the set position coordinates in the predetermined region R1 of the touch operation unit 12 with the position coordinates in the operation region R2 on the screen constituting the display unit 11 (step S10).
  • the control unit 15 determines the display magnification of the operator's operator to be displayed in a superimposed manner on the display unit 11 by calibration or the like (step S11).
  • the control unit 15 determines whether the operator's operating hand is superimposed on the touch operation unit 12 based on the image captured by the imaging unit 13 (step S12).
  • control unit 15 determines that the operator's operating hand is superimposed on the touch operation unit 12
  • the control unit 15 proceeds to step S13.
  • the control unit 15 returns to step S12 again and waits until the operator's operating hand is superimposed.
  • control unit 15 When it is determined that the operator's operator's hand is superimposed on the touch operation unit 12, the control unit 15 performs image processing for extracting a part or the whole of the operator's operator's hand (Step S13).
  • the control unit 15 superimposes and displays the captured operator's hand based on the display magnification determined in step S11 (step S14).
  • the control unit 15 determines whether or not the detection information regarding at least a part of the operation of the operator's operator has been acquired from the operation unit 14 (step S15).
  • control part 15 progresses to step S16, when detection information is acquired.
  • the control unit 15 proceeds to step S18.
  • control unit 15 When acquiring the detection information, the control unit 15 newly selects a screen corresponding to the identified operation direction (step S16).
  • the control unit 15 superimposes the captured operator's hand based on the display magnification adapted to the newly selected screen (step S17).
  • the control unit 15 determines whether detection information related to the touch operation has been acquired from the touch operation unit 12 (step S18).
  • control unit 15 When the control unit 15 acquires the detection information, the control unit 15 proceeds to step S19. When the detection information is not acquired, the control unit 15 returns to step S18 again and waits until the detection information is acquired.
  • control unit 15 When acquiring the detection information, the control unit 15 performs an operation based on the touch operation on the touch operation unit 12 on the currently selected screen (step S19).
  • the control unit 15 ends the flow.
  • the operator's operator displayed on the display unit 11 virtually operates the information on the screen. Is possible. That is, the operator can access the screen in a state closer to the actual feeling. The operator can intuitively recognize the actual position of the operator and the positional relationship on the screen. Therefore, the in-vehicle information processing system 10 can reduce the time for the operator to watch the screen as compared with a conventional device that displays a pointer or the like.
  • the in-vehicle information processing system 10 limits the time for performing image processing, and therefore, it is possible to superimpose and display the operator's hand with a minimum delay. That is, the in-vehicle information processing system 10 can reduce a temporal movement shift between the actual operator and the operator superimposed and displayed on the screen. Thereby, the operator can operate the information displayed on the screen without a more uncomfortable feeling.
  • the in-vehicle information processing system 10 performs image processing for extracting the operator's operator hand imaged by the imaging unit 13, the operator's operator hand can be faithfully superimposed on the screen. As a result, the operator can intuitively recognize the operating hand superimposed on the screen as his / her own hand.
  • the in-vehicle information processing system 10 performs image processing for changing the display magnification of the operator's operator imaged by the imaging unit 13, so that the operator's operator's hand is superimposed and displayed in an optimum size for each screen. It is possible to make it. As a result, the operator can easily see the display content displayed behind the operator while feeling the reality of the operator superimposed on the screen.
  • the in-vehicle information processing system 10 can display an operator's operator in a superimposed manner with a minimum delay compared to the case where the position coordinates in the predetermined region R1 where the touch operation is detected are identified by image processing. Is possible.
  • the in-vehicle information processing system 10 does not indirectly identify the position coordinates based on the image picked up by the image pickup unit 13 but directly identifies them by the touch operation unit 12, so that the position coordinates can be accurately identified. it can. That is, since the in-vehicle information processing system 10 detects the position where the operator is actually touching by the touch operation unit 12, when the operator selects a function item displayed on the screen, a malfunction occurs. Less likely to cause.
  • the imaging unit 13 captures an image of the operator's entire operating hand, so that the operator can easily recognize that the operating hand superimposed on the screen is his / her hand.
  • the operator can easily recognize which part of the operator is moving based on the display on the screen. Since the operator accurately recognizes the relationship between the actual position of the operator and the position on the screen, the operator can easily grasp the movement amount of the operator and the movable area on the screen.
  • the in-vehicle information processing system 10 can accurately identify the operator's operator on the touch pad 121 by imaging the entire operator's operator with the imaging unit 13. That is, the in-vehicle information processing system 10 can accurately recognize the operator's operating hand as a human hand.
  • the in-vehicle information processing system 10 can identify each part of the operator's operator with higher accuracy. That is, the in-vehicle information processing system 10 can accurately identify which finger corresponds to the finger in the captured image.
  • the vehicle-mounted information processing system 10 can accurately identify the size of the entire operator and the proportion of the size of each part in the entire operator by capturing the entire operator.
  • the in-vehicle information processing system 10 accurately matches the movement amount of the operator's operator's hand on the screen and the movable area on the screen with the movement of the actual operator's hand on the touch pad 121. It is possible to identify.
  • the in-vehicle information processing system 10 sets the predetermined area R1 on the back side of the touch pad 121 and sets the operation area R2 on the upper part of the screen, so that the operator inevitably places himself in the touch operation unit 12 as a whole. Will be superimposed. At this time, the in-vehicle information processing system 10 can inevitably image the entire operator's operation hand by the imaging unit 13. By making the region on the touch pad 121 other than the predetermined region R1 non-responsive to the touch operation, the operator's consciousness is concentrated on the predetermined region R1. At this time, the vehicle-mounted information processing system 10 can capture the entire operator's operating hand more reliably.
  • the in-vehicle information processing system 10 can clearly recognize which finger is in contact with the touch pad 121 as visual information by highlighting an item selected by contact by the operator.
  • the operator can easily visually recognize at which position on the screen the operator displayed in superposition is touching or which item is selected.
  • the in-vehicle information processing system 10 gives a click feeling to the operator when the tact switch 122 is turned on. Therefore, the operator can obtain tactile feedback by his / her own operation, and more intuitive operation is possible.
  • the tact switch 122 is used to confirm the selection, so that the operator can confirm the item selected by a natural action.
  • the operator feels more reality with respect to the operator superimposed and displayed on the screen due to the reaction force acting on the finger. That is, the operator can easily feel the illusion that his / her actual operator is touching the screen directly.
  • the in-vehicle information processing system 10 can ensure the reality because the operator's operator's hand to be superimposed is semi-transparent. Thereby, the operator can easily visually recognize the information displayed on the screen. That is, the operator can operate information displayed on the screen more intuitively.
  • the in-vehicle information processing system 10 can image the entire operator's operating hand more easily by installing the imaging unit 13 above the touch operation unit 12.
  • each screen can be easily selected. That is, the operator can select each screen based on visual information obtained while viewing each screen of the display unit 11 without changing his / her line of sight to his / her own operator. Therefore, the in-vehicle information processing system 10 makes the operator's screen selection operation easier and improves the convenience for the operator.
  • the in-vehicle information processing system 10 makes the movement direction from the currently selected screen to the next selected screen correspond to the detected direction of operation, and thus enables an intuitive screen selection operation. That is, in order to select the next screen, the operator can operate at least a part of his / her operator with a normal feeling based on the position information of each screen identified visually.
  • the in-vehicle information processing system 10 includes a screen having a plurality of display layers that differ in the depth direction, more information can be displayed on one screen. That is, the in-vehicle information processing system 10 can simultaneously display different types of information on one screen by dividing the display layer.
  • the in-vehicle information processing system 10 makes the movement direction from the currently selected display layer to the next selected display layer correspond to the direction of the detected operation, so that an intuitive display layer selection operation can be performed. To do. That is, in order to select the next display layer, the operator can operate at least a part of his / her operator with a normal feeling based on the position information of each display layer visually identified.
  • the in-vehicle information processing system 10 suppresses the degree of display of a display layer that is not selected, so that the visibility of the selected display layer can be improved.
  • the in-vehicle information processing system 10 suppresses the display level by reducing the luminance, so that the selected display layer can be clearly displayed. That is, the in-vehicle information processing system 10 can relatively increase the luminance of the selected display layer by reducing the luminance of the display layer that is not selected, thereby enabling clear display.
  • the in-vehicle information processing system 10 suppresses the degree of display of the display layer that is not selected by graying out, so that the visibility of the selected display layer can be relatively improved.
  • the in-vehicle information processing system 10 can improve the visibility of the operator with respect to the display layer structure when displaying a non-selected display layer by the above-described arbitrary method as compared with not displaying at all. That is, the operator can easily visually recognize which number of display layers are provided at which position on the screen.
  • the in-vehicle information processing system 10 includes an operation unit 14 that is arranged on a palm rest and includes a four-way switch. Therefore, the operator can select each screen or each display layer with a simple operation. That is, the operator can select each screen simply by pressing the four-way switch in the corresponding direction based on the back, near, left, and right directions regarding each screen or each display layer identified visually.
  • the in-vehicle information processing system 10 associates the back, near, left, and right directions for each screen or each display layer with the back, near, left, and right switches of the four-way switch, respectively. Therefore, the operator can perform the selection operation intuitively.
  • the in-vehicle information processing system 10 may respond to the operation of the operator with a pointer, cursor, or the like instead of displaying the operator on the screen.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

Provided is a vehicle-mounted information processing system which allows easy selection of screens. The vehicle-mounted information processing system (10) comprises: a display unit (11) which has a plurality of screens; an operating unit (14) which detects an action made by at least part of the operating hand of an operator; and a control unit (15) which selects a screen for the operator to operate displayed content on the basis of the detected action.

Description

車載用情報処理システムIn-vehicle information processing system 関連出願の相互参照Cross-reference of related applications
 本出願は、2016年4月27日に日本国に特許出願された特願2016-089489の優先権を主張するものであり、これら先の出願の開示全体をここに参照のために取り込む。 This application claims the priority of Japanese Patent Application No. 2016-089489, filed in Japan on April 27, 2016, the entire disclosure of which is incorporated herein by reference.
 本発明は、車両に関する情報を制御するために車両に搭載される車載用情報処理システムに関する。 The present invention relates to an in-vehicle information processing system mounted on a vehicle in order to control information related to the vehicle.
 従来、操作者は、センターコンソールに設置されたタッチパッド上でタッチ操作を行うことにより、1つの画面上に表示されたポインタ又はカーソルの操作、画面のスクロール操作、選択操作、及び入力操作などを行うことが可能である。 Conventionally, an operator performs a pointer operation or a cursor operation displayed on one screen, a screen scroll operation, a selection operation, an input operation, etc. by performing a touch operation on a touch pad installed on a center console. Is possible.
 例えば、特許文献1に開示された操作入力装置は、走行中の車両の振動を含む外乱因子の影響を低減して、タッチパッド上でのタッチ操作に基づく表示部上のカーソル移動を滑らかにすることが可能である。 For example, the operation input device disclosed in Patent Literature 1 reduces the influence of disturbance factors including vibration of a running vehicle, and smoothes the cursor movement on the display unit based on the touch operation on the touch pad. It is possible.
 例えば、特許文献2に開示された操作装置は、センターコンソールに設置されたタッチパッド上での操作者によるタッチ操作により、画面上に表示されたポインタ又はカーソルの操作、及び画面のスクロールを可能とする。 For example, the operation device disclosed in Patent Literature 2 can operate a pointer or cursor displayed on the screen and scroll the screen by a touch operation by an operator on a touch pad installed in the center console. To do.
特開2015-174648JP2015-174648 特開2016-012313JP2016-012313
 特許文献1及び2に開示された従来の装置は、1つの画面上にポインタ、カーソル、又は移動ベクトルなどを描画することで操作者の操作に対して応答する。一方で、複数の画面が設置される場合に、操作者が各画面をどのように選択するかは考慮されていない。従って、各画面を容易に選択できるインターフェースの実現が望まれる。 The conventional apparatuses disclosed in Patent Documents 1 and 2 respond to an operation of the operator by drawing a pointer, a cursor, a movement vector, or the like on one screen. On the other hand, when a plurality of screens are installed, how the operator selects each screen is not considered. Therefore, it is desired to realize an interface that can easily select each screen.
 かかる観点に鑑みてなされた本発明の目的は、各画面を容易に選択できる車載用情報処理システムを提供することにある。 An object of the present invention made in view of such a viewpoint is to provide an in-vehicle information processing system capable of easily selecting each screen.
 上記課題を解決するために、本発明の一実施形態に係る車載用情報処理システムは、
 複数の画面を有する表示部と、
 操作者の操作手の少なくとも一部の動作を検出する操作部と、
 検出された前記動作に基づいて、前記操作者が表示内容を操作するための前記画面を選択する制御部と、を備える。
In order to solve the above-described problem, an in-vehicle information processing system according to an embodiment of the present invention includes:
A display unit having a plurality of screens;
An operation unit for detecting at least a part of the operation of the operator's operator;
And a control unit that selects the screen for operating the display content based on the detected operation.
 本発明の一実施形態に係る車載用情報処理システムによれば、各画面を容易に選択できる。 According to the in-vehicle information processing system according to the embodiment of the present invention, each screen can be easily selected.
本実施形態に係る車載用情報処理システムの全体を示す模式図である。It is a mimetic diagram showing the whole in-vehicle information processing system concerning this embodiment. 図1の車載用情報処理システムの概略構成を示す機能ブロック図である。It is a functional block diagram which shows schematic structure of the vehicle-mounted information processing system of FIG. 表示部に表示される画像の一例を示した模式図である。It is the schematic diagram which showed an example of the image displayed on a display part. タッチ操作部を側面方向から見たときの断面を模式的に示す図である。It is a figure which shows typically a cross section when a touch operation part is seen from a side surface direction. 操作部を上面方向から見たときの模式図である。It is a schematic diagram when an operation part is seen from the upper surface direction. タッチ操作部の所定の領域と、表示部を構成する画面上の操作領域との対応関係の一例を示す模式図である。It is a schematic diagram which shows an example of the correspondence of the predetermined area | region of a touch operation part, and the operation area on the screen which comprises a display part. 車載用情報処理システムが行う画像処理の様子を示した模式図である。It is the schematic diagram which showed the mode of the image processing which an in-vehicle information processing system performs. 図1に示した表示部の拡大図である。It is an enlarged view of the display part shown in FIG. 図1の車載用情報処理システムの動作の一例を示すフローチャートである。It is a flowchart which shows an example of operation | movement of the vehicle-mounted information processing system of FIG.
 以降、諸図面を参照しながら、本発明の実施形態について詳細に説明する。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
 図1は、本実施形態に係る車載用情報処理システム10の全体を示す模式図である。図2は、図1の車載用情報処理システム10の概略構成を示す機能ブロック図である。車載用情報処理システム10は、表示部11と、タッチ操作部12と、撮像部13と、操作部14と、制御部15と、記憶部16とを有する。図3は、表示部11に表示される画像の一例を示した模式図である。図3(a)は、メニュー画面の一例を示し、図3(b)は、地図画面の一例を示す。図4は、タッチ操作部12を側面方向から見たときの断面を模式的に示す図である。図5は、操作部14を上面方向から見たときの模式図である。 FIG. 1 is a schematic diagram showing an entire in-vehicle information processing system 10 according to the present embodiment. FIG. 2 is a functional block diagram showing a schematic configuration of the in-vehicle information processing system 10 of FIG. The in-vehicle information processing system 10 includes a display unit 11, a touch operation unit 12, an imaging unit 13, an operation unit 14, a control unit 15, and a storage unit 16. FIG. 3 is a schematic diagram illustrating an example of an image displayed on the display unit 11. FIG. 3A shows an example of a menu screen, and FIG. 3B shows an example of a map screen. FIG. 4 is a diagram schematically illustrating a cross section when the touch operation unit 12 is viewed from the side surface direction. FIG. 5 is a schematic diagram when the operation unit 14 is viewed from above.
 車載用情報処理システム10は、表示部11を構成する画面上の操作領域内の位置座標と、タッチ操作部12の所定の領域内の位置座標とを対応させて、撮像部13により撮像した画像に基づいて、操作者の操作手を画面上に重畳表示する。すなわち、タッチ操作部12上での操作者によるタッチ操作に基づいて、画面上に重畳表示した操作手が、対応する位置で仮想的に画面を操作する。車載用情報処理システム10は、画面上に重畳表示した操作手の動きを、撮像部13によって撮像した実際の操作者の操作手の動きと対応させる。操作者とは、例えば車両を運転する運転者又は助手席に座っている同乗者であり、操作手とは、例えばセンターコンソール側の運転者又は同乗者自身の手である。 The in-vehicle information processing system 10 associates the position coordinates in the operation area on the screen configuring the display unit 11 with the position coordinates in the predetermined area of the touch operation unit 12 and images captured by the imaging unit 13. Based on the above, the operator's operating hand is superimposed on the screen. That is, based on the touch operation by the operator on the touch operation unit 12, the operator superimposed on the screen operates the screen virtually at the corresponding position. The in-vehicle information processing system 10 causes the movement of the operator superimposed on the screen to correspond to the movement of the actual operator's operator captured by the imaging unit 13. The operator is, for example, a driver who drives the vehicle or a passenger sitting in the passenger seat, and the operator is, for example, the driver on the center console side or the passenger's own hand.
 図1に示すとおり、表示部11は、少なくとも1つの画面を有する。表示部11は、液晶ディスプレイなどの任意の表示装置により構成されてもよい。表示部11は、液晶ディスプレイによって構成される場合、例えばインストルメントパネルに配置される。表示部11を構成する表示装置は、タッチパネルディスプレイであってもよいし、タッチ操作不能のディスプレイであってもよい。以下では、表示部11は、タッチ操作不能のディスプレイであるとして説明する。 As shown in FIG. 1, the display unit 11 has at least one screen. The display unit 11 may be configured by an arbitrary display device such as a liquid crystal display. When the display unit 11 is configured by a liquid crystal display, the display unit 11 is disposed, for example, on an instrument panel. The display device constituting the display unit 11 may be a touch panel display or a display incapable of touch operation. In the following description, it is assumed that the display unit 11 is a display that cannot be touched.
 車載用情報処理システム10は、表示部11に加えて又は代えて、いわゆるヘッドアップディスプレイ型の装置を有してもよい。この場合、ヘッドアップディスプレイ型の装置は、表示情報を表示光として発生する発光部を有し、発生した表示光を運転者等の観察者へ向かうように反射して、フロントウィンドシールド越しに虚像を表示する。観察者は、運転者に限らず助手席に座っている同乗者等でもよい。 The in-vehicle information processing system 10 may include a so-called head-up display type device in addition to or instead of the display unit 11. In this case, the head-up display type device has a light emitting unit that generates display information as display light, reflects the generated display light toward an observer such as a driver, and the virtual image through the front windshield. Is displayed. The observer is not limited to the driver but may be a passenger sitting in the passenger seat.
 表示部11は、車両に関する情報若しくは当該情報を制御するための機能項目、又はこれらの組合せを表示する。以下では、これらをまとめて「表示内容」という。車両に関する情報は、例えば、空調、カーナビゲーション、オーディオ、電子ミラーによる車両周囲の画像、車速、複数車線における自車の走行位置、又は車間距離などの情報を含む。当該情報を制御するための機能項目は、例えば、「戻る」、「進む」、「ホーム」、「決定」、「各種メニュー」、「温度高い・低い」、「現在地」、「音量大・小」、「拡大・縮小」、「速度速い・遅い」、「車線変更」、又は「距離長い・短い」などの項目を含む。表示部11は、各項目を文字として表示してもよいし、アイコンとして表示してもよい。 The display unit 11 displays information on the vehicle, function items for controlling the information, or a combination thereof. Hereinafter, these are collectively referred to as “display contents”. The information about the vehicle includes, for example, information such as air conditioning, car navigation, audio, an image around the vehicle by an electronic mirror, a vehicle speed, a traveling position of the host vehicle in a plurality of lanes, or an inter-vehicle distance. The function items for controlling the information include, for example, “return”, “forward”, “home”, “decision”, “various menus”, “temperature high / low”, “current location”, “volume high / low” ”,“ Enlargement / reduction ”,“ speed fast / slow ”,“ lane change ”, or“ distance long / short ”. The display unit 11 may display each item as a character or an icon.
 例えば、図3(a)に示すとおり、表示部11は、車両に関する情報を制御するための機能項目として、各種メニューを1つの画面上に表示する。具体的には、表示部11は、各種アプリを表示するための項目として「APPS」を表示する。表示部11は、電話を使用するための項目として「TEL」を表示する。表示部11は、エアコンを制御するための項目として「A/C」を表示する。表示部11は、カーナビゲーションを使用するためのメニューとして「NAVI」を表示する。表示部11は、オーディオを使用するためのメニューとして「AUDIO」を表示する。表示部11は、ホーム画面に戻るための項目として「HOME」を表示する。表示部11は、1つ前の画面に戻るための項目として「RETURN」を表示する。 For example, as shown in FIG. 3A, the display unit 11 displays various menus on one screen as function items for controlling information related to the vehicle. Specifically, the display unit 11 displays “APPS” as an item for displaying various applications. The display unit 11 displays “TEL” as an item for using the telephone. The display unit 11 displays “A / C” as an item for controlling the air conditioner. The display unit 11 displays “NAVI” as a menu for using the car navigation. The display unit 11 displays “AUDIO” as a menu for using audio. The display unit 11 displays “HOME” as an item for returning to the home screen. The display unit 11 displays “RETURN” as an item for returning to the previous screen.
 例えば、図3(b)に示すとおり、表示部11は、車両に関する情報として、カーナビゲーションシステムの一部である地図情報を1つの画面上に表示する。表示部11は、車両に関する情報及び当該情報を制御するための機能項目の組合せとして、当該地図情報に重畳させるように、「目的地設定」、「HOME」、及び「RETURN」などの機能項目を表示する。 For example, as shown in FIG. 3B, the display unit 11 displays map information that is a part of the car navigation system on one screen as information about the vehicle. The display unit 11 displays function items such as “Destination setting”, “HOME”, and “RETURN” so as to be superimposed on the map information as a combination of information on the vehicle and function items for controlling the information. indicate.
 表示部11は、上記の表示内容と重畳させるように操作者の操作手を表示する。表示部11は、図3に示すとおり、操作者の操作手を半透過にして、その背後に上記の表示内容を表示する。表示部11は、これに限定されず、操作者の操作手が重畳したときに、背後の表示内容が一時的に隠れるように、不透明な操作手を表示してもよい。 The display unit 11 displays the operator's operator so as to overlap the display content. As shown in FIG. 3, the display unit 11 makes the operator's operating hand translucent and displays the above display content behind it. The display unit 11 is not limited to this, and may display an opaque operation hand so that the display content behind the operator is temporarily hidden when the operator's operation hand is superimposed.
 以下では、半透過の度合い、すなわち透過率は、重畳させる位置に依存せず一定であるとして説明するが、これに限定されず、重畳させる位置ごとに変化してもよい。例えば、機能項目を表示している位置では、操作者がどの項目を選択すべきかを十分に視認可能なように、表示部11は、透過率を所定値よりも上げて操作者の操作手を表示してもよい。逆に、単なる背景のみを表示している位置では、表示部11は、透過率を所定値よりも下げて表示してもよい。 Hereinafter, the degree of semi-transmission, that is, the transmittance is described as being constant without depending on the position to be superimposed, but is not limited thereto, and may be changed for each position to be superimposed. For example, at the position where the function items are displayed, the display unit 11 raises the transmittance above a predetermined value so that the operator can sufficiently recognize which item the operator should select. It may be displayed. Conversely, at a position where only the background is displayed, the display unit 11 may display the transmittance lower than a predetermined value.
 表示部11は、グラデーションを付した操作者の操作手を表示してもよい。ここで述べるグラデーションは、明暗、色彩、若しくは透過率又はこれらの組合せに関する任意の段階的変化を含んでよい。表示部11は、操作者が背後の表示内容を容易に視認可能な任意の方法によりグラデーション表示するのが好適である。例えば、表示部11は、重畳表示した操作者の操作手の指先により近づくほど、操作手を徐々に明るくしたり、淡い色へと徐々に変えたり、透過率を徐々に上げたりして表示してもよい。 The display unit 11 may display an operator's operator with gradation. The gradations described herein may include any step change with respect to light and darkness, color, or transmittance, or a combination thereof. The display unit 11 is preferably displayed in gradation by an arbitrary method in which the operator can easily view the display content behind. For example, the display unit 11 displays the operator by gradually increasing the brightness of the operator, gradually changing to a lighter color, or gradually increasing the transmittance as the fingertip of the operator's operator superimposed on the display approaches. May be.
 透過率が低いほど、重畳させる操作者の操作手のリアリティは向上するが、操作者は、背後の表示内容を視認することが困難となる。逆に、透過率が高いほど、操作者による背後の表示内容の視認は容易となるが、リアリティは低下する。従って、表示部11は、重畳させる操作者の操作手のリアリティを確保しつつ、表示内容を操作者が容易に視認可能な任意の表示方法によりそれぞれを表示するのが好適である。 ¡The lower the transmittance, the better the reality of the operator's operator to be superimposed, but it becomes difficult for the operator to visually recognize the display content behind. Conversely, the higher the transmittance, the easier it is for the operator to visually recognize the display content behind, but the reality is reduced. Therefore, it is preferable that the display unit 11 displays each display content by an arbitrary display method that allows the operator to easily view the display contents while ensuring the reality of the operator's operator to be superimposed.
 表示部11は、上述の表示内容などが表示される仮想空間上に、現実世界の操作者の操作手を重畳させるものとして説明したが、これに限定されない。例えば、表示部11は、いわゆるミックスドリアリティのように、画面上に表示した操作者の操作手の前面から、表示内容などを重畳表示してもよい。 Although the display unit 11 has been described as superimposing an operator's hand in the real world on a virtual space where the above-described display contents are displayed, the present invention is not limited to this. For example, the display unit 11 may superimpose display contents and the like from the front surface of the operator's operator displayed on the screen as in a so-called mixed reality.
 タッチ操作部12は、図1に示すとおり、例えばセンターコンソールに配置される。タッチ操作部12は、図2に示すとおり、タッチパッド121と、タクトスイッチ122とを有する。図4に示すとおり、操作者は、アームレスト及びパームレストに自身の腕及び手首をそれぞれ乗せて、操作手の一部、例えば指をタッチパッド121に接触させる。 The touch operation part 12 is arrange | positioned at a center console, for example as shown in FIG. The touch operation unit 12 includes a touch pad 121 and a tact switch 122 as shown in FIG. As shown in FIG. 4, the operator places his / her arm and wrist on the armrest and palmrest, respectively, and makes a part of the operation hand, for example, a finger contact the touchpad 121.
 タッチパッド121は、操作者の操作手又はスタイラスペンなどの接触物による接触を、対応する接触位置において検出する。タッチパッド121は、操作者の操作手の一部、例えば指による接触を対応する接触位置において検出する。操作者は、タッチ操作部12、特にタッチパッド121上でタッチ操作を行うことにより、表示部11を構成する各画面上に表示される情報を操作する。タッチパッド121は、例えば透明なガラスにより形成され、抵抗膜方式、静電容量方式、表面弾性波方式、赤外線方式、又は電磁誘導方式などの任意の方式により構成されたタッチセンサを用いることができる。以下では、タッチパッド121は、静電容量方式による静電タッチパッドであるものとして説明する。 The touch pad 121 detects contact by a contact object such as an operator's operator or a stylus pen at a corresponding contact position. The touch pad 121 detects contact with a part of the operator's operating hand, for example, a finger at a corresponding contact position. The operator operates information displayed on each screen constituting the display unit 11 by performing a touch operation on the touch operation unit 12, particularly the touch pad 121. The touch pad 121 is formed of, for example, transparent glass, and a touch sensor configured by an arbitrary method such as a resistive film method, a capacitance method, a surface acoustic wave method, an infrared method, or an electromagnetic induction method can be used. . In the following description, it is assumed that the touch pad 121 is a capacitive touch pad.
 タクトスイッチ122は、図4に示すとおり、タッチパッド121の直下に配置され、基板により支持される。タクトスイッチ122は、操作者の押込みによってタッチパッド121が下方へと変位した場合、その押圧によりオンになる。タクトスイッチ122は、操作者の押込みが解除されタッチパッド121が元の位置に戻った場合、オフになる。操作者は、タッチパッド121を押込むと、タクトスイッチ122がオンになることで、クリック感を得る。 As shown in FIG. 4, the tact switch 122 is disposed immediately below the touch pad 121 and supported by the substrate. When the touch pad 121 is displaced downward by the operator's pressing, the tact switch 122 is turned on by the pressing. The tact switch 122 is turned off when the push of the operator is released and the touch pad 121 returns to the original position. When the operator depresses the touch pad 121, the tact switch 122 is turned on to obtain a click feeling.
 図4では、タクトスイッチ122は、タッチパッド121の直下において、その中心部に1つ配置されているが、これに限定されず、タッチパッド121からの押圧を検出可能であれば、任意の個数で任意の場所に配置されてもよい。例えば、タクトスイッチ122は、タッチパッド121の直下において、その外周部に1つ配置されてもよいし、分散した位置に複数個配置されてもよい。例えば、タッチパッド121の直下に複数のタクトスイッチ122が配置されることで、タッチ操作部12は、タッチパッド121の所定のエリアごとにタッチパッド121からの押圧を検出可能なように構成されてもよい。すなわち、タッチ操作部12は、操作者の複数の指が同時にタッチパッド121に接触している場合に、タッチパッド121上のどの位置で操作者が押込んだかを検出可能なように構成されてもよい。 In FIG. 4, one tact switch 122 is arranged at the central portion immediately below the touch pad 121. However, the tact switch 122 is not limited to this, and any number can be used as long as the pressure from the touch pad 121 can be detected. It may be arranged at any place. For example, one tact switch 122 may be disposed on the outer periphery of the tact switch 122 immediately below the touch pad 121, or a plurality of tact switches 122 may be disposed at dispersed positions. For example, by arranging a plurality of tact switches 122 immediately below the touch pad 121, the touch operation unit 12 is configured to be able to detect a pressure from the touch pad 121 for each predetermined area of the touch pad 121. Also good. That is, the touch operation unit 12 is configured to be able to detect at which position on the touch pad 121 the operator has pressed when a plurality of fingers of the operator are in contact with the touch pad 121 at the same time. Also good.
 タッチパッド121の直下に配置される構成部は、タクトスイッチ122に限定されず、タッチパッド121からの押圧を検出可能であれば、任意の構成であってよい。例えば、タクトスイッチ122に代えて、圧電素子などの感圧センサがタッチパッド121の直下に配置されてもよい。 The configuration unit arranged immediately below the touch pad 121 is not limited to the tact switch 122, and may be any configuration as long as the pressing from the touch pad 121 can be detected. For example, instead of the tact switch 122, a pressure sensitive sensor such as a piezoelectric element may be disposed immediately below the touch pad 121.
 タッチ操作部12は、タッチパッド121と、タクトスイッチ122とに加えて、車両走行中の振動によるタッチパッド121の不要な検出信号を除去するために、フィルターを有してもよい。 In addition to the touch pad 121 and the tact switch 122, the touch operation unit 12 may include a filter in order to remove an unnecessary detection signal of the touch pad 121 due to vibration during traveling of the vehicle.
 撮像部13は、少なくとも1つのカメラを有し、例えば、ルーフパネルに配置される。撮像部13を構成するカメラは、ルーフパネルから車内を撮像するように配置される。より具体的には、撮像部13は、タッチ操作部12でタッチ操作を行う操作者の操作手の少なくとも一部と、タッチ操作部12とを上方から撮像する。 The imaging unit 13 has at least one camera and is disposed on, for example, a roof panel. The camera which comprises the imaging part 13 is arrange | positioned so that the inside of a vehicle may be imaged from a roof panel. More specifically, the imaging unit 13 captures an image of at least a part of an operator's operator who performs a touch operation with the touch operation unit 12 and the touch operation unit 12 from above.
 撮像部13は、操作者の操作手の少なくとも一部、例えば5本の指のみを撮像してもよいが、手の甲なども含めた操作者の操作手全体を撮像するのが好適である。操作手全体とは、操作者の手首付近から指先までを含む部位全体である。この場合、撮像部13は、操作者の操作手全体を容易に撮像するために、例えば、ルーフパネルなど、操作手の上方に配置されるのが好適である。 The imaging unit 13 may image at least a part of the operator's operating hand, for example, only five fingers, but it is preferable to image the entire operator's operating hand including the back of the hand. The entire operating hand is the entire part including from the vicinity of the operator's wrist to the fingertip. In this case, the imaging unit 13 is preferably arranged above the operator, such as a roof panel, in order to easily image the entire operator's operator.
 撮像部13は、このような配置に限定されず、操作者の操作手の少なくとも一部を撮像可能であれば、任意の場所に配置されてもよい。例えば、撮像部13は、透明なタッチパッド121の直下に配置され、タッチパッド121でタッチ操作を行っている操作者の操作手の一部を下方から撮像してもよい。この場合、例えば、図4のパームレスト部分を任意の透明な支持部材に変えることで、撮像部13は、手の甲なども含めた操作者の操作手全体を撮像することも可能である。 The imaging unit 13 is not limited to such an arrangement, and may be arranged at any location as long as at least a part of the operator's operator's hand can be imaged. For example, the imaging unit 13 may be arranged directly below the transparent touch pad 121 and may capture a part of the operator's operator performing a touch operation with the touch pad 121 from below. In this case, for example, by changing the palm rest portion in FIG. 4 to an arbitrary transparent support member, the imaging unit 13 can also image the entire operator's operating hand including the back of the hand.
 撮像部13は、昼間の明るい状態及び夜間の暗い状態の両方で操作者の操作手を鮮明に撮像可能なように、ダイナミックレンジの広いカメラで構成されるのが好適である。カメラにより撮像される画像は、白黒画像であってもよいし、カラー画像であってもよい。 The imaging unit 13 is preferably composed of a camera with a wide dynamic range so that the operator's hand can be clearly imaged both in the daytime bright state and in the nighttime dark state. The image captured by the camera may be a black and white image or a color image.
 撮像部13は、ダイナミックレンジの広いカメラによる構成に限定されず、昼間の明るい状態で撮像可能なカメラにより構成されてもよい。この場合、撮像部13は、夜間でも操作者の操作手を鮮明に撮像するために、タッチパッド121上の操作者の操作手に上方からスポットライトを照射してもよい。 The imaging unit 13 is not limited to a configuration with a camera having a wide dynamic range, and may be configured with a camera capable of imaging in a bright state in the daytime. In this case, the imaging unit 13 may irradiate the operator's operating hand on the touch pad 121 with a spotlight from above in order to clearly capture the operator's operating hand even at night.
 車両が自動運転を行う場合、操作者は、リクライニングシートを倒して、リラックスした状態でシートに寄りかかることが想定される。この時、タッチ操作部12の位置が固定されていると、操作者は、シートに寄りかかった状態で腕を伸ばしてタッチ操作を行う必要があり、不便性を感じる。従って、例えば、タッチ操作部12が配置されたセンターコンソールを、リクライニングシートの動きに連動して後方に下がるように構成することで、操作者は腕を伸ばすことなく、容易にタッチ操作を行うことができる。このような構成の場合、撮像部13は、リクライニングシートの動きに連動したタッチ操作部12のそれぞれの位置に合わせて、タッチパッド121上の操作手を撮像する必要がある。 When the vehicle performs automatic driving, it is assumed that the operator leans on the reclining seat and leans on the seat in a relaxed state. At this time, if the position of the touch operation unit 12 is fixed, the operator needs to perform the touch operation by extending his arm while leaning on the seat, and feels inconvenience. Therefore, for example, by configuring the center console on which the touch operation unit 12 is disposed so as to be lowered rearward in conjunction with the movement of the reclining seat, the operator can easily perform the touch operation without extending the arm. Can do. In the case of such a configuration, the imaging unit 13 needs to image the operator on the touch pad 121 in accordance with each position of the touch operation unit 12 that is linked to the movement of the reclining seat.
 従って、撮像部13は、上記に加えて、画角の広いカメラで構成されるのが好適である。撮像部13は、これに限定されず、画角の狭いカメラであっても、タッチ操作部12の位置の変化に連動してカメラ自体の角度が変化するように構成されてもよい。同様に、撮像部13は、タッチ操作部12の位置の変化に連動してカメラ自体が平行移動するように構成されてもよい。リクライニングシートの動きに連動して変化するタッチ操作部12の位置を、例えば、手動運転時の位置及び自動運転時の位置の二カ所に限定して、各々の位置に対応するように二台のカメラが配置されてもよい。 Therefore, in addition to the above, the imaging unit 13 is preferably configured with a camera with a wide angle of view. The imaging unit 13 is not limited to this, and may be configured such that the angle of the camera itself changes in conjunction with a change in the position of the touch operation unit 12 even if the camera has a narrow angle of view. Similarly, the imaging unit 13 may be configured such that the camera itself translates in conjunction with a change in the position of the touch operation unit 12. For example, the position of the touch operation unit 12 that changes in conjunction with the movement of the reclining seat is limited to two positions, for example, a position during manual operation and a position during automatic operation. A camera may be arranged.
 操作部14は、図1に示すとおり、例えばパームレストの上面に配置される。図4に示すとおり、操作者は、アームレスト及びパームレストに自身の腕及び手首をそれぞれ乗せて、操作手の一部を操作部14の上面に接触させる。ここで述べる操作手の一部とは、例えば、手首の手のひら側、すなわち手根部である。 The operation part 14 is arrange | positioned, for example on the upper surface of a palm rest, as shown in FIG. As shown in FIG. 4, the operator puts his / her arms and wrists on the armrest and palmrest, respectively, and makes a part of the operator touch the upper surface of the operation unit 14. The part of the operating hand described here is, for example, the palm side of the wrist, that is, the wrist part.
 図5に示すとおり、操作部14は、例えば、その上面に十字キーなどの4方向スイッチを有する。操作者は、操作部14の上面、特に当該4方向スイッチの表面に操作手の一部を配置する。操作部14は、4方向スイッチに接触している操作者の操作手の少なくとも一部の動作を検出する。例えば、操作者が、手根部で4方向スイッチを奥側に押込むと、操作部14は、奥側への押込み動作を検出する。同様に、操作者が、手根部で4方向スイッチを手前側、左側、又は右側に押込むと、操作部14は、手前側、左側、又は右側への押込み動作をそれぞれ検出する。すなわち、操作部14は、操作者の手根部の4方向の動作を検出可能である。操作部14は、検出された動作に基づく信号を、制御部15に送信する。 As shown in FIG. 5, the operation unit 14 has, for example, a four-way switch such as a cross key on the upper surface thereof. The operator places a part of the operator on the upper surface of the operation unit 14, particularly the surface of the four-way switch. The operation unit 14 detects the operation of at least a part of the operator's operator touching the four-way switch. For example, when the operator pushes the four-way switch to the back side with the wrist, the operation unit 14 detects the pushing operation to the back side. Similarly, when the operator pushes the four-way switch to the near side, the left side, or the right side at the wrist, the operation unit 14 detects the push-in operation toward the near side, the left side, or the right side. That is, the operation unit 14 can detect the movements in the four directions of the operator's wrist. The operation unit 14 transmits a signal based on the detected operation to the control unit 15.
 操作部14は、4方向スイッチを有するものとして、以降においても説明するが、これに限定されない。例えば、操作部14は、8方向スイッチを有してもよい。この場合、操作部14は、操作者の操作手の少なくとも一部の8方向の動作を検出可能である。操作部14は、方向スイッチに限定されず、操作者の操作手の少なくとも一部の動作を検出可能であれば、任意の構成を有してもよい。 The operation unit 14 will be described below as having a four-way switch, but is not limited thereto. For example, the operation unit 14 may have an 8-way switch. In this case, the operation unit 14 can detect at least a part of eight-direction motions of the operator's operating hand. The operation unit 14 is not limited to the direction switch, and may have any configuration as long as it can detect at least a part of the operation of the operator's operator.
 制御部15は、車載用情報処理システム10の各機能ブロックをはじめとして、車載用情報処理システム10の全体を制御及び管理するプロセッサである。制御部15は、制御手順を規定したプログラムを実行するCPU(Central Processing Unit)等のプロセッサで構成される。このようなプログラムは、例えば記憶部16に格納される。 The control unit 15 is a processor that controls and manages the entire in-vehicle information processing system 10 including each functional block of the in-vehicle information processing system 10. The control unit 15 includes a processor such as a CPU (Central Processing Unit) that executes a program that defines a control procedure. Such a program is stored in the storage unit 16, for example.
 制御部15は、タッチパッド121上で検出した接触情報を、入力信号としてタッチ操作部12から取得する。具体的には、制御部15は、接触物、例えば、操作者の指による接触及び対応する接触位置に関する検出情報を取得する。制御部15は、対応する接触位置に関する検出情報に基づいて、タッチ操作が行われるタッチパッド121上の正確な位置座標を識別する。 The control unit 15 acquires the contact information detected on the touch pad 121 from the touch operation unit 12 as an input signal. Specifically, the control unit 15 acquires detection information regarding a contact object, for example, a contact by an operator's finger and a corresponding contact position. The control unit 15 identifies accurate position coordinates on the touch pad 121 where the touch operation is performed based on the detection information regarding the corresponding contact position.
 制御部15は、タクトスイッチ122のオン状態又はオフ状態に関する信号を、タッチ操作部12から取得する。具体的には、操作者が、タッチパッド121を介して、タクトスイッチ122を押込んだ場合、制御部15は、オン状態の信号を取得する。操作者が、タッチパッド121への押込みを止めて、タクトスイッチ122の押込みを解除した場合、制御部15は、オフ状態の信号を取得する。制御部15は、取得した信号に基づいて、タクトスイッチ122のオン状態又はオフ状態を識別する。 The control unit 15 acquires a signal related to the on or off state of the tact switch 122 from the touch operation unit 12. Specifically, when the operator depresses the tact switch 122 via the touch pad 121, the control unit 15 acquires an on-state signal. When the operator stops pressing the touch pad 121 and releases the pressing of the tact switch 122, the control unit 15 acquires an off-state signal. The control unit 15 identifies the on state or the off state of the tact switch 122 based on the acquired signal.
 制御部15は、操作者の操作手の一部による接触をタッチパッド121が検出した場合、表示部11を構成する画面上の対応する項目を選択する。この時、制御部15は、当該項目をハイライトさせる。ハイライトとは、所定の項目を強調して表示することである。制御部15は、ハイライトにより上記の項目が選択状態にあることを操作者にフィードバックする。例えば、図3(a)に示すとおり、操作者の指による接触をタッチパッド121の対応する位置で検出した場合、制御部15は、画面上の「NAVI」の機能項目をハイライトさせる。この時、制御部15は、撮像部13により撮像された画像に基づいて、操作者の操作手を対応する位置に重畳表示させる。 The control unit 15 selects a corresponding item on the screen constituting the display unit 11 when the touch pad 121 detects contact by a part of the operator's operator's hand. At this time, the control unit 15 highlights the item. Highlighting is to display a predetermined item with emphasis. The control unit 15 provides feedback to the operator that the above item is in a selected state by highlighting. For example, as illustrated in FIG. 3A, when contact with the operator's finger is detected at a corresponding position on the touch pad 121, the control unit 15 highlights the function item “NAVI” on the screen. At this time, the control unit 15 superimposes and displays the operator's operating hand at a corresponding position based on the image captured by the imaging unit 13.
 制御部15は、操作者の操作手の一部によるタッチパッド121の押込みによりタクトスイッチ122がオンになった場合、画面上の所定の項目の選択を確定させる。画面上の所定の項目の選択を確定させる操作は、これに限定されず、例えば、タッチパッド121上でのダブルタップなど、任意の操作であってよい。この場合、タッチ操作部12は、タクトスイッチ122を有さなくてもよい。 The control unit 15 determines the selection of a predetermined item on the screen when the tact switch 122 is turned on by pressing the touch pad 121 by a part of the operator's operator. The operation for confirming selection of a predetermined item on the screen is not limited to this, and may be an arbitrary operation such as a double tap on the touch pad 121, for example. In this case, the touch operation unit 12 may not have the tact switch 122.
 例えば、図3(a)では、操作者の操作手の一部による押込みによりタクトスイッチ122がオンになった場合、制御部15は、画面上に表示された「NAVI」の項目の選択を確定させる。この時、制御部15は、タッチパッド121に接触している操作者の操作手の一部、例えば人差し指の押込み又はダブルタップなどに合わせて、同様の動きを画面上に表示させる。 For example, in FIG. 3A, when the tact switch 122 is turned on by pressing by a part of the operator's operator's hand, the control unit 15 confirms selection of the item “NAVI” displayed on the screen. Let At this time, the control unit 15 causes the same movement to be displayed on the screen in accordance with a part of the operator's hand touching the touch pad 121, for example, pressing of the index finger or double tap.
 制御部15は、車両に関する情報若しくは当該情報を制御するための機能項目、又はこれらの組合せを表示部11に表示させる。制御部15は、後述する画像処理により、表示部11を構成する画面上の操作領域の大きさに基づいた表示倍率で、操作者の操作手の少なくとも一部をその画面上に重畳表示させる。 The control unit 15 causes the display unit 11 to display information on the vehicle, function items for controlling the information, or a combination thereof. The control unit 15 causes at least a part of the operator's operator's hand to be superimposed and displayed on the screen at a display magnification based on the size of the operation area on the screen constituting the display unit 11 by image processing to be described later.
 制御部15は、操作部14により検出された、操作者の操作手の少なくとも一部、例えば手根部の動作に基づく信号を取得する。制御部15は、取得した信号に基づいて、例えば当該動作の方向を識別する。制御部15は、検出された動作に基づいて、操作者が表示内容を操作するための画面を選択する。 The control unit 15 acquires a signal based on at least a part of the operator's operating hand, for example, the operation of the wrist detected by the operating unit 14. For example, the control unit 15 identifies the direction of the operation based on the acquired signal. The control unit 15 selects a screen for the operator to operate display contents based on the detected operation.
 制御部15は、記憶部16に格納された種々の情報を参照する。具体的には、制御部15は、上記の車両に関する情報又は当該情報を制御するための機能項目に関する情報を参照する。制御部15は、タクトスイッチ122のオン状態又はオフ状態に関する情報を参照する。制御部15は、撮像部13により撮像された画像情報を参照する。制御部15は、表示部11に最終的に重畳表示させる、画像処理された操作者の操作手に関する情報を参照する。 The control unit 15 refers to various information stored in the storage unit 16. Specifically, the control unit 15 refers to information related to the vehicle or information related to a function item for controlling the information. The control unit 15 refers to information regarding the on state or the off state of the tact switch 122. The control unit 15 refers to the image information captured by the imaging unit 13. The control unit 15 refers to information regarding the operator's operator who has been subjected to image processing and is finally superimposed on the display unit 11.
 記憶部16は、半導体メモリ又は磁気メモリ等で構成することができ、上述した各種情報及び車載用情報処理システム10を動作させるためのプログラムなどを記憶する。記憶部16は、ワークメモリとしても機能する。一例を挙げると、記憶部16は、表示部11に最終的に重畳表示させる、画像処理された操作者の操作手に関する情報などを記憶する。 The storage unit 16 can be composed of a semiconductor memory, a magnetic memory, or the like, and stores various information described above, a program for operating the vehicle information processing system 10, and the like. The storage unit 16 also functions as a work memory. For example, the storage unit 16 stores information on the operator of the operator who has been subjected to image processing, which is finally superimposed on the display unit 11.
 以下では、図6及び図7を参照して、車載用情報処理システム10が行う画像処理について詳細に説明する。 Hereinafter, image processing performed by the in-vehicle information processing system 10 will be described in detail with reference to FIGS. 6 and 7.
 図6は、タッチ操作部12の所定の領域と、表示部11を構成する画面上の操作領域との対応関係の一例を示す模式図である。図6(a)は、タッチ操作部12の所定の領域R1を示す。図6(b)は、表示部11を構成する画面上の操作領域R2を示す。 FIG. 6 is a schematic diagram illustrating an example of a correspondence relationship between a predetermined area of the touch operation unit 12 and an operation area on the screen constituting the display unit 11. FIG. 6A shows a predetermined region R1 of the touch operation unit 12. FIG. 6B shows an operation area R <b> 2 on the screen constituting the display unit 11.
 制御部15は、タッチ操作部12の所定の領域R1及び表示部11を構成する画面上の操作領域R2を設定する。タッチ操作部12の所定の領域R1とは、操作者が操作手の一部によりタッチ操作を行うための領域である。例えば、タッチ操作部12の所定の領域R1は、タッチパッド121の一部又は全体の領域である。表示部11を構成する画面上の操作領域R2とは、画面上に重畳表示された操作者の操作手が仮想的に操作可能な画面上の領域である。例えば、表示部11を構成する画面上の操作領域R2は、当該画面の一部又は全体の領域である。 The control unit 15 sets a predetermined region R1 of the touch operation unit 12 and an operation region R2 on the screen constituting the display unit 11. The predetermined area R1 of the touch operation unit 12 is an area for the operator to perform a touch operation with a part of the operator. For example, the predetermined region R1 of the touch operation unit 12 is a part or the entire region of the touch pad 121. The operation area R2 on the screen constituting the display unit 11 is an area on the screen that can be virtually operated by the operator's operator hand superimposed on the screen. For example, the operation area R2 on the screen constituting the display unit 11 is a part or the entire area of the screen.
 図6(a)に示すとおり、タッチ操作部12の所定の領域R1は、タッチ操作部12全体に操作者の操作手が重畳するように、タッチ操作部12の奥側に設定されるのが好適である。タッチ操作部12の奥側とは、例えば、タッチ操作部12を構成するタッチパッド121の奥側である。すなわち、図4及び図6に示すとおり、タッチ操作部12の奥側とは、操作者の腕及び手首をアームレスト及びパームレストにそれぞれ配置したときに、手首から最も離れるタッチパッド121上の領域である。 As shown in FIG. 6A, the predetermined region R1 of the touch operation unit 12 is set on the back side of the touch operation unit 12 so that the operator's operator's hand is superimposed on the entire touch operation unit 12. Is preferred. The back side of the touch operation unit 12 is, for example, the back side of the touch pad 121 constituting the touch operation unit 12. That is, as shown in FIGS. 4 and 6, the back side of the touch operation unit 12 is an area on the touch pad 121 that is farthest from the wrist when the operator's arm and wrist are arranged on the armrest and palmrest, respectively. .
 タッチ操作部12の所定の領域R1は、これに限定されず、上述のとおり、タッチパッド121上の任意の一部の領域であってもよいし、全体領域であってもよい。タッチ操作部12の所定の領域R1がタッチパッド121上の任意の一部の領域である場合、所定の領域R1以外のタッチパッド121上の領域は、タッチ操作に対して反応しないように構成されてもよい。 The predetermined area R1 of the touch operation unit 12 is not limited to this, and may be an arbitrary partial area on the touch pad 121 or an entire area as described above. When the predetermined area R1 of the touch operation unit 12 is an arbitrary partial area on the touch pad 121, the areas on the touch pad 121 other than the predetermined area R1 are configured not to react to the touch operation. May be.
 一方で、表示部11を構成する画面上の操作領域R2は、タッチパッド121の奥側に上記の所定の領域R1が設定されたとき、対応するように、画面の上部に設定される。すなわち、タッチパッド121の奥側及び手前側が、画面の上部及び下部にそれぞれ対応する。このように各々を対応させるのが最も直感的であるが、タッチパッド121と、表示部11を構成する画面との対応関係はこれに限定されない。例えば、上記の対応関係を逆にして、タッチパッド121の手前側及び奥側が、画面の上部及び下部にそれぞれ対応してもよい。この場合、タッチ操作部12全体に操作者の操作手を重畳させるために、タッチ操作部12の奥側の所定の領域R1に対応させて、表示部11を構成する画面上の操作領域R2は、その下部に設定されてもよい。 On the other hand, the operation area R2 on the screen constituting the display unit 11 is set at the upper part of the screen so as to correspond when the predetermined area R1 is set on the back side of the touch pad 121. That is, the back side and the near side of the touch pad 121 correspond to the upper part and the lower part of the screen, respectively. Although it is most intuitive to make each correspond in this way, the correspondence relationship between the touch pad 121 and the screen constituting the display unit 11 is not limited to this. For example, the above correspondence relationship may be reversed, and the near side and the far side of the touch pad 121 may correspond to the upper part and the lower part of the screen, respectively. In this case, in order to superimpose the operator's operation hand on the entire touch operation unit 12, the operation region R2 on the screen constituting the display unit 11 is associated with the predetermined region R1 on the back side of the touch operation unit 12. , May be set at the lower part thereof.
 制御部15は、設定したタッチ操作部12の所定の領域R1内の位置座標と、表示部11を構成する画面上の操作領域R2内の位置座標とを対応させる。例えば、タッチ操作部12の所定の領域R1が、タッチパッド121の一部又は全体の四角の領域であり、画面上の操作領域R2が、当該画面の一部又は全体の四角の領域である場合を考える。この場合、制御部15は、所定の領域R1の4つの頂点と操作領域R2の4つの頂点とをそれぞれ対応させる。制御部15は、4つの頂点の位置座標の対応関係を識別することで、各頂点を結ぶ四角の領域内に位置する各点の位置座標の対応関係を決定することができる。 The control unit 15 associates the set position coordinates in the predetermined region R1 of the touch operation unit 12 with the position coordinates in the operation region R2 on the screen constituting the display unit 11. For example, when the predetermined area R1 of the touch operation unit 12 is a square area of a part or the whole of the touch pad 121, and the operation area R2 on the screen is a part of or a whole square area of the screen. think of. In this case, the control unit 15 associates the four vertices of the predetermined area R1 with the four vertices of the operation area R2. The control unit 15 can determine the correspondence between the position coordinates of each point located in the square area connecting the vertices by identifying the correspondence between the position coordinates of the four vertices.
 このような処理は、例えば、車載用情報処理システム10が車両に搭載された初期の段階でキャリブレーションとして実行されてもよいし、随時実行されてもよい。 Such processing may be executed as calibration at an initial stage when the in-vehicle information processing system 10 is mounted on the vehicle, or may be executed as needed.
 図7は、車載用情報処理システム10が行う画像処理の様子を示した模式図である。図7(a)は、タッチ操作部12上でタッチ操作を行っている操作者の操作手の様子を示す。図7(b)は、表示部11を構成する画面上に重畳表示させた操作者の操作手の様子を示す。 FIG. 7 is a schematic diagram showing a state of image processing performed by the in-vehicle information processing system 10. FIG. 7A shows the state of the operator's operator performing a touch operation on the touch operation unit 12. FIG. 7B shows the state of the operator's operator hand superimposed on the screen constituting the display unit 11.
 制御部15は、カメラによって撮像された画像情報を、撮像部13から取得する。図7の領域R3により示すとおり、撮像された画像は、タッチ操作部12でタッチ操作を行う操作者の操作手の少なくとも一部と、タッチ操作部12、特にタッチパッド121とを含む。すなわち、撮像部13は、タッチ操作部12と操作者の操作手との位置関係を撮像する。加えて、上述のとおり、制御部15は、所定の領域R1内の位置座標と操作領域R2内の位置座標とを対応させるので、タッチ操作部12における操作者の操作手の位置と対応するように、当該操作手の少なくとも一部を画面上に重畳表示させることが可能である。 The control unit 15 acquires image information captured by the camera from the imaging unit 13. As shown by the region R <b> 3 in FIG. 7, the captured image includes at least a part of an operator's hand performing a touch operation with the touch operation unit 12, and the touch operation unit 12, particularly the touch pad 121. That is, the imaging unit 13 images the positional relationship between the touch operation unit 12 and the operator's operating hand. In addition, as described above, the control unit 15 associates the position coordinates in the predetermined region R1 with the position coordinates in the operation region R2, so that it corresponds to the position of the operator's operator's hand in the touch operation unit 12. In addition, at least a part of the operator can be displayed in a superimposed manner on the screen.
 制御部15は、表示部11に操作者の操作手を重畳表示させるとき、上記の画像に基づいて、操作者の操作手の一部又は全体を抽出する画像処理を行う。すなわち、制御部15は、操作者の操作手の輪郭よりも外部の背景等の画像情報を除去する。タッチパッド121の周囲を黒色の縁により囲むことで、制御部15は、撮像された画像に基づいて容易に操作者の操作手を抽出することができる。 The control unit 15 performs image processing for extracting a part or the whole of the operator's operator hand based on the above image when the operator's operator's hand is superimposed on the display unit 11. That is, the control unit 15 removes image information such as an external background from the contour of the operator's operator's hand. By surrounding the periphery of the touch pad 121 with a black edge, the control unit 15 can easily extract the operator's operator based on the captured image.
 制御部15は、撮像された画像が白黒画像の場合、画像処理によって操作手部分を着色してもよいし、着色しなくてもよい。表示部11に表示される操作者の操作手のリアリティをより向上させるためには、制御部15は、画像処理により着色するのが好適である。 When the captured image is a black and white image, the control unit 15 may or may not color the operating part by image processing. In order to further improve the reality of the operator's operator displayed on the display unit 11, the control unit 15 is preferably colored by image processing.
 制御部15は、撮像された画像がカラー画像の場合、操作者の操作手の実際の色彩及び明暗などに合わせて、画面上にそのまま重畳表示させるのが好適である。制御部15は、これに限定されず、背後の表示内容をより容易に視認可能とするために、操作手部分の色彩及び明暗などを排除して、代わりに例えば所定の一色を付す画像処理を行ってもよい。制御部15は、操作手部分の色彩及び明暗などを排除して、操作手部分を完全に無色透明とする画像処理を行ってもよい。この場合、制御部15は、操作手の輪郭近傍の部位のみを画面上に表示させる。 When the captured image is a color image, it is preferable that the control unit 15 directly superimposes the image on the screen in accordance with the actual color and brightness of the operator's operator. The control unit 15 is not limited to this, and in order to make it easier to visually recognize the display content behind, the control unit 15 eliminates the color and brightness of the operating part, and instead performs image processing that adds a predetermined color, for example. You may go. The control unit 15 may perform image processing that eliminates the color and brightness of the operating part and makes the operating part completely colorless and transparent. In this case, the control unit 15 displays only the part near the contour of the operator on the screen.
 以下では、撮像部13によって撮像された画像はカラー画像であり、制御部15は、操作者の操作手の実際の色彩及び明暗などに合わせて、画面上にそのまま重畳表示させるものとして説明する。すなわち、制御部15は、色彩及び明暗などに関する画像処理を行う必要はないものとして説明する。 Hereinafter, the image captured by the imaging unit 13 is a color image, and the control unit 15 is described as being superimposed and displayed on the screen as it is in accordance with the actual color and brightness of the operator's operator. That is, the description will be made assuming that the control unit 15 does not need to perform image processing relating to color, brightness, and the like.
 制御部15は、撮像された画像中のタッチ操作部12の所定の領域R1と、表示部11を構成する画面上の操作領域R2との間の大きさの比率に基づいて、操作者の操作手の表示倍率を決定する。例えば、タッチ操作部12の所定の領域R1が、タッチパッド121の一部又は全体の四角の領域であり、画面上の操作領域R2が、当該画面の一部又は全体の四角の領域である場合を考える。この場合、制御部15は、タッチ操作部12の所定の領域R1の各辺の長さと、画面上の操作領域R2の対応する各辺の長さとの比率をそれぞれ算出する。制御部15は、当該比率に基づいて、表示部11に重畳表示させる操作者の操作手の表示倍率を決定する。 The control unit 15 operates the operator based on the ratio of the size between the predetermined region R1 of the touch operation unit 12 in the captured image and the operation region R2 on the screen constituting the display unit 11. Determine the hand display magnification. For example, when the predetermined area R1 of the touch operation unit 12 is a square area of a part or the whole of the touch pad 121, and the operation area R2 on the screen is a part of or a whole square area of the screen. think of. In this case, the control unit 15 calculates the ratio between the length of each side of the predetermined region R1 of the touch operation unit 12 and the length of each corresponding side of the operation region R2 on the screen. Based on the ratio, the control unit 15 determines the display magnification of the operator's operator's hand to be displayed on the display unit 11 in a superimposed manner.
 制御部15は、決定した表示倍率に基づいて、撮像された操作者の操作手を、拡大若しくは縮小した状態、又はそのままの状態で表示部11に重畳表示させる。 The control unit 15 superimposes the imaged operator's operator's hand on the display unit 11 in an enlarged or reduced state or as it is based on the determined display magnification.
 制御部15は、操作者の操作手の表示倍率を上記の比率と同一としてもよいし、上記の比率に基づく異なる値としてもよい。表示倍率の決定処理は、例えば、車載用情報処理システム10が車両に搭載された初期の段階で上述のキャリブレーションと同時に実行されてもよいし、随時実行されてもよい。制御部15は、表示倍率を一度決定すると、その倍率を固定してもよいし、状況に応じて可変としてもよい。例えば、制御部15は、昼間と夜間とで操作者の操作手の表示倍率を変えてもよいし、操作者の設定により表示倍率を適宜変えてもよい。 The control unit 15 may set the display magnification of the operator's operator to be the same as the above ratio, or may be a different value based on the above ratio. The display magnification determination process may be executed simultaneously with the above-described calibration at the initial stage when the in-vehicle information processing system 10 is mounted on the vehicle, or may be executed as needed. Once the display magnification is determined, the control unit 15 may fix the magnification or change the display magnification according to the situation. For example, the control unit 15 may change the display magnification of the operator's operator between daytime and nighttime, or may appropriately change the display magnification according to the operator's setting.
 例えば、表示部11が複数の画面により構成される場合、制御部15は、各々の画面の操作領域R2の大きさに基づいて、重畳表示させる操作者の操作手の表示倍率をそれぞれ変えてもよい。これに限定されず、制御部15は、各々の画面の操作領域R2の大きさに基づいて、例えば、その平均値を導出し、当該平均値に基づいて、重畳表示させる操作者の操作手の表示倍率を一定にしてもよい。 For example, when the display unit 11 is configured by a plurality of screens, the control unit 15 may change the display magnification of the operator's operator to be superimposed and displayed based on the size of the operation area R2 of each screen. Good. Without being limited thereto, the control unit 15 derives, for example, an average value based on the size of the operation region R2 of each screen, and the operator's operator who displays the average value based on the average value. The display magnification may be constant.
 制御部15は、画面上の操作領域R2の大きさのみならず、画面上に表示する内容に応じて、操作者の操作手の表示倍率を変化させてもよい。例えば、表示部11が地図又は機能項目などを表示して、操作者が各々を操作する場合、制御部15は、操作者が操作しやすいようにその操作手の表示倍率を通常よりも低下させて、表示部11に重畳表示させてもよい。 The control unit 15 may change the display magnification of the operator's operator according to not only the size of the operation area R2 on the screen but also the content displayed on the screen. For example, when the display unit 11 displays a map or a function item and the operator operates each, the control unit 15 reduces the display magnification of the operator to be lower than usual so that the operator can easily operate. Then, the display unit 11 may be displayed in a superimposed manner.
 制御部15は、例えば、操作手の大きさの異なる操作者間で、表示部11に重畳表示される操作手の大きさを一致させるために、操作者ごとに表示倍率を変えてもよい。一方で、制御部15は、操作手の大きさの異なる操作者間で、表示部11に重畳表示される操作手の表示倍率を一定にして、現実の操作者の操作手の大きさに合わせて重畳表示させてもよい。 The control unit 15 may change the display magnification for each operator, for example, to match the size of the operator displayed superimposed on the display unit 11 between operators having different sizes of the operator. On the other hand, the control unit 15 makes the display magnification of the operator superimposed on the display unit 11 constant among the operators having different sizes of the operator, and adjusts it to the size of the actual operator's operator. May be displayed in a superimposed manner.
 制御部15は、撮像部13により撮像された画像に基づいて、所定の時間内で画像処理を行うのが好適である。所定の時間とは、操作者の実際の操作手による操作のタイミングと、画面上に重畳表示させた操作者の操作手の動きのタイミングとの間の時間遅れであって、操作者が意識しない程度の時間遅れを意味する。すなわち、制御部15は、操作者の反応速度及び認知能力などによって操作者が操作に違和感を感じる時間遅れよりも十分に短い時間内で画像処理を完結するのが好適である。例えば、制御部15は、上述した、撮像された操作者の操作手の抽出及び表示倍率の調整のみに画像処理を限定するのが好適である。 It is preferable that the control unit 15 performs image processing within a predetermined time based on the image captured by the imaging unit 13. The predetermined time is a time delay between the timing of the operation by the operator's actual operator and the timing of the movement of the operator's operator superimposed on the screen, and the operator is not aware of it. Means a time delay. That is, it is preferable that the control unit 15 completes the image processing within a time sufficiently shorter than the time delay in which the operator feels uncomfortable with the operation due to the reaction speed and the recognition ability of the operator. For example, it is preferable that the control unit 15 limit the image processing only to the above-described extraction of the operator's operator's hand that has been imaged and adjustment of the display magnification.
 すなわち、操作者の操作手によるタッチ操作が検出されるタッチ操作部12の所定の領域R1内の位置座標は、撮像部13による撮像画像の画像処理により識別されるのではなく、上述したとおり、タッチ操作部12、特にタッチパッド121からの検出情報に基づいて識別されるのが好適である。 That is, the position coordinates in the predetermined region R1 of the touch operation unit 12 where the touch operation by the operator's operator is detected is not identified by image processing of the captured image by the imaging unit 13, but as described above. It is preferable that identification is performed based on detection information from the touch operation unit 12, particularly the touch pad 121.
 制御部15は、上記のように2つの画像処理を行うものとして説明するが、これに限定されず、所定の時間内であれば、3つ以上の画像処理を行ってもよい。この場合、例えば、操作者の操作手によるタッチ操作が検出されるタッチ操作部12の所定の領域R1内の位置座標は、撮像部13による撮像画像の画像処理により識別されてもよい。 The control unit 15 is described as performing two image processes as described above, but is not limited thereto, and may perform three or more image processes within a predetermined time. In this case, for example, the position coordinates in the predetermined region R1 of the touch operation unit 12 where the touch operation by the operator's operator is detected may be identified by image processing of the captured image by the imaging unit 13.
 制御部15は、上記の画像処理を行うにあたり、タッチ操作部12の所定の領域R1及び表示部11を構成する画面上の操作領域R2に関する情報を、記憶部16より参照する。すなわち、制御部15は、検出情報に対応させた、タッチパッド121の所定の領域R1内の位置座標に関する情報を参照する。制御部15は、表示部11を構成する各画面の操作領域R2内の位置座標に関する情報を参照する。制御部15は、キャリブレーションなどにより決定された、重畳表示させる操作者の操作手の表示倍率に関する情報を参照する。 The control unit 15 refers to information about the predetermined region R1 of the touch operation unit 12 and the operation region R2 on the screen constituting the display unit 11 from the storage unit 16 when performing the above image processing. That is, the control unit 15 refers to information regarding the position coordinates in the predetermined region R1 of the touch pad 121 corresponding to the detection information. The control unit 15 refers to information regarding the position coordinates in the operation region R2 of each screen constituting the display unit 11. The control unit 15 refers to information on the display magnification of the operator's operator's hand to be displayed in a superimposed manner determined by calibration or the like.
 以下では、図8を参照して、操作者が表示部11を構成する複数の画面のうち操作対象とする任意の1つの画面を選択する操作の一例を詳細に説明する。 Hereinafter, with reference to FIG. 8, an example of an operation in which the operator selects an arbitrary screen to be operated among a plurality of screens constituting the display unit 11 will be described in detail.
 図8は、図1に示した表示部11の拡大図である。表示部11は、4つの画面111、112、113、及び114を有する。画面111は、奥行き方向に異なる2つの表示レイヤ1111及び1112を有する。例えば、画面111の表示レイヤ1111は、操作画面を表示する。画面111の表示レイヤ1112は、車速、複数車線における自車の走行位置、又は車間距離などの車両に関する情報を表示する。例えば、画面112及び113は、電子ミラーによる車両周囲の画像を表示する。例えば、画面114は、カーナビゲーションに関する地図情報、及び車両に関する情報を制御するための機能項目を表示する。 FIG. 8 is an enlarged view of the display unit 11 shown in FIG. The display unit 11 has four screens 111, 112, 113, and 114. The screen 111 has two display layers 1111 and 1112 that are different in the depth direction. For example, the display layer 1111 of the screen 111 displays an operation screen. The display layer 1112 of the screen 111 displays information about the vehicle such as the vehicle speed, the traveling position of the host vehicle in a plurality of lanes, or the inter-vehicle distance. For example, the screens 112 and 113 display an image around the vehicle by an electronic mirror. For example, the screen 114 displays function information for controlling map information related to car navigation and information related to vehicles.
 図8では、表示部11を構成する画面の数は4つであり、画面111の表示レイヤの数は2つであるものとして説明するが、これに限定されない。表示部11は、任意の数の画面により構成されてもよい。画面111は、任意の数の表示レイヤにより構成されてもよい。画面111のみ表示レイヤを有するものとして説明するが、これに限定されず、表示部11を構成する他の画面についても、複数の表示レイヤを有してもよい。各画面に表示されるコンテンツは、上記に限定されず、任意の画面が、任意のコンテンツを表示してもよい。 8, the number of screens constituting the display unit 11 is four and the number of display layers of the screen 111 is two. However, the present invention is not limited to this. The display unit 11 may be configured by an arbitrary number of screens. The screen 111 may be configured by an arbitrary number of display layers. Although only the screen 111 is described as having a display layer, the present invention is not limited to this, and other screens constituting the display unit 11 may have a plurality of display layers. The content displayed on each screen is not limited to the above, and any screen may display any content.
 制御部15は、操作部14より取得した信号に基づいて、現在選択されている画面から次に選択される画面への移動方向を、検出された操作者の操作手の少なくとも一部(手根部)の動作の方向に対応させる。例えば、画面111が選択されている状態で、操作者が手根部で4方向スイッチを左側に押込むと、制御部15は、画面111の左側に設置された画面112を選択する。同様に、画面111が選択されている状態で、操作者が手根部で4方向スイッチを右側に押込むと、制御部15は、画面111の右側に設置された画面113を選択する。この状態で、操作者が手根部で4方向スイッチを再度右側に押込むと、制御部15は、画面113の右側に設置された画面114を選択する。 Based on the signal acquired from the operation unit 14, the control unit 15 determines the movement direction from the currently selected screen to the next selected screen, at least a part of the detected operator's operating hand (carpal portion). ) Corresponding to the direction of movement. For example, in a state where the screen 111 is selected, when the operator pushes the four-way switch to the left side with the wrist, the control unit 15 selects the screen 112 installed on the left side of the screen 111. Similarly, when the operator depresses the four-way switch to the right side with the wrist while the screen 111 is selected, the control unit 15 selects the screen 113 installed on the right side of the screen 111. In this state, when the operator pushes the four-way switch to the right again at the wrist, the control unit 15 selects the screen 114 installed on the right side of the screen 113.
 表示部11の左端に配置された画面112を選択した状態で、操作者が4方向スイッチをさらに左側に押込んだ場合、制御部15は、右端に戻って画面114を選択してもよいし、選択を変更せず、依然として画面112にとどまるように制御してもよい。表示部11の右端に配置された画面114を選択した状態で、操作者が4方向スイッチをさらに右側に押込んだ場合も、制御部15は同様の制御を行ってもよい。 In a state where the screen 112 arranged at the left end of the display unit 11 is selected, when the operator further pushes the four-way switch to the left side, the control unit 15 may return to the right end and select the screen 114. The selection may not be changed, and the control may be performed so that the selection still remains on the screen 112. Even when the operator presses the four-way switch further to the right side with the screen 114 arranged at the right end of the display unit 11 selected, the control unit 15 may perform the same control.
 制御部15は、操作部14より取得した信号に基づいて、現在選択されている表示レイヤから次に選択される表示レイヤへの移動方向を、検出された操作者の操作手の少なくとも一部の動作の方向に対応させる。例えば、表示レイヤ1111が選択されている状態で、操作者が手根部で4方向スイッチを奥側に押込むと、制御部15は、表示レイヤ1111の奥側に配置された表示レイヤ1112を選択する。逆に、表示レイヤ1112が選択されている状態で、操作者が手根部で4方向スイッチを手前側に押込むと、制御部15は、表示レイヤ1112の手前側に配置された表示レイヤ1111を選択する。 Based on the signal acquired from the operation unit 14, the control unit 15 determines the movement direction from the currently selected display layer to the next selected display layer at least a part of the detected operator's operator's hand. Correspond to the direction of movement. For example, when the display layer 1111 is selected and the operator pushes the four-way switch to the back side with the wrist, the control unit 15 selects the display layer 1112 arranged on the back side of the display layer 1111. To do. Conversely, when the display layer 1112 is selected and the operator pushes the four-way switch to the near side with the wrist, the control unit 15 moves the display layer 1111 disposed on the near side of the display layer 1112. select.
 例えば、車載用情報処理システム10が表示部11に加えてヘッドアップディスプレイ型の装置を有する場合、上記の動作により選択される対象は、各画面及び各表示レイヤに加えて、虚像を表示するフロントウィンドシールドであってもよい。この場合、例えば、表示レイヤ1112が選択されている状態で、操作者が手根部で4方向スイッチを奥側に押込むと、制御部15は、表示レイヤ1112の奥側に配置されたフロントウィンドシールドを選択する。すなわち、制御部15は、表示レイヤ1111、1112及びフロントウィンドシールドを、1つのまとまった階層構造として識別する。 For example, when the in-vehicle information processing system 10 includes a head-up display type device in addition to the display unit 11, the target selected by the above operation is a front that displays a virtual image in addition to each screen and each display layer. It may be a windshield. In this case, for example, when the display layer 1112 is selected and the operator pushes the four-way switch to the back side with the wrist, the control unit 15 causes the front window disposed on the back side of the display layer 1112 to be Select a shield. That is, the control unit 15 identifies the display layers 1111 and 1112 and the front windshield as a single hierarchical structure.
 表示部11の奥端に配置された表示レイヤ1112又はフロントウィンドシールドを選択した状態で、操作者が4方向スイッチをさらに奥側に押込んだ場合、制御部15は、前端に戻って表示レイヤ1111を選択してもよい。制御部15は、選択を変更せず、依然として表示レイヤ1112又はフロントウィンドシールドにとどまるように制御してもよい。表示部11の前端に配置された画面111を選択した状態で、操作者が4方向スイッチをさらに手前側に押込んだ場合も、制御部15は同様の制御を行ってもよい。 In a state where the display layer 1112 or the front windshield arranged at the back end of the display unit 11 is selected, when the operator further pushes the four-way switch to the back side, the control unit 15 returns to the front end and returns to the display layer. 1111 may be selected. The control unit 15 may perform control so that the selection is not changed and the display layer 1112 or the front windshield still remains. The control unit 15 may perform the same control even when the operator pushes the four-way switch further forward while the screen 111 arranged at the front end of the display unit 11 is selected.
 表示レイヤ1112又はフロントウィンドシールドを選択した状態で、操作者が4方向スイッチを左側に押込んだ場合、制御部15は、左側に設置された画面112を選択する。表示レイヤ1112又はフロントウィンドシールドを選択した状態で、操作者が4方向スイッチを右側に押込んだ場合、制御部15は、右側に設置された画面113を選択する。 When the operator pushes the four-way switch to the left side with the display layer 1112 or the front windshield selected, the control unit 15 selects the screen 112 installed on the left side. When the operator presses the four-way switch to the right side with the display layer 1112 or the front windshield selected, the control unit 15 selects the screen 113 installed on the right side.
 画面112又は113を選択した状態で、操作者が画面111を選択する方向に4方向スイッチを押込むと、制御部15は、予め定められた特定の表示レイヤ又はフロントウィンドシールド(例えば、前端の表示レイヤ1111)に選択を戻してもよい。制御部15は、直前に選択されていた表示レイヤ又はフロントウィンドシールドに選択を戻してもよい。 When the operator presses the four-way switch in the direction of selecting the screen 111 while the screen 112 or 113 is selected, the control unit 15 causes the specific display layer or the front windshield (for example, the front end of the front end) to be selected. The selection may be returned to the display layer 1111). The control unit 15 may return the selection to the display layer or the front windshield that was selected immediately before.
 画面、表示レイヤ、又はフロントウィンドシールドの選択の方法は上記に限定されず、検出された操作者の操作手の少なくとも一部の動作の方向に対応していれば、任意の方法であってよい。 The method of selecting the screen, the display layer, or the front windshield is not limited to the above, and any method may be used as long as it corresponds to the direction of movement of at least a part of the detected operator's operator's hand. .
 制御部15は、画面111において選択していない表示レイヤの表示の度合いを抑制してもよいし、抑制しなくてもよい。操作者の視認容易性に鑑みると、制御部15は、選択していない表示レイヤの表示の度合いを抑制するのが好適である。 The control unit 15 may or may not suppress the display level of the display layer that is not selected on the screen 111. In view of the visibility of the operator, it is preferable that the control unit 15 suppresses the display degree of the display layer that is not selected.
 具体的には、制御部15は、画面111において選択していない表示レイヤの輝度を低下させてもよい。制御部15は、グレーアウトにより表示の度合いを抑制してもよい。制御部15は、選択していない表示レイヤのRGB全てを排除してもよいし、RGBの任意の1つ又は2つの要素のみを排除してもよい。制御部15は、選択していない表示レイヤをぼやかしてもよい。制御部15は、選択していない表示レイヤをそもそも表示しなくてもよい。これらの方法に限定されず、制御部15は、選択していない表示レイヤの視認性を低下させることで、選択している表示レイヤの視認性を相対的に高めることのできる任意の方法により制御してもよい。 Specifically, the control unit 15 may reduce the luminance of a display layer that is not selected on the screen 111. The control unit 15 may suppress the display degree by graying out. The control unit 15 may exclude all RGB of the display layer that has not been selected, or may exclude only one or two elements of RGB. The control unit 15 may blur the display layer that has not been selected. The control part 15 does not need to display the display layer which has not been selected in the first place. Without being limited to these methods, the control unit 15 performs control by any method that can relatively improve the visibility of the selected display layer by reducing the visibility of the display layer that is not selected. May be.
 図9は、車載用情報処理システム10の動作の一例を示すフローチャートである。 FIG. 9 is a flowchart showing an example of the operation of the in-vehicle information processing system 10.
 制御部15は、キャリブレーションを行う。すなわち、制御部15は、設定されたタッチ操作部12の所定の領域R1内の位置座標と、表示部11を構成する画面上の操作領域R2内の位置座標とを対応させる(ステップS10)。 The control unit 15 performs calibration. That is, the control unit 15 associates the set position coordinates in the predetermined region R1 of the touch operation unit 12 with the position coordinates in the operation region R2 on the screen constituting the display unit 11 (step S10).
 制御部15は、キャリブレーションなどによって、表示部11に重畳表示させる操作者の操作手の表示倍率を決定する(ステップS11)。 The control unit 15 determines the display magnification of the operator's operator to be displayed in a superimposed manner on the display unit 11 by calibration or the like (step S11).
 制御部15は、撮像部13により撮像された画像に基づいて、タッチ操作部12上に操作者の操作手が重畳しているかを判別する(ステップS12)。 The control unit 15 determines whether the operator's operating hand is superimposed on the touch operation unit 12 based on the image captured by the imaging unit 13 (step S12).
 制御部15は、タッチ操作部12上に操作者の操作手が重畳していると判別した場合、ステップS13に進む。制御部15は、タッチ操作部12上に操作者の操作手が重畳していないと判別した場合、再度ステップS12に戻り、操作者の操作手が重畳するまで待機する。 When the control unit 15 determines that the operator's operating hand is superimposed on the touch operation unit 12, the control unit 15 proceeds to step S13. When it is determined that the operator's operating hand is not superimposed on the touch operation unit 12, the control unit 15 returns to step S12 again and waits until the operator's operating hand is superimposed.
 制御部15は、タッチ操作部12上に操作者の操作手が重畳していると判別した場合、操作者の操作手の一部又は全体を抽出する画像処理を行う(ステップS13)。 When it is determined that the operator's operator's hand is superimposed on the touch operation unit 12, the control unit 15 performs image processing for extracting a part or the whole of the operator's operator's hand (Step S13).
 制御部15は、ステップS11で決定された表示倍率に基づいて、撮像された操作者の操作手を重畳表示させる(ステップS14)。 The control unit 15 superimposes and displays the captured operator's hand based on the display magnification determined in step S11 (step S14).
 制御部15は、操作部14から操作者の操作手の少なくとも一部の動作に関する検出情報を取得したかを判別する(ステップS15)。 The control unit 15 determines whether or not the detection information regarding at least a part of the operation of the operator's operator has been acquired from the operation unit 14 (step S15).
 制御部15は、検出情報を取得した場合、ステップS16に進む。制御部15は、検出情報を取得しない場合、ステップS18に進む。 The control part 15 progresses to step S16, when detection information is acquired. When the detection information is not acquired, the control unit 15 proceeds to step S18.
 制御部15は、検出情報を取得した場合、識別した動作の方向に対応する画面を新たに選択する(ステップS16)。 When acquiring the detection information, the control unit 15 newly selects a screen corresponding to the identified operation direction (step S16).
 制御部15は、新たに選択された画面に合わせた表示倍率に基づいて、撮像された操作者の操作手を重畳表示させる(ステップS17)。 The control unit 15 superimposes the captured operator's hand based on the display magnification adapted to the newly selected screen (step S17).
 制御部15は、タッチ操作部12からタッチ操作に関する検出情報を取得したかを判別する(ステップS18)。 The control unit 15 determines whether detection information related to the touch operation has been acquired from the touch operation unit 12 (step S18).
 制御部15は、検出情報を取得した場合、ステップS19に進む。制御部15は、検出情報を取得しない場合、再度ステップS18に戻り、検出情報を取得するまで待機する。 When the control unit 15 acquires the detection information, the control unit 15 proceeds to step S19. When the detection information is not acquired, the control unit 15 returns to step S18 again and waits until the detection information is acquired.
 制御部15は、検出情報を取得した場合、現在選択されている画面上で、タッチ操作部12でのタッチ操作に基づく操作を実行する(ステップS19)。 When acquiring the detection information, the control unit 15 performs an operation based on the touch operation on the touch operation unit 12 on the currently selected screen (step S19).
 制御部15は、フローを終了させる。 The control unit 15 ends the flow.
 以上により、本実施形態に係る車載用情報処理システム10は、表示部11に表示された操作者の操作手が仮想的に画面上で情報を操作するので、操作者に対して直感的な操作を可能とする。すなわち、操作者は、実際の感覚により近い状態で画面内にアクセスできる。操作者は、操作手の実際の位置と画面上の位置関係とを直感的に認識できる。従って、車載用情報処理システム10は、ポインタなどを表示させる従来の装置と比べて、操作者が画面を注視する時間を低減させることが可能である。 As described above, in the in-vehicle information processing system 10 according to the present embodiment, the operator's operator displayed on the display unit 11 virtually operates the information on the screen. Is possible. That is, the operator can access the screen in a state closer to the actual feeling. The operator can intuitively recognize the actual position of the operator and the positional relationship on the screen. Therefore, the in-vehicle information processing system 10 can reduce the time for the operator to watch the screen as compared with a conventional device that displays a pointer or the like.
 車載用情報処理システム10は、画像処理を行う時間を限定するので、最小限の遅延により、操作者の操作手を重畳表示させることが可能である。すなわち、車載用情報処理システム10は、実際の操作手と画面上に重畳表示された操作手との間で、時間的な動きのズレを低減することができる。これにより、操作者は、より違和感なく画面上に表示される情報を操作可能である。 The in-vehicle information processing system 10 limits the time for performing image processing, and therefore, it is possible to superimpose and display the operator's hand with a minimum delay. That is, the in-vehicle information processing system 10 can reduce a temporal movement shift between the actual operator and the operator superimposed and displayed on the screen. Thereby, the operator can operate the information displayed on the screen without a more uncomfortable feeling.
 車載用情報処理システム10は、撮像部13により撮像された操作者の操作手を抽出する画像処理を行うので、画面上で操作者の操作手を忠実に重畳表示させることが可能である。これにより、操作者は、画面上に重畳表示された操作手を、自身の手であると直感的に認識することができる。 Since the in-vehicle information processing system 10 performs image processing for extracting the operator's operator hand imaged by the imaging unit 13, the operator's operator hand can be faithfully superimposed on the screen. As a result, the operator can intuitively recognize the operating hand superimposed on the screen as his / her own hand.
 車載用情報処理システム10は、撮像部13により撮像された操作者の操作手の表示倍率を変更する画像処理を行うので、各画面に合わせた最適な大きさで操作者の操作手を重畳表示させることが可能である。これにより、操作者は、画面上に重畳表示されている操作手にリアリティを感じつつ、その背後に表示された表示内容を容易に視認可能である。 The in-vehicle information processing system 10 performs image processing for changing the display magnification of the operator's operator imaged by the imaging unit 13, so that the operator's operator's hand is superimposed and displayed in an optimum size for each screen. It is possible to make it. As a result, the operator can easily see the display content displayed behind the operator while feeling the reality of the operator superimposed on the screen.
 車載用情報処理システム10は、タッチ操作が検出される所定の領域R1内の位置座標を画像処理により識別する場合と比べて、最小限の遅延により、操作者の操作手を重畳表示させることが可能である。車載用情報処理システム10は、撮像部13により撮像された画像に基づいて間接的に位置座標を識別するのではなく、タッチ操作部12により直接識別するので、精度良く位置座標を識別することができる。すなわち、車載用情報処理システム10は、タッチ操作部12により操作手が実際に接触している位置を検出するので、操作者が画面上に表示された機能項目などを選択する際に、誤動作を引き起こす可能性が低い。 The in-vehicle information processing system 10 can display an operator's operator in a superimposed manner with a minimum delay compared to the case where the position coordinates in the predetermined region R1 where the touch operation is detected are identified by image processing. Is possible. The in-vehicle information processing system 10 does not indirectly identify the position coordinates based on the image picked up by the image pickup unit 13 but directly identifies them by the touch operation unit 12, so that the position coordinates can be accurately identified. it can. That is, since the in-vehicle information processing system 10 detects the position where the operator is actually touching by the touch operation unit 12, when the operator selects a function item displayed on the screen, a malfunction occurs. Less likely to cause.
 車載用情報処理システム10では、撮像部13が操作者の操作手全体を撮像することで、操作者は、画面上に重畳表示された操作手が自身の手であることを容易に認識できる。操作者は、操作手のどの部分を動かしているのかを、画面上の表示に基づいて容易に認識できる。操作者は、実際の操作手の位置と、画面上の位置との関係を正確に認識するので、操作手の移動量及び移動可能な画面上の領域を容易に把握できる。 In the in-vehicle information processing system 10, the imaging unit 13 captures an image of the operator's entire operating hand, so that the operator can easily recognize that the operating hand superimposed on the screen is his / her hand. The operator can easily recognize which part of the operator is moving based on the display on the screen. Since the operator accurately recognizes the relationship between the actual position of the operator and the position on the screen, the operator can easily grasp the movement amount of the operator and the movable area on the screen.
 車載用情報処理システム10は、撮像部13により操作者の操作手全体を撮像することで、タッチパッド121上の操作者の操作手を精度良く識別することが可能である。すなわち、車載用情報処理システム10は、操作者の操作手を人の手として精度良く認識することが可能である。車載用情報処理システム10は、操作者の操作手の各部の識別をより精度良く行うことが可能である。すなわち、車載用情報処理システム10は、撮像された画像中の指がどの指に対応するかを精度良く識別することが可能である。車載用情報処理システム10は、操作手全体を撮像することで、操作手全体の大きさ及び操作手全体に占める各部の大きさの割合などを精度良く識別することが可能である。これにより、車載用情報処理システム10は、画面中での操作者の操作手の移動量及び移動可能な画面上の領域を、タッチパッド121上の実際の操作手の動きと対応させて精度良く識別することが可能である。 The in-vehicle information processing system 10 can accurately identify the operator's operator on the touch pad 121 by imaging the entire operator's operator with the imaging unit 13. That is, the in-vehicle information processing system 10 can accurately recognize the operator's operating hand as a human hand. The in-vehicle information processing system 10 can identify each part of the operator's operator with higher accuracy. That is, the in-vehicle information processing system 10 can accurately identify which finger corresponds to the finger in the captured image. The vehicle-mounted information processing system 10 can accurately identify the size of the entire operator and the proportion of the size of each part in the entire operator by capturing the entire operator. As a result, the in-vehicle information processing system 10 accurately matches the movement amount of the operator's operator's hand on the screen and the movable area on the screen with the movement of the actual operator's hand on the touch pad 121. It is possible to identify.
 車載用情報処理システム10がタッチパッド121の奥側に所定の領域R1を設定し、画面上の上部に操作領域R2を設定することで、操作者は、必然的にタッチ操作部12全体に自身の操作手を重畳させることになる。この時、車載用情報処理システム10は、撮像部13により、必然的に操作者の操作手全体を撮像可能である。所定の領域R1以外のタッチパッド121上の領域をタッチ操作に対して無反応とすることで、操作者の意識は所定の領域R1により集中することになる。この時、車載用情報処理システム10は、より確実に操作者の操作手全体を撮像可能である。 The in-vehicle information processing system 10 sets the predetermined area R1 on the back side of the touch pad 121 and sets the operation area R2 on the upper part of the screen, so that the operator inevitably places himself in the touch operation unit 12 as a whole. Will be superimposed. At this time, the in-vehicle information processing system 10 can inevitably image the entire operator's operation hand by the imaging unit 13. By making the region on the touch pad 121 other than the predetermined region R1 non-responsive to the touch operation, the operator's consciousness is concentrated on the predetermined region R1. At this time, the vehicle-mounted information processing system 10 can capture the entire operator's operating hand more reliably.
 車載用情報処理システム10は、操作者が接触により選択した項目をハイライトさせることで、どの指がタッチパッド121に接触しているかを、視覚情報としても明瞭に認識させることができる。操作者は、重畳表示された操作手が画面上のどの位置で接触しているのか、又はどの項目が選択されているのかを容易に視認可能である。 The in-vehicle information processing system 10 can clearly recognize which finger is in contact with the touch pad 121 as visual information by highlighting an item selected by contact by the operator. The operator can easily visually recognize at which position on the screen the operator displayed in superposition is touching or which item is selected.
 車載用情報処理システム10は、タクトスイッチ122がオンになるときに操作者に対してクリック感を与える。従って、操作者は、自身の動作による触感的なフィードバックを得ることができ、より直感的な操作が可能である。換言すると、タクトスイッチ122が選択の確定に用いられることで、操作者は、自然な動作で選択した項目を確定させることができる。加えて、操作者は、指に働く反力によって、画面上に重畳表示された操作手に対してよりリアリティを感じる。すなわち、操作者は、自身の実際の操作手があたかも直接画面に触れているような錯覚を覚えやすくなる。 The in-vehicle information processing system 10 gives a click feeling to the operator when the tact switch 122 is turned on. Therefore, the operator can obtain tactile feedback by his / her own operation, and more intuitive operation is possible. In other words, the tact switch 122 is used to confirm the selection, so that the operator can confirm the item selected by a natural action. In addition, the operator feels more reality with respect to the operator superimposed and displayed on the screen due to the reaction force acting on the finger. That is, the operator can easily feel the illusion that his / her actual operator is touching the screen directly.
 車載用情報処理システム10は、重畳表示させる操作者の操作手を半透過にするので、リアリティを確保できる。これにより、操作者は、画面上に表示される情報を容易に視認可能である。すなわち、操作者は、より直感的に画面上に表示される情報を操作可能である。 The in-vehicle information processing system 10 can ensure the reality because the operator's operator's hand to be superimposed is semi-transparent. Thereby, the operator can easily visually recognize the information displayed on the screen. That is, the operator can operate information displayed on the screen more intuitively.
 車載用情報処理システム10は、タッチ操作部12の上方に撮像部13を設置することで、操作者の操作手全体をより容易に撮像可能である。 The in-vehicle information processing system 10 can image the entire operator's operating hand more easily by installing the imaging unit 13 above the touch operation unit 12.
 車載用情報処理システム10では、操作者は、操作部14上で自身の操作手の少なくとも一部を動作させるので、各画面を容易に選択できる。すなわち、操作者は、表示部11の各画面を目視しながら得る視覚情報に基づいて、自身の操作手に視線を変更することなく、各画面を選択できる。従って、車載用情報処理システム10は、操作者の画面選択操作をより簡便にし、操作者の利便性を向上させる。 In the in-vehicle information processing system 10, since the operator operates at least a part of his / her operator on the operation unit 14, each screen can be easily selected. That is, the operator can select each screen based on visual information obtained while viewing each screen of the display unit 11 without changing his / her line of sight to his / her own operator. Therefore, the in-vehicle information processing system 10 makes the operator's screen selection operation easier and improves the convenience for the operator.
 車載用情報処理システム10は、現在選択されている画面から次に選択される画面への移動方向を、検出された動作の方向に対応させるので、直感的な画面選択操作を可能とする。すなわち、操作者は、次の画面を選択するために、視覚により識別した各画面の位置情報に基づいて、通常の感覚により、自身の操作手の少なくとも一部を動作させることができる。 The in-vehicle information processing system 10 makes the movement direction from the currently selected screen to the next selected screen correspond to the detected direction of operation, and thus enables an intuitive screen selection operation. That is, in order to select the next screen, the operator can operate at least a part of his / her operator with a normal feeling based on the position information of each screen identified visually.
 車載用情報処理システム10は、奥行き方向に異なる複数の表示レイヤを有する画面を含むので、より多くの情報を1つの画面上に表示させることが可能である。すなわち、車載用情報処理システム10は、異なる種類の情報であっても、表示レイヤを分けることによって、1つの画面上で同時に表示させることが可能である。 Since the in-vehicle information processing system 10 includes a screen having a plurality of display layers that differ in the depth direction, more information can be displayed on one screen. That is, the in-vehicle information processing system 10 can simultaneously display different types of information on one screen by dividing the display layer.
 車載用情報処理システム10は、現在選択されている表示レイヤから次に選択される表示レイヤへの移動方向を、検出された動作の方向に対応させるので、直感的な表示レイヤ選択操作を可能とする。すなわち、操作者は、次の表示レイヤを選択するために、視覚により識別した各表示レイヤの位置情報に基づいて、通常の感覚により、自身の操作手の少なくとも一部を動作させることができる。 The in-vehicle information processing system 10 makes the movement direction from the currently selected display layer to the next selected display layer correspond to the direction of the detected operation, so that an intuitive display layer selection operation can be performed. To do. That is, in order to select the next display layer, the operator can operate at least a part of his / her operator with a normal feeling based on the position information of each display layer visually identified.
 車載用情報処理システム10は、選択されていない表示レイヤの表示の度合いを抑制するので、選択されている表示レイヤの視認性を高めることができる。 The in-vehicle information processing system 10 suppresses the degree of display of a display layer that is not selected, so that the visibility of the selected display layer can be improved.
 車載用情報処理システム10は、輝度を低下させることで表示の度合いを抑制するので、選択されている表示レイヤを明瞭に表示させることができる。すなわち、車載用情報処理システム10は、選択されていない表示レイヤの輝度を低下させることで、選択されている表示レイヤの輝度を相対的に高めることができ、明瞭な表示を可能とする。 The in-vehicle information processing system 10 suppresses the display level by reducing the luminance, so that the selected display layer can be clearly displayed. That is, the in-vehicle information processing system 10 can relatively increase the luminance of the selected display layer by reducing the luminance of the display layer that is not selected, thereby enabling clear display.
 車載用情報処理システム10は、グレーアウトにより選択されてない表示レイヤの表示の度合いを抑制するので、選択されている表示レイヤの視認性を相対的に高めることができる。 The in-vehicle information processing system 10 suppresses the degree of display of the display layer that is not selected by graying out, so that the visibility of the selected display layer can be relatively improved.
 車載用情報処理システム10は、選択されていない表示レイヤを上記の任意の方法により表示させる場合、全く表示しない場合に比べて、表示レイヤ構造に対する操作者の視認性を向上させることができる。すなわち、操作者は、画面上のどの位置にどの数の表示レイヤが設けられているかを容易に視認できる。 The in-vehicle information processing system 10 can improve the visibility of the operator with respect to the display layer structure when displaying a non-selected display layer by the above-described arbitrary method as compared with not displaying at all. That is, the operator can easily visually recognize which number of display layers are provided at which position on the screen.
 車載用情報処理システム10は、パームレストに配置され、4方向スイッチにより構成される操作部14を有する。従って、操作者は、簡易な動作で各画面又は各表示レイヤを選択できる。すなわち、操作者は、視覚により識別した各画面又は各表示レイヤに関する奥、手前、左、及び右の方向に基づいて、4方向スイッチを対応する方向に押込むだけで、各画面を選択できる。車載用情報処理システム10は、各画面又は各表示レイヤに関する奥、手前、左、及び右の方向を、4方向スイッチの奥側、手前側、左側、及び右側のスイッチとそれぞれ対応させる。従って、操作者は、直感的に選択操作を行うことができる。 The in-vehicle information processing system 10 includes an operation unit 14 that is arranged on a palm rest and includes a four-way switch. Therefore, the operator can select each screen or each display layer with a simple operation. That is, the operator can select each screen simply by pressing the four-way switch in the corresponding direction based on the back, near, left, and right directions regarding each screen or each display layer identified visually. The in-vehicle information processing system 10 associates the back, near, left, and right directions for each screen or each display layer with the back, near, left, and right switches of the four-way switch, respectively. Therefore, the operator can perform the selection operation intuitively.
 本実施形態では、撮像部13により撮像された画像に基づいて、操作者の操作手の少なくとも一部を画面上に重畳表示させる例を示したがこれに限定されない。車載用情報処理システム10は、操作手を画面上に重畳表示する代わりに、ポインタ、カーソル等により操作者の操作に応答してもよい。 In the present embodiment, an example in which at least a part of the operator's operator's hand is superimposed and displayed on the screen based on the image captured by the imaging unit 13 has been described, but the present invention is not limited to this. The in-vehicle information processing system 10 may respond to the operation of the operator with a pointer, cursor, or the like instead of displaying the operator on the screen.
 本発明は、その精神又はその本質的な特徴から離れることなく、上述した実施形態以外の他の所定の形態で実現できることは当業者にとって明白である。従って、先の記述は例示的なものであり、これに限定されるものではない。発明の範囲は、先の記述によってではなく、付加した請求項によって定義される。あらゆる変更のうちその均等の範囲内にあるいくつかの変更は、その中に包含されるものとする。 It will be apparent to those skilled in the art that the present invention can be realized in other predetermined forms other than the above-described embodiments without departing from the spirit or essential characteristics thereof. Accordingly, the above description is illustrative and not restrictive. The scope of the invention is defined by the appended claims rather than by the foregoing description. Some of all changes that fall within the equivalent scope shall be included therein.
10 車載用情報処理システム
11 表示部
111、112、113、114 画面
1111、1112 表示レイヤ
12 タッチ操作部
121 タッチパッド
122 タクトスイッチ
13 撮像部
14 操作部
15 制御部
16 記憶部
R1 所定の領域
R2 操作領域
R3 領域
DESCRIPTION OF SYMBOLS 10 In-vehicle information processing system 11 Display unit 111, 112, 113, 114 Screen 1111, 1112 Display layer 12 Touch operation unit 121 Touch pad 122 Tact switch 13 Imaging unit 14 Operation unit 15 Control unit 16 Storage unit R1 Predetermined area R2 operation Region R3 Region

Claims (9)

  1.  複数の画面を有する表示部と、
     操作者の操作手の少なくとも一部の動作を検出する操作部と、
     検出された前記動作に基づいて、前記操作者が表示内容を操作するための前記画面を選択する制御部と、を備える、
     車載用情報処理システム。
    A display unit having a plurality of screens;
    An operation unit for detecting at least a part of the operation of the operator's operator;
    A control unit that selects the screen for operating the display content based on the detected operation;
    In-vehicle information processing system.
  2.  前記制御部は、現在選択されている前記画面から次に選択される前記画面への移動方向を、検出された前記動作の方向に対応させる、
     請求項1に記載の車載用情報処理システム。
    The control unit associates a moving direction from the currently selected screen to the next selected screen with the detected direction of the operation.
    The in-vehicle information processing system according to claim 1.
  3.  前記表示部は、奥行き方向に異なる複数の表示レイヤを有する画面を含む、
     請求項1又は2に記載の車載用情報処理システム。
    The display unit includes a screen having a plurality of display layers different in the depth direction.
    The in-vehicle information processing system according to claim 1 or 2.
  4.  前記制御部は、現在選択されている前記表示レイヤから次に選択される前記表示レイヤへの移動方向を、検出された前記動作の方向に対応させる、
     請求項3に記載の車載用情報処理システム。
    The control unit associates the direction of movement from the currently selected display layer to the next selected display layer with the direction of the detected motion.
    The in-vehicle information processing system according to claim 3.
  5.  前記制御部は、選択されていない前記表示レイヤの表示の度合いを抑制する、
     請求項3又は4に記載の車載用情報処理システム。
    The control unit suppresses the degree of display of the display layer that is not selected.
    The in-vehicle information processing system according to claim 3 or 4.
  6.  前記制御部は、輝度を低下させることで前記表示の度合いを抑制する、
     請求項5に記載の車載用情報処理システム。
    The control unit suppresses the degree of display by reducing luminance.
    The in-vehicle information processing system according to claim 5.
  7.  前記制御部は、グレーアウトにより前記表示の度合いを抑制する、
     請求項5に記載の車載用情報処理システム。
    The control unit suppresses the degree of display by graying out.
    The in-vehicle information processing system according to claim 5.
  8.  前記操作部は、パームレストに配置され、4方向スイッチにより構成される、
     請求項1乃至7のいずれか一項に記載の車載用情報処理システム。
    The operation unit is arranged on a palm rest and is configured by a four-way switch.
    The in-vehicle information processing system according to any one of claims 1 to 7.
  9.  車載用情報処理システムはさらに、
     前記操作者の操作手による接触を検出するタッチ操作部と、
     前記操作手の少なくとも一部と、前記タッチ操作部とを撮像する撮像部と、を備え、
     前記制御部は、選択された前記画面上の操作領域内の位置座標と、前記タッチ操作部の所定の領域内の位置座標とを対応させて、前記撮像部により撮像された画像に基づいて、前記操作手の少なくとも一部を前記画面上に重畳表示させる、
     請求項1乃至8のいずれか一項に記載の車載用情報処理システム。
    In-vehicle information processing systems
    A touch operation unit that detects contact by the operator's operator; and
    An imaging unit that images at least a part of the operating hand and the touch operation unit;
    The control unit associates the position coordinate in the selected operation area on the screen with the position coordinate in the predetermined area of the touch operation unit, and based on the image captured by the imaging unit, At least a part of the operator is superimposed on the screen;
    The in-vehicle information processing system according to any one of claims 1 to 8.
PCT/JP2017/015774 2016-04-27 2017-04-19 Vehicle-mounted information processing system WO2017188098A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-089489 2016-04-27
JP2016089489A JP2017197016A (en) 2016-04-27 2016-04-27 On-board information processing system

Publications (1)

Publication Number Publication Date
WO2017188098A1 true WO2017188098A1 (en) 2017-11-02

Family

ID=60161585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/015774 WO2017188098A1 (en) 2016-04-27 2017-04-19 Vehicle-mounted information processing system

Country Status (2)

Country Link
JP (1) JP2017197016A (en)
WO (1) WO2017188098A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7338624B2 (en) * 2018-06-18 2023-09-05 日本精機株式会社 VEHICLE DISPLAY DEVICE, VEHICLE DISPLAY CONTROL METHOD, VEHICLE DISPLAY CONTROL PROGRAM
JP2020071641A (en) * 2018-10-31 2020-05-07 株式会社デンソー Input operation device and user interface system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006293601A (en) * 2005-04-08 2006-10-26 Nissan Motor Co Ltd Information operation apparatus
WO2007088942A1 (en) * 2006-02-03 2007-08-09 Matsushita Electric Industrial Co., Ltd. Input device and its method
JP2014056462A (en) * 2012-09-13 2014-03-27 Toshiba Alpine Automotive Technology Corp Operation device
JP2015149022A (en) * 2014-02-07 2015-08-20 東日本電信電話株式会社 information providing apparatus and information providing method
JP2015152964A (en) * 2014-02-10 2015-08-24 トヨタ自動車株式会社 Information display system for vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006293601A (en) * 2005-04-08 2006-10-26 Nissan Motor Co Ltd Information operation apparatus
WO2007088942A1 (en) * 2006-02-03 2007-08-09 Matsushita Electric Industrial Co., Ltd. Input device and its method
JP2014056462A (en) * 2012-09-13 2014-03-27 Toshiba Alpine Automotive Technology Corp Operation device
JP2015149022A (en) * 2014-02-07 2015-08-20 東日本電信電話株式会社 information providing apparatus and information providing method
JP2015152964A (en) * 2014-02-10 2015-08-24 トヨタ自動車株式会社 Information display system for vehicle

Also Published As

Publication number Publication date
JP2017197016A (en) 2017-11-02

Similar Documents

Publication Publication Date Title
CN110045825B (en) Gesture recognition system for vehicle interaction control
JP4351599B2 (en) Input device
KR101367593B1 (en) Interactive operating device and method for operating the interactive operating device
US8085243B2 (en) Input device and its method
US9604542B2 (en) I/O device for a vehicle and method for interacting with an I/O device
US9956878B2 (en) User interface and method for signaling a 3D-position of an input means in the detection of gestures
CN108108042B (en) Display device for vehicle and control method thereof
US11112873B2 (en) Method for operating a display device for a motor vehicle and motor vehicle
CN109643219A (en) Method for being interacted with the picture material presented in display equipment in the car
US20180157324A1 (en) Method and Device for Interacting with a Graphical User Interface
WO2014162697A1 (en) Input device
JPWO2012026402A1 (en) Vehicle control device
KR101806172B1 (en) Vehicle terminal control system and method
CN111032414A (en) Control system for main display of autonomous vehicle
WO2017138545A1 (en) Information processing system, information processing device, control method, and program
US10953749B2 (en) Vehicular display device
JP2017197015A (en) On-board information processing system
JP2018136616A (en) Display operation system
WO2017188098A1 (en) Vehicle-mounted information processing system
JP2018195134A (en) On-vehicle information processing system
JP2018010472A (en) In-vehicle electronic equipment operation device and in-vehicle electronic equipment operation method
TWM564749U (en) Vehicle multi-display control system
WO2017175666A1 (en) In-vehicle information processing system
JP2009184551A (en) On-vehicle equipment input device
JP2017187922A (en) In-vehicle information processing system

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17789382

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17789382

Country of ref document: EP

Kind code of ref document: A1