WO2019146032A1 - Gesture operation device and gesture operation method - Google Patents

Gesture operation device and gesture operation method Download PDF

Info

Publication number
WO2019146032A1
WO2019146032A1 PCT/JP2018/002242 JP2018002242W WO2019146032A1 WO 2019146032 A1 WO2019146032 A1 WO 2019146032A1 JP 2018002242 W JP2018002242 W JP 2018002242W WO 2019146032 A1 WO2019146032 A1 WO 2019146032A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture operation
operator
gesture
display
unit
Prior art date
Application number
PCT/JP2018/002242
Other languages
French (fr)
Japanese (ja)
Inventor
直志 宮原
下谷 光生
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2018/002242 priority Critical patent/WO2019146032A1/en
Priority to JP2019567457A priority patent/JP6900133B2/en
Publication of WO2019146032A1 publication Critical patent/WO2019146032A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • the present invention relates to a gesture operation device and a gesture operation method, and more particularly to a gesture operation device and a gesture operation method for guiding a gesture operation to an operator.
  • a gesture operating device which performs volume control or response to an incoming call by performing gesture operations by an operator such as a driver.
  • the gesture operation is an operation for performing an arbitrary function by moving the hand or the like in an arbitrary direction in the three-dimensional space.
  • a technology for displaying information for guiding a gesture operation to be performed by the operator when executing an arbitrary function see, for example, Patent Document 1).
  • Patent Document 1 information for guiding a gesture operation is displayed in two dimensions. Therefore, in the case of a gesture operation involving a movement in a three-dimensional direction, the information for guiding the gesture operation is expressed in two dimensions, and the operator may not be able to correctly understand the gesture operation. In this case, the gesture operation device may not be able to accurately determine the gesture operation performed by the operator.
  • the present invention has been made to solve such a problem, and it is an object of the present invention to provide a gesture operation device and a gesture operation method capable of accurately judging a gesture operation performed by an operator. .
  • the gesture operation device may be performed by an operator for advancing an event with respect to an event detected by the event detection unit and an event detected by the event detection unit.
  • a display control unit that performs control to display an operation object for guiding a gesture operation indicating the intention of the operator in three dimensions as possible operations.
  • the gesture operation method detects an event, and guides the gesture operation indicating the intention of the operator to the action that can be performed by the operator for advancing the event with respect to the detected event. Control to display the operation object in three dimensions.
  • the gesture operation device is an event detection unit that detects an event, and an operation that can be performed by the operator to cause the event to progress with respect to the event detected by the event detection unit. Since the display control unit performs control to display in three dimensions the operation object that guides the gesture operation indicating the intention of the object, it is possible to accurately determine the gesture operation performed by the operator.
  • the gesture operation method detects an event, and guides the gesture operation indicating the intention of the operator to an action that can be performed by the operator for advancing the event with respect to the detected event. In order to perform control of displaying in three dimensions, it is possible to accurately determine the gesture operation performed by the operator.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention.
  • FIG. 1 is a block diagram showing an example of the configuration of the gesture operation device 1 according to the first embodiment.
  • the minimum required structure which comprises the gesture operating device by this Embodiment is shown.
  • an operator performs gesture operation by moving an own hand is demonstrated below, it does not restrict to this.
  • the gesture operation device 1 includes an event detection unit 2 and a display control unit 3.
  • the event detection unit 2 detects an event. As an event, as described later, for example, an incoming mail, an incoming call, an operation of an in-vehicle device, etc. may be mentioned.
  • the display control unit 3 is connected to the display device 4, and for the event detected by the event detection unit 2, the operator's intention is to perform an operation that can be performed by the operator to advance the event. Control is performed to display an operation object for guiding a gesture operation to be shown in three dimensions.
  • FIGS. 2 to 4 are diagrams showing an example of the gesture operation.
  • the X axis indicates the horizontal direction as viewed from the operator
  • the Y axis indicates the vertical direction as viewed from the operator
  • the Z axis indicates the longitudinal direction as viewed from the operator.
  • the operator can perform a gesture operation to rotate his / her hand in the front-rear direction and in the vertical direction.
  • the operator can perform a gesture operation to move his / her hand in the front-rear direction.
  • the operator can perform a gesture operation to move his or her hand in the left and right direction. In this way, the operator can perform the gesture operation by moving the hand in any direction in the three-dimensional space.
  • FIGS. 5 to 7 show an example of the display device 4.
  • the X-axis indicates the horizontal direction as viewed from the operator
  • the Y-axis indicates the vertical direction as viewed from the operator
  • the Z-axis indicates the longitudinal direction as viewed from the operator.
  • FIG. 5 shows an example of the display device 4 that performs autostereoscopic display.
  • the autostereoscopic display means that the image is displayed stereoscopically when the operator 5 looks at the display screen with the naked eye, that is, it is recognized so as to sense the depth in the three-dimensional display space.
  • the display object for the operator 5 is three-dimensional by alternately displaying the display object for the right eye of the operator 5 and the display object for the left eye on the XY plane. Recognize as displayed on the screen.
  • FIG. 6 shows an example of a display device 4 that displays a virtual image, such as a HUD (Head Up Display).
  • a virtual image such as a HUD (Head Up Display).
  • HUD Head Up Display
  • FIG. 7 shows an example of the display device 4 configured by stacking a plurality of transmissive display surfaces in the z-axis direction. By displaying one display object superimposed on each display surface, the operator 5 can recognize that the display object is three-dimensionally displayed.
  • the display device 4 is a display device of any or any combination shown in FIGS. 5 to 7, and is an operation object for guiding a gesture operation in a three-dimensional space as shown in FIGS. 2 to 4 under the control of the display control unit 3. Display in three dimensions. Details of the operation object will be described later.
  • FIG. 8 is a block diagram showing an example of the configuration of the gesture operation device 6 according to another configuration.
  • the gesture operation device 6 includes an event detection unit 2, a display control unit 3, a communication unit 7, a gesture operation acquisition unit 8, an operation determination unit 9, and a device control unit 10. ing.
  • the communication unit 7 has a terminal communication unit 11 that can be communicably connected to the mobile communication terminal 12.
  • the terminal communication unit 11 receives, from the mobile communication terminal 12, for example, information indicating that a mail has arrived, and information indicating that a call has arrived. Further, the terminal communication unit 11 transmits information for operating the mobile communication terminal 12 to the mobile communication terminal 12 in accordance with the determination result of the operation determining unit 9.
  • the event detection unit 2 detects, as an event, information that the terminal communication unit 11 has received an e-mail, information that a call has been received, or the like from the mobile communication terminal 12.
  • the display control unit 3 causes the display device 4 to make a gesture indicating the intention of the operator for the action that the operator can perform to advance the event with respect to the event detected by the event detection unit 2 Control to display the operation object that guides the operation in three dimensions.
  • the gesture operation acquisition unit 8 is connected to the gesture operation detection device 13 and acquires the gesture operation of the operator 5 from the gesture operation detection device 13.
  • FIG. 9 is a diagram showing an example of the configuration of the gesture operation detection device 13.
  • the gesture operation detection device 13 is installed, for example, on a floor console in a car, and includes at least one of a ToF (Time of Flight) sensor, an image sensor, and a proximity sensor.
  • the gesture operation space 17 corresponds to a detection range by the gesture operation detection device 13. That is, the gesture operation detection device 13 detects the gesture operation performed by the operator 5 in the gesture operation space 17.
  • the gesture operation detected by the gesture operation detection device 13 is not only a gesture operation for performing an operation that can be performed by the operator in order to advance the event, but also other events. The movement of the hand is also detected as a gesture operation.
  • the sensor which comprises the gesture operation detection apparatus 13 may be one. In this case, the gesture operation detection device 13 detects a gesture operation from one direction. Moreover, the sensor which comprises the gesture operation detection apparatus 13 may be multiple. In this case, for example, if three sensors are provided at three points A, B, and C as shown in FIG. 10, the gesture operation detection device 13 detects a gesture operation from three directions. The detection accuracy of the gesture operation can be improved more than in the case of one sensor.
  • the operation determination unit 9 determines whether the gesture operation acquired by the gesture operation acquisition unit 8 is a movement according to the operation object. Further, the operation determination unit 9 is connected to the voice input device 14, and can determine what kind of operation to perform based on the content that the operator 5 utters via the voice input device 14. Thus, the operation determination unit 9 can determine the content of the voice operation by the operator 5.
  • the device control unit 10 controls devices such as the on-vehicle device 15 and the voice output device 16 based on the gesture operation.
  • the device control unit 10 controls the mobile communication terminal 12 in communication with the communication unit 7 based on the gesture operation.
  • the on-vehicle device 15 include a navigation device, an audio device, and various control devices provided in the vehicle.
  • the audio output device 16 include a speaker and the like.
  • FIG. 11 is a block diagram showing an example of the hardware configuration of the gesture operation device 6. The same applies to the gesture operation device 1 shown in FIG.
  • the functions of the terminal communication unit 11, the event detection unit 2, the display control unit 3, the gesture operation acquisition unit 8, the operation determination unit 9, and the device control unit 10 in the gesture operation device 6 are realized by a processing circuit. That is, the gesture operation device 6 communicates with the mobile communication terminal 12, detects an event, controls display, acquires a gesture operation, determines a gesture operation, and includes a processing circuit for controlling an apparatus.
  • the processing circuit is a processor 18 (also called a central processing unit, processing unit, arithmetic unit, microprocessor, microcomputer, or DSP (Digital Signal Processor)) that executes a program stored in the memory 19.
  • Each function of the terminal communication unit 11, the event detection unit 2, the display control unit 3, the gesture operation acquisition unit 8, the operation determination unit 9, and the device control unit 10 in the gesture operation device 6 is software, firmware, or software and firmware and It is realized by the combination of The software or firmware is described as a program and stored in the memory 19.
  • the processing circuit implements the functions of the respective units by reading and executing the program stored in the memory 19. That is, the gesture operation device 6 communicates with the mobile communication terminal 12, detects an event, controls display, acquires a gesture operation, determines a gesture operation, and controls an apparatus.
  • a memory 19 is provided for storing a program to be executed as a result.
  • the memory is, for example, nonvolatile or volatile such as random access memory (RAM), read only memory (ROM), flash memory, erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), etc.
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • EEPROM electrically erasable programmable read only memory
  • Semiconductor memory magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD or the like, or any storage medium to be used in the future.
  • FIG. 12 is a flowchart showing an example of the operation of the gesture operation device 1 shown in FIG.
  • step S101 the event detection unit 2 determines whether an event has been detected.
  • the event detection unit 2 repeats the process of step S101 until an event is detected, and when it is determined that an event is detected, the process proceeds to step S102.
  • step S102 the display control unit 3 performs a gesture operation indicating the intention of the operator 5 in an operation that can be performed by the operator 5 to advance the event with respect to the event detected by the event detection unit 2.
  • Control the display device 4 so as to display the operation object for guiding in three dimensions.
  • a three-dimensional operation object for guiding a three-dimensional gesture operation as shown in FIGS. 2 to 4 is displayed on the display device 4 capable of displaying in three dimensions.
  • FIG. 13 is a flowchart showing an example of the operation of the gesture operation device 6 shown in FIG. Note that, in FIG. 13, a case where the operation object is displayed in a tutorial will be described using FIG. 14.
  • a tutorial means displaying a series of gesture operations to be performed by the operator 5 in response to an event on the display device 4 as an operation object.
  • step S201 the event detection unit 2 determines whether an event has been detected. Specifically, the event detection unit 2 repeats the process of step S201 until the information indicating that the terminal communication unit 11 has received an e-mail or a telephone call is received from the mobile communication terminal 12. Then, the event detection unit 2 determines that an event is detected when the terminal communication unit 11 receives information from the mobile communication terminal 12 to the effect that the terminal communication unit 11 has received an e-mail or a call, and proceeds to step S202.
  • the terminal communication unit 11 when the terminal communication unit 11 receives information indicating that a mail has arrived from the mobile communication terminal 12, the unread icon 20 is displayed on the display device 4.
  • the example in FIG. 14 indicates that there are two unread emails.
  • step S202 in response to the event detected by the event detection unit 2, the display control unit 3 three-dimensionally displays an operation object indicating a series of gesture operations to be performed by the operator 5 in order to advance the event. Control the display device 4.
  • FIG. 14 when the operator 5 performs a gesture operation to select the unread icon 20 displayed on the display device 4, a list of unread e-mails is displayed, and the operator 5 further displays a specific e-mail from the unread e-mail list.
  • a tutorial displays that the contents of the mail are read out by voice.
  • an operation object 21 indicating a gesture operation for selecting the unread icon 20 is three-dimensionally displayed on the display device 4. At this time, the operation object 21 moves from the front to the back of the display device 4 so as to select the unread icon 20.
  • characters of “select mail and can be read out by voice” are displayed.
  • step S203 whether the gesture operation acquisition unit 8 acquires a gesture operation by the operator 5 detected by the gesture operation detection device 13 from the gesture operation detection device 13, that is, whether the operator 5 performed a gesture operation or not To judge.
  • the process proceeds to step S205.
  • the gesture operation has not been acquired, the process proceeds to step S204.
  • step S204 the gesture operation acquisition unit 8 determines whether a predetermined time has elapsed. Note that this time may be a time set in advance, or may be a time arbitrarily set by the user. If the predetermined time has elapsed, the operation shown in FIG. 13 is ended. On the other hand, if the predetermined time has not elapsed, the process returns to step S203.
  • step S205 the operation determination unit 9 determines whether the gesture operation by the operator 5 acquired by the gesture operation acquisition unit 8 is a movement according to the tutorial displayed on the display device 4. That is, the operation determination unit 9 determines whether the gesture operation by the operator 5 is an operation according to the operation object indicated in the tutorial. If the gesture operation is not the operation according to the tutorial, the process proceeds to step S206. On the other hand, when the gesture operation is an operation according to the tutorial, the process proceeds to step S207.
  • step S206 the display control unit 3 performs control to notify that the gesture operation has failed. Specifically, the display control unit 3 performs control to display on the display device 4 that the gesture operation by the operator 5 is not the operation according to the tutorial.
  • step S207 when determining that the gesture operation by the operator 5 is an operation according to the tutorial in step S207, the device control unit 10 controls a target device based on the gesture operation.
  • the operator 5 reads the contents of the mail selected by the gesture operation.
  • the device control unit 10 may control the voice output device 16 to read the contents of the mail from the voice output device 16, and control the terminal communication unit 11 to read the contents of the mail from the mobile communication terminal 12. You may
  • step S208 the operation determination unit 9 determines whether the gesture operation by the operator 5 has ended. Specifically, the operation determination unit 9 determines whether the gesture operation acquired by the gesture operation acquisition unit 8 has performed all the operations according to the tutorial displayed on the display device 4. When the gesture operation ends, the operation illustrated in FIG. 13 ends. On the other hand, if the gesture operation has not ended, the process returns to step S207.
  • the operator 5 can correctly understand what kind of gesture operation should be performed when a mail is received.
  • the display device 4 is a HUD as shown in FIG. 6, a driver who is the operator 5 can perform gesture operations without turning his eyes from the front.
  • FIG. 15 is a flowchart showing an example of the operation of the gesture operation device 6 shown in FIG.
  • produces is displayed is demonstrated using FIG.
  • step S301 the event detection unit 2 repeats the process of step S301 until an event is detected, as in step S201 of FIG. And if it judges that event detection part 2 detected an event, it will shift to Step S302.
  • the display device 4 displays the information on the call originator along with the handset icon.
  • “A call from Mr. Iwasaki” is displayed as the information of the call originator.
  • step S302 the display control unit 3 controls the display device 4 to three-dimensionally display an operation object indicating one gesture operation to be performed by the operator 5 next to the event detected by the event detection unit 2 Do.
  • the operation object 21 in which the hand of the handset icon is touched is displayed.
  • the display control unit 3 may perform control to display the operation object 21 so as to express a movement of putting a hand on the icon of the handset.
  • step S303 whether the gesture operation acquisition unit 8 acquires a gesture operation by the operator 5 detected by the gesture operation detection device 13 from the gesture operation detection device 13, that is, whether the operator 5 performed a gesture operation or not To judge.
  • a detection icon 22 corresponding to the hand of the operator 5 detected by the gesture operation detection device 13 is displayed on the display device 4.
  • the display device 4 may display the icon of the handset, the operation object 21 and the detection icon 22 in order from the back to the front. That is, the display control unit 3 displays the operation object 21 on the back side in the three-dimensional display space, and the detection icon 22 which is an icon of the hand of the operator 5 acquired by the gesture operation acquisition unit 8 is displayed in front of the three-dimensional display space. Control to display on the side.
  • the process proceeds to step S305.
  • the gesture operation has not been acquired, the process proceeds to step S304.
  • step S304 as in step S204 in FIG. 13, the gesture operation acquisition unit 8 determines whether a predetermined time has elapsed. If the predetermined time has elapsed, the operation shown in FIG. 15 is ended. On the other hand, if the predetermined time has not elapsed, the process returns to step S303.
  • step S305 the operation determination unit 9 determines whether the gesture operation by the operator 5 acquired by the gesture operation acquisition unit 8 is movement according to the operation object displayed on the display device 4. If the gesture operation is not the operation according to the operation object, the process proceeds to step S306. On the other hand, when the gesture operation is an operation according to the operation object, the process proceeds to step S307.
  • step S306 as in step S206 of FIG. 13, the display control unit 3 performs control to notify that the gesture operation has failed. Specifically, the display control unit 3 performs control to display on the display device 4 that the gesture operation by the operator 5 is not the operation according to the operation object.
  • step S307 when determining that the gesture operation by the operator 5 is an operation according to the operation object in step S307, the device control unit 10 controls a target device based on the gesture operation.
  • a target device In the example of FIG. 16, when the detection icon 22 overlaps the icon of the handset, a call can be made. At this time, the device control unit 10 controls the terminal communication unit 11 so that the mobile communication terminal 12 can make a call.
  • step S308 the operation determination unit 9 determines whether the gesture operation by the operator 5 has ended. Specifically, the operation determination unit 9 determines whether the gesture operation acquired by the gesture operation acquisition unit 8 has performed an operation according to the operation object displayed on the display device 4. When the gesture operation ends, the operation illustrated in FIG. 15 ends. On the other hand, if the gesture operation has not ended, the process returns to step S302.
  • FIG. 13 shows the operation in the case of displaying the operation object in the tutorial
  • FIG. 15 shows the operation in the case of displaying the operation object for inducing one gesture operation to be performed by the operator 5 next when an event occurs.
  • these operations may be combined.
  • the operation in FIG. 13 is performed to display the operation object in the tutorial, and when the same event is experienced next time or later, the operation in FIG. An operation object for guiding one gesture operation may be displayed.
  • the operation of FIG. 13 or 15 may be performed for each operator 5. In this case, it is necessary to recognize the operators 5 individually.
  • a dial is displayed together with an operation object 21 for guiding a clockwise or counterclockwise gesture operation.
  • Examples of the dial include a dial for performing volume adjustment.
  • folders A to C are displayed together with an operation object 21 for guiding a gesture operation to be rotated in the front-rear direction as viewed from the operator 5.
  • a folder a telephone directory etc. are mentioned, for example.
  • FIGS. 17 and 18 are displayed as a still image in step S 202 in FIG. 13 or step S 302 in FIG. 15. However, the operator 5 actually performs the gesture operation in step S 205 in FIG. 13 or step S 305 in FIG. When this is done, the dial rotates or moves in the order of folders A to C in accordance with the actual gesture operation. That is, when the operation determination unit 9 determines that the gesture operation by the operator 5 is a movement according to the operation object, the display control unit 3 controls the operation object to move according to the gesture operation acquired by the gesture operation acquisition unit 8 Do.
  • the list of operation items corresponds to, for example, the function of the on-vehicle device 15.
  • the list of operation items is arranged to draw a ring in the front-rear direction as viewed from the operator 5.
  • Operator 5 selects "Setting"
  • the screen changes to a screen for setting various settings.
  • AM Radio is selected
  • FM Radio FM radio is selected.
  • a function for viewing is executed, a function for viewing FM radio is executed when "Music" is selected, and a navigation function such as route guidance to a destination is executed when "Navigation" is selected.
  • a navigation function such as route guidance to a destination is executed when "Navigation" is selected.
  • the list of operation items moves in accordance with the actual gesture operation.
  • the present invention is not limited to the example shown in FIG. 19, and may be a display in which the dial is rotated in FIG.
  • Example 1 The gesture operation may be performed by the same operation as the operation of the mobile communication terminal 12 which the operator 5 normally uses.
  • FIG. 20 illustrates an example in which the operation screen of the mobile communication terminal 12 is three-dimensionally displayed on the display device 4.
  • the operation screen of the mobile communication terminal 12 shown in FIG. 20 is a screen displayed when an incoming call is received.
  • the terminal communication unit 11 acquires, from the mobile communication terminal 12, information of the operation screen displayed on the mobile communication terminal 12 when the call is received, together with the information indicating that the call is received from the mobile communication terminal 12.
  • the display control unit 3 displays an operation object indicating that the icon of the handset indicating that it responds is to be slid rightward, as shown in FIG. Control to display on the device 4 is performed.
  • the operator 5 can respond to an incoming call by performing a gesture operation according to the operation object shown in FIG. At this time, the operator 5 performs the gesture operation while looking at the operation screen displayed in three dimensions, and thus performs the gesture operation in the same sense of operation as the mobile communication terminal 12 that the operator 5 normally uses be able to.
  • an operation screen as shown in FIG. 22 may be displayed on the display device 4.
  • the display control unit 3 performs control to display on the display device 4 an operation object indicating selection of an icon of a handset indicating response.
  • the operator 5 can respond to an incoming call by performing a gesture operation according to FIG.
  • the gesture operation device 6 may store the operation screen of the mobile communication terminal 12 in a storage unit (not shown), and the communication unit 7 acquires the information of the operation screen corresponding to the mobile communication terminal 12 from an external server or the like. You may
  • Example 2 If the on-vehicle device 15 has a radio and a function to play music, these functions may be performed by a gesture operation. For example, when the operator 5 speaks “music”, “radio”, “mobile music” or the like while holding his hand, the event detection unit 2 detects these operations as an event.
  • the display control unit 3 displays, for example, a music reproduction list and a hand icon as an operation object on the display device 4.
  • the display control unit 3 may display an operation object indicating that scrolling of the list is possible when a gesture operation of flicking in the up and down direction is performed.
  • the operator 5 scrolls the list by performing a gesture operation of flicking up and down according to the operation object displayed on the display device 4, and reads out the music name to select a specific music.
  • selection of a music may be selected by gesture operation, and when the display apparatus 4 is equipped with the touch panel, you may select by touch operation.
  • an operation object as shown in FIG. 17 may be displayed on the display device 4.
  • the operator 5 can adjust the volume by performing a gesture operation such as rotating the dial with his thumb, forefinger and middle finger, for example.
  • the rotational axis of the dial may be in a shape inclined to the operator 5 side.
  • Example 3 When the on-vehicle device 15 has a navigation function, the function may be executed by a gesture operation. For example, when the operator 5 speaks “navigation”, “destination” or “map” while holding his hand, the event detection unit 2 detects these operations as an event.
  • the display control unit 3 displays, for example, a list of destinations and an icon of a hand as an operation object.
  • the display control unit 3 displays an operation object indicating that the list can be scrolled by performing a gesture operation of flicking in the vertical direction, and that a specific purpose can be selected by performing a gesture operation of touching. May be The operator 5 scrolls the list by performing a gesture operation of flicking up and down according to the operation object displayed on the display device 4, and selects the specific purpose when performing the gesture operation of touching.
  • the destination may be selected by reading out the name of the destination, for example.
  • the display control unit 3 displays, for example, a map and an icon of a hand as an operation object.
  • the display control unit 3 may display an operation object indicating that the map can be enlarged or reduced by performing a pinch-in or pinch-out gesture operation.
  • an operation object may be displayed which indicates that the map can be scrolled by performing a gesture operation of moving the palm in the front, back, left, and right directions with the palm directed downward.
  • the display control unit 3 performs a gesture operation of moving the palm upward with the palm facing upward.
  • An operation object may be displayed to indicate that the order of the home screen 23, the current location screen 24, and the map scroll screen 25 changes each time it is performed.
  • the home screen 23 may be displayed at the top when the hand is turned over.
  • the display control unit 3 may display an operation object indicating that a specific item is selected when, for example, a gesture operation to touch is performed.
  • the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
  • FIG. 25 is a block diagram showing an example of the configuration of the gesture operation device 26 according to the second embodiment of the present invention.
  • the gesture operation device 26 is characterized in that the device control unit 10 has a sound image control unit 27.
  • the other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
  • the sound image control unit 27 is connected to the sound output device 16 and controls the sound image of the sound output from the sound output device 16. Specifically, the sound image control unit 27 controls the sound image according to the display position of the operation object displayed on the display device 4 by the control of the display control unit 3.
  • FIG. 26 shows an operation object when sorting incoming mail into a storage folder and a trash can.
  • an incoming mail is displayed in front of the left
  • a trash can is displayed in front of the right
  • a save folder is displayed in the back right.
  • the hand icon moves along the arrow from the incoming mail to the trash or storage folder.
  • the sound image control unit 27 controls the sound image according to the movement of the hand icon, the sound when the hand icon moves toward the trash and the hand icon moves toward the storage folder The sound of is different.
  • the sound image control unit 27 may control the sound image according to the gesture operation when the operator 5 actually performs the gesture operation.
  • the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
  • FIG. 27 is a block diagram showing an example of a configuration of the gesture operation device 28 according to Embodiment 3 of the present invention.
  • the gesture operation device 28 As shown in FIG. 27, the gesture operation device 28 according to the third embodiment is characterized by including a state acquisition unit 29.
  • the other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
  • the state acquisition unit 29 is connected to the state detection device 30, and acquires the state of the operator 5 including the eye position and the line of sight of at least the operator 5 detected by the state detection device 30.
  • the state detection device 30 is configured by a camera, and specifies the eye position or line of sight of the operator 5 from the image captured by the camera.
  • the display control unit 3 controls the display position of the operation object based on the state of the operator 5 acquired by the state acquisition unit 29.
  • the operation object which can be easily viewed by the operator 5 can be displayed. Thereby, the operator 5 can correctly understand what kind of gesture operation should be performed. Therefore, the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
  • FIG. 28 is a block diagram showing an example of the configuration of the gesture operation device 31 according to the fourth embodiment of the present invention.
  • the gesture operation device 31 As shown in FIG. 28, the gesture operation device 31 according to the fourth embodiment is characterized by including an operation history information storage unit 32.
  • the other configuration and operation are the same as in the third embodiment, and thus detailed description will be omitted here.
  • the operation history information storage unit 32 is composed of a storage device such as a hard disk drive (HDD) or a semiconductor memory, for example, and the operation determination unit 9 determines whether the gesture operation follows the operation object. Remember the results.
  • HDD hard disk drive
  • semiconductor memory for example
  • FIG. 29 is a flowchart showing an example of the operation of the gesture operation device 31. Steps S401 to S406 in FIG. 29 correspond to steps S201 to S206 in FIG. 13, and steps S408 and S409 in FIG. 29 correspond to steps S207 and S208 in FIG. Omit. Hereinafter, step S407 will be described.
  • the operation history information storage unit 32 stores the result determined by the operation determination unit 9 as operation history information. Specifically, when the operation determination unit 9 determines in step S405 that the gesture operation by the operator 5 is not the operation according to the operation object indicated in the tutorial, the determination result is stored in the operation history information storage unit 32. .
  • the operation history information stored in the operation history information storage unit 32 is stored for each operator 5. In addition, identification of each operator 5 may be based on the image of the face of the operator 5 which the state detection apparatus 30 detected.
  • the operation determination unit 9 is an operation according to the operation object indicated by the tutorial by the gesture operation by the operator 5 using the operation history information stored in the operation history information storage unit 32 in the subsequent step S405. It can be determined whether or not. In addition, the operation determination unit 9 can perform machine learning using the accumulated operation history information. Thereby, even if the gesture operation by the operator 5 is a little different from the operation object shown in the tutorial, it is judged that the operation according to the operation object is taken into consideration, such as judgment taking into consideration the habit of gesture operation by the operator 5 You can
  • FIG. 30 is a flowchart showing an example of the operation of the gesture operation device 31.
  • Step S501 to Step S506 in FIG. 30 correspond to Step S201 to Step S206 in FIG. 13, and Step S509 and Step S510 in FIG. 30 correspond to Step S207 and Step S208 in FIG. Omit.
  • steps S507 and S508 will be described.
  • step S507 the state acquisition unit 29 acquires the reaction of the operator 5 with respect to the result determined by the operation determination unit 9 as the state of the operator 5. Specifically, when the operation determination unit 9 determines that the gesture operation by the operator 5 is not the operation according to the operation object shown in the tutorial in step S505, and notifies that the gesture operation has failed in step S506.
  • the state detection device 30 detects the reaction of the operator 5 and the state acquisition unit 29 acquires the detected reaction of the operator 5.
  • the reaction of the operator 5 includes the expression or movement of the operator 5.
  • step S508 the operation history information storage unit 32 associates the determination result by the operation determination unit 9 with the reaction of the operator 5 acquired by the state acquisition unit 29 and stores the result as operation history information.
  • the operation history information stored in the operation history information storage unit 32 is stored for each operator 5.
  • identification of each operator 5 may be based on the image of the face of the operator 5 which the state detection apparatus 30 detected.
  • Operation determination unit 9 is an operation in which the gesture operation by operator 5 follows the operation object indicated by the tutorial, using the operation history information stored in operation history information storage unit 32 in the subsequent step S505. It can be determined whether or not. In addition, the operation determination unit 9 can perform machine learning using the accumulated operation history information. Thereby, even if the gesture operation by the operator 5 is a little different from the operation object shown in the tutorial, it is judged that the operation according to the operation object is taken into consideration, such as judgment taking into consideration the habit of gesture operation by the operator 5 You can
  • FIGS. 29 and 30 although the case of displaying the tutorial has been described, the present invention is also applicable to the case of displaying an operation object for inducing one gesture operation to be performed by the operator 5 next when an event occurs. is there.
  • the operation determination unit 9 may not perform the process of step S405 of FIG. 29 or step S505 of FIG.
  • the gesture operation by the operator 5 follows the operation object indicated by the tutorial. In order to determine whether or not there is, it is possible to make a determination in consideration of the habit of the gesture operation by the operator 5.
  • FIG. 31 is a block diagram showing an example of the configuration of the gesture operation device 33 according to the fifth embodiment of the present invention.
  • the gesture operation device 33 As shown in FIG. 31, the gesture operation device 33 according to the fifth embodiment is characterized in that the communication unit 7 includes the external information acquisition unit 34.
  • the communication unit 7 includes the external information acquisition unit 34.
  • the other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
  • the external information acquisition unit 34 acquires various external information from the outside.
  • the external information include information related to the operator 5, traffic information, weather information, and the like.
  • Information related to the operator 5 includes, for example, the schedule of the operator 5, contact information, favorite music, information on SNS (Social Networking Service), traffic information or weather information on the set destination, POI (Point Of Interest) ) Information, purchasing behavior information, information of home appliances connected to the home network, and the like.
  • the event detection unit 2 detects that the external information acquisition unit 34 has acquired information on the home appliance as an event. Then, as shown in FIG. 32, the display control unit 3 causes the display device 4 to display an operation object 21 indicating an operation of turning on or off the power of the air conditioner installed on the first floor of the operator 5's home. Control. For example, an operation object may be displayed such that the power is turned on and off alternately and repeatedly each time the hand icon presses the switch icon. As a result, the operator 5 can easily grasp which room in the house in which the home appliance is to be operated, and can accurately understand what kind of gesture operation should be performed.
  • the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
  • FIG. 33 is a block diagram showing an example of the configuration of the gesture operation device 35 according to the sixth embodiment of the present invention.
  • the gesture operation device 35 is characterized in that the device control unit 10 has a vehicle information acquisition unit 36.
  • the other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
  • the vehicle information acquisition unit 36 acquires vehicle information from various control devices provided in the vehicle which is the in-vehicle device 15. Vehicle information is various information regarding a vehicle.
  • the event detection unit 2 detects that the vehicle information acquisition unit 36 acquires vehicle information as an event. Do. Then, as shown in FIG. 34, the display control unit 3 controls to display on the display device 4 an operation object 21 indicating an operation of closing the trunk of the vehicle.
  • the operator 5 can correctly understand the gesture operation for closing the trunk of the vehicle.
  • the event detection unit 2 detects that the terminal communication unit 11 has acquired information on the mobile communication terminal 12 as an event. .
  • the display control unit 3 controls the display device 4 to display an operation object 21 indicating an operation of switching the display of the information related to the mobile communication terminal 12.
  • the operator 5 can display the in-vehicle device 15 or the mobile communication terminal 12. Can accurately understand the gesture operation to control the Therefore, the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
  • the gesture operation device described above is not limited to an on-vehicle navigation device, that is, a car navigation device, and a PND (Portable Navigation Device) that can be mounted on a vehicle, and a server provided outside the vehicle as a system as appropriate
  • the present invention can also be applied to a navigation device to be constructed or a device other than the navigation device.
  • each function or each component of the gesture operation device is distributively arranged to each function constructing the system.
  • the function of the gesture operation device can be arranged on the server.
  • the user side includes the display device 4, the mobile communication terminal 12, the gesture operation detection device 13, the voice input device 14, the in-vehicle device 15, and the voice output device 16.
  • the server 37 includes an event detection unit 2, a display control unit 3, a communication unit 7, a gesture operation acquisition unit 8, an operation determination unit 9, a device control unit 10, and a terminal communication unit 11.
  • the gesture operating device 26 shown in FIG. 25 the gesture operating device 28 shown in FIG. 27, the gesture operating device 31 shown in FIG. 28, the gesture operating device 33 shown in FIG. 31 and the gesture operating device 35 shown in FIG. .
  • software for executing the operation in the above embodiment may be incorporated into, for example, a server.
  • the gesture operation method implemented by the server executing this software detects an event, and for the detected event, the operator can perform an operation that can be performed by the operator to advance the event. It is performing control which displays the operation object which guides the gesture operation which shows an intention in three dimensions.
  • each embodiment can be freely combined, or each embodiment can be appropriately modified or omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The purpose of the present invention is to provide a gesture operation device and a gesture operation method capable of accurately determining a gesture operation performed by an operator. The gesture operation device according to the present invention is provided with an event detection unit which detects an event, and a display control unit which performs control to display a three-dimensional operation object for guiding the operator to perform a gesture operation that can be performed by the operator to cause the event detected by the event detection unit to proceed, and that indicates an intention of the operator.

Description

ジェスチャー操作装置およびジェスチャー操作方法Gesture operating device and gesture operating method
 本発明は、ジェスチャー操作装置およびジェスチャー操作方法に関し、特に、操作者にジェスチャー操作を誘導するジェスチャー操作装置およびジェスチャー操作方法に関する。 The present invention relates to a gesture operation device and a gesture operation method, and more particularly to a gesture operation device and a gesture operation method for guiding a gesture operation to an operator.
 従来、運転者などの操作者がジェスチャー操作することによって、音量調整または着信応答などを行うジェスチャー操作装置が開発されている。ジェスチャー操作とは、三次元空間における任意の方向に自身の手などを動かすことによって、任意の機能を実行するための操作のことをいう。例えば、任意の機能を実行する際に、操作者が行うべきジェスチャー操作を誘導する情報を表示する技術が開示されている(例えば、特許文献1参照)。 2. Description of the Related Art Conventionally, a gesture operating device has been developed which performs volume control or response to an incoming call by performing gesture operations by an operator such as a driver. The gesture operation is an operation for performing an arbitrary function by moving the hand or the like in an arbitrary direction in the three-dimensional space. For example, there is disclosed a technology for displaying information for guiding a gesture operation to be performed by the operator when executing an arbitrary function (see, for example, Patent Document 1).
国際公開第2012/011263号International Publication No. 2012/011263
 特許文献1では、ジェスチャー操作を誘導する情報を二次元で表示している。従って、三次元方向の動きを伴うジェスチャー操作の場合、当該ジェスチャー操作を誘導する情報は二次元で表現されるため、操作者がジェスチャー操作を正確に理解することができないことがある。この場合、ジェスチャー操作装置は、操作者が行ったジェスチャー操作を精度良く判断することができない可能性がある。 In Patent Document 1, information for guiding a gesture operation is displayed in two dimensions. Therefore, in the case of a gesture operation involving a movement in a three-dimensional direction, the information for guiding the gesture operation is expressed in two dimensions, and the operator may not be able to correctly understand the gesture operation. In this case, the gesture operation device may not be able to accurately determine the gesture operation performed by the operator.
 本発明は、このような問題を解決するためになされたものであり、操作者が行ったジェスチャー操作を精度良く判断することが可能なジェスチャー操作装置およびジェスチャー操作方法を提供することを目的とする。 The present invention has been made to solve such a problem, and it is an object of the present invention to provide a gesture operation device and a gesture operation method capable of accurately judging a gesture operation performed by an operator. .
 上記の課題を解決するために、本発明によるジェスチャー操作装置は、イベントを検出するイベント検出部と、イベント検出部が検出したイベントに対して、当該イベントを進行させるために操作者が行うことが可能な動作に、操作者の意図を示すジェスチャー操作を誘導する操作オブジェクトを三次元で表示する制御を行う表示制御部とを備える。 In order to solve the above problems, the gesture operation device according to the present invention may be performed by an operator for advancing an event with respect to an event detected by the event detection unit and an event detected by the event detection unit. And a display control unit that performs control to display an operation object for guiding a gesture operation indicating the intention of the operator in three dimensions as possible operations.
 また、本発明によるジェスチャー操作方法は、イベントを検出し、検出したイベントに対して、当該イベントを進行させるために操作者が行うことが可能な動作に、操作者の意図を示すジェスチャー操作を誘導する操作オブジェクトを三次元で表示する制御を行う。 Further, the gesture operation method according to the present invention detects an event, and guides the gesture operation indicating the intention of the operator to the action that can be performed by the operator for advancing the event with respect to the detected event. Control to display the operation object in three dimensions.
 本発明によると、ジェスチャー操作装置は、イベントを検出するイベント検出部と、イベント検出部が検出したイベントに対して、当該イベントを進行させるために操作者が行うことが可能な動作に、操作者の意図を示すジェスチャー操作を誘導する操作オブジェクトを三次元で表示する制御を行う表示制御部とを備えるため、操作者が行ったジェスチャー操作を精度良く判断することが可能となる。 According to the present invention, the gesture operation device is an event detection unit that detects an event, and an operation that can be performed by the operator to cause the event to progress with respect to the event detected by the event detection unit. Since the display control unit performs control to display in three dimensions the operation object that guides the gesture operation indicating the intention of the object, it is possible to accurately determine the gesture operation performed by the operator.
 また、ジェスチャー操作方法は、イベントを検出し、検出したイベントに対して、当該イベントを進行させるために操作者が行うことが可能な動作に、操作者の意図を示すジェスチャー操作を誘導する操作オブジェクトを三次元で表示する制御を行うため、操作者が行ったジェスチャー操作を精度良く判断することが可能となる。 In addition, the gesture operation method detects an event, and guides the gesture operation indicating the intention of the operator to an action that can be performed by the operator for advancing the event with respect to the detected event. In order to perform control of displaying in three dimensions, it is possible to accurately determine the gesture operation performed by the operator.
 本発明の目的、特徴、態様、および利点は、以下の詳細な説明と添付図面とによって、より明白となる。 The objects, features, aspects, and advantages of the present invention will be more apparent from the following detailed description and the accompanying drawings.
本発明の実施の形態1によるジェスチャー操作装置の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the gesture operating device by Embodiment 1 of this invention. 本発明の実施の形態1によるジェスチャー操作の一例を示す図である。It is a figure which shows an example of the gesture operation by Embodiment 1 of this invention. 本発明の実施の形態1によるジェスチャー操作の一例を示す図である。It is a figure which shows an example of the gesture operation by Embodiment 1 of this invention. 本発明の実施の形態1によるジェスチャー操作の一例を示す図である。It is a figure which shows an example of the gesture operation by Embodiment 1 of this invention. 本発明の実施の形態1による表示装置の一例を示す図である。FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention. 本発明の実施の形態1による表示装置の一例を示す図である。FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention. 本発明の実施の形態1による表示装置の一例を示す図である。FIG. 1 is a diagram showing an example of a display device according to Embodiment 1 of the present invention. 本発明の実施の形態1によるジェスチャー操作装置の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the gesture operating device by Embodiment 1 of this invention. 本発明の実施の形態1によるジェスチャー操作検出装置の構成の一例を示す図である。It is a figure which shows an example of a structure of the gesture operation detection apparatus by Embodiment 1 of this invention. 本発明の実施の形態1によるジェスチャー操作検出装置の構成の一例を示す図である。It is a figure which shows an example of a structure of the gesture operation detection apparatus by Embodiment 1 of this invention. 本発明の実施の形態1によるジェスチャー操作装置のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the gesture operating device by Embodiment 1 of this invention. 本発明の実施の形態1によるジェスチャー操作装置の動作の一例を示すフローチャートである。It is a flowchart which shows an example of operation | movement of the gesture operating device by Embodiment 1 of this invention. 本発明の実施の形態1によるジェスチャー操作装置の動作の一例を示すフローチャートである。It is a flowchart which shows an example of operation | movement of the gesture operating device by Embodiment 1 of this invention. 本発明の実施の形態1による操作オブジェクトの表示の一例を示す図である。It is a figure which shows an example of a display of the operation object by Embodiment 1 of this invention. 本発明の実施の形態1によるジェスチャー操作装置の動作の一例を示すフローチャートである。It is a flowchart which shows an example of operation | movement of the gesture operating device by Embodiment 1 of this invention. 本発明の実施の形態1による操作オブジェクトの表示の一例を示す図である。It is a figure which shows an example of a display of the operation object by Embodiment 1 of this invention. 本発明の実施の形態1による操作オブジェクトの一例を示す図である。It is a figure which shows an example of the operation object by Embodiment 1 of this invention. 本発明の実施の形態1による操作オブジェクトの一例を示す図である。It is a figure which shows an example of the operation object by Embodiment 1 of this invention. 本発明の実施の形態1による操作オブジェクトの一例を示す図である。It is a figure which shows an example of the operation object by Embodiment 1 of this invention. 本発明の実施の形態1による携帯通信端末の操作画面の表示の一例を示す図である。It is a figure which shows an example of a display of the operation screen of the mobile communication terminal by Embodiment 1 of this invention. 本発明の実施の形態1による操作オブジェクトの一例を示す図である。It is a figure which shows an example of the operation object by Embodiment 1 of this invention. 本発明の実施の形態1による携帯通信端末の操作画面の表示の一例を示す図である。It is a figure which shows an example of a display of the operation screen of the mobile communication terminal by Embodiment 1 of this invention. 本発明の実施の形態1による操作オブジェクトの一例を示す図である。It is a figure which shows an example of the operation object by Embodiment 1 of this invention. 本発明の実施の形態1による階層化された画面の一例を示す図である。It is a figure which shows an example of the hierarchized screen by Embodiment 1 of this invention. 本発明の実施の形態2によるジェスチャー操作装置の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the gesture operating device by Embodiment 2 of this invention. 本発明の実施の形態1による操作オブジェクトの一例を示す図である。It is a figure which shows an example of the operation object by Embodiment 1 of this invention. 本発明の実施の形態3によるジェスチャー操作装置の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the gesture operating device by Embodiment 3 of this invention. 本発明の実施の形態4によるジェスチャー操作装置の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the gesture operating device by Embodiment 4 of this invention. 本発明の実施の形態4によるジェスチャー操作装置の動作の一例を示すフローチャートである。It is a flowchart which shows an example of operation | movement of the gesture operating device by Embodiment 4 of this invention. 本発明の実施の形態4によるジェスチャー操作装置の動作の一例を示すフローチャートである。It is a flowchart which shows an example of operation | movement of the gesture operating device by Embodiment 4 of this invention. 本発明の実施の形態5によるジェスチャー操作装置の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the gesture operating device by Embodiment 5 of this invention. 本発明の実施の形態5による操作オブジェクトの一例を示す図である。It is a figure which shows an example of the operation object by Embodiment 5 of this invention. 本発明の実施の形態6によるジェスチャー操作装置の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the gesture operating device by Embodiment 6 of this invention. 本発明の実施の形態6による操作オブジェクトの一例を示す図である。It is a figure which shows an example of the operation object by Embodiment 6 of this invention. 本発明の実施の形態6による操作オブジェクトの一例を示す図である。It is a figure which shows an example of the operation object by Embodiment 6 of this invention. 本発明の実施の形態によるジェスチャー操作システムの構成の一例を示すブロック図である。It is a block diagram showing an example of composition of a gesture operation system by an embodiment of the invention.
 本発明の実施の形態について、図面に基づいて以下に説明する。 Embodiments of the present invention will be described below based on the drawings.
 <実施の形態1>
 <構成>
 図1は、本実施の形態1によるジェスチャー操作装置1の構成の一例を示すブロック図である。なお、図1では、本実施の形態によるジェスチャー操作装置を構成する必要最小限の構成を示している。また、以下では、操作者が自身の手を動かすことによってジェスチャー操作を行う場合について説明するが、これに限るものではない。
Embodiment 1
<Configuration>
FIG. 1 is a block diagram showing an example of the configuration of the gesture operation device 1 according to the first embodiment. In addition, in FIG. 1, the minimum required structure which comprises the gesture operating device by this Embodiment is shown. Moreover, although the case where an operator performs gesture operation by moving an own hand is demonstrated below, it does not restrict to this.
 図1に示すように、ジェスチャー操作装置1は、イベント検出部2と、表示制御部3とを備えている。イベント検出部2は、イベントを検出する。イベントしては、後述のように、例えばメール着信、電話着信、車載機器の操作時などが挙げられる。表示制御部3は、表示装置4に接続されており、イベント検出部2が検出したイベントに対して、当該イベントを進行させるために操作者が行うことが可能な動作に、操作者の意図を示すジェスチャー操作を誘導する操作オブジェクトを三次元で表示する制御を行う。 As shown in FIG. 1, the gesture operation device 1 includes an event detection unit 2 and a display control unit 3. The event detection unit 2 detects an event. As an event, as described later, for example, an incoming mail, an incoming call, an operation of an in-vehicle device, etc. may be mentioned. The display control unit 3 is connected to the display device 4, and for the event detected by the event detection unit 2, the operator's intention is to perform an operation that can be performed by the operator to advance the event. Control is performed to display an operation object for guiding a gesture operation to be shown in three dimensions.
 図2~4は、ジェスチャー操作の一例を示す図である。図2~4において、X軸は操作者から見て左右方向を示し、Y軸は操作者から見て上下方向を示し、Z軸は操作者から見て前後方向を示している。 2 to 4 are diagrams showing an example of the gesture operation. In FIGS. 2 to 4, the X axis indicates the horizontal direction as viewed from the operator, the Y axis indicates the vertical direction as viewed from the operator, and the Z axis indicates the longitudinal direction as viewed from the operator.
 図2に示すように、操作者は、自身の手を前後方向かつ上下方向に回転させるジェスチャー操作を行うことができる。図3に示すように、操作者は、自身の手を前後方向に移動させるジェスチャー操作を行うことができる。図4に示すように、操作者は、自身の手を左右方向に移動させるジェスチャー操作を行うことができる。このように、操作者は、三次元空間における任意の方向に手を動かすことによってジェスチャー操作を行うことができる。 As shown in FIG. 2, the operator can perform a gesture operation to rotate his / her hand in the front-rear direction and in the vertical direction. As shown in FIG. 3, the operator can perform a gesture operation to move his / her hand in the front-rear direction. As shown in FIG. 4, the operator can perform a gesture operation to move his or her hand in the left and right direction. In this way, the operator can perform the gesture operation by moving the hand in any direction in the three-dimensional space.
 図5~7は、表示装置4の一例を示す図である。図5~7において、X軸は操作者から見て左右方向を示し、Y軸は操作者から見て上下方向を示し、Z軸は操作者から見て前後方向を示している。 5 to 7 show an example of the display device 4. In FIGS. 5 to 7, the X-axis indicates the horizontal direction as viewed from the operator, the Y-axis indicates the vertical direction as viewed from the operator, and the Z-axis indicates the longitudinal direction as viewed from the operator.
 図5は、裸眼立体表示を行う表示装置4の一例を示している。裸眼立体表示とは、操作者5が裸眼で表示画面を見たときに画像が立体的に表示されているように、すなわち三次元表示空間において奥行きを感じるように認識させることをいう。具体的には、図5に示すように、操作者5の右目用の表示オブジェクトと左目用の表示オブジェクトとをX-Y平面上に交互に表示することによって、操作者5に表示オブジェクトが立体的に表示されているように認識させる。 FIG. 5 shows an example of the display device 4 that performs autostereoscopic display. The autostereoscopic display means that the image is displayed stereoscopically when the operator 5 looks at the display screen with the naked eye, that is, it is recognized so as to sense the depth in the three-dimensional display space. Specifically, as shown in FIG. 5, the display object for the operator 5 is three-dimensional by alternately displaying the display object for the right eye of the operator 5 and the display object for the left eye on the XY plane. Recognize as displayed on the screen.
 図6は、HUD(Head Up Display)など、虚像表示を行う表示装置4の一例を示している。図6に示すように、虚像である表示オブジェクトの焦点距離を変えることによって、操作者5は表示オブジェクトが立体的に表示されているように認識することができる。 FIG. 6 shows an example of a display device 4 that displays a virtual image, such as a HUD (Head Up Display). As shown in FIG. 6, by changing the focal length of the display object which is a virtual image, the operator 5 can recognize that the display object is displayed stereoscopically.
 図7は、透過型の表示面をz軸方向に複数重ねて構成される表示装置4の一例を示している。一の表示オブジェクトを各表示面に重ねて表示することによって、操作者5は表示オブジェクトが立体的に表示されているように認識することができる。 FIG. 7 shows an example of the display device 4 configured by stacking a plurality of transmissive display surfaces in the z-axis direction. By displaying one display object superimposed on each display surface, the operator 5 can recognize that the display object is three-dimensionally displayed.
 表示装置4は、図5~7に示すいずれかまたは任意の組み合わせの表示装置で構成され、表示制御部3の制御によって図2~4に示すような三次元空間におけるジェスチャー操作を誘導する操作オブジェクトを三次元で表示する。操作オブジェクトの詳細については後述する。 The display device 4 is a display device of any or any combination shown in FIGS. 5 to 7, and is an operation object for guiding a gesture operation in a three-dimensional space as shown in FIGS. 2 to 4 under the control of the display control unit 3. Display in three dimensions. Details of the operation object will be described later.
 次に、図1に示すジェスチャー操作装置1を含むジェスチャー操作装置の他の構成について説明する。 Next, another configuration of the gesture operation device including the gesture operation device 1 shown in FIG. 1 will be described.
 図8は、他の構成に係るジェスチャー操作装置6の構成の一例を示すブロック図である。 FIG. 8 is a block diagram showing an example of the configuration of the gesture operation device 6 according to another configuration.
 図8に示すように、ジェスチャー操作装置6は、イベント検出部2と、表示制御部3と、通信部7と、ジェスチャー操作取得部8と、操作判断部9と、機器制御部10とを備えている。 As shown in FIG. 8, the gesture operation device 6 includes an event detection unit 2, a display control unit 3, a communication unit 7, a gesture operation acquisition unit 8, an operation determination unit 9, and a device control unit 10. ing.
 通信部7は、携帯通信端末12と通信接続可能な端末通信部11を有している。端末通信部11は、携帯通信端末12から、例えばメールを着信した旨の情報、および電話を着信した旨の情報などを受信する。また、端末通信部11は、操作判断部9の判断結果に従って、携帯通信端末12に当該携帯通信端末12を操作する情報を送信する。 The communication unit 7 has a terminal communication unit 11 that can be communicably connected to the mobile communication terminal 12. The terminal communication unit 11 receives, from the mobile communication terminal 12, for example, information indicating that a mail has arrived, and information indicating that a call has arrived. Further, the terminal communication unit 11 transmits information for operating the mobile communication terminal 12 to the mobile communication terminal 12 in accordance with the determination result of the operation determining unit 9.
 イベント検出部2は、端末通信部11がメールを着信した旨の情報、または電話を着信した旨の情報などを携帯通信端末12から受信したことをイベントとして検出する。表示制御部3は、表示装置4に対して、イベント検出部2が検出したイベントに対して、当該イベントを進行させるために操作者が行うことが可能な動作に、操作者の意図を示すジェスチャー操作を誘導する操作オブジェクトを三次元で表示する制御を行う。 The event detection unit 2 detects, as an event, information that the terminal communication unit 11 has received an e-mail, information that a call has been received, or the like from the mobile communication terminal 12. The display control unit 3 causes the display device 4 to make a gesture indicating the intention of the operator for the action that the operator can perform to advance the event with respect to the event detected by the event detection unit 2 Control to display the operation object that guides the operation in three dimensions.
 ジェスチャー操作取得部8は、ジェスチャー操作検出装置13に接続されており、操作者5のジェスチャー操作をジェスチャー操作検出装置13から取得する。図9は、ジェスチャー操作検出装置13の構成の一例を示す図である。図9に示すように、ジェスチャー操作検出装置13は、例えば車内のフロアコンソールに設置されており、ToF(Time of Flight)センサ、画像センサ、および近接センサのうちの少なくとも1つを備えている。ジェスチャー操作空間17は、ジェスチャー操作検出装置13による検出範囲に相当する。すなわち、ジェスチャー操作検出装置13は、ジェスチャー操作空間17において操作者5が行ったジェスチャー操作を検出する。このとき、ジェスチャー操作検出装置13が検出するジェスチャー操作は、後述するイベントに対して、当該イベントを進行させるために操作者が行うことが可能な動作を行うためのジェスチャー操作だけでなく、その他の手の動きもジェスチャー操作として検出する。 The gesture operation acquisition unit 8 is connected to the gesture operation detection device 13 and acquires the gesture operation of the operator 5 from the gesture operation detection device 13. FIG. 9 is a diagram showing an example of the configuration of the gesture operation detection device 13. As shown in FIG. 9, the gesture operation detection device 13 is installed, for example, on a floor console in a car, and includes at least one of a ToF (Time of Flight) sensor, an image sensor, and a proximity sensor. The gesture operation space 17 corresponds to a detection range by the gesture operation detection device 13. That is, the gesture operation detection device 13 detects the gesture operation performed by the operator 5 in the gesture operation space 17. At this time, the gesture operation detected by the gesture operation detection device 13 is not only a gesture operation for performing an operation that can be performed by the operator in order to advance the event, but also other events. The movement of the hand is also detected as a gesture operation.
 なお、ジェスチャー操作検出装置13を構成するセンサは、1つでもよい。この場合、ジェスチャー操作検出装置13は一方向からジェスチャー操作を検出することになる。また、ジェスチャー操作検出装置13を構成するセンサは、複数でもよい。この場合、例えば図10に示すように3つのセンサをA点、B点、およびC点の3箇所に設ければ、ジェスチャー操作検出装置13は三方向からジェスチャー操作を検出することになるため、センサが1つの場合よりもジェスチャー操作の検出精度を高めることができる。 In addition, the sensor which comprises the gesture operation detection apparatus 13 may be one. In this case, the gesture operation detection device 13 detects a gesture operation from one direction. Moreover, the sensor which comprises the gesture operation detection apparatus 13 may be multiple. In this case, for example, if three sensors are provided at three points A, B, and C as shown in FIG. 10, the gesture operation detection device 13 detects a gesture operation from three directions. The detection accuracy of the gesture operation can be improved more than in the case of one sensor.
 図8に戻り、操作判断部9は、ジェスチャー操作取得部8が取得したジェスチャー操作が操作オブジェクトに従う動きであるか否かを判断する。また、操作判断部9は、音声入力装置14と接続されており、操作者5が音声入力装置14を介して発話した内容に基づいて、どのような操作をするのかを判断することができる。このように、操作判断部9は、操作者5による音声操作の内容を判断することができる。 Returning to FIG. 8, the operation determination unit 9 determines whether the gesture operation acquired by the gesture operation acquisition unit 8 is a movement according to the operation object. Further, the operation determination unit 9 is connected to the voice input device 14, and can determine what kind of operation to perform based on the content that the operator 5 utters via the voice input device 14. Thus, the operation determination unit 9 can determine the content of the voice operation by the operator 5.
 機器制御部10は、操作判断部9が操作オブジェクトに従うジェスチャー操作であると判断した場合、当該ジェスチャー操作に基づいて車載機器15および音声出力装置16などの機器を制御する。また、機器制御部10は、操作判断部9が操作オブジェクトに従うジェスチャー操作であると判断した場合、当該ジェスチャー操作に基づいて、通信部7と通信している携帯通信端末12を制御する。車載機器15としては、例えばナビゲーション装置、音響機器、および車両内に設けられた各種制御装置などが挙げられる。音声出力装置16としては、例えばスピーカなどが挙げられる。 When the operation determination unit 9 determines that the gesture operation is in accordance with the operation object, the device control unit 10 controls devices such as the on-vehicle device 15 and the voice output device 16 based on the gesture operation. When the operation determination unit 9 determines that the operation control unit 9 is a gesture operation according to the operation object, the device control unit 10 controls the mobile communication terminal 12 in communication with the communication unit 7 based on the gesture operation. Examples of the on-vehicle device 15 include a navigation device, an audio device, and various control devices provided in the vehicle. Examples of the audio output device 16 include a speaker and the like.
 図11は、ジェスチャー操作装置6のハードウェア構成の一例を示すブロック図である。なお、図1に示すジェスチャー操作装置1についても同様である。 FIG. 11 is a block diagram showing an example of the hardware configuration of the gesture operation device 6. The same applies to the gesture operation device 1 shown in FIG.
 ジェスチャー操作装置6における端末通信部11、イベント検出部2、表示制御部3、ジェスチャー操作取得部8、操作判断部9、および機器制御部10の各機能は、処理回路により実現される。すなわち、ジェスチャー操作装置6は、携帯通信端末12と通信し、イベントを検出し、表示の制御を行い、ジェスチャー操作を取得し、ジェスチャー操作を判断し、機器を制御するための処理回路を備える。処理回路は、メモリ19に格納されたプログラムを実行するプロセッサ18(中央処理装置、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)ともいう)である。 The functions of the terminal communication unit 11, the event detection unit 2, the display control unit 3, the gesture operation acquisition unit 8, the operation determination unit 9, and the device control unit 10 in the gesture operation device 6 are realized by a processing circuit. That is, the gesture operation device 6 communicates with the mobile communication terminal 12, detects an event, controls display, acquires a gesture operation, determines a gesture operation, and includes a processing circuit for controlling an apparatus. The processing circuit is a processor 18 (also called a central processing unit, processing unit, arithmetic unit, microprocessor, microcomputer, or DSP (Digital Signal Processor)) that executes a program stored in the memory 19.
 ジェスチャー操作装置6における端末通信部11、イベント検出部2、表示制御部3、ジェスチャー操作取得部8、操作判断部9、および機器制御部10の各機能は、ソフトウェア、ファームウェア、またはソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェアまたはファームウェアは、プログラムとして記述され、メモリ19に格納される。処理回路は、メモリ19に記憶されたプログラムを読み出して実行することにより、各部の機能を実現する。すなわち、ジェスチャー操作装置6は、携帯通信端末12と通信するステップ、イベントを検出するステップ、表示の制御を行うステップ、ジェスチャー操作を取得するステップ、ジェスチャー操作を判断するステップ、機器を制御するステップが結果的に実行されることになるプログラムを格納するためのメモリ19を備える。また、これらのプログラムは、端末通信部11、イベント検出部2、表示制御部3、ジェスチャー操作取得部8、操作判断部9、および機器制御部10の手順または方法をコンピュータに実行させるものであるともいえる。ここで、メモリとは、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)等の不揮発性または揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD等、または、今後使用されるあらゆる記憶媒体であってもよい。 Each function of the terminal communication unit 11, the event detection unit 2, the display control unit 3, the gesture operation acquisition unit 8, the operation determination unit 9, and the device control unit 10 in the gesture operation device 6 is software, firmware, or software and firmware and It is realized by the combination of The software or firmware is described as a program and stored in the memory 19. The processing circuit implements the functions of the respective units by reading and executing the program stored in the memory 19. That is, the gesture operation device 6 communicates with the mobile communication terminal 12, detects an event, controls display, acquires a gesture operation, determines a gesture operation, and controls an apparatus. A memory 19 is provided for storing a program to be executed as a result. Further, these programs cause the computer to execute the procedure or method of the terminal communication unit 11, the event detection unit 2, the display control unit 3, the gesture operation acquisition unit 8, the operation determination unit 9, and the device control unit 10. It can be said that. Here, the memory is, for example, nonvolatile or volatile such as random access memory (RAM), read only memory (ROM), flash memory, erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM), etc. Semiconductor memory, magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD or the like, or any storage medium to be used in the future.
 <動作>
 図12は、図1に示すジェスチャー操作装置1の動作の一例を示すフローチャートである。
<Operation>
FIG. 12 is a flowchart showing an example of the operation of the gesture operation device 1 shown in FIG.
 ステップS101において、イベント検出部2は、イベントを検出したか否かを判断する。イベント検出部2は、イベントを検出するまでステップS101の処理を繰り返し、イベントを検出したと判断するとステップS102に移行する。 In step S101, the event detection unit 2 determines whether an event has been detected. The event detection unit 2 repeats the process of step S101 until an event is detected, and when it is determined that an event is detected, the process proceeds to step S102.
 ステップS102において、表示制御部3は、イベント検出部2が検出したイベントに対して、当該イベントを進行させるために操作者5が行うことが可能な動作に、操作者5の意図を示すジェスチャー操作を誘導する操作オブジェクトを三次元で表示するように表示装置4を制御する。具体的には、表示制御部3の制御によって、図2~4に示すような三次元のジェスチャー操作を誘導する三次元の操作オブジェクトを、三次元で表示可能な表示装置4に表示する。 In step S102, the display control unit 3 performs a gesture operation indicating the intention of the operator 5 in an operation that can be performed by the operator 5 to advance the event with respect to the event detected by the event detection unit 2. Control the display device 4 so as to display the operation object for guiding in three dimensions. Specifically, under the control of the display control unit 3, a three-dimensional operation object for guiding a three-dimensional gesture operation as shown in FIGS. 2 to 4 is displayed on the display device 4 capable of displaying in three dimensions.
 図13は、図8に示すジェスチャー操作装置6の動作の一例を示すフローチャートである。なお、図13では、操作オブジェクトをチュートリアルで表示する場合について、図14を用いて説明する。本実施の形態において、チュートリアルとは、イベントに対応して操作者5が行うべき一連のジェスチャー操作を操作オブジェクトとして表示装置4に表示することをいう。 FIG. 13 is a flowchart showing an example of the operation of the gesture operation device 6 shown in FIG. Note that, in FIG. 13, a case where the operation object is displayed in a tutorial will be described using FIG. 14. In the present embodiment, a tutorial means displaying a series of gesture operations to be performed by the operator 5 in response to an event on the display device 4 as an operation object.
 ステップS201において、イベント検出部2は、イベントを検出したか否かを判断する。具体的には、イベント検出部2は、端末通信部11がメールまたは電話を着信した旨の情報を携帯通信端末12から受信するまでステップS201の処理を繰り返す。そして、イベント検出部2は、端末通信部11がメールまたは電話を着信した旨の情報を携帯通信端末12から受信したときにイベントを検出したと判断し、ステップS202に移行する。 In step S201, the event detection unit 2 determines whether an event has been detected. Specifically, the event detection unit 2 repeats the process of step S201 until the information indicating that the terminal communication unit 11 has received an e-mail or a telephone call is received from the mobile communication terminal 12. Then, the event detection unit 2 determines that an event is detected when the terminal communication unit 11 receives information from the mobile communication terminal 12 to the effect that the terminal communication unit 11 has received an e-mail or a call, and proceeds to step S202.
 例えば、図14に示すように、端末通信部11が携帯通信端末12からメールが着信した旨の情報を受信すると、表示装置4には未読アイコン20が表示される。図14の例では、未読のメールが2件あることを示している。 For example, as illustrated in FIG. 14, when the terminal communication unit 11 receives information indicating that a mail has arrived from the mobile communication terminal 12, the unread icon 20 is displayed on the display device 4. The example in FIG. 14 indicates that there are two unread emails.
 ステップS202において、表示制御部3は、イベント検出部2が検出したイベントに対応して、当該イベントを進行させるために操作者5行うべき一連のジェスチャー操作を示す操作オブジェクトを三次元で表示するように表示装置4を制御する。 In step S202, in response to the event detected by the event detection unit 2, the display control unit 3 three-dimensionally displays an operation object indicating a series of gesture operations to be performed by the operator 5 in order to advance the event. Control the display device 4.
 図14の例では、操作者5が表示装置4に表示されている未読アイコン20を選択するジェスチャー操作を行うと未読メールの一覧が表示され、さらに操作者5が未読メールの一覧から特定のメールを選択するジェスチャー操作を行うと当該メールの内容を音声で読み上げる旨をチュートリアルで表示する。図14では、表示装置4には、未読アイコン20を選択するジェスチャー操作を示す操作オブジェクト21が三次元で表示されている。このとき、操作オブジェクト21は、表示装置4の手前から奥へ移動して未読アイコン20を選択するように動く。また、表示装置4には、「メールを選んで音声で読み上げ可能」の文字が表示されている。 In the example of FIG. 14, when the operator 5 performs a gesture operation to select the unread icon 20 displayed on the display device 4, a list of unread e-mails is displayed, and the operator 5 further displays a specific e-mail from the unread e-mail list. When a gesture operation to select is performed, a tutorial displays that the contents of the mail are read out by voice. In FIG. 14, an operation object 21 indicating a gesture operation for selecting the unread icon 20 is three-dimensionally displayed on the display device 4. At this time, the operation object 21 moves from the front to the back of the display device 4 so as to select the unread icon 20. In addition, on the display device 4, characters of “select mail and can be read out by voice” are displayed.
 ステップS203において、ジェスチャー操作取得部8は、ジェスチャー操作検出装置13から、当該ジェスチャー操作検出装置13が検出した操作者5によるジェスチャー操作を取得したか、すなわち操作者5がジェスチャー操作をしたか否かを判断する。ジェスチャー操作を取得した場合は、ステップS205に移行する。一方、ジェスチャー操作を取得していない場合は、ステップS204に移行する。 In step S203, whether the gesture operation acquisition unit 8 acquires a gesture operation by the operator 5 detected by the gesture operation detection device 13 from the gesture operation detection device 13, that is, whether the operator 5 performed a gesture operation or not To judge. When the gesture operation is acquired, the process proceeds to step S205. On the other hand, when the gesture operation has not been acquired, the process proceeds to step S204.
 ステップS204において、ジェスチャー操作取得部8は、一定時間が経過したか否かを判断する。なお、この時間は、予め設定された時間であってもよく、ユーザが任意に設定した時間であってもよい。一定時間が経過した場合は、図13に示す動作を終了する。一方、一定時間が経過していない場合は、ステップS203に戻る。 In step S204, the gesture operation acquisition unit 8 determines whether a predetermined time has elapsed. Note that this time may be a time set in advance, or may be a time arbitrarily set by the user. If the predetermined time has elapsed, the operation shown in FIG. 13 is ended. On the other hand, if the predetermined time has not elapsed, the process returns to step S203.
 ステップS205において、操作判断部9は、ジェスチャー操作取得部8が取得した操作者5によるジェスチャー操作が、表示装置4に表示されたチュートリアルに従う動きであるか否かを判断する。すなわち、操作判断部9は、操作者5によるジェスチャー操作がチュートリアルで示された操作オブジェクトに従った操作であるか否かを判断する。ジェスチャー操作がチュートリアルに従った操作でない場合は、ステップS206に移行する。一方、ジェスチャー操作がチュートリアルに従った操作である場合は、ステップS207に移行する。 In step S205, the operation determination unit 9 determines whether the gesture operation by the operator 5 acquired by the gesture operation acquisition unit 8 is a movement according to the tutorial displayed on the display device 4. That is, the operation determination unit 9 determines whether the gesture operation by the operator 5 is an operation according to the operation object indicated in the tutorial. If the gesture operation is not the operation according to the tutorial, the process proceeds to step S206. On the other hand, when the gesture operation is an operation according to the tutorial, the process proceeds to step S207.
 ステップS206において、表示制御部3は、ジェスチャー操作が失敗した旨を報知する制御を行う。具体的には、表示制御部3は、操作者5によるジェスチャー操作がチュートリアルに従った操作でない旨を表示装置4に表示する制御を行う。 In step S206, the display control unit 3 performs control to notify that the gesture operation has failed. Specifically, the display control unit 3 performs control to display on the display device 4 that the gesture operation by the operator 5 is not the operation according to the tutorial.
 ステップS207において、機器制御部10は、操作者5によるジェスチャー操作がチュートリアルに従った操作であると判断すると、当該ジェスチャー操作に基づいて対象となる機器を制御する。図14の例では、操作者5がジェスチャー操作で選択したメールの内容を読み上げる。このとき、機器制御部10は、音声出力装置16からメールの内容を読み上げるように音声出力装置16を制御してもよく、携帯通信端末12からメールの内容を読み上げるように端末通信部11を制御してもよい。 In step S207, when determining that the gesture operation by the operator 5 is an operation according to the tutorial in step S207, the device control unit 10 controls a target device based on the gesture operation. In the example of FIG. 14, the operator 5 reads the contents of the mail selected by the gesture operation. At this time, the device control unit 10 may control the voice output device 16 to read the contents of the mail from the voice output device 16, and control the terminal communication unit 11 to read the contents of the mail from the mobile communication terminal 12. You may
 ステップS208において、操作判断部9は、操作者5によるジェスチャー操作が終了したか否かを判断する。具体的には、操作判断部9は、ジェスチャー操作取得部8が取得したジェスチャー操作が、表示装置4に表示したチュートリアルに従う全ての操作を行ったか否かを判断する。ジェスチャー操作が終了した場合は、図13に示す動作を終了する。一方、ジェスチャー操作が終了していない場合は、ステップS207に戻る。 In step S208, the operation determination unit 9 determines whether the gesture operation by the operator 5 has ended. Specifically, the operation determination unit 9 determines whether the gesture operation acquired by the gesture operation acquisition unit 8 has performed all the operations according to the tutorial displayed on the display device 4. When the gesture operation ends, the operation illustrated in FIG. 13 ends. On the other hand, if the gesture operation has not ended, the process returns to step S207.
 図13に示すように、チュートリアルを表示装置4に表示することによって、操作者5は、メールを着信したときにどのようなジェスチャー操作を行えば良いのかを正確に理解することができる。特に、表示装置4が図6に示すようなHUDである場合、操作者5である運転者は、前方から視線を逸らすことなくジェスチャー操作することが可能である。 As shown in FIG. 13, by displaying the tutorial on the display device 4, the operator 5 can correctly understand what kind of gesture operation should be performed when a mail is received. In particular, when the display device 4 is a HUD as shown in FIG. 6, a driver who is the operator 5 can perform gesture operations without turning his eyes from the front.
 図15は、図8に示すジェスチャー操作装置6の動作の一例を示すフローチャートである。なお、図15では、イベントが生じたときに操作者5が次に行うべき一のジェスチャー操作を誘導する操作オブジェクトを表示する場合について、図16を用いて説明する。 FIG. 15 is a flowchart showing an example of the operation of the gesture operation device 6 shown in FIG. In addition, in FIG. 15, the case where the operation object which guides one gesture operation which the operator 5 should perform next when an event generate | occur | produces is displayed is demonstrated using FIG.
 ステップS301において、イベント検出部2は、図13のステップS201と同様、イベントを検出するまでステップS301の処理を繰り返す。そして、イベント検出部2がイベントを検出したと判断すると、ステップS302に移行する。 In step S301, the event detection unit 2 repeats the process of step S301 until an event is detected, as in step S201 of FIG. And if it judges that event detection part 2 detected an event, it will shift to Step S302.
 例えば、図16に示すように、端末通信部11が携帯通信端末12から電話を着信した旨の情報を受信すると、表示装置4には受話器のアイコンとともに、電話の発信元の情報が表示される。図16の例では、電話の発信元の情報として「岩崎さんからの電話」と表示される。 For example, as shown in FIG. 16, when the terminal communication unit 11 receives information indicating that a call has been received from the mobile communication terminal 12, the display device 4 displays the information on the call originator along with the handset icon. . In the example of FIG. 16, “A call from Mr. Iwasaki” is displayed as the information of the call originator.
 ステップS302において、表示制御部3は、イベント検出部2が検出したイベントに対して操作者5が次に行うべき一のジェスチャー操作を示す操作オブジェクトを三次元で表示するように表示装置4を制御する。 In step S302, the display control unit 3 controls the display device 4 to three-dimensionally display an operation object indicating one gesture operation to be performed by the operator 5 next to the event detected by the event detection unit 2 Do.
 図16の例では、受話器のアイコンに手を掛けた操作オブジェクト21が表示される。なお、表示制御部3は、受話器のアイコンに手を掛ける動きを表現するように操作オブジェクト21を表示する制御を行ってもよい。 In the example of FIG. 16, the operation object 21 in which the hand of the handset icon is touched is displayed. Note that the display control unit 3 may perform control to display the operation object 21 so as to express a movement of putting a hand on the icon of the handset.
 ステップS303において、ジェスチャー操作取得部8は、ジェスチャー操作検出装置13から、当該ジェスチャー操作検出装置13が検出した操作者5によるジェスチャー操作を取得したか、すなわち操作者5がジェスチャー操作をしたか否かを判断する。図16の例では、ジェスチャー操作検出装置13で検出された操作者5の手に対応する検出アイコン22が表示装置4に表示されている。このとき、表示装置4では、受話器のアイコン、操作オブジェクト21、および検出アイコン22を奥から手前に向かって順に表示してもよい。すなわち、表示制御部3は、操作オブジェクト21を三次元表示空間における奥側に表示し、ジェスチャー操作取得部8が取得した操作者5の手のアイコンである検出アイコン22を三次元表示空間における手前側に表示する制御を行う。ジェスチャー操作を取得した場合は、ステップS305に移行する。一方、ジェスチャー操作を取得していない場合は、ステップS304に移行する。 In step S303, whether the gesture operation acquisition unit 8 acquires a gesture operation by the operator 5 detected by the gesture operation detection device 13 from the gesture operation detection device 13, that is, whether the operator 5 performed a gesture operation or not To judge. In the example of FIG. 16, a detection icon 22 corresponding to the hand of the operator 5 detected by the gesture operation detection device 13 is displayed on the display device 4. At this time, the display device 4 may display the icon of the handset, the operation object 21 and the detection icon 22 in order from the back to the front. That is, the display control unit 3 displays the operation object 21 on the back side in the three-dimensional display space, and the detection icon 22 which is an icon of the hand of the operator 5 acquired by the gesture operation acquisition unit 8 is displayed in front of the three-dimensional display space. Control to display on the side. When the gesture operation is acquired, the process proceeds to step S305. On the other hand, when the gesture operation has not been acquired, the process proceeds to step S304.
 ステップS304において、図13のステップS204と同様、ジェスチャー操作取得部8は、一定時間が経過したか否かを判断する。一定時間が経過した場合は、図15に示す動作を終了する。一方、一定時間が経過していない場合は、ステップS303に戻る。 In step S304, as in step S204 in FIG. 13, the gesture operation acquisition unit 8 determines whether a predetermined time has elapsed. If the predetermined time has elapsed, the operation shown in FIG. 15 is ended. On the other hand, if the predetermined time has not elapsed, the process returns to step S303.
 ステップS305において、操作判断部9は、ジェスチャー操作取得部8が取得した操作者5によるジェスチャー操作が、表示装置4に表示された操作オブジェクトに従う動きであるか否かを判断する。ジェスチャー操作が操作オブジェクトに従った操作でない場合は、ステップS306に移行する。一方、ジェスチャー操作が操作オブジェクトに従った操作である場合は、ステップS307に移行する。 In step S305, the operation determination unit 9 determines whether the gesture operation by the operator 5 acquired by the gesture operation acquisition unit 8 is movement according to the operation object displayed on the display device 4. If the gesture operation is not the operation according to the operation object, the process proceeds to step S306. On the other hand, when the gesture operation is an operation according to the operation object, the process proceeds to step S307.
 ステップS306において、図13のステップS206と同様、表示制御部3は、ジェスチャー操作が失敗した旨を報知する制御を行う。具体的には、表示制御部3は、操作者5によるジェスチャー操作が操作オブジェクトに従った操作でない旨を表示装置4に表示する制御を行う。 In step S306, as in step S206 of FIG. 13, the display control unit 3 performs control to notify that the gesture operation has failed. Specifically, the display control unit 3 performs control to display on the display device 4 that the gesture operation by the operator 5 is not the operation according to the operation object.
 ステップS307において、機器制御部10は、操作者5によるジェスチャー操作が操作オブジェクトに従う操作であると判断すると、当該ジェスチャー操作に基づいて対象となる機器を制御する。図16の例では、検出アイコン22が受話器のアイコンに重なると、通話が可能となる。このとき、機器制御部10は、携帯通信端末12で通話が可能となるように端末通信部11を制御する。 In step S307, when determining that the gesture operation by the operator 5 is an operation according to the operation object in step S307, the device control unit 10 controls a target device based on the gesture operation. In the example of FIG. 16, when the detection icon 22 overlaps the icon of the handset, a call can be made. At this time, the device control unit 10 controls the terminal communication unit 11 so that the mobile communication terminal 12 can make a call.
 ステップS308において、操作判断部9は、操作者5によるジェスチャー操作が終了したか否かを判断する。具体的には、操作判断部9は、ジェスチャー操作取得部8が取得したジェスチャー操作が、表示装置4に表示した操作オブジェクトに従う操作を行ったか否かを判断する。ジェスチャー操作が終了した場合は、図15に示す動作を終了する。一方、ジェスチャー操作が終了していない場合は、ステップS302に戻る。 In step S308, the operation determination unit 9 determines whether the gesture operation by the operator 5 has ended. Specifically, the operation determination unit 9 determines whether the gesture operation acquired by the gesture operation acquisition unit 8 has performed an operation according to the operation object displayed on the display device 4. When the gesture operation ends, the operation illustrated in FIG. 15 ends. On the other hand, if the gesture operation has not ended, the process returns to step S302.
 図15に示すように、イベントが生じたときに操作者5が次に行うべき一のジェスチャー操作を誘導する操作オブジェクトを表示することによって、操作者5は、電話を着信したときにどのようなジェスチャー操作を行えば良いのかを正確に理解することができる。 As shown in FIG. 15, by displaying an operation object that guides the operator 5 to perform one gesture operation to be performed next when an event occurs, the operator 5 receives an incoming call and what kind of operation You can understand exactly what you need to do gesture operations.
 <変形例1>
 図13では操作オブジェクトをチュートリアルで表示する場合の動作を示し、図15ではイベントが生じたときに操作者5が次に行うべき一のジェスチャー操作を誘導する操作オブジェクトを表示する場合の動作を示しているが、これらの動作を組み合わせてもよい。例えば、操作者5が初めて経験するイベントの場合は図13の動作を行って操作オブジェクトをチュートリアルで表示し、次回以降に同様のイベントを経験したときは図15の動作を行って次に行うべき一のジェスチャー操作を誘導する操作オブジェクトを表示するようにしてもよい。操作者5が複数存在する場合は、操作者5ごとに図13または図15の動作を行ってもよい。この場合、操作者5を個別に認識する必要がある。
<Modification 1>
FIG. 13 shows the operation in the case of displaying the operation object in the tutorial, and FIG. 15 shows the operation in the case of displaying the operation object for inducing one gesture operation to be performed by the operator 5 next when an event occurs. However, these operations may be combined. For example, in the case of an event that the operator 5 experiences for the first time, the operation in FIG. 13 is performed to display the operation object in the tutorial, and when the same event is experienced next time or later, the operation in FIG. An operation object for guiding one gesture operation may be displayed. When there are a plurality of operators 5, the operation of FIG. 13 or 15 may be performed for each operator 5. In this case, it is necessary to recognize the operators 5 individually.
 <変形例2>
 図13のステップS202または図15のステップS302において、表示制御部3は、例えば図17または図18に示すように、操作オブジェクトとともにジェスチャー操作の対象となる機能を表すオブジェクトを表示する制御を行ってもよい。図17では、時計回りまたは反時計回りのジェスチャー操作を誘導する操作オブジェクト21とともにダイヤルが表示されている。ダイヤルとしては、例えば音量調整を行うためのダイヤルなどが挙げられる。図18では、操作者5から見て前後方向に回転させるジェスチャー操作を誘導する操作オブジェクト21とともにフォルダA~Cが表示されている。フォルダとしては、例えば電話帳などが挙げられる。このような表示を行うことによって、操作者5は、物理的な操作ボタンがなくてもどのような操作を行えばよいのかを正確に理解することができる。図17,18において、手は表示せず、矢印のみを表示してもよい。
<Modification 2>
In step S202 of FIG. 13 or step S302 of FIG. 15, for example, as shown in FIG. It is also good. In FIG. 17, a dial is displayed together with an operation object 21 for guiding a clockwise or counterclockwise gesture operation. Examples of the dial include a dial for performing volume adjustment. In FIG. 18, folders A to C are displayed together with an operation object 21 for guiding a gesture operation to be rotated in the front-rear direction as viewed from the operator 5. As a folder, a telephone directory etc. are mentioned, for example. By performing such a display, the operator 5 can correctly understand what kind of operation should be performed without the physical operation button. In FIGS. 17 and 18, the hand may not be displayed, and only the arrow may be displayed.
 なお、図17,18の例は、図13のステップS202または図15のステップS302において静止画として表示されるが、図13のステップS205または図15のステップS305において操作者5が実際にジェスチャー操作を行った場合は、実際のジェスチャー操作に合わせてダイヤルが回転し、またはフォルダA~Cの順序が変わるように動くことになる。すなわち、操作判断部9が操作者5によるジェスチャー操作が操作オブジェクトに従う動きであると判断した場合、表示制御部3はジェスチャー操作取得部8が取得したジェスチャー操作に合わせて操作オブジェクトが動くように制御する。 The examples in FIGS. 17 and 18 are displayed as a still image in step S 202 in FIG. 13 or step S 302 in FIG. 15. However, the operator 5 actually performs the gesture operation in step S 205 in FIG. 13 or step S 305 in FIG. When this is done, the dial rotates or moves in the order of folders A to C in accordance with the actual gesture operation. That is, when the operation determination unit 9 determines that the gesture operation by the operator 5 is a movement according to the operation object, the display control unit 3 controls the operation object to move according to the gesture operation acquired by the gesture operation acquisition unit 8 Do.
 また、実際のジェスチャー操作に合わせてフォルダA~Cが動くとき、当該動きに合わせて効果音を加えてもよい。これにより、操作者5はより直感的にどのような操作を行えばよいのかを正確に理解することができる。 Also, when the folders A to C move according to the actual gesture operation, sound effects may be added according to the movement. As a result, the operator 5 can more accurately understand what kind of operation should be performed.
 <変形例3>
 図13のステップS202または図15のステップS302において、表示制御部3は、例えば図19に示すように、操作項目の一覧が手の動きに合わせて回転するように制御してもよい。操作項目の一覧は、例えば車載機器15の機能に対応している。図19では、操作項目の一覧が、操作者5から見て前後方向に輪を描くように配置されている。操作者5が「Setting」を選択すると各種設定を行うための画面に遷移し、「AM Radio」を選択するとAMラジオを視聴するための機能が実行され、「FM Radio」を選択するとFMラジオを視聴するための機能が実行され、「Music」を選択するとFMラジオを視聴するための機能が実行され、「Navigation」を選択すると目的地までの経路案内などのナビゲーション機能が実行される。このような表示を行うことによって、操作者5は、操作対象の動的な操作感を正確に理解することができる。図19において、手は表示せず、矢印のみを表示してもよい。
<Modification 3>
In step S202 of FIG. 13 or step S302 of FIG. 15, for example, as shown in FIG. 19, the display control unit 3 may control to rotate the list of operation items in accordance with the movement of the hand. The list of operation items corresponds to, for example, the function of the on-vehicle device 15. In FIG. 19, the list of operation items is arranged to draw a ring in the front-rear direction as viewed from the operator 5. When operator 5 selects "Setting", the screen changes to a screen for setting various settings. When "AM Radio" is selected, the function for viewing AM radio is executed. When "FM Radio" is selected, FM radio is selected. A function for viewing is executed, a function for viewing FM radio is executed when "Music" is selected, and a navigation function such as route guidance to a destination is executed when "Navigation" is selected. By performing such a display, the operator 5 can correctly understand the dynamic operation feeling of the operation target. In FIG. 19, the hand may not be displayed, and only the arrow may be displayed.
 なお、図13のステップS205または図15のステップS305において操作者5が実際にジェスチャー操作を行った場合は、実際のジェスチャー操作に合わせて操作項目の一覧が動くことになる。また、図19の例に限らず、図17においてダイヤルが回転する表示にしてもよく、図18においてフォルダA~Cの順序が変わるように表示してもよい。 When the operator 5 actually performs a gesture operation in step S205 in FIG. 13 or in step S305 in FIG. 15, the list of operation items moves in accordance with the actual gesture operation. Further, the present invention is not limited to the example shown in FIG. 19, and may be a display in which the dial is rotated in FIG.
 また、実際のジェスチャー操作に合わせて操作項目の一覧が動くとき、当該動きに合わせて効果音を加えてもよい。これにより、操作者5はより直感的にどのような操作を行えばよいのかを正確に理解することができる。 Further, when the list of operation items moves in accordance with the actual gesture operation, sound effects may be added in accordance with the movement. As a result, the operator 5 can more accurately understand what kind of operation should be performed.
 <実施例1>
 操作者5が普段使用している携帯通信端末12の操作と同様の操作でジェスチャー操作を行うようにしてもよい。
Example 1
The gesture operation may be performed by the same operation as the operation of the mobile communication terminal 12 which the operator 5 normally uses.
 図20は、携帯通信端末12の操作画面を表示装置4に三次元で表示する一例を示している。図20に示す携帯通信端末12の操作画面は、電話を着信したときに表示される画面である。端末通信部11は、携帯通信端末12から電話を着信した旨の情報とともに、電話を着信したときに携帯通信端末12に表示される操作画面の情報を携帯通信端末12から取得する。そして、図13のステップS202または図15のステップS302において、表示制御部3は、例えば図21に示すように、応答することを示す受話器のアイコンを右方向にスライドさせることを示す操作オブジェクトを表示装置4に表示する制御を行う。操作者5は、図21に示す操作オブジェクトに従ってジェスチャー操作することによって、電話の着信に応答することができる。このとき、操作者5は、三次元で表示された操作画面を見ながらジェスチャー操作することになるため、操作者5が普段使用している携帯通信端末12と同様の操作感覚でジェスチャー操作をすることができる。 FIG. 20 illustrates an example in which the operation screen of the mobile communication terminal 12 is three-dimensionally displayed on the display device 4. The operation screen of the mobile communication terminal 12 shown in FIG. 20 is a screen displayed when an incoming call is received. The terminal communication unit 11 acquires, from the mobile communication terminal 12, information of the operation screen displayed on the mobile communication terminal 12 when the call is received, together with the information indicating that the call is received from the mobile communication terminal 12. Then, in step S202 of FIG. 13 or step S302 of FIG. 15, the display control unit 3 displays an operation object indicating that the icon of the handset indicating that it responds is to be slid rightward, as shown in FIG. Control to display on the device 4 is performed. The operator 5 can respond to an incoming call by performing a gesture operation according to the operation object shown in FIG. At this time, the operator 5 performs the gesture operation while looking at the operation screen displayed in three dimensions, and thus performs the gesture operation in the same sense of operation as the mobile communication terminal 12 that the operator 5 normally uses be able to.
 また、他の例として、携帯通信端末12が電話を着信したとき、表示装置4に図22に示すような操作画面を表示してもよい。この場合、表示制御部3は、例えば図23に示すように、応答することを示す受話器のアイコンを選択することを示す操作オブジェクトを表示装置4に表示する制御を行う。操作者5は、図22に従ってジェスチャー操作することによって、電話の着信に応答することができる。 As another example, when the mobile communication terminal 12 receives a call, an operation screen as shown in FIG. 22 may be displayed on the display device 4. In this case, as shown in FIG. 23, for example, the display control unit 3 performs control to display on the display device 4 an operation object indicating selection of an icon of a handset indicating response. The operator 5 can respond to an incoming call by performing a gesture operation according to FIG.
 なお、図20~図23に示す携帯通信端末12の操作画面は、上記で説明したように電話の着信時に端末通信部11が携帯通信端末12から取得してもよいが、これに限るものではない。例えば、ジェスチャー操作装置6が図示しない記憶部に携帯通信端末12の操作画面を記憶しておいてもよく、通信部7が外部のサーバなどから携帯通信端末12に対応する操作画面の情報を取得してもよい。 Although the operation screen of the mobile communication terminal 12 shown in FIGS. 20 to 23 may be acquired from the mobile communication terminal 12 by the terminal communication unit 11 at the time of call reception as described above, the present invention is not limited thereto. Absent. For example, the gesture operation device 6 may store the operation screen of the mobile communication terminal 12 in a storage unit (not shown), and the communication unit 7 acquires the information of the operation screen corresponding to the mobile communication terminal 12 from an external server or the like. You may
 <実施例2>
 車載機器15がラジオおよび音楽を再生する機能を有している場合、これらの機能をジェスチャー操作で実行してもよい。例えば、操作者5が自身の手をかざしながら「音楽」、「ラジオ」、または「携帯の音楽」などと発話すると、イベント検出部2はこれらの動作をイベントとして検出する。
Example 2
If the on-vehicle device 15 has a radio and a function to play music, these functions may be performed by a gesture operation. For example, when the operator 5 speaks "music", "radio", "mobile music" or the like while holding his hand, the event detection unit 2 detects these operations as an event.
 そして、操作者5が「音楽」と発話した場合、表示制御部3は、表示装置4に例えば音楽の再生リストおよび手のアイコンを操作オブジェクトとして表示する。このとき、表示制御部3は、上下方向にフリックするジェスチャー操作を行うと、リストのスクロールが可能であることを示す操作オブジェクトを表示してもよい。 Then, when the operator 5 utters “music”, the display control unit 3 displays, for example, a music reproduction list and a hand icon as an operation object on the display device 4. At this time, the display control unit 3 may display an operation object indicating that scrolling of the list is possible when a gesture operation of flicking in the up and down direction is performed.
 操作者5は、表示装置4に表示された操作オブジェクトに従って上下方向にフリックするジェスチャー操作を行うことによってリストをスクロールし、楽曲名を読み上げることによって特定の楽曲を選択する。なお、楽曲の選択は、ジェスチャー操作によって選択してもよく、表示装置4がタッチパネルを備えている場合はタッチ操作によって選択してもよい。 The operator 5 scrolls the list by performing a gesture operation of flicking up and down according to the operation object displayed on the display device 4, and reads out the music name to select a specific music. In addition, selection of a music may be selected by gesture operation, and when the display apparatus 4 is equipped with the touch panel, you may select by touch operation.
 また、音量調整をする際は、例えば図17に示すような操作オブジェクトを表示装置4に表示してもよい。この場合、操作者5は、例えば自身の親指、人差し指、および中指でダイヤルを回転させるようなジェスチャー操作を行うことによって、音量調整をすることができる。このとき、操作者5の操作感覚を高めるために、ダイヤルの回転軸が操作者5側に傾いた形状であってもよい。 Further, when adjusting the volume, for example, an operation object as shown in FIG. 17 may be displayed on the display device 4. In this case, the operator 5 can adjust the volume by performing a gesture operation such as rotating the dial with his thumb, forefinger and middle finger, for example. At this time, in order to enhance the sense of operation of the operator 5, the rotational axis of the dial may be in a shape inclined to the operator 5 side.
 <実施例3>
 車載機器15がナビゲーション機能を有している場合、当該機能をジェスチャー操作で実行してもよい。例えば、操作者5が自身の手をかざしながら「ナビゲーション」、「目的地」、または「地図」などと発話すると、イベント検出部2はこれらの動作をイベントとして検出する。
Example 3
When the on-vehicle device 15 has a navigation function, the function may be executed by a gesture operation. For example, when the operator 5 speaks "navigation", "destination" or "map" while holding his hand, the event detection unit 2 detects these operations as an event.
 そして、操作者5が「目的地」と発話した場合、表示制御部3は、例えば目的地のリストおよび手のアイコンを操作オブジェクトとして表示する。このとき、表示制御部3は、上下方向にフリックするジェスチャー操作を行うとリストのスクロールが可能であり、タッチするジェスチャー操作を行うと特定の目的を選択可能であることを示す操作オブジェクトを表示してもよい。操作者5は、表示装置4に表示された操作オブジェクトに従って上下方向にフリックするジェスチャー操作を行うことによってリストをスクロールし、タッチするジェスチャー操作を行うと特定の目的を選択する。なお、目的地の選択は、例えば目的地の地名を読み上げることによって選択してもよい。 Then, when the operator 5 utters “destination”, the display control unit 3 displays, for example, a list of destinations and an icon of a hand as an operation object. At this time, the display control unit 3 displays an operation object indicating that the list can be scrolled by performing a gesture operation of flicking in the vertical direction, and that a specific purpose can be selected by performing a gesture operation of touching. May be The operator 5 scrolls the list by performing a gesture operation of flicking up and down according to the operation object displayed on the display device 4, and selects the specific purpose when performing the gesture operation of touching. The destination may be selected by reading out the name of the destination, for example.
 操作者5が「地図」と発話した場合、表示制御部3は、例えば地図および手のアイコンを操作オブジェクトとして表示する。このとき、表示制御部3は、ピンチインまたはピンチアウトするジェスチャー操作を行うと地図の拡大または縮小が可能であることを示す操作オブジェクトを表示してもよい。また、手のひらを下に向けたまま前後左右方向に移動させるジェスチャー操作を行うと地図のスクロールが可能であることを示す操作オブジェクトを表示してもよい。 When the operator 5 utters “map”, the display control unit 3 displays, for example, a map and an icon of a hand as an operation object. At this time, the display control unit 3 may display an operation object indicating that the map can be enlarged or reduced by performing a pinch-in or pinch-out gesture operation. In addition, an operation object may be displayed which indicates that the map can be scrolled by performing a gesture operation of moving the palm in the front, back, left, and right directions with the palm directed downward.
 例えば、図24に示すように、ホーム画面23、現在地画面24、および地図スクロール画面25が階層化されている場合において、表示制御部3は、手のひらを上に向けたまま上方に動かすジェスチャー操作を行う度に、ホーム画面23、現在地画面24、および地図スクロール画面25の順序が変わることを示す操作オブジェクトを表示してもよい。この場合、手を裏返すジェスチャー操作を行うと、ホーム画面23が最上位に表示されるようにしてもよい。ホーム画面23において選択可能な項目がある場合、表示制御部3は、例えばタッチするジェスチャー操作を行うと特定の項目を選択することを示す操作オブジェクトを表示してもよい。 For example, as shown in FIG. 24, in the case where the home screen 23, the current location screen 24, and the map scroll screen 25 are hierarchized, the display control unit 3 performs a gesture operation of moving the palm upward with the palm facing upward. An operation object may be displayed to indicate that the order of the home screen 23, the current location screen 24, and the map scroll screen 25 changes each time it is performed. In this case, the home screen 23 may be displayed at the top when the hand is turned over. When there is an item that can be selected on the home screen 23, the display control unit 3 may display an operation object indicating that a specific item is selected when, for example, a gesture operation to touch is performed.
 以上のことから、本実施の形態1によれば、三次元方向の動きを伴うジェスチャー操作を誘導する操作オブジェクトを三次元で表示しているため、操作者5はどのようなジェスチャー操作を行えばよいのかを正確に理解することができる。従って、ジェスチャー操作装置6は、操作者5が行ったジェスチャー操作を精度良く判断することができる。 From the above, according to the first embodiment, since the operation object for guiding the gesture operation accompanied by the movement in the three-dimensional direction is displayed in three dimensions, what kind of gesture operation should the operator 5 perform You can understand exactly what is good. Therefore, the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
 <実施の形態2>
 図25は、本発明の実施の形態2によるジェスチャー操作装置26の構成の一例を示すブロック図である。
Second Embodiment
FIG. 25 is a block diagram showing an example of the configuration of the gesture operation device 26 according to the second embodiment of the present invention.
 図25に示すように、本実施の形態2によるジェスチャー操作装置26は、機器制御部10が音像制御部27を有していることを特徴としている。その他の構成および動作は、実施の形態1と同様であるため、ここでは詳細な説明を省略する。 As shown in FIG. 25, the gesture operation device 26 according to the second embodiment is characterized in that the device control unit 10 has a sound image control unit 27. The other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
 音像制御部27は、音声出力装置16に接続されており、音声出力装置16から出力される音声の音像を制御する。具体的には、音像制御部27は、表示制御部3の制御によって表示装置4に表示された操作オブジェクトの表示位置に応じて音像を制御する。 The sound image control unit 27 is connected to the sound output device 16 and controls the sound image of the sound output from the sound output device 16. Specifically, the sound image control unit 27 controls the sound image according to the display position of the operation object displayed on the display device 4 by the control of the display control unit 3.
 例えば、表示装置4に図26に示すような操作オブジェクトが表示されている場合について説明する。図26では、着信メールを保存フォルダとゴミ箱とに分別するときの操作オブジェクトを示している。表示装置4には、左手前には着信メールが表示され、右手前にはゴミ箱が表示され、右奥には保存フォルダが表示されている。操作オブジェクトにおいて、手のアイコンが、着信メールからゴミ箱または保存フォルダに向かう矢印に沿って移動する。このとき、音像制御部27は、手のアイコンの動きに合わせて音像を制御するため、手のアイコンがゴミ箱に向かって移動するときの音と、手のアイコンが保存フォルダに向かって移動するときの音とが異なる。 For example, the case where an operation object as shown in FIG. 26 is displayed on the display device 4 will be described. FIG. 26 shows an operation object when sorting incoming mail into a storage folder and a trash can. In the display device 4, an incoming mail is displayed in front of the left, a trash can is displayed in front of the right, and a save folder is displayed in the back right. In the operation object, the hand icon moves along the arrow from the incoming mail to the trash or storage folder. At this time, since the sound image control unit 27 controls the sound image according to the movement of the hand icon, the sound when the hand icon moves toward the trash and the hand icon moves toward the storage folder The sound of is different.
 なお、音像制御部27は、操作者5が実際にジェスチャー操作をする際に、当該ジェスチャー操作に合わせて音像を制御してもよい。 The sound image control unit 27 may control the sound image according to the gesture operation when the operator 5 actually performs the gesture operation.
 以上のことから、本実施の形態2によれば、三次元方向の動きを伴うジェスチャー操作を誘導する操作オブジェクトを三次元で表示するとともに音場を制御しているため、操作者5はどのようなジェスチャー操作を行えばよいのかを直感的かつ正確に理解することができる。従って、ジェスチャー操作装置6は、操作者5が行ったジェスチャー操作を精度良く判断することができる。 From the above, according to the second embodiment, since the operation object for guiding the gesture operation accompanied by the movement in the three-dimensional direction is displayed in three dimensions and the sound field is controlled, Intuitive and accurate understanding of what kind of gesture operations should be done. Therefore, the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
 <実施の形態3>
 図27は、本発明の実施の形態3によるジェスチャー操作装置28の構成の一例を示すブロック図である。
Embodiment 3
FIG. 27 is a block diagram showing an example of a configuration of the gesture operation device 28 according to Embodiment 3 of the present invention.
 図27に示すように、本実施の形態3によるジェスチャー操作装置28は、状態取得部29を備えることを特徴としている。その他の構成および動作は、実施の形態1と同様であるため、ここでは詳細な説明を省略する。 As shown in FIG. 27, the gesture operation device 28 according to the third embodiment is characterized by including a state acquisition unit 29. The other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
 状態取得部29は、状態検出装置30に接続されており、状態検出装置30が検出した少なくとも操作者5の目の位置および視線を含む操作者5の状態を取得する。状態検出装置30は、カメラで構成されており、操作者5の目の位置または視線をカメラで撮影された画像から特定する。表示制御部3は、状態取得部29が取得した操作者5の状態に基づいて操作オブジェクトの表示位置を制御する。 The state acquisition unit 29 is connected to the state detection device 30, and acquires the state of the operator 5 including the eye position and the line of sight of at least the operator 5 detected by the state detection device 30. The state detection device 30 is configured by a camera, and specifies the eye position or line of sight of the operator 5 from the image captured by the camera. The display control unit 3 controls the display position of the operation object based on the state of the operator 5 acquired by the state acquisition unit 29.
 以上のことから、本実施の形態3によれば、操作者5の状態に基づいて操作オブジェクトの表示位置を制御しているため、操作者5にとって見やすい操作オブジェクトを表示することができる。これにより、操作者5はどのようなジェスチャー操作を行えばよいのかを正確に理解することができる。従って、ジェスチャー操作装置6は、操作者5が行ったジェスチャー操作を精度良く判断することができる。 From the above, according to the third embodiment, since the display position of the operation object is controlled based on the state of the operator 5, the operation object which can be easily viewed by the operator 5 can be displayed. Thereby, the operator 5 can correctly understand what kind of gesture operation should be performed. Therefore, the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
 <実施の形態4>
 <構成>
 図28は、本発明の実施の形態4によるジェスチャー操作装置31の構成の一例を示すブロック図である。
Fourth Preferred Embodiment
<Configuration>
FIG. 28 is a block diagram showing an example of the configuration of the gesture operation device 31 according to the fourth embodiment of the present invention.
 図28に示すように、本実施の形態4によるジェスチャー操作装置31は、操作履歴情報記憶部32を備えることを特徴としている。その他の構成および動作は、実施の形態3と同様であるため、ここでは詳細な説明を省略する。 As shown in FIG. 28, the gesture operation device 31 according to the fourth embodiment is characterized by including an operation history information storage unit 32. The other configuration and operation are the same as in the third embodiment, and thus detailed description will be omitted here.
 操作履歴情報記憶部32は、例えばハードディスク(Hard Disk Drive:HDD)または半導体メモリ等の記憶装置から構成されており、操作判断部9によってジェスチャー操作が操作オブジェクトに従う動きであるか否かが判断された結果を記憶する。 The operation history information storage unit 32 is composed of a storage device such as a hard disk drive (HDD) or a semiconductor memory, for example, and the operation determination unit 9 determines whether the gesture operation follows the operation object. Remember the results.
 <動作>
 図29は、ジェスチャー操作装置31の動作の一例を示すフローチャートである。なお、図29のステップS401~ステップS406は図13のステップS201~ステップS206に対応し、図29のステップS408,ステップS409は図13のステップS207,ステップS208に対応しているため、ここでは説明を省略する。以下では、ステップS407について説明する。
<Operation>
FIG. 29 is a flowchart showing an example of the operation of the gesture operation device 31. Steps S401 to S406 in FIG. 29 correspond to steps S201 to S206 in FIG. 13, and steps S408 and S409 in FIG. 29 correspond to steps S207 and S208 in FIG. Omit. Hereinafter, step S407 will be described.
 ステップS407において、操作履歴情報記憶部32は、操作判断部9が判断した結果を操作履歴情報として記憶する。具体的には、ステップS405において操作判断部9が操作者5によるジェスチャー操作がチュートリアルで示された操作オブジェクトに従った操作でないと判断すると、当該判断結果が操作履歴情報記憶部32に記憶される。操作履歴情報記憶部32に記憶される操作履歴情報は、操作者5ごとに記憶される。なお、各操作者5の識別は、状態検出装置30が検出した操作者5の顔の画像に基づくものであってもよい。 In step S407, the operation history information storage unit 32 stores the result determined by the operation determination unit 9 as operation history information. Specifically, when the operation determination unit 9 determines in step S405 that the gesture operation by the operator 5 is not the operation according to the operation object indicated in the tutorial, the determination result is stored in the operation history information storage unit 32. . The operation history information stored in the operation history information storage unit 32 is stored for each operator 5. In addition, identification of each operator 5 may be based on the image of the face of the operator 5 which the state detection apparatus 30 detected.
 操作判断部9は、次回以降のステップS405において、操作履歴情報記憶部32に記憶された操作履歴情報を用いて、操作者5によるジェスチャー操作がチュートリアルで示された操作オブジェクトに従った操作であるか否かを判断することができる。また、操作判断部9は、蓄積された操作履歴情報を用いて機械学習を行うことができる。これにより、操作者5によるジェスチャー操作がチュートリアルで示した操作オブジェクトとは少し異なる場合であっても操作オブジェクトに従った操作であると判断するなど、操作者5によるジェスチャー操作の癖を考慮した判断をすることができる。 The operation determination unit 9 is an operation according to the operation object indicated by the tutorial by the gesture operation by the operator 5 using the operation history information stored in the operation history information storage unit 32 in the subsequent step S405. It can be determined whether or not. In addition, the operation determination unit 9 can perform machine learning using the accumulated operation history information. Thereby, even if the gesture operation by the operator 5 is a little different from the operation object shown in the tutorial, it is judged that the operation according to the operation object is taken into consideration, such as judgment taking into consideration the habit of gesture operation by the operator 5 You can
 図30は、ジェスチャー操作装置31の動作の一例を示すフローチャートである。なお、図30のステップS501~ステップS506は図13のステップS201~ステップS206に対応し、図30のステップS509,ステップS510は図13のステップS207,ステップS208に対応しているため、ここでは説明を省略する。以下では、ステップS507,ステップS508について説明する。 FIG. 30 is a flowchart showing an example of the operation of the gesture operation device 31. Step S501 to Step S506 in FIG. 30 correspond to Step S201 to Step S206 in FIG. 13, and Step S509 and Step S510 in FIG. 30 correspond to Step S207 and Step S208 in FIG. Omit. Hereinafter, steps S507 and S508 will be described.
 ステップS507において、状態取得部29は、操作判断部9が判断した結果に対する操作者5の反応を操作者5の状態として取得する。具体的には、ステップS505において操作判断部9が操作者5によるジェスチャー操作がチュートリアルで示された操作オブジェクトに従った操作でないと判断して、ステップS506においてジェスチャー操作が失敗した旨を報知したときの操作者5の反応を状態検出装置30が検出し、当該検出した操作者5の反応を状態取得部29が取得する。操作者5の反応としては、操作者5の表情、または動きなどが挙げられる。 In step S507, the state acquisition unit 29 acquires the reaction of the operator 5 with respect to the result determined by the operation determination unit 9 as the state of the operator 5. Specifically, when the operation determination unit 9 determines that the gesture operation by the operator 5 is not the operation according to the operation object shown in the tutorial in step S505, and notifies that the gesture operation has failed in step S506. The state detection device 30 detects the reaction of the operator 5 and the state acquisition unit 29 acquires the detected reaction of the operator 5. The reaction of the operator 5 includes the expression or movement of the operator 5.
 ステップS508において、操作履歴情報記憶部32は、操作判断部9による判断結果と、状態取得部29が取得した操作者5の反応とを対応付けて操作履歴情報として記憶する。操作履歴情報記憶部32に記憶される操作履歴情報は、操作者5ごとに記憶される。なお、各操作者5の識別は、状態検出装置30が検出した操作者5の顔の画像に基づくものであってもよい。 In step S508, the operation history information storage unit 32 associates the determination result by the operation determination unit 9 with the reaction of the operator 5 acquired by the state acquisition unit 29 and stores the result as operation history information. The operation history information stored in the operation history information storage unit 32 is stored for each operator 5. In addition, identification of each operator 5 may be based on the image of the face of the operator 5 which the state detection apparatus 30 detected.
 操作判断部9は、次回以降のステップS505において、操作履歴情報記憶部32に記憶された操作履歴情報を用いて、操作者5によるジェスチャー操作がチュートリアルで示された操作オブジェクトに従った操作であるか否かを判断することができる。また、操作判断部9は、蓄積された操作履歴情報を用いて機械学習を行うことができる。これにより、操作者5によるジェスチャー操作がチュートリアルで示した操作オブジェクトとは少し異なる場合であっても操作オブジェクトに従った操作であると判断するなど、操作者5によるジェスチャー操作の癖を考慮した判断をすることができる。 Operation determination unit 9 is an operation in which the gesture operation by operator 5 follows the operation object indicated by the tutorial, using the operation history information stored in operation history information storage unit 32 in the subsequent step S505. It can be determined whether or not. In addition, the operation determination unit 9 can perform machine learning using the accumulated operation history information. Thereby, even if the gesture operation by the operator 5 is a little different from the operation object shown in the tutorial, it is judged that the operation according to the operation object is taken into consideration, such as judgment taking into consideration the habit of gesture operation by the operator 5 You can
 なお、図29,30では、チュートリアルを表示する場合について説明したが、イベントが生じたときに操作者5が次に行うべき一のジェスチャー操作を誘導する操作オブジェクトを表示する場合にも適用可能である。 In FIGS. 29 and 30, although the case of displaying the tutorial has been described, the present invention is also applicable to the case of displaying an operation object for inducing one gesture operation to be performed by the operator 5 next when an event occurs. is there.
 操作者5がシフトレバーまたは遠隔操作装置などジェスチャー操作以外の他の操作を行った場合、操作判断部9は、図29のステップS405または図30のステップS505の処理を行わなくてもよい。 When the operator 5 performs another operation other than the gesture operation such as the shift lever or the remote control device, the operation determination unit 9 may not perform the process of step S405 of FIG. 29 or step S505 of FIG.
 以上のことから、本実施の形態4によれば、操作履歴情報記憶部32に記憶された操作履歴情報を用いて、操作者5によるジェスチャー操作がチュートリアルで示された操作オブジェクトに従った操作であるか否かを判断するため、操作者5によるジェスチャー操作の癖を考慮した判断をすることができる。 From the above, according to the fourth embodiment, using the operation history information stored in the operation history information storage unit 32, the gesture operation by the operator 5 follows the operation object indicated by the tutorial. In order to determine whether or not there is, it is possible to make a determination in consideration of the habit of the gesture operation by the operator 5.
 <実施の形態5>
 図31は、本発明の実施の形態5によるジェスチャー操作装置33の構成の一例を示すブロック図である。
The Fifth Preferred Embodiment
FIG. 31 is a block diagram showing an example of the configuration of the gesture operation device 33 according to the fifth embodiment of the present invention.
 図31に示すように、本実施の形態5によるジェスチャー操作装置33は、通信部7が外部情報取得部34を有することを特徴としている。その他の構成および動作は、実施の形態1と同様であるため、ここでは詳細な説明を省略する。 As shown in FIG. 31, the gesture operation device 33 according to the fifth embodiment is characterized in that the communication unit 7 includes the external information acquisition unit 34. The other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
 外部情報取得部34は、外部から種々の外部情報を取得する。外部情報としては、例えば、操作者5に関連する情報、交通情報、天候情報などが挙げられる。操作者5に関連する情報としては、例えば、操作者5のスケジュール、連絡先、好みの楽曲、SNS(Social Networking Service)の情報、設定した目的地に関する交通情報または天候情報、POI(Point Of Interest)情報、購買行動情報、自宅のネットワークに接続された家電の情報などが挙げられる。 The external information acquisition unit 34 acquires various external information from the outside. Examples of the external information include information related to the operator 5, traffic information, weather information, and the like. Information related to the operator 5 includes, for example, the schedule of the operator 5, contact information, favorite music, information on SNS (Social Networking Service), traffic information or weather information on the set destination, POI (Point Of Interest) ) Information, purchasing behavior information, information of home appliances connected to the home network, and the like.
 例えば、外部情報取得部34が操作者5の自宅のネットワークに接続された家電の情報を取得すると、イベント検出部2は外部情報取得部34が家電の情報を取得したことをイベントとして検出する。そして、表示制御部3は、図32に示すように、操作者5の自宅の1階に設置されているエアコンの電源をオンまたはオフする動作を示す操作オブジェクト21を表示装置4に表示するように制御する。例えば、手のアイコンがスイッチのアイコンを押すごとに電源のオンおよびオフが交互に繰り返して表示されるような操作オブジェクトを表示してもよい。これにより、操作者5は、自宅のどの部屋の家電を操作するのかを容易に把握することができるとともに、どのようなジェスチャー操作を行えばよいのかを正確に理解することができる。 For example, when the external information acquisition unit 34 acquires information on a home appliance connected to the network of the operator 5 at home, the event detection unit 2 detects that the external information acquisition unit 34 has acquired information on the home appliance as an event. Then, as shown in FIG. 32, the display control unit 3 causes the display device 4 to display an operation object 21 indicating an operation of turning on or off the power of the air conditioner installed on the first floor of the operator 5's home. Control. For example, an operation object may be displayed such that the power is turned on and off alternately and repeatedly each time the hand icon presses the switch icon. As a result, the operator 5 can easily grasp which room in the house in which the home appliance is to be operated, and can accurately understand what kind of gesture operation should be performed.
 以上のことから、本実施の形態5によれば、外部情報を用いて操作オブジェクトを表示しているため、操作者5は例えば自宅に設置された家電を制御するためのジェスチャー操作を正確に理解することができる。従って、ジェスチャー操作装置6は、操作者5が行ったジェスチャー操作を精度良く判断することができる。 From the above, according to the fifth embodiment, since the operation object is displayed using the external information, the operator 5 correctly understands, for example, the gesture operation for controlling the home appliance installed at home. can do. Therefore, the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
 <実施の形態6>
 図33は、本発明の実施の形態6によるジェスチャー操作装置35の構成の一例を示すブロック図である。
Embodiment 6
FIG. 33 is a block diagram showing an example of the configuration of the gesture operation device 35 according to the sixth embodiment of the present invention.
 図33に示すように、本実施の形態6によるジェスチャー操作装置35は、機器制御部10が車両情報取得部36を有することを特徴としている。その他の構成および動作は、実施の形態1と同様であるため、ここでは詳細な説明を省略する。 As shown in FIG. 33, the gesture operation device 35 according to the sixth embodiment is characterized in that the device control unit 10 has a vehicle information acquisition unit 36. The other configuration and operation are the same as in the first embodiment, and thus detailed description will be omitted here.
 車両情報取得部36は、車載機器15である車両内に設けられた各種制御装置から車両情報を取得する。車両情報は、車両に関する種々の情報である。 The vehicle information acquisition unit 36 acquires vehicle information from various control devices provided in the vehicle which is the in-vehicle device 15. Vehicle information is various information regarding a vehicle.
 例えば、車両情報取得部36が車載機器15から車両のトランクの扉が開いていることを示す情報を取得すると、イベント検出部2は車両情報取得部36が車両情報を取得したことをイベントとして検出する。そして、表示制御部3は、図34に示すように、車両のトランクを閉める動作を示す操作オブジェクト21を表示装置4に表示するように制御する。これにより、操作者5は、車両のトランクを閉めるためのジェスチャー操作を正確に理解することができる。 For example, when the vehicle information acquisition unit 36 acquires information indicating that the door of the trunk of the vehicle is opened from the in-vehicle device 15, the event detection unit 2 detects that the vehicle information acquisition unit 36 acquires vehicle information as an event. Do. Then, as shown in FIG. 34, the display control unit 3 controls to display on the display device 4 an operation object 21 indicating an operation of closing the trunk of the vehicle. Thus, the operator 5 can correctly understand the gesture operation for closing the trunk of the vehicle.
 また、例えば、端末通信部11が携帯通信端末12から当該携帯通信端末12に関する情報を取得すると、イベント検出部2は端末通信部11が携帯通信端末12に関する情報を取得したことをイベントとして検出する。そして、表示制御部3は、図35に示すように、携帯通信端末12に関する情報の表示の切り替えを行う動作を示す操作オブジェクト21を表示装置4に表示するように制御する。これにより、操作者5は、携帯通信端末12に関する情報の表示の切り替えを行うためのジェスチャー操作を正確に理解することができる。 Also, for example, when the terminal communication unit 11 acquires information on the mobile communication terminal 12 from the mobile communication terminal 12, the event detection unit 2 detects that the terminal communication unit 11 has acquired information on the mobile communication terminal 12 as an event. . Then, as shown in FIG. 35, the display control unit 3 controls the display device 4 to display an operation object 21 indicating an operation of switching the display of the information related to the mobile communication terminal 12. Thereby, the operator 5 can correctly understand the gesture operation for switching the display of the information related to the mobile communication terminal 12.
 以上のことから、本実施の形態6によれば、車載機器15または携帯通信端末12から取得した情報を用いて操作オブジェクトを表示しているため、操作者5は車載機器15または携帯通信端末12を制御するためのジェスチャー操作を正確に理解することができる。従って、ジェスチャー操作装置6は、操作者5が行ったジェスチャー操作を精度良く判断することができる。 From the above, according to the sixth embodiment, since the operation object is displayed using the information acquired from the in-vehicle device 15 or the mobile communication terminal 12, the operator 5 can display the in-vehicle device 15 or the mobile communication terminal 12. Can accurately understand the gesture operation to control the Therefore, the gesture operation device 6 can accurately determine the gesture operation performed by the operator 5.
 以上で説明したジェスチャー操作装置は、車載用ナビゲーション装置、すなわちカーナビゲーション装置だけでなく、車両に搭載可能なPND(Portable Navigation Device)、および車両の外部に設けられるサーバなどを適宜に組み合わせてシステムとして構築されるナビゲーション装置あるいはナビゲーション装置以外の装置にも適用することができる。この場合、ジェスチャー操作装置の各機能あるいは各構成要素は、上記システムを構築する各機能に分散して配置される。 The gesture operation device described above is not limited to an on-vehicle navigation device, that is, a car navigation device, and a PND (Portable Navigation Device) that can be mounted on a vehicle, and a server provided outside the vehicle as a system as appropriate The present invention can also be applied to a navigation device to be constructed or a device other than the navigation device. In this case, each function or each component of the gesture operation device is distributively arranged to each function constructing the system.
 具体的には、一例として、ジェスチャー操作装置の機能をサーバに配置することができる。例えば、図36に示すように、ユーザ側は、表示装置4、携帯通信端末12、ジェスチャー操作検出装置13、音声入力装置14、車載機器15、および音声出力装置16を備えている。サーバ37は、イベント検出部2、表示制御部3、通信部7、ジェスチャー操作取得部8、操作判断部9、機器制御部10、および端末通信部11を備えている。このような構成とすることによって、ジェスチャー操作システムを構築することができる。図25に示すジェスチャー操作装置26、図27に示すジェスチャー操作装置28、図28に示すジェスチャー操作装置31、図31に示すジェスチャー操作装置33、および図33に示すジェスチャー操作装置35についても同様である。 Specifically, as an example, the function of the gesture operation device can be arranged on the server. For example, as shown in FIG. 36, the user side includes the display device 4, the mobile communication terminal 12, the gesture operation detection device 13, the voice input device 14, the in-vehicle device 15, and the voice output device 16. The server 37 includes an event detection unit 2, a display control unit 3, a communication unit 7, a gesture operation acquisition unit 8, an operation determination unit 9, a device control unit 10, and a terminal communication unit 11. With such a configuration, it is possible to construct a gesture operation system. The same applies to the gesture operating device 26 shown in FIG. 25, the gesture operating device 28 shown in FIG. 27, the gesture operating device 31 shown in FIG. 28, the gesture operating device 33 shown in FIG. 31 and the gesture operating device 35 shown in FIG. .
 このように、ジェスチャー操作装置の各機能を、システムを構築する各機能に分散して配置した構成であっても、上記の実施の形態と同様の効果が得られる。 As described above, even in the configuration in which the functions of the gesture operation device are dispersedly arranged in the functions constructing the system, the same effect as that of the above embodiment can be obtained.
 また、上記の実施の形態における動作を実行するソフトウェアを、例えばサーバに組み込んでもよい。このソフトウェアをサーバが実行することにより実現されるジェスチャー操作方法は、イベントを検出し、検出したイベントに対して、当該イベントを進行させるために操作者が行うことが可能な動作に、操作者の意図を示すジェスチャー操作を誘導する操作オブジェクトを三次元で表示する制御を行うことである。 Also, software for executing the operation in the above embodiment may be incorporated into, for example, a server. The gesture operation method implemented by the server executing this software detects an event, and for the detected event, the operator can perform an operation that can be performed by the operator to advance the event. It is performing control which displays the operation object which guides the gesture operation which shows an intention in three dimensions.
 このように、上記の実施の形態における動作を実行するソフトウェアをサーバに組み込んで動作させることによって、上記の実施の形態と同様の効果が得られる。 As described above, by incorporating the software for executing the operation in the above embodiment into the server and operating it, the same effect as that of the above embodiment can be obtained.
 なお、本発明は、その発明の範囲内において、各実施の形態を自由に組み合わせたり、各実施の形態を適宜、変形、省略することが可能である。 In the present invention, within the scope of the invention, each embodiment can be freely combined, or each embodiment can be appropriately modified or omitted.
 本発明は詳細に説明されたが、上記した説明は、すべての態様において、例示であって、この発明がそれに限定されるものではない。例示されていない無数の変形例が、この発明の範囲から外れることなく想定され得るものと解される。 Although the present invention has been described in detail, the above description is an exemplification in all aspects, and the present invention is not limited thereto. It is understood that countless variations not illustrated are conceivable without departing from the scope of the present invention.
 1 ジェスチャー操作装置、2 イベント検出部、3 表示制御部、4 表示装置、5 操作者、6 ジェスチャー操作装置、7 通信部、8 ジェスチャー操作取得部、9 操作判断部、10 機器制御部、11 端末通信部、12 携帯通信端末、13 ジェスチャー操作検出装置、14 音声入力装置、15 車載機器、16 音声出力装置、17 ジェスチャー操作空間、18 プロセッサ、19 メモリ、20 未読アイコン、21 操作オブジェクト、22 検出アイコン、23 ホーム画面、24 現在地画面、25 地図スクロール画面、26 ジェスチャー操作装置、27 音像制御部、28 ジェスチャー操作装置、29 状態取得部、30 状態検出装置、31 ジェスチャー操作装置、32 操作履歴情報記憶部、33 ジェスチャー操作装置、34 外部情報取得部、35 ジェスチャー操作装置、36 車両情報取得部、37 サーバ。 Reference Signs List 1 gesture operation device, 2 event detection unit, 3 display control unit, 4 display device, 5 operator, 6 gesture operation device, 7 communication unit, 8 gesture operation acquisition unit, 9 operation determination unit, 10 equipment control unit, 11 terminal Communication unit, 12 mobile communication terminals, 13 gesture operation detection devices, 14 voice input devices, 15 in-vehicle devices, 16 voice output devices, 17 gesture operation spaces, 18 processors, 19 memories, 20 unread icons, 21 operation objects, 22 detection icons , 23 home screen, 24 current position screen, 25 map scroll screen, 26 gesture operation device, 27 sound image control unit, 28 gesture operation device, 29 state acquisition unit, 30 state detection device, 31 gesture operation device, 32 operation history information storage unit 33 gesture operation device 34 external information acquisition unit, 35 gesture operation device 36 vehicle information acquisition unit, 37 server.

Claims (18)

  1.  イベントを検出するイベント検出部と、
     前記イベント検出部が検出した前記イベントに対して、当該イベントを進行させるために操作者が行うことが可能な動作に、前記操作者の意図を示すジェスチャー操作を誘導する操作オブジェクトを三次元で表示する制御を行う表示制御部と、
    を備える、ジェスチャー操作装置。
    An event detection unit that detects an event;
    For the event detected by the event detection unit, the operation object that guides the gesture operation indicating the intention of the operator is displayed in three dimensions in the action that the operator can perform to advance the event A display control unit that performs control to
    A gesture operation device comprising:
  2.  前記表示制御部は、前記イベントに対応して前記操作者が行うべき一連の前記ジェスチャー操作を示す前記操作オブジェクトを表示する制御を行うことを特徴とする、請求項1に記載のジェスチャー操作装置。 The gesture operation device according to claim 1, wherein the display control unit performs control to display the operation object indicating a series of the gesture operations to be performed by the operator in response to the event.
  3.  前記表示制御部は、前記イベントに対して前記操作者が次に行うべき一の前記ジェスチャー操作を示す前記操作オブジェクトを表示する制御を行うことを特徴とする、請求項1に記載のジェスチャー操作装置。 The gesture operation device according to claim 1, wherein the display control unit performs control to display the operation object indicating one gesture operation to be performed by the operator next to the event. .
  4.  前記操作者の前記ジェスチャー操作を取得するジェスチャー操作取得部をさらに備え、
     前記表示制御部は、前記操作オブジェクトを三次元表示空間における奥側に表示し、前記ジェスチャー操作取得部が取得した前記操作者の手のアイコンを前記三次元表示空間における手前側に表示する制御を行うことを特徴とする、請求項3に記載のジェスチャー操作装置。
    And a gesture operation acquisition unit that acquires the gesture operation of the operator.
    The display control unit displays the operation object on the back side in the three-dimensional display space, and displays the icon of the operator's hand acquired by the gesture operation acquisition unit on the near side in the three-dimensional display space. The gesture operation device according to claim 3, characterized in that:
  5.  前記表示制御部は、前記操作オブジェクトとともに前記ジェスチャー操作の対象となる機能を表すオブジェクトを表示する制御を行うことを特徴とする、請求項1に記載のジェスチャー操作装置。 The gesture operation device according to claim 1, wherein the display control unit performs control to display an object representing a function to be a target of the gesture operation together with the operation object.
  6.  前記操作者の前記ジェスチャー操作を取得するジェスチャー操作取得部と、
     前記ジェスチャー操作取得部が取得した前記ジェスチャー操作が前記操作オブジェクトに従う動きであるか否かを判断する操作判断部と、
    をさらに備え、
     前記操作判断部が前記操作オブジェクトに従う動きであると判断した場合、前記表示制御部は、前記ジェスチャー操作取得部が取得した前記ジェスチャー操作に合わせて前記操作オブジェクトが動くように制御することを特徴とする、請求項1に記載のジェスチャー操作装置。
    A gesture operation acquisition unit that acquires the gesture operation of the operator;
    An operation determination unit that determines whether the gesture operation acquired by the gesture operation acquisition unit is a motion according to the operation object;
    And further
    When the operation determination unit determines that the movement conforms to the operation object, the display control unit controls the operation object to move in accordance with the gesture operation acquired by the gesture operation acquisition unit. The gesture operation device according to claim 1.
  7.  携帯通信端末と通信接続可能な端末通信部をさらに備え、
     前記表示制御部は、前記端末通信部が通信している前記携帯通信端末の操作画面を前記操作オブジェクトとして表示する制御を行うことを特徴とする、請求項1に記載のジェスチャー操作装置。
    The mobile communication terminal further includes a terminal communication unit capable of communicating with the mobile communication terminal,
    The gesture operation device according to claim 1, wherein the display control unit performs control to display an operation screen of the mobile communication terminal communicated by the terminal communication unit as the operation object.
  8.  音像を制御する音像制御部をさらに備え、
     前記音像制御部は、前記操作オブジェクトの表示位置に応じて前記音像を制御することを特徴とする、請求項1に記載のジェスチャー操作装置。
    And a sound image control unit for controlling the sound image,
    The gesture operation device according to claim 1, wherein the sound image control unit controls the sound image according to a display position of the operation object.
  9.  少なくとも前記操作者の目の位置および視線を含む前記操作者の状態を取得する状態取得部をさらに備え、
     前記表示制御部は、前記状態取得部が取得した前記操作者の状態に基づいて前記操作オブジェクトの表示位置を制御することを特徴とする、請求項1に記載のジェスチャー操作装置。
    It further comprises a state acquisition unit for acquiring the state of the operator including at least the position and the line of sight of the operator's eyes,
    The gesture operation device according to claim 1, wherein the display control unit controls a display position of the operation object based on a state of the operator acquired by the state acquisition unit.
  10.  前記操作者の前記ジェスチャー操作を取得するジェスチャー操作取得部と、
     前記ジェスチャー操作取得部が取得した前記ジェスチャー操作が前記操作オブジェクトに従う動きであるか否かを判断する操作判断部と、
     前記操作判断部が判断した結果を操作履歴情報として記憶する操作履歴情報記憶部と、
    をさらに備える、請求項1に記載のジェスチャー操作装置。
    A gesture operation acquisition unit that acquires the gesture operation of the operator;
    An operation determination unit that determines whether the gesture operation acquired by the gesture operation acquisition unit is a motion according to the operation object;
    An operation history information storage unit which stores the result determined by the operation determination unit as operation history information;
    The gesture operation device according to claim 1, further comprising:
  11.  前記操作者の状態を取得する状態取得部をさらに備え、
     前記状態取得部は、前記操作判断部が判断した結果に対する前記操作者の反応を取得し、
     前記操作履歴情報記憶部は、前記操作者の反応を操作履歴情報に含めて記憶することを特徴とする、請求項10に記載のジェスチャー操作装置。
    It further comprises a state acquisition unit for acquiring the state of the operator,
    The state acquisition unit acquires the reaction of the operator to the result determined by the operation determination unit.
    The gesture operation device according to claim 10, wherein the operation history information storage unit stores the reaction of the operator in operation history information.
  12.  前記操作判断部は、前記操作者が前記ジェスチャー操作以外の他の操作を行った場合は前記判断を行わないことを特徴とする、請求項10に記載のジェスチャー操作装置。 The gesture operation device according to claim 10, wherein the operation determination unit does not make the determination when the operator performs an operation other than the gesture operation.
  13.  外部情報を取得する外部情報取得部をさらに備え、
     前記表示制御部は、前記外部情報取得部が取得した前記外部情報に基づいて、前記操作オブジェクトを表示する制御を行うことを特徴とする、請求項1に記載のジェスチャー操作装置。
    It further comprises an external information acquisition unit for acquiring external information,
    The gesture operation device according to claim 1, wherein the display control unit performs control to display the operation object based on the external information acquired by the external information acquisition unit.
  14.  車両情報を取得する車両情報取得部をさらに備え、
     前記表示制御部は、前記車両情報取得部が取得した前記車両情報に基づいて、前記操作オブジェクトを表示する制御を行うことを特徴とする、請求項1に記載のジェスチャー操作装置。
    It further comprises a vehicle information acquisition unit for acquiring vehicle information,
    The said display control part performs control which displays the said operation object based on the said vehicle information which the said vehicle information acquisition part acquired, The gesture operation apparatus of Claim 1 characterized by the above-mentioned.
  15.  携帯通信端末と通信接続可能な端末通信部をさらに備え、
     前記表示制御部は、前記端末通信部を介して取得した前記携帯通信端末の情報に基づいて、前記操作オブジェクトを表示する制御を行うことを特徴とする、請求項1に記載のジェスチャー操作装置。
    The mobile communication terminal further includes a terminal communication unit capable of communicating with the mobile communication terminal,
    The gesture operation device according to claim 1, wherein the display control unit performs control to display the operation object based on information of the mobile communication terminal acquired via the terminal communication unit.
  16.  前記表示制御部は、裸眼立体表示を行う表示装置、虚像表示を行う表示装置、および透過型の表示面を複数重ねて構成される表示装置のうちの少なくとも1つに前記操作オブジェクトを表示する制御を行うことを特徴とする、請求項1に記載のジェスチャー操作装置。 The display control unit is configured to control the operation object to be displayed on at least one of a display device that performs autostereoscopic display, a display device that performs virtual image display, and a display device configured by overlapping a plurality of transmissive display surfaces. The gesture operation device according to claim 1, wherein:
  17.  前記操作者の前記ジェスチャー操作を取得するジェスチャー操作取得部と、
     前記ジェスチャー操作取得部が取得した前記ジェスチャー操作が前記操作オブジェクトに従う動きであるか否かを判断する操作判断部と、
     前記操作判断部が前記操作オブジェクトに従う動きであると判断した場合、前記ジェスチャー操作に基づいて機器を制御する機器制御部と、
    をさらに備える、請求項1に記載のジェスチャー操作装置。
    A gesture operation acquisition unit that acquires the gesture operation of the operator;
    An operation determination unit that determines whether the gesture operation acquired by the gesture operation acquisition unit is a motion according to the operation object;
    A device control unit that controls a device based on the gesture operation when the operation determination unit determines that the movement follows the operation object;
    The gesture operation device according to claim 1, further comprising:
  18.  イベントを検出し、
     前記検出した前記イベントに対して、当該イベントを進行させるために操作者が行うことが可能な動作に、前記操作者の意図を示すジェスチャー操作を誘導する操作オブジェクトを三次元で表示する制御を行う、ジェスチャー操作方法。
    Detect events,
    Control is performed to three-dimensionally display an operation object for guiding a gesture operation indicating the intention of the operator to an action that can be performed by the operator to advance the event with respect to the detected event , Gesture operation method.
PCT/JP2018/002242 2018-01-25 2018-01-25 Gesture operation device and gesture operation method WO2019146032A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2018/002242 WO2019146032A1 (en) 2018-01-25 2018-01-25 Gesture operation device and gesture operation method
JP2019567457A JP6900133B2 (en) 2018-01-25 2018-01-25 Gesture operation device and gesture operation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/002242 WO2019146032A1 (en) 2018-01-25 2018-01-25 Gesture operation device and gesture operation method

Publications (1)

Publication Number Publication Date
WO2019146032A1 true WO2019146032A1 (en) 2019-08-01

Family

ID=67395344

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/002242 WO2019146032A1 (en) 2018-01-25 2018-01-25 Gesture operation device and gesture operation method

Country Status (2)

Country Link
JP (1) JP6900133B2 (en)
WO (1) WO2019146032A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210040243A (en) * 2019-10-03 2021-04-13 구글 엘엘씨 Facilitating user-proficiency in using radar gestures to interact with an electronic device
US11281303B2 (en) 2019-08-30 2022-03-22 Google Llc Visual indicator for paused radar gestures
US11288895B2 (en) 2019-07-26 2022-03-29 Google Llc Authentication management through IMU and radar
US11360192B2 (en) 2019-07-26 2022-06-14 Google Llc Reducing a state based on IMU and radar
US11385722B2 (en) 2019-07-26 2022-07-12 Google Llc Robust radar-based gesture-recognition by user equipment
US11402919B2 (en) 2019-08-30 2022-08-02 Google Llc Radar gesture input methods for mobile devices
US11467672B2 (en) 2019-08-30 2022-10-11 Google Llc Context-sensitive control of radar-based gesture-recognition
US11531459B2 (en) 2016-05-16 2022-12-20 Google Llc Control-article-based control of a user interface
US11841933B2 (en) 2019-06-26 2023-12-12 Google Llc Radar-based authentication status feedback
US11868537B2 (en) 2019-07-26 2024-01-09 Google Llc Robust radar-based gesture-recognition by user equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008089985A (en) * 2006-10-02 2008-04-17 Pioneer Electronic Corp Image display device
JP2009089068A (en) * 2007-09-28 2009-04-23 Victor Co Of Japan Ltd Control device for electronic apparatus, control method and control program
WO2012011263A1 (en) * 2010-07-20 2012-01-26 パナソニック株式会社 Gesture input device and gesture input method
JP2013211712A (en) * 2012-03-30 2013-10-10 Sony Corp Output controller, output control method, and program
JP2017501500A (en) * 2013-09-17 2017-01-12 アマゾン テクノロジーズ インコーポレイテッド Approach for 3D object display
JP2017027401A (en) * 2015-07-23 2017-02-02 株式会社デンソー Display operation device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110046470A (en) * 2008-07-15 2011-05-04 가부시키가이샤 아이피 솔루션즈 Naked eye three dimensional video image display system, naked eye three dimensional video image display device, amusement game machine and parallax barrier sheet
US8970484B2 (en) * 2010-07-23 2015-03-03 Nec Corporation Three dimensional display device and three dimensional display method
WO2016088410A1 (en) * 2014-12-02 2016-06-09 ソニー株式会社 Information processing device, information processing method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008089985A (en) * 2006-10-02 2008-04-17 Pioneer Electronic Corp Image display device
JP2009089068A (en) * 2007-09-28 2009-04-23 Victor Co Of Japan Ltd Control device for electronic apparatus, control method and control program
WO2012011263A1 (en) * 2010-07-20 2012-01-26 パナソニック株式会社 Gesture input device and gesture input method
JP2013211712A (en) * 2012-03-30 2013-10-10 Sony Corp Output controller, output control method, and program
JP2017501500A (en) * 2013-09-17 2017-01-12 アマゾン テクノロジーズ インコーポレイテッド Approach for 3D object display
JP2017027401A (en) * 2015-07-23 2017-02-02 株式会社デンソー Display operation device

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11531459B2 (en) 2016-05-16 2022-12-20 Google Llc Control-article-based control of a user interface
US11841933B2 (en) 2019-06-26 2023-12-12 Google Llc Radar-based authentication status feedback
US11868537B2 (en) 2019-07-26 2024-01-09 Google Llc Robust radar-based gesture-recognition by user equipment
US11288895B2 (en) 2019-07-26 2022-03-29 Google Llc Authentication management through IMU and radar
US11360192B2 (en) 2019-07-26 2022-06-14 Google Llc Reducing a state based on IMU and radar
US11385722B2 (en) 2019-07-26 2022-07-12 Google Llc Robust radar-based gesture-recognition by user equipment
US11790693B2 (en) 2019-07-26 2023-10-17 Google Llc Authentication management through IMU and radar
US11402919B2 (en) 2019-08-30 2022-08-02 Google Llc Radar gesture input methods for mobile devices
US11467672B2 (en) 2019-08-30 2022-10-11 Google Llc Context-sensitive control of radar-based gesture-recognition
US11687167B2 (en) 2019-08-30 2023-06-27 Google Llc Visual indicator for paused radar gestures
US11281303B2 (en) 2019-08-30 2022-03-22 Google Llc Visual indicator for paused radar gestures
US12008169B2 (en) 2019-08-30 2024-06-11 Google Llc Radar gesture input methods for mobile devices
KR20210040243A (en) * 2019-10-03 2021-04-13 구글 엘엘씨 Facilitating user-proficiency in using radar gestures to interact with an electronic device
KR102320754B1 (en) * 2019-10-03 2021-11-02 구글 엘엘씨 Facilitating user-proficiency in using radar gestures to interact with an electronic device

Also Published As

Publication number Publication date
JPWO2019146032A1 (en) 2020-07-02
JP6900133B2 (en) 2021-07-07

Similar Documents

Publication Publication Date Title
WO2019146032A1 (en) Gesture operation device and gesture operation method
US10123300B2 (en) Tactile feedback in an electronic device
EP3000013B1 (en) Interactive multi-touch remote control
US9323446B2 (en) Apparatus including a touch screen and screen change method thereof
JP5694204B2 (en) System and method for using textures in a graphical user interface device
WO2009131089A1 (en) Portable information terminal, computer readable program and recording medium
CN105138259A (en) Operation execution method and operation execution device
US20170068418A1 (en) Electronic apparatus, recording medium, and operation method of electronic apparatus
WO2014208691A1 (en) Portable device and method for controlling portable device
JP2010003307A (en) Portable information terminal, computer readable program and recording medium
US20150253887A1 (en) Information processing apparatus
EP3657311A1 (en) Apparatus including a touch screen and screen change method thereof
JP2019175449A (en) Information processing apparatus, information processing system, movable body, information processing method, and program
JP2017173989A (en) Image display device
JP6078375B2 (en) Electronic device, control program, and operation method of electronic device
US9733725B2 (en) Control unit, input apparatus and method for an information and communication system
KR101511118B1 (en) Apparatus and method for displaying split screen
KR20150009695A (en) Method for operating application and electronic device thereof
KR101422003B1 (en) Method of displaying menu in terminal and Terminal using this same
US9582150B2 (en) User terminal, electronic device, and control method thereof
KR102117450B1 (en) Display device and method for controlling thereof
JP5227356B2 (en) Information terminal and information input method
US20200272325A1 (en) Input control device, input device, and input control method
JP2016063366A (en) Portable display terminal
JP2014164388A (en) Information presentation device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18901876

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019567457

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18901876

Country of ref document: EP

Kind code of ref document: A1