WO2019016875A1 - Dispositif et procédé de détermination d'opération tactile pour déterminer la validité d'une opération tactile - Google Patents

Dispositif et procédé de détermination d'opération tactile pour déterminer la validité d'une opération tactile Download PDF

Info

Publication number
WO2019016875A1
WO2019016875A1 PCT/JP2017/026010 JP2017026010W WO2019016875A1 WO 2019016875 A1 WO2019016875 A1 WO 2019016875A1 JP 2017026010 W JP2017026010 W JP 2017026010W WO 2019016875 A1 WO2019016875 A1 WO 2019016875A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
indicator
input surface
touch input
coordinates
Prior art date
Application number
PCT/JP2017/026010
Other languages
English (en)
Japanese (ja)
Inventor
家田 邦代
下谷 光生
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2019530274A priority Critical patent/JP6639745B2/ja
Priority to PCT/JP2017/026010 priority patent/WO2019016875A1/fr
Priority to CN201780093030.3A priority patent/CN110869891B/zh
Publication of WO2019016875A1 publication Critical patent/WO2019016875A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Definitions

  • the present invention relates to a technique for preventing an erroneous operation on a touch input device having two touch input surfaces.
  • Patent Document 1 discloses a vehicle display device provided with two screens each functioning as a touch input surface.
  • the screen to the one with a lower priority is touched. By invalidating, the wrong operation is prevented.
  • the determination of the effectiveness of the touch depends on the priorities of the two screens (touch input surfaces), so the determination is biased. That is, when the two screens are touched almost simultaneously, the touch to the lower priority screen is always invalidated. Therefore, it is conceivable that the determination result of the effectiveness of the touch may be different from the intention of the operator. Specifically, when the operator touches the screen with high priority, when the user touches the screen with low priority by mistake, the judgment result according to the intention of the operator is obtained, but on the contrary, the operator gives priority When a screen with a low priority is touched by mistake when a screen with a high priority is touched, a determination result contrary to the intention of the operator is obtained.
  • the present invention has been made to solve the problems as described above, and in a touch operation determination device that determines the effectiveness of a touch on two touch input surfaces, the bias of the determination is eliminated, and the intention of the operator is improved.
  • the purpose is to enable judgment in accordance with
  • a touch operation determination device includes an indicator position detection unit that detects a position of an indicator in a space in which a touch input device having a first touch input surface and a second touch input surface is disposed; A first touch coordinate acquisition unit that acquires a first touch coordinate indicating a position of a first touch that is a touch of the indicator on the touch input surface of 1; and a touch of the indicator on the second touch input surface A second touch coordinate acquisition unit for acquiring a second touch coordinate indicating a position of the second touch, and a case where a time between the first touch and the second touch is shorter than a predetermined threshold time And determining the effectiveness of each of the first touch and the second touch based on the time-series change of the position of the indicator before the first touch and the second touch are performed. And a unit.
  • the effectiveness of each of the first and second touches is determined based on the time-series change of the position of the indicator at. It is not necessary to set priorities on the first and second touch input surfaces, and it is possible to prevent deviation in determination of the effectiveness of each of the first and second touches. As a result, the determination result of the effectiveness of the touch can be made more in line with the intention of the operator.
  • FIG. 2 is a functional block diagram showing a configuration of a touch operation system according to Embodiment 1. It is a figure which shows the example of a touch input device. It is a figure which shows the example of a touch input device. It is a figure for demonstrating the operation
  • FIG. 7 is a functional block diagram showing a configuration of a touch operation system according to Embodiment 2. It is a figure for demonstrating the operation
  • 15 is a flowchart showing validity determination processing in the second embodiment.
  • FIG. 13 is a functional block diagram showing a configuration of a touch operation system according to Embodiment 3. It is a figure which shows the modification of a touch input device.
  • FIG. 1 is a functional block diagram showing the configuration of the touch operation system according to the first embodiment.
  • the touch operation system includes a touch input device 100 including a first touch panel 10 and a second touch panel 20, a touch operation determination device 30, and an operation target device 40.
  • the first touch panel 10 of the touch input device 100 detects a first touch input surface 11 for receiving an operator's touch operation and a first touch input surface for detecting coordinates indicating the touch position of the operator on the first touch input surface 11. And a touch coordinate detection unit 12.
  • the second touch panel 20 of the touch input device 100 detects the second touch input surface 21 that receives the touch operation of the operator and the coordinates indicating the touch position of the operator on the second touch input surface 21.
  • a second touch coordinate detection unit 22 is a second touch coordinate detection unit 22.
  • the indicator for the operator to touch the first touch panel 10 and the second touch panel 20 uses the indicator as the hand of the operator, but may be, for example, a stylus pen or the like.
  • FIG. 2 and FIG. An example of the touch input device 100 is shown in FIG. 2 and FIG.
  • the first touch input surface 11 and the second touch input surface 21 are each flat, and the first touch input surface 11 and the second touch input surface 21 form an angle of less than 180 °.
  • FIGS. 2 and 3 when the operator tries to touch the first touch input surface 11 with the index finger of the hand 500, the thumb erroneously touches the second touch input surface 21, and the second An example in which an erroneous touch occurs on the touch input surface 21 of FIG.
  • the first touch coordinate detection unit 12 detects the coordinates of the touch position P1 of the operator's hand 500 on the first touch input surface 11, and the second touch coordinate detection unit 22 performs the second touch.
  • the coordinates of the touch position P2 of the operator's hand 500 on the input surface 21 are detected.
  • the first touch input surface 11 and the second touch input surface 21 form an angle of less than 180 °, such an erroneous touch is likely to occur.
  • the false touch is caused not only by the thumb but also by the bent little finger or palm.
  • first touch the touch of the operator on the first touch input surface 11
  • second touch the coordinates indicating the position of the second touch are referred to as “second touch coordinates”.
  • the touch operation determination device 30 includes the first touch coordinate acquisition unit 31, the second touch coordinate acquisition unit 32, the indicator position detection unit 33, and the validity determination unit 34. .
  • the first touch coordinate acquisition unit 31 indicates the presence / absence of the first touch on the first touch input surface 11 and the position of the first touch from the first touch coordinate detection unit 12 of the touch input device 100. 1. Acquire information on touch coordinates of 1.
  • the second touch coordinate acquisition unit 32 is a second touch coordinate detection unit 22 of the touch input device 100.
  • the second touch coordinate acquisition unit 32 indicates the presence / absence of the second touch on the second touch input surface 21 and the second touch position. Acquire the information of 2 touch coordinates.
  • the indicator position detection unit 33 detects the position of the indicator (the operator's hand 500) in the space where the touch input device 100 is disposed.
  • a method of detecting the position of the indicator for example, a method of analyzing an image of an operator taken by a camera (not shown), a method of analyzing the direction and distance of the indicator measured by a sensor (not shown), etc. Conceivable.
  • the validity determination unit 34 is the target of the touch operation on the first touch coordinates acquired by the first touch coordinate acquisition unit 31 and the second touch coordinates acquired by the second touch coordinate acquisition unit 32. Output to the operation target device 40. However, in the case where the time between the first touch and the second touch is shorter than a predetermined threshold time (that is, the first touch and the second touch substantially simultaneously) In the case where it is performed, the validity of each of the first and second touches is determined, and only coordinates indicating the touch position determined to be valid are output to the operation target device 40. The effectiveness of the first and second touches is determined based on the time-series change of the position of the indicator before the first and second touches are performed.
  • the validity determination unit 34 includes a first pointer coordinate calculation unit 341, a second pointer coordinate calculation unit 342, and a determination unit 343.
  • the first indicator coordinate calculation unit 341 recognizes in advance the position at which the first touch input surface 11 is disposed, and based on the position of the indicator detected by the indicator position detection unit 33, the first indicator coordinate calculation unit 341 The coordinates of the position closest to the pointer on the touch input surface 11 (hereinafter referred to as “first pointer coordinates”) are calculated.
  • the second indicator coordinate calculation unit 342 recognizes in advance the position at which the second touch input surface 21 is disposed, and based on the position of the indicator detected by the indicator position detection unit 33, the second indicator coordinate calculation unit 342 The coordinates of the position closest to the pointer on the touch input surface 21 (hereinafter referred to as “second pointer coordinates”) are calculated.
  • the first and second indicator coordinate calculation units 341 and 342 hold the history of the first and second indicator coordinates, that is, the time-series changes for a predetermined time.
  • the first pointer coordinate operation unit 341 determines the operator on the first touch input surface 11. Coordinates indicating a position Q1 closest to the hand 500 are calculated as first indicator coordinates. Further, the second indicator coordinate calculation unit 342 calculates coordinates indicating the position Q2 closest to the operator's hand 500 on the second touch input surface 21 as second indicator coordinates.
  • the determination unit 343 determines the time series of the first indicator coordinates calculated by the first indicator coordinates calculation unit 341.
  • the effectiveness of each of the first touch and the second touch is determined based on the change and the time-series change of the second indicator coordinates calculated by the second indicator coordinate calculation unit 342.
  • the touch panel operator tends to press a finger perpendicular to the touch input surface. Therefore, when the operator tries to touch the first touch input surface 11, the operator's hand 500 moves substantially perpendicularly to the first touch input surface 11, as shown in FIG. In this case, the first indicator coordinates (position Q1) hardly changes, and the second indicator coordinates (position Q2) largely changes. Although illustration is omitted, when the operator tries to touch the second touch input surface 21, the operator's hand 500 moves substantially perpendicularly to the second touch input surface 21, so the second The indicator coordinates (position Q2) hardly change, and the first indicator coordinates (position Q1) largely change.
  • the determination unit 343 determines the amount of change in the first indicator coordinates during a predetermined time immediately before the first touch is performed, and the second indicator coordinates during the predetermined time immediately before the second touch is performed. And the touch corresponding to the smaller change amount is enabled, and the touch corresponding to the larger change amount is invalidated. That is, if the amount of change in the first indicator coordinates is smaller, the determination unit 343 determines that the first touch is valid and the second touch is invalid. Conversely, if the amount of change in the second indicator coordinates is smaller, the determination unit 343 determines that the second touch is valid and the first touch is invalid.
  • either of the first touch and the second touch may be invalidated.
  • the first touch may always be disabled, or the second touch may be disabled.
  • a bias of this degree of judgment is considered to be acceptable.
  • the first one of the first and second touches may be invalidated, or the later one may be invalidated.
  • the operation target device 40 illustrated in FIG. 1 may be any device as long as it can be a target of touch operation using the touch input device 100.
  • a navigation device, an audio display device, or the like is assumed as the operation target device 40.
  • the operation target device 40 may be plural.
  • the first touch panel 10 and the second touch panel 20 may be used to operate different operation target devices 40, respectively.
  • a mode is conceivable in which the first touch panel 10 is used for operating the navigation device and the second touch panel 20 is used for operating the audio display device.
  • the first touch panel 10 and the second touch panel 20 may be used to operate different applications.
  • the first touch panel 10 may be used to operate an application for navigation
  • the second touch panel 20 may be used to operate an application for playing video and music.
  • first touch panel 10 and the second touch panel 20 may be used to operate the same application. That is, the first touch panel 10 and the second touch panel 20 may be used to operate different attributes of the same application.
  • first touch panel 10 may be used for map operation of an application for navigation
  • second touch panel 20 may be used for operation of facility search for the application.
  • the touch input apparatus 100, the touch operation determination apparatus 30, and the operation target apparatus 40 were each shown with the separate block, two or more of these may be comprised integrally.
  • the touch input device 100 may incorporate the touch operation determination device 30.
  • the touch operation system is applied to a portable device such as a smartphone, the touch input device 100, the touch operation determination device 30, and the operation target device 40 are all housed in one case and integrated. It becomes.
  • FIG. 6 is a flowchart showing the operation of the first touch coordinate acquisition unit 31. The operation of the first touch coordinate acquisition unit 31 will be described based on FIG.
  • the first touch coordinate acquisition unit 31 determines the presence or absence of the operator's touch (first touch) on the first touch input surface 11 based on the output signal of the first touch coordinate detection unit 12 (Step S101). If the first touch is not performed (NO in step S101), step S101 is repeated. In addition, when the first touch is performed (YES in step S101), the first touch coordinate acquisition unit 31 receives the first touch coordinates indicating the position of the first touch from the first touch coordinate detection unit 12 It acquires (step S102). Then, the first touch coordinate acquisition unit 31 transmits the acquired first touch coordinates to the determination unit 343 (step S103), and the process returns to step S101.
  • FIG. 7 is a flowchart showing the operation of the second touch coordinate acquisition unit 32. The operation of the second touch coordinate acquisition unit 32 will be described based on FIG. 7.
  • the second touch coordinate acquisition unit 32 determines the presence or absence of the operator's touch (second touch) on the second touch input surface 21 based on the output signal of the second touch coordinate detection unit 22. (Step S201). If the second touch is not performed (NO in step S201), step S201 is repeated. In addition, when the second touch is performed (YES in step S201), the second touch coordinate acquisition unit 32 causes the second touch coordinate detection unit 22 to use the second touch coordinates indicating the position of the second touch. It acquires (step S202). Then, the second touch coordinate acquisition unit 32 transmits the acquired second touch coordinates to the determination unit 343 (step S203), and the process returns to step S201.
  • the time difference from when the first touch is made on the first touch input surface 11 until the first touch coordinates indicating the position are received by the determination unit 343, and After the second touch is made on the second touch input surface 21, the time difference from when the determination unit 343 receives a second touch coordinate indicating the position is ignored. That is, the first touch coordinates are received by the determination unit 343 at the same time as the first touch is performed on the first touch input surface 11, and the second touch is performed on the second touch input surface 21 at the same time. It is assumed that the touch coordinates of 2 are received by the determination unit 343.
  • FIG. 8 is a flowchart showing operations of the indicator position detection unit 33 and the validity determination unit 34. The operations of the indicator position detection unit 33 and the validity determination unit 34 will be described based on FIG. 8.
  • the indicator position detection unit 33 acquires the position of the indicator (the operator's hand 500) in the space where the touch input device 100 is disposed (step S301).
  • the first indicator coordinate calculation unit 341 is a first instruction that is a coordinate of a position closest to the indicator on the first touch input surface 11 based on the position of the indicator acquired by the indicator position detection unit 33.
  • Body coordinates are calculated (step S302).
  • the second pointer coordinate calculation unit 342 is a second coordinate of a position closest to the pointer on the second touch input surface 21 based on the position of the pointer acquired by the pointer position detection unit 33.
  • the pointer coordinates of the target are calculated (step S303).
  • the determination unit 343 confirms whether or not the first touch coordinates have been received from the first touch coordinate acquisition unit 31 (step S304). If determination unit 343 receives the first touch coordinates (YES in step S304), determination unit 343 receives the second touch coordinates before the threshold time has elapsed since the reception of the first touch coordinates. Whether or not it is confirmed (step S305). If the second touch coordinates are not received before the threshold time elapses from the reception of the first touch coordinates (NO in step S305), the determination unit 343 instructs the operation target device 40 to perform the first touch coordinates acquisition unit. The first touch coordinates received from 31 are output (step S306).
  • step S307 the determination unit 343 determines the validity of each of the first and second touches. After executing the "validity determination process" to be determined (step S307), the process returns to step S301.
  • FIG. 9 is a flowchart showing the validity determination process.
  • the determination unit 343 determines a first instruction in a predetermined time immediately before the first touch is performed, based on the history of the first indicator coordinates calculated by the first indicator coordinate calculation unit 341. The amount of change in body coordinates is calculated (step S401).
  • the determination unit 343 changes the second indicator coordinate change during a predetermined time immediately before the second touch is performed, from the history of the second indicator coordinates calculated by the second indicator coordinate calculation unit 342. The amount is calculated (step S402).
  • the determination unit 343 compares the amount of change in the first indicator coordinates calculated in step S401 with the amount of change in the second indicator coordinates calculated in step S402 (step S403). As a result, when the amount of change in the first indicator coordinates is smaller (YES in step S404), determination unit 343 determines that the first touch is valid and the second touch is invalid (step S405). , And outputs only the first touch coordinates acquired from the first touch coordinate acquisition unit 31 to the operation target device 40 (step S406).
  • step S404 determines that the change amount of the second indicator coordinates is smaller (NO in step S404)
  • the determination unit 343 determines that the second touch is valid and the first touch is invalid (step S407)
  • Only the second touch coordinates acquired from the second touch coordinate acquisition unit 32 are output to the operation target device 40 (step S408).
  • the determination unit 343 confirms whether or not the second touch coordinates are received from the second touch coordinate acquisition unit 32 (step S308).
  • determination unit 343 receives the second touch coordinates (YES in step S308)
  • determination unit 343 receives the first touch coordinates before the threshold time has elapsed from the reception of the second touch coordinates. Whether or not it is confirmed (step S309). If the first touch coordinates are not received before the threshold time elapses from the reception of the second touch coordinates (NO in step S309), the determination unit 343 instructs the operation target device 40 to perform the second touch coordinates acquisition unit.
  • the second touch coordinates acquired from 32 are output (step S310).
  • the determination unit 343 determines the validity of each of the first and second touches. After executing the validity determination process to be determined (step S311), the process returns to step S301.
  • step S311 Since the validity determination process of step S311 is the same as the validity determination process (FIG. 9) of step S307, the description here is omitted. If NO in step S308, that is, if neither the first touch coordinate nor the second touch coordinate has been received, the process returns to step S301.
  • the touch operation determination device 30 of the first embodiment when the first touch and the second touch are performed almost simultaneously, the first and second touches are not performed. Based on the time-series change in the position of the indicator (specifically, the amount of change in the first indicator coordinates and the second indicator coordinates), the activation of each of the first touch and the second touch Sex is determined. Therefore, it is not necessary to set the priority on each of the first touch input surface and the second touch input surface, and it is possible to prevent deviation in determination of the effectiveness of each of the first touch and the second touch. It is done. Therefore, the determination of the effectiveness of the touch can be made more in line with the intention of the operator.
  • FIG. 10 and 11 each show an example of the hardware configuration of the touch operation determination device 30.
  • Each element (the first touch coordinate acquisition unit 31, the second touch coordinate acquisition unit 32, the indicator position detection unit 33, and the validity determination unit 34) of the touch operation determination device 30 illustrated in FIG. This is realized by the processing circuit 50 shown in FIG.
  • the processing circuit 50 detects the position of the pointer in the space in which the touch input device 100 having the first touch input surface 11 and the second touch input surface 21 is disposed, and A first touch coordinate acquisition unit 31 for acquiring a first touch coordinate indicating a position of the first touch that is a touch of the indicator on the first touch input surface 11, and an indicator for the second touch input surface 21 A second touch coordinate acquisition unit 32 for acquiring a second touch coordinate indicating a position of the second touch, which is a touch of a second touch, and a time threshold between the first touch and the second touch being predetermined If it is shorter than time, the effectiveness of each of the first touch and the second touch is determined based on the chronological change of the position of the indicator before the first touch and the second touch are performed.
  • Judgment of validity Comprises a section 34, the.
  • a dedicated hardware may be applied to the processing circuit 50, and a processor (central processing unit (CPU: Central Processing Unit), processing unit, arithmetic unit, microprocessor, microprocessor, etc. that executes a program stored in the memory).
  • CPU Central Processing Unit
  • a computer or DSP also called Digital Signal Processor
  • the processing circuit 50 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an application specific integrated circuit (ASIC), an FPGA (field-programmable) This corresponds to Gate Array) or a combination thereof.
  • ASIC application specific integrated circuit
  • FPGA field-programmable This corresponds to Gate Array
  • Each function of each element of touch operation determination device 30 may be realized by a plurality of processing circuits, or those functions may be realized collectively by one processing circuit.
  • FIG. 11 shows the hardware configuration of the touch operation determination device 30 when the processing circuit 50 is configured using a processor.
  • the function of each element of the touch operation determination device 30 is realized by a combination of software or the like (software, firmware, or software and firmware).
  • Software and the like are described as a program and stored in the memory 52.
  • the processor 51 as the processing circuit 50 reads out and executes a program stored in the memory 52 to realize the functions of the respective units.
  • the touch operation determination device 30 is a pointer in the space in which the touch input device 100 having the first touch input surface 11 and the second touch input surface 21 is disposed.
  • the memory 52 is non-volatile or non-volatile such as, for example, random access memory (RAM), read only memory (ROM), flash memory, erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM). Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), its drive device, etc., or any storage medium used in the future May be
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • EEPROM electrically erasable programmable read only memory
  • the present invention is not limited to this, and a part of the elements of the touch operation determination device 30 may be realized by dedicated hardware, and another part may be realized by software or the like.
  • the function is realized by the processing circuit 50 as dedicated hardware for some elements, and the processing circuit 50 as the processor 51 reads a program stored in the memory 52 for other elements. It is possible to realize the function by executing it.
  • the touch operation determination device 30 can realize the above-described functions by hardware, software, or the like, or a combination thereof.
  • FIG. 12 is a functional block diagram showing the configuration of the touch operation system according to the second embodiment.
  • the touch operation system shown in FIG. 12 is different from the configuration shown in FIG. 1 in the first indicator coordinate calculation unit 341 and the second indicator coordinate calculation unit 342 of the validity determination unit 34, respectively. 344 and the second incident angle calculation unit 345 are replaced.
  • the first incident angle calculation unit 344 sets the incident angle of the indicator on the first touch input surface 11 based on the time-series change of the position of the indicator detected by the indicator position detection unit 33 (hereinafter referred to as “the Calculate the incident angle of 1).
  • the second incident angle calculation unit 345 sets the incident angle of the indicator on the second touch input surface 21 based on the time-series change of the position of the indicator detected by the indicator position detection unit 33 (hereinafter referred to as “the Calculate the incident angle of 2).
  • the first incident angle calculation unit 344 determines the movement direction of the hand 500 of the operator.
  • An angle ⁇ 1 formed with the first touch input surface 11 is calculated as a first incident angle.
  • the second incident angle calculation unit 345 calculates an angle ⁇ 2 formed by the movement direction of the hand 500 of the operator and the second touch input surface 21 as a second incident angle.
  • the operator of the touch panel tends to press the finger perpendicular to the touch input surface. Therefore, when the operator tries to touch the first touch input surface 11, the operator's hand 500 moves substantially perpendicularly to the first touch input surface 11, as shown in FIG. In this case, the first incident angle ( ⁇ 1) becomes larger (closer to the perpendicular). Although illustration is omitted, when the operator tries to touch the second touch input surface 21, the operator's hand 500 moves substantially perpendicularly to the second touch input surface 21, so the second The incident angle ( ⁇ 2) increases.
  • determination unit 343 determines that the time between the first touch on first touch input surface 11 and the second touch on second touch input surface 21 is shorter than the threshold time. Then, the first incident angle when the first touch is performed is compared with the second incident angle when the second touch is performed. As a result, if the first incident angle is larger, the determination unit 343 determines that the first touch is valid and the second touch is invalid. Conversely, if the second incident angle is larger, the determination unit 343 determines that the second touch is valid and the first touch is invalid.
  • step S307 and step S311 the validity determination process shown in FIG. 14 is executed.
  • the determination unit 343 performs the first touch when the first touch is performed from the history of the first indicator coordinates calculated by the first incident angle calculation unit 344.
  • the incident angle of is calculated (step S501).
  • the determination unit 343 calculates a second incident angle when the second touch is performed, from the history of the second indicator coordinates calculated by the second incident angle calculation unit 345 (step S502). .
  • the determination unit 343 compares the first incident angle calculated in step S501 with the second incident angle calculated in step S502 (step S503). As a result, if the first incident angle is larger (YES in step S504), the determination unit 343 determines that the first touch is valid and the second touch is invalid (step S505). Only the first touch coordinates acquired from the touch coordinate acquisition unit 31 are output to the operation target device 40 (step S506). On the other hand, if the second incident angle is larger (NO in step S504), the determination unit 343 determines that the second touch is valid and the first touch is invalid (step S507), and the second touch is performed. Only the second touch coordinates acquired from the coordinate acquisition unit 32 are output to the operation target device 40 (step S508).
  • either of the first touch and the second touch may be invalidated.
  • the first touch may always be disabled, or the second touch may always be disabled (in FIG. 14, since NO in step S504, the first touch is always disabled). Touch is disabled). A bias of this degree of judgment is considered to be acceptable.
  • the first one of the first and second touches may be invalidated, or the later one may be invalidated.
  • the touch operation determination device 30 of the second embodiment when the first touch and the second touch are performed almost simultaneously, the first and second touches are not performed.
  • the effectiveness of each of the first touch and the second touch is determined based on the first and second incident angles calculated from the time-series change of the position of the indicator. Therefore, it is not necessary to set the priority on each of the first touch input surface and the second touch input surface, and it is possible to prevent deviation in determination of the effectiveness of each of the first touch and the second touch. It is done. Therefore, the determination of the effectiveness of the touch can be made more in line with the intention of the operator.
  • FIG. 15 is a functional block diagram showing the configuration of the touch operation system according to the third embodiment.
  • the touch operation system of FIG. 15 adds the first display screen 13 and the second display screen 23 to the touch input device 100 and the display control unit 35 to the touch operation determination device 30 in the configuration of FIG. It has become a thing.
  • the first display screen 13 and the second display screen 23 of the touch input device 100 are, for example, liquid crystal display devices.
  • the first touch input surface 11 of the first touch panel 10 is transparent and disposed on the first display screen 13.
  • the second touch input surface 21 of the second touch panel 20 is transparent and disposed on the first display screen 13. That is, the first touch panel 10 and the first display screen 13, and the second touch panel 20 and the second display screen 23 constitute a touch panel monitor having both an image display function and a touch operation function. .
  • the display control unit 35 of the touch operation determination device 30 inputs the video signal output from the operation target device 40 to the first display screen 13 and the second display screen 23 to obtain the first display screen 13 and the second display screen 23.
  • the image is displayed on the display screen 23 of FIG.
  • the present invention is also applicable to the touch input device 100 provided with two touch panel monitors.
  • 15 shows an example in which the display control unit 35 is added to the first display screen 13, the second display screen 23, and the touch operation system of FIG. 1, but the third embodiment is not limited to this.
  • a combination with the touch operation system of mode 2 (FIG. 12) is also possible.
  • the operation target device 40 may be plural.
  • a touch panel monitor (hereinafter referred to as “first touch panel monitor”) consisting of the first touch panel 10 and the first display screen 13, and a touch panel consisting of the second touch panel 20 and the second display screen 23 A monitor (hereinafter, referred to as “second touch panel monitor”) may be used to display the operation screen of the operation target device 40 different from one another.
  • first touch panel monitor displays the operation screen of the navigation device
  • the second touch panel monitor displays the operation screen of the audio display device.
  • the first touch panel monitor and the second touch panel monitor may be used to display operation screens of different applications.
  • the first touch panel monitor may display an operation screen of an application for navigation
  • the second touch panel monitor may display an operation screen of an application for reproducing video and music.
  • first touch panel monitor and the second touch panel monitor may be used to display the operation screen of the same application. That is, the first touch panel monitor and the second touch panel monitor may be used to display operation screens of different attributes of the same application.
  • first touch panel monitor is used to display and operate a map of an application for navigation
  • second touch panel monitor displays an operation screen for facility search of the application.
  • first display screen 13 and the second display screen 23 may be provided in the touch input device 100.
  • the touch input device 100 is also assumed to be capable of noncontact gesture operation in which an operator can input an operation without directly touching the first touch input surface 11 and the second touch input surface 21. It is unlikely that a normal (contact) touch operation is performed on the first touch input surface 11 or the second touch input surface 21 while the operator performs a noncontact gesture operation. Therefore, it is effective when the first touch on the first touch input surface 11 and the second touch on the second touch input surface 21 are detected while the non-contact gesture operation is performed by the operator.
  • the sex determination unit 34 may invalidate both of them.
  • the “threshold time” (the threshold of the time difference between the first touch and the second touch), which is the determination criterion of whether to perform the effectiveness determination process, It may be changed according to the traveling state of the vehicle. For example, when the vehicle is traveling, an erroneous touch is likely to occur due to the shaking of the vehicle, so it is preferable to increase the threshold time to increase the detection sensitivity of the erroneous touch.
  • the threshold time may be changed according to the traveling speed of the vehicle. For example, as the traveling speed of the vehicle is higher, it is more difficult for the operator (driver) to direct awareness to the operation of the touch input device 100, and erroneous touch is more likely to occur. Therefore, it is preferable to increase the threshold time to increase the detection sensitivity of the erroneous touch as the traveling speed of the vehicle increases.
  • FIG. 2 shows an example in which the first touch input surface 11 and the second touch input surface 21 of the touch input device 100 are vertically arranged, they may be arranged horizontally.
  • At least one of the first touch input surface 11 and the second touch input surface 21 may be curved. Even when the first touch input surface 11 or the second touch input surface 21 has a curved shape, erroneous touch is likely to occur, and thus the application of the present invention is effective.
  • each embodiment can be freely combined, or each embodiment can be appropriately modified or omitted.
  • SYMBOLS 100 touch input device, 10 1st touch panel, 11 1st touch input surface, 12 1st touch coordinate detection part, 13 1st display screen, 20 2nd touch panel, 21 2nd touch input surface, 22 Second touch coordinate detection unit, 23 second display screen, 30 touch operation determination device, 31 first touch coordinate acquisition unit, 32 second touch coordinate acquisition unit, 33 indicator position detection unit, 34 validity determination Unit, 35 display control unit, 341 first pointer coordinate operation unit, 342 second pointer coordinate operation unit, 343 determination unit, 344 first incident angle operation unit, 345 second incident angle operation unit, 40 Operation target device, 500 operator's hand.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un dispositif de détermination d'opération tactile (30) comprenant : une unité de détection de position d'indicateur (33) qui détecte la position d'un indicateur dans un espace dans lequel est disposé un dispositif d'entrée tactile (100), ledit dispositif d'entrée tactile étant équipé d'une première surface d'entrée tactile (11) et d'une première unité de détection de coordonnées tactiles (12) ; et une unité de détermination de validité (34) qui détermine la validité d'un premier contact tactile sur la première surface d'entrée tactile (11) et d'un second contact tactile sur une seconde surface d'entrée tactile (21). L'unité de détermination de validité (34) détermine la validité du premier contact tactile et du second contact tactile sur la base du changement de la position de l'indicateur dans une séquence temporelle avant que le premier contact tactile et le second contact tactile soient réalisés, lorsque le temps entre le premier contact tactile et le second contact tactile est plus court qu'un temps seuil prédéfini.
PCT/JP2017/026010 2017-07-19 2017-07-19 Dispositif et procédé de détermination d'opération tactile pour déterminer la validité d'une opération tactile WO2019016875A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2019530274A JP6639745B2 (ja) 2017-07-19 2017-07-19 タッチ操作判定装置およびタッチ操作の有効性判定方法
PCT/JP2017/026010 WO2019016875A1 (fr) 2017-07-19 2017-07-19 Dispositif et procédé de détermination d'opération tactile pour déterminer la validité d'une opération tactile
CN201780093030.3A CN110869891B (zh) 2017-07-19 2017-07-19 触摸操作判定装置及触摸操作的有效性判定方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/026010 WO2019016875A1 (fr) 2017-07-19 2017-07-19 Dispositif et procédé de détermination d'opération tactile pour déterminer la validité d'une opération tactile

Publications (1)

Publication Number Publication Date
WO2019016875A1 true WO2019016875A1 (fr) 2019-01-24

Family

ID=65015904

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/026010 WO2019016875A1 (fr) 2017-07-19 2017-07-19 Dispositif et procédé de détermination d'opération tactile pour déterminer la validité d'une opération tactile

Country Status (3)

Country Link
JP (1) JP6639745B2 (fr)
CN (1) CN110869891B (fr)
WO (1) WO2019016875A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021163155A (ja) * 2020-03-31 2021-10-11 アルパイン株式会社 操作制御装置

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111642045A (zh) * 2020-05-29 2020-09-08 意诺科技有限公司 智能控制器及其控制方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013025621A (ja) * 2011-07-22 2013-02-04 Nec Corp コンテンツ表示装置、情報処理装置、コンテンツ表示方法およびプログラム
JP2015064783A (ja) * 2013-09-25 2015-04-09 京セラドキュメントソリューションズ株式会社 タッチパネル装置、及びこれを備えた画像形成装置
JP2015232844A (ja) * 2014-06-10 2015-12-24 株式会社デンソー 入力装置
JP2016062183A (ja) * 2014-09-16 2016-04-25 キヤノン株式会社 情報処理装置、その制御方法、プログラム、及び記憶媒体
JP2016109505A (ja) * 2014-12-04 2016-06-20 トヨタ自動車株式会社 車両用表示装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5269648B2 (ja) * 2009-03-02 2013-08-21 パナソニック株式会社 携帯端末装置及び入力装置
JP5557316B2 (ja) * 2010-05-07 2014-07-23 Necカシオモバイルコミュニケーションズ株式会社 情報処理装置、情報生成方法及びプログラム
KR20130102298A (ko) * 2012-03-07 2013-09-17 주식회사 팬택 휴대용 단말기 및 휴대용 단말기의 디스플레이 제어 방법
JP5461735B2 (ja) * 2012-04-27 2014-04-02 パナソニック株式会社 入力装置、入力支援方法及びプログラム
JP6005417B2 (ja) * 2012-06-26 2016-10-12 株式会社東海理化電機製作所 操作装置
JP6221265B2 (ja) * 2013-03-04 2017-11-01 株式会社デンソー タッチパネル操作装置及びタッチパネル操作装置における操作イベント判定方法
JP2017084216A (ja) * 2015-10-30 2017-05-18 京セラドキュメントソリューションズ株式会社 入力処理装置、及びそれを備えた画像形成装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013025621A (ja) * 2011-07-22 2013-02-04 Nec Corp コンテンツ表示装置、情報処理装置、コンテンツ表示方法およびプログラム
JP2015064783A (ja) * 2013-09-25 2015-04-09 京セラドキュメントソリューションズ株式会社 タッチパネル装置、及びこれを備えた画像形成装置
JP2015232844A (ja) * 2014-06-10 2015-12-24 株式会社デンソー 入力装置
JP2016062183A (ja) * 2014-09-16 2016-04-25 キヤノン株式会社 情報処理装置、その制御方法、プログラム、及び記憶媒体
JP2016109505A (ja) * 2014-12-04 2016-06-20 トヨタ自動車株式会社 車両用表示装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021163155A (ja) * 2020-03-31 2021-10-11 アルパイン株式会社 操作制御装置
JP7378902B2 (ja) 2020-03-31 2023-11-14 アルパイン株式会社 操作制御装置

Also Published As

Publication number Publication date
CN110869891A (zh) 2020-03-06
JP6639745B2 (ja) 2020-02-05
JPWO2019016875A1 (ja) 2019-11-07
CN110869891B (zh) 2023-07-04

Similar Documents

Publication Publication Date Title
US9207801B2 (en) Force sensing input device and method for determining force information
US9678606B2 (en) Method and device for determining a touch gesture
JP5805890B2 (ja) タッチパネルシステム
US9195381B2 (en) Information processing apparatus, method for controlling the same, and storage medium to receive a touch operation for rotating a displayed image
US20120249440A1 (en) method of identifying a multi-touch rotation gesture and device using the same
WO2012129975A1 (fr) Procédé d'identification de geste de rotation et dispositif utilisant ce procédé
US20120249487A1 (en) Method of identifying a multi-touch shifting gesture and device using the same
JP6005563B2 (ja) タッチパネル装置、制御方法
JP6188998B2 (ja) タッチパネル制御装置及び車載情報機器
JP6639745B2 (ja) タッチ操作判定装置およびタッチ操作の有効性判定方法
JP6370118B2 (ja) 情報処理装置、情報処理方法、及びコンピュータプログラム
US8952934B2 (en) Optical touch systems and methods for determining positions of objects using the same
US10126856B2 (en) Information processing apparatus, control method for information processing apparatus, and storage medium
US20140320430A1 (en) Input device
JP6877546B2 (ja) タッチ操作判定装置およびタッチ操作の有効性判定方法
JP6112506B2 (ja) 携帯型電子機器
US10558270B2 (en) Method for determining non-contact gesture and device for the same
KR102030169B1 (ko) 차량용 터치 오입력 검출 장치 및 그 방법
KR102082696B1 (ko) 정보 처리장치 및 터치패널의 조작 관리방법
JP2013037481A (ja) 入力装置、情報処理装置、入力制御方法およびプログラム
TWI553531B (zh) 光學觸控裝置及觸控點座標之計算方法
JP2016115303A (ja) 操作検出装置
JP2017207868A (ja) 情報処理装置、入力操作の判別方法、コンピュータプログラム
JP2012234299A (ja) 携帯端末装置、入力制御方法、および入力制御プログラム
WO2018167813A1 (fr) Dispositif de détection de fonction de pavé tactile et procédé de détection de fonction de pavé tactile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17918293

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019530274

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17918293

Country of ref document: EP

Kind code of ref document: A1