WO2017010593A1 - Dispositif de reconnaissance de geste - Google Patents

Dispositif de reconnaissance de geste Download PDF

Info

Publication number
WO2017010593A1
WO2017010593A1 PCT/KR2015/007407 KR2015007407W WO2017010593A1 WO 2017010593 A1 WO2017010593 A1 WO 2017010593A1 KR 2015007407 W KR2015007407 W KR 2015007407W WO 2017010593 A1 WO2017010593 A1 WO 2017010593A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
face
detection unit
motion
gesture detection
Prior art date
Application number
PCT/KR2015/007407
Other languages
English (en)
Korean (ko)
Inventor
임채열
Original Assignee
재단법인 다차원 스마트 아이티 융합시스템 연구단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 재단법인 다차원 스마트 아이티 융합시스템 연구단 filed Critical 재단법인 다차원 스마트 아이티 융합시스템 연구단
Priority to PCT/KR2015/007407 priority Critical patent/WO2017010593A1/fr
Publication of WO2017010593A1 publication Critical patent/WO2017010593A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to a gesture recognition device, and more particularly to a gesture recognition device for minimizing power consumption for gesture recognition.
  • the present invention minimizes power consumption by performing a gesture recognition operation only when a user's motion and a face are recognized before the gesture operation, and in a preferred embodiment, when the face is recognized
  • the present invention relates to a gesture recognition apparatus for minimizing the amount of computation for gesture detection by recognizing a gesture only in a certain region (a region of a face or a region having a predetermined size spaced apart from the face region) based on a face position.
  • a gesture recognition device for recognizing a gesture of an object to be operated by an operator is combined with a display device such as a television receiver, a personal computer, or a tablet terminal.
  • a display device such as a television receiver, a personal computer, or a tablet terminal.
  • the operator does not need to wear special jewelry such as a data glove, and the operator can use his or her hand or fingers to perform the operation on the display device smoothly and smoothly.
  • a gesture recognizing apparatus for recognizing a gesture to be performed.
  • gesture recognition using the Hidden Markov Model (HMM), continuous dynamic programming (DP), or the like is performed in the gesture recognition apparatus.
  • Such systems may perform a gesture recognition operation while maintaining a power-on state to recognize a gesture at all times. Therefore, even if there is no gesture input, the system performs a gesture recognition operation so that unnecessary power is excessively consumed.
  • the present invention has been made to solve the above problems, it is possible to minimize the total amount of power consumed for gesture detection by detecting whether the user's gesture detection with low power, and the gesture detection module only when necessary to detect the gesture.
  • Another object of the present invention is to provide a gesture recognition device.
  • the present invention provides a gesture recognition device that minimizes the amount of calculation for gesture detection / recognition by setting the area for detecting the gesture to a part of the image area instead of the entire image area, thereby reducing the amount of power consumed for gesture recognition. To provide.
  • a gesture detection device including a motion detector, a face detector, a gesture detector, and a result output unit may detect a user's motion (motion) and, upon detecting a motion, first wake up to the face detector.
  • a motion detector for transmitting a wake up signal;
  • the device When the first wake-up signal is received from the motion detection unit, the device operates in the second state for a predetermined time, detects a user's face, returns to the first state, and detects a face to the gesture detection unit.
  • a face detector for transmitting a second wake-up signal; When the second wake-up signal is received from the face detection unit, the device operates in the second state for a predetermined time, detects the user's gesture, returns to the first state, and detects the gesture.
  • a gesture detector for outputting a corresponding operation signal; And a result output unit configured to perform an operation corresponding to the operation signal received from the gesture detector.
  • the first state may be an operating state that consumes less power than the second state.
  • the first state may be a power off state or a sleep state.
  • the face detection unit may determine that the user's face is detected only when the detected user's face direction is within a predetermined angle based on the face detection unit.
  • the face detector may determine that the user's face is detected only when the detected user's face size is greater than or equal to the first threshold and less than or equal to the second threshold.
  • the face detector when detecting a user's face, selects the detected face as a gesture detection target and additionally transmits the location information to the gesture detector, and the gesture detector detects a gesture detected by the face detector.
  • the gesture detection area of the user may be set based on the location information to perform gesture detection only on the gesture detection area.
  • the face detection unit selects one of the faces as a gesture detection target, and additionally transmits the location information to the gesture detection unit, and the gesture detection unit provides a gesture provided from the face detection unit.
  • a gesture detection area of a user may be set based on location information on a detection target to perform gesture detection only on the gesture detection area.
  • the face detector may select a face closest to the motion information detected by the motion detector as a gesture detection target.
  • the face detector may select a face that overlaps all or a part of the motion information detected by the motion detector as a gesture detection target.
  • the gesture detection unit may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
  • the gesture detecting unit may set a region having a predetermined size spaced apart from the position of the gesture sensing target as a gesture sensing region.
  • the gesture detector may detect a shape and a movement direction of a user's gesture and output an operation signal corresponding thereto.
  • Such a gesture detecting apparatus may further include an alarm unit configured to provide alarm information for a predetermined time when the gesture detecting unit detects a user's gesture when detecting a face through the face detecting unit.
  • a gesture detection device including a motion detector, a face / gesture detector, and a result output unit detects a user's motion (motion) and wakes up to the face / gesture detector when motion is detected. up) a motion detector for transmitting a signal; When the wake-up signal is received from the motion detection unit, the device operates in the second state for a predetermined time, detects a user's face, and detects a user's gesture for a predetermined time.
  • a face / gesture detection unit that outputs a corresponding operation signal and returns to a first state when a user's face is not detected or a gesture is not detected for a predetermined time; And a result output unit configured to perform an operation corresponding to the operation signal received from the face / gesture detector.
  • the first state may be an operating state that consumes less power than the second state.
  • the face / gesture detector may include: a face detector configured to detect a user's face for a predetermined time and select the detected face as a gesture detection target when receiving a wake-up signal from the motion detector; And a gesture detection unit configured to receive information on the selected gesture detection target from the face detection unit, set a user's gesture detection region based on the information on the gesture detection target, and perform gesture detection only on the gesture detection region. It can include;
  • the gesture detection unit may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
  • a gesture detection device including a motion / face detection unit, a gesture detection unit, and a result output unit detects a user's motion (motion), detects a user's face when detecting a motion, and detects a face.
  • a motion / face detector configured to transfer a wake up signal to the gesture detector; Operating in the first state When receiving the wake-up signal from the motion / face detection unit, the device operates in the second state for a predetermined time, detects the user's gesture and returns to the first state, and detects the gesture when the gesture is detected.
  • a gesture detector for outputting a corresponding operation signal; And a result output unit configured to perform an operation corresponding to the operation signal received from the gesture detector.
  • the first state may be an operating state that consumes less power than the second state.
  • the motion / face detection unit selects a face detected when a user's face is detected as a gesture detection target, additionally transmits position information on the gesture detection target to the gesture detection unit, and the gesture detection unit detects the motion / face detection.
  • a gesture detection unit configured to set a user's gesture detection area based on the location information of the gesture detection target received from the unit, and perform gesture detection only on the gesture detection area.
  • the gesture detection unit may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
  • the gesture recognition apparatus detects the user's movement and face to determine whether the gesture input preparation is completed, and operates the gesture recognition module only when the gesture input preparation is completed, thereby reducing the total power consumption and the amount of calculation. It is effective to let.
  • the motion detection operation of the human being can be driven with low power, it is possible to detect a human motion (and face) at low power and determine whether to perform a gesture recognition operation that requires a relatively large amount of power. There is an effect that can be reduced.
  • a gesture can be detected (detected) by using only some image information instead of all image information. This has the effect of minimizing the amount of computation for gesture detection.
  • FIG. 1 is a view showing a gesture recognition device according to a first example of the present invention
  • FIG. 2 illustrates an example of determining that a face of a user is detected according to the present invention
  • FIG. 3 is a view illustrating an example of selecting one of the faces as a gesture detection target when detecting a plurality of faces according to the present invention
  • FIG. 5 illustrates an example of recognizing a gesture on an area spaced a predetermined distance from a detected face area according to the present invention
  • FIG. 6 is a view showing an example of the shape of the gesture detected in accordance with the present invention.
  • FIG. 7 is a view showing a gesture recognition device according to a second example of the present invention.
  • FIG. 8 is a diagram illustrating a gesture recognizing apparatus according to a third example of the present invention.
  • first and second may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
  • FIG. 1 is a diagram illustrating a gesture recognizing apparatus according to a first example of the present invention.
  • the gesture recognition apparatus may include a motion (motion) detecting unit 10, a face detecting unit 20, a gesture detecting unit 30, and a result output unit 40. Include. Hereinafter, each component will be described in detail.
  • the motion detector 10 detects a user's motion (motion).
  • a human body sensor or an image sensor may be applied to the motion detector 10.
  • the human body detection sensor information about the human body is acquired at regular intervals through an infrared passive method (PIR), near infrared active method, ultrasonic waves, microwaves, and the like, and when it is determined that the difference between the acquired information is more than a predetermined level, It may be determined that the motion of the motion is detected.
  • PIR infrared passive method
  • NIR near infrared active method
  • ultrasonic waves ultrasonic waves
  • microwaves microwaves
  • CMOS complementary MOS
  • CCD charge-coupled device
  • IR image sensor IR image sensor
  • the image sensor may detect whether there is a motion (motion) of the user based on a difference value or a matching result of the image information acquired at a predetermined interval.
  • the motion detection unit 10 When the motion detection unit 10 detects a motion through the various methods as described above, the motion detection unit 10 transmits a wake up signal to the separate face detection unit 20.
  • the face detector 20 normally operates in a first state.
  • the face detector 20 receives a wake up signal from the motion detector 10, the face detector 20 operates in a second state for a predetermined time and detects a user's face.
  • the face detecting unit 20 detects the user's face in a certain region for a predetermined time, and then returns to the first state.
  • a sleep state, a power off state, or the like may be applied to the first state, and a normal operating state may be applied to the second state.
  • the first state may mean an operating state that consumes a smaller amount of power than the second state.
  • the power off state means that power to all components is turned off.
  • the sleep state controls only some components to operate and the other components are turned off to consume the entire power. It means a state that minimizes the amount of power.
  • the wake-up signal is received, power is supplied to all components to control all components to operate.
  • the face detector 20 may operate in a sleep state that cuts off power to other components in addition to the configuration of receiving a wake-up signal from the motion detector 10 as a first state. have. Subsequently, upon receiving the wake-up signal from the motion detection unit 10, power is supplied to other components (a module for detecting a user's face, etc.) to perform a user's face detection operation as a second state. do.
  • the face detector 20 all the components capable of detecting the shape of a human face may be applied.
  • a human body sensor and an image sensor may be applied.
  • the face detector 20 may determine that the user's face is detected only in the following cases.
  • the user's face is at an angle within
  • FIG. 2 is a diagram illustrating an example of determining that a face of a user is detected according to the present invention.
  • the face detector 20 may determine that the face of the user is detected only when the face of which the face of the user faces the front is detected. More specifically, the face detector 20 may determine that the face of the user is detected only when the detected direction of the face of the user is within a predetermined angle with respect to the face detector 20.
  • various face detection algorithms can be applied. For example, this may be implemented by learning a matching template for face recognition through a forward image.
  • the user's face may be detected by detecting the direction of the user's face based on the shape information of the face.
  • the face detecting unit 20 detects the overall direction of the face to determine whether the face is detected, calculates the position of the gaze in the image information, and calculates the gaze direction based on the overall calculation amount and This has the effect of reducing the amount of power consumed.
  • the face detector 20 may detect only a face of a user located within a certain distance.
  • the distance between the user and the face detector 20 may be measured by using a separate sensor, and the face detector 20 may be configured to perform a detection operation only when the distance value is within a predetermined range. .
  • the distance between the user and the face detection unit 20 is measured based on the face information detected by the face detection unit 20, and the face detection operation of the user is performed only when the distance value is within a predetermined range. You can do it.
  • the face detecting unit 20 may control to detect (detect) the user's face only when the number of pixels of the face according to the resolution is counted.
  • the face detector 20 may detect only a face of a user located within a predetermined range. In other words, the face detector 20 may detect only a face of a user spaced apart from the face detector by a minimum A distance or more and a maximum B distance or less.
  • the face detector 20 may determine that the user's face is detected only when the detected user's face size is greater than or equal to the first threshold and less than or equal to the second threshold. To this end, the face detector 20 may count the number of pixels of the face according to the resolution to control (detect) the user's face only when the number of pixels is within a predetermined number range.
  • the face detecting unit 20 provides the gesture detecting unit 30 with the position information on the detected face.
  • the gesture detection unit 30 may receive the gesture detection unit 30 and set the gesture detection area based on the position information. Through this, the gesture detector 30 may reduce the amount of calculation for detecting the gesture.
  • the gesture detector 30 will be described in detail with reference to the related description.
  • the face detector 20 may select the detected face as a gesture detection target and provide location information thereof to the gesture detector 30.
  • the face detector 20 may select one of the faces as a gesture detection target according to a predetermined rule and provide location information thereof to the gesture detector 30. have.
  • FIG. 3 is a diagram illustrating an example of selecting one of the faces as a gesture detection target when detecting a plurality of faces according to the present invention.
  • the face detector 20 may select a face closest to the motion information detected by the motion detector 10 as a gesture detection target. have.
  • the face detector 20 may select the face information closest to the specific direction as the gesture detection target based on the detected motion information.
  • 3 illustrates an example of selecting the face information located closest in the upward direction of the detected motion as a gesture detection target, but the face detection unit 20 according to the present invention is not limited to the above embodiment.
  • the face detector 20 may select a face that overlaps all or a part of the motion information detected by the motion detector 10 as a gesture detection target. In other words, when the detected motion information is detected on a specific face, the face detector 20 may select a corresponding face as a gesture detection target and provide the position information to the gesture detector 30.
  • the face detection unit 20 may select a gesture detection target, and the gesture detection unit 30 provided with the information may perform a gesture detection operation based on the information on the detection target. can do.
  • the gesture detector 30 normally operates in a first state.
  • the gesture detector 30 receives a wake up signal from the face detector 20, the gesture detector 30 operates in a second state for a predetermined time and detects a user's gesture. At this time, the gesture detector 30 detects the user's gesture with respect to the predetermined area for a predetermined time, and then returns to the first state.
  • a sleep state, a power off state, or the like may be applied to the first state, and a normal operating state may be applied to the second state.
  • the first state may mean an operating state that consumes a smaller amount of power than the second state.
  • the gesture detector 30 may operate in a sleep state that cuts off power to other components in addition to the configuration of receiving a wake-up signal from the face detector 20. Subsequently, when the wake-up signal is received from the face detector 20, the user's gesture detection may be performed by supplying power to other components (such as a module for detecting the user's gesture) to operate.
  • all the components capable of detecting the gesture of the user's hand may be applied to the gesture detecting unit 30.
  • a vision sensor may be applied to more accurately detect (detect) the user's gesture.
  • the gesture detecting unit 30 when the gesture detecting unit 30 receives the (location) information on the gesture detecting object from the face detecting unit 20, the gesture detecting unit 30 sets the gesture detecting region based on the information.
  • the gesture detection operation is performed only in the gesture detection area. That is, the gesture detector 30 may perform a gesture detection (detection) operation on only some of the regions without performing a gesture detection operation on all image information obtained to reduce the amount of computation and power consumption.
  • FIG. 4 illustrates an example of recognizing a gesture on a detected face area according to the present invention.
  • the gesture detector 30 may detect a gesture input on an area on a face selected as a gesture detection target. To this end, the gesture detection unit 30 may set an area within a predetermined range as a gesture detection area based on the position of the gesture detection target.
  • FIG. 5 is a diagram illustrating an example of recognizing a gesture on an area spaced a predetermined distance from a detected face area according to the present invention.
  • the gesture detector 30 may set a region having a predetermined size spaced apart from a position of the gesture sensing object as a gesture sensing region.
  • FIG. 5 illustrates an example in which a region spaced a predetermined distance downward in the downward direction based on the position of the gesture sensing object (face) is set as the gesture sensing region, but the present invention is not limited to the above embodiment.
  • the gesture detection unit 30 may optimize the amount of calculation required to calculate the gesture by setting the gesture detection area through various methods as described above and performing the gesture detection operation only in the area. Subsequently, when a gesture is detected, an operation signal corresponding to the detected gesture is output to the result output unit 40.
  • the gesture detector 30 may detect a shape and a movement direction of a user's gesture and output an operation signal corresponding thereto.
  • the gesture detector 30 may detect the shapes of various gestures shown in FIG. 6 through image analysis.
  • the gesture detection unit 30 may also detect a detected gesture direction of movement.
  • a moving direction of a hand may be detected using difference information between image data acquired at predetermined time intervals.
  • the motion direction of a hand may be detected by obtaining a block motion to obtain a motion vector of an individual block, and then calculating an average of all motion vectors.
  • the direction of the movement can be tracked by obtaining the optical flow and detecting which direction the optical flow is moving as a whole.
  • the present invention is not limited to the above embodiment, and other motion sensing methods may also be applied.
  • the result output unit 40 performs an operation corresponding to the operation signal received from the gesture detection unit 30.
  • the gesture detection unit 30 stores table information on the corresponding motion signal for each shape and movement direction of the detected gesture.
  • the gesture detection unit 30 uses the table information.
  • the signal may be provided to the result output unit 40.
  • the A gesture is set to change a channel of the TV (increase or decrease channel number)
  • the B gesture is to adjust the volume of the TV (increase or decrease volume)
  • the C gesture is set to correspond to the power on / off control operation of the TV. do.
  • the gesture detector 30 if the user's B gesture is detected by the gesture detector 30, the gesture detector 30 provides an operation signal for volume control (volume increase or decrease) to the result output unit 40. can do. As a result, the result output unit 40 may perform a corresponding operation.
  • the gesture sensing device may further include an alarm unit (not shown).
  • the alarm unit (not shown) allows the user to detect the user's gesture by the gesture detecting unit 30. You can provide an alarm message that you are ready to do so.
  • the alarm unit may provide information that the gesture detection unit 30 is ready to detect a gesture by providing alarm information for a predetermined time for detecting the user's gesture.
  • the gesture recognizing apparatus may include a motion detector 10, a face detector 20, and a gesture detector 30, each of which is distinguished.
  • the face detector 20 the gesture detector 30 normally operate in a first state (sleep state or power off state).
  • the gesture detection unit 30 operates in a second state for a while. Can return to the 1 state.
  • the user may determine whether the user prepares to input a gesture through the low-power motion detector 10, and when the user is ready, wakes up the face detector 20 and the gesture detector 30. Has the effect of being able to be driven with low power as a whole by performing the gesture sensing operation.
  • the amount of computation required for the gesture detection may be reduced.
  • the face detector 20 and the gesture detector 30 may operate as one component, and in another embodiment, the motion detector 10 and the face detector 20. Can operate as one component.
  • FIG. 7 is a diagram illustrating a gesture recognizing apparatus according to a second example of the present invention.
  • the face detector 20 and the gesture detector 30 of the first example may be integrated and operate as one component.
  • the PIR sensor may be applied to the motion detector 110, and the image sensor may be applied to the face / gesture detector 125.
  • the face / gesture detection unit 125 may be woken up only when a motion is detected, and thus may be implemented at low power.
  • FIG. 8 is a diagram illustrating a gesture recognizing apparatus according to a third example of the present invention.
  • the motion detector 10 and the face detector 20 of the first example may be integrated and operate as one component.
  • the PIR sensor may be applied to the motion / face detector 215 and the image sensor may be applied to the gesture detector 230.
  • the motion / face detection operation is performed by using the PIR sensor which can be driven at a low power, and can be realized at low power as a whole by waking up the gesture detector 230 only when a face is detected.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un dispositif de reconnaissance de geste pour minimiser la consommation d'énergie pour la reconnaissance de gestes. Plus précisément, la présente invention concerne un dispositif de reconnaissance de geste pour minimiser une quantité de consommation d'énergie en effectuant une opération de reconnaissance de geste seulement lorsqu'un mouvement et le visage d'un utilisateur sont reconnus, et minimiser une quantité de calcul pour la détection de geste en reconnaissant le geste seulement pour une région prédéterminée (une zone faciale ou une région de taille prédéterminée formée pour être espacée d'une distance prédéterminée de la zone faciale) sur la base d'une position de visage reconnue lorsque le visage est reconnu.
PCT/KR2015/007407 2015-07-16 2015-07-16 Dispositif de reconnaissance de geste WO2017010593A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/007407 WO2017010593A1 (fr) 2015-07-16 2015-07-16 Dispositif de reconnaissance de geste

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/007407 WO2017010593A1 (fr) 2015-07-16 2015-07-16 Dispositif de reconnaissance de geste

Publications (1)

Publication Number Publication Date
WO2017010593A1 true WO2017010593A1 (fr) 2017-01-19

Family

ID=57757992

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/007407 WO2017010593A1 (fr) 2015-07-16 2015-07-16 Dispositif de reconnaissance de geste

Country Status (1)

Country Link
WO (1) WO2017010593A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107422859A (zh) * 2017-07-26 2017-12-01 广东美的制冷设备有限公司 基于手势的调控方法、装置及计算机可读存储介质和空调
KR20200121513A (ko) * 2019-04-16 2020-10-26 경북대학교 산학협력단 동작 인식 기반 조작 장치 및 방법
KR102437979B1 (ko) * 2022-02-22 2022-08-30 주식회사 마인드포지 제스처 기반의 객체 지향적 인터페이싱 방법 및 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130229508A1 (en) * 2012-03-01 2013-09-05 Qualcomm Incorporated Gesture Detection Based on Information from Multiple Types of Sensors
WO2013133624A1 (fr) * 2012-03-06 2013-09-12 모젼스랩 주식회사 Appareil d'interface utilisant une reconnaissance de mouvement, et procédé destiné à commander ce dernier
KR20130109031A (ko) * 2012-03-26 2013-10-07 실리콤텍(주) 모션 제스처 인식 모듈 및 그것의 모션 제스처 인식 방법
EP2680191A2 (fr) * 2012-06-26 2014-01-01 Google Inc. Reconnaissance faciale
US20140368423A1 (en) * 2013-06-17 2014-12-18 Nvidia Corporation Method and system for low power gesture recognition for waking up mobile devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130229508A1 (en) * 2012-03-01 2013-09-05 Qualcomm Incorporated Gesture Detection Based on Information from Multiple Types of Sensors
WO2013133624A1 (fr) * 2012-03-06 2013-09-12 모젼스랩 주식회사 Appareil d'interface utilisant une reconnaissance de mouvement, et procédé destiné à commander ce dernier
KR20130109031A (ko) * 2012-03-26 2013-10-07 실리콤텍(주) 모션 제스처 인식 모듈 및 그것의 모션 제스처 인식 방법
EP2680191A2 (fr) * 2012-06-26 2014-01-01 Google Inc. Reconnaissance faciale
US20140368423A1 (en) * 2013-06-17 2014-12-18 Nvidia Corporation Method and system for low power gesture recognition for waking up mobile devices

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107422859A (zh) * 2017-07-26 2017-12-01 广东美的制冷设备有限公司 基于手势的调控方法、装置及计算机可读存储介质和空调
CN107422859B (zh) * 2017-07-26 2020-04-03 广东美的制冷设备有限公司 基于手势的调控方法、装置及计算机可读存储介质和空调
KR20200121513A (ko) * 2019-04-16 2020-10-26 경북대학교 산학협력단 동작 인식 기반 조작 장치 및 방법
KR102192051B1 (ko) * 2019-04-16 2020-12-16 경북대학교 산학협력단 동작 인식 기반 조작 장치 및 방법
KR102437979B1 (ko) * 2022-02-22 2022-08-30 주식회사 마인드포지 제스처 기반의 객체 지향적 인터페이싱 방법 및 장치

Similar Documents

Publication Publication Date Title
AU2014297039B2 (en) Auto-cleaning system, cleaning robot and method of controlling the cleaning robot
WO2017039308A1 (fr) Appareil d'affichage de réalité virtuelle et procédé d'affichage associé
WO2017119664A1 (fr) Appareil d'affichage et ses procédés de commande
EP3281058A1 (fr) Appareil d'affichage de réalité virtuelle et procédé d'affichage associé
WO2020059939A1 (fr) Dispositif d'intelligence artificielle
WO2020091505A1 (fr) Dispositif électronique et son procédé d'interaction intelligente
WO2018208093A1 (fr) Procédé de fourniture de rétroaction haptique et dispositif électronique destiné à sa mise en œuvre
EP3482341A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2017010593A1 (fr) Dispositif de reconnaissance de geste
EP3281089A1 (fr) Dispositif intelligent et procédé de fonctionnement de ce dernier
WO2021047070A1 (fr) Procédé et appareil de photographie de terminal, terminal mobile et support de stockage lisible
WO2009157654A2 (fr) Procédé et appareil de détection de mouvement et de position d'une pression et support d'enregistrement où est enregistré le programme destiné à mettre en œuvre le procédé
WO2020180051A1 (fr) Appareil électronique et son procédé de commande
WO2020076055A1 (fr) Dispositif électronique contenant un dispositif de saisie par crayon et procédé d'utilisation de celui-ci
WO2017023140A1 (fr) Dispositif et procédé pour gérer une puissance dans un dispositif électronique
WO2020141727A1 (fr) Robot de soins de santé et son procédé de commande
WO2018143509A1 (fr) Robot mobile et son procédé de commande
WO2017215354A1 (fr) Procédé et appareil de stockage de données de jauge de mesure
WO2021080171A1 (fr) Procédé et dispositif permettant la détection d'un port à l'aide d'un capteur inertiel
WO2015194697A1 (fr) Dispositif d'affichage vidéo et son procédé d'utilisation
WO2016080662A1 (fr) Procédé et dispositif de saisie de caractères coréens sur la base du mouvement des doigts d'un utilisateur
WO2018120717A1 (fr) Procédé et dispositif de commande de climatiseur
WO2019143122A1 (fr) Dispositif d'affichage, système d'affichage et procédé de commande associé
WO2022019582A1 (fr) Robot et procédé de commande associé
WO2020153691A1 (fr) Appareil électronique et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15898354

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15898354

Country of ref document: EP

Kind code of ref document: A1