TWI483141B - System and method for gesture recognition - Google Patents

System and method for gesture recognition Download PDF

Info

Publication number
TWI483141B
TWI483141B TW102100142A TW102100142A TWI483141B TW I483141 B TWI483141 B TW I483141B TW 102100142 A TW102100142 A TW 102100142A TW 102100142 A TW102100142 A TW 102100142A TW I483141 B TWI483141 B TW I483141B
Authority
TW
Taiwan
Prior art keywords
hand
image
index
centroid point
edge points
Prior art date
Application number
TW102100142A
Other languages
Chinese (zh)
Other versions
TW201428541A (en
Inventor
Chih Yin Chiang
Che Wei Chang
Original Assignee
Chunghwa Picture Tubes Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chunghwa Picture Tubes Ltd filed Critical Chunghwa Picture Tubes Ltd
Priority to TW102100142A priority Critical patent/TWI483141B/en
Publication of TW201428541A publication Critical patent/TW201428541A/en
Application granted granted Critical
Publication of TWI483141B publication Critical patent/TWI483141B/en

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Description

手勢辨識系統及手勢辨識方法Gesture recognition system and gesture recognition method

本發明是有關於一種辨識系統及辨識方法,且特別是有關於一種手勢辨識系統及手勢辨識方法。The invention relates to an identification system and an identification method, and in particular to a gesture recognition system and a gesture recognition method.

傳統的人機互動介面為滑鼠、鍵盤和搖桿,隨著科技發展日新月異,近來觸控式螢幕也廣為應用於各項電子產品中,作為人機互動介面。為了使人機互動可以更人性化,體感控制提供了一種全新的輸入方式,其中之一為手勢辨識,由於手勢是一種原始且自然的表示方式,因此在日常生活裡,手勢成為人與人之間常用的溝通方式之一。手勢辨識應用於人機介面設計、醫療保健、虛擬實境、數位藝術創作與遊戲設計等領域近來漸漸受到消費者的矚目。The traditional human-computer interaction interface is a mouse, a keyboard and a joystick. With the rapid development of technology, the touch screen has recently been widely used in various electronic products as a human-computer interaction interface. In order to make human-computer interaction more humane, somatosensory control provides a new input method, one of which is gesture recognition. Since gesture is a primitive and natural representation, in daily life, gestures become people and people. One of the commonly used communication methods. Gesture recognition has been attracting the attention of consumers in the fields of human-machine interface design, medical care, virtual reality, digital art creation and game design.

辨識手勢的資訊主要為手部移動軌跡,而系統藉由分析手勢資訊來判斷使用者的手勢,並且根據不同的手勢來達到人機互動的功能。然而,當使用者持續進行手勢操控時,使用者會不斷移動手臂以使手部不斷移動。但不斷移動手臂會造成使用者的疲勞,進而影響使用者的使用時間。並且,手部的位置無法準確的維持於定點,以致於無法精確地對系統進行操控。因此,一種不會大幅增加使用者的疲勞且可精確操作的操控方式為手勢操控系統的研發方向之一。The information of the recognition gesture is mainly the movement track of the hand, and the system judges the gesture of the user by analyzing the gesture information, and achieves the function of human-computer interaction according to different gestures. However, when the user continues to gesture control, the user constantly moves the arm to keep the hand moving. However, constantly moving the arm can cause fatigue to the user, which in turn affects the user's usage time. Moreover, the position of the hand cannot be accurately maintained at a fixed point, so that the system cannot be accurately manipulated. Therefore, a control method that does not greatly increase the user's fatigue and can be accurately operated is one of the development directions of the gesture control system.

本發明提供一種手勢辨識系統及手勢辨識方法,可依據影像中的手指數輸出手勢資訊,以提高操作的精確性及降低使用者的疲勞。The invention provides a gesture recognition system and a gesture recognition method, which can output gesture information according to a hand index in an image, so as to improve the accuracy of the operation and reduce the fatigue of the user.

本發明提出一種手勢辨識系統,包括一影像感測器及一影像處理器。影像感測器用以提供第一影像。影像處理器耦接影像感測器,以依據第一影像中的手部的手指數輸出手勢資訊。The invention provides a gesture recognition system comprising an image sensor and an image processor. The image sensor is used to provide a first image. The image processor is coupled to the image sensor to output gesture information according to the hand index of the hand in the first image.

本發明亦提出一種手勢辨識方法,包括:接收一影像感測器所提供的一第一影像;透過一影像處理器判斷第一影像中的一手部的手指數以輸出一手勢資訊。The present invention also provides a method for recognizing a gesture, comprising: receiving a first image provided by an image sensor; and determining, by an image processor, a hand index of a hand in the first image to output a gesture information.

在本發明之一實施例中,影像處理器擷取第一影像中的一膚色部分,以判斷第一影像中是否出現手部。當第一影像中出現手部時,影像處理器對第一影像中的膚色部分進行邊緣偵測,並且計算膚色部分的質心點作為手部的質心點,以依據膚色部分的多個邊緣點及手部的質心點判斷手部的手指數。In an embodiment of the invention, the image processor captures a skin color portion of the first image to determine whether a hand is present in the first image. When a hand appears in the first image, the image processor performs edge detection on the skin color portion of the first image, and calculates a centroid point of the skin color portion as a centroid point of the hand to be based on multiple edges of the skin color portion Point and hand centroid point to determine the hand index of the hand.

在本發明之一實施例中,影像處理器將這些邊緣點的位置於垂直位置上大於手部的質心點且與手部的質心點間的距離大於一預設距離的部份的數量作為手部的手指數。In an embodiment of the invention, the image processor positions the edge points in a vertical position greater than the centroid point of the hand and the distance from the centroid point of the hand is greater than a predetermined distance. As the hand index of the hand.

在本發明之一實施例中,影像處理器判斷這些邊緣點是否為對應真實手指,並且將這些邊緣點中對應真實手指且與手部的質心點間的距離大於一預設距離的部份的數量作為手部的手指數。In an embodiment of the present invention, the image processor determines whether the edge points are corresponding to the real finger, and the distance between the edge points corresponding to the real finger and the centroid point of the hand is greater than a predetermined distance. The number is used as the hand index of the hand.

在本發明之一實施例中,影像處理器依據這些邊緣點的形狀判斷這些邊緣點是否為對應真實手指。In an embodiment of the invention, the image processor determines whether the edge points correspond to real fingers according to the shape of the edge points.

在本發明之一實施例中,影像處理器依據這些邊緣點的線條的斜率變化判斷這些邊緣點是否為對應真實手指。In an embodiment of the invention, the image processor determines whether the edge points are corresponding to the real finger according to the slope change of the lines of the edge points.

在本發明之一實施例中,影像處理器輸出手部的質心點的位置資訊。In an embodiment of the invention, the image processor outputs location information of a centroid point of the hand.

在本發明之一實施例中,影像感測器用以提供第二影像,且影像處理器依據第一影像中的手部的手指數與第二影像中的手部的手指數的差距輸出手勢資訊,其中第一影像的時間點不同於第二影像的時間點。In an embodiment of the invention, the image sensor is configured to provide the second image, and the image processor outputs the gesture information according to the difference between the hand index of the hand in the first image and the hand index of the hand in the second image. , wherein the time point of the first image is different from the time point of the second image.

在本發明之一實施例中,當第一影像中的手部的手指數相同於第二影像中的手部的手指數時,影像處理器依據第一影像中手部的質心點的位置及第二影像中手部的質心點的位置判斷手部的質心點的移動方向,並且依據第一影像中的手部的手指數及手部的質心點的移動方向輸出手勢資訊。In an embodiment of the invention, when the hand index of the hand in the first image is the same as the hand index of the hand in the second image, the image processor is based on the position of the centroid point of the hand in the first image. And the position of the centroid point of the hand in the second image determines the moving direction of the centroid point of the hand, and outputs the gesture information according to the hand index of the hand in the first image and the moving direction of the centroid point of the hand.

基於上述,本發明實施例的手勢辨識系統及手勢辨識方法,影像處理器可依據第一影像中的手指數輸出手勢資訊,藉此可提高操作的精確性及降低使用者的疲勞。Based on the above, the gesture recognition system and the gesture recognition method of the embodiment of the present invention, the image processor can output gesture information according to the hand index in the first image, thereby improving the accuracy of the operation and reducing the fatigue of the user.

為讓本發明之上述特徵和優點能更明顯易懂,下文特舉實施例,並配合所附圖式作詳細說明如下。The above described features and advantages of the present invention will be more apparent from the following description.

圖1為依據本發明一實施例的手勢辨識系統的系統示 意圖。請參照圖1,在本實施例中,手勢辨識系統100包括影像感測器110及影像處理器120。影像感測器110用以將所接收的光線轉換為數位資訊後提供數位化的影像資訊(如第一影像IMG1或第二影像IMG2)。其中,第一影像IMG1的時間點先於第二影像IMG2的時間點,亦即第一影像IMG1的時間點不同於第二影像IMG2的時間點,並且第一影像IMG1的時間點與第二影像IMG2的時間點可間隔一單位時間(例如0.5秒)。FIG. 1 is a system diagram of a gesture recognition system according to an embodiment of the invention. intention. Referring to FIG. 1 , in the embodiment, the gesture recognition system 100 includes an image sensor 110 and an image processor 120 . The image sensor 110 is configured to convert the received light into digital information and provide digitalized image information (such as the first image IMG1 or the second image IMG2). The time point of the first image IMG1 is earlier than the time point of the second image IMG2, that is, the time point of the first image IMG1 is different from the time point of the second image IMG2, and the time point of the first image IMG1 and the second image are The time point of IMG2 can be separated by one unit time (for example, 0.5 second).

影像處理器120耦接影像感測器110,用以偵測第一影像IMG1及/或第二影像IMG2中是否出現手部的影像,並且在第一影像IMG1及/或第二影像IMG2中出現手部的影像。當第一影像IMG1及/或第二影像IMG2中出現手部的影像,則影像處理器120可依據第一影像IMG1及/或第二影像IMG2中手部的手指數輸出手勢資訊IGES。或者,影像處理器120可計算第一影像IMG1及第二影像IMG2中手部的質心點的位置,並影像處理器120可依據第一影像IMG1及/或第二影像IMG2中手部的手指數及第一影像IMG1及第二影像IMG2中手部的質心點的位置輸出手勢資訊IGES。再者,影像處理器120可對應地輸出第一影像IMG1及/或第二影像IMG2中手部的質心點的位置資訊IPOS。The image processor 120 is coupled to the image sensor 110 for detecting whether an image of the hand appears in the first image IMG1 and/or the second image IMG2, and appears in the first image IMG1 and/or the second image IMG2. Image of the hand. When the image of the hand appears in the first image IMG1 and/or the second image IMG2, the image processor 120 may output the gesture information IGES according to the hand index of the hand in the first image IMG1 and/or the second image IMG2. Alternatively, the image processor 120 can calculate the position of the centroid point of the hand in the first image IMG1 and the second image IMG2, and the image processor 120 can be based on the hand of the hand in the first image IMG1 and/or the second image IMG2. The index and the position of the centroid point of the hand in the first image IMG1 and the second image IMG2 output gesture information IGES. Furthermore, the image processor 120 can correspondingly output the position information IPOS of the centroid point of the hand in the first image IMG1 and/or the second image IMG2.

進一步來說,若影像處理器120依據單一影像(如第一影像IMG1或第二影像IMG2)輸出手勢資訊IGES時,則影像處理器120可依據第一影像IMG1中手部的手指數 輸出手勢資訊IGES,並且可輸出對應第一影像IMG1中手部的質心點的位置資訊IPOS。或者,影像處理器120可依據第二影像IMG2中手部的手指數輸出手勢資訊IGES,並且可輸出對應第二影像IMG2中手部的質心點的位置資訊IPOS。Further, if the image processor 120 outputs the gesture information IGES according to a single image (such as the first image IMG1 or the second image IMG2), the image processor 120 may use the hand index of the hand in the first image IMG1. The gesture information IGES is output, and the position information IPOS corresponding to the centroid point of the hand in the first image IMG1 can be output. Alternatively, the image processor 120 may output the gesture information IGES according to the hand index of the hand in the second image IMG2, and may output the position information IPOS corresponding to the centroid point of the hand in the second image IMG2.

若影像處理器120依據兩相鄰影像(如第一影像IMG1及第二影像IMG2)輸出手勢資訊IGES時,則影像處理器120可依據第一影像IMG1中的手部的手指數與第二影像IMG2中的手部的手指數的差距輸出手勢資訊IGES。其次,當第一影像IMG1中的手部的手指數相同於第二影像IMG2中的手部的手指數時,影像處理器120可依據第一影像IMG1中手部的質心點的位置及第二影像IMG2中手部的質心點的位置判斷手部的質心點的移動方向,並且影像處理器120可依據第一影像IMG1或第二影像IMG2中的手部的手指數及手部的質心點的移動方向輸出手勢資訊IGES。If the image processor 120 outputs the gesture information IGES according to two adjacent images (such as the first image IMG1 and the second image IMG2), the image processor 120 may use the hand index and the second image of the hand in the first image IMG1. The gap in the hand index of the hand in IMG2 outputs the gesture information IGES. Secondly, when the hand index of the hand in the first image IMG1 is the same as the hand index of the hand in the second image IMG2, the image processor 120 can determine the position and the position of the centroid point of the hand in the first image IMG1. The position of the centroid point of the hand in the second image IMG2 determines the moving direction of the centroid point of the hand, and the image processor 120 can according to the hand index of the hand in the first image IMG1 or the second image IMG2 and the hand The movement direction of the centroid point outputs the gesture information IGES.

在本發明的一實施例中,當手部的質心點的移動量大於一預定量時,則影像處理器120判定手部的質心點為移動,否則影像處理器120判定手部的質心點視為未移動。上述預定量可依據影像(如第一影像IMG1及第二影像IMG2)的大小而定,以水平移動而言,上述預定量可以是影像的水平寬度與特定比例值(例如25%)的乘積,以垂直移動而言,上述預定量可以是影像的垂直寬度與特定比例值(例如25%)的乘積。上述質心點的移動方向及特定 比例值可依據本領域通常知識者自行設定,本發明實施例不以此為限。In an embodiment of the invention, when the amount of movement of the centroid point of the hand is greater than a predetermined amount, the image processor 120 determines that the centroid point of the hand is moving, otherwise the image processor 120 determines the quality of the hand. The heart point is considered not to move. The predetermined amount may be determined according to the size of the image (such as the first image IMG1 and the second image IMG2). In the horizontal movement, the predetermined amount may be a product of a horizontal width of the image and a specific scale value (for example, 25%). In terms of vertical movement, the predetermined amount may be a product of a vertical width of the image and a specific scale value (for example, 25%). The moving direction and specificity of the above centroid point The value of the present invention can be set by a person skilled in the art, and the embodiment of the present invention is not limited thereto.

圖2為依據本發明一實施例的多個手勢示意圖。請參照圖1及圖2,在本實施例中,影像210、220、230、240、250及260為對應不同手勢,並且第一影像IMG1及第二影像IMG2可以分別為影像210、220、230、240、250及260的其中之一。2 is a schematic diagram of multiple gestures in accordance with an embodiment of the present invention. Referring to FIG. 1 and FIG. 2, in the embodiment, the images 210, 220, 230, 240, 250, and 260 correspond to different gestures, and the first image IMG1 and the second image IMG2 may be images 210, 220, and 230, respectively. One of 240, 250 and 260.

在本實施例中,影像處理器120會對色彩進行過濾,以擷取影像中的膚色部分。若影像中不包含膚色部分或膚色部分過大或過小時,則判斷影像中未出現手部;反之,則判斷影像中出現手部。接著,當影像中出現手部時,影像處理器120對影像中的膚色部分進行邊緣偵測,並且計算膚色部分的質心點作為手部的質心點,以依據膚色部分的多個邊緣點及手部的質心點判斷手部的手指數。其中,膚色部分過大或過小可依據膚色部分在影像中所佔的比例來判定,例如膚色部分在影像中比例為75%以上則視為過大,膚色部分在影像中比例為15%以下則視為過小。上述為用以說明,本發明實施例不以此為限。In this embodiment, the image processor 120 filters the color to capture the skin color portion of the image. If the image does not contain a skin color portion or the skin color portion is too large or too small, it is judged that the hand does not appear in the image; otherwise, the hand appears in the image. Then, when a hand appears in the image, the image processor 120 performs edge detection on the skin color portion of the image, and calculates a centroid point of the skin color portion as a centroid point of the hand to be based on the plurality of edge points of the skin color portion. And the hand point of the hand determines the hand index of the hand. The skin color portion is too large or too small to be determined according to the proportion of the skin color portion in the image. For example, if the skin color portion is 75% or more in the image, the ratio is considered to be too large, and the skin color portion is regarded as 15% or less in the image. too small. The foregoing is for illustrative purposes, and the embodiments of the present invention are not limited thereto.

以影像210而言,影像處理器120會擷取到影像210中的膚色部分H1,並且由於膚色部分H1不會過大及過小,因此影像處理器120會判斷影像210中出現手部(即膚色部分H1)。接著,影像處理器120至少會偵測到邊緣點A1~A7,並且計算出膚色部分H1的質心點MA。然後,影像處理器120會依據邊緣點A1~A7與質心點MA的相 對位置及邊緣點A1~A7與質心點MA間的距離判斷邊緣點A1~A7是否為手部的手指,以決定手部的手指數。In the case of the image 210, the image processor 120 captures the skin color portion H1 in the image 210, and since the skin color portion H1 is not too large or too small, the image processor 120 determines that the hand portion (ie, the skin color portion) appears in the image 210. H1). Next, the image processor 120 detects at least the edge points A1 A A7 and calculates the centroid point MA of the skin color portion H1. Then, the image processor 120 according to the phase of the edge points A1~A7 and the centroid point MA The distance between the position and the edge points A1 to A7 and the centroid point MA is judged whether the edge points A1 to A7 are the fingers of the hand to determine the hand index of the hand.

進一步來說,由於手指一般為向上展開,因此真正的手指於垂直位置上一般會高於質心點MA,並且當手指展開後,手指的指尖會遠離質心點MA,亦即手指的指尖與質心點MA間的距離會大於一預設距離(例如影像的垂直寬度的三分之一)。據此,影像處理器120將邊緣點A1~A7的位置於垂直位置上大於手部的質心點MA且與手部的質心點MA間的距離大於預設距離的部份的數量作為手部的手指數,亦即影像處理器120會將邊緣點A1~A5當作手部的手指而決定手部的手指數為5。Further, since the finger is generally upwardly deployed, the true finger is generally higher than the centroid point MA in the vertical position, and when the finger is deployed, the fingertip of the finger is away from the centroid point MA, that is, the finger finger. The distance between the tip and the centroid point MA may be greater than a predetermined distance (eg, one-third of the vertical width of the image). Accordingly, the image processor 120 sets the position of the edge points A1 to A7 at a vertical position larger than the centroid point MA of the hand and the distance from the centroid point MA of the hand is greater than the preset distance. The hand index of the part, that is, the image processor 120 determines the hand index of the hand as 5 by using the edge points A1 to A5 as the fingers of the hand.

以影像220而言,影像處理器120會擷取到影像220中的膚色部分H2,並且至少會偵測到邊緣點B1~B6,以及計算出膚色部分H2的質心點MB。然後,依據影像210的判斷方式,影像處理器120會將邊緣點B1~B4當作手部的手指而決定手部的手指數為4。以影像230而言,影像處理器120會擷取到影像230中的膚色部分H3,並且至少會偵測到邊緣點C1~C6,以及計算出膚色部分H3的質心點MC。然後,依據影像210的判斷方式,影像處理器120會將邊緣點C2~C4當作手部的手指而決定手部的手指數為3。In the case of the image 220, the image processor 120 captures the skin color portion H2 in the image 220, and at least detects the edge points B1 B B6, and calculates the centroid point MB of the skin color portion H2. Then, depending on the manner in which the image 210 is determined, the image processor 120 determines the hand index of the hand as 4 by using the edge points B1 B B4 as the fingers of the hand. In the case of the image 230, the image processor 120 captures the skin color portion H3 in the image 230, and at least detects the edge points C1 to C6, and calculates the centroid point MC of the skin color portion H3. Then, depending on the manner in which the image 210 is determined, the image processor 120 determines the hand index of the hand as 3 by using the edge points C2 to C4 as the fingers of the hand.

以影像240而言,影像處理器120會擷取到影像240中的膚色部分H4,並且至少會偵測到邊緣點D1~D6,以及計算出膚色部分H4的質心點MD。然後,依據影像210 的判斷方式,影像處理器120會將邊緣點D3及D4當作手部的手指而決定手部的手指數為2。以影像250而言,影像處理器120會擷取到影像250中的膚色部分H5,並且至少會偵測到邊緣點E1~E6,以及計算出膚色部分H5的質心點ME。然後,依據影像210的判斷方式,影像處理器120會將邊緣點E4當作手部的手指而決定手部的手指數為1。In the case of the image 240, the image processor 120 captures the skin color portion H4 in the image 240, and at least detects the edge points D1 to D6, and calculates the centroid point MD of the skin color portion H4. Then, according to image 210 In the manner of judging, the image processor 120 determines the hand index of the hand as 2 by using the edge points D3 and D4 as the fingers of the hand. In the case of the image 250, the image processor 120 captures the skin color portion H5 in the image 250, and at least detects the edge points E1 to E6, and calculates the centroid point ME of the skin color portion H5. Then, depending on the manner in which the image 210 is judged, the image processor 120 determines the hand index of the hand as 1 by using the edge point E4 as the finger of the hand.

以影像260而言,影像處理器120會擷取到影像260中的膚色部分H6,並且至少會偵測到邊緣點F1~F6,以及計算出膚色部分H6的質心點MF。然後,依據影像210的判斷方式,影像處理器120會將判定影像260中未出現手部的手指而決定手部的手指數為0。In the case of the image 260, the image processor 120 captures the skin color portion H6 in the image 260, and at least detects the edge points F1 to F6, and calculates the centroid point MF of the skin color portion H6. Then, depending on the manner in which the image 210 is determined, the image processor 120 determines that the hand of the hand is not present in the image 260 and determines that the hand index of the hand is zero.

依據上述,影像處理器120可依據單一影像(如210、220、230、240、250及260)的手部的手指數輸出手勢資訊IGES,亦即影像210為對應至一種手勢,影像220為對應至另一種手勢,其餘則以此類推。或者,影像處理器120可依據兩相鄰影像中(如210、220、230、240、250及260)手部的手指數的變化輸出手勢資訊IGES,亦即影像210變化至影像220為對應至一種手勢,影像210變化至影像230為對應至另一種手勢,其餘則以此類推。再者,當兩相鄰影像中(如210、220、230、240、250及260)手部的手指數為相同時,影像處理器120可依據影像中手部的手指數及手部的質心點的移動方向輸出手勢資訊IGES,亦即兩相鄰影像皆為影像210且手部的質心點MA未移動為 對應至一種手勢,兩相鄰影像皆為影像210且手部的質心點MA分別往上、下、左、右移動為對應至四種手勢。According to the above, the image processor 120 can output the gesture information IGES according to the hand index of the hand of a single image (such as 210, 220, 230, 240, 250, and 260), that is, the image 210 corresponds to a gesture, and the image 220 corresponds to To another gesture, the rest is like this. Alternatively, the image processor 120 can output the gesture information IGES according to the change of the hand index of the hand in two adjacent images (eg, 210, 220, 230, 240, 250, and 260), that is, the image 210 changes to the image 220 to correspond to A gesture in which image 210 changes to image 230 to correspond to another gesture, and so on. Furthermore, when the hand index of the hand in two adjacent images (such as 210, 220, 230, 240, 250, and 260) is the same, the image processor 120 can according to the hand index of the hand in the image and the quality of the hand. The moving direction of the heart point outputs the gesture information IGES, that is, both adjacent images are the image 210 and the centroid point MA of the hand is not moved to Corresponding to a gesture, both adjacent images are the image 210 and the centroid point MA of the hand is moved up, down, left, and right respectively to correspond to the four gestures.

在上述實施例中,影像處理器120是以邊緣點的位置於垂直位置上與手部的質心點比較來判斷是否為手部的手指。但在本發明的一實施例中,影像處理器120可先判斷膚色部分的邊緣點是否為對應真實手指,並且將這些邊緣點中對應真實手指且與手部的質心點間的距離大於一預設距離的部份的數量作為手部的手指數。In the above embodiment, the image processor 120 determines whether or not the finger is a hand by comparing the position of the edge point with the centroid point of the hand at a vertical position. However, in an embodiment of the present invention, the image processor 120 may first determine whether the edge points of the skin color portion are corresponding to the real finger, and the distance between the edge points corresponding to the real finger and the centroid point of the hand is greater than one. The number of parts of the preset distance is used as the hand index of the hand.

以影像210為例,影像處理器120會偵測到邊緣點A1~A7,並且計算出膚色部分H1的質心點MA。依據圖形來看,手指的指頭形狀類似字母“U”,而邊緣點A1~A5的形狀類似字母“U”,因此邊緣點A1~A5應該為對應真實手指,邊緣點A6及A7的形狀與字母“U”的相似度較低,因此邊緣點A6及A7應該為非對應真實手指。或者,依據線條斜率來看,邊緣點A1~A5的線條的斜率變化較大,因此邊緣點A1~A5應該為對應真實手指,而邊緣點A6及A7的線條的斜率變化較小,因此邊緣點A6及A7應該為非對應真實手指。並且,由於邊緣點A1~A5與質心點MA的距離較遠,因此影像210的手部的手指數判定為5。Taking the image 210 as an example, the image processor 120 detects the edge points A1 to A7 and calculates the centroid point MA of the skin color portion H1. According to the graph, the shape of the finger is similar to the letter "U", and the shape of the edge points A1~A5 is similar to the letter "U", so the edge points A1~A5 should be the shape and letter corresponding to the real finger, edge points A6 and A7. The similarity of "U" is low, so edge points A6 and A7 should be non-corresponding to real fingers. Or, according to the slope of the line, the slope of the line of the edge points A1~A5 changes greatly, so the edge points A1~A5 should correspond to the real fingers, and the slopes of the lines of the edge points A6 and A7 change less, so the edge points A6 and A7 should be non-corresponding to real fingers. Further, since the distance between the edge points A1 to A5 and the centroid point MA is long, the hand index of the hand of the image 210 is determined to be five.

圖3為依據本發明一實施例的手勢辨識方法的流程圖。請參照圖3,在本實例中,會接收影像感測器所提供的第一影像(步驟S310),並且透過影像處理器判斷第一影像中的手部的手指數以輸出手勢資訊(步驟S320)。FIG. 3 is a flowchart of a gesture recognition method according to an embodiment of the invention. Referring to FIG. 3, in this example, the first image provided by the image sensor is received (step S310), and the hand index of the hand in the first image is determined by the image processor to output gesture information (step S320). ).

圖4為圖3依據本發明一實施例的步驟S320的細部 流程圖。請參照圖3及圖4,在本實施例中,會透過影像處理器擷取第一影像中的膚色部分(步驟S410),並且依據所擷取的膚色部分判斷第一影像中是否出現手部(步驟S420)。當第一影像中出現手部時,亦即步驟S420的判斷結果為“是”,則透過影像處理器對第一影像中的膚色部分進行邊緣偵測,並且計算膚色部分的質心點的位置作為手部的質心點(步驟S430)。接著,會透過影像處理器依據膚色部分的多個邊緣點及手部的質心點判斷手部的手指數(步驟S440),並且透過影像處理器依據第一影像中的手部的手指數輸出手勢資訊(步驟S450)。當第一影像中未出現手部時,亦即步驟S420的判斷結果為“否”,則回到步驟S310。並且,在步驟S450之後,同樣會回到步驟S310。4 is a detail of step S320 of FIG. 3 according to an embodiment of the invention. flow chart. Referring to FIG. 3 and FIG. 4, in this embodiment, the skin color portion of the first image is captured by the image processor (step S410), and whether the hand is present in the first image is determined according to the captured skin color portion. (Step S420). When the hand appears in the first image, that is, the determination result in step S420 is YES, the image processing is performed on the skin color portion of the first image by the image processor, and the position of the centroid point of the skin color portion is calculated. As the centroid point of the hand (step S430). Then, the hand index of the hand is determined according to the plurality of edge points of the skin color portion and the centroid point of the hand through the image processor (step S440), and is output by the image processor according to the hand index of the hand in the first image. Gesture information (step S450). When the hand does not appear in the first image, that is, if the result of the determination in step S420 is "NO", the process returns to step S310. Further, after step S450, the process returns to step S310.

圖5為依據本發明另一實施例的手勢辨識方法的流程圖。請參照圖5,在本實例中,會接收影像感測器所提供的第一影像及第二影像,其中第一影像的時間點不同於第二影像的時間點(步驟S510)。接著,會透過影像處理器依據第一影像中的手部的手指數與第二影像中的手部的手指數的差距輸出手勢資訊(步驟S520)。FIG. 5 is a flowchart of a gesture recognition method according to another embodiment of the present invention. Referring to FIG. 5, in this example, the first image and the second image provided by the image sensor are received, wherein the time point of the first image is different from the time point of the second image (step S510). Then, the gesture information is outputted by the image processor according to the difference between the hand index of the hand in the first image and the hand index of the hand in the second image (step S520).

圖6為圖5依據本發明另一實施例的步驟S520的細部流程圖。請參照圖5及圖6,在本實施例中,會透過影像處理器擷取第一影像及第二影像中的膚色部分(步驟S610),並且依據所擷取的膚色部分判斷第一影像及第二影像中是否皆出現手部(步驟S620)。當第一影像及第二影像中皆出現手部時,亦即步驟S620的判斷結果為“是”, 則透過影像處理器對第一影像及第二影像中的膚色部分進行邊緣偵測,並且計算膚色部分的質心點的位置作為手部的質心點(步驟S630)。接著,會透過影像處理器依據膚色部分的多個邊緣點及手部的質心點判斷第一影像中手部的手指數及第二影像中手部的手指數(步驟S640)。然後,判斷第一影像中的手部的手指數是否相同於第二影像中的手部的手指數(步驟S650)。FIG. 6 is a detailed flow chart of step S520 of FIG. 5 according to another embodiment of the present invention. Referring to FIG. 5 and FIG. 6 , in the embodiment, the skin color portion of the first image and the second image is captured by the image processor (step S610 ), and the first image is determined according to the captured skin color portion. Whether or not the hand is present in the second image (step S620). When the hand appears in both the first image and the second image, that is, the determination result in step S620 is "Yes". Then, edge detection is performed on the skin color portion in the first image and the second image through the image processor, and the position of the centroid point of the skin color portion is calculated as the centroid point of the hand (step S630). Then, the hand index of the hand in the first image and the hand index of the hand in the second image are determined by the image processor according to the plurality of edge points of the skin color portion and the centroid point of the hand (step S640). Then, it is determined whether the hand index of the hand in the first image is the same as the hand index of the hand in the second image (step S650).

當第一影像中的手部的手指數相同於第二影像中的手部的手指數時,亦即步驟S650的判斷結果為“是”,會透過影像處理器依據第一影像中手部的質心點的位置及第二影像中的手部的質心點的位置判斷手部的質心點的移動方向,並且依據第一影像中的手部的手指數及手部的質心點的移動方向輸出手勢資訊。當第一影像中的手部的手指數不同於第二影像中的手部的手指數時,亦即步驟S650的判斷結果為“否”,會透過影像處理器依據第一影像中的手部的手指數與第二影像中的手部的手指數的差距輸出手勢資訊。當第一影像及第二影像的其中之一未出現手部或第一影像及第二影像中皆未出現手部時,亦即步驟S620的判斷結果為“否”,則回到步驟S510。並且,在步驟S660及S670之後,同樣會回到步驟S510。When the hand index of the hand in the first image is the same as the hand index of the hand in the second image, that is, the determination result of step S650 is “Yes”, and the image processor is used according to the hand in the first image. The position of the centroid point and the position of the centroid point of the hand in the second image determine the moving direction of the centroid point of the hand, and according to the hand index of the hand in the first image and the centroid point of the hand The direction of movement outputs gesture information. When the hand index of the hand in the first image is different from the hand index of the hand in the second image, that is, the determination result of step S650 is “No”, the image processor is used according to the hand in the first image. The gap between the hand index and the hand index of the hand in the second image outputs gesture information. When the hand is not present in one of the first image and the second image, or the hand is not present in the first image and the second image, that is, if the determination result in step S620 is "NO", the process returns to step S510. Further, after steps S660 and S670, the process returns to step S510.

其中,上述步驟的順序為用以說明,本發明實施例不以此為限。並且,上述步驟的細節可參照圖1及圖2實施例,在此則不再贅述。The order of the above steps is for illustration, and the embodiment of the present invention is not limited thereto. For details of the above steps, reference may be made to the embodiments of FIG. 1 and FIG. 2, and details are not described herein again.

綜上所述,本發明實施例的手勢辨識系統及手勢辨識 方法,影像處理器可依據單一影像中的手指數輸出手勢資訊,或者影像處理器可依據兩相鄰影像中的手部的手指數的差距輸出手勢資訊。並且,當兩相鄰影像中的手部的手指數相同時,影像處理器可依據影像中的手部的手指數與手部的質心點的移動方向輸出手勢資訊。藉此,可提高操作的精確性及降低使用者的疲勞。In summary, the gesture recognition system and gesture recognition of the embodiment of the present invention The image processor may output gesture information according to a hand index in a single image, or the image processor may output gesture information according to a difference in hand index of the two adjacent images. Moreover, when the hand index of the hand in the two adjacent images is the same, the image processor can output the gesture information according to the hand index of the hand in the image and the moving direction of the centroid point of the hand. Thereby, the accuracy of the operation can be improved and the fatigue of the user can be reduced.

雖然本發明已以實施例揭露如上,然其並非用以限定本發明,任何所屬技術領域中具有通常知識者,在不脫離本發明之精神和範圍內,當可作些許之更動與潤飾,故本發明之保護範圍當視後附之申請專利範圍所界定者為準。Although the present invention has been disclosed in the above embodiments, it is not intended to limit the invention, and any one of ordinary skill in the art can make some modifications and refinements without departing from the spirit and scope of the invention. The scope of the invention is defined by the scope of the appended claims.

100‧‧‧手勢辨識系統100‧‧‧ gesture recognition system

110‧‧‧影像感測器110‧‧‧Image Sensor

120‧‧‧影像處理器120‧‧‧Image Processor

210、220、230、240、250、260‧‧‧影像210, 220, 230, 240, 250, 260‧ ‧ images

A1~A7、B1~B6、C1~C6、D1~D6、E1~E6、F1~F6‧‧‧邊緣點A1~A7, B1~B6, C1~C6, D1~D6, E1~E6, F1~F6‧‧‧ edge points

H1~H6‧‧‧膚色部分H1~H6‧‧‧ skin color section

IGES‧‧‧手勢資訊IGES‧‧‧ gesture information

IMG1‧‧‧第一影像IMG1‧‧‧ first image

IMG2‧‧‧第二影像IMG2‧‧‧Second image

IPOS‧‧‧位置資訊IPOS‧‧‧Location Information

MA~MF‧‧‧質心點MA~MF‧‧‧Quality point

S310、S320、S410、S420、S430、S440、S450、S510、S520、S610、S620、S630、S640、S650、S660、S670‧‧‧步驟S310, S320, S410, S420, S430, S440, S450, S510, S520, S610, S620, S630, S640, S650, S660, S670‧‧

圖1為依據本發明一實施例的手勢辨識系統的系統示意圖。FIG. 1 is a schematic diagram of a system of a gesture recognition system according to an embodiment of the invention.

圖2為依據本發明一實施例的多個手勢示意圖。2 is a schematic diagram of multiple gestures in accordance with an embodiment of the present invention.

圖3為依據本發明一實施例的手勢辨識方法的流程圖。FIG. 3 is a flowchart of a gesture recognition method according to an embodiment of the invention.

圖4為圖3依據本發明一實施例的步驟S320的細部流程圖。4 is a detailed flow chart of step S320 of FIG. 3 in accordance with an embodiment of the present invention.

圖5為依據本發明另一實施例的手勢辨識方法的流程圖。FIG. 5 is a flowchart of a gesture recognition method according to another embodiment of the present invention.

圖6為圖5依據本發明另一實施例的步驟S520的細部流程圖。FIG. 6 is a detailed flow chart of step S520 of FIG. 5 according to another embodiment of the present invention.

210、220、230、240、250、260‧‧‧影像210, 220, 230, 240, 250, 260‧ ‧ images

A1~A7、B1~B6、C1~C6、D1~D6、E1~E6、F1~F6‧‧‧邊緣點A1~A7, B1~B6, C1~C6, D1~D6, E1~E6, F1~F6‧‧‧ edge points

H1~H6‧‧‧膚色部分H1~H6‧‧‧ skin color section

MA~MF‧‧‧質心點MA~MF‧‧‧Quality point

Claims (16)

一種手勢辨識系統,包括:一影像感測器,用以提供一第一影像;以及一影像處理器,耦接該影像感測器,以依據該第一影像中的一手部的手指數輸出一手勢資訊,其中該影像處理器擷取該第一影像中的一膚色部分,以判斷該第一影像中是否出現該手部,當該第一影像中出現該手部時,該影像處理器對該第一影像中的該膚色部分進行邊緣偵測,並且計算該膚色部分的質心點作為該手部的質心點,以依據該膚色部分的多個邊緣點及該手部的質心點判斷該手部的手指數。 A gesture recognition system includes: an image sensor for providing a first image; and an image processor coupled to the image sensor for outputting a hand index according to a hand in the first image Gesture information, wherein the image processor captures a skin color portion of the first image to determine whether the hand is present in the first image, and when the hand appears in the first image, the image processor pair The skin color portion of the first image performs edge detection, and calculates a centroid point of the skin color portion as a centroid point of the hand portion, according to the plurality of edge points of the skin color portion and the centroid point of the hand portion Determine the hand index of the hand. 如申請專利範圍第1項所述之手勢辨識系統,其中該影像處理器將該些邊緣點的位置於垂直位置上大於該手部的質心點且與該手部的質心點間的距離大於一預設距離的部份的數量作為該手部的手指數。 The gesture recognition system of claim 1, wherein the image processor positions the edge points in a vertical position greater than a centroid point of the hand and a distance from a centroid point of the hand The number of portions greater than a predetermined distance is used as the hand index of the hand. 如申請專利範圍第1項所述之手勢辨識系統,其中該影像處理器判斷該些邊緣點是否為對應真實手指,並且將該些邊緣點中對應真實手指且與該手部的質心點間的距離大於一預設距離的部份的數量作為該手部的手指數。 The gesture recognition system of claim 1, wherein the image processor determines whether the edge points are corresponding to a real finger, and the edge points of the edge points are corresponding to the real finger and the centroid point of the hand The distance of the portion greater than a predetermined distance is used as the hand index of the hand. 如申請專利範圍第3項所述之手勢辨識系統,其中該影像處理器依據該些邊緣點的形狀判斷該些邊緣點是否為對應真實手指。 The gesture recognition system of claim 3, wherein the image processor determines whether the edge points correspond to real fingers according to the shape of the edge points. 如申請專利範圍第3項所述之手勢辨識系統,其中該影像處理器依據該些邊緣點的線條的斜率變化判斷該些 邊緣點是否為對應真實手指。 The gesture recognition system of claim 3, wherein the image processor determines the points according to the slope change of the lines of the edge points. Whether the edge point is the corresponding real finger. 如申請專利範圍第1項所述之手勢辨識系統,其中該影像處理器輸出該手部的質心點的位置資訊。 The gesture recognition system of claim 1, wherein the image processor outputs location information of a centroid point of the hand. 如申請專利範圍第1項所述之手勢辨識系統,其中該影像感測器用以提供一第二影像,該影像處理器依據該第一影像中的該手部的手指數與該第二影像中的該手部的手指數的差距輸出該手勢資訊,其中該第一影像的時間點不同於該第二影像的時間點。 The gesture recognition system of claim 1, wherein the image sensor is configured to provide a second image, wherein the image processor is based on the hand index of the hand in the first image and the second image. The gap of the hand index of the hand outputs the gesture information, wherein the time point of the first image is different from the time point of the second image. 如申請專利範圍第7項所述之手勢辨識系統,其中當該第一影像中的該手部的手指數相同於該第二影像中的該手部的手指數時,該影像處理器依據該第一影像中該手部的質心點的位置及該第二影像中該手部的質心點的位置判斷該手部的質心點的移動方向,並且依據該第一影像中的該手部的手指數及該手部的質心點的移動方向輸出該手勢資訊。 The gesture recognition system of claim 7, wherein when the hand index of the hand in the first image is the same as the hand index of the hand in the second image, the image processor is configured according to the The position of the centroid point of the hand in the first image and the position of the centroid point of the hand in the second image determine the moving direction of the centroid point of the hand, and according to the hand in the first image The hand index and the moving direction of the centroid point of the hand output the gesture information. 一種手勢辨識方法,包括:接收一影像感測器所提供的一第一影像;以及透過一影像處理器判斷該第一影像中的一手部的手指數以輸出一手勢資訊;其中,判斷該第一影像中的該手部的手指數的步驟包括:透過該影像處理器擷取該第一影像中的一膚色部分,以判斷該第一影像中是否出現該手部;當該第一影像中出現該手部時,透過該影像處理器對 該第一影像中的該膚色部分進行邊緣偵測,並且計算該膚色部分的質心點的位置作為該手部的質心點;以及透過該影像處理器依據該膚色部分的多個邊緣點及該手部的質心點判斷該手部的手指數。 A gesture recognition method includes: receiving a first image provided by an image sensor; and determining, by an image processor, a hand index of the hand in the first image to output a gesture information; wherein, determining the The step of the hand index of the hand in the image includes: capturing, by the image processor, a skin color portion of the first image to determine whether the hand is present in the first image; when the first image is in the image When the hand appears, through the image processor pair The skin color portion of the first image performs edge detection, and calculates a position of a centroid point of the skin color portion as a centroid point of the hand; and a plurality of edge points of the skin color portion according to the image processor and The center of mass of the hand determines the hand index of the hand. 如申請專利範圍第9項所述之手勢辨識方法,其中依據該些邊緣點及該手部的質心點判斷該手部的手指數的步驟包括:透過該影像處理器將該些邊緣點的位置於垂直位置上大於該手部的質心點且與該手部的質心點間的距離大於一預設距離的部份的數量作為該手部的手指數。 The gesture recognition method of claim 9, wherein the step of determining the hand index of the hand according to the edge points and the centroid point of the hand comprises: using the image processor to perform the edge points The number of the portion in the vertical position that is larger than the centroid point of the hand and the distance from the centroid point of the hand is greater than a predetermined distance is used as the hand index of the hand. 如申請專利範圍第9項所述之手勢辨識方法,其中依據該些邊緣點及該手部的質心點判斷該手部的手指數的步驟包括:透過該影像處理器判斷該些邊緣點是否為對應真實手指,並且將該些邊緣點中對應真實手指且與該手部的質心點間的距離大於一預設距離的部份的數量作為該手部的手指數。 The gesture recognition method of claim 9, wherein the step of determining the hand index of the hand according to the edge points and the centroid point of the hand comprises: determining, by the image processor, whether the edge points are In order to correspond to the real finger, the number of the portions of the edge points corresponding to the real finger and the distance from the centroid point of the hand is greater than a predetermined distance as the hand index of the hand. 如申請專利範圍第11項所述之手勢辨識方法,其中判斷該些邊緣點是否為對應真實手指的步驟包括:透過該影像處理器依據該些邊緣點的形狀判斷該些邊緣點是否為對應真實手指。 The method for identifying a gesture according to claim 11, wherein the step of determining whether the edge points are corresponding to the real finger comprises: determining, by the image processor, whether the edge points are corresponding to the real part according to the shape of the edge points finger. 如申請專利範圍第11項所述之手勢辨識方法,其中判斷該些邊緣點是否為對應真實手指的步驟包括:透過該影像處理器依據該些邊緣點的線條的斜率變 化判斷該些邊緣點是否為對應真實手指。 The gesture recognition method of claim 11, wherein the step of determining whether the edge points are corresponding to the real finger comprises: changing, according to the slope of the line points of the edge points, by the image processor It is determined whether the edge points are corresponding to real fingers. 如申請專利範圍第9項所述之手勢辨識方法,更包括:透過該影像處理器輸出該手部的質心點的位置資訊。 The method for recognizing a gesture according to claim 9 further includes: outputting, by the image processor, location information of a centroid point of the hand. 如申請專利範圍第9項所述之手勢辨識方法,更包括:接收該影像感測器所提供的一第二影像,其中該第一影像的時間點不同於該第二影像的時間點;以及透過該影像處理器依據該第一影像中的該手部的手指數與該第二影像中的該手部的手指數的差距輸出該手勢資訊。 The gesture recognition method of claim 9, further comprising: receiving a second image provided by the image sensor, wherein a time point of the first image is different from a time point of the second image; The gesture information is output by the image processor according to a difference between the hand index of the hand in the first image and the hand index of the hand in the second image. 如申請專利範圍第15項所述之手勢辨識方法,其中透過該影像處理器依據該第一影像中的該手部的手指數與該第二影像中的該手部的手指數的差距輸出該手勢資訊的步驟包括:當該第一影像中的該手部的手指數相同於該第二影像中的該手部的手指數時,透過該影像處理器依據該第一影像中該手部的質心點的位置及該第二影像中的該手部的質心點的位置判斷該手部的質心點的移動方向,並且依據該第一影像中的該手部的手指數及該手部的質心點的移動方向輸出該手勢資訊。The gesture recognition method of claim 15, wherein the image processor outputs the difference between the hand index of the hand in the first image and the hand index of the hand in the second image. The step of the gesture information includes: when the hand index of the hand in the first image is the same as the hand index of the hand in the second image, the image processor is used according to the hand in the first image Determining the moving direction of the centroid point of the hand by the position of the centroid point and the position of the centroid point of the hand in the second image, and according to the hand index of the hand in the first image and the hand The movement direction of the centroid point of the part outputs the gesture information.
TW102100142A 2013-01-03 2013-01-03 System and method for gesture recognition TWI483141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW102100142A TWI483141B (en) 2013-01-03 2013-01-03 System and method for gesture recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW102100142A TWI483141B (en) 2013-01-03 2013-01-03 System and method for gesture recognition

Publications (2)

Publication Number Publication Date
TW201428541A TW201428541A (en) 2014-07-16
TWI483141B true TWI483141B (en) 2015-05-01

Family

ID=51726099

Family Applications (1)

Application Number Title Priority Date Filing Date
TW102100142A TWI483141B (en) 2013-01-03 2013-01-03 System and method for gesture recognition

Country Status (1)

Country Link
TW (1) TWI483141B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5454043A (en) * 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US20030185445A1 (en) * 2002-03-29 2003-10-02 Industrial Technology Research Institute Method for extracting and matching gesture features of image
US20080037875A1 (en) * 2006-08-14 2008-02-14 Hye Jin Kim Method and apparatus for shoulder-line detection and gesture spotting detection
TW201101197A (en) * 2009-06-30 2011-01-01 Univ Nat Taiwan Science Tech Method and system for gesture recognition
TW201120681A (en) * 2009-12-10 2011-06-16 Tatung Co Method and system for operating electric apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5454043A (en) * 1993-07-30 1995-09-26 Mitsubishi Electric Research Laboratories, Inc. Dynamic and static hand gesture recognition through low-level image analysis
US20030185445A1 (en) * 2002-03-29 2003-10-02 Industrial Technology Research Institute Method for extracting and matching gesture features of image
US7068843B2 (en) * 2002-03-29 2006-06-27 Industrial Technology Research Institute Method for extracting and matching gesture features of image
US20080037875A1 (en) * 2006-08-14 2008-02-14 Hye Jin Kim Method and apparatus for shoulder-line detection and gesture spotting detection
TW201101197A (en) * 2009-06-30 2011-01-01 Univ Nat Taiwan Science Tech Method and system for gesture recognition
TW201120681A (en) * 2009-12-10 2011-06-16 Tatung Co Method and system for operating electric apparatus

Also Published As

Publication number Publication date
TW201428541A (en) 2014-07-16

Similar Documents

Publication Publication Date Title
Kumar et al. A multimodal framework for sensor based sign language recognition
US10394334B2 (en) Gesture-based control system
TWI489317B (en) Method and system for operating electric apparatus
WO2022166243A1 (en) Method, apparatus and system for detecting and identifying pinching gesture
CN106598227A (en) Hand gesture identification method based on Leap Motion and Kinect
TWI471815B (en) Gesture recognition device and method
US20140208274A1 (en) Controlling a computing-based device using hand gestures
CN103970264B (en) Gesture recognition and control method and device
US10198627B2 (en) Gesture identification with natural images
TWI571772B (en) Virtual mouse driving apparatus and virtual mouse simulation method
Kalsh et al. Sign language recognition system
Hongyong et al. Finger tracking and gesture recognition with kinect
KR102052449B1 (en) System for virtual mouse and method therefor
KR101706864B1 (en) Real-time finger and gesture recognition using motion sensing input devices
Sun A survey on dynamic sign language recognition
Appenrodt et al. Multi stereo camera data fusion for fingertip detection in gesture recognition systems
TWI483141B (en) System and method for gesture recognition
Ghodichor et al. Virtual mouse using hand gesture and color detection
KR101085536B1 (en) Method for Designing Interface using Gesture recognition
Goussies et al. Learning to detect touches on cluttered tables
Edwin et al. Hand detection for virtual touchpad
Sen et al. Deep Learning-Based Hand Gesture Recognition System and Design of a Human–Machine Interface
TW201401187A (en) Virtual touch method using fingertip detection and system thereof
Mujibiya et al. Anywhere touchtyping: text input on arbitrary surface using depth sensing
Iswarya et al. Interactive Media Control System Using Hand Gestures

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees