TW201224459A - Human motion identification and locating method - Google Patents

Human motion identification and locating method Download PDF

Info

Publication number
TW201224459A
TW201224459A TW99142308A TW99142308A TW201224459A TW 201224459 A TW201224459 A TW 201224459A TW 99142308 A TW99142308 A TW 99142308A TW 99142308 A TW99142308 A TW 99142308A TW 201224459 A TW201224459 A TW 201224459A
Authority
TW
Taiwan
Prior art keywords
user
acceleration
angular velocity
sensing signal
sensing
Prior art date
Application number
TW99142308A
Other languages
Chinese (zh)
Other versions
TWI422824B (en
Inventor
Chi-Wen Teng
Tung-Wu Lu
Original Assignee
Inst Information Industry
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inst Information Industry filed Critical Inst Information Industry
Priority to TW99142308A priority Critical patent/TWI422824B/en
Publication of TW201224459A publication Critical patent/TW201224459A/en
Application granted granted Critical
Publication of TWI422824B publication Critical patent/TWI422824B/en

Links

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Navigation (AREA)

Abstract

An embodiment of the invention provides a method for identifying a human motion. The method comprises the following steps: receiving a first acceleration speed and a first angle speed from a first initial measurement unit tied to a location above the waist of the human; receiving a second angle speed from a second initial measurement unit tied to the human's lower leg or ankle; determining the human's motion state according to the first acceleration speed and a variation of the first acceleration speed.

Description

201224459 六、發明說明: 【弩明所屬之技術領域】 本發明為一種人體運動特徵辨識與定位方法,特別是 一種利.用慣性量測裝置的人體運動特徵辨識與定位方法。 【先前技術】 目前的行動定位技術大多利用接收全球定位系統 (global positioning system,簡稱 GPS)或無線區域網路 (wireless local area network ’ 簡稱 wireless LAN)之訊號以 提供定位數據,然而當GPS被建築或森林阻隔時,便無法 接收定位訊號;wireless LAN的定位也會因 雜性(例如大量的遊客)而嚴重影響其精度。 空間幾何的複 慣性定位則偵測移動物體加速度進以計#1 此不受外界環境影響。加速度儀對移動物體加& =移’因 精確度越高則價格越貴。 ' &之彳貞測 由於加速度儀測量加速度會產生誤:差,在時^ 速度作二次積分所得之位移誤差會雖著時間而更B上'對加 另外,不受外界環境影響的慣性定位多用、〇 大。 車等機械,由於其運動之加減速度明顯且亚'飛機1 ^ '^順。對於 i室命^ 模式複雜的行動者(例如動物、人類等)而^ 統慣性定位。 S ’不能套用傳 【發明内容】 本發明的一實施例提供一種人體運動特徵辨 用以辨識一使用者的一運動狀態,包括:接 °万法, 用者的一腰部以上的一第一慣性量測裴置傳送=*戴在該使 速度與一第一角速度;接收配戴在該使用去^、第加 爷的一小腿或一 NMI99043/0213-A42805-TW/fmal 201224459 腳踝上的一第二慣性量測裝置傳送的一第二角速度;根據 該第一加速度與該第一加速度的一變化量,判斷該使用者 的該運動狀態。 本發明的另一實施例提供一種人體運動特徵辨識與定 位方法,包括:透過配戴在使用者的腰部或腰部以上的一 第一慣性量測裝置量測一垂直方向的一加速度並得到一第 一感測信號以及量測一水平方向的一角速度並得到一第二 感測信號;根據該第一感測信號與該第二感測信號執行一 _ 腳跟偵測程序;根據該第一感測信號與該第二感測信號執 行一行為辨識程序以產生一行為辨識結果;儲存兩次腳跟 著地之間的複數個資料;接收配戴在該使用者的一小腿或 一腳踝上的一第二慣性量測裝置傳送的一第三感測信號; 根據該第三感測信號計算一垂直角速度;根據該等資料與 該垂直角速度計算該使用者的一移動距離並產生一定位資 訊。 【實施方式】 • 有關本發明之前述及其他技術内容、特點與功效,在 以下配合參考圖式之一較佳實施例的詳細說明中,將可清 楚的呈現。以下實施例中所提到的方向用語,例如:上、 下、左、右、前或後等,僅是參考附加圖式的方向。因此, 使用的方向用語是用來說明並非用來限制本發明。 第1圖為根據本發明之人體運動特徵辨識與定位方法 所需的慣性量測裝置的配戴示意圖。在本發明中,人體運 動特徵辨識與定位方法需利用兩個慣性量測裝置(initial measurement unit,IMU )來完成,其中一第一慣性量測裝 NMI99043/0213-A42805-TW/final 5 201224459 置必須配戴在使用者的上半惠r 如配戴位置11與12。而〜第_腰°卩或腰部以上的位置), 用者的小腿或腳踩上,如配戴償眭直冽裝置則配戴在使 用以量測使用者的一垂直力1 13。第一慣性量測裝置 度,用以判斷使用者的1動水平方向的第一角速 以量測使用者的-垂直方向的^。第二慣性量測裝置用 者的移動距離’做為一定位的=二角速度,用以判斷使用 梯、下樓梯、起立、坐下等辇 該運動狀態包括上樓 外殼(housing)包覆。·.. 4。慣性量測裝置可以被一 第2圖為根據本發明< — 程圖。在步驟S21中,〜電體運動特徵辨識方法的流 部或腰部以上的-第-慣性量戴在使用者的腰 與-第-角速度。在步騍S22 ψ置傳送的-第-加速度 使用者的小腿或腳踩上的-第該電子裝置接收配戴在 二角速度。第-加速度為使性量測裝置傳送的-第 速度為使用者的水平方向角加速度’第_角 向的角速度。在步驟阳中逮^第二角速度為—垂直方 -角速度與第二角速度進行作號韓^到的*加速度、第 . T1°琥轉換,信號轉換的方式可 能是小波轉換或是類比數位信號轉換。接著,在步驟s24 中,根據該第-加速度、該第_角速度與該第二角速度判 斷使用者的一運動狀態。 第3圖為根據本發明之一慣性量測裝置的一實施例的 示意_。慣性量測裝置31為配戴在使用者的腰部或腰部以 上的慣性量測裝置’包括加速度感測器32、角速度感測器 33、控制單元34與發送單元35。加速度感測器32用以威 NMI99043/0213-A42805-TW/final 201224459 測使用者的-垂直加速度並產生—第—感測信號。 感=器33用喊測使用者的水平方㈣1速度^ :第二感測信號。控制單元34接收該第—感測信號與 一感測彳§號,產生一垂直加速度值盘一 ’' β 、悉VH«恭、主gg -, * ^ 水十角速度值’並 發送^35傳送到-接收裝置。發送單元%利用一 無線網路界面,藍芽傳輸界面,紅外線界面 面或是其他非接觸式的連接方式與接收裝置树=頻界 HI據本發明之—慣性量測裝置㈣ 圖:量測裝置41為配戴在使用者的小腿或聊踝 上的-祕量測裝置,包括角速度感測器42、控制單元Μ 與發送單元44。角速度感測器42用以量測❹者的一垂 直方向的角速度,並產生-第三感測信號。控制單元“接 收並轉換該第三制信號為-垂直角速度值,並透過發送 單元44傳送到-接收裝置。發送單元44利用一無線網路 界面’藍芽傳輸界面,紅外線界面,無線射頻界面或是其 他非接觸式的連接方式與接收裝置進行溝通。 / 第5圖為根據本發明之一人體運動特徵辨識方法的一 實施例的流程圖。在步驟S51中,先透過配戴在使用者的 腰部或腰部以上的一第一慣性量測裝置量測一垂直加速 度’並得到一第一感測信號。接著在步驟S52中,對該第 一感測信號進行信號轉換,如小波信號轉換,以得到一垂 直加速度值。在步驟S53中,判斷該垂直加速度值是否大 於一第一臨界值,若是,則可判斷此時使用者的運動狀態 為一下樓梯狀態’若否,則進入步驟S55 ^在步驟S54中, 量測一中間加速度(antero-posterior acceleration),並得 NMI99043/0213-A42805-TW/final η 201224459 到一第二感測信號。接著著在步驟S55中,對該第二感測 信號進行信號轉換,如小波信號轉換,以得到一中間加速 度值。在步驟S56中,判斷該中間加速度值是否大於一第 二臨界值’若是,則則可判斷此時使用者的運動狀態為一 走路狀態。若不是,則進入步驟S57,;執行一運動模式比 對程序。在步驟S57中,利用多個比對規則以判斷使用者 目前的運動狀態,而在本實施例中,透過預設的比對規則 可以判斷出使用者目前的運動狀態是上樓梯狀況、起立狀 態或坐下狀態,但非將本發明限於此。起立狀態是指使用 者原本是坐著’後來站起來的狀態。坐下狀態是指使用者 原本是站著,後來坐下的狀態。 第6圖為根據本發明之一人體運動特徵辨識與定位方 法的一實施例的流程圖。在步驟S61中,透過配戴在使用 者的腰部或腰部以上的―第—慣性量測裝置量測—垂直加 ,度並得到-第-感測信號以及量測—水平角速度並得到 :第二感測信號。接著,在步驟S62巾,對該第一感測作 與該第二感測信號進行雜訊過濾、。在步驟S63 t,對i =雜訊的該第-感測信號與該第二感測 = 換後的信號進行信號處理二=驟_中對轉 . 產生複數個處理值。在步驟 在:驟S66中’接收轉換後的信號並執行一行 接此兮㈣識使用者目前的運動狀態。在步驟S6i中5 接收該行為辨識程序的1識結果。在步驟二:據 购 99043/0213-A42805-TW/finai 201224459 該辨識結果與該腳跟著地偵測結果,計算兩次腳跟著地 間,使用者的行走距離。 為更清楚說明,請參考第7圖。第7圖為根據本發明 之一腳跟著地偵測的一實施例的流程圖。在步驟S71中, 接收來自配戴在使用者的小腿或腳踝上的一慣性量測裝置 傳送的對應一加速度的一感測信號。在步驟S72中,對該 感測信號進行取樣,取樣頻率可能為50Hz。接著在步驟 S73中,對取樣後的信號進行雜訊過濾。在步驟S74中, • 根據過濾後的雜訊計算一偵測值。在步驟S75中,判斷該 偵測值是否大於一臨界值。在本實施例中,臨界值為0.5 且偵測值的範圍位於〇與1之間。若偵測值沒有大於0.5, 則回到步驟S71中。若偵測值大於0.5,產生一腳跟著地信 號,並儲存兩次腳跟著地信號間的資料,做為後續計算使 用者移動距離的參考。 第8圖為根據本發明之一使用者定位方法的一實施例 的流程圖。在步驟S81中,接收使用者兩次腳跟著地間的 ® 資料。接著,在步驟S82中,計算使用者的移動距離。接 著,在步驟S83中,根據配戴在使用者的腰部或腰部以上 的一慣性量測裝置傳送來的一角速度值與使用者的一初始 方向,計算使用者的一方向。在步驟S84中,進行一地圖 校正程序。地圖校正程序是根據使用者的方向與移動距 離,判斷使用者的行進方向是否有大方向的改變,如轉彎。 如果有發現到的話,則透過該程序會將使用者先前的記錄 資料清空,重新接收新的資料。如此一來,對於使用者的 定位的準確度是可以提高的。接著在步驟S85中,根據使 NMI99043/0213-A42805-TW/fmal 9 201224459 用者的一初始位置、移動距離與方向,轉換為一坐標資訊, 並且在步驟S86中,根據該坐標資訊於該地圖上顯示使用 者的位置。 第9A圖為一計算使用者移動距離的一實施例的示意 圖。透過配戴在使用者的小腿或腳踝上的一慣性量測裝置 傳送的一角加速度,可以計算出夾角φΐ與φ2。φΐ是垂直 地面方向的一鉛垂線與前腿的夾角。φ2是垂直地面方向的 一鉛垂線與後腿的夹角。在本實施例中,使用者的步長是 利用一人體運動數學模型進行計算所得到,人體運動數學 模型如下所示: ^m〇d e/ = f(L, Ρ, ΦΙΦ2,Θ) = N(l sin ¢51 + L sin + P sin Θ) 參數說明如下: φ 1 :垂直地面方向的一錯垂線與前腿的夾角 φ2 :垂直地面方向的一錯垂線與後腿的夾角 Θ :骨盆旋轉的角度 Ν :腳跟著地的次數 L、Ρ :腿長與骨盆寬 第9Β圖為一使用者上樓的示意圖。為計算使用者上樓 的移動距離,利用另一數學模型來計算使用者的移動距 離。數學模型如下所示: 0 = ^-0.18)1311(^1 + ^2) 第9C圖為一使用者下樓的示意圖。為計算使用者下樓201224459 VI. Description of the invention: [Technical field to which the invention belongs] The invention is a method for recognizing and locating a human body motion feature, in particular, a human body motion feature recognition and positioning method using an inertial measurement device. [Prior Art] Most of the current mobile positioning technologies use signals from a global positioning system (GPS) or a wireless local area network (wireless local area network) to provide positioning data. However, when GPS is constructed Or the forest can't receive the positioning signal when it is blocked; the positioning of the wireless LAN will also seriously affect its accuracy due to the complexity (such as a large number of tourists). The complex inertial positioning of the space geometry detects the acceleration of the moving object into the #1, which is not affected by the external environment. The accelerometer adds & = shift to the moving object. The higher the accuracy, the more expensive it is. ' & 彳贞 由于 由于 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量 测量Positioning is multi-purpose and large. Cars and other machinery, due to the movement of the acceleration and deceleration is obvious and the sub-aircraft 1 ^ ' ^ 顺. For the i-room life ^ mode complex actors (such as animals, humans, etc.) and inertial positioning. S ' can not be used to transmit the invention. [Invention] An embodiment of the present invention provides a human motion feature for identifying a user's motion state, including: a tens of thousands of methods, a first inertia of a user's waist or more Measuring device transmission = * wearing the speed and a first angular speed; receiving a pair of legs on the use of the ^, the first calf or a NMI99043/0213-A42805-TW/fmal 201224459 ankle a second angular velocity transmitted by the second inertial measurement device; determining the motion state of the user according to the first acceleration and a change amount of the first acceleration. Another embodiment of the present invention provides a human body motion feature recognition and positioning method, including: measuring an acceleration in a vertical direction by a first inertial measurement device worn on a waist or waist of a user and obtaining a first a sensing signal and measuring an angular velocity in a horizontal direction and obtaining a second sensing signal; performing a _ heel detection procedure according to the first sensing signal and the second sensing signal; according to the first sensing The signal and the second sensing signal perform a behavior recognition process to generate a behavior recognition result; store a plurality of data between the heel strikes the ground; receive a first piece worn on the user's calf or an ankle a third sensing signal transmitted by the second inertial measurement device; calculating a vertical angular velocity according to the third sensing signal; calculating a moving distance of the user from the vertical angular velocity according to the data and generating a positioning information. The above and other technical contents, features and effects of the present invention will be apparent from the following detailed description of the preferred embodiments. The directional terms mentioned in the following embodiments, such as up, down, left, right, front or back, etc., are only directions referring to the additional drawings. Therefore, the directional terminology used is for the purpose of illustration and not limitation. Fig. 1 is a schematic view showing the wearing of the inertial measurement device required for the human body motion feature recognition and positioning method according to the present invention. In the present invention, the human motion feature recognition and positioning method is completed by using two initial measurement units (IMUs), wherein a first inertial measurement device is installed NMI99043/0213-A42805-TW/final 5 201224459 Must be worn on the user's upper half, such as wearing positions 11 and 12. And the ~ _ waist 卩 or the position above the waist), the user's calf or foot stepping, such as wearing a reclining device is worn to measure a vertical force 1 13 of the user. The first inertial measurement device is used to determine the first angular velocity of the user's 1 horizontal direction to measure the user's vertical direction. The moving distance of the second inertial measuring device is used as a positioning = two-angle speed for judging the use of the ladder, the descending stairs, the standing, the sitting, etc. The moving state includes the upper casing. ·.. 4. The inertial measurement device can be represented by a second diagram according to the present invention. In step S21, the -first inertia amount above the flow portion or the waist portion of the electric body motion feature recognition method is worn on the user's waist and -th-angle speed. In step S22, the transmitted -first-acceleration user's calf or foot is stepped on - the electronic device receives the wear at the two-angle speed. The first acceleration is the angular velocity at which the first speed transmitted by the sex measuring device is the angular acceleration of the horizontal direction of the user. In the step of yang, the second angular velocity is—the vertical square-angular velocity and the second angular velocity are the *acceleration, the first T1° amber conversion, and the signal conversion may be wavelet conversion or analog digital signal conversion. . Next, in step s24, a motion state of the user is determined based on the first acceleration, the y angular velocity, and the second angular velocity. Figure 3 is a schematic illustration of an embodiment of an inertial measurement device in accordance with the present invention. The inertial measurement device 31 is an inertial measurement device ‘ worn on the waist or waist of the user', and includes an acceleration sensor 32, an angular velocity sensor 33, a control unit 34, and a transmission unit 35. The acceleration sensor 32 uses the NMI99043/0213-A42805-TW/final 201224459 to measure the user's vertical acceleration and generate a -first sense signal. The sensor=33 uses the horizontal level of the user (four) 1 speed ^: the second sensing signal. The control unit 34 receives the first sensing signal and a sensing 彳§ number, generates a vertical acceleration value disk-'β, 知VH«恭, master gg -, * ^ water ten-angle speed value' and sends ^35 transmission To-receive device. The transmitting unit % utilizes a wireless network interface, a Bluetooth transmission interface, an infrared interface surface or other non-contact connection method and a receiving device tree = frequency boundary HI according to the present invention - inertial measurement device (4) Figure: Measuring device 41 is a secret measuring device worn on the user's calf or chat, including an angular velocity sensor 42, a control unit Μ and a transmitting unit 44. The angular velocity sensor 42 is configured to measure the angular velocity of a vertical direction of the latter and generate a third sense signal. The control unit "receives and converts the third signal to a vertical angular velocity value and transmits it to the receiving device through the transmitting unit 44. The transmitting unit 44 utilizes a wireless network interface 'Bluetooth transmission interface, infrared interface, wireless RF interface or It is another non-contact type of connection that communicates with the receiving device. / Figure 5 is a flow chart of an embodiment of a human body motion feature recognizing method according to the present invention. In step S51, the user is first worn through the user. A first inertial measurement device above the waist or the waist measures a vertical acceleration 'and obtains a first sensing signal. Then in step S52, the first sensing signal is converted, such as wavelet signal conversion, to Obtaining a vertical acceleration value. In step S53, determining whether the vertical acceleration value is greater than a first threshold value, and if so, determining that the user's motion state is a stair state at the time of step-if no, proceeding to step S55. In step S54, an intermediate acceleration (antero-posterior acceleration) is measured, and NMI99043/0213-A42805-TW/final η 201224459 is obtained. The second sensing signal is next, in step S55, the second sensing signal is subjected to signal conversion, such as wavelet signal conversion, to obtain an intermediate acceleration value. In step S56, it is determined whether the intermediate acceleration value is greater than a second. If the threshold value is ', the state of the user's motion state is determined to be a walking state. If not, the process proceeds to step S57, and a motion mode comparison program is executed. In step S57, a plurality of comparison rules are utilized. Determining the current state of motion of the user, and in this embodiment, the preset comparison rule can determine that the current state of motion of the user is a stair condition, a standing state, or a sitting state, but the invention is not limited thereto. The standing state refers to the state in which the user is sitting and 'stuck up later. The sitting state refers to the state in which the user is standing and then sitting down. FIG. 6 is a diagram of human body motion recognition according to the present invention. A flowchart of an embodiment of the positioning method. In step S61, the measurement is performed by a "first inertial measurement device" that is worn on the waist or waist of the user. And obtaining a -first-sensing signal and measuring-horizontal angular velocity and obtaining: a second sensing signal. Then, in step S62, the first sensing and the second sensing signal are subjected to noise filtering. In step S63 t, the first sensing signal of i = noise and the second sensing = the converted signal are subjected to signal processing two = _ _ reversed. A plurality of processing values are generated. In step S66, 'received the converted signal and executes a line to receive the next (4) to recognize the current motion state of the user. In step S6i, 5 receives the result of the behavior recognition program. In step 2: purchase 99043/0213- A42805-TW/finai 201224459 The identification result and the heel strike detection result calculate the distance traveled between the heel and the ground. For a clearer explanation, please refer to Figure 7. Figure 7 is a flow diagram of an embodiment of heel landing detection in accordance with the present invention. In step S71, a sensing signal corresponding to an acceleration transmitted from an inertial measuring device worn on the user's lower leg or ankle is received. In step S72, the sensing signal is sampled, and the sampling frequency may be 50 Hz. Next, in step S73, noise filtering is performed on the sampled signal. In step S74, • a detected value is calculated based on the filtered noise. In step S75, it is determined whether the detected value is greater than a critical value. In this embodiment, the threshold is 0.5 and the range of detected values is between 〇 and 1. If the detected value is not greater than 0.5, the process returns to step S71. If the detected value is greater than 0.5, a heel strike signal is generated and the data between the heel strike signals is stored twice as a reference for subsequent calculation of the user's movement distance. Figure 8 is a flow diagram of an embodiment of a user positioning method in accordance with the present invention. In step S81, the user's data of the heel strikes the ground twice is received. Next, in step S82, the moving distance of the user is calculated. Next, in step S83, a direction of the user is calculated based on an angular velocity value transmitted from an inertial measuring device worn above the waist or waist of the user and an initial direction of the user. In step S84, a map correction procedure is performed. The map correction program determines whether the user's direction of travel changes in a large direction, such as a turn, based on the direction of the user and the distance of movement. If it is found, the program will clear the user's previous records and re-receive new data. As a result, the accuracy of the positioning of the user can be improved. Next, in step S85, an initial position, a moving distance and a direction of the user of the NMI99043/0213-A42805-TW/fmal 9 201224459 are converted into a coordinate information, and in step S86, the coordinate information is used on the map. The location of the user is displayed on it. Figure 9A is a schematic diagram of an embodiment of calculating the distance traveled by a user. The angles φ ΐ and φ 2 can be calculated by an angular acceleration transmitted by an inertial measuring device worn on the user's lower leg or ankle. Φΐ is the angle between a vertical line and the front leg in the vertical direction. Φ2 is the angle between a vertical line and the rear leg in the vertical ground direction. In this embodiment, the user's step size is obtained by using a mathematical model of human motion, and the mathematical model of the human motion is as follows: ^m〇de/ = f(L, Ρ, ΦΙΦ2, Θ) = N( l sin ¢51 + L sin + P sin Θ) The parameters are as follows: φ 1 : the angle between a wrong vertical line and the front leg in the vertical ground direction φ2 : the angle between the vertical line and the rear leg in the vertical direction Θ: pelvic rotation Angle Ν: The number of heels hitting the ground L, Ρ: leg length and pelvis width Figure 9 is a schematic diagram of a user going upstairs. To calculate the distance the user moves upstairs, another mathematical model is used to calculate the user's movement distance. The mathematical model is as follows: 0 = ^-0.18) 1311 (^1 + ^2) Figure 9C is a schematic diagram of a user going downstairs. Going down the stairs for calculating users

I 的移動距離,利用另一數學模型來計算使用者的移動距 離。數學模型如下所示: ά^\ΠΡ^ιη{φ\^φ2) ΝΜΙ99043/0213-A42805-TW/fmal 10 201224459The moving distance of I, using another mathematical model to calculate the user's moving distance. The mathematical model is as follows: ά^\ΠΡ^ιη{φ\^φ2) ΝΜΙ99043/0213-A42805-TW/fmal 10 201224459

要注意的是,使用者在上下樓梯時,其後腳與垂直地 面方向的鉛垂線之間的夾角應大於其前腳與垂直地面方向 的鉛垂線之間的夹角。也就是φ2>(ρ1。因此,可以利用這 樣的特徵’設計-判斷機制’用以避免計算接收到錯誤的 ^訊。請參考第9D圖。第90圖為根據本發明之一步長計 =裝置的一實施例的示意圖。步長計算單元91包括角度計 算裝置92、判斷單元93以及步長計算模組94。角度f算 妒置92接收配戴在使用者的小腿或腳踝上的一慣性量測 裝置傳送的-角加速度,計算垂直地面方向的—錯垂線與 使用者前腿的夾角φ卜與垂直地面方向的錯垂線盘使用者 後腿的失角cp2。判斷單元93先比較角度⑻與扣的大小。 若Φ1大於φ2,則表示角度計算裝置92的計算結果有誤, 不會將計算結果傳送給步長計算模組94。若^小於的, 則表示角度計算裝置92的計算結果正確,料算結果傳送 給步長計算模組94,以計算使用者的步長。 第1〇圖為-使用者方向計算方法的一實施例的流程 圖。在步驟S1G1巾’先給枝用者—初始方向。接著在步 ,S102中’接㈣戴在使用者的腰部或腰部以上的一慣性 ,測裝置傳送來的-角速度值,計算使用者的一轉向角 2接著在步驟關中,根據初始方向與該轉向角度,計 算使用者的一目前方向。 惟以上所述者,僅為本發明之較佳實施例而已,當不 =此㈣本發明實施之範圍,即大凡依本發明申請專利 =及發明說明内容所作之簡單的等效變化與修飾,皆仍 屬本發明專利涵蓋之範_。另外本發明的任—實施例或 ΝΜΙ99043/0213-A42805-TW/final ^ 201224459 申請專利範圍残達成本發明所 特點。此外,摘要部分和標題僅 ^部目的或優點或 之用二=本發明之權利範:輔助專利文件搜尋 第1圖為根據本發明之人體運 所需的慣性量測襄置的配戴示意圖/寺徵辨識與定位方法 第2圖為根據本發明之一人體 程圖。 動特徵辨識方法的流 第3圖為根據本發明之—慣性 — 示意圖。 、彳裝置的一貫施例的 第4圖為根據本發明之—慣性 的示意圖。 、裝置的另一實施例 第5圖為根據本發明之一人體 實施例的流程圖。 特斂辨識方法的一 第6圖為根據本發明之一人體運動特徵辨 法的一實施例的流程圖。 ° 疋位方 ^ 7圖為根據本發明之一腳跟著地偵測的一實施例的 流私圖。 第8圖為根縣發明之—使用者定財 的流程圖。 夏她例 第9A圖為一計算使用者移動距離的—實施例的示意 圖。 、 第9B圖為一使用者上樓的示意圖。 第9C圖為一使用者下樓的示意圖。 第9D圖為根據本發明之一步長計算裝置的一實施例 NMI99043/0213-A42805-TW/fmaI 12 201224459 的示意圖。 第ίο圖為一使用者方向計算方法的一實施例的流程圖 【主要元件符號說明】 11、12、13〜配戴位置 31〜慣性量測裝置 3 2〜加速度感測器 33〜角速度感測器 34〜控制單元 φ 35〜發送單元 41〜慣性量測裝置 42〜角速度感測器 43〜控制單元 44〜發送單元 91〜步長計算單元 92〜角度計算裝置 93〜判斷單元 • 94〜步長計算模組 NMI99043/0213-A42805-TW/fmal 13It should be noted that when the user goes up and down the stairs, the angle between the rear foot and the vertical line in the vertical direction should be larger than the angle between the front foot and the vertical line in the vertical ground direction. That is, φ2> (ρ1. Therefore, such a feature 'design-judgment mechanism' can be utilized to avoid calculation of the received error. Please refer to Fig. 9D. Fig. 90 is a step size meter according to the present invention = device A schematic diagram of an embodiment. The step calculation unit 91 includes an angle calculation device 92, a determination unit 93, and a step calculation module 94. The angle f calculation unit 92 receives an inertial measurement worn on the user's lower leg or ankle. The angular acceleration transmitted by the device calculates the angle between the vertical ground direction and the front leg of the user, and the angle cp2 of the rear leg of the user in the vertical ground direction. The judging unit 93 compares the angle (8) with the buckle first. If Φ1 is greater than φ2, it means that the calculation result of the angle calculation device 92 is incorrect, and the calculation result is not transmitted to the step calculation module 94. If the value is less than, the calculation result of the angle calculation device 92 is correct. The calculation result is transmitted to the step calculation module 94 to calculate the user's step size. The first diagram is a flowchart of an embodiment of the user direction calculation method. In step S1G1, the towel is first given to the user. - initial direction. Then in step S102, 'connected (4) an inertia worn on the waist or waist of the user, the value of the angular velocity transmitted by the measuring device, calculates a steering angle 2 of the user, and then in the step, according to the initial Direction and the steering angle, calculating a current direction of the user. However, the above is only a preferred embodiment of the present invention, and if not = (4) the scope of the implementation of the present invention, that is, the patent application according to the present invention = And the simple equivalent changes and modifications made by the description of the invention are still covered by the patent of the present invention. In addition, any of the embodiments of the present invention or ΝΜΙ99043/0213-A42805-TW/final ^ 201224459 The present invention is characterized in that the abstract part and the title are only used for the purpose or advantage or the use of the second embodiment of the invention: the auxiliary patent document search Fig. 1 is the inertial measurement device required for the human body transportation according to the present invention. Fig. 2 is a schematic diagram of a human body according to the present invention. Flowchart of the dynamic feature identification method is shown in Fig. 3 - inertia - schematic according to the present invention Figure 4 is a schematic view of a conventional embodiment of a device according to the present invention. Another embodiment of the device Fig. 5 is a flow chart of an embodiment of the human body according to the present invention. Fig. 6 is a flow chart showing an embodiment of a human motion characteristic method according to the present invention. The Fig. 7 is a flow private diagram of an embodiment of heel landing detection according to the present invention. Figure 8 is a flow chart of the invention of the user-defined money in the county. The 9A of the summer example is a schematic diagram of an embodiment for calculating the distance traveled by the user. Figure 9B is a schematic diagram of a user going upstairs. Figure 9C is a schematic diagram of a user going downstairs. Figure 9D is a schematic diagram of an embodiment of a step size computing device according to the present invention NMI99043/0213-A42805-TW/fmaI 12 201224459. Figure ίο is a flow chart of an embodiment of a user direction calculation method [main component symbol description] 11, 12, 13~ wearing position 31~ inertial measurement device 3 2~ acceleration sensor 33~ angular velocity sensing 34 to control unit φ 35 to transmitting unit 41 to inertial measuring device 42 to angular velocity sensor 43 to control unit 44 to transmitting unit 91 to step calculating unit 92 to angle calculating device 93 to determining unit • 94 to step Calculation Module NMI99043/0213-A42805-TW/fmal 13

Claims (1)

201224459 七、申請專利範圍: 1. 一種人體運動特徵辨識方法,用以辨識一使用者的 一運動狀態,包括: 接收配戴在該使用者的一腰部以上的一第一慣性量測 裝置傳送的一第一加速度與一第一角速度; 接收配戴在該使用者的一小腿或一腳騍上的一第二慣 性量測裝置傳送的一第二角速度;以及 根據該第一加速度與該第一加速度的一變化量,判斷 該使用者的該運動狀態。 2. 如申請專利範圍第1項所述之人體運動特徵辨識方 法,其中該第一加速度為一垂直加速度。 3. 如申請專利範圍第1項所述之人體運動特徵辨識方 法,其中該第一角速度為該使用者之一水平角速度。 4. 如申請專利範圍第1項所述之人體運動特徵辨識方 法,其中該第一慣性量測裝置包括: 一加速度感測器,用以感應該使用者的一垂直加速度 並產生一第一感測資料; 一角速度感測器,用以感應該使用者的一水平角速度 並產生一第二感測資料;以及 一控制單元,接收該第一感測資料與該第二感測資料 以產生該第一加速度與該第一角速度。 5. 如申請專利範圍第1項所述之人體運動特徵辨識方 法,其中該第一慣性量測裝置更包括一發送單元,用以傳 送該第一加速度與該第一角速度至一電子裝置。 6. 如申請專利範圍第1項所述之人體運動特徵辨識方 NMI99043/0213-A42805-TW/final 14 201224459 法,其中該第二慣性量測裝置包括: 一角速度感測器’用以感應該使用者的一垂直角速度 並產生一第三感測資料; 一控制單元,接收該第三感測資料以產生該第二角速 度。 7.如申請專利範圍第6項所述之人體運動特徵辨識方 法’其中該第一慣性量測裝置更包括一發送單元,用以傳 送該第二角速度至一電子裝置。 籲 8.如申請專利範圍第丨項所述之人體運動特徵辨識方 法,其中更包括: 以及 判斷該使用者 判斷該第一加速度是否大於一臨界值; 當該第一加速度大於該第一臨界值時, 的該運動狀態為一下樓梯狀態。 體運動特徵辨識方 9.如申清專利範圍第8項所述之人 法,其中更包括: 量測一中間加速度; 列斷該中間加 判斷該使用者 運動特徵辨識 當該第一加速度小於該第一臨界值時, 速度是否大於一第二臨界值;以及 當該中間加速度大於該第一臨界值時, 的該運動狀態為一走路狀態。 10.如申請專利範圍第9項所述之人 方法,其中更包括: 執行〜運動模 當該第一加速度大於該第一臨界值 式比對程序。 ' 11. 一種人體運動特徵辨識與定位方法,勺 NMI99043/0213-A42805-TW/final , * 201224459 透過配戴在使用者的腰部或腰部以上的一第一慣性量 測裝置量測一垂直方向的一加速度並得到一第一感測信號 以及量測一水平方向的一角速度並得到一第二感測信號; 根據該第一感測信號與該第二感測信號執行一腳跟偵 測程序; 根據該第一感測信號與該第二感測信號執行一行為辨 識程序以產生一行為辨識結果; 儲存兩次腳跟著地之間的複數個資料; 接收配戴在該使用者的一小腿或一腳踝上的一第二慣 性量測裝置傳送的一第三感測信號; 根據該第三感測信號計算一垂直角速度;以及 根據該等資料與該垂直角速度計算該使用者的一移動 距離並產生一定位資訊。 12. 如申請專利範圍第11項所述之人體運動特徵辨識 與定位方法,其中更包括: 設定該使用者的一初始方向; 根據該第一感測信號產生一垂直加速度; 根據該第二感測信號產生一水平角速度;以及 根據該水平角速度計算該使用者的一轉向角度; 根據該初始方向與該轉向角度產生該使用者的一目前 方向。 13. 如申請專利範圍第12項所述之人體運動特徵辨識 與定位方法,其中該定位資訊更依據該使用者的該初始方 向與目前方向所決定。 14. 如申請專利範圍第11項所述之人體運動特徵辨識 NMI99043/0213-A42805-TW/final 16 201224459 與定位方法,其中該腳跟偵測程序包括: 對該第一感測信號與該第二感測信號進行一取樣程序 以產生一第一取樣信號與一第二取樣信號; 根據該第一取樣信號與該第二取樣信號產生複數個偵 測值;以及 當該偵測值大於一臨界值時,產生一腳跟著地信號。 15.如申請專利範圍第14項所述之人體運動特徵辨識 與定位方法,其中該臨界值為0.5。 NMI99043/0213-A42805-TW/fmal 17201224459 VII. Patent Application Range: 1. A human body motion feature identification method for identifying a motion state of a user, comprising: receiving a first inertial measurement device worn on a waist of the user a first acceleration and a first angular velocity; receiving a second angular velocity transmitted by a second inertial measurement device mounted on a lower leg or an ankle of the user; and according to the first acceleration and the first The amount of change in acceleration determines the state of motion of the user. 2. The human motion feature recognition method of claim 1, wherein the first acceleration is a vertical acceleration. 3. The human motion feature recognition method of claim 1, wherein the first angular velocity is a horizontal angular velocity of the user. 4. The human motion characteristic identification method according to claim 1, wherein the first inertial measurement device comprises: an acceleration sensor for sensing a vertical acceleration of the user and generating a first sense Measuring data; a corner speed sensor for sensing a horizontal angular velocity of the user and generating a second sensing data; and a control unit receiving the first sensing data and the second sensing data to generate the The first acceleration is the first angular velocity. 5. The human motion feature recognition method of claim 1, wherein the first inertial measurement device further comprises a transmitting unit for transmitting the first acceleration and the first angular velocity to an electronic device. 6. The human motion feature recognition method NMI99043/0213-A42805-TW/final 14 201224459 according to claim 1, wherein the second inertial measurement device comprises: an angular velocity sensor 'to sense the a vertical angular velocity of the user and generating a third sensing data; a control unit receiving the third sensing data to generate the second angular velocity. 7. The human motion feature recognition method of claim 6, wherein the first inertial measurement device further comprises a transmitting unit for transmitting the second angular velocity to an electronic device. 8. The method of identifying a human motion feature according to the above application, further comprising: determining whether the user determines whether the first acceleration is greater than a threshold; and when the first acceleration is greater than the first threshold At the time, the state of motion is the state of the stairs. The method for identifying the motion of a body motion is as follows: 9. The method of claim 8, wherein the method further comprises: measuring an intermediate acceleration; determining the intermediate motion plus determining the motion characteristic of the user when the first acceleration is less than the The first threshold value, whether the speed is greater than a second threshold value; and when the intermediate acceleration is greater than the first threshold value, the motion state is a walking state. 10. The method of claim 9, wherein the method further comprises: performing a ~motional mode when the first acceleration is greater than the first threshold value comparison procedure. 11. A method for recognizing and locating human motion characteristics, spoon NMI99043/0213-A42805-TW/final, * 201224459 Measuring a vertical direction by a first inertial measuring device worn on the waist or waist of the user And accelerating a first sensing signal and measuring an angular velocity in a horizontal direction and obtaining a second sensing signal; performing a heel detection procedure according to the first sensing signal and the second sensing signal; The first sensing signal and the second sensing signal perform a behavior recognition program to generate a behavior recognition result; store a plurality of data between the heel strikes the ground; receive a calf or a worn on the user a third sensing signal transmitted by a second inertial measuring device on the pedal; calculating a vertical angular velocity according to the third sensing signal; and calculating a moving distance of the user from the vertical angular velocity according to the data and generating A positioning information. 12. The method for recognizing and locating a human motion feature according to claim 11, further comprising: setting an initial direction of the user; generating a vertical acceleration according to the first sensing signal; The measured signal produces a horizontal angular velocity; and a steering angle of the user is calculated based on the horizontal angular velocity; a current direction of the user is generated based on the initial direction and the steering angle. 13. The method for identifying and locating a human motion feature according to claim 12, wherein the positioning information is determined according to the initial direction and the current direction of the user. 14. The human motion feature identification NMI99043/0213-A42805-TW/final 16 201224459 and the positioning method according to claim 11, wherein the heel detection program comprises: the first sensing signal and the second The sensing signal performs a sampling process to generate a first sampling signal and a second sampling signal; generating a plurality of detection values according to the first sampling signal and the second sampling signal; and when the detection value is greater than a threshold At the time, a heel strike signal is generated. 15. The method for identifying and locating a human motion feature according to claim 14, wherein the threshold is 0.5. NMI99043/0213-A42805-TW/fmal 17
TW99142308A 2010-12-06 2010-12-06 Human motion identification and locating method TWI422824B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW99142308A TWI422824B (en) 2010-12-06 2010-12-06 Human motion identification and locating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW99142308A TWI422824B (en) 2010-12-06 2010-12-06 Human motion identification and locating method

Publications (2)

Publication Number Publication Date
TW201224459A true TW201224459A (en) 2012-06-16
TWI422824B TWI422824B (en) 2014-01-11

Family

ID=46725870

Family Applications (1)

Application Number Title Priority Date Filing Date
TW99142308A TWI422824B (en) 2010-12-06 2010-12-06 Human motion identification and locating method

Country Status (1)

Country Link
TW (1) TWI422824B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI468646B (en) * 2012-07-11 2015-01-11 Univ Nat Cheng Kung Calculation method of step distance of computing device
CN110260860A (en) * 2019-06-20 2019-09-20 武汉大学 Indoor moving measurement and positioning method for determining posture and system based on foot inertial sensor

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7092216B2 (en) * 2019-01-24 2022-06-28 富士通株式会社 Information processing program, information processing method and information processing system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI348639B (en) * 2005-12-16 2011-09-11 Ind Tech Res Inst Motion recognition system and method for controlling electronic device
DE102006018545B4 (en) * 2006-04-21 2009-12-31 Andrea Wimmer Pedometer for four-legged friends
US20080146968A1 (en) * 2006-12-14 2008-06-19 Masuo Hanawaka Gait analysis system
KR100834723B1 (en) * 2007-05-14 2008-06-05 팅크웨어(주) Method and apparatus for decide vertical travel condition using sensor
PT103933A (en) * 2008-01-17 2009-07-17 Univ Do Porto PORTABLE DEVICE AND METHOD FOR MEASURING AND CALCULATING DYNAMIC PARAMETERS OF PEDESTRIAN LOCOMOTION
CN201387660Y (en) * 2009-04-28 2010-01-20 中国科学院合肥物质科学研究院 Automatic detecting and alarming system for human body falling-over
CN101579238B (en) * 2009-06-15 2012-12-19 吴健康 Human motion capture three dimensional playback system and method thereof
CN101894252B (en) * 2010-03-29 2012-12-05 天津大学 Walking movement classification method based on triaxial acceleration transducer signals

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI468646B (en) * 2012-07-11 2015-01-11 Univ Nat Cheng Kung Calculation method of step distance of computing device
CN110260860A (en) * 2019-06-20 2019-09-20 武汉大学 Indoor moving measurement and positioning method for determining posture and system based on foot inertial sensor
CN110260860B (en) * 2019-06-20 2023-06-23 武汉大学 Indoor movement measurement positioning and attitude determination method and system based on foot inertial sensor

Also Published As

Publication number Publication date
TWI422824B (en) 2014-01-11

Similar Documents

Publication Publication Date Title
CN106815857B (en) Gesture estimation method for mobile auxiliary robot
EP2936066B1 (en) Swing compensation in step detection
Cho et al. Autogait: A mobile platform that accurately estimates the distance walked
TWI538662B (en) Body motion detector device and control method of the same
TWI468646B (en) Calculation method of step distance of computing device
JP6127873B2 (en) Analysis method of walking characteristics
JP2015062654A (en) Gait estimation device, program thereof, stumble risk calculation device and program thereof
CN106767790B (en) The method that human body lower limbs motion model merges estimation pedestrian's mobile tracking with Kalman filtering
US9159213B1 (en) Motion detection method and device
US20140257766A1 (en) Adaptive probabilistic step detection for pedestrian positioning
TW201214292A (en) Automatic identifying method for exercise mode
US20210093917A1 (en) Detecting outdoor walking workouts on a wearable device
JP2016150193A (en) Motion analysis device
WO2015182304A1 (en) Information processing device, information processing method, and computer program
TW201224459A (en) Human motion identification and locating method
JP2015014587A (en) Information processor, position determination method and position determination program
Gowda et al. UMOISP: Usage mode and orientation invariant smartphone pedometer
JP6416722B2 (en) Step counting device, step counting method, and program
JP2015224932A (en) Information processing device, information processing method, and computer program
JP2008054768A (en) Pedometer device and pedometer measuring method
JP6578874B2 (en) Walking cycle detection method and detection apparatus
KR101995482B1 (en) Motion sensing method and apparatus for gait-monitoring
KR20190120638A (en) Apparatus for pedestrian dead reckoning using shoe model and method for the same
JP6054905B2 (en) Path shape determination device, exercise support system, and program
JPWO2016063661A1 (en) Information processing apparatus, information processing method, and program