CN213403447U - Bluetooth headset of distinguishable gesture - Google Patents

Bluetooth headset of distinguishable gesture Download PDF

Info

Publication number
CN213403447U
CN213403447U CN202022659118.9U CN202022659118U CN213403447U CN 213403447 U CN213403447 U CN 213403447U CN 202022659118 U CN202022659118 U CN 202022659118U CN 213403447 U CN213403447 U CN 213403447U
Authority
CN
China
Prior art keywords
sensor
earphone
chip
angular velocity
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202022659118.9U
Other languages
Chinese (zh)
Inventor
张磊
张彤
马亚卿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Gewu Liangzhi Information Technology Co ltd
Original Assignee
Xi'an Gewu Liangzhi Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Gewu Liangzhi Information Technology Co ltd filed Critical Xi'an Gewu Liangzhi Information Technology Co ltd
Priority to CN202022659118.9U priority Critical patent/CN213403447U/en
Application granted granted Critical
Publication of CN213403447U publication Critical patent/CN213403447U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The utility model relates to the technical field of electronic information, and discloses a Bluetooth earphone capable of recognizing gestures; a sensing unit comprising an acceleration sensor and an angular velocity sensor is arranged in the Bluetooth headset; the control unit is connected with the sensing unit through an FPC (flexible printed circuit) flexible board; the storage unit is connected with the control unit through an FPC (flexible printed circuit) flexible board; the method comprises the steps that an acceleration sensor and an angular velocity sensor acquire acceleration and angular velocity data of the earphone moving along with the head; the control unit receives and processes acceleration and angular velocity data, determines the movement direction of the earphone according to the processed data, counts when the data is greater than or equal to a counting threshold value, and determines the movement posture of the earphone when the counting number is greater than a trigger threshold value; when the earphone stops moving, determining as a single gesture; when the earphone continues to move, detecting the next gesture until the earphone stops moving, and fusing a plurality of single gestures to obtain a complex gesture; and determining a corresponding control instruction in the attitude number-instruction mapping table according to the single attitude or the complex attitude, and sending the control instruction to the communication terminal.

Description

Bluetooth headset of distinguishable gesture
Technical Field
The utility model relates to an electronic information technical field especially relates to a bluetooth headset of distinguishable gesture.
Background
Compared with the common earphone, the Bluetooth earphone transmits audio and control signals between devices through wireless Bluetooth without a common signal line, thereby greatly improving portability.
The neck of the existing Bluetooth headset is provided with a button and a charging interface, and a control chip circuit board is arranged from the bottom upwards to ensure that the circuit board is aligned with the charging interface; however, when the major axis of the inner diameter of the earphone shell is less than or equal to 5.5mm and the minor axis is less than or equal to 4.5mm, a control chip with a larger package or a circuit board with more units cannot be mounted.
In addition, the existing bluetooth headset is controlled through keys, the adjustment and the control can be realized only by pressing or touching the keys of the headset by hands in the operation mode, and the control of the communication terminal cannot be realized when the buttons are not pressed in place, so that the use is inconvenient.
SUMMERY OF THE UTILITY MODEL
The utility model provides a bluetooth headset of distinguishable gesture and control method thereof solves and controls the bluetooth headset inconvenient operation and the unsafe problem of information interaction through the button.
The utility model discloses a realize through following scheme:
a Bluetooth headset capable of recognizing gestures comprises a loudspeaker and a charging interface, wherein the loudspeaker is packaged at the top of a headset shell through a top cover, and the charging interface is installed at the bottom of the headset shell through a bottom cover; the earphone shell also comprises a sensing unit arranged above the charging interface, a control unit arranged at the arc bending part of the earphone shell, a Bluetooth module and a power supply unit which are connected with the control unit, and a storage unit arranged at the loudspeaker part;
the sensing unit comprises an acceleration sensor and an angular velocity sensor which are respectively used for acquiring acceleration and angular velocity data when the earphone moves along with the head;
the storage unit comprises a memory for storing processing data;
the input end of the control unit is connected to the sensing unit and the storage unit through FPC (flexible printed circuit) soft boards respectively, and is used for inputting acceleration and angular velocity data collected by the sensing unit, calling processing data to process the acceleration and angular velocity data and then outputting a control instruction; the output end is connected with the Bluetooth module and outputs a control instruction to the communication terminal;
the power supply unit is a power supply battery, one end of the power supply unit is connected to the charging interface, and the other end of the power supply unit is connected to the control unit.
Further, the sensing unit encapsulates an acceleration sensor and an angular velocity sensor by a gyroscope.
Furthermore, the sensing unit also comprises a plurality of light sensors which are respectively arranged on the storage unit, the control unit and the sensing unit, and light sensing holes are arranged at earphone shells of the light sensors; the optical sensor collects the distance between the earphone and the ear through the photosensitive hole, judges whether the earphone is in the ear or not, causes interruption after the earphone is in the ear, and sends a signal to the control unit.
Further, the light sensor includes a first light sensor, a second light sensor, and a third light sensor; the first light sensor is arranged on the sensing unit; the second light sensor and the third light sensor are respectively arranged on the storage unit and the control unit.
Furthermore, the first light sensor, the second light sensor and the third light sensor all adopt infrared sensors.
Further, the chip adopted by the gyroscope is U7, and the chip is arranged on the reverse side of the sensing unit; chips adopted by the first optical sensor, the second optical sensor and the third optical sensor are respectively U4, U5 and U6 and are respectively arranged on the front surface of the sensing unit, the back surface of the storage unit and the back surface of the control unit; the front surface of the control unit is provided with a control chip U2; the memory adopts a chip U3, which is arranged on the front surface of the memory unit.
Further, the chip U7 of the gyroscope is connected to PIO [54], PIO [53] of the control chip U2 through pins 14 and 13; the first photo-sensor chip U4, the second photo-sensor chip U5 and the third photo-sensor chip U6 are respectively connected to a pin K3 of the control chip U2 through input ends of peripheral circuits, and VOUT ends of the first photo-sensor chip U4, the second photo-sensor chip U5 and the third photo-sensor chip U6 are respectively connected to pins G3, J2 and J3 of the control chip U2; pins E3, F2 and H2 of the control chip U2 are respectively connected to pins 1, 2 and 3 of a chip U3 of the FLASH memory; the positive and negative electrodes of the battery are connected to the P1 pin and the P2 pin of the control chip U2, respectively.
The utility model has the advantages of it is following:
1) the acceleration sensor and the angular velocity sensor acquire acceleration and angular velocity data of the earphone during movement; the control unit receives and processes data, judges the current movement direction according to the processed data, compares the processed data with a set acceleration threshold and a set angular velocity threshold, counts when the data is greater than or equal to the threshold, compares the counted number with a trigger threshold, and triggers an instruction to determine the single gesture of the movement of the earphone when the counted number is greater than the trigger threshold; outputting a single gesture when the earphone stops moving, and continuously judging to obtain the next gesture when the earphone continues moving until the earphone stops moving, and fusing a plurality of gestures to obtain a complex gesture; determining a control instruction corresponding to a single attitude or a complex attitude according to an attitude number-instruction mapping table; sending the control instruction to the communication terminal; the traditional key-press type adjustment is changed, and the method of twice threshold values is adopted, so that the misoperation rate of a user is reduced, and the control accuracy is improved;
2) the optical sensor is arranged, so that the distance between the earphone and the head can be detected, the earphone can be conveniently placed in the ear, and the detection precision of the acceleration sensor and the angular velocity sensor on the acceleration and the angular velocity of the earphone is indirectly ensured.
Drawings
FIG. 1 is a schematic diagram of the internal structure of a Bluetooth headset;
FIG. 2 is a layout view of the front side of each unit circuit board;
FIG. 3 is a layout view of the reverse side of each unit circuit board;
FIG. 4 is a schematic circuit diagram of a Bluetooth headset;
FIG. 5 is a circuit diagram of a Bluetooth headset;
FIG. 6 is a flow chart of a Bluetooth headset control method;
FIG. 7 is a flow chart of single gesture detection;
FIG. 8 is a graph of a gyroscope;
in the figure: the system comprises a top cover, a loudspeaker, a storage unit, a control unit, a sensing unit, a 6-charging interface, a bottom cover, a 8-Bluetooth module, a 9-power supply unit, a U2-main control chip, a U4-first light sensor chip, a U5-second light sensor chip, a U6-third light sensor chip and a U7-gyroscope chip, wherein the top cover is arranged on the top cover, the loudspeaker is arranged on the top cover, the storage unit is arranged on the top cover, the control unit is arranged on the bottom cover, the sensing unit is arranged on the.
Detailed Description
The present invention will be described in detail with reference to the following embodiments.
Example 1
A Bluetooth headset capable of recognizing gestures is shown in figure 1 and comprises a loudspeaker 2 packaged at the top of a headset shell through a top cover 1 and a charging interface 6 installed at the bottom of the headset shell through a bottom cover 7; the earphone shell also comprises a sensing unit 5 arranged above the charging interface 6, a control unit 4 arranged at the arc bending part of the earphone shell, a Bluetooth module 8 and a power supply unit 9 connected to the control unit, and a storage unit 3 arranged at the loudspeaker 2;
the sensing unit 5 comprises an acceleration sensor and an angular velocity sensor which are respectively used for collecting acceleration and angular velocity data when the earphone moves along with the head, and the acceleration sensor and the angular velocity sensor are packaged by a gyroscope;
the sensing unit 5 further comprises a plurality of optical sensors respectively arranged on the storage unit 3, the control unit 4 and the sensing unit 5, and light sensing holes are arranged at earphone shells of the optical sensors; the optical sensor collects the distance between the earphone and the ear through the photosensitive hole and judges whether the earphone is in the ear or not, after the earphone is in the ear, the interruption is caused, and a signal is sent to the control unit;
the light sensors include a first light sensor, a second light sensor and a third light sensor; the first light sensor is arranged on the sensing unit 5; the second optical sensor and the third optical sensor are respectively arranged on the storage unit 3 and the control unit 4; the first optical sensor, the second optical sensor and the third optical sensor are all infrared sensors.
The storage unit 3 includes a memory for storing processing data;
as shown in fig. 4, the input end of the control unit 4 is connected to the sensing unit and the storage unit through the FPC flexible boards, inputs the acceleration and angular velocity data collected by the sensing unit, calls the processing data to process the acceleration and angular velocity data, and outputs a control instruction; the output end is connected with the Bluetooth module 8 and outputs a control instruction to the communication terminal;
specifically, the control unit receives and processes acceleration and angular velocity data, and judges the current movement direction of the earphone according to the processed data; when the data is larger than or equal to the threshold value, counting; when the number of counts is greater than the trigger threshold, determining a single gesture of the movement of the earphone; when the earphone stops moving, judging the earphone to be in a single posture; when the earphone continues to move, the next gesture is continuously determined until the earphone stops moving, and the multiple gestures are fused to judge the gesture as a complex gesture; determining a control instruction corresponding to a single attitude or a complex attitude according to the attitude number-instruction mapping table, and sending the control instruction to the communication terminal;
the power supply unit 9 is a power supply battery, one end of the power supply battery is connected to the charging interface 6, and the other end of the power supply battery is connected to the control unit 4.
As shown in fig. 2 and 3, the chip adopted by the gyroscope is U7, and is arranged on the reverse side of the sensing unit; chips adopted by the first optical sensor, the second optical sensor and the third optical sensor are respectively U4, U5 and U6 and are respectively arranged on the front surface of the sensing unit, the back surface of the storage unit and the back surface of the control unit; the front surface of the control unit is provided with a control chip U2 with the model of QCC 3020; the memory adopts a chip U3, which is arranged on the front surface of the memory unit.
As shown in fig. 5, the chip U7 of the gyroscope is connected to the PIO [54], PIO [53] of the control chip U2 through pins 14 and 13; the first photo-sensor chip U4, the second photo-sensor chip U5 and the third photo-sensor chip U6 are respectively connected to a pin K3 of the control chip U2 through input ends of peripheral circuits, and VOUT ends of the first photo-sensor chip U4, the second photo-sensor chip U5 and the third photo-sensor chip U6 are respectively connected to pins G3, J2 and J3 of the control chip U2; pins E3, F2 and H2 of the control chip U2 are respectively connected to pins 1, 2 and 3 of a chip U3 of the FLASH memory; the positive and negative electrodes of the battery are connected to the P1 pin and the P2 pin of the control chip U2, respectively.
The specific process used is as follows:
the Bluetooth earphone is placed at the position of an ear opening, the light sensor collects the distance between the earphone and the ear through the light sensing hole and judges whether the earphone is inserted into the ear, after the earphone is inserted into the ear, interruption is caused, and a signal is sent to the control unit 4; after the earphone is put into the ear, the earphone moves along with the change of the head posture, and the acceleration sensor and the angular velocity sensor respectively acquire the acceleration original data and the angular velocity original data when the earphone moves; the control chip receives and processes the acceleration original data and the angular velocity original data into acceleration data and angular velocity data, and judges the current motion direction according to the processed data; comparing the processed data with a set acceleration threshold and a set angular velocity threshold, and counting when the data is greater than or equal to the threshold; when the number of counts is greater than the trigger threshold, determining a single gesture of the movement of the earphone; when the earphone stops moving, judging the earphone to be in a single posture; when the earphone continues to move, the next gesture is continuously determined until the earphone stops moving, and the multiple gestures are fused to judge the gesture as a complex gesture; and determining a control instruction corresponding to a single attitude or a complex attitude according to the attitude number-instruction mapping table, and sending the control instruction to the communication terminal.
The beneficial effect of this embodiment is: the traditional key-type adjustment is changed, and the method of twice threshold values is adopted, so that the misoperation rate of a user is reduced, and the control accuracy is improved.
Example 2
On the basis of embodiment 1, the control method of the gesture-recognizable bluetooth headset, as shown in fig. 6, includes the following steps:
B1. initializing a sensor, and setting the measurement range and precision of an acceleration sensor and an angular velocity sensor; setting a counting threshold comprising an acceleration threshold and an angular velocity threshold, a trigger threshold and an attitude number-instruction mapping table on a control unit;
B2. detecting a single gesture when the headset moves with the head;
as shown in fig. 7, the steps of detecting a single gesture are as follows:
s1, when the earphone moves along with the head, an acceleration sensor and an angular velocity sensor respectively acquire current acceleration original data and angular velocity original data of the earphone, the current acceleration original data and the current angular velocity original data are expressed by 16-bit binary complement data or quaternion data and are transmitted to a control unit;
s2, the control unit receives and processes the acceleration original data and the angular velocity original data to obtain acceleration data and angular velocity data; respectively comparing the acceleration data and the angular velocity data with the acceleration threshold and the angular velocity threshold, and determining whether to count or clear the data;
when the acceleration raw data and the angular velocity raw data are 16-bit two-complement data, the data processing formula is as follows:
the conversion formula of the acceleration raw data is as follows:
Figure DEST_PATH_GDA0003030774340000081
wherein, amIs ax,ayOr az(ii) a ACC _ RATIO is an acceleration multiplying factor set when the acceleration sensor is initialized; ACC _ z is a component of the read acceleration raw data along an x axis, a y axis or a z axis; a isx,ay,azThe components of the processed acceleration along the x axis, the y axis and the z axis are respectively; delta is a set precision constant, particularly 32768;
the conversion formula of the angular velocity raw data is as follows:
Figure DEST_PATH_GDA0003030774340000082
wherein, gmIs gx,gyOr gz(ii) a The GyR _ RATIO is an angular velocity multiplying factor set when the angular velocity sensor is initialized; GYR _ z is a component of the read angular velocity raw data along an x-axis, a y-axis or a z-axis; gx,gy,gzThe processing relief angle velocity is measured along the x-axis, y-axis,a component of the z-axis; delta is a set precision constant, particularly 32768;
when the acceleration raw data and the angular velocity raw data are quaternion data, the data processing formula is as follows:
converting quaternion into floating point number, and setting the four obtained floating point numbers as q0,q1,q2,q3
And (3) obtaining an Euler angle according to the floating point number:
pitch=arcsin(2×(q0×q2-q1×q3))×57.3;
roll=arctan2(2×(q0×q1+q2×q3),1-2×(q1 2+q2 2))×57.3;
yaw=arctan2(2×(q0×q3+q1×q2),1-2×(q2 2+q3 2))×57.3;
wherein pitch is a pitch angle, roll is a roll angle, yaw is a course angle, and the unit is degree "°";
the conditions for determining whether to count or clear data are:
when the acceleration data is larger than or equal to the acceleration threshold and the angular velocity data is larger than or equal to the angular velocity threshold, counting +1, and entering S3;
after the data are determined to be cleared, clearing the current acceleration data and the current angular velocity data;
s3, the control unit compares the number of the current counts with the size of a trigger threshold value to determine whether to trigger the appointed or clear data;
when the number of counts is larger than or equal to the trigger threshold, determining a trigger instruction, and entering step S4;
when the number of counts is less than the trigger threshold, determining to clear data, and clearing current acceleration data and angular velocity data;
s4, the control unit determines the head movement posture according to the current acceleration data and the current angular velocity data and eliminates the current acceleration data and the current angular velocity data;
B3. after the detection of the single gesture is finished, detecting whether the earphone continues to move;
if the movement is continued, returning to the step B2 to continue to detect the next gesture;
if the motion is stopped, go to step B4;
B4. if the single gesture is detected and then is stopped, determining the movement gesture of the earphone to be the single gesture;
if the movement is continued after the detection of the single gesture is finished, fusing a plurality of single gestures to obtain a complex gesture after the earphone stops;
B5. and determining a control instruction corresponding to a single attitude or a complex attitude according to the attitude number-instruction mapping table, and sending the control instruction to the communication terminal.
It should be noted that:
in step S2, the key to whether to count is: removing the unconscious posture; based on this, the scheme removes the action within a period of time according to the habit of people, for example, removes the data within the action time of less than 100ms, that is, the action time is more than 100ms and can be counted as the effective trigger data.
In step B4, for a complex gesture, for example, recognizing a head shaking left and right continuously, at a critical point of the gesture left and right conversion, it is obvious that acceleration raw data and angular velocity raw data are small or even 0 for a period of time; in order to continue recognition, if data inversion occurs within a period of time (for example, within 100 ms), the measurement data needs to be continuously tracked, and the next gesture is detected until the earphone stops moving, and a plurality of gestures are merged into a complex gesture; however, if two consecutive poses reverse after a period of time (e.g., after 200 ms), they cannot be merged but are determined to be two separate poses; through the processing, the single gesture or the complex gesture of the user for controlling the Bluetooth headset can be effectively identified, and the misoperation rate is reduced.
The attitude number-command mapping table is as follows:
head pose (including but not limited to) Corresponding instructions (including but not limited to)
Nodding once Sound reduction
Raise head once Sound augmentation
Left shaking head Last song
Right swing head The next song
Shaking head twice Refusing to receive telephone
Nodding twice Answering telephone
It should be noted that the head posture and the corresponding command may be adjusted with each other, not limited to the command corresponding to the head posture in the above table.
The beneficial effect of this embodiment is: the control chip receives the original data of the movement of the earphone, processes the data through a formula, compares the processed data with a threshold value twice, determines a single gesture of the movement of the head, continues tracking detection, and determines the detected single gesture as an instruction triggering gesture when the tracking detection detects that the movement of the earphone stops; continuously judging each gesture when the earphone is tracked and detected to continue moving until the earphone is detected to stop moving, fusing a plurality of gestures into a complex gesture, and triggering the gesture by taking the complex gesture as an instruction; and after the triggering attitude is determined, determining a control instruction according to the attitude number-instruction mapping table, and sending the control instruction to the communication terminal to realize control. Compared with the traditional gesture recognition, the method of twice threshold values is adopted, so that the misoperation rate of a user is reduced, and the control accuracy is improved.
Example 3
In this embodiment, in a single gesture detection process, the steps of data detection, processing and analysis at a certain time are as follows: the coordinate system of the gyroscope is as shown in fig. 8, and assuming that the total count before the current time is 19, the acceleration threshold set in the acceleration threshold range of 0.4g-0.8g is 0.6 g; the angular velocity threshold value set within the angular velocity threshold value range of 120 °/s to 150 °/s is 140 °/s; setting a trigger threshold value to be 20;
the earphone moves along with the head, the acceleration sensor and the angular velocity sensor respectively collect the acceleration original data and the angular velocity original data of the earphone at the current moment, and the 16-bit binary complement data is ax=[13612],ay=[-824],az=[8952];gx=[307],gy=[84], gz=[124];
The control unit receives and processes the acceleration raw data and the angular velocity raw data, and the data processing formula is as follows:
according to a conversion formula of the acceleration raw data:
Figure DEST_PATH_GDA0003030774340000121
the obtained acceleration data at the current moment is as follows: a isx=0.99g,ay=0.12g,az=0.21g;
The conversion formula of the angular velocity raw data is as follows:
Figure DEST_PATH_GDA0003030774340000122
the angular velocity data of the current moment is obtained as follows: gx=-103.72°/s,gy=-58.35°/s, gz=291.52°/s;
Combined with the gyroscope coordinate system and acceleration, angular velocity analysis as shown in fig. 8:
axwhen the acceleration is larger than 0.99g, the component of the acceleration on the positive axis of the X axis is 0.99;
aywhen the value is 0.12g and is larger than 0, the component of the acceleration on the positive axis of the Y axis is 0.12;
azwhen the acceleration is larger than 0.21g, the component of the acceleration on the positive axis of the Z axis is 0.21;
gx-103.72 °/s is less than 0, when the angular velocity has a component in the negative direction of the X-axis of 133.72;
gy-58.35 °/s is less than 0, when the angular velocity has a component in the negative Y-axis of 58.35;
gz291.52 °/s is greater than 0, when the component of angular velocity in the positive Z-axis direction is 291.52;
due to | ax|>|ay|,|ax|>|az|;|gz|>|gx|,|gz|>|gyTherefore, data of acceleration in the X axis and data of angular velocity in the Z axis are selected as the judgment reference, and ax,gzThe components of the signals on the X axis and the Z axis are on the positive axis respectively, and the right movement posture of the earphone is judged to be the right shaking head by combining the coordinate system of the gyroscope and the data of the right movement of the earphone at the current moment as shown in the figure 8;
furthermore, the current acceleration data ax0.99g satisfies a is not less than 0.6g, and angular velocity data gz291.52 DEG/s satisfies that g.gtoreq.140 DEG/s, so count + 1; the number of counts at this time is 19+1 to 20;
triggering an instruction because the counting number is more than or equal to the triggering threshold value 20;
the control unit determines a next song instruction corresponding to the right shaking head gesture in the gesture instruction mapping table as a control instruction, and the control unit controls the communication terminal to execute a play next song instruction through the wireless Bluetooth module.
If in example 3, ax,gzThe counting condition is not satisfied or the counted number is less than the triggering thresholdAnd if so, eliminating the acceleration and angular velocity data at the moment, respectively acquiring the acceleration original data and the angular velocity original data of the next moment of the earphone by the acceleration sensor and the angular velocity sensor, and continuously judging according to the method.
The content of the present invention is not limited to the examples, and any equivalent transformation adopted by the technical solution of the present invention is covered by the claims of the present invention by those skilled in the art through reading the present invention.

Claims (7)

1. A Bluetooth headset capable of recognizing gestures comprises a loudspeaker and a charging interface, wherein the loudspeaker is packaged at the top of a headset shell through a top cover, and the charging interface is installed at the bottom of the headset shell through a bottom cover; the earphone shell is characterized by further comprising a sensing unit arranged above the charging interface, a control unit arranged at the arc bending part of the earphone shell, a Bluetooth module and a power supply unit which are connected to the control unit, and a storage unit arranged at the loudspeaker part;
the sensing unit comprises an acceleration sensor and an angular velocity sensor which are respectively used for acquiring acceleration and angular velocity data when the earphone moves along with the head;
the storage unit comprises a memory for storing processing data;
the input end of the control unit is connected to the sensing unit and the storage unit through FPC (flexible printed circuit) soft boards respectively, and is used for inputting acceleration and angular velocity data collected by the sensing unit, calling processing data to process the acceleration and angular velocity data and then outputting a control instruction; the output end is connected with the Bluetooth module and outputs a control instruction to the communication terminal;
the power supply unit is a power supply battery, one end of the power supply unit is connected to the charging interface, and the other end of the power supply unit is connected to the control unit.
2. The Bluetooth headset of claim 1, wherein the sensing unit encapsulates an acceleration sensor and an angular velocity sensor by a gyroscope.
3. The Bluetooth headset of claim 2, wherein the sensing unit further comprises a plurality of light sensors respectively disposed on the storage unit, the control unit and the sensing unit, and a light sensing hole is disposed at a headset housing where each light sensor is disposed; the optical sensor collects the distance between the earphone and the ear through the photosensitive hole, judges whether the earphone is in the ear or not, causes interruption after the earphone is in the ear, and sends a signal to the control unit.
4. The gesture recognizable bluetooth headset of claim 3, wherein the light sensor comprises a first light sensor, a second light sensor, and a third light sensor; the first light sensor is arranged on the sensing unit; the second light sensor and the third light sensor are respectively arranged on the storage unit and the control unit.
5. The Bluetooth headset of claim 4, wherein the first light sensor, the second light sensor and the third light sensor are infrared sensors.
6. The Bluetooth headset of claim 5, wherein the gyroscope is provided with a chip U7 disposed on the opposite side of the sensing unit; chips adopted by the first optical sensor, the second optical sensor and the third optical sensor are respectively U4, U5 and U6 and are respectively arranged on the front surface of the sensing unit, the back surface of the storage unit and the back surface of the control unit; the front surface of the control unit is provided with a control chip U2; the memory adopts a chip U3, which is arranged on the front surface of the memory unit.
7. The Bluetooth headset of claim 6, wherein the gyroscope chip U7 is connected to PIO [54], PIO [53] of the control chip U2 through 14 and 13 pins; the first photo-sensor chip U4, the second photo-sensor chip U5 and the third photo-sensor chip U6 are respectively connected to a pin K3 of the control chip U2 through input ends of peripheral circuits, and VOUT ends of the first photo-sensor chip U4, the second photo-sensor chip U5 and the third photo-sensor chip U6 are respectively connected to pins G3, J2 and J3 of the control chip U2; pins E3, F2 and H2 of the control chip U2 are respectively connected to pins 1, 2 and 3 of a chip U3 of the FLASH memory; the positive and negative electrodes of the battery are connected to the P1 pin and the P2 pin of the control chip U2, respectively.
CN202022659118.9U 2020-11-17 2020-11-17 Bluetooth headset of distinguishable gesture Active CN213403447U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202022659118.9U CN213403447U (en) 2020-11-17 2020-11-17 Bluetooth headset of distinguishable gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202022659118.9U CN213403447U (en) 2020-11-17 2020-11-17 Bluetooth headset of distinguishable gesture

Publications (1)

Publication Number Publication Date
CN213403447U true CN213403447U (en) 2021-06-08

Family

ID=76195095

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202022659118.9U Active CN213403447U (en) 2020-11-17 2020-11-17 Bluetooth headset of distinguishable gesture

Country Status (1)

Country Link
CN (1) CN213403447U (en)

Similar Documents

Publication Publication Date Title
CN112291669A (en) Bluetooth headset capable of recognizing gesture and preparation and control method thereof
TWI457793B (en) Real-time motion recognition method and inertia sensing and trajectory
JP5952486B2 (en) Terminal control method and apparatus, and terminal
US8010911B2 (en) Command input method using motion recognition device
KR100811015B1 (en) Method and apparatus for entering data using a virtual input device
CN104267819B (en) Can gesture wake up electronic equipment and electronic equipment gesture awakening method
CN104238743B (en) Information processor and information processing method
US20060125789A1 (en) Contactless input device
US11422609B2 (en) Electronic device and method for controlling operation of display in same
CN107222623A (en) Held state identifying device, method and electronic equipment
CN213403447U (en) Bluetooth headset of distinguishable gesture
CN106055958B (en) A kind of unlocking method and device
CN113495609A (en) Sleep state judgment method and system, wearable device and storage medium
CN113641278A (en) Control method, control device, electronic equipment and storage medium
KR101053411B1 (en) Character input method and terminal
KR20210015638A (en) Apparatus and method for detecting fall in low power using sensor
US10712831B2 (en) Information processing apparatus, method, and program
US11467697B2 (en) Electronic device and method for distinguishing between different input operations
WO2022194029A1 (en) Robot feedback method and robot
CN115291786A (en) False touch judgment method and device based on machine learning and storage medium
CN109089191A (en) Plug-hole processing method and Related product
JP2009099041A (en) Pen type input device
JP2003114754A (en) Device for inputting hand-written information
KR102157304B1 (en) Wearable device and interface method thereof
KR20210105783A (en) Electronic device and method for recognizing context thereof

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant