CN113359990A - Friend making method, friend making device, first wearing equipment and readable storage medium - Google Patents

Friend making method, friend making device, first wearing equipment and readable storage medium Download PDF

Info

Publication number
CN113359990A
CN113359990A CN202110625244.9A CN202110625244A CN113359990A CN 113359990 A CN113359990 A CN 113359990A CN 202110625244 A CN202110625244 A CN 202110625244A CN 113359990 A CN113359990 A CN 113359990A
Authority
CN
China
Prior art keywords
wearable device
friend
making
gesture
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110625244.9A
Other languages
Chinese (zh)
Inventor
张腾飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Genius Technology Co Ltd
Original Assignee
Guangdong Genius Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Genius Technology Co Ltd filed Critical Guangdong Genius Technology Co Ltd
Priority to CN202110625244.9A priority Critical patent/CN113359990A/en
Publication of CN113359990A publication Critical patent/CN113359990A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application is suitable for the technical field of intelligent wearable equipment, and provides a friend making method, a friend making device, first wearable equipment and a readable storage medium, wherein the method comprises the following steps: acquiring an electromyographic signal acquired by the electromyographic sensor; identifying a target gesture corresponding to the electromyographic signal; and if the target gesture is a preset gesture, executing friend making pairing operation. The method and the device for making friends can solve the problems that the existing method for making friends of the smart watch is troublesome and experience feeling is low to a certain extent.

Description

Friend making method, friend making device, first wearing equipment and readable storage medium
Technical Field
The application belongs to the technical field of intelligent wearable equipment, and particularly relates to a friend making method, a friend making device, first wearable equipment and a readable storage medium.
Background
With the development of scientific technology and the improvement of the living standard of people in China, the smart watch is more and more popular with people.
At present, the smart watch has a friend-making function through 'touch-touch', namely when making friends, a child firstly clicks a screen to start a friend-making application, and then the smart watch makes friends and pairs. Or, if the friend-making application is started but is running in the background, the child is required to manually switch to the interface of the friend-making application first, and then the smart watch performs friend-making pairing. In this friend-making process, all need child to operate earlier the back intelligence wrist-watch just to make friends and pair, not only troublesome, reduced child moreover and carried out the experience of making friends and felt.
Disclosure of Invention
The embodiment of the application provides a friend making method, a friend making device, a first wearing device and a readable storage medium, and can solve the problems that the conventional friend making method of a smart watch is troublesome and the experience is low to a certain extent.
In a first aspect, an embodiment of the present application provides a friend making method, which is applied to a first wearing device, where the first wearing device is provided with an electromyographic sensor, and includes:
acquiring an electromyographic signal acquired by the electromyographic sensor;
identifying a target gesture corresponding to the electromyographic signal;
and if the target gesture is a preset gesture, executing friend making pairing operation.
Optionally, if the target gesture is a preset gesture, performing friend making pairing operation includes:
if the target gesture is a preset gesture, playing a prompt voice;
and when a voice confirmation instruction of the user is received, carrying out friend-making pairing operation according to the voice confirmation instruction.
Optionally, before the performing the friend making pairing operation if the target gesture is the preset gesture, the method further includes:
acquiring current voice data of a user;
correspondingly, if the target gesture is a preset gesture, performing friend making pairing operation, including:
if the target gesture is a preset gesture, performing semantic recognition on the voice data;
and when the semantics of the voice data comprise the preset word eyes, executing friend making pairing operation.
Optionally, the performing friend-making pairing operation includes:
performing a wireless search operation;
sending a connection request to each searched second wearable device;
and when the connection agreement information returned by any one of the second wearable devices is received, the second wearable device corresponding to the connection agreement information is matched with friends.
Optionally, the performing friend-making pairing operation includes:
performing a wireless search operation;
calculating the signal intensity of each searched second wearable device;
sending a connection request to a second wearable device with the strongest signal intensity;
and when the connection agreement information sent by the second wearable device with the strongest signal strength is received, making a friend with the second wearable device with the strongest signal strength.
Optionally, before the calculating and searching the signal strength of each second wearable device, the method further includes:
acquiring a current position;
taking the current position as a center, and taking a preset distance as a target length value to construct a preset area;
correspondingly, the calculating and searching the signal strength of each second wearable device includes:
and calculating the signal intensity of each second wearable device in the searched preset area.
Optionally, the recognizing the target gesture corresponding to the electromyographic signal includes:
inputting the electromyographic signals into a convolutional layer pair in a trained neural network model for feature extraction to obtain target feature vectors;
and inputting the target feature vector to a full connection layer in the trained neural network model for classification to obtain a target gesture corresponding to the electromyographic signal.
In a second aspect, an embodiment of the present application provides a friend-making device, which is applied to a first wearing device, where the first wearing device is provided with an electromyographic sensor, and includes:
the acquisition module is used for acquiring the electromyographic signals acquired by the electromyographic sensor;
the recognition module is used for recognizing a target gesture corresponding to the electromyographic signal;
and the execution module is used for executing friend making pairing operation if the target gesture is a preset gesture.
In a third aspect, an embodiment of the present application provides a first wearable device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method according to the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and the computer program implements the steps of the method according to the first aspect when executed by a processor.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on a first wearable device, causes the first wearable device to perform the friend making method according to any one of the first aspect.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Compared with the prior art, the embodiment of the application has the advantages that:
the application provides a friend making method, which is characterized in that when a target gesture corresponding to an electromyographic signal acquired by first wearable equipment is a preset gesture, friend making pairing operation is executed. When needing to make friends promptly, only need to make and predetermine the gesture, first wearing equipment can make friends and pair, need not the manual first wearing equipment of back that operates of user and then makes friends and pairs simple and convenient. Moreover, the preset gesture can be a gesture frequently made in a real friend making scene, so that the real friend making scene can be simulated when the user makes the preset gesture, and the experience of making friends of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart of a friend making method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a friend making device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a first wearable device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Example one
At present, the process of making friends by a user using a smart watch is as follows: the user manually starts the friend-making application, and then the smart watch performs friend-making pairing. If the friend-making application is started but is running in the background, the user needs to manually switch to the interface of the friend-making application first, and then the smart watch performs friend-making pairing. Therefore, regardless of whether the friend-making application is in a non-started state or in a state of running in the background of the smart watch, the smart watch needs to perform friend-making pairing operation after the user manually operates the friend-making application. The user is required to manually operate, which is not only troublesome, but also greatly differs from the real friend-making scene, and the experience of friend-making of the user is reduced.
Therefore, in order to enable friend making behaviors among users to be simple and convenient and improve experience of friend making among users, the application provides a friend making method. When the user need make friends promptly, only need to make and predetermine the gesture, first wearing equipment can make friends and pair, need not the manual first wearing equipment of back of carrying out the operation of user and then makes friends and pairs simple and convenient. Moreover, the preset gesture can be a gesture frequently made in a real friend making scene, so that the real friend making scene can be simulated when the user makes the preset gesture, and the experience of making friends of the user is improved.
In the following, a detailed description is given of a friend making method provided in an embodiment of the present application, where the method is applied to a first wearable device, and the first wearable device is provided with an electromyographic sensor, referring to fig. 1, the method includes:
and S101, acquiring an electromyographic signal acquired by the electromyographic sensor.
In step S101, an Electromyography (EMG) is a bioelectric signal generated along with a muscle contraction action. Therefore, when a user of the first wearing device makes a preset gesture, the electromyographic signal corresponding to the preset gesture can be acquired by the electromyographic sensor of the first wearing device. After the electromyographic signals corresponding to the preset gestures are collected by the electromyographic sensor of the first wearable device, the first wearable device acquires the electromyographic signals collected by the electromyographic sensor.
And S102, identifying a target gesture corresponding to the electromyographic signal.
In step S102, after acquiring the electromyographic signal, the first wearable device recognizes a target gesture corresponding to the electromyographic signal.
In some possible implementation manners, the process of recognizing the target gesture corresponding to the electromyographic signal by the first wearable device is as follows: the first wearable device inputs the electromyographic signals to the convolutional layer pair in the trained neural network model for feature extraction, and a target feature vector is obtained. And then inputting the target feature vector to a full connection layer in the trained neural network model for classification, thereby obtaining a target gesture corresponding to the electromyographic signal.
It should be noted that the training process of the trained neural network model is as follows: the first wearing device obtains a training electromyographic signal, and the training electromyographic signal is input to a convolution layer of a neural network model to be trained to carry out feature extraction, so that an initial feature vector is obtained. And then inputting the initial characteristic vector into a full connection layer of the neural network model to be trained to obtain a target loss value. And if the target loss value is larger than the preset threshold value, returning to execute the acquisition of the training electromyographic signal. And if the target loss value is less than or equal to the preset threshold value, stopping training to obtain the trained neural network model.
And S103, if the target gesture is a preset gesture, executing friend making pairing operation.
In step S103, the preset gesture may include at least one of a handshake, a punch, and a clap. And if the first wearable device recognizes that the target gesture is a preset gesture, the first wearable device executes friend making pairing operation.
It should be noted that, when the target gesture is the preset gesture, if the friend-making application is in the start state, the first wearable device may directly perform the friend-making pairing operation. If the friend-making application is not started, the first wearable device needs to start the friend-making application first and then execute friend-making pairing operation.
It should be understood that in the process of making friends and pairing the first wearable device and the second wearable device, the user of the first wearable device and the user of the second wearable device can maintain the preset gesture, so that the probability of successful making friends and pairing between the first wearable device and the second wearable device is improved. And, when the friend-making is paired and is accomplished, first wearing equipment can remind the user of first wearing equipment through pronunciation, and second wearing equipment can remind the user of second wearing equipment through pronunciation to the user of first wearing equipment knows the paired condition of making friends in real time with the user of second wearing equipment.
In some possible implementations, the detailed process of the first wearable device performing the friend-making pairing operation is as follows: the first wearable device executes wireless searching operation and then sends connection requests to the searched second wearable devices. In the process of making friends, generally, the user of the second wearable device also makes a preset gesture. Accordingly, the second wearable device also performs steps S101-S103. Therefore, when the second wearable device receives the connection request, the second wearable device can determine whether the preset gesture is detected, and if the preset gesture is detected, it indicates that the user of the second wearable device also needs to make friends. Therefore, the second wearable device can automatically send the connection agreement information to the first wearable device, and therefore the user of the second wearable device does not need to manually operate to send the connection agreement information, and the method and the device are simple and convenient.
When the first wearable device receives the connection agreeing information returned by the second wearable device, the first wearable device matches friends with the second wearable device corresponding to the connection agreeing information.
The wireless search operation includes broadcasting or listening for near field communication signals. The type of the near field communication signal may be selected according to actual situations, for example, a bluetooth signal and a radio frequency signal are selected as the near field communication signal in this embodiment, which is not limited herein.
Because in the friend-making process, the distance between the first wearable device and the second wearable device is short, the signal strength between the first wearable device and the second wearable device is strong. Therefore, when the number of the second wearable devices searched by the first wearable device is large, the first wearable device may calculate the signal strength with each of the searched second wearable devices, and then send the connection request to the second wearable device with the strongest signal strength, so as not to disturb the user of the second wearable device who does not need to make friend pairs. And finally, when the connection agreement information sent by the second wearable device with the strongest signal strength is received, the first wearable device is matched with the second wearable device with the strongest signal strength in a friend making way.
It should be noted that, when the second wearable device with the strongest signal strength detects the preset gesture, the second wearable device with the strongest signal strength may also automatically send the connection agreement information to the first wearable device.
Because the number of the second wearable devices searched by the first wearable device is large, the time for the first wearable device to calculate the signal intensity of each searched second wearable device is also long. Therefore, in order to reduce the time for the first wearable device to calculate the signal strength, the first wearable device acquires its current location before calculating and searching the signal strength of each second wearable device. And then, taking the current position as a center and the preset distance as a target length value to construct a preset area. And finally, screening the second wearable devices outside the searched preset area, and only calculating the signal intensity of each second wearable device in the searched preset area.
In this embodiment, set up and predetermine the region, filter the second wearing equipment who searches for according to predetermineeing the region to reduce the number of the second wearing equipment that first wearing equipment calculated signal intensity, and then reduce the time that first wearing equipment calculated signal intensity, and then accelerate the speed that makes friends and pair between first wearing equipment and the second wearing equipment.
The preset region may be a circular region, a rectangular region, or a square region. The shape of the preset area may be set by a user according to actual conditions, and the application is not limited herein. When the preset region is a rectangular region, the preset distance may include a first sub-distance and a second sub-distance, the first sub-distance may be a length of the rectangular region, and the second sub-distance may be a width of the rectangular region.
It should be noted that sometimes, when the user of the first wearable device makes the preset gesture, the user of the first wearable device does not want to make friends. Therefore, in order to prevent the first wearable device from being triggered by mistake to execute the friend-making pairing operation, when the first wearable device detects the preset gesture, the first wearable device plays prompt voice, and when a voice confirmation instruction of the user is received, the first wearable device executes the friend-making pairing operation according to the voice confirmation instruction.
In this embodiment, in order to prevent the first wearable device from being triggered by mistake to perform the friend-making pairing operation, the first wearable device performs the friend-making pairing operation only when detecting the preset gesture and receiving the voice confirmation instruction.
In other embodiments, the words "add friends" are typically spoken when users are to add friends to each other. Therefore, in order to prevent the first wearable device from being triggered by mistake to perform the friend-making pairing operation, the first wearable device may acquire current voice data of the user before performing the friend-making pairing operation. And then carrying out semantic recognition on the voice data when the preset gesture is detected. And if the semantics of the voice data contain the preset words, which indicates that the user of the first wearable device wants to make friends, the first wearable device executes friend-making pairing operation.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Example two
Fig. 2 shows an example of a friend making apparatus applied to a first wearing device provided with an electromyographic sensor, and only a portion related to an embodiment of the present application is shown for convenience of explanation. The apparatus 200 comprises:
the acquiring module 201 is configured to acquire an electromyographic signal acquired by an electromyographic sensor.
The recognition module 202 is configured to recognize a target gesture corresponding to the electromyographic signal.
The executing module 203 is configured to execute a friend making pairing operation if the target gesture is a preset gesture.
Optionally, the executing module 203 comprises:
and the playing unit is used for playing the prompt voice if the target gesture is a preset gesture.
And the first execution unit is used for executing friend-making pairing operation according to the voice confirmation instruction when the voice confirmation instruction of the user is received.
Optionally, the apparatus 200 further comprises:
and the voice data acquisition unit is used for acquiring the current voice data of the user.
Accordingly, the execution module 203 includes:
and the semantic recognition unit is used for performing semantic recognition on the voice data if the target gesture is a preset gesture.
And the second execution unit is used for executing friend making pairing operation when the semantic meaning of the voice data contains the preset word eye.
Optionally, the executing module 203 is specifically configured to execute:
a wireless search operation is performed.
And sending a connection request to the searched second wearable devices.
And when the connection agreeing information returned by any second wearable device is received, making friends with the second wearable device corresponding to the connection agreeing information and pairing the friends with the second wearable device.
Optionally, the executing module 203 is specifically configured to execute:
a wireless search operation is performed.
And calculating the signal intensity of each searched second wearable device.
And sending a connection request to the second wearable device with the strongest signal strength.
And when the connection agreement information sent by the second wearable device with the strongest signal strength is received, making a friend with the second wearable device with the strongest signal strength.
Optionally, the apparatus 200 further comprises:
and the position acquisition unit is used for acquiring the current position.
And the area construction unit is used for constructing a preset area by taking the current position as a center and a preset distance as a target length value.
Optionally, the executing module 203 is specifically configured to execute:
and calculating the signal intensity of each second wearable device in the searched preset area.
Optionally, the identifying module 202 is specifically configured to perform:
inputting the electromyographic signals into a convolutional layer pair in a trained neural network model for feature extraction to obtain a target feature vector;
and inputting the target characteristic vector to a full connection layer in the trained neural network model for classification to obtain a target gesture corresponding to the electromyographic signal.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the method embodiment of the present application, and specific reference may be made to a part of the method embodiment, which is not described herein again.
EXAMPLE III
Fig. 3 is a schematic view of a first wearable device provided in the third embodiment of the present application. As shown in fig. 3, the first wearable device 300 of this embodiment includes: a processor 301, a memory 302, and a computer program 303 stored in the memory 302 and operable on the processor 301. The processor 301 implements the steps of the above-described method embodiments when executing the computer program 303. Alternatively, the processor 301 implements the functions of the modules/units in the device embodiments when executing the computer program 303.
Illustratively, the computer program 303 may be divided into one or more modules/units, which are stored in the memory 302 and executed by the processor 301 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, and the instruction segments are used for describing the execution process of the computer program 303 in the first wearable device 300. For example, the computer program 303 may be divided into an acquisition module, an identification module, and an execution module, and each module has the following specific functions:
acquiring an electromyographic signal acquired by an electromyographic sensor;
identifying a target gesture corresponding to the electromyographic signal;
and if the target gesture is a preset gesture, executing friend making pairing operation.
The first wearable device may include, but is not limited to, a processor 301 and a memory 302. Those skilled in the art will appreciate that fig. 3 is only an example of the first wearable device 300, and does not constitute a limitation of the first wearable device 300, and may include more or less components than those shown, or combine some components, or different components, for example, the first wearable device may further include an input-output device, a network access device, a bus, etc.
The Processor 301 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware card, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 302 may be an internal storage unit of the first wearable device 300, such as a hard disk or a memory of the first wearable device 300. The memory 302 may also be an external storage device of the first wearable device 300, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the first wearable device 300. Further, the memory 302 may include both an internal storage unit and an external storage device of the first wearable device 300. The memory 302 is used to store the computer program and other programs and data required by the first wearable device. The memory 302 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/first wearable device and method may be implemented in other ways. For example, the above-described apparatus/first wearable device embodiments are merely illustrative, and for example, the division of the above modules or units is only one logical function division, and there may be other division manners in actual implementation, for example, a plurality of units or plug-ins may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units described above, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the above method embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium and executed by a processor, so as to implement the steps of the above method embodiments. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer readable medium may include: any entity or device capable of carrying the above-mentioned computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the computer readable medium described above may include content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media that does not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A friend making method applied to a first wearing apparatus provided with an electromyographic sensor, comprising:
acquiring an electromyographic signal acquired by the electromyographic sensor;
identifying a target gesture corresponding to the electromyographic signal;
and if the target gesture is a preset gesture, executing friend making pairing operation.
2. The friend making method according to claim 1, wherein if the target gesture is a preset gesture, performing friend making pairing operation comprises:
if the target gesture is a preset gesture, playing a prompt voice;
and when a voice confirmation instruction of the user is received, carrying out friend-making pairing operation according to the voice confirmation instruction.
3. The friend making method according to claim 1, wherein before performing the friend making pairing operation if the target gesture is a preset gesture, the method further comprises:
acquiring current voice data of a user;
correspondingly, if the target gesture is a preset gesture, performing friend making pairing operation, including:
if the target gesture is a preset gesture, performing semantic recognition on the voice data;
and when the semantics of the voice data contain a preset word eye, executing friend-making pairing operation.
4. A dating method according to claim 1, wherein said performing dating pairing operations comprises:
performing a wireless search operation;
sending a connection request to each searched second wearable device;
and when the connection agreeing information returned by any one of the second wearable devices is received, making friends with the second wearable device corresponding to the connection agreeing information.
5. A dating method according to claim 1, wherein said performing dating pairing operations comprises:
performing a wireless search operation;
calculating the signal intensity of each searched second wearable device;
sending a connection request to a second wearable device with the strongest signal intensity;
and when the connection agreement information sent by the second wearable device with the strongest signal strength is received, making a friend with the second wearable device with the strongest signal strength.
6. The friend making method according to claim 5, further comprising, before the calculating and searching for the signal strength of each second wearable device:
acquiring a current position;
constructing a preset area by taking the current position as a center and a preset distance as a target length value;
correspondingly, the calculating and searching the signal strength of each second wearable device comprises:
and calculating the signal intensity of each second wearable device in the searched preset area.
7. A friend making method according to claim 1, wherein the recognizing of the target gesture corresponding to the electromyographic signal comprises:
inputting the electromyographic signals to a convolutional layer pair in a trained neural network model for feature extraction to obtain target feature vectors;
and inputting the target feature vector to a full connection layer in the trained neural network model for classification to obtain a target gesture corresponding to the electromyographic signal.
8. A friend-making device, applied to a first wearing apparatus provided with an electromyographic sensor, comprising:
the acquisition module is used for acquiring the electromyographic signals acquired by the electromyographic sensor;
the recognition module is used for recognizing a target gesture corresponding to the electromyographic signal;
and the execution module is used for executing friend making pairing operation if the target gesture is a preset gesture.
9. A first wearable device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any of claims 1-7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202110625244.9A 2021-06-04 2021-06-04 Friend making method, friend making device, first wearing equipment and readable storage medium Pending CN113359990A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110625244.9A CN113359990A (en) 2021-06-04 2021-06-04 Friend making method, friend making device, first wearing equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110625244.9A CN113359990A (en) 2021-06-04 2021-06-04 Friend making method, friend making device, first wearing equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN113359990A true CN113359990A (en) 2021-09-07

Family

ID=77532292

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110625244.9A Pending CN113359990A (en) 2021-06-04 2021-06-04 Friend making method, friend making device, first wearing equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN113359990A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014114162A1 (en) * 2013-01-26 2014-07-31 广州市沃希信息科技有限公司 Electronic device communication method and electronic device communication system
CN104750251A (en) * 2015-03-09 2015-07-01 联想(北京)有限公司 Information processing method and first wearable equipment
CN105407453A (en) * 2015-11-23 2016-03-16 深圳还是威健康科技有限公司 Bluetooth pairing method and device
CN108712729A (en) * 2018-05-30 2018-10-26 福州米鱼信息科技有限公司 A kind of active koinotropic type wearable device and its implementation
US20190110140A1 (en) * 2016-04-07 2019-04-11 Sonova Ag Body-Worn Personal Device with Pairing Control

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014114162A1 (en) * 2013-01-26 2014-07-31 广州市沃希信息科技有限公司 Electronic device communication method and electronic device communication system
CN104750251A (en) * 2015-03-09 2015-07-01 联想(北京)有限公司 Information processing method and first wearable equipment
CN105407453A (en) * 2015-11-23 2016-03-16 深圳还是威健康科技有限公司 Bluetooth pairing method and device
US20190110140A1 (en) * 2016-04-07 2019-04-11 Sonova Ag Body-Worn Personal Device with Pairing Control
CN108712729A (en) * 2018-05-30 2018-10-26 福州米鱼信息科技有限公司 A kind of active koinotropic type wearable device and its implementation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王巍编著: "《Leap Motion人机交互应用开发》", 30 November 2015 *

Similar Documents

Publication Publication Date Title
US10169639B2 (en) Method for fingerprint template update and terminal device
US11074466B2 (en) Anti-counterfeiting processing method and related products
CN107147618B (en) User registration method and device and electronic equipment
CN108509033B (en) Information processing method and related product
CN108932102B (en) Data processing method and device and mobile terminal
CN108958503A (en) input method and device
CN111491123A (en) Video background processing method and device and electronic equipment
JP6609266B2 (en) Fingerprint identification method, apparatus, program, and recording medium
CN108766416B (en) Speech recognition method and related product
CN107291238B (en) Data processing method and device
CN111444321A (en) Question answering method, device, electronic equipment and storage medium
CN112307281A (en) Entity recommendation method and device
CN105068660A (en) Method for controlling increase and decrease of sound volume by means of fingerprint recognition and mobile terminal
WO2023103917A1 (en) Speech control method and apparatus, and electronic device and storage medium
CN107729439A (en) Obtain the methods, devices and systems of multi-medium data
CN113359990A (en) Friend making method, friend making device, first wearing equipment and readable storage medium
CN111382598B (en) Identification method and device and electronic equipment
CN114360528B (en) Speech recognition method, device, computer equipment and storage medium
CN107463822B (en) Biometric pattern control method and related product
CN113643706B (en) Speech recognition method, device, electronic equipment and storage medium
CN114220034A (en) Image processing method, device, terminal and storage medium
CN110188678B (en) Vein identification method and related product
CN114283453A (en) Method and device for acquiring information of wandering animal, storage medium and electronic equipment
CN108418960B (en) Electronic device, operation control method and related product
CN114764363A (en) Prompting method, prompting device and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210907

RJ01 Rejection of invention patent application after publication