CN109274812B - Method for controlling electronic device through ultrasonic gestures and related product - Google Patents

Method for controlling electronic device through ultrasonic gestures and related product Download PDF

Info

Publication number
CN109274812B
CN109274812B CN201810917635.6A CN201810917635A CN109274812B CN 109274812 B CN109274812 B CN 109274812B CN 201810917635 A CN201810917635 A CN 201810917635A CN 109274812 B CN109274812 B CN 109274812B
Authority
CN
China
Prior art keywords
electronic device
gesture
ultrasonic
moments
control command
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810917635.6A
Other languages
Chinese (zh)
Other versions
CN109274812A (en
Inventor
米岚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN201810917635.6A priority Critical patent/CN109274812B/en
Publication of CN109274812A publication Critical patent/CN109274812A/en
Application granted granted Critical
Publication of CN109274812B publication Critical patent/CN109274812B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a method for controlling an electronic device through ultrasonic gestures, wherein the electronic device comprises: the ultrasonic sensor is a sensor with a signal transmitting and receiving function; transmitting a first ultrasonic signal through the ultrasonic sensor to generate a sound field; acquiring a second ultrasonic signal through the ultrasonic sensor, wherein the second ultrasonic signal is all or part of echo signals of the first ultrasonic signal; and determining a gesture according to the second ultrasonic signal, and controlling the electronic device according to the gesture. The technical scheme provided by the application has the advantage of high user experience.

Description

Method for controlling electronic device through ultrasonic gestures and related product
Technical Field
The present application relates to the field of communications and terminals, and in particular, to a method for controlling an electronic device through ultrasonic gestures and a related product.
Background
In the prior art, mobile terminals (such as mobile phones, tablet computers, etc.) have become electronic devices preferred and most frequently used by users. The frame of the existing mobile terminal realizes the control of the terminal based on the physical keys, the physical keys have high cost, the service life of the physical keys is limited, and the user experience is influenced.
Disclosure of Invention
The embodiment of the application provides a method for controlling an electronic device through an ultrasonic gesture and a related product, which can realize the function of a frame key in a non-contact mode through the ultrasonic gesture and improve the user experience.
In a first aspect, an embodiment of the present application provides an electronic device, including: the ultrasonic sensor is a sensor with a signal transmitting and receiving function;
the ultrasonic sensor is used for transmitting a first ultrasonic signal to generate a sound field;
the ultrasonic sensor is further used for acquiring a second ultrasonic signal, wherein the second ultrasonic signal is all or part of echo signals of the first ultrasonic signal;
and the processor is used for determining a gesture according to the second ultrasonic signal and controlling the electronic device according to the gesture.
In a second aspect, a method of ultrasonic gesture control of an electronic device is provided, the electronic device comprising: the ultrasonic sensor is a sensor with a signal transmitting and receiving function;
transmitting a first ultrasonic signal through the ultrasonic sensor to generate a sound field;
acquiring a second ultrasonic signal through the ultrasonic sensor, wherein the second ultrasonic signal is all or part of echo signals of the first ultrasonic signal;
and determining a gesture according to the second ultrasonic signal, and controlling the electronic device according to the gesture.
In a third aspect, a computer-readable storage medium is provided, which stores a computer program for electronic data exchange, wherein the computer program causes a computer to execute the method provided in the second aspect.
In a fourth aspect, there is provided a computer program product comprising a non-transitory computer readable storage medium having a computer program stored thereon, the computer program being operable to cause a computer to perform the method provided by the second aspect.
The embodiment of the application has the following beneficial effects:
it can be seen that, in the electronic device provided by the present application, the ultrasonic sensor transmits the first ultrasonic signal to generate a sound field, and the ultrasonic sensor further acquires the second ultrasonic signal, determines a gesture according to the second ultrasonic signal, and then generates a control command corresponding to the gesture to control the electronic device.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Fig. 2 is a schematic view of an electronic device disclosed in an embodiment of the present application.
Fig. 3 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Fig. 4a is a schematic diagram of an ultrasonic wave transmission reflection provided in an embodiment of the present application.
Fig. 4b is a schematic diagram of an ultrasonic wave emitting position provided by the embodiment of the present application.
Fig. 5a is a schematic structural diagram of an input matrix according to an embodiment of the present application.
Fig. 5b is a schematic structural diagram of inputting three-dimensional data according to an embodiment of the present application.
Fig. 6 is a flowchart illustrating a method for controlling an electronic device through ultrasonic gestures according to an embodiment of the present disclosure.
Fig. 7 is a schematic structural diagram of a mobile phone disclosed in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In the electronic device provided in the first aspect, the processor is further configured to acquire a first application program currently running in the electronic device, generate a control command of the gesture in the first application program according to the gesture, and execute the control command on the first application program.
In the electronic device provided in the first aspect, the processor is specifically configured to determine a first result corresponding to the first application program after the first application program executes the control command, collect at least one parameter associated with the first application program, determine whether the first result and the at least one parameter conflict with each other, suspend execution of the control command if the first result and the at least one parameter conflict with each other, send a prompt message, receive a response message of the prompt message, and execute the control command after the execution of the control command is determined according to the response message; if it is determined that there is no conflict, executing the control command
In the electronic device provided in the first aspect, the processor is further configured to unlock the electronic device according to the gesture if the electronic device is in a screen lock state;
or the electronic device is further used for controlling the electronic device according to the gesture after the electronic device is unlocked if the electronic device is in a screen locking state and the command corresponding to the gesture is a non-unlocking command.
In the electronic device provided in the first aspect, the processor is specifically configured to control the electronic device to enter a login interface of a first game according to the gesture, acquire a picture, perform face recognition on the picture to determine a first identity, acquire an account and a password of the first game according to the first identity, and use the account and the password to realize login of the first game.
In a method provided in a second aspect, the controlling the electronic device according to the gesture specifically includes:
and acquiring a first application program currently operated by the electronic device, generating a control command of the gesture in the first application program according to the gesture, and executing the control command on the first application program.
In the method provided in the second aspect, the executing the control command on the first application specifically includes:
determining a first result corresponding to the first application program after the first application program executes the control command, collecting at least one parameter associated with the first application program, judging whether the first result conflicts with the at least one parameter, if yes, suspending the execution of the control command, sending a prompt message, receiving a response message of the prompt message, and executing the control command after the control command is confirmed to be executed according to the response message; if it is determined that there is no conflict, executing the control command
In a method provided in a second aspect, the controlling the electronic device according to the gesture specifically includes:
if the electronic device is in a screen locking state, unlocking the electronic device according to the gesture;
or if the electronic device is in a screen locking state and the command corresponding to the gesture is a non-unlocking command, unlocking the electronic device and then controlling the electronic device according to the gesture.
In a method provided in a second aspect, the controlling the electronic device according to the gesture specifically includes:
controlling the electronic device to enter a login interface of a first game according to the gesture, acquiring a picture, carrying out face recognition on the picture to determine a first identity, acquiring an account and a password of the first game according to the first identity, and using the account and the password to realize login of the first game.
In a method provided in a second aspect, the transmitting, by the ultrasonic sensor, a first ultrasonic signal comprises: transmitting, by the ultrasonic sensor, a first ultrasonic signal at a frequency corresponding to a plurality of transmission timings; the acquiring of the second ultrasonic signal by the ultrasonic sensor comprises: receiving, by the ultrasonic sensor, a second ultrasonic signal at a plurality of reception timings; the determining a gesture according to the second ultrasonic signal specifically includes:
the method comprises the steps of obtaining a plurality of emission moments of a first ultrasonic signal and a plurality of frequency emission moments corresponding to the emission moments, a plurality of receiving moments of a second ultrasonic signal and a plurality of received signal strengths, enabling the emission moments, the receiving moments and the received signal strengths to form input data, and inputting the input data into a preset neural network model to calculate and determine gestures.
In the method provided in the second aspect, the forming the plurality of transmission time instants, the plurality of reception time instants, and the plurality of received signal strengths into input data specifically includes:
the method comprises the steps of obtaining the type of sample input data in a training sample of a preset neural network model and the arrangement rule of the sample input data, if the type is matrix data, forming an input matrix by a plurality of transmitting moments, a plurality of receiving moments and a plurality of received signal strengths according to the arrangement rule, and if the type is a three-dimensional data block, forming an input three-dimensional data block by a plurality of transmitting moments, a plurality of receiving moments and a plurality of reflected signal strengths according to the arrangement rule.
In the method provided in the second aspect, the forming the plurality of transmission time instants, the plurality of reception time instants, and the plurality of received signal strengths into input data specifically includes:
obtaining the type of sample input data in a training sample of a preset neural network model and the arrangement rule of the sample input data, if the type is matrix data [ H ]0】【W0Determining a plurality of transmit instants, a plurality of receive instants, and a total number Y of received signal strengths, e.g., Y < H0*W0(ii) a Computing
Figure BDA0001763466030000051
Executing a process of inserting n values to obtain data after the insertion process, where the process of inserting n values specifically includes: in thatInserting n transmission time points into a plurality of transmission time points, inserting n receiving time points into a plurality of receiving time points, and inserting n receiving signal strengths into a plurality of receiving signal strengths; forming an input matrix by the data after insertion processing according to the arrangement rule, wherein the size of the input matrix is [ H ]0】【W0H, the above0Is a height value of the matrix, said W0May be the width value of the matrix.
In the method provided by the second aspect, the computationally determining the gesture by inputting the input data into the preset neural network model specifically includes:
and inputting the input data into a preset neural network model to execute multilayer forward operation to obtain a forward operation result, and determining the gesture according to the forward operation result.
Referring to fig. 1, fig. 1 is a schematic view of an electronic device according to an embodiment of the present disclosure, fig. 1 is a schematic view of an electronic device 100 according to an embodiment of the present disclosure, where the electronic device 100 includes: the touch screen display device comprises a shell 110, a circuit board 120, a battery 130 (optional), a cover plate 140, a touch display screen 150 and an ultrasonic sensor 170, wherein the circuit board 120, the ultrasonic sensor 170, the battery 130 and the cover plate 140 are arranged on the shell 110, and a circuit connected with the touch display screen 150 is further arranged on the circuit board 120; the circuit board 120 may further include: the application processor AP 190. As described above.
The ultrasonic sensor has functions of transmitting and receiving ultrasonic signals.
The touch Display screen may be a Thin Film Transistor-Liquid Crystal Display (TFT-LCD), a Light Emitting Diode (LED) Display screen, an Organic Light Emitting Diode (OLED) Display screen, or the like.
Referring to fig. 2, fig. 2 is a schematic structural diagram of another electronic device 100 disclosed in the embodiment of the present application, the electronic device 200 includes a storage and processing circuit 210, and a communication circuit 220 and an audio component 240 connected to the storage and processing circuit 210, wherein in some specific electronic devices 200, a display component 230 or a touch component may be further disposed.
The electronic device 200 may include control circuitry that may include storage and processing circuitry 210. The storage and processing circuit 210 may be a memory, such as a hard disk drive memory, a non-volatile memory (e.g., a flash memory or other electronically programmable read-only memory used to form a solid state drive, etc.), a volatile memory (e.g., a static or dynamic random access memory, etc.), etc., and the embodiments of the present application are not limited thereto. Processing circuitry in the storage and processing circuitry 210 may be used to control the operation of the electronic device 200. The processing circuitry may be implemented based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio codec chips, application specific integrated circuits, display driver integrated circuits, and the like.
The storage and processing circuitry 210 may be used to run software in the electronic device 200, such as Voice Over Internet Protocol (VOIP) telephone call applications, simultaneous interpretation functions, media playing applications, operating system functions, and the like. Such software may be used to perform control operations such as, for example, camera-based image capture, ambient light measurement based on an ambient light sensor, proximity sensor measurement based on a proximity sensor, information display functions implemented based on a status indicator such as a status indicator light of a light emitting diode, touch event detection based on a touch sensor, operations associated with performing wireless communication functions, operations associated with collecting and generating audio signals, control operations associated with collecting and processing button press event data, and other functions in the electronic device 200, to name a few.
The electronic device 200 may also include input-output circuitry 250. The input-output circuit 250 may be used to enable the electronic device 200 to input and output data, i.e., to allow the electronic device 200 to receive data from an external device and also to allow the electronic device 200 to output data from the electronic device 200 to an external device. The input-output circuit 250 may further include a sensor 270. The sensors 270 may include ambient light sensors, optical and capacitive based proximity sensors, touch sensors (e.g., optical based touch sensors and/or capacitive touch sensors, where the touch sensors may be part of a touch display screen or may be used independently as a touch sensor structure), acceleration sensors, and other sensors, among others.
The input-output circuitry 250 may also include a touch sensor array (i.e., the display 230 may be a touch display screen). The touch sensor may be a capacitive touch sensor formed by a transparent touch sensor electrode (e.g., an Indium Tin Oxide (ITO) electrode) array, or may be a touch sensor formed using other touch technologies, such as acoustic wave touch, pressure sensitive touch, resistive touch, optical touch, and the like, and the embodiments of the present application are not limited thereto.
The electronic device 200 may also include an audio component 240. The audio component 240 may be used to provide audio input and output functionality for the electronic device 200. Audio components 240 in electronic device 200 may include speakers, microphones, buzzers, tone generators, and other components for generating and detecting sound, such as ultrasonic sensors.
The communication circuit 220 may be used to provide the electronic device 200 with the capability to communicate with external devices. The communication circuit 220 may include analog and digital input-output interface circuits, and wireless communication circuits based on radio frequency signals and/or optical signals. The wireless communication circuitry in communication circuitry 220 may include radio-frequency transceiver circuitry, power amplifier circuitry, low noise amplifiers, switches, filters, and antennas. For example, the wireless Communication circuitry in Communication circuitry 220 may include circuitry to support Near Field Communication (NFC) by transmitting and receiving Near Field coupled electromagnetic signals. For example, the communication circuit 220 may include a near field communication antenna and a near field communication transceiver. The communications circuitry 220 may also include a cellular telephone transceiver and antenna, a wireless local area network transceiver circuit and antenna, and the like.
The electronic device 200 may further include a battery, power management circuitry, and other input-output units 260. The input-output unit 260 may include buttons, joysticks, click wheels, scroll wheels, touch pads, keypads, keyboards, cameras, light emitting diodes or other status indicators, and the like.
A user may enter commands through the input-output circuitry 250 to control the operation of the electronic device 200, and may use the output data of the input-output circuitry 250 to enable receipt of status information and other outputs from the electronic device 200.
The electronic device 200 may further include an ultrasonic sensor, which may be integrated with the audio component 240, such as the audio component 240 including a microphone, which may integrate the functions of the ultrasonic sensor for transmitting and receiving ultrasonic signals.
Referring to fig. 3, fig. 3 provides an electronic device 300, which is shown in fig. 3 and includes: the processor 301 is connected with the ultrasonic sensor 302, and the connection mode of the processor 301 and the ultrasonic sensor 302 includes but is not limited to a bus mode, but may also adopt other modes, such as a single-wire connection mode.
An ultrasonic sensor 302 for emitting a first ultrasonic signal to generate a sound field;
the ultrasonic sensor 302 is further configured to acquire a second ultrasonic signal, where the second ultrasonic signal is all or part of an echo signal of the first ultrasonic signal;
the processor 301 is configured to determine a gesture according to the second ultrasonic signal, and control the electronic apparatus according to the gesture.
According to the electronic device, the ultrasonic sensor transmits the first ultrasonic signal to generate a sound field, the ultrasonic sensor also acquires the second ultrasonic signal, the gesture is determined according to the second ultrasonic signal, and then the control command corresponding to the gesture is generated to control the electronic device.
Optionally, the processor 301 is further configured to obtain a first application currently running on the electronic device, generate a control command of the gesture in the first application according to the gesture, and execute the control command on the first application.
This is set for the same gesture may have different control commands in different applications, for example, the first gesture in an audio application may correspond to a previous or next control command, and the first gesture in a video application may correspond to a fast forward or fast backward control command. Of course, in practical applications, the application does not limit the specific corresponding relationship between the gestures and the application programs.
Optionally, the executing the control command to the first application specifically includes:
a processor, configured to determine a first result corresponding to the first application after the first application executes the control command, collect at least one parameter associated with the first application, determine whether the first result conflicts with the at least one parameter, if yes, suspend execution of the control command, send a prompt message, receive a response message of the prompt message, and execute the control command after the execution of the control command is confirmed according to the response message; if no conflict is determined, the control command is executed.
According to the technical scheme, whether the control command conflicts with the corresponding result after being executed can be determined, and the influence on the experience of the user due to the gesture of misoperation of the user is avoided. For example, the user is using a music application, when the first control command is to close the music application, the at least one parameter collected comprises a wearing parameter of the headset, when it is determined that the first result conflicts with the at least one parameter if it is determined that the headset is still worn.
Optionally, the processor 301 is further configured to unlock the electronic device according to the gesture if the electronic device is locked;
or the electronic device is further used for controlling the electronic device according to the gesture after the electronic device is unlocked if the electronic device is in a screen locking state and the command corresponding to the gesture is a non-unlocking command.
And the processor is specifically used for controlling the electronic device to enter a login interface of a first game according to the gesture, acquiring the picture, carrying out face recognition on the picture to determine a first identity corresponding to the picture, acquiring an account and a password of the first game according to the first identity, and realizing login of the first game by using the account and the password.
According to the technical scheme, the game account can be quickly logged in through gestures, the security is improved through the login, and other users are prevented from logging in.
Optionally, the first ultrasonic signal may be a first ultrasonic signal transmitted by the ultrasonic sensor at a frequency corresponding to a plurality of transmission moments; the second ultrasonic signal includes: a second ultrasonic signal is received at a plurality of reception timings by the ultrasonic sensor.
The plurality of different frequencies may specifically include acoustic signals in excess of 20KHz, and the present application is not limited to a specific number of the plurality of frequencies, nor to specific frequency values of the plurality of different frequencies. Specifically, in an optional embodiment, the plurality of frequencies may be 10 frequencies, and the 10 frequencies may be 21KHz, 22KHz, 23KHz, 24KHz, 25KHz, 26KHz, 27KHz, 28KHz, 29KHz, and 30KHz, respectively.
The processor 301 is configured to obtain multiple transmitting times of the first ultrasonic signal, multiple receiving times of the second ultrasonic signal, and multiple receiving signal strengths of the first ultrasonic signal at multiple frequencies, combine the multiple transmitting times, the multiple receiving times of the ultrasonic signal, and the multiple receiving signal strengths into input data, input the input data into a preset neural network model, calculate and determine a gesture, and generate a control command corresponding to the gesture.
The ultrasonic transmitter transmits ultrasonic signals with different frequencies at each transmitting moment in the electronic device, so that reflected ultrasonic signals received by the ultrasonic receiver are also of different frequencies, the transmitting moments, the receiving moments and the reflected signal intensities corresponding to the ultrasonic signals with different frequencies can form input data, the input data is input into a preset trained neural network model to be calculated, and then gestures of the ultrasonic waves can be obtained, and control commands corresponding to the gestures are generated to control the electronic device.
Referring to fig. 4a, fig. 4a is a schematic view of a second ultrasonic signal, as shown in fig. 4a, the electronic device shown in fig. 4a is exemplified by a smart phone, and a plurality of ultrasonic sensors of the smart phone may be respectively installed at two sides, specifically, at a left side and a right side of the smart phone.
The number of the ultrasonic sensors on each side may be multiple, each ultrasonic sensor transmits a first ultrasonic signal with one frequency, different ultrasonic sensors transmit ultrasonic signals with different frequencies, and the ultrasonic sensor may receive all or part of echo signals (i.e., second ultrasonic signals) of the first ultrasonic signals transmitted by the multiple ultrasonic sensors.
Referring to fig. 4b, fig. 4b is a schematic diagram of the reflecting object (hand) at the position a, the position b and the position c, the ultrasonic wave is also a sound wave, which has some characteristics of the sound wave, for example, the speed is about 340m/s, since different positions of the emitting object will have certain influence on the receiving time and the intensity of the emitting signal, the emitting time, the receiving time and the intensity of the reflecting signal are combined into input data, and then the forward operation result is obtained by calculation according to the preset neural network model, and the gesture can be determined according to the forward operation result. The following description explains how to compose the input data and determine the corresponding gesture according to the forward operation result.
Optionally, the implementation manner of forming the input data by the multiple transmitting moments, the multiple receiving moments, and the multiple reflected signal strengths may specifically be:
the processor 301 is specifically configured to obtain a type of sample input data in a training sample of a preset neural network model and an arrangement rule of the sample input data, and if the type is matrix data, form an input matrix according to the arrangement rule by using a plurality of transmission moments, a plurality of reception moments, and a plurality of received signal strengths, and if the type is a three-dimensional data block, form an input three-dimensional data block according to the arrangement rule by using a plurality of transmission moments, a plurality of reception moments, and a plurality of reflected signal strengths.
The above-mentioned input data is determined by a practical example, where the type of the input data is exemplified by matrix data, and the arrangement rule may be that, if the number of the transmission time, the reception time and the received signal strength is not enough to form a matrix, the number of the transmission time, the reception time and the received signal strength is made to form a matrix by supplementing zero elements. A specific supplementary schematic diagram is shown in fig. 5a, and as shown in fig. 5a, the final black boxes are elements that supplement zero, and each box in fig. 5a represents an element of a matrix. Of course, the above arrangement rule may also be: the reception time-the received signal strength-the transmission time, but of course, other alignment rules are also possible.
The type of the input data is exemplified by a three-dimensional data block, and the arrangement rule may be that, if the number of the transmission time, the reception time and the received signal strength is not enough to form the three-dimensional data block, the number of the transmission time, the reception time and the received signal strength is made to form the three-dimensional data block by supplementing zero elements. The specific supplementary schematic diagram is shown in fig. 5b, and as shown in fig. 5b, the last black box is an element for supplementing zero, and each box in fig. 5b represents an element of a three-dimensional data block.
In the training method of the neural network model, each sample input data in a plurality of sample input data is input into the neural network model for training to update the weight data in the neural network model, all the plurality of sample input data are trained to update the weight data, the neural network model at the moment is a trained neural network model, and the weight data are not changed after the neural network model is trained. The plurality of sample input data at least needs to include: sample input data corresponding to various gestures, each gesture having at least a separate sample input data. Because the weight data in the preset neural network model is not changed, the input data which is input into the preset neural network model and is subjected to forward operation needs to be consistent with the type of the sample input data, and if the types are inconsistent, the result of the operation which is possibly executed by the neural network model has a lot of deviations. Specifically, matrix-to-matrix multiplication in mathematical calculations and calculations between three-dimensional data blocks and three-dimensional data blocks are performed based on the positions of elements, if the types are not consistent, the corresponding positions must be changed, for example, the input matrix shown in FIG. 5a and the input three-dimensional data shown in FIG. 5b, even if the same plurality of transmission time instants, plurality of reception time instants and plurality of received signal strengths are respectively adopted to constitute the input matrix and input three-dimensional data, due to the inconsistent types, the positions of most elements in the input matrix and the input three-dimensional data are inconsistent, the misalignment of the positions can lead to a large deviation of the calculated result, so that the forward output result is inaccurate, and the inaccurate forward output result can lead to a deviation of the gesture determined according to the forward output result. The input data formed by the same type and the same arrangement rule can reduce the inconsistency of positions and types, and improve the accuracy of the forward output result.
Optionally, the processor 301 is specifically configured to obtain a type of sample input data in a training sample of a preset neural network model and an arrangement rule of the sample input data, where the type is matrix data [ H [ ]0】【W0Determining a plurality of transmit instants, a plurality of receive instants, and a total number Y of received signal strengths, e.g., Y < H0*W0(ii) a Computing
Figure BDA0001763466030000111
Executing a process of inserting n values to obtain data after the insertion process, wherein the process of inserting n values specifically includes: inserting n transmitting moments at multiple transmitting moments, inserting n receiving moments at multiple receiving moments, inserting n receiving signal intensities at multiple receiving signal intensities, and forming an input matrix according to the arrangement rule by the data after insertion processing, wherein the size of the input matrix is [ H [ ]0】【W0H, the H0Is a height value of the matrix, the W0May be the width value of the matrix.
The n transmission time instants may be inserted in a plurality of ways, for example, in an alternative way, n transmission time instants are inserted after the plurality of transmission time instants, two adjacent transmission time instants of the n transmission time instants are a set time interval, n reception time instants are inserted after the plurality of reception time instants, two adjacent reception time instants of the n reception time instants are a set time interval, and n reception signal strengths are inserted after the plurality of reception signal strengths, where the n reception signal strengths may be an average value of the plurality of reception signal strengths, and of course, the n reception signal strengths may be a plurality of reception signal strengths distributed discretely, and the plurality of reception signal strengths distributed discretely are within a set range and the average value of the plurality of reception signal strengths distributed discretely is the same as the average value of the plurality of reception signal strengths.
The insertion mode can simulate the originally collected transmitting time, receiving time and received signal strength value as much as possible, so that the authenticity of input matrix data can be improved, and the accuracy of a forward operation result is improved.
Optionally, the above inputting the input data into the preset neural network model to calculate and determine the gesture specifically may include: and the processor is specifically used for inputting the input data into a preset neural network model to execute multilayer forward operation to obtain a forward operation result, and determining the gesture according to the forward operation result.
Optionally, determining the gesture according to the forward operation result may specifically include: the processor is specifically configured to extract, from the forward operation result, X elements whose element values are greater than a set threshold and X positions corresponding to the X elements, and determine that the forward operation result is the first gesture if the X positions have a first gesture corresponding to a position exceeding X/2.
Optionally, the processor is specifically configured to extract, from the forward operation result, X elements whose element values are greater than a set threshold and X positions corresponding to the X elements, determine a plurality of gestures corresponding to the X positions, select, from the plurality of gestures, a second gesture that occupies the largest number of positions in the X positions, and determine that the forward operation structure is the second gesture.
It should be noted that the gesture corresponding to each element value in the forward operation result may be determined during training, and for the training sample input data, since the training sample input data is labeled sample data, that is, it is known what gesture the training sample input data belongs to, the training sample is input into a preset neural network model to obtain the forward operation result, and the gesture at the position corresponding to the element greater than the set threshold in the forward operation result is the gesture corresponding to the position. Similarly, gestures corresponding to elements in all forward operation results can be obtained through calculation of a plurality of sample input data.
Referring to fig. 6, fig. 6 is a method for controlling an electronic device through ultrasonic gestures according to the present application, where the method is executed by the electronic device, and the electronic device may be configured as shown in fig. 2 or fig. 3, and the method includes the following steps:
step S601, transmitting a first ultrasonic signal through the ultrasonic sensor to generate a sound field;
step S602, acquiring a second ultrasonic signal through the ultrasonic sensor, wherein the second ultrasonic signal is all or part of echo signals of the first ultrasonic signal;
step S603, determining a gesture according to the second ultrasonic signal, and controlling the electronic device according to the gesture.
According to the electronic device, the ultrasonic sensor transmits the first ultrasonic signal to generate a sound field, the ultrasonic sensor also acquires the second ultrasonic signal, the gesture is determined according to the second ultrasonic signal, and then the control command corresponding to the gesture is generated to control the electronic device.
The executing the control command on the first application program specifically includes:
determining a first result corresponding to the first application program after the first application program executes the control command, collecting at least one parameter associated with the first application program, judging whether the first result conflicts with the at least one parameter, if yes, suspending the execution of the control command, sending a prompt message, receiving a response message of the prompt message, and executing the control command after the control command is confirmed to be executed according to the response message; if no conflict is determined, the control command is executed.
Optionally, the controlling the electronic device according to the gesture specifically includes:
and acquiring a first application program currently operated by the electronic device, generating a control command of the gesture in the first application program according to the gesture, and executing the control command on the first application program.
Optionally, the controlling the electronic device according to the gesture specifically includes:
controlling the electronic device to enter a login interface of a first game according to the gesture, acquiring the picture, carrying out face recognition on the picture to determine a first identity corresponding to the picture, acquiring an account and a password of the first game according to the first identity, and realizing login of the first game by using the account and the password.
Optionally, the controlling the electronic device according to the gesture specifically includes:
if the electronic device is in a screen locking state, unlocking the electronic device according to the gesture;
or if the electronic device is in a screen locking state and the command corresponding to the gesture is a non-unlocking command, unlocking the electronic device and then controlling the electronic device according to the gesture.
Optionally, the transmitting the first ultrasonic signal by the ultrasonic sensor includes: transmitting, by the ultrasonic sensor, a first ultrasonic signal at a frequency corresponding to a plurality of transmission timings; the acquiring of the second ultrasonic signal by the ultrasonic sensor comprises: receiving, by the ultrasonic sensor, a second ultrasonic signal at a plurality of reception timings; the determining a gesture according to the second ultrasonic signal specifically includes:
the method comprises the steps of obtaining a plurality of transmitting moments of a first ultrasonic signal, a plurality of frequencies corresponding to the transmitting moments, a plurality of receiving moments of a second ultrasonic signal and a plurality of received signal strengths, enabling the transmitting moments, the receiving moments and the received signal strengths to form input data, and inputting the input data into a preset neural network model to calculate and determine gestures.
Optionally, the forming the multiple transmitting time instants, the multiple receiving time instants, and the multiple received signal strengths into input data specifically includes:
the method comprises the steps of obtaining the type of sample input data in a training sample of a preset neural network model and the arrangement rule of the sample input data, if the type is matrix data, forming an input matrix by a plurality of transmitting moments, a plurality of receiving moments and a plurality of received signal strengths according to the arrangement rule, and if the type is a three-dimensional data block, forming an input three-dimensional data block by a plurality of transmitting moments, a plurality of receiving moments and a plurality of reflected signal strengths according to the arrangement rule.
Optionally, the forming the multiple transmitting time instants, the multiple receiving time instants, and the multiple received signal strengths into input data specifically includes:
obtaining the type of sample input data in a training sample of a preset neural network model and the arrangement rule of the sample input data, if the type is matrix data [ H ]0】【W0Determining a plurality of transmit instants, a plurality of receive instants, and a total number Y of received signal strengths, e.g., Y < H0*W0(ii) a Computing
Figure BDA0001763466030000141
Executing a process of inserting n values to obtain data after the insertion process, where the process of inserting n values specifically includes: inserting n transmission moments at a plurality of transmission moments, inserting n reception moments at a plurality of reception moments, and inserting n reception signal strengths at a plurality of reception signal strengths; forming an input matrix by the data after insertion processing according to the arrangement rule, wherein the size of the input matrix is [ H ]0】【W0H, the above0Is a height value of the matrix, said W0May be the width value of the matrix.
Optionally, the inputting the input data into a preset neural network model to calculate and determine the gesture specifically includes:
and inputting the input data into a preset neural network model to execute multilayer forward operation to obtain a forward operation result, and determining the gesture according to the forward operation result.
Fig. 7 is a block diagram illustrating a partial structure of a mobile phone related to a mobile terminal according to an embodiment of the present disclosure. Referring to fig. 7, the handset includes: radio Frequency (RF) circuit 910, memory 920, input/output unit 930, sensor 950, audio collector 960, Wireless Fidelity (WiFi) module 970, application processor AP980, touch display 933, power supply 990, and the like. Those skilled in the art will appreciate that the handset configuration shown in fig. 7 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 7:
the input and output unit 930 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 930 may include a fingerprint recognition apparatus 931, a face recognition apparatus 936, an iris recognition apparatus 937, and other input devices 932. The input unit 930 may also include other input devices 932. In particular, other input devices 932 may include, but are not limited to, one or more of physical keys, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like. Wherein,
the application processor AP980 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions and processes of the mobile phone by running or executing software programs and/or modules stored in the memory 920 and calling data stored in the memory 920, thereby integrally monitoring the mobile phone. Optionally, AP980 may include one or more processing units; alternatively, the AP980 may integrate an application processor that handles primarily the operating system, user interface, and applications, etc., and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into the AP 980.
Further, the memory 920 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
RF circuitry 910 may be used for the reception and transmission of information. In general, the RF circuit 910 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 910 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The handset may also include at least one sensor 950, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the touch display screen according to the brightness of ambient light, and the proximity sensor may turn off the touch display screen and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio collector 960, speaker 961, microphone 962 may provide an audio interface between the user and the handset. The audio collector 960 can transmit the received electrical signal converted from the audio data to the speaker 961, and the audio data is converted into a sound signal by the speaker 961 for playing; on the other hand, the microphone 962 converts the collected sound signal into an electrical signal, and the electrical signal is received by the audio collector 960 and converted into audio data, and then the audio data is processed by the audio data playing AP980, and then the audio data is sent to another mobile phone through the RF circuit 910, or the audio data is played to the memory 920 for further processing.
The above-mentioned mobile phone may further include: an ultrasonic sensor, which may be provided integrally with the speaker 961. That is, the speaker 961 can transmit or receive an ultrasonic signal.
A speaker 961 for emitting a first ultrasonic signal to generate a sound field;
a microphone 962, configured to acquire a second ultrasonic signal, where the second ultrasonic signal is all or part of an echo signal of the first ultrasonic signal;
and the application processor AP980 is used for determining a gesture according to the second ultrasonic signal and controlling the electronic device according to the gesture.
The application processor AP980 is also used for unlocking the electronic device according to the gesture if the electronic device is in a screen locking state;
or the electronic device is further used for controlling the electronic device according to the gesture after the electronic device is unlocked if the electronic device is in a screen locking state and the command corresponding to the gesture is a non-unlocking command.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 970, and provides wireless broadband Internet access for the user. Although fig. 7 shows the WiFi module 970, it is understood that it does not belong to the essential constitution of the handset, and can be omitted entirely as needed within the scope of not changing the essence of the application.
The handset also includes a power supply 990 (e.g., a battery) for supplying power to various components, and optionally, the power supply may be logically connected to the AP980 via a power management system, so that functions of managing charging, discharging, and power consumption are implemented via the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, a light supplement device, a light sensor, and the like, which are not described herein again.
It can be seen that the technical scheme that this application provided confirms whether in the sleep state through acquireing position and time, if in the sleep state, judge whether contain the settlement crowd through the picture that acquires, when containing the settlement crowd, control wireless transceiver's communication function, reduce wireless transceiver's radiation like this and to settlement crowd's influence, improve user's experience degree.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the methods for controlling an electronic device by ultrasonic gestures as described in the above method embodiments.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any one of the methods of ultrasonic gesture control of an electronic device as recited in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are exemplary embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (13)

1. An electronic device, comprising: the ultrasonic sensor is a sensor with a signal transmitting and receiving function;
the ultrasonic sensor is used for transmitting a first ultrasonic signal to generate a sound field, and comprises: transmitting, by the ultrasonic sensor, a first ultrasonic signal at a frequency corresponding to a plurality of transmission timings;
the ultrasonic sensor is further configured to acquire a second ultrasonic signal, and includes: receiving, by the ultrasonic sensor, a second ultrasonic signal at a plurality of reception timings; the second ultrasonic signal is all or part of echo signals of the first ultrasonic signal;
the processor is configured to obtain a plurality of transmission timings of the first ultrasonic wave, a plurality of reception timings of the second ultrasonic wave, and a plurality of received signal strengths, and combine the plurality of transmission timings, the plurality of reception timings, and the plurality of received signal strengths into an input numberAccording to the method, the method comprises the following steps: obtaining the type of sample input data in a training sample of a preset neural network model and the arrangement rule of the sample input data, if the type is matrix data [ H ]0】【W0Determining a plurality of transmit instants, a plurality of receive instants, and a total number Y of received signal strengths, e.g., Y < H0*W0(ii) a Computing
Figure 350109DEST_PATH_IMAGE002
Executing a process of inserting n values to obtain data after the insertion process, where the process of inserting n values specifically includes: inserting n transmission moments at a plurality of transmission moments, inserting n reception moments at a plurality of reception moments, and inserting n reception signal strengths at a plurality of reception signal strengths; forming an input matrix by the data after insertion processing according to the arrangement rule, wherein the size of the input matrix is [ H ]0】【W0H, the above0Is a height value of the matrix, said W0May be the width value of the matrix; inputting the input data into a preset neural network model to calculate and determine a gesture; and controlling the electronic device according to the gesture.
2. The electronic device of claim 1,
the processor is further configured to acquire a first application program currently running in the electronic device, generate a control command of the gesture in the first application program according to the gesture, and execute the control command on the first application program.
3. The electronic device of claim 2,
the processor is specifically configured to determine a first result corresponding to the first application after the first application executes the control command, collect at least one parameter associated with the first application, determine whether the first result conflicts with the at least one parameter, if yes, suspend execution of the control command, send a prompt message, receive a response message of the prompt message, and execute the control command after the execution of the control command is confirmed according to the response message; if no conflict is determined, the control command is executed.
4. The electronic device of claim 1,
the processor is further configured to unlock the electronic device according to the gesture if the electronic device is locked;
or the electronic device is further used for controlling the electronic device according to the gesture after the electronic device is unlocked if the electronic device is in a screen locking state and the command corresponding to the gesture is a non-unlocking command.
5. The electronic device of claim 4,
the processor is specifically configured to control the electronic device to enter a login interface of a first game according to the gesture, collect a picture, perform face recognition on the picture to determine a first identity, acquire an account and a password of the first game according to the first identity, and use the account and the password to realize login of the first game.
6. A method for controlling an electronic device through ultrasonic gestures, the electronic device comprising: the ultrasonic sensor is a sensor with a signal transmitting and receiving function;
transmitting, by the ultrasonic sensor, a first ultrasonic signal to generate a sound field, comprising: transmitting, by the ultrasonic sensor, a first ultrasonic signal at a frequency corresponding to a plurality of transmission timings;
acquiring a second ultrasonic signal by the ultrasonic sensor, comprising: receiving, by the ultrasonic sensor, a second ultrasonic signal at a plurality of reception timings; the second ultrasonic signal is all or part of echo signals of the first ultrasonic signal;
acquiring a plurality of transmitting moments of the first ultrasonic wave, a plurality of receiving moments of the second ultrasonic wave and a plurality of received signal strengths, and converting the plurality of transmitting moments, the plurality of receiving moments and the plurality of received signal strengths into a plurality of signalsThe transmit time, the plurality of receive times, and the plurality of received signal strengths comprise input data comprising: obtaining the type of sample input data in a training sample of a preset neural network model and the arrangement rule of the sample input data, if the type is matrix data [ H ]0】【W0Determining a plurality of transmit instants, a plurality of receive instants, and a total number Y of received signal strengths, e.g., Y < H0* W0(ii) a Computing
Figure 267250DEST_PATH_IMAGE002
Executing a process of inserting n values to obtain data after the insertion process, where the process of inserting n values specifically includes: inserting n transmission moments at a plurality of transmission moments, inserting n reception moments at a plurality of reception moments, and inserting n reception signal strengths at a plurality of reception signal strengths; forming an input matrix by the data after insertion processing according to the arrangement rule, wherein the size of the input matrix is [ H ]0】【W0H, the above0Is a height value of the matrix, said W0May be the width value of the matrix;
inputting the input data into a preset neural network model to calculate and determine a gesture;
and controlling the electronic device according to the gesture.
7. The method of claim 6, wherein the controlling the electronic device according to the gesture specifically comprises:
and acquiring a first application program currently operated by the electronic device, generating a control command of the gesture in the first application program according to the gesture, and executing the control command on the first application program.
8. The method according to claim 7, wherein the executing the control command on the first application specifically comprises:
determining a first result corresponding to the first application program after the first application program executes the control command, collecting at least one parameter associated with the first application program, judging whether the first result conflicts with the at least one parameter, if yes, suspending the execution of the control command, sending a prompt message, receiving a response message of the prompt message, and executing the control command after the control command is confirmed to be executed according to the response message; if no conflict is determined, the control command is executed.
9. The method of claim 6, wherein the controlling the electronic device according to the gesture specifically comprises:
if the electronic device is in a screen locking state, unlocking the electronic device according to the gesture;
or if the electronic device is in a screen locking state and the command corresponding to the gesture is a non-unlocking command, unlocking the electronic device and then controlling the electronic device according to the gesture.
10. The method of claim 8, wherein the controlling the electronic device according to the gesture specifically comprises:
controlling the electronic device to enter a login interface of a first game according to the gesture, acquiring a picture, carrying out face recognition on the picture to determine a first identity, acquiring an account and a password of the first game according to the first identity, and using the account and the password to realize login of the first game.
11. The method of claim 10, wherein the combining the plurality of transmit time instants, the plurality of receive time instants, and the plurality of received signal strengths into input data specifically comprises:
the method comprises the steps of obtaining the type of sample input data in a training sample of a preset neural network model and the arrangement rule of the sample input data, if the type is matrix data, forming an input matrix by a plurality of transmitting moments, a plurality of receiving moments and a plurality of received signal strengths according to the arrangement rule, and if the type is a three-dimensional data block, forming an input three-dimensional data block by a plurality of transmitting moments, a plurality of receiving moments and a plurality of reflected signal strengths according to the arrangement rule.
12. The method according to claim 10, wherein the inputting the input data into a preset neural network model calculation determination gesture specifically comprises:
and inputting the input data into a preset neural network model to execute multilayer forward operation to obtain a forward operation result, and determining the gesture according to the forward operation result.
13. A computer-readable storage medium, characterized in that it stores a computer program for electronic data exchange, wherein the computer program causes a computer to perform the method according to any one of claims 6-12.
CN201810917635.6A 2018-08-13 2018-08-13 Method for controlling electronic device through ultrasonic gestures and related product Active CN109274812B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810917635.6A CN109274812B (en) 2018-08-13 2018-08-13 Method for controlling electronic device through ultrasonic gestures and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810917635.6A CN109274812B (en) 2018-08-13 2018-08-13 Method for controlling electronic device through ultrasonic gestures and related product

Publications (2)

Publication Number Publication Date
CN109274812A CN109274812A (en) 2019-01-25
CN109274812B true CN109274812B (en) 2021-02-02

Family

ID=65153520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810917635.6A Active CN109274812B (en) 2018-08-13 2018-08-13 Method for controlling electronic device through ultrasonic gestures and related product

Country Status (1)

Country Link
CN (1) CN109274812B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110032276A (en) * 2019-03-07 2019-07-19 永德利硅橡胶科技(深圳)有限公司 Quick start voice translation method and Related product
CN110297542B (en) * 2019-06-28 2022-10-18 Oppo广东移动通信有限公司 Parameter adjusting method and related equipment
CN110737387B (en) * 2019-10-09 2021-06-04 Oppo广东移动通信有限公司 Page display method and related equipment
CN112865826A (en) * 2019-11-28 2021-05-28 华为技术有限公司 Touch operation identification method and wearable device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103064606A (en) * 2012-12-24 2013-04-24 天津三星光电子有限公司 Screen unlocking method for mobile terminal
CN104898844A (en) * 2015-01-23 2015-09-09 瑞声光电科技(常州)有限公司 Gesture recognition and control device based on ultrasonic positioning and gesture recognition and control method based on ultrasonic positioning
CN106203380A (en) * 2016-07-20 2016-12-07 中国科学院计算技术研究所 Ultrasound wave gesture identification method and system
CN106303599A (en) * 2016-08-11 2017-01-04 腾讯科技(深圳)有限公司 A kind of information processing method, system and server

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3324204B1 (en) * 2016-11-21 2020-12-23 HTC Corporation Body posture detection system, suit and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103064606A (en) * 2012-12-24 2013-04-24 天津三星光电子有限公司 Screen unlocking method for mobile terminal
CN104898844A (en) * 2015-01-23 2015-09-09 瑞声光电科技(常州)有限公司 Gesture recognition and control device based on ultrasonic positioning and gesture recognition and control method based on ultrasonic positioning
CN106203380A (en) * 2016-07-20 2016-12-07 中国科学院计算技术研究所 Ultrasound wave gesture identification method and system
CN106303599A (en) * 2016-08-11 2017-01-04 腾讯科技(深圳)有限公司 A kind of information processing method, system and server

Also Published As

Publication number Publication date
CN109274812A (en) 2019-01-25

Similar Documents

Publication Publication Date Title
CN109271121B (en) Application display method and mobile terminal
CN109274812B (en) Method for controlling electronic device through ultrasonic gestures and related product
TWI679552B (en) Unlocking control method and mobile terminal
US11227042B2 (en) Screen unlocking method and apparatus, and storage medium
CN108900231B (en) Dynamic antenna adjustment method and related product
CN109240551B (en) Method for controlling electronic device by using gestures and related product
CN103389863B (en) A kind of display control method and device
CN104375886B (en) Information processing method, device and electronic equipment
CN106445596B (en) Method and device for managing setting items
CN109150221B (en) Master-slave switching method for wearable equipment and related product
CN109739394B (en) SAR value processing method and mobile terminal
CN108833683B (en) Dynamic antenna adjustment implementation method and related product
CN107317918B (en) Parameter setting method and related product
CN106209608B (en) Method and device for distinguishing and synchronizing chat information
CN108834013B (en) Wearable equipment electric quantity balancing method and related product
CN111246061B (en) Mobile terminal, method for detecting shooting mode and storage medium
CN108989546B (en) Approach detection method of electronic device and related product
CN110688051B (en) Screen recording operation method and device, computer readable storage medium and terminal
CN107194223B (en) Fingerprint identification area display method and related product
CN108810261B (en) Antenna switching method in call and related product
CN108388459B (en) Message display processing method and mobile terminal
CN108196663B (en) Face recognition method and mobile terminal
CN110277097B (en) Data processing method and related equipment
CN111372003A (en) Camera switching method and device and terminal
CN110891262A (en) Bluetooth pairing method, system and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant