CN111857369B - Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal - Google Patents

Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal Download PDF

Info

Publication number
CN111857369B
CN111857369B CN202010727791.3A CN202010727791A CN111857369B CN 111857369 B CN111857369 B CN 111857369B CN 202010727791 A CN202010727791 A CN 202010727791A CN 111857369 B CN111857369 B CN 111857369B
Authority
CN
China
Prior art keywords
proximity sensor
mobile terminal
calibrating
gesture
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010727791.3A
Other languages
Chinese (zh)
Other versions
CN111857369A (en
Inventor
丛国华
韩冰天
李腾飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202010727791.3A priority Critical patent/CN111857369B/en
Publication of CN111857369A publication Critical patent/CN111857369A/en
Application granted granted Critical
Publication of CN111857369B publication Critical patent/CN111857369B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S2007/4975Means for monitoring or calibrating of sensor obstruction by, e.g. dirt- or ice-coating, e.g. by reflection measurement on front-screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Software Systems (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The present disclosure relates to the field of computer technology, and in particular, to a method, an apparatus, a terminal, and a storage medium for calibrating a proximity sensor of a mobile terminal. The method for calibrating the proximity sensor of the mobile terminal comprises the following steps: acquiring sensor information detected by a motion sensor of a mobile terminal; determining the gesture of the mobile terminal according to the sensor information; when the gesture is determined to be in accordance with the preset gesture condition, acquiring detection information of the proximity sensor; and calibrating the background noise value of the proximity sensor according to the detection information. According to the method for calibrating the proximity sensor of the mobile terminal, when the current mobile terminal is judged to be in the preset gesture with low shielding probability, the bottom noise value of the proximity sensor is calibrated according to the current detection information of the proximity sensor, so that the bottom noise value of the proximity sensor can be accurately calibrated in the use process of the mobile terminal, and the detection accuracy of the proximity sensor is improved.

Description

Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal
Technical Field
The present disclosure relates to the field of computer technology, and in particular, to a method, an apparatus, a terminal, and a storage medium for calibrating a proximity sensor of a mobile terminal.
Background
The proximity sensor (Proximity Sensor) is used for short-distance ranging, can continuously emit infrared light outwards, and can detect distance information according to an energy signal of infrared light reflected by a close-distance shielding object when the close-distance shielding object shields the infrared light emitted by the close-distance shielding object. The proximity sensor is usually installed at the top of cell-phone screen, and when the cell-phone was in the conversation state, when user's face pressed close to the screen, proximity sensor auto-induction cell-phone was with the distance of people's face to can make the cell-phone screen go out, prevent that the conversation process from producing the mistake and touching.
The calibration of the proximity sensor on the mobile phone is generally performed when the mobile phone leaves the factory, and once the calibration is finished, the secondary calibration is not easy to perform during the use period. However, in the use process of the mobile phone, some situations of interference with the bottom noise value of the proximity sensor inevitably occur, wherein the bottom noise value refers to the detection signal value of the proximity sensor under the condition that the mobile terminal is not shielded, and the detection signal value is generated by diffraction of infrared light in the terminal. The situations of the change of the position of the proximity sensor caused by the stains on the mobile phone film and the mobile phone screen, the collision of the mobile phone and the like can all cause the change of the actual background noise value of the proximity sensor. When the background noise value of the proximity sensor changes, the proximity sensor can be inaccurate in ranging, so that the background noise value of the proximity sensor needs to be calibrated in the use process of the mobile phone.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, the present disclosure provides a method of calibrating a proximity sensor of a mobile terminal, comprising:
acquiring sensor information detected by a motion sensor of the mobile terminal;
Determining the gesture of the mobile terminal according to the sensor information;
when the gesture is determined to accord with a preset gesture condition, acquiring detection information of the proximity sensor;
And calibrating the background noise value of the proximity sensor according to the detection information.
In a second aspect, the present disclosure provides an apparatus for calibrating a proximity sensor of a mobile terminal, comprising:
A sensor information acquisition unit for acquiring sensor information detected by a motion sensor of the mobile terminal;
the terminal gesture determining unit is used for determining the gesture of the mobile terminal according to the sensor information;
A detection information determining unit, configured to obtain detection information of the proximity sensor when it is determined that the gesture meets a preset gesture condition;
and the calibration unit is used for calibrating the background noise value of the proximity sensor according to the detection information.
In a third aspect, the present disclosure provides a terminal comprising:
at least one memory and at least one processor;
Wherein the memory is used for storing program codes, and the processor is used for calling the program codes stored in the memory to execute the method for calibrating the proximity sensor of the mobile terminal according to the embodiment of the disclosure.
In a fourth aspect, a non-transitory computer storage medium stores program code for performing a method of calibrating a proximity sensor of a mobile terminal provided according to an embodiment of the present disclosure.
According to the method for calibrating the proximity sensor of the mobile terminal, when the current mobile terminal is judged to be in the preset gesture with low shielding probability, the bottom noise value of the proximity sensor is calibrated according to the detection information of the proximity sensor, so that the bottom noise value of the proximity sensor can be accurately calibrated in the use process of the mobile terminal, and the detection accuracy of the proximity sensor is improved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a flow chart of a method of calibrating a proximity sensor of a mobile terminal provided in accordance with one embodiment of the present disclosure;
FIG. 2 is a flow chart of a method of calibrating a proximity sensor of a mobile terminal provided in accordance with another embodiment of the present disclosure;
FIG. 3 is a flow chart of a training manner of a machine learning model provided according to an embodiment of the present disclosure;
fig. 4 is a schematic structural view of an apparatus for calibrating a proximity sensor of a mobile terminal according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a terminal device for implementing an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the steps recited in the apparatus embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Furthermore, apparatus embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
It should be noted that references in this disclosure to "responsive to" are intended to mean a condition or state upon which an operation is performed, one or more of which may be performed in real-time or with a set delay when the condition or state upon which it is dependent is satisfied.
For the purposes of this disclosure, the phrase "a and/or B" means (a), (B), or (a and B).
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
Referring to fig. 1, fig. 1 illustrates a flowchart of a method 100 of calibrating a proximity sensor of a mobile terminal provided by an embodiment of the present disclosure. Mobile terminals in embodiments of the present disclosure may include, but are not limited to, devices such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), smart wearable devices, and the like. The method 100 comprises steps S101-S104:
step S101: sensor information detected by a motion sensor of the mobile terminal is acquired.
In the present embodiment, the motion sensor may include, but is not limited to, an acceleration sensor, a gravity sensor, a gyro sensor, a magnetic force sensor, a direction sensor, etc. may be used as a sensor for detecting a motion state and a posture of the terminal.
Step S102: and determining the posture of the mobile terminal according to the sensor information.
The determination of the current posture of the mobile terminal according to the sensor information detected by the motion sensor may take a technical solution related to the field, which is not limited in this embodiment. In some embodiments, a corresponding sensor threshold or condition may be preset for a specific target gesture or gestures, and when the sensor information detected by the motion sensor meets the corresponding threshold or condition, the target gesture in which the current mobile terminal is located may be determined.
Step S103: and when the gesture is determined to be in accordance with the preset gesture condition, acquiring detection information of the proximity sensor.
When the preset gesture condition is set to be that the gesture of the mobile terminal accords with the gesture defined by the preset gesture condition, the probability that the proximity sensor is blocked in a close range is low, for example, gestures such as vertical screen, screen upwards and the like.
In some implementations, the preset gesture condition is a vertical screen gesture, that is, when the mobile terminal is determined to be in the vertical screen gesture, detection information of the proximity sensor is obtained. The inventor experiment research shows that when a user holds the mobile terminal by standing on the screen, the proximity sensor is hardly blocked by a close distance.
In this embodiment, when it is determined that the mobile terminal is in a posture condition that meets a preset requirement, the proximity sensor may detect one time or acquire one detection information, or may continuously detect or detect multiple times within a preset duration to acquire multiple detection information.
Step S104: and calibrating the background noise value of the proximity sensor according to the detection information.
In this embodiment, the detection information for calibrating the background noise value of the proximity sensor may be one or more, wherein a plurality of detection information for calibrating the background noise value of the proximity sensor may be acquired by repeating steps S101 to S103.
In some embodiments, the detection information obtained in step S103 may be directly used as a background noise value after the proximity sensor calibration. When the detection information for calibrating the proximity sensor is more than two, the calibrated background noise value can be obtained by adopting a related data processing method such as averaging, median and the like.
When calibrating the background noise value of the proximity sensor, it is required that the mobile terminal, in particular the proximity sensor on the mobile terminal, is not blocked. Therefore, according to the method for calibrating the proximity sensor of the mobile terminal, when the current mobile terminal is judged to be in the preset gesture with low shielding probability, the bottom noise value of the proximity sensor is calibrated according to the detection information of the proximity sensor, so that the bottom noise value of the proximity sensor can be accurately calibrated in the using process of the mobile terminal, and the detection accuracy of the proximity sensor is improved.
Referring to fig. 2, fig. 2 shows a flowchart of a method 200 of calibrating a proximity sensor of a mobile terminal, the method 200 including steps S201-S205, provided according to an embodiment of the present disclosure:
step S201: and receiving a user operation event of the mobile terminal.
Step S202: sensor information detected by a motion sensor of the mobile terminal is acquired in response to a user operation event.
Step S203: and determining the posture of the mobile terminal according to the sensor information.
Step S204: and when the gesture is determined to be the vertical screen gesture, acquiring current detection information of the proximity sensor.
Step S205: and calibrating the background noise value of the proximity sensor according to the detection information.
The user operation event refers to an event that a user operates the mobile terminal, for example, a triggering operation on an entity key such as a home key, a power key, a screen touch operation, a screen lighting operation, a screen unlocking operation through fingerprint, touch control, face recognition, and the like.
According to the method for calibrating the proximity sensor of the mobile terminal, the received user operation event is used as the calibration triggering condition of the proximity sensor background noise value, so that the mobile terminal in the vertical screen state at present can be further ensured to be in the vertical screen state when being held by a user instead of being caused by unexpected events, and the proximity sensor is ensured not to be shielded by other objects at present, and the accurate calibration of the proximity sensor background noise value is realized.
In some embodiments, the user operational event includes a screen lighting event and/or a screen unlocking event. The screen lighting event comprises an event of lifting a bright screen, touching the bright screen or triggering a power key to lighten the screen; the screen unlocking event comprises an event of touch unlocking, face recognition unlocking or triggering power key unlocking. The inventor experiment researches show that when a screen lighting event or a screen unlocking event is received, the probability that the proximity sensor is blocked is extremely low, so that the proximity sensor is not blocked by other objects currently through further setting the user operation event as the screen lighting event and/or the screen unlocking event, and the accurate calibration of the bottom noise value of the proximity sensor is realized.
In some cases, the terminal screen may be lit by non-human factors, for example, the terminal may light the screen in a user's pocket or bag by pushing or lifting the lit screen with the proximity sensor of the terminal in close-up occlusion. For such cases, in some embodiments, step S104 further comprises: and if the detection information accords with a preset condition, determining that the detection information cannot be used for calibrating the background noise value of the proximity sensor. The preset condition is set such that when the detection information accords with the preset condition, the probability that the proximity sensor is blocked in a short distance is high. For example, if it is determined from the detection information that there is a close-range occlusion of the current proximity sensor, the detection information cannot be used to calibrate the background noise value of the proximity sensor. Illustratively, the preset condition may be "greater than 0.5cm". For example, when the detection information is 1cm, it indicates that there is an occlusion near 1cm of the proximity sensor, and the detection information cannot be used to calibrate the background noise value of the proximity sensor.
In some embodiments, the mobile terminal pre-stores the current background noise value of the proximity sensor; step S104 further includes: comparing the detection information with the current background noise value; and calibrating the background noise value of the proximity sensor according to the comparison result. Optionally, when the detection information is different from the current background noise value, or the obtained plurality of detection information is different from the current background noise value, or the difference between the detection information and the current background noise value is greater than a preset range, the detection information is used as the background noise value after the proximity sensor is calibrated, so that the background noise value can be calibrated more accurately.
Further, in some embodiments, the corrected floor noise value may be stored as a new current floor noise value in the mobile terminal for use in next calibration of the floor noise value.
In some embodiments, step S102 includes:
sensor information is input to a pre-trained machine learning model to determine the pose in which the mobile terminal is located.
Although there is a certain difference in sensor information measured by the motion sensor for the same person to hold the mobile terminal in the same posture each time or for different persons to use the same posture mobile terminal, the difference is necessarily within a certain range, which is not easily represented by a fixed value or condition, according to the human body structure and the common sense of use of the mobile terminal. Therefore, according to the method for calibrating the proximity sensor of the mobile terminal provided by the embodiment of the disclosure, the boundary of different terminal gestures is identified by using a machine learning method, so that the method for detecting the terminal gesture can have higher accuracy.
Referring to fig. 3, fig. 3 shows a flowchart of a training manner 300 of a machine learning model provided by an embodiment of the disclosure, including steps S301 to S302:
Step S301: acquiring multiple sets of training data, wherein each set of training data comprises more than two pieces of test information detected by a motion sensor of a test terminal when the test terminal is in a preset terminal posture;
step S302: and training the machine learning model by taking the test information contained in the plurality of sets of training data as input and the terminal gesture corresponding to the test information as expected output.
In this embodiment, by using each set of training data for training the machine learning model to include more than two pieces of test information, interference generated by abnormal test information can be eliminated, and the recognition accuracy of the machine learning model can be improved.
In some embodiments, the vertical screen gesture may be defined as an angle between a straight line where the long side of the mobile terminal screen is located and a horizontal plane that exceeds 45 degrees. The test personnel can hold the test terminal for many times to enable the test terminal to be in a vertical screen or horizontal screen gesture, record the terminal gesture of the test terminal and store one or more sensing data detected by a motion sensor of the test terminal under the terminal gesture, so that multiple groups of training data can be obtained. In order to eliminate individual differences among the testers as much as possible, training data generated by a plurality of testers may be used.
In some embodiments, the portrait gesture may be defined as an angle between a line along which the long side of the mobile terminal screen is located and a line connecting the eyes of the user exceeding 45 degrees. The test personnel can hold the test terminal in vertical, horizontal, sideways postures for many times, record the terminal posture of the test terminal and store one or more sensing data detected by the motion sensor of the test terminal under the terminal posture, so that multiple groups of training data can be obtained.
In some embodiments, the machine learning model may be trained with training data corresponding to the vertical screen gesture as a positive example and training data corresponding to the horizontal screen gesture as a negative example, and the trained machine learning model may be used to determine whether the mobile terminal is in the vertical screen gesture.
In some embodiments, the test information includes values in at least two directions; training regimen 300 further comprises: and combining the test information included in each group of training data to obtain processed training data, wherein the numerical values corresponding to the same direction of the test information included in the training data are generated into an array in time sequence in the processed training data. In this embodiment, the merging process may be performed on all the test information included in each set of training data, or may be performed on only part of the test information included in each set of training data, for example, after deleting the missing value or the abnormal value in the training data, the merging process may be performed on the remaining test information. In this embodiment, the processed training data is used to train the machine learning model as input to the machine learning model.
In the embodiment of the present disclosure, the coordinate data includes, but is not limited to, plane rectangular coordinate data, space rectangular coordinate data, and the like, and the coordinate data is composed of coordinate values corresponding to coordinate axes. For example, the test information included in each set of training data may be spatial rectangular data, such as (x 1, y1, z 1), (x 2, y2, z 2), …, (xn, yn, zn), where n is greater than or equal to 2, and the n spatial rectangular data are combined to obtain processed training data (x 1, x2, …, xn, y1, y2, … yn, z1, z2, …, zn), where values (e.g., x1, x2, …, xn) corresponding to the same direction generate an array in time sequence in the processed training data, i.e., the training data is processed from [ xyz, xyz, xyz, … ] to form [ x, x, x, …, y, y, y, …, z, z, … ]. In the embodiment of the disclosure, the array is generated according to the time sequence of the numerical values of the test information included in each group of training data in the corresponding same direction, so that the machine learning model can more easily understand which positions (coordinate axes) of the data change to generate the change of the classification result, the training efficiency of the machine learning model is improved, and the sample size required by training is reduced.
In some embodiments, the machine learning model is a neural network model. Because the neural network model has better fitting property to the classification function, the trained machine learning model has higher classification accuracy by adopting the neural network model as the machine learning model.
In some embodiments, step S104 further comprises: and calibrating the background noise value of the proximity sensor according to more than two detection information. In this embodiment, a plurality of pieces of detection information obtained by processing by related data processing methods such as averaging, median, data fitting, etc. may be used to calibrate the background noise value, so that the background noise value may be calibrated more accurately.
As shown in fig. 4, there is provided a terminal gesture detection apparatus 400 according to an embodiment of the present disclosure, including: a sensor information acquisition unit 401, a terminal posture determination unit 402, a terminal posture determination unit 403, and a calibration unit 404, wherein:
a sensor information acquisition unit 401 for acquiring sensor information detected by a motion sensor of the mobile terminal;
A terminal gesture determining unit 402, configured to determine a gesture in which the mobile terminal is located according to the sensor information;
A detection information determining unit 403, configured to obtain detection information of the proximity sensor when it is determined that the gesture meets a preset gesture condition;
And a calibration unit 404 for calibrating the background noise value of the proximity sensor according to the detection information.
For embodiments of the device, reference is made to the description of method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the modules illustrated as separate modules may or may not be separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
In some embodiments, the detection apparatus 400 further includes a receiving unit for receiving a user operation event of the mobile terminal; the sensor information acquisition unit 401 is further configured to acquire sensor information detected by a motion sensor of the mobile terminal in response to the user operation event.
In some embodiments, the user operational event includes a screen lighting event and/or a screen unlocking event. The screen lighting event comprises an event of lifting a bright screen, touching the bright screen or triggering a power key to lighten the screen; the screen unlocking event comprises an event of touch unlocking, face recognition unlocking or triggering power key unlocking. The inventor experiment researches show that when a screen lighting event or a screen unlocking event is received, the probability that the proximity sensor is blocked is extremely low, so that the proximity sensor is not blocked by other objects currently through further setting the user operation event as the screen lighting event and/or the screen unlocking event, and the accurate calibration of the bottom noise value of the proximity sensor is realized.
In some cases, the terminal screen may be lit by non-human factors, for example, the terminal may light the screen in a user's pocket or bag by pushing or lifting the lit screen with the proximity sensor of the terminal in close-up occlusion. For such cases, in some embodiments, the calibration unit 404 is further configured to determine that the detection information cannot be used to calibrate the background noise value of the proximity sensor if the detection information meets a preset condition. The preset condition is set such that when the detection information accords with the preset condition, the probability that the proximity sensor is blocked in a short distance is high. For example, if it is determined from the detection information that there is a close-range occlusion of the current proximity sensor, the detection information cannot be used to calibrate the background noise value of the proximity sensor. Illustratively, the preset condition may be "greater than 0.5cm". For example, when the detection information is 1cm, it indicates that there is an occlusion near 1cm of the proximity sensor, and the detection information cannot be used to calibrate the background noise value of the proximity sensor.
In some embodiments, the mobile terminal pre-stores the current background noise value of the proximity sensor; the calibration unit 404 is further configured to compare the detection information with the current background noise value; and calibrating the background noise value of the proximity sensor according to the comparison result. Optionally, when the detection information is different from the current background noise value, or the obtained plurality of detection information is different from the current background noise value, or the difference between the detection information and the current background noise value is greater than a preset range, the detection information is used as the background noise value after the proximity sensor is calibrated, so that the background noise value can be calibrated more accurately.
In some embodiments, the calibration unit 404 is further configured to calibrate a background noise value of the proximity sensor based on two or more of the detection information. In this embodiment, a plurality of pieces of detection information obtained by processing by related data processing methods such as averaging, median, data fitting, etc. may be used to calibrate the background noise value, so that the background noise value may be calibrated more accurately.
Further, in some embodiments, the apparatus 400 further includes a storage unit, configured to store the calibrated background noise value as a new current background noise value in the mobile terminal, for use in calibrating the background noise value next time.
In some embodiments, the terminal pose determination unit 402 is configured to input sensor information to a pre-trained machine learning model to determine the pose in which the mobile terminal is located.
Although there is a certain difference in sensor information measured by the motion sensor for the same person to hold the mobile terminal in the same posture each time or for different persons to use the same posture mobile terminal, the difference is necessarily within a certain range, which is not easily represented by a fixed value or condition, according to the human body structure and the common sense of use of the mobile terminal. Therefore, according to the method for calibrating the proximity sensor of the mobile terminal provided by the embodiment of the disclosure, the boundary of different terminal gestures is identified by using a machine learning method, so that the method for detecting the terminal gesture can have higher accuracy.
According to one or more embodiments of the present disclosure, there is provided a training apparatus of a machine learning model, including:
the training data acquisition unit is used for acquiring multiple groups of training data, wherein each group of training data comprises more than two pieces of testing information detected by a motion sensor of the testing terminal when the testing terminal is in a preset terminal posture;
And the training unit is used for taking the test information contained in the plurality of groups of training data as input and the terminal gesture corresponding to the test information as expected output to train the machine learning model.
In some embodiments, the vertical screen gesture may be defined as an angle between a straight line where the long side of the mobile terminal screen is located and a horizontal plane that exceeds 45 degrees. The test personnel can hold the test terminal for many times to enable the test terminal to be in a vertical screen or horizontal screen gesture, record the terminal gesture of the test terminal and store one or more sensing data detected by a motion sensor of the test terminal under the terminal gesture, so that multiple groups of training data can be obtained. In order to eliminate individual differences among the testers as much as possible, training data generated by a plurality of testers may be used.
In some embodiments, the portrait gesture may be defined as an angle between a line along which the long side of the mobile terminal screen is located and a line connecting the eyes of the user exceeding 45 degrees. The test personnel can hold the test terminal in vertical, horizontal, sideways postures for many times, record the terminal posture of the test terminal and store one or more sensing data detected by the motion sensor of the test terminal under the terminal posture, so that multiple groups of training data can be obtained.
In some embodiments, the machine learning model may be trained with training data corresponding to the vertical screen gesture as a positive example and training data corresponding to the horizontal screen gesture as a negative example, and the trained machine learning model may be used to determine whether the mobile terminal is in the vertical screen gesture.
In some embodiments, the test information includes values in at least two directions; the training device further comprises: the training data processing unit is used for merging the test information included in each group of training data to obtain processed training data, wherein the values corresponding to the same direction of the test information included in the training data are generated into an array in time sequence in the processed training data. In this embodiment, the merging process may be performed on all the test information included in each set of training data, or may be performed on only part of the test information included in each set of training data, for example, after deleting the missing value or the abnormal value in the training data, the merging process may be performed on the remaining test information. In this embodiment, the processed training data is used to train the machine learning model as input to the machine learning model.
In the embodiment of the present disclosure, the coordinate data includes, but is not limited to, plane rectangular coordinate data, space rectangular coordinate data, and the like, and the coordinate data is composed of coordinate values corresponding to coordinate axes. For example, the test information included in each set of training data may be spatial rectangular data, such as (x 1, y1, z 1), (x 2, y2, z 2), …, (xn, yn, zn), where n is greater than or equal to 2, and the n spatial rectangular data are combined to obtain processed training data (x 1, x2, …, xn, y1, y2, … yn, z1, z2, …, zn), where values (e.g., x1, x2, …, xn) corresponding to the same direction generate an array in time sequence in the processed training data, i.e., the training data is processed from [ xyz, xyz, xyz, … ] to form [ x, x, x, …, y, y, y, …, z, z, … ]. In the embodiment of the disclosure, the array is generated according to the time sequence of the numerical values of the test information included in each group of training data in the corresponding same direction, so that the machine learning model can more easily understand which positions (coordinate axes) of the data change to generate the change of the classification result, the training efficiency of the machine learning model is improved, and the sample size required by training is reduced.
In some embodiments, the machine learning model is a neural network model. Because the neural network model has better fitting property to the classification function, the trained machine learning model has higher classification accuracy by adopting the neural network model as the machine learning model.
Referring now to fig. 5, a schematic diagram of a terminal device 600 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The terminal device shown in fig. 5 is only one example, and should not impose any limitation on the functions and scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the terminal apparatus 600 may include a processing device (e.g., a central processor, a graphic processor, etc.) 601, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage device 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the terminal apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other through a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
In general, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, magnetic tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the terminal device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 shows a terminal device 600 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via communication means 609, or from storage means 608, or from ROM 602. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 601.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the terminal device; or may exist alone without being fitted into the terminal device.
The computer-readable medium carries one or more programs which, when executed by the terminal device, cause the terminal device to: acquiring sensor information detected by a motion sensor of the mobile terminal; determining the gesture of the mobile terminal according to the sensor information; when the gesture is determined to accord with a preset gesture condition, acquiring detection information of the proximity sensor; and calibrating the background noise value of the proximity sensor according to the detection information.
Or the above computer-readable medium carries one or more programs which, when executed by the terminal device, cause the terminal device to: acquiring multiple sets of training data, wherein each set of training data comprises more than two pieces of test information detected by a motion sensor of a test terminal when the test terminal is in a preset terminal posture; and training the machine learning model by taking the test information contained in the plurality of sets of training data as input and the terminal gesture corresponding to the test information as expected output.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The name of the unit is not limited to the unit itself in some cases, and for example, the sensor information acquisition unit may also be described as "a unit for acquiring sensor information detected by a motion sensor of the mobile terminal".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided a method of calibrating a proximity sensor of a mobile terminal, including: acquiring sensor information detected by a motion sensor of the mobile terminal; determining the gesture of the mobile terminal according to the sensor information; when the gesture is determined to accord with a preset gesture condition, acquiring detection information of the proximity sensor; and calibrating the background noise value of the proximity sensor according to the detection information.
According to one or more embodiments of the present disclosure, further comprising: receiving a user operation event of the mobile terminal; the acquiring the sensor information detected by the motion sensor of the mobile terminal comprises the following steps: and responding to the user operation event, and acquiring sensor information detected by a motion sensor of the mobile terminal.
According to one or more embodiments of the present disclosure, the user operation event includes a screen lighting event and/or a screen unlocking event.
According to one or more embodiments of the present disclosure, the calibrating the background noise value of the proximity sensor according to the detection information includes: and if the detection information accords with a preset condition, determining that the detection information cannot be used for calibrating the background noise value of the proximity sensor.
According to one or more embodiments of the present disclosure, the mobile terminal stores a current background noise value of the proximity sensor in advance; the calibrating the background noise value of the proximity sensor according to the detection information comprises the following steps: comparing the detection information with the current background noise value; and calibrating the background noise value of the proximity sensor according to the comparison result.
According to one or more embodiments of the present disclosure, the determining, according to the sensor information, a gesture in which the mobile terminal is located includes: the sensor information is input to a pre-trained machine learning model to determine a pose in which the mobile terminal is located.
According to one or more embodiments of the present disclosure, the training manner of the machine learning model includes: acquiring multiple sets of training data, wherein each set of training data comprises more than two pieces of test information detected by a motion sensor of a test terminal when the test terminal is in a preset terminal posture; and training the machine learning model by taking the test information contained in the plurality of sets of training data as input and the terminal gesture corresponding to the test information as expected output.
According to one or more embodiments of the present disclosure, the test information includes values in at least two directions; the training mode of the machine learning model further comprises the following steps: and merging the test information included in each group of the training data to obtain processed training data, wherein the numerical values corresponding to the same direction of the test information included in the training data generate an array in time sequence in the processed training data.
According to one or more embodiments of the present disclosure, the machine learning model is a neural network model.
According to one or more embodiments of the present disclosure, the calibrating the background noise value of the proximity sensor according to the detection information includes: and calibrating the background noise value of the proximity sensor according to more than two detection information.
According to one or more embodiments of the present disclosure, there is provided an apparatus for calibrating a proximity sensor of a mobile terminal, including: a sensor information acquisition unit for acquiring sensor information detected by a motion sensor of the mobile terminal; the terminal gesture determining unit is used for determining the gesture of the mobile terminal according to the sensor information; a detection information determining unit, configured to obtain detection information of the proximity sensor when it is determined that the gesture meets a preset gesture condition; and the calibration unit is used for calibrating the background noise value of the proximity sensor according to the detection information.
According to one or more embodiments of the present disclosure, there is provided a terminal including: at least one memory and at least one processor; wherein the memory is for storing program code, and the processor is for invoking the program code stored by the memory to perform a method of calibrating a proximity sensor of a mobile terminal provided in accordance with one or more embodiments of the present disclosure.
According to one or more embodiments of the present disclosure, there is provided a non-transitory computer storage medium storing program code for performing a method of calibrating a proximity sensor of a mobile terminal provided according to one or more embodiments of the present disclosure.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (11)

1. A method of calibrating a proximity sensor of a mobile terminal, comprising:
Responding to a user operation event, and acquiring sensor information detected by a motion sensor of the mobile terminal; the user operation event comprises a screen lighting event and/or a screen unlocking event;
Determining the gesture of the mobile terminal according to the sensor information;
when the gesture is determined to be in accordance with a preset gesture, acquiring detection information of the proximity sensor; wherein the preset gesture is a vertical screen gesture;
And calibrating the background noise value of the proximity sensor according to the detection information.
2. The method of calibrating a proximity sensor of a mobile terminal according to claim 1, wherein the calibrating a background noise value of the proximity sensor according to the detection information comprises:
and if the detection information accords with a preset condition, determining that the detection information cannot be used for calibrating the background noise value of the proximity sensor.
3. A method of calibrating a proximity sensor of a mobile terminal according to claim 1,
The mobile terminal pre-stores the current background noise value of the proximity sensor;
The calibrating the background noise value of the proximity sensor according to the detection information comprises the following steps: comparing the detection information with the current background noise value; and calibrating the background noise value of the proximity sensor according to the comparison result.
4. The method of calibrating a proximity sensor of a mobile terminal of claim 1, wherein the determining a pose in which the mobile terminal is based on the sensor information comprises:
the sensor information is input to a pre-trained machine learning model to determine a pose in which the mobile terminal is located.
5. The method of calibrating a proximity sensor of a mobile terminal of claim 4 wherein the training pattern of the machine learning model comprises:
Acquiring multiple sets of training data, wherein each set of training data comprises more than two pieces of test information detected by a motion sensor of a test terminal when the test terminal is in a preset terminal posture;
And training the machine learning model by taking the test information contained in the plurality of sets of training data as input and the terminal gesture corresponding to the test information as expected output.
6. A method of calibrating a proximity sensor of a mobile terminal according to claim 5,
The test information comprises numerical values in at least two directions;
The training mode of the machine learning model further comprises the following steps:
And merging the test information included in each group of the training data to obtain processed training data, wherein the numerical values corresponding to the same direction of the test information included in the training data generate an array in time sequence in the processed training data.
7. A method of calibrating a proximity sensor of a mobile terminal according to claim 4,
The machine learning model is a neural network model.
8. The method of calibrating a proximity sensor of a mobile terminal according to claim 1, wherein the calibrating a background noise value of the proximity sensor according to the detection information comprises:
and calibrating the background noise value of the proximity sensor according to more than two detection information.
9. An apparatus for calibrating a proximity sensor of a mobile terminal, comprising:
A sensor information acquisition unit for acquiring sensor information detected by a motion sensor of the mobile terminal in response to a user operation event; the user operation event comprises a screen lighting event and/or a screen unlocking event;
the terminal gesture determining unit is used for determining the gesture of the mobile terminal according to the sensor information;
a detection information determining unit, configured to obtain detection information of the proximity sensor when it is determined that the gesture conforms to a preset gesture; wherein the preset gesture is a vertical screen gesture;
and the calibration unit is used for calibrating the background noise value of the proximity sensor according to the detection information.
10. A terminal, the terminal comprising:
at least one memory and at least one processor;
wherein the memory is for storing program code and the processor is for invoking the program code stored in the memory to perform the method of any of claims 1 to 8.
11. A non-transitory computer storage medium storing program code for performing the method of any one of claims 1 to 8.
CN202010727791.3A 2020-07-23 2020-07-23 Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal Active CN111857369B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010727791.3A CN111857369B (en) 2020-07-23 2020-07-23 Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010727791.3A CN111857369B (en) 2020-07-23 2020-07-23 Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal

Publications (2)

Publication Number Publication Date
CN111857369A CN111857369A (en) 2020-10-30
CN111857369B true CN111857369B (en) 2024-05-07

Family

ID=72950227

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010727791.3A Active CN111857369B (en) 2020-07-23 2020-07-23 Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal

Country Status (1)

Country Link
CN (1) CN111857369B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116734903B (en) * 2022-10-20 2024-05-14 荣耀终端有限公司 Test method and device
CN117665958B (en) * 2024-01-31 2024-05-24 荣耀终端有限公司 Calibration method of proximity light sensor and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002211312A (en) * 2001-01-15 2002-07-31 Mazda Motor Corp Lighting system for vehicle
JP2003064921A (en) * 2001-06-18 2003-03-05 Roorando Products:Kk Vehicle locking/unlocking state determining method and control method using the same, as well as vehicle locking /unlocking state determining device and control device using the same
CN106094055A (en) * 2016-06-21 2016-11-09 广东欧珀移动通信有限公司 The calibration steps of a kind of proximity transducer and terminal
CN106500751A (en) * 2016-10-20 2017-03-15 广东欧珀移动通信有限公司 The calibration steps and mobile terminal of proximity transducer
CN108415024A (en) * 2018-01-24 2018-08-17 广东欧珀移动通信有限公司 proximity sensor calibration method, device, mobile terminal and computer-readable medium
CN111262986A (en) * 2020-01-16 2020-06-09 Oppo(重庆)智能科技有限公司 Calibration method and calibration device for proximity sensor and mobile terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002211312A (en) * 2001-01-15 2002-07-31 Mazda Motor Corp Lighting system for vehicle
JP2003064921A (en) * 2001-06-18 2003-03-05 Roorando Products:Kk Vehicle locking/unlocking state determining method and control method using the same, as well as vehicle locking /unlocking state determining device and control device using the same
CN106094055A (en) * 2016-06-21 2016-11-09 广东欧珀移动通信有限公司 The calibration steps of a kind of proximity transducer and terminal
CN106500751A (en) * 2016-10-20 2017-03-15 广东欧珀移动通信有限公司 The calibration steps and mobile terminal of proximity transducer
CN108415024A (en) * 2018-01-24 2018-08-17 广东欧珀移动通信有限公司 proximity sensor calibration method, device, mobile terminal and computer-readable medium
CN111262986A (en) * 2020-01-16 2020-06-09 Oppo(重庆)智能科技有限公司 Calibration method and calibration device for proximity sensor and mobile terminal

Also Published As

Publication number Publication date
CN111857369A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
US20200356821A1 (en) Method, terminal, and computer storage medium for image classification
US9076256B2 (en) Information processing device, information processing method, and program
RU2597524C2 (en) Method and apparatus for classifying number of conditions of device
CN111857369B (en) Method, device, terminal and storage medium for calibrating proximity sensor of mobile terminal
US10019219B2 (en) Display device for displaying multiple screens and method for controlling the same
CN111835916B (en) Training method and device of attitude detection model and detection method and device of terminal attitude
KR20180116843A (en) Method for detecting motion and electronic device implementing the same
CN111738316B (en) Zero sample learning image classification method and device and electronic equipment
CN114063964A (en) Volume compensation optimization method and device, electronic equipment and readable storage medium
CN112818898B (en) Model training method and device and electronic equipment
CN108594150A (en) a kind of calibration method, device, terminal and storage medium
CN111382701B (en) Motion capture method, motion capture device, electronic equipment and computer readable storage medium
CN113741750B (en) Cursor position updating method and device and electronic equipment
CN112315463B (en) Infant hearing test method and device and electronic equipment
CN115097379A (en) Positioning tracking method, device, equipment and storage medium
CN114720932A (en) Battery management system signal sampling precision testing method and device, upper computer and storage medium
CN113111692B (en) Target detection method, target detection device, computer readable storage medium and electronic equipment
CN116608881A (en) Equipment calibration method, device, equipment and medium
CN111741165B (en) Mobile terminal control method and device, mobile terminal and storage medium
CN111625755B (en) Data processing method, device, server, terminal and readable storage medium
CN116558552B (en) Calibration method and device for electronic compass, electronic equipment and medium
CN112307859B (en) User language level determining method and device, electronic equipment and medium
CN116737051B (en) Visual touch combination interaction method, device and equipment based on touch screen and readable medium
US11282178B2 (en) Electronic device and method of identifying false image of object attributable to reflection in indoor environment thereof
CN109375232B (en) Distance measuring method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant