CN114840086A - Control method, electronic device and computer storage medium - Google Patents

Control method, electronic device and computer storage medium Download PDF

Info

Publication number
CN114840086A
CN114840086A CN202210507343.1A CN202210507343A CN114840086A CN 114840086 A CN114840086 A CN 114840086A CN 202210507343 A CN202210507343 A CN 202210507343A CN 114840086 A CN114840086 A CN 114840086A
Authority
CN
China
Prior art keywords
distance
target object
boundary
electronic device
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210507343.1A
Other languages
Chinese (zh)
Inventor
李雅欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210507343.1A priority Critical patent/CN114840086A/en
Publication of CN114840086A publication Critical patent/CN114840086A/en
Priority to PCT/CN2022/141461 priority patent/WO2023216613A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a control method, which is applied to electronic equipment and comprises the following steps: when the non-contact touch control function of the electronic equipment is in an open state, a video sequence corresponding to non-contact touch operation is obtained, touch identification is carried out on the video sequence to obtain a target object and a touch type, a first distance between the target object and the electronic equipment is determined, a touch operation parameter at the first distance is determined, and the electronic equipment is controlled to execute a target function corresponding to the touch type according to the touch operation parameter. The embodiment of the application also provides the electronic equipment and a computer storage medium.

Description

Control method, electronic device and computer storage medium
Technical Field
The present disclosure relates to non-contact touch control technologies in electronic devices, and in particular, to a control method, an electronic device, and a computer storage medium.
Background
At present, with the rapid development of smart phones in recent years, more and more time and scenes are used by people, currently, a mainstream human-computer interaction mode is mainly touch operation, on the basis, gesture interaction is taken as a novel interaction mode, the application in scenes such as driving and dining is continuously developed, and the gesture interaction can realize various controls on pages, such as up-and-down sliding, page turning, photographing, screen capturing, ending recording and the like of the pages.
However, in the control of the electronic device by the change of the gesture type or the change of other parts of the body, the electronic device can only be controlled to realize the functions of the electronic device, so that the control is not fine enough; therefore, the existing non-contact touch control method is not fine enough.
Disclosure of Invention
The embodiment of the application provides a control method, electronic equipment and a computer storage medium, which can improve the refinement of non-contact touch control of the electronic equipment.
The technical scheme of the application is realized as follows:
the embodiment of the application provides a control method, which is applied to electronic equipment and comprises the following steps:
when the non-contact touch control function of the electronic equipment is in an open state, acquiring a video sequence corresponding to non-contact touch operation;
performing touch identification on the video sequence to obtain a target object and a touch type;
determining a first distance between the target object and the electronic device;
determining a touch operation parameter at the first distance;
and controlling the electronic equipment to execute a target function corresponding to the touch type according to the touch operation parameter.
An embodiment of the present application provides an electronic device, including:
the acquisition module is used for acquiring a video sequence corresponding to the non-contact touch operation when the non-contact touch control function of the electronic equipment is in an open state;
the processing module is used for performing touch identification on the video sequence to obtain a target object and a touch type;
a first determining module for determining a first distance between the target object and the electronic device;
the second determining module is used for determining the touch operation parameters at the first distance;
and the control module is used for controlling the electronic equipment to execute a target function corresponding to the touch type according to the touch operation parameters.
An embodiment of the present application provides an electronic device, including:
a processor and a storage medium storing instructions executable by the processor, the storage medium performing operations in dependence of the processor through a communication bus, the instructions, when executed by the processor, performing the control method described in one or more of the above embodiments.
The embodiment of the application provides a computer storage medium, which stores executable instructions, and when the executable instructions are executed by one or more processors, the processors execute the control method according to one or more embodiments.
The embodiment of the application provides a control method, electronic equipment and a computer storage medium, comprising the following steps: when a non-contact touch control function of the electronic equipment is in an open state, acquiring a video sequence corresponding to non-contact touch operation, performing touch identification on the video sequence to obtain a target object and a touch type, determining a first distance between the target object and the electronic equipment, determining a touch operation parameter at the first distance, and controlling the electronic equipment to execute a target function corresponding to the touch type according to the touch operation parameter; that is to say, in the embodiment of the application, in the implementation of the non-contact touch control function of the electronic device, the first distance between the target object and the electronic device is determined, and then the touch operation parameter at the first distance is determined, and then the electronic device is controlled to execute the target function corresponding to the touch type according to the touch operation parameter, so that different touch operation parameters can be determined for the same touch type according to different first distances, and then, the electronic device can implement a response to the touch type according to the distance between the target object and the electronic device.
Drawings
Fig. 1 is a schematic flowchart of an alternative control method provided in an embodiment of the present application;
FIG. 2 is a flowchart illustrating a page control method in the related art;
FIG. 3a is a diagram illustrating an example of an alternative gesture box according to an embodiment of the present disclosure;
FIG. 3b is a diagram illustrating an example two of an alternative gesture box according to an embodiment of the present disclosure;
FIG. 3c is a schematic diagram of an example three of an alternative gesture box provided by an embodiment of the present application;
FIG. 3d is a diagram illustrating an example four of an alternative gesture box provided by an embodiment of the present application;
fig. 4 is a schematic diagram of an example one of an alternative control method provided in an embodiment of the present application;
fig. 5 is a schematic diagram of a second example of an alternative control method provided in the embodiment of the present application;
FIG. 6a is a schematic diagram of an example five of an alternative gesture box provided by an embodiment of the present application;
FIG. 6b is a schematic diagram illustrating an example six of an alternative gesture box provided by an embodiment of the present application;
FIG. 6c is a diagram illustrating an example seventh alternative gesture box according to an embodiment of the present disclosure;
fig. 7a is a schematic layout diagram of an example one of an alternative screen provided in the embodiment of the present application;
fig. 7b is a schematic layout diagram of an example two of an alternative screen provided in the embodiment of the present application;
fig. 7c is a schematic layout diagram of an example three of an alternative screen provided in the embodiment of the present application;
fig. 8 is a schematic structural diagram of an alternative electronic device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of another alternative electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
An embodiment of the present application provides a control method, where the method is applied to an electronic device, fig. 1 is a schematic flow diagram of an optional control method provided in the embodiment of the present application, and as shown in fig. 1, the control method may include:
s101: when the non-contact touch control function of the electronic equipment is in an open state, acquiring a video sequence corresponding to non-contact touch operation;
fig. 2 is a schematic flow chart of a page control method in the related art, as shown in fig. 2, taking a gesture as an example to control a page, the page control method may include:
s201: opening a front camera;
s202: capturing pictures;
s203: detecting a continuous gesture;
s204: judging the gesture;
s205: responding to the operation of the gesture.
The electronic equipment opens the front camera, the front camera is used for capturing a picture containing gestures, after the picture is captured, the gestures in the picture are continuously detected, the gestures are obtained and judged, the operation corresponding to the gestures, such as upward moving operation, is obtained through judgment, and finally the operation of the gestures is responded.
In order to improve refinement of non-contact touch control, an embodiment of the present application provides an optional control method, where first, when a non-contact touch control function of an electronic device is in an on state, that is, the electronic device starts the non-contact touch control function, at this time, a front camera of the electronic device is turned on to capture a front side of a screen of the electronic device, and when a target object is detected by a captured picture, a video sequence is obtained to obtain a video sequence corresponding to a non-contact touch operation, where the video sequence includes a plurality of continuous image frames.
The touch operation may include an upward sliding operation, a downward sliding operation, a long pressing operation, an upward page turning operation, a downward page turning operation, and a clicking operation, which is not specifically limited in this embodiment of the present application.
S102: performing touch identification on the video sequence to obtain a target object and a touch type;
when a video sequence of non-contact touch operation is acquired, touch identification needs to be performed on the video sequence, so that a target object and a touch type included in the video sequence can be acquired.
The target object may be a body part of a human body, such as a hand, a head, an eye, and the like, and this is not particularly limited in the embodiments of the present application.
For the target object, the user may control the electronic device to execute the target function corresponding to the touch type corresponding to the gesture by using the gesture, may also control the electronic device to execute the target function corresponding to the touch type corresponding to the left-right shaking of the head by using a change of the head, for example, by using the left-right shaking of the head, and may also control the electronic device to execute the target function corresponding to the touch type corresponding to the blink action by using a change of eyes, for example, by using the blink action, which is not specifically limited in this embodiment of the present application.
As can be known from the description of the touch operation, the touch type may include a sliding type, a page turning type, a clicking type, and a long press type, where the up-sliding operation and the down-sliding operation belong to the sliding type, the long press operation is the long press type, the up-page turning operation and the down-page turning operation belong to the page turning type, and the clicking operation belongs to the clicking type, which is not specifically limited in this embodiment of the present application.
In an alternative embodiment, S102 may include:
performing touch identification on the video sequence to obtain a target object, a boundary box of the target object and a confidence value of touch operation of the target object;
and determining the touch type based on the confidence value of the touch operation of the target object.
In the touch recognition of a video sequence, a target object, a boundary box of the target object and a confidence value of a touch operation of the target object can be obtained, wherein the boundary box represents boundary information of the target object, and the boundary information of the target object refers to contour information of the target object, and the contour information can include the shape of a contour, the area of the contour and the size information of a contour line; the bounding box is a rectangular box wrapping the target object, and comprises the height of the boundary and the width of the boundary.
That is, by performing touch recognition on a video sequence, not only a target object but also boundary information of the target object may be obtained, and meanwhile, matching the gesture type of the target object may be performed to match the touch operation of the target object, where it should be noted that touch operations of one or more target objects may be obtained by matching, and a confidence value of the touch operation of the target object, that is, a confidence value of the touch operation of the target object, which is also obtained in the touch recognition may be determined, so as to determine the touch type, the touch type may be determined based on the confidence value of the touch operation of the target object.
It can be understood that, when only the touch operation of one target object is obtained by matching, the type to which the touch operation of the target object obtained by matching belongs is directly determined as the touch type; in an optional embodiment, when the touch operation of more than one target object is obtained through matching, determining the touch type based on a confidence value of the touch operation of the target object includes:
and determining the type to which the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object belongs as the touch type.
The confidence value of each touch operation of the target object can reflect the credibility of each touch operation, and the higher the confidence value is, the higher the credibility of the touch operation is, so here, the maximum value of the confidence value of the touch operation of the target object is selected, and the type to which the touch operation corresponding to the maximum value belongs is determined as the touch type.
S103: determining a first distance between a target object and an electronic device;
here, after the target object is obtained through touch recognition, a first distance between the target object and the electronic device needs to be further determined, where it should be noted that the first distance may be a first distance between the target object and a screen of the electronic device, may also be a first distance between the target object and a back plate of the electronic device, and may also be a first distance between the target object and an image sensor of the electronic device, which is not specifically limited in this embodiment of the present application.
It can be understood that, if each image frame in the video sequence acquired by using the front-facing camera further corresponds to a depth image, the first distance between the target object and the electronic device may be determined according to the depth image, and the first distance between the target object and the electronic device may also be determined according to boundary information of the target object, where this is not specifically limited in this embodiment of the present application.
In order to determine the first distance between the target object and the electronic device through the boundary information of the target object, since each image frame in the video sequence corresponds to one boundary information, here, the first distance between the target object and the electronic device may be determined according to the boundary information of the target object in each image frame, and the first distance between the target object and the electronic device may also be determined according to the boundary information of one target object in the boundary information of the target object in each image frame, which is not specifically limited in this embodiment of the present application.
To enable determining the first distance between the target object and the electronic device according to the boundary information of the target object in each image frame, in an alternative embodiment, S103 may include:
selecting boundary information of a target object from the boundary information of the target object in each image frame of the video sequence;
and determining a first distance between the target object and the electronic equipment according to the boundary information of the selected target object.
Here, after the boundary information of the target object in each image frame of the video sequence is obtained through touch recognition, the boundary information of one target object is selected from the boundary information of the target object in each image frame, and the first distance between the target object and the electronic device is determined according to the selected boundary information of the one target object.
In an alternative embodiment, when the boundary information is represented by a bounding box, determining a first distance between the target object and the electronic device according to the boundary information of the selected target object includes:
determining a second distance between each boundary of the bounding box and a corresponding edge of a screen of the electronic device;
determining a target boundary according to a second distance between each boundary and the corresponding edge of the screen;
and determining a first distance between the target object and the electronic equipment by utilizing the relation between the preset boundary length value and the distance between the target object and the electronic equipment according to the length value of the target boundary.
It can be understood that, when the boundary information is represented by a boundary box, in determining a first distance between the target object and the electronic device according to the boundary box of the selected target object, a second distance between each boundary of the boundary box and a corresponding edge of the screen is determined first, and whether the boundary box locks the complete target object in the boundary box, that is, whether the target object is cut off by the screen, may be determined by determining the second distance between each boundary and the corresponding edge of the screen.
After the second distance between each boundary of the bounding box and the corresponding edge of the screen is determined, whether the target object is truncated by the screen can be determined according to the second distance between each boundary and the corresponding edge of the screen, based on which, one boundary is selected from each boundary as the target boundary, the length value of the target boundary is the width value or the height value available for the target boundary when the target object is not truncated or partially truncated, and thus, the first distance between the target object and the electronic device is determined by using the length value of the selected target boundary.
It should be noted that two boundaries of the target boundary may be selected, for example, both the width value and the height value are selected, and the width value and the height value of the boundary box are used to determine the first distance between the target object and the electronic device.
Therefore, the first distance between the target object and the electronic equipment can be determined by using the target boundary under the condition that the boundary is screened out, and then the touch operation parameter at the first distance is determined.
In order to select an uncut or partially cut target object to obtain an available target boundary to improve the accuracy of the first distance between the target object and the electronic device, in an alternative embodiment, determining the target boundary according to a second distance between each boundary of the bounding box and a corresponding edge of a screen of the electronic device includes:
when the second distance between each boundary and the corresponding edge of the screen is larger than a preset threshold value, selecting one boundary from each boundary;
when only one second distance between each boundary and the corresponding edge of the screen is smaller than or equal to a preset threshold value, selecting the boundary of which the second distance between the boundary and the corresponding edge of the screen is smaller than or equal to the preset threshold value from each boundary;
and determining the selected boundary as a target boundary.
That is, the second distance between each boundary and the corresponding edge of the screen is compared with the preset threshold, and if the second distances are greater than the preset threshold, it is described that the target object in the boundary frame is not cut off by the screen, so that one boundary can be selected from each boundary as the target boundary; if only one boundary exists, and the second distance between the boundary and the corresponding edge of the screen is smaller than or equal to the preset threshold, it is indicated that the target object in the boundary box is cut off by the screen, and although the target object is cut off, the boundary corresponding to the boundary, where the second distance between the corresponding edges of the screen is smaller than or equal to the preset threshold, can reflect the size of the target object, so that the boundary, where the second distance between the corresponding edges of the screen is smaller than or equal to the preset threshold, can be selected as the target boundary, and the first distance between the target object and the electronic device is determined by using the length value of the target boundary.
In addition, if both the values are less than or equal to the preset threshold, it is indicated that the target object in the bounding box is cut off by the screen, and the height value and the width value in the bounding box cannot reflect the size of the target object, so the first distance between the target object and the electronic device cannot be calculated by using the length value and/or the width value of the bounding box, and at this time, prompt information may be generated for prompting that the target object does not conform to the control range of the electronic device.
Taking a target object as a hand and a boundary box as a gesture box as an example to describe the boundary box, fig. 3a is a schematic view of an example of an optional gesture box provided in the embodiment of the present application, as shown in fig. 3a, second distances between each boundary of the gesture box and a corresponding edge of a screen are all greater than a preset threshold, so that the gesture is not intercepted by the screen, and any one boundary in the gesture box may be selected as a target boundary to determine a first distance between the hand and the electronic device; fig. 3b is a schematic diagram of an example two of an optional gesture box provided in the embodiment of the present application, and as shown in fig. 3b, only one of second distances between each boundary of the gesture box and a corresponding edge of the screen is smaller than or equal to a preset threshold, so that the hand is intercepted by the screen, but the width boundary in the gesture box may be selected as a target boundary to determine the first distance between the hand and the electronic device; fig. 3c is a schematic diagram of an example three of an optional gesture box according to the embodiment of the present disclosure, as shown in fig. 3c, only one of second distances between each boundary of the gesture box and a corresponding edge of the screen is smaller than or equal to a preset threshold, and therefore, the hand is intercepted by the screen, but the height boundary in the gesture box may be selected as a target boundary to determine the first distance between the hand and the electronic device; fig. 3d is a schematic diagram of an example four of an optional gesture box provided in the embodiment of the present application, and as shown in fig. 3d, two second distances between two boundaries among second distances between each boundary of the gesture box and a corresponding edge of the screen and the corresponding edge of the screen are less than or equal to a preset threshold, which are a height boundary and a width boundary, respectively, so that the hand is cut off by the screen, in this case, the boundary in the gesture box cannot be selected as a target boundary to determine a first distance between the hand and the electronic device, and prompt information may be generated for prompting that the target object does not conform to the control range of the electronic device.
In an optional embodiment, calculating, according to the length value of the target boundary, a first distance between the target object and the electronic device by using a relationship between a preset boundary length value and a distance from the target object to the electronic device, includes:
and when the target boundary is a width boundary, calculating to obtain a first distance between the target object and the electronic equipment according to the length value of the target boundary and the relation between the preset boundary width value and the distance between the target object and the electronic equipment.
The relationship between the boundary width value and the distance from the target object to the electronic device is stored in the electronic device in advance, and then, after the width value of the target boundary is known, the first distance between the target object and the electronic device can be calculated and obtained by using the relationship between the preset boundary width value and the distance from the target object to the electronic device according to the width value of the target boundary.
It is understood that there is a relationship between the boundary width value and the distance from the target object to the electronic device, and by using this relationship, the first distance between the target object and the electronic device can be calculated, taking the target object as a hand and the bounding box as a gesture box as an example, the relationship between the boundary width value and the distance from the target object to the electronic device can be determined as follows:
recording the width W1 of the gesture box when the distance between the hand of the user and the electronic equipment is D1; recording the width W2 of the gesture box when the distance between the hand of the user and the electronic equipment is D2; therefore, when the distance is D, the current gesture box W may be calculated by:
W=W1+(D-D1)(W2-W1)/(D2-D1) (1)
it may be determined that the relationship between the boundary width value and the distance from the target object to the electronic device may be:
D=(W-W1)(D2-D1)/(W2-W1)+D1 (2)
the W1 and the W2 are width values of the gesture box when the hand is not intercepted by the screen.
In an optional embodiment, calculating, according to the length value of the target boundary, a first distance between the target object and the electronic device by using a relationship between a preset boundary length value and a distance from the target object to the electronic device, includes:
and when the target boundary is a height boundary, calculating to obtain a first distance between the target object and the electronic equipment according to the length value of the target boundary and the relation between the preset boundary height value and the distance between the target object and the electronic equipment.
The relationship between the boundary height value and the distance from the target object to the electronic device is stored in the electronic device in advance, and then, after the height value of the target boundary is obtained, the first distance between the target object and the electronic device can be calculated and obtained by using the relationship between the preset boundary height value and the distance from the target object to the electronic device according to the boundary height value.
It is understood that there is a certain relationship between the boundary height value and the distance from the target object to the electronic device, and by using this relationship, the first distance between the target object and the electronic device can be calculated, taking the target object as a hand and the bounding box as a gesture box as an example, the relationship between the boundary height value and the distance from the target object to the electronic device can be determined as follows:
recording the width H1 of the gesture box when the distance between the hand of the user and the electronic equipment is D1; recording the width H2 of the gesture box when the distance between the hand of the user and the electronic equipment is D2; therefore, when the distance is D, the current gesture box H may be calculated as follows:
H=H1+(D-D1)(H2-H1)/(D2-D1) (3)
thus, it can be determined that the relationship between the boundary width value and the distance from the target object to the electronic device may be:
D=(H-H1)(D2-D1)/(H2-H1)+D1 (4)
the H1 and the H2 are height values of the gesture box when the hand is not intercepted by the screen.
In addition to determining the first distance between the target object and the electronic device by using the boundary information of the selected one target object, the first distance between the target object and the electronic device may be determined according to the boundary information of the target object in each image frame, and in an alternative embodiment, after the boundary information of the target object in each image frame of the video sequence is acquired, the method further includes:
determining a distance between the target object in each image frame and the electronic device according to boundary information in the target object in each image frame;
an average of distances between the target object and the electronic device in each image frame is determined as a first distance between the target object and the electronic device.
It can be understood that the distance between the target object in each image frame and the electronic device is determined by using the boundary information of the target object in each image frame, and specifically, the implementation manner of determining the first distance between the target object and the electronic device according to the boundary information of the selected target object is the same, and details are not repeated here.
After determining the distance between the target object and the electronic device in each image frame, an average value algorithm may be used to calculate an average value of the distances between the target object and the electronic device in each image frame, and the average value may be determined as the first distance between the target object and the electronic device.
Further, in order to determine the accurate first distance, in an alternative embodiment, determining an average value of the distances between the target object and the electronic device in each image frame as the first distance between the target object and the electronic device includes:
when the absolute value of the difference value between any two of the distances between the target object and the electronic equipment in each image frame is smaller than or equal to a preset error threshold value, determining the average value of the distances between the target object and the electronic equipment in each image frame as the first distance between the target object and the electronic equipment.
Here, before calculating the average value, it is determined whether an absolute value of a difference between any two of the distances between the target object and the electronic device in each image frame is less than or equal to a preset error threshold, and if the absolute value is less than or equal to the preset error threshold, it indicates that the distance between the target object and the electronic device is jittered within the preset error threshold without a large change, so the average value may be used to determine the first distance between the target object and the electronic device.
In addition, in an optional embodiment, the method further includes:
and when an absolute value greater than a preset error threshold exists in the absolute value of the difference between any two of the distances between the target object and the electronic equipment in each image frame, controlling the electronic equipment to execute a target function corresponding to the touch type according to a preset touch operation parameter at the standard distance.
That is, by determining that the absolute value of the difference between any two of the distances between each target object and the electronic device is greater than the preset error threshold, it is described that the distance between the target object and the electronic device greatly fluctuates, so that the first distance between the target object and the electronic device cannot be determined, and therefore, the electronic device can be directly controlled to execute the target function corresponding to the touch type according to the preset touch operation parameter at the standard distance.
S104: determining a touch operation parameter at a first distance;
after the first distance is determined, touch operation parameters at the first distance may be determined, for example, for each touch operation or touch type, a corresponding relationship between the distance and the touch operation parameters is stored in the electronic device, where the corresponding relationship may be used to determine the touch operation parameters at the first distance, and the touch operation parameters at the first distance may also be calculated by using a preset parameter calculation formula, where this is not specifically limited in this embodiment of the present application.
In an optional embodiment, the touch operation parameter includes any one of: the sliding distance of the sliding operation, the sliding speed of the sliding operation, the page turning speed of the page turning operation, the long pressing time of the long pressing operation and the clicking frequency of the clicking operation.
That is to say, the touch operation of the target object may be a sliding operation, a long-press operation, a clicking operation, or the like, the operation parameters of the sliding operation may include a sliding distance and a sliding speed, the operation parameters of the long-press operation may include a long-press time, and the operation parameters of the clicking operation may include a clicking frequency, which is not specifically limited in this embodiment of the present application.
In order to determine the touch operation parameter at the first distance, in an alternative embodiment, S104 may include:
calculating a sensitivity coefficient at a first distance based on a corresponding relation between a preset distance and the sensitivity coefficient;
and determining the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance.
Here, after the first distance is determined, a correspondence between the first distance and the sensitivity coefficient is stored in the electronic device in advance, and then, based on the correspondence, the sensitivity coefficient at the first distance may be calculated, and then, based on the sensitivity coefficient at the first distance, the touch operation parameter at the first distance may be determined.
In an optional embodiment, determining the touch operation parameter at the first distance according to the sensitivity coefficient at the first distance includes:
and calculating the touch operation parameter at the first distance by using the sensitivity coefficient at the first distance and a preset touch operation parameter at the standard distance.
It is to be understood that the touch operation parameter corresponding to each touch operation is stored in the electronic device, so in determining the touch operation parameter at the first distance according to the sensitivity coefficient at the first distance, for example, the product of the sensitivity coefficient at the first distance and the preset touch operation parameter at the standard distance is determined as the touch operation parameter at the first distance.
That is to say, the sensitivity coefficient at the first distance is used to determine the touch operation parameter at the first distance, so that target objects at different distances from the electronic device are obtained, even if the touch operation parameters corresponding to the same touch operation are different, the function of non-contact touch control is expanded, the control of the non-contact touch operation of the user on the electronic device is more refined, and the user experience is improved.
S105: and controlling the electronic equipment to execute the target function corresponding to the touch type according to the touch operation parameter.
After the touch operation parameters are determined, the electronic equipment executes the target function corresponding to the touch type according to the determined touch operation parameters. The electronic device may control a page of the screen to execute a target function corresponding to the touch type according to the touch operation parameter, and the electronic device may also execute processing on data according to the touch operation parameter, for example, a processing function on an image, which is not specifically limited in this embodiment of the present application.
For implementing control of the screen of the electronic device, in an alternative embodiment, S105 may include:
controlling a controlled object of a screen of the electronic equipment according to the touch operation parameters to execute a target function corresponding to the touch type;
wherein the controlled object includes any one of: pages, interfaces, controls; that is to say, the control method may implement control over an interface, or control over a certain control on a screen, in addition to implementing control over a page on the screen, and this is not specifically limited in this embodiment of the present application.
Taking the page sliding as an example, in an optional embodiment, according to the touch operation parameter, the controlled object of the screen of the electronic device is controlled to execute the target function corresponding to the touch type, including:
and controlling the page of the screen according to the sliding distance of the sliding operation to execute the sliding function.
Illustratively, when a user slides a page through a palm, the electronic device responds to the sliding action of the palm, determines that the operation corresponding to the sliding action is the sliding operation, determines the distance between the palm and a screen, and then determines the sliding distance of the sliding operation under the distance, and the electronic device controls the page to execute the sliding function corresponding to the sliding operation according to the sliding distance of the sliding operation under the distance.
The control method in one or more of the above embodiments is described below by way of example.
The control method is described below by taking a target object as a hand and a boundary box as a gesture box as an example, and fig. 4 is a schematic diagram of an example one of an optional control method provided in the embodiment of the present application, as shown in fig. 4, the control method may include:
s401: capturing a gesture picture;
in S401, the gesture interaction application may start the front camera, and capture a gesture image through the front camera.
S402: detecting a gesture;
s403: acquiring a gesture frame;
in S401-S402, after the picture is captured, detecting the picture to obtain a gesture category, a gesture confidence and a gesture frame; in detecting a gesture, fig. 5 is a schematic diagram of an example two of an optional control method provided in the embodiment of the present application, and as shown in fig. 5, the gesture detection method may include:
s501: acquiring a picture containing a gesture;
s502: performing model reasoning on the picture;
s503: post-processing the model;
s504: non-maxima suppression;
s505: and obtaining a gesture box.
After the picture including the gesture is acquired, the detection of the picture mainly includes operations such as model inference, model post-processing, non-maximum suppression and the like, so that the gesture in one picture can be detected, each detection result includes a gesture frame, a gesture category (which refers to a category corresponding to the gesture change of the hand and corresponds to a touch type) and a gesture confidence level (which refers to the confidence level of the touch type corresponding to the gesture category and is, for example, 0.9), and if the detection results of a plurality of gesture categories exist in the picture, the gesture category with the highest confidence level is taken as the final detection result.
S404: calculating the distance between the hand and the screen;
here, the distance between the hand of the user and the mobile phone screen is calculated according to the gesture box, namely the distance between the hand and the mobile phone screen. Wherein, the distance between the hand and the screen can be calculated by the following method:
firstly, recording the width and height of a gesture box when the distance between a group of hands and a screen is D1, fig. 6a is a schematic diagram of an example five of an optional gesture box provided in the embodiment of the present application, and as shown in fig. 6a, recording the width W1 and the height H1 of the gesture box when the distance between the user hand and the screen is D1;
then, the width and height of the gesture box when the distance between the hand and the screen is D2 are recorded, fig. 6b is a schematic diagram of an example six of an optional gesture box provided in the embodiment of the present application, as shown in fig. 6b, when the distance between the hand and the screen is D2, the width W2 and the height H2 of the gesture box are recorded;
finally, fig. 6c is a schematic diagram of an example seventh of an optional gesture box provided in the embodiment of the present application, and as shown in fig. 6c, when the distance is D, the current gesture boxes W and H are calculated as formula (1) and formula (3).
In this way, the reverse thrust can be obtained, and when the width and height of the gesture box are W and H, the distance D between the current hand and the screen can be obtained through formula (2) and formula (4).
It should be noted that, when calculating the distance between the hand and the screen, the distance between the hand and the screen may be calculated according to the width or height of the gesture box, and when the hand is at the edge of the screen, there may be a phenomenon that the hand is cut off by the screen. If neither width nor height is available, the distance between the hand and the screen cannot be calculated.
S405: updating the sensitivity coefficient of the touch type corresponding to the gesture;
in S405, the sensitivity coefficient of the touch type corresponding to the gesture is adjusted according to the distance between the user hand and the mobile phone screen.
S406: and controlling the page sliding distance.
In S406, the sliding distance of the page is calculated according to the sensitivity coefficient of the touch type corresponding to the current gesture.
After the distance between the hand and the screen is calculated in S403, the sensitivity coefficient of the touch type corresponding to the gesture may be adjusted, and if the distance between the hand and the screen is 25cm and the sensitivity coefficient is 1, and the distance is D (10 cm < D <60 cm), the sensitivity coefficient K of the touch type corresponding to the updated gesture is:
K=1-0.01*(D-25) (5)
that is, when the distance between the hand and the screen is less than 25cm, the sensitivity coefficient of the touch type corresponding to the gesture is improved; when the distance between the hand and the screen is more than 25cm, the sensitivity of the touch type corresponding to the gesture is reduced.
In calculating the sliding distance on the mobile phone page according to the sensitivity coefficient of the touch type corresponding to the current gesture, taking the touch type corresponding to the gesture as the sliding type as an example, when the distance between the preset hand and the screen is 25cm, the hand sends out a sliding operation once, the sliding distance of the mobile phone page is M0, and when the distance between the hand and the screen is D, the gesture sends out a sliding operation once, and the sliding distance M of the mobile phone page is:
M=M0*K=1-0.01*(D-25) (6)
and finally, substituting the formula (2) or the formula (4) to calculate the distance D between the hand and the screen, and executing a sliding function corresponding to the sliding operation according to the D control page.
Fig. 7a is a schematic layout view of an example one of an optional screen provided in the embodiment of the present application, and as shown in fig. 7a, a communication record list is displayed on the screen, and a user can control an electronic device to slide a screen page by swinging up and down four fingers except a thumb; fig. 7b is a schematic layout diagram of an example two of an optional screen provided in the embodiment of the present application, and as shown in fig. 7b, when the distance between the hand of the user and the screen is 25cm, the page slides by the distance of one communication record each time four fingers swing up and down, so that the first communication record displayed on the page is "4-17 sisters access"; fig. 7c is a schematic layout diagram of an alternative example three of the screen provided in the embodiment of the present application, and as shown in fig. 7c, when the distance between the hand of the user and the screen is 26cm, the page slides by the distance recorded by two communications each time the four fingers swing up and down, so that the first communications record displayed on the page is "4-16 lee four outgoing call"; therefore, the sliding distance of the page corresponding to the up-and-down swing of the four fingers at each time is determined through the distance between the hand and the screen, and different sliding distances corresponding to different distances are achieved.
In this example, the sliding distance is intelligently adjusted based on the gesture sensitivity coefficient detected by the gesture, the current distance between the hand and the screen can be calculated according to the gesture detection result, the current gesture sensitivity coefficient is adjusted according to the distance between the hand and the screen, and the sliding distance control of the page when the user operates the mobile phone is acted. A lower sensitivity factor may be obtained when the user's hand is farther from the screen, and a higher sensitivity factor may be obtained when the user's hand is closer to the screen.
So, realize the page control that becomes more meticulous, the user can select comfortable distance to carry out the control of gesture to the page according to own use habit, has promoted the user experience of gesture to page control.
In addition, it should be noted that the example can be used not only for adjusting the sensitivity coefficient of the sliding distance in the gesture interaction, but also for adjusting the sliding speed of the sliding operation, the click frequency of the click operation, the page turning speed of the page turning operation, the long press time of the long press operation, the response time of returning to the previous stage operation, the moving distance of the character moving operation in the game, and the like), and can also be applied to control the sensitivity coefficient of the novel interaction such as face recognition, body recognition, eye gaze, and the like, so as to provide more comfortable user experience for the above scenes.
The embodiment of the application provides a control method, which comprises the following steps: when a non-contact touch control function of the electronic equipment is in an open state, acquiring a video sequence corresponding to non-contact touch operation, performing touch identification on the video sequence to obtain a target object and a touch type, determining a first distance between the target object and the electronic equipment, determining a touch operation parameter at the first distance, and controlling the electronic equipment to execute a target function corresponding to the touch type according to the touch operation parameter; that is to say, in the embodiment of the application, in the implementation of the non-contact touch control function of the electronic device, the first distance between the target object and the electronic device is determined, and then the touch operation parameter at the first distance is determined, and then the controlled object of the electronic device is controlled to execute the target function corresponding to the touch type according to the touch operation parameter, so for the same touch type, different touch operation parameters can be determined for different first distances, and then, the electronic device can implement a response to the touch type according to the distance between the target object and the electronic device.
Based on the same inventive concept as the foregoing embodiment, an embodiment of the present application provides an electronic device, and fig. 8 is a schematic structural diagram of an optional electronic device provided in the embodiment of the present application, and as shown in fig. 8, the electronic device includes:
the obtaining module 81 is configured to obtain a video sequence corresponding to a non-contact touch operation when a non-contact touch control function of the electronic device is in an on state;
the processing module 82 is configured to perform touch identification on the video sequence to obtain a target object and a touch type;
a first determining module 83, configured to determine a first distance between the target object and the electronic device;
a second determining module 84, configured to determine a touch operation parameter at the first distance;
the control module 85 is configured to control the electronic device to execute a target function corresponding to the touch type according to the touch operation parameter.
In an alternative embodiment, the first determining module 83 is specifically configured to:
selecting boundary information of a target object from the boundary information of the target object in each image frame of the video sequence;
and determining a first distance between the target object and the screen of the electronic equipment according to the boundary information of the selected target object.
In an alternative embodiment, when the boundary information is represented by a bounding box, the determining, by the first determining module 83, a first distance between the target object and the electronic device according to the boundary information of the selected target object includes:
determining a second distance between each boundary of the bounding box and a corresponding edge of a screen of the electronic device;
determining a target boundary according to a second distance between each boundary and the corresponding edge of the screen;
and determining a first distance between the target object and the electronic equipment by utilizing the relation between the preset boundary length value and the distance between the target object and the electronic equipment according to the length value of the target boundary.
In an alternative embodiment, the determining, by the first determining module 83, the target boundary according to the second distance between each boundary of the bounding box and the corresponding edge of the screen includes:
when the second distance between each boundary and the corresponding edge of the screen is larger than a preset threshold value, selecting one boundary from each boundary;
when only one second distance between each boundary and the corresponding edge of the screen is smaller than or equal to a preset threshold value, selecting the boundary of which the second distance between the boundary and the corresponding edge of the screen is smaller than or equal to the preset threshold value from each boundary;
and determining the selected boundary as a target boundary.
In an optional embodiment, the calculating, by the first determining module 83, according to the length value of the target boundary, the first distance between the target object and the electronic device by using a relationship between a preset boundary length value and a distance from the target object to the electronic device includes:
and when the target boundary is a width boundary, calculating to obtain a first distance between the target object and the electronic equipment according to the length value of the target boundary and the relation between the preset boundary width value and the distance between the target object and the electronic equipment.
In an optional embodiment, the calculating, by the first determining module 83, according to the length value of the target boundary, the first distance between the target object and the electronic device by using a relationship between a preset boundary length value and a distance from the target object to the electronic device includes:
and when the target boundary is a height boundary, calculating to obtain a first distance between the target object and the electronic equipment according to the length value of the target boundary and the relation between the preset boundary height value and the distance between the target object and the electronic equipment.
In an alternative embodiment, the first determining module 83 is specifically configured to:
determining a distance between a target object in each image frame and the electronic device according to boundary information in the target object in each image frame in the video sequence;
an average of distances between the target object and the electronic device in each image frame is determined as a first distance between the target object and the electronic device.
In an alternative embodiment, the determining, by the first determining module 83, an average value of distances between the target object and the electronic device in each image frame as the first distance between the target object and the electronic device includes:
when the absolute value of the difference value between any two of the distances between the target object and the electronic equipment in each image frame is smaller than or equal to a preset error threshold value, determining the average value of the distances between the target object and the electronic equipment in each image frame as the first distance between the target object and the electronic equipment.
In an alternative embodiment, the electronic device is further configured to:
and when an absolute value greater than a preset error threshold exists in the absolute value of the difference between any two of the distances between the target object and the electronic equipment in each image frame, controlling the electronic equipment to execute a target function corresponding to the touch type according to a preset touch operation parameter at the standard distance.
In an alternative embodiment, the second determining module 84 is specifically configured to:
calculating a sensitivity coefficient at a first distance based on a corresponding relation between a preset distance and the sensitivity coefficient;
and determining the touch operation parameters at the first distance according to the sensitivity coefficient at the first distance.
In an alternative embodiment, the determining, by the second determining module 84, the touch operation parameter at the first distance according to the sensitivity coefficient at the first distance includes:
and calculating the touch operation parameter at the first distance by using the sensitivity coefficient at the first distance and a preset touch operation parameter at the standard distance.
In an alternative embodiment, the processing module 82 performs touch recognition on the video sequence to obtain the target object and the touch type, including:
performing touch identification on the video sequence to obtain a target object, a boundary box of the target object and a confidence value of touch operation of the target object; wherein the bounding box represents boundary information;
and determining the touch type based on the confidence value of the touch operation of the target object.
In an alternative embodiment, the determining, by the processing module 82, the touch type based on the confidence value of the touch operation of the target object includes:
and determining the type to which the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object belongs as the touch type.
In an optional embodiment, the touch operation parameter includes any one of: the sliding distance of the sliding operation, the sliding speed of the sliding operation, the page turning speed of the page turning operation, the long time of the long press operation and the clicking frequency of the clicking operation.
In an alternative embodiment, the control module 85 is specifically configured to:
controlling a controlled object of a screen of the electronic equipment according to the touch operation parameters to execute a target function corresponding to the touch type; wherein the controlled object includes any one of: pages, interfaces, and controls.
In an alternative embodiment, the controlling module 85 controls the controlled object of the screen of the electronic device according to the touch operation parameter to execute the target function corresponding to the touch type, including:
and controlling the page of the screen according to the sliding distance of the sliding operation to execute the sliding function.
In practical applications, the obtaining module 81, the Processing module 82, the first determining module 83, the second determining module 84, and the control module 85 may be implemented by a processor located on an electronic device, specifically, a Central Processing Unit (CPU), a Microprocessor Unit (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
Fig. 9 is a schematic structural diagram of another alternative electronic device provided in an embodiment of the present application, and as shown in fig. 9, an embodiment of the present application provides an electronic device 900, including:
a processor 91 and a storage medium 92 storing instructions executable by the processor 91, the storage medium 92 relying on the processor 91 to perform operations via a communication bus 93, the instructions when executed by the processor 91 performing the control method performed in one or more of the embodiments described above.
It should be noted that, in practical applications, the various components in the terminal are coupled together by a communication bus 93. It is understood that the communication bus 93 is used to enable connection communication between these components. The communication bus 93 includes a power bus, a control bus, and a status signal bus, in addition to a data bus. But for clarity of illustration the various buses are labeled in figure 9 as communication bus 93.
The embodiment of the present application provides a computer storage medium, which stores executable instructions, and when the executable instructions are executed by one or more processors, the processors execute the control method performed by the first electronic device in one or more embodiments as described above.
The computer-readable storage medium may be a magnetic random access Memory (FRAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read Only Memory (EPROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM), among others.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present application, and is not intended to limit the scope of the present application.

Claims (19)

1. A control method is applied to electronic equipment and comprises the following steps:
when the non-contact touch control function of the electronic equipment is in an open state, acquiring a video sequence corresponding to non-contact touch operation;
performing touch identification on the video sequence to obtain a target object and a touch type;
determining a first distance between the target object and the electronic device;
determining touch operation parameters at the first distance;
and controlling the electronic equipment to execute a target function corresponding to the touch type according to the touch operation parameter.
2. The method of claim 1, wherein determining the first distance between the target object and the electronic device comprises:
selecting boundary information of a target object from the boundary information of the target object in each image frame of the video sequence;
and determining a first distance between the target object and the electronic equipment according to the boundary information of the selected target object.
3. The method of claim 2, wherein when the boundary information is represented by a bounding box, the determining the first distance between the target object and the electronic device according to the boundary information of the selected target object comprises:
determining a second distance between each boundary of the bounding box and a corresponding edge of a screen of the electronic device;
determining a target boundary according to a second distance between each boundary and a corresponding edge of the screen;
and determining a first distance between the target object and the electronic equipment by utilizing a relation between a preset boundary length value and the distance from the target object to the electronic equipment according to the length value of the target boundary.
4. The method of claim 3, wherein determining a target boundary based on a second distance between each boundary of the bounding box and a corresponding edge of the screen comprises:
when second distances between the boundaries and the corresponding edges of the screen are larger than a preset threshold value, selecting one boundary from the boundaries;
when only one second distance between each boundary and the corresponding edge of the screen exists in the second distances between the boundaries and the corresponding edges of the screen, the second distance between each boundary and the corresponding edge of the screen is less than or equal to a preset threshold value, and the boundary of which the second distance between each boundary and the corresponding edge of the screen is less than or equal to the preset threshold value is selected from the boundaries;
and determining the selected boundary as the target boundary.
5. The method of claim 3, wherein the calculating, according to the length value of the target boundary, a first distance between the target object and the electronic device according to a relationship between a preset boundary length value and a distance from the target object to the electronic device includes:
and when the target boundary is a width boundary, calculating to obtain a first distance between the target object and the electronic equipment according to the length value of the target boundary and by using the relation between the preset boundary width value and the distance between the target object and the electronic equipment.
6. The method of claim 3, wherein the calculating, according to the length value of the target boundary, a first distance between the target object and the electronic device according to a relationship between a preset boundary length value and a distance from the target object to the electronic device includes:
and when the target boundary is a height boundary, calculating to obtain a first distance between the target object and the electronic equipment according to the length value of the target boundary and by using the relation between the preset boundary height value and the distance between the target object and the electronic equipment.
7. The method of claim 1, wherein determining the first distance between the target object and the electronic device comprises:
determining a distance between a target object in each image frame and the electronic device according to boundary information in the target object in each image frame in the video sequence;
determining an average of distances between a target object and the electronic device in each image frame as a first distance between the target object and the electronic device.
8. The method of claim 7, wherein determining an average of distances between a target object in each image frame and the electronic device as a first distance between the target object and the electronic device comprises:
when the absolute value of the difference value between any two of the distances between the target object and the electronic device in each image frame is smaller than or equal to a preset error threshold value, determining the average value of the distances between the target object and the electronic device in each image frame as the first distance between the target object and the electronic device.
9. The method of claim 8, further comprising:
and when an absolute value greater than a preset error threshold exists in the absolute value of the difference between any two of the distances between the target object and the electronic equipment in each image frame, controlling the electronic equipment to execute a target function corresponding to the touch type according to a preset touch operation parameter at a standard distance.
10. The method of claim 1, wherein the determining the touch operation parameter at the first distance comprises:
calculating a sensitivity coefficient at the first distance based on a corresponding relation between a preset distance and the sensitivity coefficient;
and determining touch operation parameters at the first distance according to the sensitivity coefficient at the first distance.
11. The method of claim 10, wherein determining the touch operation parameter at the first distance according to the sensitivity coefficient at the first distance comprises:
and calculating the touch operation parameter at the first distance by using the sensitivity coefficient at the first distance and a preset touch operation parameter at a standard distance.
12. The method of claim 1, wherein the performing touch recognition on the video sequence to obtain a target object and a touch type comprises:
performing touch identification on the video sequence to obtain the target object, a boundary box of the target object and a confidence value of touch operation of the target object; wherein the bounding box represents bounding information;
determining the touch type based on the confidence value of the touch operation of the target object.
13. The method of claim 12, wherein the determining the touch type based on the confidence value of the touch operation of the target object comprises:
and determining the type to which the touch operation corresponding to the maximum value of the confidence value of the touch operation of the target object belongs as the touch type.
14. The method according to any one of claims 1 to 13, wherein the touch operation parameter comprises any one of:
the sliding distance of the sliding operation, the sliding speed of the sliding operation, the page turning speed of the page turning operation, the long time of the long press operation and the clicking frequency of the clicking operation.
15. The method according to any one of claims 1 to 13, wherein the controlling the electronic device to execute a target function corresponding to the touch type according to the touch operation parameter includes:
controlling a controlled object of a screen of the electronic equipment according to the touch operation parameters so as to execute a target function corresponding to the touch type;
wherein the controlled object includes any one of: pages, interfaces, and controls.
16. The method according to claim 15, wherein the controlling a controlled object of a screen of the electronic device according to the touch operation parameter to execute a target function corresponding to the touch type includes:
and controlling the page of the screen according to the sliding distance of the sliding operation to execute the sliding function.
17. An electronic device, comprising:
the acquisition module is used for acquiring a video sequence corresponding to the non-contact touch operation when the non-contact touch control function of the electronic equipment is in an open state;
the processing module is used for performing touch identification on the video sequence to obtain a target object and a touch type;
a first determining module for determining a first distance between the target object and the electronic device;
the second determining module is used for determining the touch operation parameters at the first distance;
and the control module is used for controlling the electronic equipment to execute the target function corresponding to the touch type according to the touch operation parameter.
18. An electronic device, comprising:
a processor and a storage medium storing instructions executable by the processor to perform operations dependent on the processor via a communication bus, the instructions when executed by the processor performing the control method of any of claims 1 to 16 above.
19. A computer storage medium having stored thereon executable instructions which, when executed by one or more processors, perform a control method according to any one of claims 1 to 16.
CN202210507343.1A 2022-05-10 2022-05-10 Control method, electronic device and computer storage medium Pending CN114840086A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210507343.1A CN114840086A (en) 2022-05-10 2022-05-10 Control method, electronic device and computer storage medium
PCT/CN2022/141461 WO2023216613A1 (en) 2022-05-10 2022-12-23 Control method, electronic device and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210507343.1A CN114840086A (en) 2022-05-10 2022-05-10 Control method, electronic device and computer storage medium

Publications (1)

Publication Number Publication Date
CN114840086A true CN114840086A (en) 2022-08-02

Family

ID=82568888

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210507343.1A Pending CN114840086A (en) 2022-05-10 2022-05-10 Control method, electronic device and computer storage medium

Country Status (2)

Country Link
CN (1) CN114840086A (en)
WO (1) WO2023216613A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023216613A1 (en) * 2022-05-10 2023-11-16 Oppo广东移动通信有限公司 Control method, electronic device and computer storage medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050025345A1 (en) * 2003-07-30 2005-02-03 Nissan Motor Co., Ltd. Non-contact information input device
US20090228841A1 (en) * 2008-03-04 2009-09-10 Gesture Tek, Inc. Enhanced Gesture-Based Image Manipulation
JP2010147784A (en) * 2008-12-18 2010-07-01 Fujifilm Corp Three-dimensional imaging device and three-dimensional imaging method
US20120218183A1 (en) * 2009-09-21 2012-08-30 Extreme Reality Ltd. Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
US20120229377A1 (en) * 2011-03-09 2012-09-13 Kim Taehyeong Display device and method for controlling the same
CN103017730A (en) * 2012-11-30 2013-04-03 中兴通讯股份有限公司 Single-camera ranging method and single-camera ranging system
CN103472916A (en) * 2013-09-06 2013-12-25 东华大学 Man-machine interaction method based on human body gesture recognition
US8902198B1 (en) * 2012-01-27 2014-12-02 Amazon Technologies, Inc. Feature tracking for device input
KR20150064597A (en) * 2013-12-03 2015-06-11 엘지전자 주식회사 Video display device and operating method thereof
US20160026254A1 (en) * 2013-03-14 2016-01-28 Lg Electronics Inc. Display device and method for driving the same
CN105308536A (en) * 2013-01-15 2016-02-03 厉动公司 Dynamic user interactions for display control and customized gesture interpretation
US20170139482A1 (en) * 2014-06-03 2017-05-18 Lg Electronics Inc. Image display apparatus and operation method thereof
CN107291221A (en) * 2017-05-04 2017-10-24 浙江大学 Across screen self-adaption accuracy method of adjustment and device based on natural gesture
CN109782906A (en) * 2018-12-28 2019-05-21 深圳云天励飞技术有限公司 A kind of gesture identification method of advertisement machine, exchange method, device and electronic equipment
CN111084606A (en) * 2019-10-12 2020-05-01 深圳壹账通智能科技有限公司 Vision detection method and device based on image recognition and computer equipment
CN112394811A (en) * 2019-08-19 2021-02-23 华为技术有限公司 Interaction method for air-separating gesture and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8890812B2 (en) * 2012-10-25 2014-11-18 Jds Uniphase Corporation Graphical user interface adjusting to a change of user's disposition
CN110414495B (en) * 2019-09-24 2020-05-19 图谱未来(南京)人工智能研究院有限公司 Gesture recognition method and device, electronic equipment and readable storage medium
CN112947755A (en) * 2021-02-24 2021-06-11 Oppo广东移动通信有限公司 Gesture control method and device, electronic equipment and storage medium
CN114840086A (en) * 2022-05-10 2022-08-02 Oppo广东移动通信有限公司 Control method, electronic device and computer storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050025345A1 (en) * 2003-07-30 2005-02-03 Nissan Motor Co., Ltd. Non-contact information input device
US20090228841A1 (en) * 2008-03-04 2009-09-10 Gesture Tek, Inc. Enhanced Gesture-Based Image Manipulation
JP2010147784A (en) * 2008-12-18 2010-07-01 Fujifilm Corp Three-dimensional imaging device and three-dimensional imaging method
US20120218183A1 (en) * 2009-09-21 2012-08-30 Extreme Reality Ltd. Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
US20120229377A1 (en) * 2011-03-09 2012-09-13 Kim Taehyeong Display device and method for controlling the same
US8902198B1 (en) * 2012-01-27 2014-12-02 Amazon Technologies, Inc. Feature tracking for device input
CN103017730A (en) * 2012-11-30 2013-04-03 中兴通讯股份有限公司 Single-camera ranging method and single-camera ranging system
CN105308536A (en) * 2013-01-15 2016-02-03 厉动公司 Dynamic user interactions for display control and customized gesture interpretation
US20160026254A1 (en) * 2013-03-14 2016-01-28 Lg Electronics Inc. Display device and method for driving the same
CN103472916A (en) * 2013-09-06 2013-12-25 东华大学 Man-machine interaction method based on human body gesture recognition
KR20150064597A (en) * 2013-12-03 2015-06-11 엘지전자 주식회사 Video display device and operating method thereof
US20170139482A1 (en) * 2014-06-03 2017-05-18 Lg Electronics Inc. Image display apparatus and operation method thereof
CN107291221A (en) * 2017-05-04 2017-10-24 浙江大学 Across screen self-adaption accuracy method of adjustment and device based on natural gesture
CN109782906A (en) * 2018-12-28 2019-05-21 深圳云天励飞技术有限公司 A kind of gesture identification method of advertisement machine, exchange method, device and electronic equipment
CN112394811A (en) * 2019-08-19 2021-02-23 华为技术有限公司 Interaction method for air-separating gesture and electronic equipment
CN111084606A (en) * 2019-10-12 2020-05-01 深圳壹账通智能科技有限公司 Vision detection method and device based on image recognition and computer equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023216613A1 (en) * 2022-05-10 2023-11-16 Oppo广东移动通信有限公司 Control method, electronic device and computer storage medium

Also Published As

Publication number Publication date
WO2023216613A1 (en) 2023-11-16

Similar Documents

Publication Publication Date Title
CN105488527B (en) Image classification method and device
CN108076290B (en) Image processing method and mobile terminal
CN109242765B (en) Face image processing method and device and storage medium
US20130009989A1 (en) Methods and systems for image segmentation and related applications
CN106951884A (en) Gather method, device and the electronic equipment of fingerprint
CN107291346B (en) Terminal device and method and device for processing drawing content of terminal device
EP3308536B1 (en) Determination of exposure time for an image frame
CN105512605A (en) Face image processing method and device
CN107368810A (en) Method for detecting human face and device
KR20170033805A (en) Human face recognition method, apparatus and terminal
CN107832836A (en) Model-free depth enhancing study heuristic approach and device
CN106980840A (en) Shape of face matching process, device and storage medium
CN107330868A (en) image processing method and device
EP3328062A1 (en) Photo synthesizing method and device
CN111432245B (en) Multimedia information playing control method, device, equipment and storage medium
CN107835359A (en) Triggering method of taking pictures, mobile terminal and the storage device of a kind of mobile terminal
CN113409342A (en) Training method and device for image style migration model and electronic equipment
CN107995417B (en) Photographing method and mobile terminal
CN114840086A (en) Control method, electronic device and computer storage medium
KR20210000671A (en) Head pose estimation
CN109947243B (en) Intelligent electronic equipment gesture capturing and recognizing technology based on touch hand detection
EP2888716B1 (en) Target object angle determination using multiple cameras
CN106446643B (en) Terminal control method and device
CN110069126B (en) Virtual object control method and device
CN113642551A (en) Nail key point detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination