CN112363659A - APP interface operation method and device, electronic equipment and storage medium - Google Patents

APP interface operation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112363659A
CN112363659A CN202011240993.1A CN202011240993A CN112363659A CN 112363659 A CN112363659 A CN 112363659A CN 202011240993 A CN202011240993 A CN 202011240993A CN 112363659 A CN112363659 A CN 112363659A
Authority
CN
China
Prior art keywords
action
head
characteristic
sequence
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011240993.1A
Other languages
Chinese (zh)
Inventor
姚宏志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Puhui Enterprise Management Co Ltd
Original Assignee
Ping An Puhui Enterprise Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Puhui Enterprise Management Co Ltd filed Critical Ping An Puhui Enterprise Management Co Ltd
Priority to CN202011240993.1A priority Critical patent/CN112363659A/en
Publication of CN112363659A publication Critical patent/CN112363659A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to an artificial intelligence technology, and discloses an APP interface operation method, which comprises the following steps: the method comprises the steps of receiving a head characteristic action sequence of a user of an APP by utilizing an attitude angle sensor, performing characteristic expansion on the head characteristic action sequence to obtain a transformation characteristic action sequence, calculating a distance value between the transformation characteristic action sequence and each characteristic action in a pre-constructed characteristic action clustering center library, selecting an action label corresponding to the characteristic action with the minimum distance value, and performing corresponding operation on the APP according to the action label. The invention also relates to a block chain technology, and the characteristic action clustering center library can be stored in the block chain nodes. The invention also provides an operating device of the APP interface, electronic equipment and a computer readable storage medium. The method and the device can solve the problems of poor scene adaptation to APP interface operation and excessively high or low operation speed.

Description

APP interface operation method and device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to an APP interface operation method, an APP interface operation device, electronic equipment and a computer readable storage medium.
Background
With the development of science and technology, not only do books require users to perform operations such as page turning, but also corresponding software APPs such as mobile phones and tablets require users to perform operations such as page turning, for example, players often perform music by using music scores recorded in the music score APPs, so that the users need to perform operations such as page turning during the playing process.
The conventional APP interface page turning is mainly based on a manual screen clicking mode or automatic page turning is carried out at a fixed speed, the manual screen clicking mode and the fixed speed page turning mode can achieve the page turning purpose, but the conventional APP interface page turning has the defects that the manual screen clicking mode and the fixed speed page turning mode are poor in adaptation, the manual screen turning mode is unlikely to have time for manual page turning when the player plays, the fixed speed page turning mode cannot achieve the purpose of real-time interaction with a user, and the phenomenon that the page turning is too fast or too slow is caused.
Disclosure of Invention
The invention provides an operation method and device of an APP interface, electronic equipment and a computer readable storage medium, and mainly aims to solve the problems of poor scene adaptation and excessively high or excessively low operation speed of the APP interface operation.
In order to achieve the above object, the present invention provides an APP interface operating method, including:
receiving a head characteristic action sequence of a user of the APP by using the attitude angle sensor;
performing feature expansion on the head feature action sequence to obtain a transformation feature action sequence;
calculating the distance value between the transformation characteristic action sequence and each characteristic action in a pre-constructed characteristic action clustering center library;
and selecting an action tag corresponding to the characteristic action with the minimum distance value, and executing corresponding operation on the APP according to the action tag.
Optionally, the performing feature extension on the head feature action sequence to obtain a transformed feature action sequence includes:
calculating the angle value, Euclidean distance value and average value of each head characteristic action in the head characteristic action sequence;
calculating an energy value for each head feature action within the sequence of head feature actions based on an energy formula;
and combining each head characteristic action and the corresponding angle value, Euclidean distance value, average value and energy value into a transformation characteristic action, and summarizing each transformation characteristic action to obtain the transformation characteristic action sequence.
Optionally, the calculating an angle value of the head feature action sequence includes:
calculating the angle value of the head characteristic action sequence by adopting the following calculation formula:
Figure BDA0002768343810000021
wherein, AnglenAn angle value, A, representing the nth head feature movement in the sequence of head feature movementsnRepresents the nth head feature action in the head feature action sequence, An+kRepresenting the (n + k) th head feature action in the head feature action sequence, arccos representing the inverse cosine in the inverse trigonometric function, and pi representing the circumferential rate.
Optionally, the calculating an energy value of each head feature action within the sequence of head feature actions based on an energy formula includes:
calculating the energy value of the head characteristic action sequence by adopting the following calculation formula:
Figure BDA0002768343810000022
wherein EnergynAn energy value representing an nth head characteristic motion and an n + k head characteristic motion in the head characteristic motion sequence,
Figure BDA0002768343810000023
representing the acceleration values of the nth head characteristic motion in the x-axis, the y-axis and the z-axis respectively in the head characteristic motion sequence,
Figure BDA0002768343810000024
and the acceleration values of the (n + k) th head characteristic motion in the head characteristic motion sequence in the x axis, the y axis and the z axis are respectively represented.
Optionally, before calculating the distance value between the transformed feature action sequence and each feature action in the pre-constructed feature action cluster center library, the method further includes:
acquiring a head characteristic action sequence set and an action tag set;
performing feature expansion on the head feature action sequence set to obtain a transformation feature action sequence set;
calculating the initial membership degree of the head feature action sequence set according to the action tag set, receiving the clustering times input by a user, and constructing a target function according to the clustering times and the initial membership degree;
and solving the minimum value of the target function according to a pre-constructed clustering function, and dividing the head characteristic action sequence set according to the minimum value to obtain the characteristic action clustering center library.
Optionally, the set of action tags includes a stationary tag, a left-right linear oscillating tag, a circular oscillating tag, and an up-down nodding tag.
Optionally, the constructing an objective function according to the clustering times and the initial membership comprises:
the objective function is constructed in the following way:
Figure BDA0002768343810000031
wherein J represents the objective function, k represents the clustering degree, sjRepresenting the jth transform feature action sequence, s, in the set of transform feature action sequencesj+tRepresents the j + t transform characteristic motion sequences in the transform characteristic motion sequence set,
Figure BDA0002768343810000032
representing s calculated on the basis of said initial degree of membershipjAnd sj+tM is the data quantity corresponding to the transformation characteristic action sequence set.
In order to solve the above problem, the present invention further provides an APP interface operating device, including:
the motion sequence acquisition module is used for receiving a head characteristic motion sequence of a user of the APP by utilizing the attitude angle sensor;
the characteristic expansion module is used for executing characteristic expansion on the head characteristic action sequence to obtain a transformation characteristic action sequence;
the distance value calculation module is used for calculating the distance value between the transformation characteristic action sequence and each characteristic action in a pre-constructed characteristic action clustering center library;
and the APP interface operation module is used for selecting the action tag corresponding to the characteristic action with the minimum distance value and executing corresponding operation on the APP according to the action tag.
In order to solve the above problem, the present invention also provides an electronic device, including:
a memory storing at least one instruction; and
and the processor executes the instructions stored in the memory to realize the operation method of the APP interface.
In order to solve the above problem, the present invention further provides a computer-readable storage medium including a storage data area and a storage program area, the storage data area storing created data, the storage program area storing a computer program; wherein the computer program, when executed by a processor, implements a method of operating an APP interface as described in any one of the above.
According to the method, the device and the system, a transformation characteristic action sequence is obtained based on characteristic expansion, the distance value between the transformation characteristic action sequence and each characteristic action in a pre-constructed characteristic action clustering center library is calculated, corresponding operation is performed on the APP based on the action label corresponding to the characteristic action with the minimum distance value, compared with manual screen clicking or page operation by utilizing a fixed speed, the head characteristic of an APP user is analyzed to obtain the action label corresponding to the characteristic action clustering center library, and the APP page is operated based on the action label, so that the method does not need the APP user to manually click the screen, has wider application scenes, and meanwhile, the APP page operation corresponds to the head characteristic, and the phenomenon that the APP page is operated too fast or too slow when the page operation is performed at the fixed speed does not occur, and therefore the method for operating the APP interface provided by the invention, The device and the computer readable storage medium can solve the problems of poor scene adaptation to APP interface operation and too high or too low operation speed.
Drawings
Fig. 1 is a schematic flow chart of an operation method of an APP interface according to an embodiment of the present invention;
fig. 2 is a detailed flowchart of S2 in the operation method of the APP interface according to an embodiment of the present invention;
fig. 3 is a detailed flowchart of S3 in the operation method of the APP interface according to an embodiment of the present invention;
fig. 4 is a schematic block diagram of an operating device of an APP interface according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an internal structure of an electronic device implementing an operation method of an APP interface according to an embodiment of the present invention;
the implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The embodiment of the application provides an APP interface operation method. The execution main body of the operation method of the APP interface includes, but is not limited to, at least one of electronic devices that can be configured to execute the method provided by the embodiment of the present application, such as a server and a terminal. In other words, the operation method of the APP interface may be executed by software or hardware installed in the terminal device or the server device, where the software may be a blockchain platform. The server includes but is not limited to: a single server, a server cluster, a cloud server or a cloud server cluster, and the like.
The invention provides an operation method of an APP interface. Fig. 1 is a schematic flow chart of an operation method of an APP interface according to an embodiment of the present invention. The method may be performed by an apparatus, which may be implemented by software and/or hardware.
In this embodiment, the operation method of the APP interface includes:
and S1, receiving the head characteristic action sequence of the user of the APP by using the posture angle sensor.
In a preferred embodiment of the present invention, the attitude angle sensor includes a triaxial accelerometer, and a three-dimensional motion attitude measurement system based on the MEMS technology is installed inside the triaxial accelerometer, so as to calculate a head characteristic motion sequence of the head of the APP user in three axes. Wherein the three axes include an x-axis, a y-axis, and a z-axis.
In detail, the head feature action sequence includes acceleration values of x-axis, y-axis and z-axis
Figure BDA0002768343810000051
Wherein the content of the first and second substances,
Figure BDA0002768343810000052
an acceleration value in the x-axis representing the nth head characteristic motion in the head characteristic motion sequence,
Figure BDA0002768343810000053
An acceleration value in the y-axis representing the nth head characteristic motion in the head characteristic motion sequence,
Figure BDA0002768343810000054
The acceleration value of the nth head characteristic motion in the z-axis in the head characteristic motion sequence is shown.
Further, the acceleration value
Figure BDA0002768343810000055
Collectively referred to as the headFeature actions, each head feature action is collected to obtain the feature action sequence, and the feature action sequence is expressed by { A1,A1,…,At,…,AnDenotes wherein
Figure BDA0002768343810000056
If the user is the player, APP is music book APP that provides music book reading for the player, and when the player played, the player opened the music book APP who installs in equipment such as cell-phone, flat board, and the gesture angle sensor in equipment such as cell-phone, flat board obtained player's head characteristic motion sequence { A in real time this moment1,A1,…,At,…,An}。
And S2, performing feature expansion on the head feature action sequence to obtain a transformation feature action sequence.
Since the head feature motion sequence data obtained at S1 is not sufficiently distinctive, feature expansion is required, and in detail, referring to fig. 2, the S2 includes:
s21, calculating the angle value, Euclidean distance value and average value of each head characteristic motion in the head characteristic motion sequence;
s22, calculating the energy value of each head characteristic motion in the head characteristic motion sequence based on an energy formula;
and S23, combining each head characteristic action and the corresponding angle value, Euclidean distance value, average value and energy value into a transformation characteristic action, and summarizing each transformation characteristic action to obtain the transformation characteristic action sequence.
In detail, calculating the angle value of the head feature action sequence includes using the following calculation formula:
Figure BDA0002768343810000057
wherein, AnglenAn angle value, A, representing the nth head feature movement in the sequence of head feature movementsnRepresenting the nth head feature action in the head feature action sequence,An+kRepresenting the (n + k) th head feature action in the head feature action sequence, arccos representing the inverse cosine in the inverse trigonometric function, and pi representing the circumferential rate.
Further, the calculation method of the euclidean distance value of the head feature action sequence is as follows:
Figure BDA0002768343810000061
wherein D isnkRepresenting Euclidean distance values of the nth head characteristic motion and the (n + k) th head characteristic motion in the head characteristic motion sequence,
Figure BDA0002768343810000062
acceleration value representing the x-axis of the (n + k) th head characteristic motion in the head characteristic motion sequence,
Figure BDA0002768343810000063
Acceleration value representing y-axis of the (n + k) th head feature motion in the head feature motion sequence,
Figure BDA0002768343810000064
The acceleration value of the n + k th head characteristic motion in the z-axis in the head characteristic motion sequence is shown.
The calculation method of the average value of the head characteristic action sequence comprises the following steps:
Figure BDA0002768343810000065
wherein Mean isnkThe energy formula of the energy value of the head characteristic action sequence is as follows:
Figure BDA0002768343810000066
wherein EnergynAnd (3) an energy value representing the nth head characteristic motion and the (n + k) th head characteristic motion in the head characteristic motion sequence.
Combining each head feature action and the corresponding angle value, Euclidean distance value, average value and energy value into a transformation feature action, and summarizing each transformation feature action to obtain the transformation feature action sequence Y (Y)1,y2,…,yn,…,yn+k) Wherein y isnThe method comprises a head characteristic action sequence, an angle value, an Euclidean distance value, an average value and an energy value.
And S3, calculating the distance value of each characteristic action in the transformation characteristic action sequence and a pre-constructed characteristic action clustering center library.
In a preferred embodiment of the present invention, referring to fig. 3, the process of constructing the pre-constructed feature action cluster center library includes:
and S31, acquiring a head feature action sequence set and an action label set.
In a preferred embodiment of the present invention, the head feature action sequences of different people in different time periods may be obtained by using the attitude angle sensor, and the head feature action sequence set is obtained by summing up, as in S1, each head feature action sequence in the head feature action sequence set is composed of acceleration values of three axes, i.e., x axis, y axis, and z axis.
The action label set comprises a static label, a left and right linear oscillating label, a circular oscillating label, an upper and lower nodding labels and the like.
Further, the head feature action sequence set and the action tag set are in a one-to-one correspondence relationship, for example, the action tag corresponding to the head feature action sequence a in the head feature action sequence set is static, the action tag corresponding to the head feature action sequence B is circular shaking, and the action tag corresponding to the head feature action sequence C is up-down nodding.
S32, executing the feature extension on the head feature action sequence set to obtain a transformation feature action sequence set.
In detail, the feature extension is the same as S2, and includes calculating an angle value, an euclidean distance value, an average value, and an energy value, and summarizing the angle value, the euclidean distance value, the average value, and the energy value to obtain the transformed feature motion sequence set.
S33, calculating the initial membership degree of the head feature action sequence set according to the action label set, receiving the clustering times input by a user, and constructing an objective function according to the clustering times and the initial membership degree.
In detail, the membership degree represents the fuzzy degree of the action label corresponding to each transformation characteristic action sequence in the transformation characteristic action sequence set, if the probability that the transformation characteristic action sequence A in the transformation characteristic action sequence set is a circle type shaking head label is 0.9, and the probability that the transformation characteristic action sequence A is an upper and lower shaking head label is 0.1, the membership degree of the numerical value corresponding to the transformation characteristic action sequence A belonging to the circle type shaking head and the upper and lower shaking head labels is transformed according to a pre-constructed membership degree function, and the initial membership degree is obtained. The membership function comprises an S-shaped membership function, a combined Gaussian membership function, a generalized bell-shaped membership function and the like.
The objective function is:
Figure BDA0002768343810000071
wherein J represents the objective function, k represents the number of clustering times input by a user, sjRepresenting the jth transform feature action sequence, s, in the set of transform feature action sequencesj+tRepresents the j + t transform characteristic motion sequences in the transform characteristic motion sequence set,
Figure BDA0002768343810000072
representing s calculated on the basis of said initial degree of membershipjAnd sj+tAnd (5) updating the membership degree correspondingly, wherein m is the data quantity corresponding to the transformation characteristic action sequence set.
S34, solving the minimum value of the objective function according to the pre-constructed clustering function, and dividing the head characteristic action sequence set according to the minimum value to obtain the characteristic action clustering center library.
In a preferred embodiment of the invention, the clustering function can adopt a K-means algorithm, and when the K-means algorithm is used for clustering, the minimum value of the objective function is solved to replace the minimum value of the distance in the original K-means algorithm, and the iterative updating is continuously carried out, so that the division of the head characteristic action sequence set is completed, and the characteristic action clustering center library is obtained.
When the feature action clustering center library is constructed, in a preferred embodiment of the present invention, the distance value of each feature action in the transformation feature action sequence and the pre-constructed feature action clustering center library is calculated respectively through a distance calculation formula.
In the preferred embodiment of the present invention, the following distance calculation formula is adopted to calculate the distance value of each characteristic action in the transformation characteristic action sequence and the pre-constructed characteristic action clustering center library:
Figure BDA0002768343810000081
wherein u isiRepresenting the value of a characteristic action in a cluster center library of said characteristic actions, DsRepresenting the distance value of the sequence of transformed characteristic actions from each characteristic action.
In one embodiment of the present invention, the feature action clustering center library may be stored in one or more nodes of a block chain.
S4, selecting an action label corresponding to the characteristic action with the minimum distance value, and executing corresponding operation on the APP according to the action label.
In the preferred embodiment of the present invention, the corresponding operation includes page turning of the APP, zooming in and out of the APP interface, and the like. As mentioned above, the player sets the left and right linear shaking head in the music score APP in advance to represent the page turning operation, and when the embodiment of the invention is utilized to analyze and obtain the head characteristic action sequence { A ] of the player1,A1,…,At,…,AnAnd if the left and right linear shaking heads are adopted, performing page turning operation on the music score APP.
According to the method, the device and the system, a transformation characteristic action sequence is obtained based on characteristic expansion, the distance value between the transformation characteristic action sequence and each characteristic action in a pre-constructed characteristic action clustering center library is calculated, corresponding operation is performed on the APP based on the action label corresponding to the characteristic action with the minimum distance value, compared with manual screen clicking or page operation by utilizing a fixed speed, the head characteristic of an APP user is analyzed to obtain the action label corresponding to the characteristic action clustering center library, and the APP page is operated based on the action label, so that the method does not need the APP user to manually click the screen, has wider application scenes, and meanwhile, the APP page operation corresponds to the head characteristic, and the phenomenon that the APP page is operated too fast or too slow when the page operation is performed at the fixed speed does not occur, and therefore the method for operating the APP interface provided by the invention, The device and the computer readable storage medium can solve the problems of poor scene adaptation to APP interface operation and too high or too low operation speed.
Fig. 4 is a schematic block diagram of an operating device of an APP interface according to the present invention.
The operating device 100 of the APP interface of the present invention can be installed in an electronic device. According to the realized functions, the operating device of the APP interface may include an action sequence obtaining module 101, a feature extension module 102, a distance value calculating module 103, and an APP interface operating module 104. A module according to the present invention, which may also be referred to as a unit, refers to a series of computer program segments that can be executed by a processor of an electronic device and that can perform a fixed function, and that are stored in a memory of the electronic device.
In the present embodiment, the functions regarding the respective modules/units are as follows:
the motion sequence acquisition module 101 is configured to receive a head feature motion sequence of a user of the APP using the attitude angle sensor.
In a preferred embodiment of the present invention, the attitude angle sensor includes a triaxial accelerometer, and a three-dimensional motion attitude measurement system based on the MEMS technology is installed inside the triaxial accelerometer, so as to calculate a head characteristic motion sequence of the head of the APP user in three axes. Wherein the three axes include an x-axis, a y-axis, and a z-axis.
In detail, the head feature action sequence includes acceleration values of x-axis, y-axis and z-axis
Figure BDA0002768343810000091
Wherein the content of the first and second substances,
Figure BDA0002768343810000092
an acceleration value in the x-axis representing the nth head characteristic motion in the head characteristic motion sequence,
Figure BDA0002768343810000093
An acceleration value in the y-axis representing the nth head characteristic motion in the head characteristic motion sequence,
Figure BDA0002768343810000094
The acceleration value of the nth head characteristic motion in the z-axis in the head characteristic motion sequence is shown.
Further, the acceleration value
Figure BDA0002768343810000095
Collectively called head feature actions, each head feature action is summarized to obtain the feature action sequence, and the feature action sequence is expressed by { A1,A1,…,At,…,AnDenotes wherein
Figure BDA0002768343810000096
If the user is the player, APP is music book APP that provides music book reading for the player, and when the player played, the player opened the music book APP who installs in equipment such as cell-phone, flat board, and the gesture angle sensor in equipment such as cell-phone, flat board obtained player's head characteristic motion sequence { A in real time this moment1,A1,…,At,…,An}。
The feature extension module 102 is configured to perform feature extension on the head feature action sequence to obtain a transformed feature action sequence.
The method for obtaining the transformation feature action sequence by performing feature extension on the head feature action sequence comprises the following steps: calculating the angle value, Euclidean distance value and average value of each head characteristic action in the head characteristic action sequence; calculating an energy value for each head feature action within the sequence of head feature actions based on an energy formula; and combining each head characteristic action and the corresponding angle value, Euclidean distance value, average value and energy value into a transformation characteristic action, and summarizing each transformation characteristic action to obtain the transformation characteristic action sequence.
In detail, calculating the angle value of the head feature action sequence includes using the following calculation formula:
Figure BDA0002768343810000101
wherein, AnglenAn angle value, A, representing the nth head feature movement in the sequence of head feature movementsnRepresents the nth head feature action in the head feature action sequence, An+kRepresenting the (n + k) th head feature action in the head feature action sequence, arccos representing the inverse cosine in the inverse trigonometric function, and pi representing the circumferential rate.
Further, the calculation method of the euclidean distance value of the head feature action sequence is as follows:
Figure BDA0002768343810000102
wherein D isnkRepresenting Euclidean distance values of the nth head characteristic motion and the (n + k) th head characteristic motion in the head characteristic motion sequence,
Figure BDA0002768343810000103
acceleration value representing the x-axis of the (n + k) th head characteristic motion in the head characteristic motion sequence,
Figure BDA0002768343810000104
Acceleration value representing y-axis of the (n + k) th head feature motion in the head feature motion sequence,
Figure BDA0002768343810000105
The acceleration value of the n + k th head characteristic motion in the z-axis in the head characteristic motion sequence is shown.
The calculation method of the average value of the head characteristic action sequence comprises the following steps:
Figure BDA0002768343810000106
wherein Mean isnkThe energy formula of the energy value of the head characteristic action sequence is as follows:
Figure BDA0002768343810000107
wherein EnergynAnd (3) an energy value representing the nth head characteristic motion and the (n + k) th head characteristic motion in the head characteristic motion sequence.
Combining each head feature action and the corresponding angle value, Euclidean distance value, average value and energy value into a transformation feature action, and summarizing each transformation feature action to obtain the transformation feature action sequence Y (Y)1,y2,…,yn,…,yn+k) Wherein y isnThe method comprises a head characteristic action sequence, an angle value, an Euclidean distance value, an average value and an energy value.
The distance value calculating module 103 is configured to calculate a distance value between the transformed feature action sequence and each feature action in a pre-constructed feature action clustering center library.
In a preferred embodiment of the present invention, the process of constructing the pre-constructed feature action clustering center library includes: acquiring a head characteristic action sequence set and an action tag set; executing the feature extension on the head feature action sequence set to obtain a transformation feature action sequence set; calculating the initial membership degree of the head feature action sequence set according to the action tag set, receiving the clustering times input by a user, and constructing a target function according to the clustering times and the initial membership degree; and solving the minimum value of the target function according to a pre-constructed clustering function, and dividing the head characteristic action sequence set according to the minimum value to obtain the characteristic action clustering center library.
In a preferred embodiment of the present invention, the head feature action sequences of different people in different time periods may be obtained by using the attitude angle sensor, and the head feature action sequence set is obtained by summing up, as in S1, each head feature action sequence in the head feature action sequence set is composed of acceleration values of three axes, i.e., x axis, y axis, and z axis.
The action label set comprises a static label, a left and right linear oscillating label, a circular oscillating label, an upper and lower nodding labels and the like.
Further, the head feature action sequence set and the action tag set are in a one-to-one correspondence relationship, for example, the action tag corresponding to the head feature action sequence a in the head feature action sequence set is static, the action tag corresponding to the head feature action sequence B is circular shaking, and the action tag corresponding to the head feature action sequence C is up-down nodding.
In detail, the feature extension similarly includes calculating an angle value, an euclidean distance value, an average value, and an energy value, and summarizing the angle value, the euclidean distance value, the average value, and the energy value to obtain the transformed feature motion sequence set.
In detail, the membership degree represents the fuzzy degree of the action label corresponding to each transformation characteristic action sequence in the transformation characteristic action sequence set, if the probability that the transformation characteristic action sequence A in the transformation characteristic action sequence set is a circle type shaking head label is 0.9, and the probability that the transformation characteristic action sequence A is an upper and lower shaking head label is 0.1, the membership degree of the numerical value corresponding to the transformation characteristic action sequence A belonging to the circle type shaking head and the upper and lower shaking head labels is transformed according to a pre-constructed membership degree function, and the initial membership degree is obtained. The membership function comprises an S-shaped membership function, a combined Gaussian membership function, a generalized bell-shaped membership function and the like.
The objective function is:
Figure BDA0002768343810000111
wherein J represents the objective function, k represents the number of clustering times input by a user, sjRepresenting the jth transform feature action sequence, s, in the set of transform feature action sequencesj+tRepresents the j + t transform characteristic motion sequences in the transform characteristic motion sequence set,
Figure BDA0002768343810000112
representing s calculated on the basis of said initial degree of membershipjAnd sj+tAnd (5) updating the membership degree correspondingly, wherein m is the data quantity corresponding to the transformation characteristic action sequence set.
In a preferred embodiment of the invention, the clustering function can adopt a K-means algorithm, and when the K-means algorithm is used for clustering, the minimum value of the objective function is solved to replace the minimum value of the distance in the original K-means algorithm, and the iterative updating is continuously carried out, so that the division of the head characteristic action sequence set is completed, and the characteristic action clustering center library is obtained.
When the feature action clustering center library is constructed, in a preferred embodiment of the present invention, the distance value of each feature action in the transformation feature action sequence and the pre-constructed feature action clustering center library is calculated respectively through a distance calculation formula.
In the preferred embodiment of the present invention, the following distance calculation formula is adopted to calculate the distance value of each characteristic action in the transformation characteristic action sequence and the pre-constructed characteristic action clustering center library:
Figure BDA0002768343810000121
wherein u isiRepresenting the value of a characteristic action in a cluster center library of said characteristic actions, DsRepresenting the distance value of the sequence of transformed characteristic actions from each characteristic action.
In one embodiment of the present invention, the feature action clustering center library may be stored in one or more nodes of a block chain.
The APP interface operation module 104 is configured to select an action tag corresponding to the feature action with the smallest distance value, and perform corresponding operation on the APP according to the action tag.
In the preferred embodiment of the present invention, the corresponding operation includes page turning of the APP, zooming in and out of the APP interface, and the like. As mentioned above, the player sets the left and right linear shaking head in the music score APP in advance to represent the page turning operation, and when the embodiment of the invention is utilized to analyze and obtain the head characteristic action sequence { A ] of the player1,A1,…,At,…,AnAnd if the left and right linear shaking heads are adopted, performing page turning operation on the music score APP.
Fig. 5 is a schematic structural diagram of an electronic device implementing an APP interface operation method according to the present invention.
The electronic device 1 may comprise a processor 10, a memory 11 and a bus, and may further comprise a computer program, such as an operating program 12 of an APP interface, stored in the memory 11 and executable on the processor 10.
The memory 11 includes at least one type of readable storage medium, which includes flash memory, removable hard disk, multimedia card, card-type memory (e.g., SD or DX memory, etc.), magnetic memory, magnetic disk, optical disk, etc. The memory 11 may in some embodiments be an internal storage unit of the electronic device 1, such as a removable hard disk of the electronic device 1. The memory 11 may also be an external storage device of the electronic device 1 in other embodiments, such as a plug-in mobile hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device 1. Further, the memory 11 may also include both an internal storage unit and an external storage device of the electronic device 1. The memory 11 may be used not only to store application software installed in the electronic device 1 and various types of data, such as codes of the operating program 12 of the APP interface, but also to temporarily store data that has been output or is to be output.
The processor 10 may be composed of an integrated circuit in some embodiments, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same or different functions, including one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, graphics processors, and combinations of various control chips. The processor 10 is a Control Unit (Control Unit) of the electronic device, connects various components of the whole electronic device by using various interfaces and lines, and executes various functions and processes data of the electronic device 1 by running or executing programs or modules (for example, operating programs for executing an APP interface, etc.) stored in the memory 11 and calling data stored in the memory 11.
The bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The bus is arranged to enable connection communication between the memory 11 and at least one processor 10 or the like.
Fig. 5 only shows an electronic device with components, and it will be understood by a person skilled in the art that the structure shown in fig. 5 does not constitute a limitation of the electronic device 1, and may comprise fewer or more components than shown, or a combination of certain components, or a different arrangement of components.
For example, although not shown, the electronic device 1 may further include a power supply (such as a battery) for supplying power to each component, and preferably, the power supply may be logically connected to the at least one processor 10 through a power management device, so as to implement functions of charge management, discharge management, power consumption management, and the like through the power management device. The power supply may also include any component of one or more dc or ac power sources, recharging devices, power failure detection circuitry, power converters or inverters, power status indicators, and the like. The electronic device 1 may further include various sensors, a bluetooth module, a Wi-Fi module, and the like, which are not described herein again.
Further, the electronic device 1 may further include a network interface, and optionally, the network interface may include a wired interface and/or a wireless interface (such as a WI-FI interface, a bluetooth interface, etc.), which are generally used for establishing a communication connection between the electronic device 1 and other electronic devices.
Optionally, the electronic device 1 may further comprise a user interface, which may be a Display (Display), an input unit (such as a Keyboard), and optionally a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable for displaying information processed in the electronic device 1 and for displaying a visualized user interface, among other things.
It is to be understood that the described embodiments are for purposes of illustration only and that the scope of the appended claims is not limited to such structures.
The operating program 12 of the APP interface stored in the memory 11 of the electronic device 1 is a combination of instructions that, when executed in the processor 10, can implement:
receiving a head characteristic action sequence of a user of the APP by using the attitude angle sensor;
performing feature expansion on the head feature action sequence to obtain a transformation feature action sequence;
calculating the distance value between the transformation characteristic action sequence and each characteristic action in a pre-constructed characteristic action clustering center library;
and selecting an action tag corresponding to the characteristic action with the minimum distance value, and executing corresponding operation on the APP according to the action tag.
Further, the integrated modules/units of the electronic device 1, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. The computer-readable medium may include: any entity or device capable of carrying said computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM).
Further, the computer usable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus, device and method can be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any accompanying claims should not be construed as limiting the claim concerned.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the system claims may also be implemented by one unit or means in software or hardware. The terms second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. An operation method of an APP interface is characterized by comprising the following steps:
receiving a head characteristic action sequence of a user of the APP by using the attitude angle sensor;
performing feature expansion on the head feature action sequence to obtain a transformation feature action sequence;
calculating the distance value between the transformation characteristic action sequence and each characteristic action in a pre-constructed characteristic action clustering center library;
and selecting an action tag corresponding to the characteristic action with the minimum distance value, and executing corresponding operation on the APP according to the action tag.
2. The method for operating an APP interface of claim 1, wherein said performing feature expansion on said head feature action sequence to obtain a transformed feature action sequence comprises:
calculating the angle value, Euclidean distance value and average value of each head characteristic action in the head characteristic action sequence;
calculating an energy value for each head feature action within the sequence of head feature actions based on an energy formula;
and combining each head characteristic action and the corresponding angle value, Euclidean distance value, average value and energy value into a transformation characteristic action, and summarizing each transformation characteristic action to obtain the transformation characteristic action sequence.
3. The method of operation of the APP interface of claim 2, wherein the calculating the angular value of the sequence of head feature actions comprises:
calculating the angle value of the head characteristic action sequence by adopting the following calculation formula:
Figure FDA0002768343800000011
wherein, AnglenAn angle value, A, representing the nth head feature movement in the sequence of head feature movementsnRepresents the nth head feature action in the head feature action sequence, An+kRepresenting the (n + k) th head feature action in the head feature action sequence, and arccos representing the inverse remainder in the inverse trigonometric functionChord, pi, represents the circumferential ratio.
4. The method of operating an APP interface of claim 2, wherein the calculating an energy value for each head feature action within the sequence of head feature actions based on an energy formula comprises:
calculating the energy value of the head characteristic action sequence by adopting the following calculation formula:
Figure FDA0002768343800000012
wherein EnergynAn energy value representing an nth head characteristic motion and an n + k head characteristic motion in the head characteristic motion sequence,
Figure FDA0002768343800000021
representing the acceleration values of the nth head characteristic motion in the x-axis, the y-axis and the z-axis respectively in the head characteristic motion sequence,
Figure FDA0002768343800000022
and the acceleration values of the (n + k) th head characteristic motion in the head characteristic motion sequence in the x axis, the y axis and the z axis are respectively represented.
5. The method of claim 1, wherein before calculating the distance value between the transformed feature action sequence and each feature action in the pre-constructed feature action cluster center library, the method further comprises:
acquiring a head characteristic action sequence set and an action tag set;
performing feature expansion on the head feature action sequence set to obtain a transformation feature action sequence set;
calculating the initial membership degree of the head feature action sequence set according to the action tag set, receiving the clustering times input by a user, and constructing a target function according to the clustering times and the initial membership degree;
and solving the minimum value of the target function according to a pre-constructed clustering function, and dividing the head characteristic action sequence set according to the minimum value to obtain the characteristic action clustering center library.
6. The method of claim 5, wherein the set of action tags includes a static tag, a left-right linear oscillating tag, a circular oscillating tag, and an up-down nodding tag.
7. The method of operating an APP interface of claim 5, wherein the constructing an objective function according to the cluster times and the initial membership comprises:
the objective function is constructed in the following way:
Figure FDA0002768343800000023
wherein J represents the objective function, k represents the clustering degree, sjRepresenting the jth transform feature action sequence, s, in the set of transform feature action sequencesj+tRepresents the j + t transform characteristic motion sequences in the transform characteristic motion sequence set,
Figure FDA0002768343800000024
representing s calculated on the basis of said initial degree of membershipjAnd sj+tM is the data quantity corresponding to the transformation characteristic action sequence set.
8. An operating device for an APP interface, the operating device comprising:
the motion sequence acquisition module is used for receiving a head characteristic motion sequence of a user of the APP by utilizing the attitude angle sensor;
the characteristic expansion module is used for executing characteristic expansion on the head characteristic action sequence to obtain a transformation characteristic action sequence;
the distance value calculation module is used for calculating the distance value between the transformation characteristic action sequence and each characteristic action in a pre-constructed characteristic action clustering center library;
and the APP interface operation module is used for selecting the action tag corresponding to the characteristic action with the minimum distance value and executing corresponding operation on the APP according to the action tag.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method of operation of the APP interface as claimed in any one of claims 1 to 7.
10. A computer-readable storage medium comprising a storage data area and a storage program area, wherein the storage data area stores created data, and the storage program area stores a computer program; wherein the computer program, when executed by a processor, implements a method of operation of the APP interface of any of claims 1 to 7.
CN202011240993.1A 2020-11-09 2020-11-09 APP interface operation method and device, electronic equipment and storage medium Pending CN112363659A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011240993.1A CN112363659A (en) 2020-11-09 2020-11-09 APP interface operation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011240993.1A CN112363659A (en) 2020-11-09 2020-11-09 APP interface operation method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112363659A true CN112363659A (en) 2021-02-12

Family

ID=74509020

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011240993.1A Pending CN112363659A (en) 2020-11-09 2020-11-09 APP interface operation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112363659A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103365417A (en) * 2013-06-20 2013-10-23 天津市莱科信息技术有限公司 Mobile terminal and electronic book page turning method based on same
CN107515670A (en) * 2017-01-13 2017-12-26 维沃移动通信有限公司 A kind of method and mobile terminal for realizing automatic page turning
CN107544673A (en) * 2017-08-25 2018-01-05 上海视智电子科技有限公司 Body feeling interaction method and body feeling interaction system based on depth map information
CN108089891A (en) * 2017-11-30 2018-05-29 维沃移动通信有限公司 A kind of application program launching method, mobile terminal
WO2018141409A1 (en) * 2017-02-06 2018-08-09 Telefonaktiebolaget Lm Ericsson (Publ) Initiating a control operation in response to a head gesture
CN110110616A (en) * 2019-04-19 2019-08-09 出门问问信息科技有限公司 A kind of electronic equipment and control method
CN110348275A (en) * 2018-04-08 2019-10-18 中兴通讯股份有限公司 Gesture identification method, device, smart machine and computer readable storage medium
CN111291655A (en) * 2020-01-21 2020-06-16 杭州微洱网络科技有限公司 Head pose matching method for 2d image measured in E-commerce image
CN111814556A (en) * 2020-06-09 2020-10-23 厦门大学 Teaching assistance method and system based on computer vision
CN111885265A (en) * 2020-07-31 2020-11-03 Oppo广东移动通信有限公司 Screen interface adjusting method and related device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103365417A (en) * 2013-06-20 2013-10-23 天津市莱科信息技术有限公司 Mobile terminal and electronic book page turning method based on same
CN107515670A (en) * 2017-01-13 2017-12-26 维沃移动通信有限公司 A kind of method and mobile terminal for realizing automatic page turning
WO2018141409A1 (en) * 2017-02-06 2018-08-09 Telefonaktiebolaget Lm Ericsson (Publ) Initiating a control operation in response to a head gesture
CN107544673A (en) * 2017-08-25 2018-01-05 上海视智电子科技有限公司 Body feeling interaction method and body feeling interaction system based on depth map information
CN108089891A (en) * 2017-11-30 2018-05-29 维沃移动通信有限公司 A kind of application program launching method, mobile terminal
CN110348275A (en) * 2018-04-08 2019-10-18 中兴通讯股份有限公司 Gesture identification method, device, smart machine and computer readable storage medium
CN110110616A (en) * 2019-04-19 2019-08-09 出门问问信息科技有限公司 A kind of electronic equipment and control method
CN111291655A (en) * 2020-01-21 2020-06-16 杭州微洱网络科技有限公司 Head pose matching method for 2d image measured in E-commerce image
CN111814556A (en) * 2020-06-09 2020-10-23 厦门大学 Teaching assistance method and system based on computer vision
CN111885265A (en) * 2020-07-31 2020-11-03 Oppo广东移动通信有限公司 Screen interface adjusting method and related device

Similar Documents

Publication Publication Date Title
CN110363077A (en) Sign Language Recognition Method, device, computer installation and storage medium
CN112446919A (en) Object pose estimation method and device, electronic equipment and computer storage medium
CN104346224B (en) It is terminated using group page fault descriptor to handle context switching and process
CN111476225B (en) In-vehicle human face identification method, device, equipment and medium based on artificial intelligence
CN111461168A (en) Training sample expansion method and device, electronic equipment and storage medium
CN111950621A (en) Target data detection method, device, equipment and medium based on artificial intelligence
CN112306835A (en) User data monitoring and analyzing method, device, equipment and medium
CN114491047A (en) Multi-label text classification method and device, electronic equipment and storage medium
CN114708461A (en) Multi-modal learning model-based classification method, device, equipment and storage medium
CN113868529A (en) Knowledge recommendation method and device, electronic equipment and readable storage medium
CN114862140A (en) Behavior analysis-based potential evaluation method, device, equipment and storage medium
CN113268665A (en) Information recommendation method, device and equipment based on random forest and storage medium
CN115600644A (en) Multitasking method and device, electronic equipment and storage medium
CN114913371A (en) Multitask learning model training method and device, electronic equipment and storage medium
CN108875901B (en) Neural network training method and universal object detection method, device and system
CN113591881A (en) Intention recognition method and device based on model fusion, electronic equipment and medium
CN113157739A (en) Cross-modal retrieval method and device, electronic equipment and storage medium
CN112991526A (en) Method and device for marking three-dimensional posture of image, electronic equipment and medium
CN112269875A (en) Text classification method and device, electronic equipment and storage medium
CN112069782A (en) Document template generation method and device, electronic equipment and storage medium
CN112363659A (en) APP interface operation method and device, electronic equipment and storage medium
CN115816831A (en) Cloud-based control 3D printer power consumption reduction method, device, equipment and medium
CN111185902B (en) Robot character writing method and device based on visual recognition and writing system
CN114240560A (en) Product ranking method, device, equipment and storage medium based on multidimensional analysis
CN114595321A (en) Question marking method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination