CN112837083A - User behavior data processing method and device, computer equipment and storage medium - Google Patents

User behavior data processing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112837083A
CN112837083A CN201911169205.1A CN201911169205A CN112837083A CN 112837083 A CN112837083 A CN 112837083A CN 201911169205 A CN201911169205 A CN 201911169205A CN 112837083 A CN112837083 A CN 112837083A
Authority
CN
China
Prior art keywords
user
vehicle
information
browsed
face image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911169205.1A
Other languages
Chinese (zh)
Inventor
龙军成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dasou Vehicle Software Technology Co Ltd
Original Assignee
Zhejiang Dasou Vehicle Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dasou Vehicle Software Technology Co Ltd filed Critical Zhejiang Dasou Vehicle Software Technology Co Ltd
Priority to CN201911169205.1A priority Critical patent/CN112837083A/en
Publication of CN112837083A publication Critical patent/CN112837083A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0281Customer communication at a business location, e.g. providing product or service information, consulting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data

Landscapes

  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application relates to a user behavior data processing method and device, computer equipment and a storage medium. The method comprises the steps of obtaining operation information of a user on a display system, wherein the operation information comprises information of vehicles browsed by the user, time length of browsing each vehicle and the like; the method comprises the steps of giving corresponding numerical values to each vehicle browsed by a user, giving different weights to the numerical values corresponding to each vehicle according to the time length of each vehicle browsed by the user, adding the numerical values to obtain an intention-to-purchase evaluation value, and predicting the intention of the user by using the intention-to-purchase evaluation value; the user behavior data processing method avoids subjective factors caused by business personnel, and can objectively and accurately predict the car purchasing intention of the user.

Description

User behavior data processing method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for processing user behavior data, a computer device, and a storage medium.
Background
With the rapid development of economy, the living standard of the masses is rapidly improved, and the rigid demand of purchasing vehicles is vigorous. Business personnel in automobile related industries need to make clear the car purchasing intention of users so as to develop corresponding business promotion for the users.
In the conventional scheme, the business personnel can only determine the car-buying intention of the user through communication with the user. However, since service personnel have certain limitations and subjective factors of individuals for vehicle cognition, information provided by users cannot be accurately analyzed in the communication process, and the vehicle purchasing intention of the users is predicted.
Disclosure of Invention
In view of the above, it is necessary to provide a user behavior data processing method, apparatus, computer device, and storage medium capable of improving accuracy.
A user behavior data processing method is applied to a face recognition system, and comprises the following steps:
acquiring behavior data information of a user pair; the behavior data information comprises vehicle information browsed by the user in a display system and the time length of each vehicle browsed by the user; the browsed vehicle information comprises each vehicle browsed by the user;
according to the browsed vehicle information, giving a corresponding numerical value to each vehicle browsed by the user;
according to the time length for browsing each vehicle, giving different weights to the numerical value corresponding to each vehicle, and adding the numerical values to obtain operation reference information corresponding to each vehicle;
and calculating the associated reference information of each vehicle and the user according to the operation reference information, and storing the relationship between the associated reference information and the corresponding user in an associated manner.
In one embodiment, before the acquiring the behavior data information of the user, the method further includes:
when a user accesses a display system, acquiring a video stream;
when the pixel of a face image in a video stream is larger than a preset pixel threshold value, intercepting the face image in the video stream, and identifying the face image to obtain user information corresponding to the face image;
and logging in a user account corresponding to the user information into a display system according to the user information.
In one embodiment, before the user accesses the display system, the method further comprises:
acquiring user information and a face image corresponding to the user;
establishing an incidence relation between the user information and the face image;
and generating a corresponding user account according to the incidence relation.
In one embodiment, the intercepting a face image in the video stream, and identifying the face image to obtain user information corresponding to the face image includes:
and when the face image is identified to have no corresponding user information, generating a prompt message of a registered account, and displaying the prompt message in a display system.
In one embodiment, the intercepting the face image in the video stream, and identifying the face image to obtain the user information corresponding to the face image further includes:
when more than one face image is recognized, generating a user selection window, and displaying the user selection window in a display system;
acquiring the selection operation of a user on the user selection window, and determining the user information to be acquired according to the selection operation;
and logging in a user account corresponding to the user information into a display system according to the user information.
In one embodiment, the method further comprises:
acquiring target vehicle information; the target vehicle information comprises each entity vehicle of which the distance between the target vehicle information and the user is smaller than a distance threshold value;
according to the target vehicle information, giving a corresponding numerical value to each target vehicle;
acquiring the stay time of a user in the distance threshold of each target vehicle;
according to the time length of the user staying in the distance threshold of each target vehicle, giving different weights to the numerical value corresponding to each target vehicle, and then adding the numerical values to obtain reference information corresponding to each target vehicle;
and calculating the association information of each target vehicle and the user according to the reference information, and associating and storing the relationship between the association reference information and the corresponding user.
In one embodiment, the method further comprises:
acquiring a time point when a user enters an exhibition hall and a time point when the user leaves the exhibition hall;
determining the time length of the user accessing the exhibition hall according to the time point of the user entering the exhibition hall and the time point of the user leaving the exhibition hall;
and associating and storing the relationship between the time length of the user accessing the exhibition hall and the corresponding user.
A user behavior data processing device is applied to realize face recognition, and the device comprises:
the operation information acquisition module is used for acquiring behavior data information of a user; the behavior data information comprises vehicle information browsed by the user in a display system and the time length of each vehicle browsed by the user; the browsed vehicle information comprises each vehicle browsed by the user;
the numerical value giving module is used for giving a corresponding numerical value to each vehicle browsed by the user according to the browsed vehicle information;
the operation reference information calculation module is used for giving different weights to the numerical values corresponding to each vehicle according to the time length for browsing each vehicle and then adding the numerical values to obtain operation reference information corresponding to each vehicle;
and the associated reference information calculation module is used for calculating the associated reference information of each vehicle and the user according to the operation reference information, and storing the relationship between the associated reference information and the corresponding user in an associated manner.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
acquiring behavior data information of a user; the behavior data information comprises vehicle information browsed by the user in a display system and the time length of each vehicle browsed by the user; the browsed vehicle information comprises each vehicle browsed by the user;
according to the browsed vehicle information, giving a corresponding numerical value to each vehicle browsed by the user;
according to the time length for browsing each vehicle, giving different weights to the numerical value corresponding to each vehicle, and adding the numerical values to obtain operation reference information corresponding to each vehicle;
and calculating the associated reference information of each vehicle and the user according to the operation reference information, and storing the relationship between the associated reference information and the corresponding user in an associated manner.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring behavior data information of a user; the behavior data information comprises vehicle information browsed by the user in a display system and the time length of each vehicle browsed by the user; the browsed vehicle information comprises each vehicle browsed by the user;
according to the browsed vehicle information, giving a corresponding numerical value to each vehicle browsed by the user;
according to the time length for browsing each vehicle, giving different weights to the numerical value corresponding to each vehicle, and adding the numerical values to obtain operation reference information corresponding to each vehicle;
and calculating the associated reference information of each vehicle and the user according to the operation reference information, and storing the relationship between the associated reference information and the corresponding user in an associated manner.
According to the user behavior data processing method, the user behavior data processing device, the computer equipment and the storage medium, the operation information of the user on the display system is obtained, the corresponding numerical value is given to each vehicle browsed by the user in the display system, the numerical value corresponding to each vehicle is given with different weights according to the time length of each vehicle browsed by the user and then added to obtain the operation reference information corresponding to each vehicle, the association reference information of each vehicle and the user is calculated according to the operation reference information, the relationship between the association reference information and the corresponding user is stored in an association mode, and the vehicle purchasing intention of the user is predicted. The user behavior data processing process is obtained through analysis of computer equipment, subjective factors existing due to business personnel are avoided, and the vehicle purchasing intention of the user can be predicted more objectively and accurately.
Drawings
FIG. 1 is a diagram illustrating an exemplary implementation of a method for processing user behavior data;
FIG. 2 is a flowchart illustrating a method for processing user behavior data according to an embodiment;
FIG. 3 is a flowchart illustrating a method for processing user behavior data according to another embodiment;
FIG. 4 is a block diagram showing the configuration of a user behavior data processing device according to an embodiment;
FIG. 5 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The user behavior data processing method provided by the application can be applied to the application environment shown in fig. 1. Wherein the display screen 102 communicates with the server 104 via a network. The server 104 obtains behavior data information of the user on the display system 102, where the behavior data information includes vehicle information browsed by the user in the display system 102 and a time duration for browsing each vehicle, and the browsed vehicle information includes each vehicle browsed by the user. According to the behavior data information, the server 104 assigns corresponding numerical values to each vehicle browsed by the user, then assigns different weights to the numerical values corresponding to each vehicle according to the time length of each vehicle browsed by the user, adds the numerical values to obtain operation reference information corresponding to each vehicle, calculates the associated reference information of each vehicle and the user according to the operation reference information, and associates and stores the relationship between the associated reference information and the corresponding user, so that the vehicle purchasing intention of the user can be predicted. The display system 102 may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, and other intelligent devices, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers.
In an embodiment, as shown in fig. 2, a user behavior data processing method is provided, which is applied to a face recognition system, and is described by taking the application of the method to the server in fig. 1 as an example, and includes the following steps:
step 202, acquiring behavior data information of a user; the behavior data information comprises vehicle information browsed by a user in the display system and the time length of each vehicle browsed; the browsed vehicle information includes each vehicle browsed by the user.
The display system is an intelligent device placed in a vehicle exhibition and can be a display screen. When the user visits the vehicle exhibition, the user can browse the vehicle interested by the user by accessing the display screen and operating the display screen. The operation information of the user on the display screen comprises the information of the vehicles browsed by the user in the display screen and the time length of browsing each vehicle. The browsed vehicle information includes information of the number of vehicles browsed by the user, the brand and model of each browsed vehicle, and the like.
Specifically, in one embodiment, a user may click on the display screen to perform an operation on the display screen. And after the display screen receives the clicking operation of the user, displaying the related page. The server acquires page information displayed by the display screen and the stay time of the page, wherein the page information comprises corresponding vehicle information. By analyzing the page information displayed on the display screen and the stay time of the page, the vehicle information browsed by the user, the time for browsing each vehicle and the like can be obtained.
And step 204, according to the browsed vehicle information, assigning a corresponding numerical value to each vehicle browsed by the user.
After the operation information of the user on the display screen is obtained, each vehicle browsed by the user is obtained, corresponding numerical values are given to each vehicle browsed by the user, the numerical values corresponding to each vehicle are added, and the associated reference information can be obtained. The associated reference information includes an evaluation value of the intention of the user to purchase the vehicle.
Specifically, assume that the user browses 50 vehicles in the display screen. In one embodiment, the value corresponding to each vehicle browsed by the user may be set to 1, and then the initial vehicle-purchasing intention evaluation value of the user is 50. In other embodiments, the value may be other values, such as 2, 3, etc.
And step 206, according to the time length of browsing each vehicle, giving different weights to the numerical values corresponding to each vehicle, and then adding the numerical values to obtain the operation reference information corresponding to each vehicle.
Because the preference degrees of the users to each vehicle are different, the stay time lengths of the users on the corresponding pages of the vehicles are different when the users browse each vehicle, namely, the stay time lengths of the users browsing each vehicle are different. The operation reference information includes an evaluation value of the intention of the user to purchase the vehicle.
Specifically, in one embodiment, when a user browses a vehicle for no more than 30 seconds, the user is deemed to be uninterested in the vehicle, and the value corresponding to such vehicle may be given a weight of 0; when the time for the user to browse the vehicle exceeds 30 seconds, the user can be considered to be interested in the vehicle; when the time for the user to browse the vehicles is between 30 seconds and 60 seconds, the corresponding numerical value of the vehicles can be given a weight of 1; when the time for the user to browse the vehicles is between 60 seconds and 90 seconds, the corresponding numerical value of the vehicles can be given a weight of 2; by analogy, the numerical values corresponding to all browsed vehicles are weighted and added to obtain the vehicle purchasing intention evaluation value of the user, and the operation reference information corresponding to each vehicle is obtained.
For example, assuming that a user browses 50 vehicles in a display screen, a corresponding numerical value of each vehicle is set to 1, wherein the browsing time period of 10 vehicles is less than 30 seconds, the browsing time period of 25 vehicles is between 30 seconds and 60 seconds, and the browsing time period of 15 vehicles is between 60 seconds and 90 seconds, then 10 × 1 × 0+25 × 1 × 1+15 × 1 × 2 is 55, and the estimated value of the vehicle purchasing intention of the user is 55.
And step 208, calculating the associated reference information of each vehicle and each user according to the operation reference information, and storing the relationship between the associated reference information and the corresponding user in an associated manner.
The evaluation criteria of the purchase intention evaluation value are different depending on the value corresponding to each vehicle and the weight given to the value.
Specifically, in one embodiment, when the value corresponding to each vehicle is set to 1 and when the user does not browse the vehicle for more than 30 seconds, the weight given to the value corresponding to the vehicle is 0, the evaluation criterion of the purchase intention evaluation value may be set to: when the vehicle purchasing intention evaluation value is larger than 10, predicting that the vehicle purchasing intention of the user is a first level; when the vehicle purchasing intention evaluation value is between 10 and 20, predicting the vehicle purchasing intention of the user to be secondary; and so on. Wherein, higher grade represents stronger intention of the user to purchase the car. And according to the time length of each vehicle browsed by the user, the vehicle most interested by the user can be predicted, namely the vehicle with the longest browsing time has higher interest degree of the user.
In the user behavior data processing method, a server acquires operation information of a user on a display screen, wherein the operation information comprises information of vehicles browsed by the user, duration of browsing each vehicle and the like; the method comprises the steps of giving corresponding numerical values to each vehicle browsed by a user, giving different weights to the numerical values corresponding to each vehicle according to the time length of each vehicle browsed by the user, adding the numerical values to obtain operation reference information corresponding to each vehicle, calculating the association reference information of each vehicle and the user by using the operation reference information, storing the relationship between the association reference information and the corresponding user in an association manner, and predicting the vehicle purchasing intention of the user according to the association reference information; subjective factors caused by business personnel are avoided in the user behavior data processing process, and the vehicle purchasing intention of the user can be objectively and accurately predicted.
In one embodiment, prior to step 202, the method further comprises: when a user accesses a display system, acquiring a video stream; when the pixels of the face image in the video stream are larger than a preset pixel threshold value, intercepting the face image in the video stream, and identifying the face image to obtain user information corresponding to the face image; and logging in a user account corresponding to the user information into the display system according to the user information.
The display system is provided with a camera, and the camera can acquire video streams. Because the flow of people in the exhibition room is large, a plurality of face images may exist in the video stream, and some users only pass through the exhibition room and do not access the exhibition screen, when the face images in the video stream are detected, the distance between the user corresponding to the face images and the exhibition screen is judged. And when the distance is smaller than the distance threshold value, intercepting the face image in the video stream, and identifying the face image.
Specifically, when a human face image is detected in a video stream, the video stream is subjected to image processing analysis. The method comprises the steps of obtaining pixels of a face image in a video stream, and judging the size of the pixels of the face image in the video stream and a preset pixel threshold value. And when the pixels of the face image in the video stream are larger than a preset pixel threshold value, considering that the distance between the user corresponding to the face image and the display screen is close enough, intercepting the face image in the video stream, and identifying the face image by using a face identification algorithm to obtain the user information corresponding to the current face image.
In one embodiment, when the face image is recognized by using a face recognition algorithm, facial features of the face image can be extracted, and the facial features are compared with facial features of face images stored in a database to calculate face similarity. And when the maximum value of the face similarity is larger than the similarity threshold value, the user corresponding to the face image in the current video stream and the user corresponding to the maximum value of the face similarity in the database are considered to be the same person. And further, the user account is sent to a display screen system by utilizing the association relation between the user information and the user account, and the user login is automatically completed. In other embodiments, other face recognition algorithms may also be used to recognize the face image in the video stream, which is not limited herein.
In this embodiment, the video stream is acquired, the face image in the video stream is identified to obtain the user information corresponding to the face image, and then the user account corresponding to the user information is logged in the display system, that is, the user login is automatically completed by using the face identification technology, so that the complicated process of manually logging in the display system by the user is avoided.
In one embodiment, prior to the user accessing the display system, the method further comprises: acquiring user information and a face image corresponding to the user; establishing an incidence relation between user information and a face image; and generating a corresponding user account according to the incidence relation.
The user information is used for uniquely identifying the user, and can be the name, the mobile phone number and the like of the user.
Specifically, before a user accesses a display system, information of the user is recorded, a face image of the user is acquired by using a camera, an association relation between the user information and the face image is established, and a corresponding user account is generated and stored in a database.
In this embodiment, before a user accesses a display system, user information and a face image corresponding to the user are obtained, an association relationship between the user information and the face image is established, and a corresponding user account is generated; the face recognition method is a preparation work for carrying out face recognition on a user, and provides a basis for recognizing user information and automatically logging a user account into a display system when the user accesses the display system.
In one embodiment, the step of capturing a face image in a video stream, and identifying the face image to obtain user information corresponding to the face image includes: and when the face image is identified to have no corresponding user information, generating a prompt message of the registered account, and displaying the prompt message in a display system.
Before accessing the display system, there may be a case where some users do not record user information and corresponding face images, so that when the users access the display system, the users cannot be identified, and the user account is automatically logged in the display system.
Specifically, when the facial image is identified to have no corresponding facial image and user information in the database, the server generates a prompt message of the registered account, displays the prompt message in the system, and the user can enter a page of the registered account through the prompt message and input the personal information of the user. And the server binds the user information with the current face image to generate a user account, and logs in the user account into a display system.
In this embodiment, when it is recognized that the current face image does not have corresponding user information, user information filled by a user is acquired, the user information is bound with the current face image, a user account is generated, and the user account is logged in the display system. After logging in the user account, the server can acquire the operation information of the user on the display system and predict the car purchasing intention of the user.
In one embodiment, intercepting a face image in a video stream, and identifying the face image to obtain user information corresponding to the face image further includes: when more than one face image is recognized, generating a user selection window, and displaying the user selection window in a display system; acquiring the selection operation of a user on a user selection window, and determining user information to be acquired according to the selection operation; and logging in a user account corresponding to the user information into a display system according to the user information.
When accessing a display system, there may be a case where a plurality of users access the same display system together. For example, user A and friend user B enter the exhibition hall together, browse the vehicles using the same display system and discuss. At this time, the display system recognizes two different face images.
Specifically, when more than one face image is recognized, the server generates a user selection window and displays the user selection window on the display system. The user selection window comprises currently recognized face images, a user can select one of the face images, the selection operation of the user is obtained, the user information to be obtained is determined according to the selection operation, and a user account corresponding to the user information is logged in a display system.
In one embodiment, the user may select a facial image in the user selection window by clicking. In other embodiments, the user may also select the face image in the user selection window by checking or selecting a box.
In the embodiment, the problem that the user account logs in when a plurality of users access the display system at the same time is solved, and the stability of the display system and the accuracy of analyzing the vehicle purchasing intention of the users by using the operation information of the users on the display system are ensured.
In one embodiment, the behavioural data information comprises physical vehicle information viewed by the user, the method further comprising:
step 302, acquiring target vehicle information; the target vehicle information includes each physical vehicle whose distance from the user is less than a distance threshold.
Wherein, besides being provided with a plurality of display systems, the exhibition hall of the vehicle is also provided with a plurality of entity vehicles. The target vehicle is a physical vehicle with a distance from the user smaller than a distance threshold value. The target vehicle information includes the number of physical vehicles whose distance from the user is smaller than a distance threshold value, and information such as a brand and a model corresponding to the vehicles.
Specifically, in one embodiment, the physical vehicles are each provided with a detector. When the detector detects that the distance between the user and the entity vehicle is smaller than the distance threshold value, the user is considered to be interested in the entity vehicle, the entity vehicle is identified as a target vehicle, and vehicle information of the vehicle is obtained.
In one embodiment, the distance threshold may be set to 50 centimeters, that is, when the distance between the user and the physical vehicle is less than 50 centimeters, the user is considered to be interested in the physical vehicle, the physical vehicle is identified as the target vehicle, and the vehicle information of the vehicle is acquired.
And step 304, assigning a corresponding numerical value to each target vehicle according to the target vehicle information.
After the target vehicle information is obtained, each entity vehicle, namely each target vehicle, the distance between which and the user is smaller than the distance threshold value is obtained. And assigning corresponding numerical values to each target vehicle, and adding the numerical values corresponding to each vehicle to obtain the reference information corresponding to each target vehicle.
The reference information includes an initial vehicle-purchasing intention evaluation value of the user.
Specifically, assume that the user is less than 50 centimeters away from 20 physical vehicles during the process of viewing the vehicle exhibition, i.e., has viewed 20 target vehicles. In one embodiment, the value corresponding to each vehicle browsed by the user may be set to 1, and then the initial vehicle-purchasing intention evaluation value of the user is 20. In other embodiments, the value may be other values, such as 2, 3, etc.
Step 306, obtaining the time length of the user staying in the distance threshold of each target vehicle.
Because the preference degrees of the users to each vehicle are different, the stay time of the users in the distance threshold of each target vehicle is different, namely the stay time of the users watching each target vehicle is different.
Specifically, a first time point and a last time point corresponding to the time when the distance between the user and the target vehicle is smaller than the distance threshold are obtained, a time interval is calculated, and the time interval is used as the time length for the user to watch the target vehicle.
And 308, according to the stay time of the user in the distance threshold of each target vehicle, giving different weights to the numerical values corresponding to each target vehicle, and then adding the numerical values to obtain the reference information corresponding to each target vehicle.
The reference information includes an estimation value of the intention of the user to purchase the vehicle.
Specifically, in one embodiment, when a user views a target vehicle for no more than 60 seconds, the user is deemed to be not interested in the vehicle, and the value corresponding to such vehicle may be given a weight of 0; when the time period for which the user views the target vehicle exceeds 60 seconds, the user may be considered to be interested in the vehicle; when the time length of the target vehicle watched by the user is between 60 seconds and 120 seconds, the value corresponding to the vehicle can be given a weight of 1; when the time for the user to view the target vehicle is between 120 seconds and 180 seconds, the value corresponding to such vehicle can be given a weight of 2; by analogy, weighting and adding the numerical values corresponding to all browsed vehicles to obtain an intention evaluation value of the vehicle purchase, namely obtaining the reference information corresponding to each target vehicle.
For example, assuming that the user views 20 target vehicles in the exhibition hall, and the corresponding value of each vehicle is set to 2, where the viewing duration of 5 target vehicles is less than 60 seconds, the viewing duration of 10 target vehicles is between 60 seconds and 120 seconds, the viewing duration of 3 target vehicles is between 120 seconds and 180 seconds, and the viewing duration of 2 target vehicles is between 180 seconds and 240 seconds, then 5 × 2 × 0+10 × 2 × 1+3 × 2 × 2+2 × 2 × 2 ═ 40, and the estimated value of the intention of the user to purchase a vehicle is 40.
And 310, calculating the association information of each target vehicle and the user according to the reference information, and associating and storing the relationship between the association reference information and the corresponding user.
The evaluation criteria of the purchase intention evaluation value are different depending on the value corresponding to each vehicle and the weight given to the value.
Specifically, in one embodiment, when the value corresponding to each vehicle is set to 2 and when the user views the target vehicle for a period of time not longer than 60 seconds, the weight given to the value corresponding to the vehicle is 0, the evaluation criterion of the associated reference information value may be set to: when the associated reference information value is larger than 15, predicting that the car purchasing intention of the user is first grade; when the associated reference information value is between 15 and 30, predicting that the car purchasing intention of the user is secondary; and so on. Wherein, higher grade represents stronger intention of the user to purchase the car. And according to the time length of watching the target vehicle by the user, the vehicle in which the user is most interested can be predicted, namely the vehicle with the longest watching time has higher user interest.
In this embodiment, the distance between the user and the physical vehicle is detected to obtain the time length of the user staying in front of the physical vehicle, and the time length is analyzed to predict the car purchasing intention of the user. Subjective factors caused by business personnel are avoided, and the objectivity and the accuracy of the prediction of the car purchasing intention of the user are improved.
In one embodiment, the method further comprises: acquiring a time point when a user enters an exhibition hall and a time point when the user leaves the exhibition hall; determining the time length of the user accessing the exhibition hall according to the time point of the user entering the exhibition hall and the time point of the user leaving the exhibition hall; and the relation between the time length for the user to access the exhibition hall and the corresponding user is stored in an associated manner.
Specifically, time points corresponding to the time points when the user enters the exhibition hall and leaves the exhibition hall are obtained, the time length of the user accessing the exhibition hall is obtained according to the time points, the time length of the user accessing the exhibition hall is analyzed, the relation between the time length of the user accessing the exhibition hall and the corresponding user is stored in a correlated mode, and the car purchasing intention of the user can be predicted. Namely, the longer the time that the user visits the exhibition hall, the higher the intention of the user to purchase the car is predicted.
In the embodiment, the time points corresponding to the time points when the user enters the exhibition hall and leaves the exhibition hall are obtained, the time length of the user accessing the exhibition hall is calculated and analyzed, the car purchasing intention of the user is predicted, and the objectivity and the accuracy of prediction are improved.
It should be understood that although the various steps in the flow charts of fig. 2-3 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2-3 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 4, there is provided a user behavior data processing apparatus 400, for implementing face recognition, including: an operation information obtaining module 401, a numerical value assigning module 402, an operation reference information calculating module 403, and an association reference information calculating module 404, wherein:
an operation information obtaining module 401, configured to obtain behavior data information of a user; the behavior data information comprises vehicle information browsed by a user in the display system and the time length of each vehicle browsed; the browsed vehicle information includes each vehicle browsed by the user.
A value assigning module 402, configured to assign a corresponding value to each vehicle browsed by the user according to the browsed vehicle information.
The operation reference information calculating module 403 is configured to assign different weights to the numerical values corresponding to each vehicle according to the time length for browsing each vehicle, and add the numerical values to obtain the operation reference information corresponding to each vehicle.
And an associated reference information calculation module 404, configured to calculate associated reference information of each vehicle and the user according to the operation reference information, and store a relationship between the associated reference information and the corresponding user in an associated manner.
In one embodiment, the user behavior data processing apparatus 400 further includes a user account automatic login module 405, configured to obtain a video stream when a user accesses the display system; when the pixels of the face image in the video stream are larger than a preset pixel threshold value, intercepting the face image in the video stream, and identifying the face image to obtain user information corresponding to the face image; and logging in a user account corresponding to the user information into the display system according to the user information.
In one embodiment, the user behavior data processing apparatus 400 further includes a user account generation module 406, configured to obtain user information and a facial image corresponding to the user; establishing an incidence relation between user information and a face image; and generating a corresponding user account according to the incidence relation.
In one embodiment, the user account generation module 406 is further configured to generate a prompt message for registering an account when it is recognized that the face image does not have corresponding user information, and display the prompt message in the display system.
In one embodiment, the user account generation module 406 is further configured to generate a user selection window when more than one facial image is recognized, and display the user selection window in the display system; acquiring the selection operation of a user on a user selection window, and determining user information to be acquired according to the selection operation; and logging in a user account corresponding to the user information into a display system according to the user information.
In one embodiment, the user behavior data processing device 400 is further configured to obtain target vehicle information; the target vehicle information comprises each entity vehicle of which the distance between the target vehicle information and the user is smaller than a distance threshold value; according to the target vehicle information, giving a corresponding numerical value to each target vehicle; acquiring the stay time of a user in the distance threshold of each target vehicle; according to the time length of the user staying in the distance threshold of each target vehicle, giving different weights to the numerical value corresponding to each target vehicle, and adding the numerical values to obtain reference information corresponding to each target vehicle; and calculating the association information of each target vehicle and the user according to the reference information, and associating and storing the relationship between the association reference information and the corresponding user.
In one embodiment, the user behavior data processing device 400 is further configured to obtain a time point when the user enters the exhibition hall and a time point when the user leaves the exhibition hall; determining the time length of the user accessing the exhibition hall according to the time point of the user entering the exhibition hall and the time point of the user leaving the exhibition hall; and associating and storing the relationship between the time length of the user accessing the exhibition hall and the corresponding user.
For specific limitations of the user behavior data processing apparatus, reference may be made to the above limitations of the user behavior data processing method, which is not described herein again. The respective modules in the user behavior data processing apparatus may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 5. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a user behavior data processing method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 5 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program: acquiring behavior data information of a user pair; the operation information comprises vehicle information browsed by a user in the display system and the time length of each vehicle browsed; the browsed vehicle information comprises each vehicle browsed by the user; according to the browsed vehicle information, giving a corresponding numerical value to each vehicle browsed by the user; according to the time length of browsing each vehicle, giving different weights to the numerical values corresponding to each vehicle, and adding the numerical values to obtain operation reference information corresponding to each vehicle; and calculating the associated reference information of each vehicle and each user according to the operation reference information, and storing the relationship between the associated reference information and the corresponding user in an associated manner.
In one embodiment, the processor, when executing the computer program, further performs the steps of: when a user accesses a display system, acquiring a video stream; when the pixels of the face image in the video stream are larger than a preset pixel threshold value, intercepting the face image in the video stream, and identifying the face image to obtain user information corresponding to the face image; and logging in a user account corresponding to the user information into the display system according to the user information.
In one embodiment, the processor, when executing the computer program, further performs the steps of: acquiring user information and a face image corresponding to the user; establishing an incidence relation between user information and a face image; and generating a corresponding user account according to the incidence relation.
In one embodiment, the processor, when executing the computer program, further performs the steps of: and when the face image is identified to have no corresponding user information, generating a prompt message of the registered account, and displaying the prompt message in a display system.
In one embodiment, the processor, when executing the computer program, further performs the steps of: when more than one face image is recognized, generating a user selection window, and displaying the user selection window in a display system; acquiring the selection operation of a user on a user selection window, and determining user information to be acquired according to the selection operation; and logging in a user account corresponding to the user information into a display system according to the user information.
In one embodiment, the processor, when executing the computer program, further performs the steps of: acquiring target vehicle information; the target vehicle information comprises each entity vehicle of which the distance between the target vehicle information and the user is smaller than a distance threshold value; according to the target vehicle information, giving a corresponding numerical value to each target vehicle; acquiring the stay time of a user in the distance threshold of each target vehicle; according to the time length of the user staying in the distance threshold of each target vehicle, giving different weights to the numerical value corresponding to each target vehicle, and adding the numerical values to obtain reference information corresponding to each target vehicle; and calculating the association information of each target vehicle and the user according to the reference information, and associating and storing the relationship between the association reference information and the corresponding user.
In one embodiment, the processor, when executing the computer program, further performs the steps of: acquiring a time point when a user enters an exhibition hall and a time point when the user leaves the exhibition hall; determining the time length of the user accessing the exhibition hall according to the time point of the user entering the exhibition hall and the time point of the user leaving the exhibition hall; and associating and storing the relationship between the time length of the user accessing the exhibition hall and the corresponding user.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of: acquiring behavior data information of a user pair; the behavior data information comprises vehicle information browsed by a user in the display system and the time length of each vehicle browsed; the browsed vehicle information comprises each vehicle browsed by the user; according to the browsed vehicle information, giving a corresponding numerical value to each vehicle browsed by the user; according to the time length of browsing each vehicle, giving different weights to the numerical values corresponding to each vehicle, and adding the numerical values to obtain operation reference information corresponding to each vehicle; and calculating the associated reference information of each vehicle and each user according to the operation reference information, and storing the relationship between the associated reference information and the corresponding user in an associated manner.
In one embodiment, the processor, when executing the computer program, further performs the steps of: when a user accesses a display system, acquiring a video stream; when the pixels of the face image in the video stream are larger than a preset pixel threshold value, intercepting the face image in the video stream, and identifying the face image to obtain user information corresponding to the face image; and logging in a user account corresponding to the user information into the display system according to the user information.
In one embodiment, the processor, when executing the computer program, further performs the steps of: acquiring user information and a face image corresponding to the user; establishing an incidence relation between user information and a face image; and generating a corresponding user account according to the incidence relation.
In one embodiment, the processor, when executing the computer program, further performs the steps of: and when the face image is identified to have no corresponding user information, generating a prompt message of the registered account, and displaying the prompt message in a display system.
In one embodiment, the processor, when executing the computer program, further performs the steps of: when more than one face image is recognized, generating a user selection window, and displaying the user selection window in a display system; acquiring the selection operation of a user on a user selection window, and determining user information to be acquired according to the selection operation; and logging in a user account corresponding to the user information into a display system according to the user information.
In one embodiment, the processor, when executing the computer program, further performs the steps of: acquiring target vehicle information; the target vehicle information comprises each entity vehicle of which the distance between the target vehicle information and the user is smaller than a distance threshold value; according to the target vehicle information, giving a corresponding numerical value to each target vehicle; acquiring the stay time of a user in the distance threshold of each target vehicle; according to the time length of the user staying in the distance threshold of each target vehicle, giving different weights to the numerical value corresponding to each target vehicle, and adding the numerical values to obtain reference information corresponding to each target vehicle; and calculating the association information of each target vehicle and the user according to the reference information, and associating and storing the relationship between the association reference information and the corresponding user.
In one embodiment, the processor, when executing the computer program, further performs the steps of: acquiring a time point when a user enters an exhibition hall and a time point when the user leaves the exhibition hall; determining the time length of the user accessing the exhibition hall according to the time point of the user entering the exhibition hall and the time point of the user leaving the exhibition hall; and associating and storing the relationship between the time length of the user accessing the exhibition hall and the corresponding user.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A user behavior data processing method is applied to a face recognition system, and comprises the following steps:
acquiring behavior data information of a user; the behavior data information comprises vehicle information browsed by a user in a display system and the time length of each vehicle browsed; the browsed vehicle information comprises each vehicle browsed by the user;
according to the browsed vehicle information, giving a corresponding numerical value to each vehicle browsed by the user;
according to the time length for browsing each vehicle, giving different weights to the numerical value corresponding to each vehicle, and adding the numerical values to obtain operation reference information corresponding to each vehicle;
and calculating the associated reference information of each vehicle and the user according to the operation reference information, and storing the relationship between the associated reference information and the corresponding user in an associated manner.
2. The method of claim 1, wherein before the obtaining the behavior data information of the user, the method further comprises:
when a user accesses a display system, acquiring a video stream;
when the pixel of a face image in a video stream is larger than a preset pixel threshold value, intercepting the face image in the video stream, and identifying the face image to obtain user information corresponding to the face image;
and logging in a user account corresponding to the user information into a display system according to the user information.
3. The method of claim 2, wherein prior to the user accessing a display system, the method further comprises:
acquiring user information and a face image corresponding to the user;
establishing an incidence relation between the user information and the face image;
and generating a corresponding user account according to the incidence relation.
4. The method according to claim 2, wherein the intercepting the face image in the video stream and the recognizing the face image to obtain the user information corresponding to the face image comprises:
and when the face image is identified to have no corresponding user information, generating a prompt message of a registered account, and displaying the prompt message in a display system.
5. The method according to claim 2, wherein the intercepting the face image in the video stream and the recognizing the face image to obtain the user information corresponding to the face image further comprises:
when more than one face image is recognized, generating a user selection window, and displaying the user selection window in a display system;
acquiring the selection operation of a user on the user selection window, and determining the user information to be acquired according to the selection operation;
and logging in a user account corresponding to the user information into a display system according to the user information.
6. The method of claim 1, wherein the behavior data information includes physical vehicle information viewed by a user, the method further comprising:
acquiring target vehicle information; the target vehicle information comprises each entity vehicle of which the distance between the target vehicle information and the user is smaller than a distance threshold value;
according to the target vehicle information, giving a corresponding numerical value to each target vehicle;
acquiring the stay time of a user in the distance threshold of each target vehicle;
according to the time length of the user staying in the distance threshold of each target vehicle, giving different weights to the numerical value corresponding to each target vehicle, and then adding the numerical values to obtain reference information corresponding to each target vehicle;
and calculating the association information of each target vehicle and the user according to the reference information, and associating and storing the relationship between the association reference information and the corresponding user.
7. The method of claim 1, further comprising:
acquiring a time point when a user enters an exhibition hall and a time point when the user leaves the exhibition hall;
determining the time length of the user accessing the exhibition hall according to the time point of the user entering the exhibition hall and the time point of the user leaving the exhibition hall;
and associating and storing the relationship between the time length of the user accessing the exhibition hall and the corresponding user.
8. A user behavior data processing device is applied to realize face recognition, and is characterized by comprising:
the operation information acquisition module is used for acquiring behavior data information of a user; the behavior data information comprises vehicle information browsed by the user in a display system and the time length of each vehicle browsed by the user; the browsed vehicle information comprises each vehicle browsed by the user;
the numerical value giving module is used for giving a corresponding numerical value to each vehicle browsed by the user according to the browsed vehicle information;
the operation reference information calculation module is used for giving different weights to the numerical values corresponding to each vehicle according to the time length for browsing each vehicle and then adding the numerical values to obtain operation reference information corresponding to each vehicle;
and the associated reference information calculation module is used for calculating the associated reference information of each vehicle and the user according to the operation reference information, and storing the relationship between the associated reference information and the corresponding user in an associated manner.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of claims 1 to 7 are implemented when the computer program is executed by the processor.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN201911169205.1A 2019-11-25 2019-11-25 User behavior data processing method and device, computer equipment and storage medium Pending CN112837083A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911169205.1A CN112837083A (en) 2019-11-25 2019-11-25 User behavior data processing method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911169205.1A CN112837083A (en) 2019-11-25 2019-11-25 User behavior data processing method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112837083A true CN112837083A (en) 2021-05-25

Family

ID=75922516

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911169205.1A Pending CN112837083A (en) 2019-11-25 2019-11-25 User behavior data processing method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112837083A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113850956A (en) * 2021-09-08 2021-12-28 摩拜(北京)信息技术有限公司 Bicycle borrowing and helmet returning method, electric bicycle, server and user terminal

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120271715A1 (en) * 2011-03-25 2012-10-25 Morton Timothy B System and method for the automatic delivery of advertising content to a consumer based on the consumer's indication of interest in an item or service available in a retail environment
CN105025145A (en) * 2014-04-22 2015-11-04 上海斐讯数据通信技术有限公司 Photo-related contact method and apparatus
CN105701687A (en) * 2016-02-17 2016-06-22 网易传媒科技(北京)有限公司 User behavior-based data processing method and apparatus
CN108334588A (en) * 2018-01-29 2018-07-27 北京搜狐新媒体信息技术有限公司 A kind of user tag construction method and device
CN108460064A (en) * 2017-07-24 2018-08-28 平安科技(深圳)有限公司 Method for closing, device, storage medium and the terminal of task based access control dimension
CN108491806A (en) * 2018-03-28 2018-09-04 成都信达智胜科技有限公司 A kind of fast human face recognition
KR20180104961A (en) * 2017-03-14 2018-09-27 주식회사 에그락 System and method for recommending shops-to-visit at off-line shopping malls based on customers' behaviors
US20180336578A1 (en) * 2017-05-16 2018-11-22 Fujitsu Limited Interest level evaluation method, interest level evaluating device and recording medium
CN109410009A (en) * 2018-10-31 2019-03-01 徐凤林 A kind of method and apparatus of purchase intention that predicting mobile terminal user
JP2019053409A (en) * 2017-09-13 2019-04-04 ヤフー株式会社 Addition device, addition method, addition program and model
KR20190084610A (en) * 2018-01-09 2019-07-17 롯데정보통신 주식회사 Car display system and method of displaying a car in the system
CN110377389A (en) * 2019-07-12 2019-10-25 北京旷视科技有限公司 Image labeling guidance method, device, computer equipment and storage medium
CN110443644A (en) * 2019-07-25 2019-11-12 重庆电子工程职业学院 The consumer's automatic locking method intelligently sold based on 4 S auto shop
CN110490646A (en) * 2019-08-01 2019-11-22 北京东方国信科技股份有限公司 The determination method and device of automobile brand target user

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120271715A1 (en) * 2011-03-25 2012-10-25 Morton Timothy B System and method for the automatic delivery of advertising content to a consumer based on the consumer's indication of interest in an item or service available in a retail environment
CN105025145A (en) * 2014-04-22 2015-11-04 上海斐讯数据通信技术有限公司 Photo-related contact method and apparatus
CN105701687A (en) * 2016-02-17 2016-06-22 网易传媒科技(北京)有限公司 User behavior-based data processing method and apparatus
KR20180104961A (en) * 2017-03-14 2018-09-27 주식회사 에그락 System and method for recommending shops-to-visit at off-line shopping malls based on customers' behaviors
US20180336578A1 (en) * 2017-05-16 2018-11-22 Fujitsu Limited Interest level evaluation method, interest level evaluating device and recording medium
CN108460064A (en) * 2017-07-24 2018-08-28 平安科技(深圳)有限公司 Method for closing, device, storage medium and the terminal of task based access control dimension
JP2019053409A (en) * 2017-09-13 2019-04-04 ヤフー株式会社 Addition device, addition method, addition program and model
KR20190084610A (en) * 2018-01-09 2019-07-17 롯데정보통신 주식회사 Car display system and method of displaying a car in the system
CN108334588A (en) * 2018-01-29 2018-07-27 北京搜狐新媒体信息技术有限公司 A kind of user tag construction method and device
CN108491806A (en) * 2018-03-28 2018-09-04 成都信达智胜科技有限公司 A kind of fast human face recognition
CN109410009A (en) * 2018-10-31 2019-03-01 徐凤林 A kind of method and apparatus of purchase intention that predicting mobile terminal user
CN110377389A (en) * 2019-07-12 2019-10-25 北京旷视科技有限公司 Image labeling guidance method, device, computer equipment and storage medium
CN110443644A (en) * 2019-07-25 2019-11-12 重庆电子工程职业学院 The consumer's automatic locking method intelligently sold based on 4 S auto shop
CN110490646A (en) * 2019-08-01 2019-11-22 北京东方国信科技股份有限公司 The determination method and device of automobile brand target user

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113850956A (en) * 2021-09-08 2021-12-28 摩拜(北京)信息技术有限公司 Bicycle borrowing and helmet returning method, electric bicycle, server and user terminal

Similar Documents

Publication Publication Date Title
US10382670B2 (en) Cognitive recording and sharing
CN108805898B (en) Video image processing method and device
JPWO2018142766A1 (en) Learned model providing method and learned model providing apparatus
CN111654700B (en) Privacy mask processing method and device, electronic equipment and monitoring system
CN110660078B (en) Object tracking method, device, computer equipment and storage medium
CN111667001A (en) Target re-identification method and device, computer equipment and storage medium
CN110991249A (en) Face detection method, face detection device, electronic equipment and medium
CN114650430A (en) Business list display method and device, electronic equipment and storage medium
CN111476275A (en) Target detection method based on picture recognition, server and storage medium
CN115544475A (en) Business service method and device
CN115311042A (en) Commodity recommendation method and device, computer equipment and storage medium
CN111291087A (en) Information pushing method and device based on face detection
CN113918738A (en) Multimedia resource recommendation method and device, electronic equipment and storage medium
CN112837083A (en) User behavior data processing method and device, computer equipment and storage medium
CN110795980A (en) Network video-based evasion identification method, equipment, storage medium and device
CN110677448A (en) Associated information pushing method, device and system
US11527091B2 (en) Analyzing apparatus, control method, and program
CN113420203A (en) Object recommendation method and device, electronic equipment and storage medium
CN112905935A (en) Page recording method, page recording animation generation method, equipment and storage medium
JP7015927B2 (en) Learning model application system, learning model application method, and program
CN114549849A (en) Image recognition method and device, computer equipment and storage medium
CN109523360B (en) Information recommendation method and system
CN112836115A (en) Information recommendation method and device, computer equipment and storage medium
CN115497252A (en) Early warning method and device for self-service equipment, computer equipment and storage medium
CN110866292A (en) Interface display method and device, terminal equipment and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination