CN104133920A - User behavior collection method and device - Google Patents

User behavior collection method and device Download PDF

Info

Publication number
CN104133920A
CN104133920A CN201410344319.6A CN201410344319A CN104133920A CN 104133920 A CN104133920 A CN 104133920A CN 201410344319 A CN201410344319 A CN 201410344319A CN 104133920 A CN104133920 A CN 104133920A
Authority
CN
China
Prior art keywords
user
smart machine
wearable smart
information
head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410344319.6A
Other languages
Chinese (zh)
Inventor
余凯
丁二锐
吴中勤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201410344319.6A priority Critical patent/CN104133920A/en
Publication of CN104133920A publication Critical patent/CN104133920A/en
Pending legal-status Critical Current

Links

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the invention discloses a user behavior collection method and device. The method comprises the following steps: on the basis of a wearable smart device, identifying target object information; on the basis of the wearable smart device, collecting the behavior information, which aims at the target object, of a user who wears the wearable smart device; and taking the target information and the behavior information as the behavior record of the target object to be stored. The amount of information collected by the technical scheme of the embodiment of the invention can enrich the user behavior so as to satisfy statistic analysis requirements based on the user behavior.

Description

A kind of user behavior acquisition method and device
Technical field
The present invention relates to Computer Applied Technology field, specifically relate to a kind of user behavior acquisition method and device.
Background technology
Conventionally, retail shop need to analyze Sales Volume of Commodity and potential purchasing power, market need to manage and analyze market operation situation, customer consumption feature, each retail shop operation situation, cinema need to analyze and decision-making film result of broadcast, advertisement putting business need to analyze and choose etc. the effect of advertising plane image, and these analyze mostly based on counterpart's's (being user) behavior is added up, analysis and decision, need to gather user's behavior.
But there are a lot of problems to the collection of user behavior in prior art, especially there is the problem that data are more single, cannot meet the statistical study requirement based on user behavior.Take market as example, statistical study is carried out to Sales Volume of Commodity in market, conventionally only can collect limited user behavior data by shopping list mode, and cannot carry out accordingly for specific consumers modeling analysis, cannot add up the commodity that client paid close attention in shopping but finally buy, cannot add up the time buying on a certain commodity of spending in of client, also cannot add up Preset Time section or time point has how many users to stop in certain region.
Summary of the invention
In view of this, the embodiment of the present invention provides a kind of user behavior acquisition method and device, can solve the user behavior gathering because of prior art more single, and cannot meet the statistical study requirement based on user behavior.
First aspect, the embodiment of the present invention provides a kind of user behavior acquisition method, comprising:
Based on wearable smart machine identification object information;
Wear the user of described wearable smart machine for the behavioural information of described object based on wearable smart machine collection;
By described object information and described behavioural information, preserve as the behavior record to described object.
Second aspect, the embodiment of the present invention also provides a kind of user behavior harvester, comprising:
Object information identificating unit, for wearable smart machine identification object information;
Behavioural information collecting unit, wears the user of described wearable smart machine for the behavioural information of described object for described wearable smart machine collection;
Behavior record storage unit,, preserves as the behavior record to described object described object information and described behavioural information for described wearable smart machine.
The useful technique effect of the technical scheme that the embodiment of the present invention proposes is:
The technical scheme of the embodiment of the present invention, by wearable smart machine identification object information, and collection is worn the user of described wearable smart machine for the behavioural information of described object, preserve as the behavior record to described object, can collect the more user behavior of horn of plenty of quantity of information, to meet the statistical study requirement based on user behavior.
Brief description of the drawings
In order to be illustrated more clearly in the technical scheme in the embodiment of the present invention, below the accompanying drawing of required use during the embodiment of the present invention is described is briefly described, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, do not paying under the prerequisite of creative work, can also obtain according to the content of the embodiment of the present invention and these accompanying drawings other accompanying drawing.
Fig. 1 is the user behavior acquisition method process flow diagram described in the embodiment of the present invention one;
Fig. 2 is the user behavior acquisition method process flow diagram described in the embodiment of the present invention two;
Fig. 3 is the structured flowchart of the user behavior harvester described in the embodiment of the present invention three;
Fig. 4 is the structured flowchart of the user behavior harvester described in the embodiment of the present invention four.
Embodiment
For the technical scheme of technical matters that the present invention is solved, employing and the technique effect that reaches clearer, below in conjunction with accompanying drawing, the technical scheme of the embodiment of the present invention is described in further detail, obviously, described embodiment is only the present invention's part embodiment, instead of whole embodiment.Based on the embodiment in the present invention, those skilled in the art, not making the every other embodiment obtaining under creative work prerequisite, belong to the scope of protection of the invention.
Further illustrate technical scheme of the present invention below in conjunction with accompanying drawing and by embodiment.
Embodiment mono-
Fig. 1 is the user behavior acquisition method process flow diagram that the embodiment of the present invention one provides, the present embodiment is applicable to the system that comprises server and at least one wearable smart machine composition, described wearable smart machine is as acquisition of information instrument obtaining information and be delivered to server, and server and wearable smart machine can directly or indirectly carry out data transmission.Simultaneously other equipment (being not limited to described wearable smart machine, as mobile phone) can be used for exhibition information to user or businessman (once as associated in user smart machine just can obtain the shopping history of oneself etc.).The present embodiment is for wearing the user of described wearable smart machine for the behavioural information of object based on wearable smart machine collection, carries out the operations such as analytic statistics to send in server for server.The record that for example user of wearable smart machine is carried out to consumer behavior as consumer objects sample, so that analyze object and/or the user behavior of wearing described wearable smart machine.Method described in the present embodiment can be carried out by the user behavior harvester being configured in wearable smart machine.
As shown in Figure 1, the user behavior acquisition method described in the present embodiment comprises:
S101, based on wearable smart machine identification object information.
Wearable smart machine is " to be worn " smart machine with it by user, comprise head-wearing type intelligent equipment, Intelligent spire lamella, intelligent watch etc., directly worn or be integrated in the middle of clothes or accessory by user, combine with people, realize by wireless transmission the wearable computing pattern that mobile network calculates, can assisting users realize the lifting of information Perception ability and information processing capability.
Those skilled in the art it should be explicitly made clear at this point, the present embodiment is to gather multiplely to wear the user of described wearable smart machine for the behavioural information of object, respectively wear the wearable smart machine that the user of described wearable smart machine wears and can be the same or different, can carry out direct or indirect data transmission with same server.
Described in the present embodiment, object includes but not limited to physical goods, video, plane picture and retail shop etc.Be to identify the identification information of object by the object of wearable smart machine identification object information, described identification information is for the kind of unique identification object or this object, include but not limited to label, Quick Response Code and/or the subject image etc. of object can electron scanning identification or the mark of image acquisition identification as object information.Subject image can be for example LOGO or the main external appearance characteristic etc. of object.
Particularly, by wearable smart machine identification object information, can be by methods such as image scanning, image taking, sight tracking.Can be that described wearable smart machine gathers while receiving user's predetermined registration operation, can be also to control described wearable smart machine automatically to carry out information acquisition according to preset rules.Taking head-wearing type intelligent equipment as example, can carry out sight tracking to the user who wears head-wearing type intelligent equipment, when the residence time according to user on object and/or user's object observing thing, the feature of pupil determines whether gathering object information, also can the operation to wear-type smart machine according to user (for example user touches or knocks the privileged sites of wear-type smart machine, user holds up head-wearing type intelligent equipment to make it be preset angles scope or makes eyeglass and the distance of eyes reaches predeterminable range scope, head-wearing type intelligent equipment receives user preset gesture, default voice messaging or default expression information) determine whether gathering object information.
S102, wear the user of described wearable smart machine for the behavioural information of described object based on wearable smart machine collection.
With it the user of vast wearable smart machine, can get very abundant and user behavior information accurately.
For example, carrying out sight tracking by described head-wearing type intelligent equipment obtains the concern time of described user at described object; Carrying out location, position by described head-wearing type intelligent equipment obtains the residence time of described user at described object; Determine by GPS (Global Positioning System, GPS) sensor where user does shopping; By gathering user's voice, gesture, pupil information, outside laser intelligence, obtaining user to object attention rate etc. for the operation information of the button on described head-wearing type intelligent equipment etc.
S103, by described object information and described behavioural information, preserve as the behavior record to described object.
Behavior record can be left in to this locality of wearable smart machine, or upload onto the server or database is preserved by bluetooth, WIFI (Wireless Fidelity), wireless network etc.Server can carry out behavioural analysis to described object and/or described user according to preserved behavior record.
The technical scheme of the embodiment of the present invention one, by wearable smart machine identification object information, and collection is worn the user of described wearable smart machine for the behavioural information of described object, preserve as the behavior record to described object, can collect the more user behavior of horn of plenty of quantity of information, to meet the statistical study requirement based on user behavior.
Embodiment bis-
Fig. 2 is the user behavior acquisition method process flow diagram described in the embodiment of the present invention two, the present embodiment is applicable to the user behavior analysis system that comprises server and at least one wearable smart machine composition, in system, each wearable smart machine is as the client of user behavior analysis system, the user behavior of wearing described wearable smart machine is gathered, respectively the user behavior record of collection is sent in server and preserved.The executive agent of the user behavior acquisition method described in the present embodiment is the user behavior harvester being configured in each wearable smart machine, the present embodiment is in the angle of a wearable smart machine, taking head-wearing type intelligent equipment as example, the technical scheme of the present embodiment is described.
As shown in Figure 2, the user behavior acquisition method described in the present embodiment comprises:
S201, by head-wearing type intelligent equipment connection in the user behavior analysis system of market A.
Head-wearing type intelligent equipment (for example intelligent glasses) is connected to the user behavior analysis system of market A as client, can connect by the form that is connected to LAN (Local Area Network), also can connect by the form that client software is installed at head-wearing type intelligent equipment, can by modes such as bluetooth, WIFI (Wireless Fidelity), wireless networks and server communicates and data interaction.
As for head-wearing type intelligent equipment connection to the connection opportunity in the user behavior analysis system of market A, also can comprise multiple, for example can be by the Quick Response Code of the user behavior analysis system of scanning market A, obtain web page address and connect, or recognize after client software is installed and connect.
S202, gather the image of the object that user pays close attention to of described head-wearing type intelligent equipment by described head-wearing type intelligent equipment, the concern time of recording user to same object.
Taking described head-wearing type intelligent equipment as intelligent glasses is as example, carry out sight tracking by intelligent glasses, wear the image of the object that user pays close attention to of described intelligent glasses by the built-in camera collection of intelligent glasses, the concern time of recording user to same object.
It should be noted that, determine the user's who wears described head-wearing type intelligent equipment the attention rate to same object, include but not limited to the concern time.The present embodiment subsequent operation illustrates the technical scheme of the present embodiment as an example of the concern time example, if attention rate is other features, replace according to same principle, also within the disclosed scope of the present embodiment.It should be noted that, in the time that the image collecting comprises multiple object, can follow the tracks of user's eyeball orientation and determine the main object of paying close attention to.
The image of the object that user pays close attention to of described head-wearing type intelligent equipment is worn in this operation by the built-in camera collection of head-wearing type intelligent equipment, recording user is example to the concern time of same object, and Benq wears the user of described wearable smart machine for a kind of mode of the behavioural information of object in wearable smart machine collection.
Certainly, those skilled in the art it should be explicitly made clear at this point, the user who wears described wearable smart machine based on wearable smart machine collection includes but are not limited to which for the behavioural information of object, for example:
Mode one, by described head-wearing type intelligent equipment carry out sight follow the tracks of obtain the concern time of described user at described object;
Mode two, carry out position location by described head-wearing type intelligent equipment and obtain the residence time of described user at described object;
The action of mode three, eyeball by described head-wearing type intelligent equipment to user or head is analyzed and is obtained the behavioural information of described user to described object;
Mode four, carry out gesture identification and/or the behavioural information of described user to described object obtained in speech recognition based on described wearable smart machine;
Mode five, by receive user the hard button operation of described head-wearing type intelligent equipment is obtained to the behavioural information of user to described object.
It should be noted that, the user who wears described wearable smart machine based on wearable smart machine collection includes but are not limited to upper type for the behavioural information of object, and aforesaid way can be used alone and also can be combined with.
S203, judge whether the concern time be more than or equal to Preset Time threshold values, if executable operations S204, otherwise return to operation S202.
S204, from label, Quick Response Code, LOGO or the subject image etc. of the extracting target from images thing of this gathered object as object information.
Object information is as the identification information of object, for the kind of unique identification object or object.
Include but not limited to label, Quick Response Code and the subject image etc. of object can electron scanning identification or the mark of image acquisition identification as object information.
S205, the concern time of continuation recording user to this object, the behavioural information as user to this object.
If whether the concern time is more than or equal to Preset Time threshold values, showing that this object meets carries out the pre-conditioned of user behavior collection, need to continue the behavioural information of recording user to this object, for example, pay close attention to the time.
S206, judge by head-wearing type intelligent equipment whether user finishes the concern to this object, if executable operations S207, otherwise return to operation S205.
S207, by head-wearing type intelligent equipment confirm and recording user identity information.
This operation is in the time that user gathers the behavior of object, increase the record of subscriber identity information, for a gathered actor's of information increase dimension, make the user behavior content of collection abundanter, more is rich in quantity of information, simultaneously, because to the actor of each behavior record to some extent, thereby can have certain early warning foundation to information cheating, flow of the people, client to market provide reliable foundation to average concern object number, store customer modeling analysis, hierarchy of consumption and the consumption feature analysis of market internal object thing.
It should be noted that, the method of concrete recording user identity information comprises multiple, for example can gather by the built-in camera of head-wearing type intelligent equipment user's iris image, according to iris image recording user identity information, and for example also can pass through the log-on message identification user's of wearable smart machine (as aforementioned head-wearing type intelligent equipment) identity information.
Particularly, subscriber identity information is for user being identified so that distinguish, can be user's iris image, wearable smart machine and while registration, can carry out a certain log-on message of unique differentiation, the wearable smart machine numbering that system is distributed automatically as described wearable smart machine while registering or user's wearable smart machine and mobile phone are bundled in system in system, for example, using the mark of user mobile phone (cell-phone number) as subscriber identity information.
The present embodiment is determined user identity according to iris recognition, technically on head-wearing type intelligent equipment, configure infrared camera, iris image by this camera typing user or extract in this iris image can this user of unique identification the information of identity, as user's the identity information of wearing this head-wearing type intelligent equipment.Also can in the time wearing head-wearing type intelligent equipment, carry out the identification of user identity.
S208, using described identity information, described object information and described behavioural information as the behavior record to described object, send in the server of user behavior analysis system of market A and preserve.
Certainly, user behavior is gathered, it can be the independent behavior of head-wearing type intelligent equipment, the user who can be used for wearing this head-wearing type intelligent equipment gathers the user behavior of each object, so that this user's behavior is carried out to modeling analysis, realize the foundation of user's consumption habit being studied or realized automatic shopping or consumption recommendation.
After described user is preserved the behavior record of described object, also can comprise below automatically perform following operation maybe carries out in the time getting default statistical study instruction and operating:
In this locality, described behavior record is added up according to preset algorithm, so that described object and/or described user are carried out to behavioural analysis; Or
Described behavior record is uploaded onto the server, to add up according to preset algorithm, described object and/or described user are carried out to behavioural analysis;
The operation that wherein receives default statistical study instruction can comprise: carry out location, position based on described wearable smart machine and obtain positional information, obtain default statistical study instruction by described positional information; Carry out speech recognition based on described wearable smart machine and obtain default statistical study instruction; Carry out gesture identification based on described wearable smart machine and obtain default statistical study instruction; By receiving user, the hard key-press input of described wearable smart machine is obtained to default statistical study instruction.The method of the above-mentioned operation that receives default statistical study instruction can be used alone, and also can be combined with.For example, default user nods, shows the hand of expression " OK " and holds, sends voice " submission " and/or leave market as default statistical study, if receive this default statistical study instruction, starts above-mentioned statistical study operation.
The technical scheme of the present embodiment two is taking head-wearing type intelligent equipment as example, by head-wearing type intelligent recognition of devices head-wearing type intelligent equipment wearer information, the merchandise news of concern, the behavioural information of user to commodity, can collect the more behavior of horn of plenty store customer of quantity of information, to meet market to the do shopping requirement of statistical study of behavior of client.
Embodiment tri-
Fig. 3 is the structured flowchart of the user behavior harvester described in the embodiment of the present invention three, and as shown in Figure 3, the user behavior harvester described in the present embodiment comprises:
Object information identificating unit 301, for identifying object information based on wearable smart machine.
Behavioural information collecting unit 302, for wearing the user of described wearable smart machine for the behavioural information of described object based on described wearable smart machine collection.
Behavior record storage unit 303, for by described object information and described behavioural information, preserves as the behavior record to described object.
The device that the present embodiment provides can be carried out the user behavior acquisition method that the embodiment of the present invention one provides, and possesses the corresponding functional unit of manner of execution and beneficial effect.
Embodiment tetra-
Fig. 4 is the structured flowchart of the user behavior harvester described in the embodiment of the present invention four, and as shown in Figure 4, the user behavior harvester described in the present embodiment comprises:
Object information identificating unit 401, for identifying object information based on wearable smart machine.
Behavioural information collecting unit 402, for wearing the user of described wearable smart machine for the behavioural information of described object based on described wearable smart machine collection.
Identity information recognition unit 403, for identifying described user's identity information based on described wearable smart machine.
Behavior record storage unit 404, for by described user's identity information, with described object information and the described behavioural information corresponding record behavior record that is described user.
Further, described identity information recognition unit 403 is specifically for the iris image that gathers user by described wearable smart machine built-in camera, identify described user's identity information according to gathered iris image, or identify user's identity information by the log-on message of described wearable smart machine.
Further, described object information identificating unit 401 specifically for: the label, Quick Response Code and/or the subject image that meet pre-conditioned object by the collection of described wearable smart machine built-in camera are as object information.
Further, described object comprises physical goods, video, plane picture and/or retail shop.
Further, described wearable smart machine is head-wearing type intelligent equipment, and described behavioural information collecting unit 402 is specifically for realizing following at least one:
Carrying out sight tracking by described head-wearing type intelligent equipment obtains the concern time of described user at described object;
Carrying out location, position by described head-wearing type intelligent equipment obtains the residence time of described user at described object;
Eyeball by described head-wearing type intelligent equipment to user or the action of head are analyzed and are obtained the behavioural information of described user to described object;
Carry out gesture identification and/or the behavioural information of described user to described object obtained in speech recognition based on described wearable smart machine;
By receiving user, the hard button operation of described head-wearing type intelligent equipment is obtained to the behavioural information of user to described object.
Further, described device also comprises statistical analysis unit 405, for: after described user is preserved the behavior record of described object, automatically perform maybe operation below execution in the time getting default statistical study instruction of following operation:
In this locality, described behavior record is added up according to preset algorithm, so that described object and/or described user are carried out to behavioural analysis; Or
Described behavior record is uploaded onto the server, to add up according to preset algorithm, described object and/or described user are carried out to behavioural analysis;
The operation that wherein receives default statistical study instruction specifically comprises following at least one:
Carry out location, position based on described wearable smart machine and obtain positional information, obtain default statistical study instruction by described positional information;
Carry out speech recognition based on described wearable smart machine and obtain default statistical study instruction;
Carry out gesture identification based on described wearable smart machine and obtain default statistical study instruction;
By receiving user, the hard key-press input of described wearable smart machine is obtained to default statistical study instruction.
The device that the present embodiment provides can be carried out the user behavior acquisition method that the embodiment of the present invention two provides, and possesses the corresponding functional unit of manner of execution and beneficial effect.
All or part of content in the technical scheme that above embodiment provides can realize by software programming, and its software program is stored in the storage medium can read, storage medium for example: hard disk, CD or floppy disk in computing machine.
Note, above are only preferred embodiment of the present invention and institute's application technology principle.Skilled person in the art will appreciate that and the invention is not restricted to specific embodiment described here, can carry out for a person skilled in the art various obvious variations, readjust and substitute and can not depart from protection scope of the present invention.Therefore, although the present invention is described in further detail by above embodiment, the present invention is not limited only to above embodiment, in the situation that not departing from the present invention's design, can also comprise more other equivalent embodiment, and scope of the present invention is determined by appended claim scope.

Claims (14)

1. a user behavior acquisition method, is characterized in that, comprising:
Based on wearable smart machine identification object information;
Wear the user of described wearable smart machine for the behavioural information of described object based on wearable smart machine collection;
By described object information and described behavioural information, preserve as the behavior record to described object.
2. method according to claim 1, is characterized in that, also comprises:
Identify described user's identity information based on wearable smart machine;
By described user's identity information, with described object information and the described behavioural information corresponding record behavior record that is described user.
3. method according to claim 2, is characterized in that, the described identity information based on wearable smart machine identification user comprises:
Gather user's iris image by described wearable smart machine built-in camera, identify described user's identity information according to gathered iris image; Or,
Identify user's identity information by the log-on message of described wearable smart machine.
4. method according to claim 1 and 2, is characterized in that, describedly comprises based on wearable smart machine identification object information:
The label, Quick Response Code and/or the subject image that meet pre-conditioned object by the collection of described wearable smart machine built-in camera are as object information.
5. method according to claim 1 and 2, is characterized in that, described object comprises physical goods, video, plane picture and/or retail shop.
6. method according to claim 1 and 2, it is characterized in that, described wearable smart machine is head-wearing type intelligent equipment, and the described user who wears described wearable smart machine based on wearable smart machine collection comprises following at least one for the behavioural information of described object:
Carrying out sight tracking by described head-wearing type intelligent equipment obtains the concern time of described user at described object;
Carrying out location, position by described head-wearing type intelligent equipment obtains the residence time of described user at described object;
Eyeball by described head-wearing type intelligent equipment to user or the action of head are analyzed and are obtained the behavioural information of described user to described object;
Carry out gesture identification and/or the behavioural information of described user to described object obtained in speech recognition based on described wearable smart machine;
By receiving user, the hard button operation of described head-wearing type intelligent equipment is obtained to the behavioural information of user to described object.
7. method according to claim 1 and 2, is characterized in that, described method also comprises: after described user is preserved the behavior record of described object,
Automatically perform maybe operation below execution in the time getting default statistical study instruction of following operation:
In this locality, described behavior record is added up according to preset algorithm, so that described object and/or described user are carried out to behavioural analysis; Or
Described behavior record is uploaded onto the server, to add up according to preset algorithm, described object and/or described user are carried out to behavioural analysis;
The operation that wherein receives default statistical study instruction specifically comprises following at least one:
Carry out location, position based on described wearable smart machine and obtain positional information, obtain default statistical study instruction by described positional information;
Carry out speech recognition based on described wearable smart machine and obtain default statistical study instruction;
Carry out gesture identification based on described wearable smart machine and obtain default statistical study instruction;
By receiving user, the hard key-press input of described wearable smart machine is obtained to default statistical study instruction.
8. a user behavior harvester, is characterized in that, comprising:
Object information identificating unit, for identifying object information based on wearable smart machine;
Behavioural information collecting unit, for wearing the user of described wearable smart machine for the behavioural information of described object based on described wearable smart machine collection;
Behavior record storage unit, for by described object information and described behavioural information, preserves as the behavior record to described object.
9. device according to claim 8, is characterized in that, also comprises identity information recognition unit, for identify described user's identity information based on described wearable smart machine;
Described behavior record storage unit is also for by described user's identity information, with described object information and the described behavioural information corresponding record behavior record that is described user.
10. device according to claim 9, is characterized in that, described identity information recognition unit specifically for:
Gather user's iris image by described wearable smart machine built-in camera, identify described user's identity information according to gathered iris image; Or,
Identify user's identity information by the log-on message of described wearable smart machine.
11. devices according to claim 8 or claim 9, is characterized in that, described object information identificating unit specifically for:
The label, Quick Response Code and/or the subject image that meet pre-conditioned object by the collection of described wearable smart machine built-in camera are as object information.
12. devices according to claim 8 or claim 9, is characterized in that, described object comprises physical goods, video, plane picture and/or retail shop.
13. devices according to claim 8 or claim 9, is characterized in that, described wearable smart machine is head-wearing type intelligent equipment, and described behavioural information collecting unit is specifically for realizing following at least one:
Carrying out sight tracking by described head-wearing type intelligent equipment obtains the concern time of described user at described object;
Carrying out location, position by described head-wearing type intelligent equipment obtains the residence time of described user at described object;
Eyeball by described head-wearing type intelligent equipment to user or the action of head are analyzed and are obtained the behavioural information of described user to described object;
Carry out gesture identification and/or the behavioural information of described user to described object obtained in speech recognition based on described wearable smart machine;
By receiving user, the hard button operation of described head-wearing type intelligent equipment is obtained to the behavioural information of user to described object.
14. devices according to claim 8 or claim 9, is characterized in that, also comprise statistical analysis unit, for:
After described user is preserved the behavior record of described object,
Automatically perform maybe operation below execution in the time getting default statistical study instruction of following operation:
In this locality, described behavior record is added up according to preset algorithm, so that described object and/or described user are carried out to behavioural analysis; Or
Described behavior record is uploaded onto the server, to add up according to preset algorithm, described object and/or described user are carried out to behavioural analysis;
The operation that wherein receives default statistical study instruction specifically comprises following at least one:
Carry out location, position based on described wearable smart machine and obtain positional information, obtain default statistical study instruction by described positional information;
Carry out speech recognition based on described wearable smart machine and obtain default statistical study instruction;
Carry out gesture identification based on described wearable smart machine and obtain default statistical study instruction;
By receiving user, the hard key-press input of described wearable smart machine is obtained to default statistical study instruction.
CN201410344319.6A 2014-07-18 2014-07-18 User behavior collection method and device Pending CN104133920A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410344319.6A CN104133920A (en) 2014-07-18 2014-07-18 User behavior collection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410344319.6A CN104133920A (en) 2014-07-18 2014-07-18 User behavior collection method and device

Publications (1)

Publication Number Publication Date
CN104133920A true CN104133920A (en) 2014-11-05

Family

ID=51806598

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410344319.6A Pending CN104133920A (en) 2014-07-18 2014-07-18 User behavior collection method and device

Country Status (1)

Country Link
CN (1) CN104133920A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090973A (en) * 2014-07-18 2014-10-08 百度在线网络技术(北京)有限公司 Information presentation method and device
CN104318752A (en) * 2014-11-14 2015-01-28 陈尚卫 Method and system for behavior data acquisition based on high-frequency acoustic waves
CN106056405A (en) * 2016-05-27 2016-10-26 上海青研科技有限公司 Advertisement directional-pushing technology based on virtual reality visual interest area
CN106296160A (en) * 2015-05-12 2017-01-04 广州杰赛科技股份有限公司 The acquisition of information of a kind of position association and authentication method
CN106296204A (en) * 2015-05-13 2017-01-04 广州杰赛科技股份有限公司 A kind of convenient shopping and method of mobile payment
CN106647291A (en) * 2015-10-30 2017-05-10 霍尼韦尔国际公司 Wearable control device, control system and method for controlling controlled electric appliance
CN107025690A (en) * 2015-10-27 2017-08-08 Sk普兰尼特有限公司 For the method and apparatus for the information for building the position on shown commodity
CN107180363A (en) * 2017-05-15 2017-09-19 泰康保险集团股份有限公司 Data capture method and device
CN108334185A (en) * 2017-01-20 2018-07-27 深圳纬目信息技术有限公司 A kind of eye movement data reponse system for wearing display equipment
CN108769266A (en) * 2018-07-12 2018-11-06 谭飞伍 A kind of information-pushing method and system
JP2018195284A (en) * 2017-05-12 2018-12-06 富士ゼロックス株式会社 Program, method, device, and system for managing plural work table
WO2018218860A1 (en) * 2017-05-31 2018-12-06 深圳正品创想科技有限公司 Commodity recommendation method and device
CN109271824A (en) * 2018-08-31 2019-01-25 出门问问信息科技有限公司 A kind of method and device identifying image in 2 D code
CN110888578A (en) * 2018-09-10 2020-03-17 宝沃汽车(中国)有限公司 Vehicle function optimization method and device and vehicle with same
CN113052197A (en) * 2019-12-28 2021-06-29 中移(成都)信息通信科技有限公司 Method, apparatus, device and medium for identity recognition
CN113807894A (en) * 2021-09-18 2021-12-17 陕西师范大学 Advertisement putting method, system and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002244608A (en) * 2001-02-22 2002-08-30 Hitachi Ltd Advertisement system using window glass
US20130054576A1 (en) * 2011-08-23 2013-02-28 Buckyball Mobile, Inc. Identifying digital content using bioresponse data
CN103561635A (en) * 2011-05-11 2014-02-05 谷歌公司 Gaze tracking system
US20140100955A1 (en) * 2012-10-05 2014-04-10 Microsoft Corporation Data and user interaction based on device proximity
CN103810254A (en) * 2014-01-22 2014-05-21 浙江大学 User behavior real-time analyzing method based on cloud terminal
CN103927350A (en) * 2014-04-04 2014-07-16 百度在线网络技术(北京)有限公司 Smart glasses based prompting method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002244608A (en) * 2001-02-22 2002-08-30 Hitachi Ltd Advertisement system using window glass
CN103561635A (en) * 2011-05-11 2014-02-05 谷歌公司 Gaze tracking system
US20130054576A1 (en) * 2011-08-23 2013-02-28 Buckyball Mobile, Inc. Identifying digital content using bioresponse data
US20140100955A1 (en) * 2012-10-05 2014-04-10 Microsoft Corporation Data and user interaction based on device proximity
CN103810254A (en) * 2014-01-22 2014-05-21 浙江大学 User behavior real-time analyzing method based on cloud terminal
CN103927350A (en) * 2014-04-04 2014-07-16 百度在线网络技术(北京)有限公司 Smart glasses based prompting method and device

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090973B (en) * 2014-07-18 2018-09-07 百度在线网络技术(北京)有限公司 A kind of information demonstrating method and device
CN104090973A (en) * 2014-07-18 2014-10-08 百度在线网络技术(北京)有限公司 Information presentation method and device
CN104318752A (en) * 2014-11-14 2015-01-28 陈尚卫 Method and system for behavior data acquisition based on high-frequency acoustic waves
CN106296160A (en) * 2015-05-12 2017-01-04 广州杰赛科技股份有限公司 The acquisition of information of a kind of position association and authentication method
CN106296204A (en) * 2015-05-13 2017-01-04 广州杰赛科技股份有限公司 A kind of convenient shopping and method of mobile payment
CN107025690A (en) * 2015-10-27 2017-08-08 Sk普兰尼特有限公司 For the method and apparatus for the information for building the position on shown commodity
CN106647291A (en) * 2015-10-30 2017-05-10 霍尼韦尔国际公司 Wearable control device, control system and method for controlling controlled electric appliance
CN106056405A (en) * 2016-05-27 2016-10-26 上海青研科技有限公司 Advertisement directional-pushing technology based on virtual reality visual interest area
CN108334185A (en) * 2017-01-20 2018-07-27 深圳纬目信息技术有限公司 A kind of eye movement data reponse system for wearing display equipment
JP2018195284A (en) * 2017-05-12 2018-12-06 富士ゼロックス株式会社 Program, method, device, and system for managing plural work table
JP7167421B2 (en) 2017-05-12 2022-11-09 富士フイルムビジネスイノベーション株式会社 Program, method, device and system for managing multiple workbenches
CN107180363B (en) * 2017-05-15 2021-01-26 泰康保险集团股份有限公司 Data acquisition method and device
CN107180363A (en) * 2017-05-15 2017-09-19 泰康保险集团股份有限公司 Data capture method and device
WO2018218860A1 (en) * 2017-05-31 2018-12-06 深圳正品创想科技有限公司 Commodity recommendation method and device
CN108769266A (en) * 2018-07-12 2018-11-06 谭飞伍 A kind of information-pushing method and system
CN109271824A (en) * 2018-08-31 2019-01-25 出门问问信息科技有限公司 A kind of method and device identifying image in 2 D code
CN109271824B (en) * 2018-08-31 2021-07-09 出门问问信息科技有限公司 Method and device for identifying two-dimensional code image
CN110888578A (en) * 2018-09-10 2020-03-17 宝沃汽车(中国)有限公司 Vehicle function optimization method and device and vehicle with same
CN113052197A (en) * 2019-12-28 2021-06-29 中移(成都)信息通信科技有限公司 Method, apparatus, device and medium for identity recognition
CN113052197B (en) * 2019-12-28 2024-03-12 中移(成都)信息通信科技有限公司 Method, device, equipment and medium for identity recognition
CN113807894A (en) * 2021-09-18 2021-12-17 陕西师范大学 Advertisement putting method, system and device

Similar Documents

Publication Publication Date Title
CN104133920A (en) User behavior collection method and device
CN104090973B (en) A kind of information demonstrating method and device
US10609267B2 (en) Systems and methods for analyzing advertisement effectiveness using wearable camera systems
US9952427B2 (en) Measurement method and system
CN104103024A (en) User evaluation information acquisition method and device
US10902498B2 (en) Providing content based on abandonment of an item in a physical shopping cart
US10991004B2 (en) Utilizing population density to facilitate providing offers
US10839464B2 (en) System and method for managing interaction between commercial and social users
US20180033045A1 (en) Method and system for personalized advertising
KR20100114860A (en) Touchpoint customization system
JP2015133033A (en) Recommendation device, recommendation method and program
US20220005081A1 (en) Marketplace For Advertisement Space Using Gaze-Data Valuation
CN105046630A (en) image tag add system
US20220253907A1 (en) System and method for identifying tailored advertisements based on detected features in a mixed reality environment
JP2019204431A (en) Computer program and method
US12033190B2 (en) System and method for content recognition and data categorization
CN108595651A (en) Customized information display methods, device and user terminal based on recognition of face
CN105005982A (en) Image processing including object selection
US20220114616A1 (en) Digital anthropology and ethnography system
CN105022773A (en) Image processing system including image priority
CN105183739A (en) Image Processing Server
KR102340904B1 (en) System and method for virtual fitting based on augument reality
CN105955451A (en) Travel prompt method based on user image
US10133931B2 (en) Alert notification based on field of view
CN103440307A (en) Method and device for providing media information

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20141105

RJ01 Rejection of invention patent application after publication