WO2016000163A1 - 用户数据的处理方法和设备 - Google Patents
用户数据的处理方法和设备 Download PDFInfo
- Publication number
- WO2016000163A1 WO2016000163A1 PCT/CN2014/081247 CN2014081247W WO2016000163A1 WO 2016000163 A1 WO2016000163 A1 WO 2016000163A1 CN 2014081247 W CN2014081247 W CN 2014081247W WO 2016000163 A1 WO2016000163 A1 WO 2016000163A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- user
- user behavior
- smart
- behavior
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title abstract description 3
- 230000006399 behavior Effects 0.000 claims description 119
- 238000001514 detection method Methods 0.000 claims description 35
- 238000000034 method Methods 0.000 claims description 30
- 230000033001 locomotion Effects 0.000 claims description 18
- 239000004984 smart glass Substances 0.000 description 23
- 230000036772 blood pressure Effects 0.000 description 9
- 230000003860 sleep quality Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007670 refining Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/206—Drawing of charts or graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/70—Services for machine-to-machine communication [M2M] or machine type communication [MTC]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
Definitions
- the embodiments of the present invention relate to the field of communications technologies, and in particular, to a method and a device for processing user data. Background technique
- sensors such as smartphones, tablet computers, and the like are detected by non-wearable smart devices (such as mobile phones, tablet computers, etc.) and displayed on non-wearable smart devices.
- non-wearable smart devices such as mobile phones, tablet computers, etc.
- sensors such as smart bracelets, smart watches, smart rings, etc.
- data such as number of turns, sleep, heart rate, blood pressure, etc.
- the embodiment of the invention provides a method and a device for processing user data, so that the user can see the most accurate and most needed data in the first time in different scenarios, and improve the user experience.
- an embodiment of the present invention provides a method for processing user data, including: acquiring, by a first device, first data and at least one second data, where the first data is that the first device itself detects user behavior The acquired data, the second data is data acquired by the at least one second device detecting the user behavior;
- the preset rule includes: dividing the detection time into multiple according to a preset time length in a detection time of the user behavior from occurrence to stop For each time period, the user data corresponding to each time segment is selected in the first data and the at least one second data according to the selection rule; and the user data corresponding to each time segment is summed, User data as the user's behavior.
- the method further includes:
- the presenting of the user behavior and/or the user data includes displaying the user behavior and/or the user data in a coordinate form, where the coordinates include a time axis, specifically: according to the first data and the Determining at least one second data, detecting that the motion track of the user behavior is in the same area within a detection period, calculating a center point of the area, and time in the first device display screen The user behavior occurs at the center point on the axis, corresponding to the detection time period.
- the method further includes:
- the preset rule includes selecting the first data or the second data having a high priority as the user data corresponding to the user behavior according to the state of the user behavior.
- an embodiment of the present invention provides a first device, including:
- An acquiring module configured to acquire first data and at least one second data, where the first data is data acquired by the first device itself to detect user behavior, and the second data is at least one second device pair Describe the data obtained by the user behavior detection;
- a processing module configured to determine, according to a preset rule, user data corresponding to the user behavior according to the first data and the second data;
- a presentation module for presenting the user behavior and/or the user data.
- the preset rule includes: when the user behavior is from detection to stop, according to a preset time length The detection time is divided into a plurality of time segments; for each time segment, one of the first data and the at least one second data is selected as the user data corresponding to each time segment according to the selection rule; The corresponding user data is summed as user data of the user behavior.
- the preset rule further includes: detecting, according to the first data and the at least one second data, During the detection period, when the motion track of the user behavior is in the same area, the center point of the area is calculated;
- the presentation module is further configured to display the user behavior at the center point on a time axis in the display screen of the first device and at a point corresponding to the detection time period.
- the device further includes:
- a determining module configured to: determine, by the first device, a state that learns the behavior of the user according to the first data and the at least one second data;
- the preset rule includes selecting the first data or the second data having a high priority as the user data corresponding to the user behavior according to the state of the user behavior.
- the first device acquires the first data acquired by detecting the user behavior, and the at least one second data acquired by the second device to detect the user behavior. Then, according to the preset rules, the user data corresponding to the user behavior is determined, and the user behavior and/or the user data are presented on the first device, so that the user can be seen in different scenarios in the first time. Improve the user experience with the most accurate and needed data.
- Embodiment 1 is a flowchart of Embodiment 1 of a method for processing user data according to the present invention
- Embodiment 2 is a flowchart of Embodiment 2 of a method for processing user data according to the present invention
- Embodiment 3 is a flowchart of Embodiment 3 of a method for processing user data according to the present invention
- 4 is a schematic structural diagram of Embodiment 1 of a first device of the present invention
- FIG. 5 is a schematic structural diagram of Embodiment 2 of a first device of the present invention. detailed description
- Embodiment 1 is a flowchart of Embodiment 1 of a method for processing user data according to the present invention. As shown in FIG. 1, the method in this embodiment may include:
- Step 101 The first device acquires first data and at least one second data, where the first data is data acquired by the first device itself to detect user behavior, and the second data is at least one The data acquired by the device detecting the user behavior.
- the first device may be a non-wearing smart device or a wearable smart device
- the second device is a wearable smart device.
- a non-wearable smart device can be a smartphone, a tablet computer, or the like.
- Wearable smart devices can be smart glasses, smart watches, smart rings, and more.
- the first device acquires the first data and the at least one second data. Specifically, it can be understood that the first device acquires data acquired by detecting the user behavior and data acquired by the second device to detect the user behavior, that is, the first data is the data acquired by the first device itself, The second data is the data acquired by the second device itself.
- Step 102 The first device determines, according to the preset rule, user data corresponding to the user behavior according to the first data and the second data, and performs the user behavior and/or the user data. Presented.
- the first device determines the user data corresponding to the user behavior according to the preset rules according to the preset rules, and presents the user behavior and/or the user data on the first device.
- the user data is data determined according to a preset rule in the first data and the second data acquired above.
- the presentation is not limited to vision, but may also include hearing, touch, taste, and the like.
- the first device obtains the first obtained by detecting the user behavior by using the first device.
- the data and the second device detect the user behavior to obtain the at least one second data, and then determine the user data corresponding to the user behavior according to a preset rule, and perform user behavior and/or on the first device.
- the user data is presented, which enables the user to see the most accurate and most needed data in the first time in different scenarios, and improves the user experience.
- the preset rule described in the foregoing embodiment may include dividing the detection time into at least one time period according to a preset time length during a detection time of the user behavior from occurrence to stop. For each time period, user data corresponding to each time period is selected among the first data and the at least one second data according to a selection rule. The details will be described below in conjunction with the embodiment of Fig. 2.
- Embodiment 2 is a flowchart of Embodiment 2 of a method for processing user data according to the present invention. As shown in FIG. 2, the method in this embodiment may include:
- Step 201 The first device acquires the first data and the at least one second data, where the first data is the data obtained by the first device itself to detect the behavior of the user running, the second data.
- the parameter data obtained by detecting the behavior of the user running for the at least one second device.
- the first device is exemplified by a smart phone
- the second device is exemplified by a smart bracelet and a smart shoe.
- the running data is used as a user behavior.
- the first data is the data obtained by the smart phone itself to detect the user's running behavior
- the second data is the smart wristband and/or the smart shoe to the user.
- the running behavior is detected by the running behavior.
- Step 202 The first device determines, according to the preset rule, the parameter data corresponding to the user running according to the first data and the second data, and presents the user data.
- the smart phone has the data, the smart bracelet and the smart shoes have no data, and at this time, the data acquired by the smart phone is selected; if the smart phone does not have The number of data, the smart bracelet has the number of data, the smart shoes have no data, in this case, select the data obtained by the smart bracelet; if the smart phone does not have the data, the smart bracelet does not have the data, the smart shoes have the data, this When the number of parameters obtained by the smart shoe is selected, the party having the parameter data is selected as the user data.
- the data recording points are refined, and in this embodiment, the data is divided into Every 5 minutes is a point, taking 1 hour as an example, divided into 12 periods.
- the parameter data with a large amount of motion is selected as the parameter data.
- the number of turns acquired by the smartphone is 150 ⁇
- the number of turns acquired by the smart wrist is 158 ⁇
- the number of turns obtained by the smart shoes is 160 ⁇
- the smart shoes are selected to obtain the data with the number of turns of 160 ⁇ as the first 5 minutes.
- the number of parameters in the segment The number of turns in the rest of the time period, and so on.
- the time of 5 minutes is not absolute, it is relative, and the specific time can be determined by the capabilities of the smart device, which is not limited here.
- the data corresponding to the respective time segments are summed to obtain the final user data in the detection time from the occurrence of the user's running behavior to the stop, which is presented on the smart phone.
- the number of data obtained by detecting the behavior of the user's running through the smart phone and the parameter data obtained by detecting the behavior of the user's running by the smart bracelet and the smart shoe are obtained by the smart phone, and then the data recording point is obtained. Refining, in each time period, selecting the user data with a large amount of motion, and finally summing the user data corresponding to each time period, and obtaining the final user data in the detection time of the user behavior from occurrence to stop.
- the user can see the most accurate and most needed data in the first time in different scenarios, and improve the user experience.
- the method embodiment can also display the location of the user in a certain time on the corresponding time axis according to the user data obtained by the terminal device.
- the first data may be the latitude and longitude obtained by the first device (for example, a smart phone), and the second data is the latitude and longitude obtained by the second device (for example, a smart watch).
- This embodiment exemplifies a latitude and longitude point every 30 seconds.
- the specific time interval can be configured according to actual conditions, and is not limited herein.
- the smart watch acquires the latitude and longitude data by using the smart phone when the latitude and longitude data is not acquired; when the smart phone does not acquire the latitude and longitude data, and the smart watch acquires the latitude and longitude data, the smart watch acquires the latitude and longitude data.
- the latitude and longitude data obtained by the smart watch is used at the point where the time period is repeated, because the latitude and longitude data of the smart watch is derived from GPS, and the latitude and longitude data of the smart phone may be derived from GPS may also originate from a base station or WIFI, and the latitude and longitude data provided by the base station and WIFI is not accurate enough and is biased.
- the presenting of the user behavior and/or the user data includes displaying the user behavior and/or the user data in a coordinate form, where the coordinates include a time axis, specifically: according to the first data and the Determining at least one second data, detecting that the user is within a detection period
- the motion track of the behavior is in the same area, and the center point of the area is calculated, and displayed at the center point on the time axis in the display screen of the first device and the point corresponding to the detection time period
- the user behavior occurs.
- the smart phone depicts the user's motion trajectory, and aggregates the range of the active area of the user during the period of time, and detects that the motion track of the user behavior is in the same area within a detection time period, and then calculates The center point of the area displays the user behavior occurring at the center point during the detection time period on the timeline in the display of the smartphone.
- the latitude and longitude data obtained by detecting the user behavior by the non-wearable smart device such as a smart phone and the latitude and longitude data acquired by the wearable smart device such as a smart watch detecting the user behavior are performed, and then, The acquired latitude and longitude points are drawn to obtain the user's motion trajectory, and the range of the active area of the user during the period is aggregated. It is detected that within a detection period, the motion track of the user behavior is in the same area, then the motion point of the user behavior is calculated to be at the center point in the same area, and is displayed on the time axis in the display screen of the smartphone. During the time period, the user behavior occurring at the central point enables the user to see the most accurate and most needed data in the first time in different scenarios, improving the user experience.
- the preset rule described in the first embodiment may further include selecting the first data or the second data having a high priority as the user data corresponding to the user behavior according to the state of the user behavior. The details will be described below in conjunction with the embodiment of Fig. 3.
- FIG. 3 is a flowchart of Embodiment 3 of a method for processing user data according to the present invention. As shown in FIG. 3, the method in this embodiment may include:
- Step 301 The first device acquires first data and at least one second data, where the first data is data acquired by the first device itself to detect user behavior, and the second data is at least one The data acquired by the device detecting the user behavior.
- the first device is exemplified by a smart phone, smart glasses, a smart watch
- the second device is smart glasses, a smart watch, a smart shoe, a smart bracelet, and a smart ring.
- the first data is data acquired by the first device itself to detect user behavior, that is, data acquired by a smart phone, smart glasses, and a smart watch to detect user behavior, for example, data such as parameters, heart rate, and blood pressure.
- the second data is data obtained by detecting, by the at least one second device, the user behavior, that is, data acquired by the smart shoes, the smart bracelet, and the smart ring detecting the user behavior, for example, the number of turns, the heart rate, and the blood pressure. And other data.
- smart shoes, smart bracelets, and smart rings periodically send out broadcast ADV_IND messages. After receiving ADV_IND messages, smartphones, smart glasses, and smart watches broadcast SCAN_REQ messages to scan nearby Bluetooth devices.
- the shoe, the smart bracelet, and the smart ring After receiving the SCAN_REQ message, the shoe, the smart bracelet, and the smart ring respond to the SCAN_RSP message.
- the SCAN_RSP message carries the device identification number (IDentity, ID: ID) and the device's Bluetooth address, etc., smartphone, smart glasses.
- IDentity ID: ID
- the smart watch After receiving the SCAN_RSP message, the smart watch establishes a connection with the corresponding device according to the Bluetooth address of each device and acquires the capabilities of the shoe, the smart bracelet, and the smart ring device, such as the service information supported by the device.
- smart devices such as smart phones, smart glasses, and smart watches obtain data obtained by detecting the user's behavior and data acquired by smart devices such as smart shoes, smart bracelets, and smart rings to detect user behavior.
- Step 302 The first device determines, according to the first data and the at least one second data, a status of learning the behavior of the user.
- motion sensors acceleration sensors, gravity sensors, gyroscopes, etc.
- smartphones smart glasses, smart watches to identify the user's status.
- the state of user behavior such as motion, rest, sleep, and the like.
- Step 303 The first device selects the first data or the second data with a high priority as the user data corresponding to the user behavior according to the state of the user behavior, and performs the presentation.
- the user data advanced priority policy is configured, when the smart phone, the smart glasses, the smart watch device are all When you wear it on the user, the user data advanced priority policy configured on the smartphone is subject to the user data advanced priority policy configured on the smart glasses.
- the user data advanced priority policy configured on the smart watch is quasi. In the present embodiment, only the parameter data, the heart rate data, the blood pressure data, and the sleep quality data are exemplified.
- the priority is obtained from the number of data obtained from the smart shoes or smart foot ring, if the sensor data of the smart shoes or smart foot ring If you can't get it, the number of data obtained by smart bracelet or smart watch will prevail. Secondly, the number of data obtained by smart ring or smart glasses will prevail. That is, the advanced priority order is: smart shoes or smart foot ring ⁇ smart bracelet or smart watch ⁇ smart ring or smart glasses;
- the heart rate or blood pressure data obtained from the smart bracelet or smart watch is preferred. If the sensor data of the smart bracelet or smart watch cannot be obtained, the heart rate or blood pressure obtained by the smart ring is obtained. The data prevails, followed by the heart rate or blood pressure data obtained from the smart foot ring or smart shoes. That is, the advanced priority order is: Smart Bracelet or Smart Watch > Smart Ring > Smart Foot Ring or Smart Shoes.
- sleep quality is generally achieved by data such as dreaming, pulse, body motion recording, etc.
- priority is given to the sleep quality data obtained from the smart bracelet or smart watch. If the sensor data of the smart bracelet or smart watch is not available, the sleep quality data obtained by the smart ring shall prevail, and the heart rate or blood pressure data obtained by the smart foot ring shall prevail. That is, the advanced priority order is: Smart Bracelet or Smart Watch > Smart Ring > Smart Foot Ring.
- the user defaults to the number of data obtained from smart shoes or smart feet, followed by the number of data acquired on the smart bracelet or smart watch. It is the number of data obtained by smart ring or smart glasses.
- the user prefers smart ring or smart glasses to obtain the data, followed by smart shoes or smart feet to obtain the data, and again the smart bracelet or smart watch to obtain the number of parameters. data.
- the advanced priority policy is configured according to the user's own personalized requirements, that is, the current high priority order is: smart ring or smart glasses > smart shoes or smart foot ring > smart bracelet or smart watch.
- the corresponding user data selected by the above advanced priority policy is presented on the smartphone, the smart glasses, the smart watch, the smart shoes, the smart bracelet, and the smart ring.
- the presentation method can be visual, auditory, tactile, taste and other ways.
- the corresponding user data is displayed on the smart phone, the smart glasses, and the smart watch, and the corresponding user data is played out in a sound manner through the smart phone, and the corresponding user data is vibrated through the smart shoes, the smart bracelet, and the smart ring. Way to remind users.
- the smart device, the smart glasses, the smart watch and other smart devices obtain the data acquired by the user to detect the user behavior and the smart shoes, the smart bracelet, the smart ring and the like.
- the data acquired by the device to detect user behavior is identified based on the acquired data.
- the bracelet and the smart ring are presented, which enables the user to see the most accurate and most needed data in the first time in different scenarios and improve the user experience.
- the first device 01 of this embodiment may include: an obtaining module 11, a processing module 12, and a rendering module 13, wherein the acquiring module 11 is And the at least one second data is used by the first device to detect the user behavior, and the second data is used by the at least one second device to the user behavior.
- the processing module 12 is configured to determine, according to the preset rule, the user data corresponding to the user behavior according to the first data and the second data, and the presentation module 13 uses Presenting the user behavior and/or the user data.
- the preset rule may include: dividing the detection time into multiple time segments according to a preset time length during a detection time of the user behavior from occurrence to stop. And selecting, according to the selection rule, the user data corresponding to each time segment in the first data and the at least one second data according to the selection rule; summing the user data corresponding to each time segment as a User data describing user behavior.
- the preset rule may include: calculating, according to the first data and the at least one second data, that the motion track of the user behavior is in the same area within a detection period, then calculating Out of the center of the area;
- the presentation module 13 is further configured to display the user behavior at the center point on a time axis in the first device display screen and corresponding to the detection time period.
- the first device in this embodiment may be used to perform the technical solution of the foregoing method embodiment, and the implementation principle and the technical effect are similar, and details are not described herein again.
- FIG. 5 is a schematic structural diagram of Embodiment 2 of the first device of the present invention.
- the first device 01 of the present embodiment may further include: a determining module, based on the device structure shown in FIG. 14.
- the determining module 14 is configured to determine, by the first device, a state that learns the behavior of the user according to the first data and the at least one second data;
- the preset rule may include, according to a state of the user behavior, selecting the first data or the second data having a high priority as user data corresponding to the user behavior.
- the first device in this embodiment may be used to perform the technical solution of the foregoing method embodiment, and the implementation principle and the technical effect are similar, and details are not described herein again.
- the disclosed apparatus and method may be implemented in other manners.
- the device embodiments described above are only schematic.
- the division of the unit is only a logical function division.
- there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not executed.
- the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be electrical, mechanical or otherwise.
- the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solution of the embodiment.
- each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
- the above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
- the above-described integrated unit implemented in the form of a software functional unit can be stored in a computer readable storage medium.
- the software functional unit is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform the method of various embodiments of the present invention.
- a computer device which may be a personal computer, a server, or a network device, etc.
- the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, which can store program codes. .
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Cardiology (AREA)
- General Engineering & Computer Science (AREA)
- Vascular Medicine (AREA)
- Human Computer Interaction (AREA)
- Physiology (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2014/081247 WO2016000163A1 (zh) | 2014-06-30 | 2014-06-30 | 用户数据的处理方法和设备 |
CN201480008208.6A CN105519074B (zh) | 2014-06-30 | 2014-06-30 | 用户数据的处理方法和设备 |
EP14896735.9A EP3145156A4 (en) | 2014-06-30 | 2014-06-30 | User data processing method and device |
JP2016575952A JP6380961B2 (ja) | 2014-06-30 | 2014-06-30 | ユーザデータ処理方法、およびデバイス |
US15/391,083 US20170213367A1 (en) | 2014-06-30 | 2016-12-27 | User data processing method, and device for displaying data acquired from a wearable device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2014/081247 WO2016000163A1 (zh) | 2014-06-30 | 2014-06-30 | 用户数据的处理方法和设备 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/391,083 Continuation US20170213367A1 (en) | 2014-06-30 | 2016-12-27 | User data processing method, and device for displaying data acquired from a wearable device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016000163A1 true WO2016000163A1 (zh) | 2016-01-07 |
Family
ID=55018249
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2014/081247 WO2016000163A1 (zh) | 2014-06-30 | 2014-06-30 | 用户数据的处理方法和设备 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20170213367A1 (zh) |
EP (1) | EP3145156A4 (zh) |
JP (1) | JP6380961B2 (zh) |
CN (1) | CN105519074B (zh) |
WO (1) | WO2016000163A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10351777B2 (en) | 2013-03-15 | 2019-07-16 | All Power Labs, Inc. | Simultaneous pyrolysis and communition for fuel flexible gasification and pyrolysis |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107811624A (zh) * | 2017-12-12 | 2018-03-20 | 深圳金康特智能科技有限公司 | 一种基于双智能穿戴设备的用户信息采集*** |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070049814A1 (en) * | 2005-08-24 | 2007-03-01 | Muccio Philip E | System and device for neuromuscular stimulation |
CN202282004U (zh) * | 2011-06-02 | 2012-06-20 | 上海巨浪信息科技有限公司 | 基于情景感知与活动分析的移动健康管理*** |
CN103198615A (zh) * | 2013-03-21 | 2013-07-10 | 浙江畅志科技有限公司 | 基于多传感器协同的人体跌倒检测预警装置 |
CN103810254A (zh) * | 2014-01-22 | 2014-05-21 | 浙江大学 | 基于云端的用户行为实时分析方法 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050195094A1 (en) * | 2004-03-05 | 2005-09-08 | White Russell W. | System and method for utilizing a bicycle computer to monitor athletic performance |
JP2006334087A (ja) * | 2005-06-01 | 2006-12-14 | Medical Electronic Science Inst Co Ltd | 睡眠状態判定システム及び睡眠状態判定方法 |
JP4905918B2 (ja) * | 2006-02-22 | 2012-03-28 | 株式会社タニタ | 健康管理装置 |
US20100305480A1 (en) * | 2009-06-01 | 2010-12-02 | Guoyi Fu | Human Motion Classification At Cycle Basis Of Repetitive Joint Movement |
US9664518B2 (en) * | 2010-08-27 | 2017-05-30 | Strava, Inc. | Method and system for comparing performance statistics with respect to location |
US8694282B2 (en) * | 2010-09-30 | 2014-04-08 | Fitbit, Inc. | Methods and systems for geo-location optimized tracking and updating for events having combined activity and location information |
US8849610B2 (en) * | 2010-09-30 | 2014-09-30 | Fitbit, Inc. | Tracking user physical activity with multiple devices |
AU2013217184A1 (en) * | 2012-02-02 | 2014-08-21 | Tata Consultancy Services Limited | A system and method for identifying and analyzing personal context of a user |
JP2013168026A (ja) * | 2012-02-15 | 2013-08-29 | Omron Healthcare Co Ltd | 睡眠分析結果表示プログラム、睡眠改善支援画面表示プログラム及び睡眠改善行動結果表示プログラム |
US9582755B2 (en) * | 2012-05-07 | 2017-02-28 | Qualcomm Incorporated | Aggregate context inferences using multiple context streams |
CN107095679B (zh) * | 2012-06-04 | 2020-12-01 | 耐克创新有限合伙公司 | 用于进行锻炼的用户使用的计算机实施的方法以及*** |
US20140003983A1 (en) * | 2012-06-28 | 2014-01-02 | Trebor International | Restrained, unattached, ultrapure pump diaphragm |
JP5846179B2 (ja) * | 2013-09-30 | 2016-01-20 | ダイキン工業株式会社 | 生体情報取得装置 |
JP2017079807A (ja) * | 2014-03-11 | 2017-05-18 | 株式会社東芝 | 生体センサ、生体データ収集端末、生体データ収集システム、及び生体データ収集方法 |
-
2014
- 2014-06-30 EP EP14896735.9A patent/EP3145156A4/en not_active Withdrawn
- 2014-06-30 WO PCT/CN2014/081247 patent/WO2016000163A1/zh active Application Filing
- 2014-06-30 JP JP2016575952A patent/JP6380961B2/ja active Active
- 2014-06-30 CN CN201480008208.6A patent/CN105519074B/zh active Active
-
2016
- 2016-12-27 US US15/391,083 patent/US20170213367A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070049814A1 (en) * | 2005-08-24 | 2007-03-01 | Muccio Philip E | System and device for neuromuscular stimulation |
CN202282004U (zh) * | 2011-06-02 | 2012-06-20 | 上海巨浪信息科技有限公司 | 基于情景感知与活动分析的移动健康管理*** |
CN103198615A (zh) * | 2013-03-21 | 2013-07-10 | 浙江畅志科技有限公司 | 基于多传感器协同的人体跌倒检测预警装置 |
CN103810254A (zh) * | 2014-01-22 | 2014-05-21 | 浙江大学 | 基于云端的用户行为实时分析方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3145156A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10351777B2 (en) | 2013-03-15 | 2019-07-16 | All Power Labs, Inc. | Simultaneous pyrolysis and communition for fuel flexible gasification and pyrolysis |
Also Published As
Publication number | Publication date |
---|---|
CN105519074A (zh) | 2016-04-20 |
EP3145156A4 (en) | 2017-05-31 |
JP2017522962A (ja) | 2017-08-17 |
EP3145156A1 (en) | 2017-03-22 |
CN105519074B (zh) | 2019-06-07 |
JP6380961B2 (ja) | 2018-08-29 |
US20170213367A1 (en) | 2017-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11044684B2 (en) | Method and device for measuring amount of user physical activity | |
KR102446811B1 (ko) | 복수의 디바이스들로부터 수집된 데이터 통합 및 제공 방법 및 이를 구현한 전자 장치 | |
KR102072788B1 (ko) | 휴대 장치 및 휴대 장치의 콘텐트 화면 변경방법 | |
EP3162284B1 (en) | Communication method and apparatus for wearable device | |
US20180042542A1 (en) | System, device and method for remotely monitoring the well-being of a user with a wearable device | |
CN107734487B (zh) | 控制可穿戴电子设备的方法、中心装置及设备 | |
US9798385B1 (en) | User physical attribute based device and content management system | |
US10182770B2 (en) | Smart devices that capture images and sensed signals | |
EP3660688A1 (en) | Information processing system, information processing device, information processing method, and recording medium | |
US11983313B2 (en) | User physical attribute based device and content management system | |
CN107223247A (zh) | 用于获得多个健康参数的方法、***和可穿戴装置 | |
KR20180075931A (ko) | 온라인 게임 내에서 아이템 추천 서비스를 제공하는 방법 및 그 장치 | |
US11651238B2 (en) | Earpiece advisor | |
WO2017193566A1 (zh) | 一种穿戴式设备数据的管理方法、终端及*** | |
WO2014131003A1 (en) | System and method for monitoring biometric data | |
WO2016124495A1 (en) | Smart air quality evaluating wearable device | |
US20170339255A1 (en) | Dynamic feedback for wearable devices | |
CN106293810B (zh) | 基于vr设备的应用处理方法、装置和vr设备 | |
CN113646027A (zh) | 电子装置和用于由该电子装置提供用于减压的信息的方法 | |
WO2016000163A1 (zh) | 用户数据的处理方法和设备 | |
US20140115092A1 (en) | Sensory communication sessions over a network | |
CN106030442B (zh) | 交互设备的选择方法和装置 | |
JP2016170589A (ja) | 情報処理装置、情報処理方法およびプログラム | |
WO2016151061A1 (en) | Wearable-based health and safety warning systems | |
CA3117604C (en) | Apparatus for determining mobile application user engagement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14896735 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2014896735 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014896735 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016575952 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |