WO2016000163A1 - 用户数据的处理方法和设备 - Google Patents

用户数据的处理方法和设备 Download PDF

Info

Publication number
WO2016000163A1
WO2016000163A1 PCT/CN2014/081247 CN2014081247W WO2016000163A1 WO 2016000163 A1 WO2016000163 A1 WO 2016000163A1 CN 2014081247 W CN2014081247 W CN 2014081247W WO 2016000163 A1 WO2016000163 A1 WO 2016000163A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
user behavior
smart
behavior
Prior art date
Application number
PCT/CN2014/081247
Other languages
English (en)
French (fr)
Inventor
甘元莉
王红军
单振威
刘冰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2014/081247 priority Critical patent/WO2016000163A1/zh
Priority to CN201480008208.6A priority patent/CN105519074B/zh
Priority to EP14896735.9A priority patent/EP3145156A4/en
Priority to JP2016575952A priority patent/JP6380961B2/ja
Publication of WO2016000163A1 publication Critical patent/WO2016000163A1/zh
Priority to US15/391,083 priority patent/US20170213367A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer

Definitions

  • the embodiments of the present invention relate to the field of communications technologies, and in particular, to a method and a device for processing user data. Background technique
  • sensors such as smartphones, tablet computers, and the like are detected by non-wearable smart devices (such as mobile phones, tablet computers, etc.) and displayed on non-wearable smart devices.
  • non-wearable smart devices such as mobile phones, tablet computers, etc.
  • sensors such as smart bracelets, smart watches, smart rings, etc.
  • data such as number of turns, sleep, heart rate, blood pressure, etc.
  • the embodiment of the invention provides a method and a device for processing user data, so that the user can see the most accurate and most needed data in the first time in different scenarios, and improve the user experience.
  • an embodiment of the present invention provides a method for processing user data, including: acquiring, by a first device, first data and at least one second data, where the first data is that the first device itself detects user behavior The acquired data, the second data is data acquired by the at least one second device detecting the user behavior;
  • the preset rule includes: dividing the detection time into multiple according to a preset time length in a detection time of the user behavior from occurrence to stop For each time period, the user data corresponding to each time segment is selected in the first data and the at least one second data according to the selection rule; and the user data corresponding to each time segment is summed, User data as the user's behavior.
  • the method further includes:
  • the presenting of the user behavior and/or the user data includes displaying the user behavior and/or the user data in a coordinate form, where the coordinates include a time axis, specifically: according to the first data and the Determining at least one second data, detecting that the motion track of the user behavior is in the same area within a detection period, calculating a center point of the area, and time in the first device display screen The user behavior occurs at the center point on the axis, corresponding to the detection time period.
  • the method further includes:
  • the preset rule includes selecting the first data or the second data having a high priority as the user data corresponding to the user behavior according to the state of the user behavior.
  • an embodiment of the present invention provides a first device, including:
  • An acquiring module configured to acquire first data and at least one second data, where the first data is data acquired by the first device itself to detect user behavior, and the second data is at least one second device pair Describe the data obtained by the user behavior detection;
  • a processing module configured to determine, according to a preset rule, user data corresponding to the user behavior according to the first data and the second data;
  • a presentation module for presenting the user behavior and/or the user data.
  • the preset rule includes: when the user behavior is from detection to stop, according to a preset time length The detection time is divided into a plurality of time segments; for each time segment, one of the first data and the at least one second data is selected as the user data corresponding to each time segment according to the selection rule; The corresponding user data is summed as user data of the user behavior.
  • the preset rule further includes: detecting, according to the first data and the at least one second data, During the detection period, when the motion track of the user behavior is in the same area, the center point of the area is calculated;
  • the presentation module is further configured to display the user behavior at the center point on a time axis in the display screen of the first device and at a point corresponding to the detection time period.
  • the device further includes:
  • a determining module configured to: determine, by the first device, a state that learns the behavior of the user according to the first data and the at least one second data;
  • the preset rule includes selecting the first data or the second data having a high priority as the user data corresponding to the user behavior according to the state of the user behavior.
  • the first device acquires the first data acquired by detecting the user behavior, and the at least one second data acquired by the second device to detect the user behavior. Then, according to the preset rules, the user data corresponding to the user behavior is determined, and the user behavior and/or the user data are presented on the first device, so that the user can be seen in different scenarios in the first time. Improve the user experience with the most accurate and needed data.
  • Embodiment 1 is a flowchart of Embodiment 1 of a method for processing user data according to the present invention
  • Embodiment 2 is a flowchart of Embodiment 2 of a method for processing user data according to the present invention
  • Embodiment 3 is a flowchart of Embodiment 3 of a method for processing user data according to the present invention
  • 4 is a schematic structural diagram of Embodiment 1 of a first device of the present invention
  • FIG. 5 is a schematic structural diagram of Embodiment 2 of a first device of the present invention. detailed description
  • Embodiment 1 is a flowchart of Embodiment 1 of a method for processing user data according to the present invention. As shown in FIG. 1, the method in this embodiment may include:
  • Step 101 The first device acquires first data and at least one second data, where the first data is data acquired by the first device itself to detect user behavior, and the second data is at least one The data acquired by the device detecting the user behavior.
  • the first device may be a non-wearing smart device or a wearable smart device
  • the second device is a wearable smart device.
  • a non-wearable smart device can be a smartphone, a tablet computer, or the like.
  • Wearable smart devices can be smart glasses, smart watches, smart rings, and more.
  • the first device acquires the first data and the at least one second data. Specifically, it can be understood that the first device acquires data acquired by detecting the user behavior and data acquired by the second device to detect the user behavior, that is, the first data is the data acquired by the first device itself, The second data is the data acquired by the second device itself.
  • Step 102 The first device determines, according to the preset rule, user data corresponding to the user behavior according to the first data and the second data, and performs the user behavior and/or the user data. Presented.
  • the first device determines the user data corresponding to the user behavior according to the preset rules according to the preset rules, and presents the user behavior and/or the user data on the first device.
  • the user data is data determined according to a preset rule in the first data and the second data acquired above.
  • the presentation is not limited to vision, but may also include hearing, touch, taste, and the like.
  • the first device obtains the first obtained by detecting the user behavior by using the first device.
  • the data and the second device detect the user behavior to obtain the at least one second data, and then determine the user data corresponding to the user behavior according to a preset rule, and perform user behavior and/or on the first device.
  • the user data is presented, which enables the user to see the most accurate and most needed data in the first time in different scenarios, and improves the user experience.
  • the preset rule described in the foregoing embodiment may include dividing the detection time into at least one time period according to a preset time length during a detection time of the user behavior from occurrence to stop. For each time period, user data corresponding to each time period is selected among the first data and the at least one second data according to a selection rule. The details will be described below in conjunction with the embodiment of Fig. 2.
  • Embodiment 2 is a flowchart of Embodiment 2 of a method for processing user data according to the present invention. As shown in FIG. 2, the method in this embodiment may include:
  • Step 201 The first device acquires the first data and the at least one second data, where the first data is the data obtained by the first device itself to detect the behavior of the user running, the second data.
  • the parameter data obtained by detecting the behavior of the user running for the at least one second device.
  • the first device is exemplified by a smart phone
  • the second device is exemplified by a smart bracelet and a smart shoe.
  • the running data is used as a user behavior.
  • the first data is the data obtained by the smart phone itself to detect the user's running behavior
  • the second data is the smart wristband and/or the smart shoe to the user.
  • the running behavior is detected by the running behavior.
  • Step 202 The first device determines, according to the preset rule, the parameter data corresponding to the user running according to the first data and the second data, and presents the user data.
  • the smart phone has the data, the smart bracelet and the smart shoes have no data, and at this time, the data acquired by the smart phone is selected; if the smart phone does not have The number of data, the smart bracelet has the number of data, the smart shoes have no data, in this case, select the data obtained by the smart bracelet; if the smart phone does not have the data, the smart bracelet does not have the data, the smart shoes have the data, this When the number of parameters obtained by the smart shoe is selected, the party having the parameter data is selected as the user data.
  • the data recording points are refined, and in this embodiment, the data is divided into Every 5 minutes is a point, taking 1 hour as an example, divided into 12 periods.
  • the parameter data with a large amount of motion is selected as the parameter data.
  • the number of turns acquired by the smartphone is 150 ⁇
  • the number of turns acquired by the smart wrist is 158 ⁇
  • the number of turns obtained by the smart shoes is 160 ⁇
  • the smart shoes are selected to obtain the data with the number of turns of 160 ⁇ as the first 5 minutes.
  • the number of parameters in the segment The number of turns in the rest of the time period, and so on.
  • the time of 5 minutes is not absolute, it is relative, and the specific time can be determined by the capabilities of the smart device, which is not limited here.
  • the data corresponding to the respective time segments are summed to obtain the final user data in the detection time from the occurrence of the user's running behavior to the stop, which is presented on the smart phone.
  • the number of data obtained by detecting the behavior of the user's running through the smart phone and the parameter data obtained by detecting the behavior of the user's running by the smart bracelet and the smart shoe are obtained by the smart phone, and then the data recording point is obtained. Refining, in each time period, selecting the user data with a large amount of motion, and finally summing the user data corresponding to each time period, and obtaining the final user data in the detection time of the user behavior from occurrence to stop.
  • the user can see the most accurate and most needed data in the first time in different scenarios, and improve the user experience.
  • the method embodiment can also display the location of the user in a certain time on the corresponding time axis according to the user data obtained by the terminal device.
  • the first data may be the latitude and longitude obtained by the first device (for example, a smart phone), and the second data is the latitude and longitude obtained by the second device (for example, a smart watch).
  • This embodiment exemplifies a latitude and longitude point every 30 seconds.
  • the specific time interval can be configured according to actual conditions, and is not limited herein.
  • the smart watch acquires the latitude and longitude data by using the smart phone when the latitude and longitude data is not acquired; when the smart phone does not acquire the latitude and longitude data, and the smart watch acquires the latitude and longitude data, the smart watch acquires the latitude and longitude data.
  • the latitude and longitude data obtained by the smart watch is used at the point where the time period is repeated, because the latitude and longitude data of the smart watch is derived from GPS, and the latitude and longitude data of the smart phone may be derived from GPS may also originate from a base station or WIFI, and the latitude and longitude data provided by the base station and WIFI is not accurate enough and is biased.
  • the presenting of the user behavior and/or the user data includes displaying the user behavior and/or the user data in a coordinate form, where the coordinates include a time axis, specifically: according to the first data and the Determining at least one second data, detecting that the user is within a detection period
  • the motion track of the behavior is in the same area, and the center point of the area is calculated, and displayed at the center point on the time axis in the display screen of the first device and the point corresponding to the detection time period
  • the user behavior occurs.
  • the smart phone depicts the user's motion trajectory, and aggregates the range of the active area of the user during the period of time, and detects that the motion track of the user behavior is in the same area within a detection time period, and then calculates The center point of the area displays the user behavior occurring at the center point during the detection time period on the timeline in the display of the smartphone.
  • the latitude and longitude data obtained by detecting the user behavior by the non-wearable smart device such as a smart phone and the latitude and longitude data acquired by the wearable smart device such as a smart watch detecting the user behavior are performed, and then, The acquired latitude and longitude points are drawn to obtain the user's motion trajectory, and the range of the active area of the user during the period is aggregated. It is detected that within a detection period, the motion track of the user behavior is in the same area, then the motion point of the user behavior is calculated to be at the center point in the same area, and is displayed on the time axis in the display screen of the smartphone. During the time period, the user behavior occurring at the central point enables the user to see the most accurate and most needed data in the first time in different scenarios, improving the user experience.
  • the preset rule described in the first embodiment may further include selecting the first data or the second data having a high priority as the user data corresponding to the user behavior according to the state of the user behavior. The details will be described below in conjunction with the embodiment of Fig. 3.
  • FIG. 3 is a flowchart of Embodiment 3 of a method for processing user data according to the present invention. As shown in FIG. 3, the method in this embodiment may include:
  • Step 301 The first device acquires first data and at least one second data, where the first data is data acquired by the first device itself to detect user behavior, and the second data is at least one The data acquired by the device detecting the user behavior.
  • the first device is exemplified by a smart phone, smart glasses, a smart watch
  • the second device is smart glasses, a smart watch, a smart shoe, a smart bracelet, and a smart ring.
  • the first data is data acquired by the first device itself to detect user behavior, that is, data acquired by a smart phone, smart glasses, and a smart watch to detect user behavior, for example, data such as parameters, heart rate, and blood pressure.
  • the second data is data obtained by detecting, by the at least one second device, the user behavior, that is, data acquired by the smart shoes, the smart bracelet, and the smart ring detecting the user behavior, for example, the number of turns, the heart rate, and the blood pressure. And other data.
  • smart shoes, smart bracelets, and smart rings periodically send out broadcast ADV_IND messages. After receiving ADV_IND messages, smartphones, smart glasses, and smart watches broadcast SCAN_REQ messages to scan nearby Bluetooth devices.
  • the shoe, the smart bracelet, and the smart ring After receiving the SCAN_REQ message, the shoe, the smart bracelet, and the smart ring respond to the SCAN_RSP message.
  • the SCAN_RSP message carries the device identification number (IDentity, ID: ID) and the device's Bluetooth address, etc., smartphone, smart glasses.
  • IDentity ID: ID
  • the smart watch After receiving the SCAN_RSP message, the smart watch establishes a connection with the corresponding device according to the Bluetooth address of each device and acquires the capabilities of the shoe, the smart bracelet, and the smart ring device, such as the service information supported by the device.
  • smart devices such as smart phones, smart glasses, and smart watches obtain data obtained by detecting the user's behavior and data acquired by smart devices such as smart shoes, smart bracelets, and smart rings to detect user behavior.
  • Step 302 The first device determines, according to the first data and the at least one second data, a status of learning the behavior of the user.
  • motion sensors acceleration sensors, gravity sensors, gyroscopes, etc.
  • smartphones smart glasses, smart watches to identify the user's status.
  • the state of user behavior such as motion, rest, sleep, and the like.
  • Step 303 The first device selects the first data or the second data with a high priority as the user data corresponding to the user behavior according to the state of the user behavior, and performs the presentation.
  • the user data advanced priority policy is configured, when the smart phone, the smart glasses, the smart watch device are all When you wear it on the user, the user data advanced priority policy configured on the smartphone is subject to the user data advanced priority policy configured on the smart glasses.
  • the user data advanced priority policy configured on the smart watch is quasi. In the present embodiment, only the parameter data, the heart rate data, the blood pressure data, and the sleep quality data are exemplified.
  • the priority is obtained from the number of data obtained from the smart shoes or smart foot ring, if the sensor data of the smart shoes or smart foot ring If you can't get it, the number of data obtained by smart bracelet or smart watch will prevail. Secondly, the number of data obtained by smart ring or smart glasses will prevail. That is, the advanced priority order is: smart shoes or smart foot ring ⁇ smart bracelet or smart watch ⁇ smart ring or smart glasses;
  • the heart rate or blood pressure data obtained from the smart bracelet or smart watch is preferred. If the sensor data of the smart bracelet or smart watch cannot be obtained, the heart rate or blood pressure obtained by the smart ring is obtained. The data prevails, followed by the heart rate or blood pressure data obtained from the smart foot ring or smart shoes. That is, the advanced priority order is: Smart Bracelet or Smart Watch > Smart Ring > Smart Foot Ring or Smart Shoes.
  • sleep quality is generally achieved by data such as dreaming, pulse, body motion recording, etc.
  • priority is given to the sleep quality data obtained from the smart bracelet or smart watch. If the sensor data of the smart bracelet or smart watch is not available, the sleep quality data obtained by the smart ring shall prevail, and the heart rate or blood pressure data obtained by the smart foot ring shall prevail. That is, the advanced priority order is: Smart Bracelet or Smart Watch > Smart Ring > Smart Foot Ring.
  • the user defaults to the number of data obtained from smart shoes or smart feet, followed by the number of data acquired on the smart bracelet or smart watch. It is the number of data obtained by smart ring or smart glasses.
  • the user prefers smart ring or smart glasses to obtain the data, followed by smart shoes or smart feet to obtain the data, and again the smart bracelet or smart watch to obtain the number of parameters. data.
  • the advanced priority policy is configured according to the user's own personalized requirements, that is, the current high priority order is: smart ring or smart glasses > smart shoes or smart foot ring > smart bracelet or smart watch.
  • the corresponding user data selected by the above advanced priority policy is presented on the smartphone, the smart glasses, the smart watch, the smart shoes, the smart bracelet, and the smart ring.
  • the presentation method can be visual, auditory, tactile, taste and other ways.
  • the corresponding user data is displayed on the smart phone, the smart glasses, and the smart watch, and the corresponding user data is played out in a sound manner through the smart phone, and the corresponding user data is vibrated through the smart shoes, the smart bracelet, and the smart ring. Way to remind users.
  • the smart device, the smart glasses, the smart watch and other smart devices obtain the data acquired by the user to detect the user behavior and the smart shoes, the smart bracelet, the smart ring and the like.
  • the data acquired by the device to detect user behavior is identified based on the acquired data.
  • the bracelet and the smart ring are presented, which enables the user to see the most accurate and most needed data in the first time in different scenarios and improve the user experience.
  • the first device 01 of this embodiment may include: an obtaining module 11, a processing module 12, and a rendering module 13, wherein the acquiring module 11 is And the at least one second data is used by the first device to detect the user behavior, and the second data is used by the at least one second device to the user behavior.
  • the processing module 12 is configured to determine, according to the preset rule, the user data corresponding to the user behavior according to the first data and the second data, and the presentation module 13 uses Presenting the user behavior and/or the user data.
  • the preset rule may include: dividing the detection time into multiple time segments according to a preset time length during a detection time of the user behavior from occurrence to stop. And selecting, according to the selection rule, the user data corresponding to each time segment in the first data and the at least one second data according to the selection rule; summing the user data corresponding to each time segment as a User data describing user behavior.
  • the preset rule may include: calculating, according to the first data and the at least one second data, that the motion track of the user behavior is in the same area within a detection period, then calculating Out of the center of the area;
  • the presentation module 13 is further configured to display the user behavior at the center point on a time axis in the first device display screen and corresponding to the detection time period.
  • the first device in this embodiment may be used to perform the technical solution of the foregoing method embodiment, and the implementation principle and the technical effect are similar, and details are not described herein again.
  • FIG. 5 is a schematic structural diagram of Embodiment 2 of the first device of the present invention.
  • the first device 01 of the present embodiment may further include: a determining module, based on the device structure shown in FIG. 14.
  • the determining module 14 is configured to determine, by the first device, a state that learns the behavior of the user according to the first data and the at least one second data;
  • the preset rule may include, according to a state of the user behavior, selecting the first data or the second data having a high priority as user data corresponding to the user behavior.
  • the first device in this embodiment may be used to perform the technical solution of the foregoing method embodiment, and the implementation principle and the technical effect are similar, and details are not described herein again.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are only schematic.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be electrical, mechanical or otherwise.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the above-described integrated unit implemented in the form of a software functional unit can be stored in a computer readable storage medium.
  • the software functional unit is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform the method of various embodiments of the present invention.
  • a computer device which may be a personal computer, a server, or a network device, etc.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, which can store program codes. .

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Cardiology (AREA)
  • General Engineering & Computer Science (AREA)
  • Vascular Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Physiology (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

本发明实施例提供一种用户数据的处理方法和设备,包括:第一设备获取第一数据和至少一个第二数据,所述第一数据为所述第一设备自身对用户行为进行检测所获取到的数据,所述第二数据为至少一个第二设备对所述用户行为进行检测所获取到的数据。所述第一设备按照预设规则,根据所述第一数据和所述第二数据确定所述用户行为对应的用户数据,并对所述用户行为和/或所述用户数据进行呈现。实现了让用户在不同的场景下,第一时间看到最准确、最需要的数据,提高了用户体验。

Description

用户数据的处理方法和设备
技术领域
本发明实施例涉及通信领域技术, 尤其涉及一种用户数据的处理方法和 设备。 背景技术
如今, 可穿戴式智能设备已悄然兴起, 且种类日益增多, 出现了如智能 手表、 智能眼镜、 智能手环等等。 上述设备不但方便随身携带, 还为人们提 供了丰富的实用功能, 它们的出现极大地改变现代人的生活方式、 运动方式 和休闲方式。
目前, 通过非可穿戴式智能设备本身自带的传感器 (例如智能手机、 平 板电脑等移动终端) 检测如歩数、 睡眠、 心率、 血压等数据并展示在非可穿 戴式智能设备上。 或者, 通过可穿戴式智能设备本身自带的传感器 (例如智 能手环、 智能手表、 智能戒指等) 检测如歩数、 睡眠、 心率、 血压等数据并 同歩展示在非可穿戴式智能设备上。
随着设备与设备之间的互通互联, 数据传输共享并按照一定的规律展示 给用户的要求越来越细。如何让用户在不同的场景下, 第一时间看到最准确、 最需要的数据, 用户可能既携带了可穿戴式智能设备也携带了非可穿戴式智 能设备, 如何更科学的记录并显示这些数据, 现在并没有一个特别好的方法。 发明内容
本发明实施例提供一种用户数据的处理方法和设备, 以实现让用户在不 同的场景下, 第一时间看到最准确、 最需要的数据, 提高用户体验。
第一方面, 本发明实施例提供一种用户数据的处理方法, 包括: 第一设备获取第一数据和至少一个第二数据, 所述第一数据为所述第 一设备自身对用户行为进行检测所获取到的数据, 所述第二数据为至少一 个第二设备对所述用户行为进行检测所获取到的数据;
所述第一设备按照预设规则, 根据所述第一数据和所述第二数据确定 所述用户行为对应的用户数据,并对所述用户行为和 /或所述用户数据进行 呈现。
在第一方面的第一种可能的实现方式中, 所述预设规则包括: 在所述用户行为从发生到停止的检测时间内, 按照预设的时间长度将 所述检测时间划分成多个时间段; 对于各时间段, 根据选取规则在所述第 一数据和所述至少一个第二数据中择一作为各时间段对应的用户数据; 将各时间段分别对应的用户数据进行求和, 作为所述用户行为的用户 数据。
结合第一方面的第一种可能的实现方式, 在第二种可能的实现方式 中, 所述方法还包括:
对所述用户行为和 /或所述用户数据进行呈现包括,以坐标形式显示所 述用户行为和 /或所述用户数据, 所述坐标包括时间轴, 具体包括: 根据所 述第一数据和所述至少一个第二数据, 检测到在一检测时间段内, 所述用 户行为的运动轨迹处于同一区域内, 则计算出所述区域的中心点, 并在所 述第一设备显示屏中的时间轴上、 与所述检测时间段对应点上显示在所述 中心点处发生所述用户行为。
在第一方面的第三种可能的实现方式中, 所述方法还包括:
所述第一设备根据所述第一数据和所述至少一个第二数据, 判断获知 所述用户行为的状态;
相应地, 所述预设规则包括根据所述用户行为的状态, 选择具有高优 先级的所述第一数据或所述第二数据作为所述用户行为对应的用户数据。
第二方面, 本发明实施例提供一种第一设备, 包括:
获取模块, 用于获取第一数据和至少一个第二数据, 所述第一数据为 第一设备自身对用户行为进行检测所获取到的数据, 所述第二数据为至少 一个第二设备对所述用户行为进行检测所获取到的数据;
处理模块, 用于所述第一设备按照预设规则, 根据所述第一数据和所 述第二数据确定所述用户行为对应的用户数据;
呈现模块, 用于呈现所述用户行为和 /或所述用户数据。
在第二方面的第一种可能的实现方式中, 所述预设规则包括: 在所述用户行为从发生到停止的检测时间内, 按照预设的时间长度将 所述检测时间划分成多个时间段; 对于各时间段, 根据选取规则在所述第 一数据和所述至少一个第二数据中择一作为各时间段对应的用户数据; 将各时间段分别对应的用户数据进行求和, 作为所述用户行为的用户 数据。
结合第二方面的第一种可能的实现方式, 在第二种可能的实现方式 中, 所述预设规则还包括: 根据所述第一数据和所述至少一个第二数据, 检测到在一检测时间段内, 所述用户行为的运动轨迹处于同一区域内, 则 计算出所述区域的中心点;
相应地,所述呈现模块,还用于在所述第一设备显示屏中的时间轴上、 与所述检测时间段对应点上显示在所述中心点处发生所述用户行为。
在第二方面的第三种可能的实现方式中, 所述设备还包括:
判断模块, 用于所述第一设备根据所述第一数据和所述至少一个第二 数据, 判断获知所述用户行为的状态;
相应地, 所述预设规则包括根据所述用户行为的状态, 选择具有高优 先级的所述第一数据或所述第二数据作为所述用户行为对应的用户数据。
本发明实施例提供的用户数据的处理方法和设备, 通过第一设备获取自 身对用户行为进行检测所获取到的第一数据和第二设备对用户行为进行检测 所获取到的至少一个第二数据, 然后按照预先设置好的规则, 来确定上述用 户行为对应的用户数据, 并在第一设备上对用户行为和 /或用户数据进行呈 现, 实现了让用户在不同的场景下, 第一时间看到最准确、 最需要的数据, 提高了用户体验。 附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案, 下面将对实 施例或现有技术描述中所需要使用的附图作一简单地介绍, 显而易见地, 下 面描述中的附图是本发明的一些实施例, 对于本领域普通技术人员来讲, 在 不付出创造性劳动性的前提下, 还可以根据这些附图获得其他的附图。
图 1为本发明用户数据的处理方法实施例一的流程图;
图 2为本发明用户数据的处理方法实施例二的流程图;
图 3为本发明用户数据的处理方法实施例三的流程图; 图 4为本发明第一设备实施例一的结构示意图;
图 5为本发明第一设备实施例二的结构示意图。 具体实施方式
为使本发明实施例的目的、 技术方案和优点更加清楚, 下面将结合本 发明实施例中的附图, 对本发明实施例中的技术方案进行清楚、 完整地描 述, 显然,所描述的实施例是本发明一部分实施例, 而不是全部的实施例。 基于本发明中的实施例, 本领域普通技术人员在没有作出创造性劳动前提 下所获得的所有其他实施例, 都属于本发明保护的范围。
图 1为本发明用户数据的处理方法实施例一的流程图, 如图 1所示, 本 实施例的方法可以包括:
歩骤 101、 第一设备获取第一数据和至少一个第二数据, 所述第一数据 为所述第一设备自身对用户行为进行检测所获取到的数据, 所述第二数据为 至少一个第二设备对所述用户行为进行检测所获取到的数据。
其中, 第一设备可以为非穿戴式智能设备, 也可以为穿戴式智能设备, 第二设备为穿戴式智能设备。 例如, 非穿戴式智能设备可以为智能手机、 平 板电脑等。 穿戴式智能设备可以为智能眼镜、 智能手表、 智能戒指等。
第一设备获取第一数据和至少一个第二数据。 具体可以理解为, 第一设 备获取自身对用户行为进行检测所获取到的数据和第二设备对用户行为进行 检测所获取到的数据, 即第一数据为第一设自身获取到的数据, 第二数据为 第二设备自身获取到的数据。
歩骤 102、 所述第一设备按照预设规则, 根据所述第一数据和所述第二 数据确定所述用户行为对应的用户数据, 并对所述用户行为和 /或所述用户 数据进行呈现。
第一设备将获取到的第一数据和第二数据按照预先设置好的规则, 来确 定上述用户行为对应的用户数据, 并在第一设备上对用户行为和 /或用户数据 进行呈现。 其中, 用户数据是在上述获取到的第一数据和第二数据中, 根据 预设规则确定出的数据。 另外, 呈现不仅仅局限于视觉, 还可以包括听觉、 触觉、 味觉等等。
本实施例, 通过第一设备获取自身对用户行为进行检测所获取到的第一 数据和第二设备对用户行为进行检测所获取到的至少一个第二数据, 然后按 照预先设置好的规则, 来确定上述用户行为对应的用户数据, 并在第一设备 上对用户行为和 /或用户数据进行呈现, 实现了让用户在不同的场景下, 第一 时间看到最准确、 最需要的数据, 提高了用户体验。
上述实施例中所述的预设规则可以包括在所述用户行为从发生到停止的 检测时间内, 按照预设的时间长度将所述检测时间划分成至少一个时间段。 对于各时间段, 根据选取规则在所述第一数据和所述至少一个第二数据中择 一作为各时间段对应的用户数据。 以下结合图 2的实施例进行详细说明。
图 2为本发明用户数据的处理方法实施例二的流程图, 如图 2所示, 本 实施例的方法可以包括:
歩骤 201、 第一设备获取第一数据和至少一个第二数据, 所述第一数据 为所述第一设备自身对用户跑歩的行为进行检测所获取到的歩数数据, 所述 第二数据为至少一个第二设备对所述用户跑歩的行为进行检测所获取到的歩 数数据。
在本实施例中, 第一设备举例为智能手机, 第二设备举例为智能手环、 智能鞋子。 本实施例以跑歩作为用户行为进行举例说明, 第一数据为智能手 机自身对用户跑歩行为进行检测所获取到的歩数数据, 第二数据为智能手环 和 /或智能鞋子对所述用户跑歩行为进行检测所获取到的跑歩数据。
歩骤 202、 所述第一设备按照预设规则, 根据所述第一数据和所述第二 数据确定所述用户跑歩所对应的歩数数据, 并对所述用户数据进行呈现。
具体地, 在用户跑歩行为从发生到停止的检测时间内, 如果智能手机有 歩数数据, 智能手环、 智能鞋子均没有歩数数据, 此时, 选取智能手机获取 的歩数数据; 如果智能手机没有歩数数据, 智能手环有歩数数据, 智能鞋子 没有歩数数据, 此时, 选取智能手环获取的歩数数据; 如果智能手机没有歩 数数据, 智能手环也没有歩数数据, 智能鞋子有歩数数据, 此时, 选取智能 鞋子获取的歩数数据, 即选择有歩数数据的一方作为用户数据。
在用户跑歩行为从发生到停止的检测时间内, 智能手机、 智能手环、 智 能鞋子中至少有两个中有歩数数据, 此时, 将数据记录点细化, 在本实施例 中分为每 5分钟为一个点, 以 1小时为例,分成 12段时间。在每个时间段内, 选择运动量较大的歩数数据作为歩数数据。例如, 在第一个 5分钟时间段内, 智能手机获取到的歩数为 150歩, 智能手腕获取到的歩数为 158歩, 智能鞋 子获取到的歩数为 160歩, 则选取智能鞋子获取到歩数为 160歩的数据作为 在第一个 5分钟时间段内的歩数数据。其余时间段内的歩数数据, 以此类推。 当然 5分钟这个时间不是绝对的, 是相对的, 具体的时间可以由智能设备的 能力所决定, 在此不加限定。
然后, 将各时间段分别对应的歩数数据进行求和, 得出在用户跑歩行为 从发生到停止的检测时间内最终的用户数据, 在智能手机上呈现出来。
本实施例, 通过智能手机获取自身对用户跑歩的行为进行检测所获取到 的歩数数据和智能手环、 智能鞋子对用户跑歩的行为进行检测所获取到的歩 数数据, 然后将数据记录点细化, 在每个时间段内, 选择运动量较大的作为 用户数据, 最后将各时间段分别对应的用户数据进行求和, 得出在用户行为 从发生到停止的检测时间内最终的用户数据, 在智能手机上呈现出来, 实现 了让用户在不同的场景下, 第一时间看到最准确、 最需要的数据, 提高了用 户体验。
在上述实施例的基础上, 本方法实施例还可以根据终端设备所获得的用 户数据将用户在一定时间内所在位置在对应的时间轴上显示出来。 本实施例 中, 第一数据可以为第一设备 (例如智能手机) 获得的经纬度, 第二数据为 第二设备(例如智能手表)获得的经纬度。 本实施例以每 30秒记录一个经纬 度点来举例说明, 具体的时间间隔可以根据实际情况进行配置, 在此不加以 限制。
当智能手机获取到经纬度数据, 智能手表没有获取到经纬度数据时, 使 用智能手机获取到经纬度数据; 当智能手机没有获取到经纬度数据, 智能手 表获取到经纬度数据时, 使用智能手表获取到经纬度数据。
当智能手机和智能手表均获取到经纬度数据时, 在时间段重复的点, 使 用智能手表获取到的经纬度数据, 其原因是智能手表的经纬度数据来源于 GPS , 而智能手机的经纬度数据可能来源于 GPS , 也可能来源于基站或者 WIFI, 而基站和 WIFI提供的经纬度数据不够精准, 是有偏差的。
对所述用户行为和 /或所述用户数据进行呈现包括,以坐标形式显示所 述用户行为和 /或所述用户数据, 所述坐标包括时间轴, 具体包括: 根据所 述第一数据和所述至少一个第二数据, 检测到在一检测时间段内, 所述用户 行为的运动轨迹处于同一区域内, 则计算出所述区域的中心点, 并在所述第 一设备显示屏中的时间轴上、 与所述检测时间段对应点上显示在所述中心点 处发生所述用户行为。
根据获取到的经纬度数据, 智能手机将用户运动轨迹进行描绘, 聚合出 用户这段时间内活动的区域范围, 检测到在一检测时间段内, 用户行为的运 动轨迹处于同一区域内, 则计算出该区域的中心点, 在智能手机的显示屏中 的时间轴上显示在检测时间段内, 中心点处发生的用户行为。
本实施例, 通过智能手机等非可穿戴式智能设备获取自身对用户行为进 行检测所获取到的经纬度数据和智能手表等可穿戴式智能设备对用户行为进 行检测所获取到的经纬度数据, 然后, 将获取到的经纬度的点, 进行描绘, 得出用户运动轨迹, 聚合出用户这段时间内活动的区域范围。 检测到在一检 测时间段内, 用户行为的运动轨迹处于同一区域内, 则计算得出用户行为的 运动轨迹在同一区域内的中心点, 在智能手机的显示屏中的时间轴上显示在 检测时间段内, 中心点处发生的用户行为, 实现了让用户在不同的场景下, 第一时间看到最准确、 最需要的数据, 提高了用户体验。
在实施例一中所述的预设规则还可以包括根据所述用户行为的状态, 选 择具有高优先级的所述第一数据或所述第二数据作为所述用户行为对应的用 户数据。 以下结合图 3的实施例进行详细说明。
图 3为本发明用户数据的处理方法实施例三的流程图, 如图 3所示, 本 实施例的方法可以包括:
歩骤 301、 第一设备获取第一数据和至少一个第二数据, 所述第一数据 为所述第一设备自身对用户行为进行检测所获取到的数据, 所述第二数据为 至少一个第二设备对所述用户行为进行检测所获取到的数据。
在本实施例中, 第一设备以智能手机、 智能眼镜、 智能手表, 第二设备 以智能眼镜、 智能手表、 智能鞋子、 智能手环、 智能戒指来举例说明。
第一数据为第一设备自身对用户行为进行检测所获取到的数据, 即智能 手机、 智能眼镜、 智能手表对用户行为进行检测所获取到的数据, 例如, 歩 数, 心率, 血压等数据。 第二数据为至少一个第二设备对所述用户行为进行 检测所获取到的数据, 即智能鞋子、 智能手环、 智能戒指对用户行为进行检 测所获取到的数据, 例如, 歩数, 心率、 血压等数据。 首先, 智能鞋子、智能手环、智能戒指周期性的向外发送广播 ADV_IND 报文, 智能手机、 智能眼镜、 智能手表接收到 ADV_IND报文后, 向外广播 SCAN_REQ报文, 扫描附近蓝牙设备, 智能鞋子、 智能手环、 智能戒指接收 到 SCAN_REQ报文后, 回应 SCAN_RSP报文, SCAN_RSP报文中携带了设 备身份标识号 (IDentity, 简称: ID) 和设备的蓝牙地址等信息, 智能手机、 智能眼镜、 智能手表接收到 SCAN_RSP报文后, 根据各设备的蓝牙地址与对 应的设备建立连接并获取鞋子、 智能手环、 智能戒指设备的能力, 比如设备 支持的服务信息。
接着, 智能手机、 智能眼镜、 智能手表等智能设备获取自身对用户行为 进行检测所获取到的数据和智能鞋子、 智能手环、 智能戒指等智能设备对用 户行为进行检测所获取到的数据。
歩骤 302、 所述第一设备根据所述第一数据和所述至少一个第二数据, 判断获知所述用户行为的状态。
利用智能手机、 智能眼镜、 智能手表上的运动传感器 (加速度传感器、 重力传感器、 陀螺仪等)来识别用户的状态。 或者通过智能手机、 智能眼镜、 智能手表等设备收集、整合自身所获取到的数据或者从智能鞋子、智能手环、 智能戒指所获取到的数据来识别用户行为的状态。用户行为的状态例如运动、 静止、 睡眠等等。
歩骤 303、 所述第一设备根据所述用户行为的状态, 选择具有高优先级 的所述第一数据或所述第二数据作为所述用户行为对应的用户数据, 并进行 呈现。
具体的, 在有处理器的第一设备上 (在本实施例中仅以智能手机、 智能 眼镜、 智能手表来举例说明) 配置用户数据高级优先策略, 当智能手机、 智 能眼镜、 智能手表设备都佩戴在用户身上的时候, 以智能手机上配置的用户 数据高级优先策略为准, 接下来以智能眼镜上配置的用户数据高级优先策略 为准, 最后以智能手表上配置的用户数据高级优先策略为准。 本实施例中, 仅以歩数数据、 心率数据、 血压数据、 睡眠质量数据来举例说明。
以下为高级优先策略的详细内容:
当用户处于运动状态, 需要获取歩数数据的时候, 优先以从智能鞋子或 者智能脚环获取的歩数数据为准, 如果智能鞋子或者智能脚环的传感器数据 无法获得, 则以智能手环或者智能手表获取到的歩数数据为准, 其次以智能 戒指或者智能眼镜获取到的歩数数据为准。 即高级优先顺序为: 智能鞋子或 者智能脚环〉智能手环或者智能手表〉智能戒指或者智能眼镜;
需要获取心率或者血压数据的时候, 优先以从智能手环或者智能手表获 取的心率或者血压数据为准, 如果智能手环或者智能手表的传感器数据无法 获得, 则以智能戒指获取到的心率或者血压数据为准, 其次以智能脚环或者 智能鞋子获取到的心率或者血压数据为准。 即高级优先顺序为: 智能手环或 者智能手表〉智能戒指〉智能脚环或者智能鞋子。
当用户处于睡眠状态, 需要获取睡眠质量数据 (睡眠质量一般是通过做 梦、 脉搏、 体动记录等数据来实现的) 的时候, 优先以从智能手环或者智能 手表获取的睡眠质量数据为准, 如果智能手环或者智能手表的传感器数据无 法获得, 则以智能戒指获取到的睡眠质量数据为准, 其次以智能脚环获取到 的心率或者血压数据为准。 即高级优先顺序为: 智能手环或者智能手表〉智 能戒指〉智能脚环。
根据用户的行为习惯来配置高级优先策略, 比如用户在大多数情况下默 认的是优先从智能鞋子或者智能脚环上获取的歩数数据, 其次是智能手环或 者智能手表上获取的歩数数据, 再次是智能戒指或者智能眼镜获取到的歩数 数据。 但是, 当用户有个性化设置的时候, 比如用户首选的是智能戒指或者 智能眼镜来获取歩数数据, 其次是智能鞋子或者智能脚环来获取歩数数据, 再次是智能手环或者智能手表来获取歩数数据。 按照用户自己的个性化需求 来配置高级优先策略, 即当前高级优先顺序为: 智能戒指或者智能眼镜〉智 能鞋子或者智能脚环〉智能手环或者智能手表。
然后, 通过上述高级优先策略选择出的对应的用户数据在智能手机、 智 能眼镜、 智能手表、 智能鞋子、 智能手环、 智能戒指上呈现出来。 其中, 呈 现方式可以为视觉、 听觉、 触觉、 味觉等多种方式。 例如, 在智能手机、 智 能眼镜、 智能手表上显示对应的用户数据, 通过智能手机将对应的用户数据 以声音的方式播放出来, 通过智能鞋子、 智能手环、 智能戒指将对应的用户 数据通过振动的方式提醒用户。
本实施例, 通过智能手机、 智能眼镜、 智能手表等智能设备获取自身对 用户行为进行检测所获取到的数据和智能鞋子、 智能手环、 智能戒指等智能 设备对用户行为进行检测所获取到的数据。 然后, 根据所获取到的数据来识 别用户行为的状态。 接着, 在智能手机、 智能眼镜、 智能手表等有处理器的 设备上配置用户数据高级优先策略, 通过高级优先策略选择出的对应的用户 数据在智能手机、 智能眼镜、 智能手表、 智能鞋子、 智能手环、 智能戒指上 呈现出来, 实现了让用户在不同的场景下, 第一时间看到最准确、 最需要的 数据, 提高了用户体验。
图 4为本发明第一设备实施例一的结构示意图, 如图 4所示, 本实施例 的第一设备 01可以包括: 获取模块 11、 处理模块 12、 呈现模块 13, 其中, 获取模块 11, 用于获取第一数据和至少一个第二数据, 所述第一数据为第 一设备自身对用户行为进行检测所获取到的数据, 所述第二数据为至少一 个第二设备对所述用户行为进行检测所获取到的数据, 处理模块 12用于 所述第一设备按照预设规则, 根据所述第一数据和所述第二数据确定所述 用户行为对应的用户数据, 呈现模块 13, 用于呈现所述用户行为和 /或所 述用户数据。
在上述实施例的基础上, 具体的, 所述预设规则可以包括: 在所述用 户行为从发生到停止的检测时间内, 按照预设的时间长度将所述检测时间 划分成多个时间段; 对于各时间段, 根据选取规则在所述第一数据和所述 至少一个第二数据中择一作为各时间段对应的用户数据; 将各时间段分别 对应的用户数据进行求和, 作为所述用户行为的用户数据。
进一歩的, 所述预设规则可以包括: 根据所述第一数据和所述至少一 个第二数据, 检测到在一检测时间段内, 所述用户行为的运动轨迹处于同 一区域内, 则计算出所述区域的中心点;
相应地, 呈现模块 13, 还用于在所述第一设备显示屏中的时间轴上、 与所述检测时间段对应点上显示在所述中心点处发生所述用户行为。
本实施例的第一设备, 可以用于执行上述所示方法实施例的技术方案, 其实现原理和技术效果类似, 此处不再赘述。
图 5为本发明第一设备实施例二的结构示意图, 如图 4所示, 本实施例 的第一设备 01在图 4所示设备结构的基础上, 进一歩地, 还可以包括: 判断 模块 14。 其中, 判断模块 14, 用于所述第一设备根据所述第一数据和所 述至少一个第二数据, 判断获知所述用户行为的状态; 相应地, 所述预设规则可以包括根据所述用户行为的状态, 选择具有 高优先级的所述第一数据或所述第二数据作为所述用户行为对应的用户 数据。
本实施例的第一设备, 可以用于执行上述所示方法实施例的技术方案, 其实现原理和技术效果类似, 此处不再赘述。
在本发明所提供的几个实施例中, 应该理解到, 所揭露的设备和方法, 可以通过其它的方式实现。例如, 以上所描述的设备实施例仅仅是示意性的, 例如, 所述单元的划分, 仅仅为一种逻辑功能划分, 实际实现时可以有另外 的划分方式, 例如多个单元或组件可以结合或者可以集成到另一个***, 或 一些特征可以忽略, 或不执行。 另一点, 所显示或讨论的相互之间的耦合或 直接耦合或通信连接可以是通过一些接口, 设备或单元的间接耦合或通信连 接, 可以是电性, 机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的, 作 为单元显示的部件可以是或者也可以不是物理单元, 即可以位于一个地方, 或者也可以分布到多个网络单元上。 可以根据实际的需要选择其中的部分或 者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中, 也可以是各个单元单独物理存在, 也可以两个或两个以上单元集成在一个单 元中。 上述集成的单元既可以采用硬件的形式实现, 也可以采用硬件加软件 功能单元的形式实现。
上述以软件功能单元的形式实现的集成的单元, 可以存储在一个计算机 可读取存储介质中。 上述软件功能单元存储在一个存储介质中, 包括若干指 令用以使得一台计算机设备(可以是个人计算机, 服务器, 或者网络设备等) 或处理器 (processor) 执行本发明各个实施例所述方法的部分歩骤。 而前述 的存储介质包括: U盘、移动硬盘、只读存储器(Read-Only Memory, ROM)、 随机存取存储器(Random Access Memory, RAM) 、 磁碟或者光盘等各种可 以存储程序代码的介质。
本领域技术人员可以清楚地了解到, 为描述的方便和简洁, 仅以上述各 功能模块的划分进行举例说明, 实际应用中, 可以根据需要而将上述功能分 配由不同的功能模块完成, 即将设备的内部结构划分成不同的功能模块, 以 完成以上描述的全部或者部分功能。 上述描述的设备的具体工作过程, 可以 参考前述方法实施例中的对应过程, 在此不再赘述。
最后应说明的是: 以上各实施例仅用以说明本发明的技术方案, 而非对 其限制; 尽管参照前述各实施例对本发明进行了详细的说明, 本领域的普通 技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改, 或者对其中部分或者全部技术特征进行等同替换; 而这些修改或者替换, 并 不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims

权 利 要 求 书
1、 一种用户数据的处理方法, 其特征在于, 包括:
第一设备获取第一数据和至少一个第二数据, 所述第一数据为所述第 一设备自身对用户行为进行检测所获取到的数据, 所述第二数据为至少一 个第二设备对所述用户行为进行检测所获取到的数据;
所述第一设备按照预设规则, 根据所述第一数据和所述第二数据确定 所述用户行为对应的用户数据,并对所述用户行为和 /或所述用户数据进行 呈现。
2、 根据权利要求 1所述的方法, 其特征在于, 所述预设规则包括: 在所述用户行为从发生到停止的检测时间内, 按照预设的时间长度将 所述检测时间划分成多个时间段; 对于各时间段, 根据选取规则在所述第 一数据和所述至少一个第二数据中择一作为各时间段对应的用户数据; 将各时间段分别对应的用户数据进行求和, 作为所述用户行为的用户 数据。
3、 根据权利要求 2所述的方法, 其特征在于, 所述对所述用户行为 和 /或所述用户数据进行呈现包括, 以坐标形式显示所述用户行为和 /或所 述用户数据, 所述坐标包括时间轴, 具体包括:
根据所述第一数据和所述至少一个第二数据, 检测到在一检测时间段 内, 所述用户行为的运动轨迹处于同一区域内, 则计算出所述区域的中心 点, 并在所述第一设备显示屏中的时间轴上、 与所述检测时间段对应点上 显示在所述中心点处发生所述用户行为。
4、 根据权利要求 1所述的方法, 其特征在于, 所述方法还包括: 所述第一设备根据所述第一数据和所述至少一个第二数据, 判断获知 所述用户行为的状态;
相应地, 所述预设规则包括根据所述用户行为的状态, 选择具有高优 先级的所述第一数据或所述第二数据作为所述用户行为对应的用户数据。
5、 一种第一设备, 其特征在于, 包括:
获取模块, 用于获取第一数据和至少一个第二数据, 所述第一数据为 第一设备自身对用户行为进行检测所获取到的数据, 所述第二数据为至少 一个第二设备对所述用户行为进行检测所获取到的数据; 处理模块, 用于所述第一设备按照预设规则, 根据所述第一数据和所 述第二数据确定所述用户行为对应的用户数据;
呈现模块, 用于呈现所述用户行为和 /或所述用户数据。
6、 根据权利要求 5所述的第一设备, 其特征在于, 所述预设规则包 括:
在所述用户行为从发生到停止的检测时间内, 按照预设的时间长度将 所述检测时间划分成多个时间段; 对于各时间段, 根据选取规则在所述第 一数据和所述至少一个第二数据中择一作为各时间段对应的用户数据; 将各时间段分别对应的用户数据进行求和, 作为所述用户行为的用户 数据。
7、 根据权利要求 6所述的第一设备, 其特征在于, 所述对所述用户 行为和 /或所述用户数据进行呈现包括, 以坐标形式显示所述用户行为和 / 或所述用户数据, 所述坐标包括时间轴, 具体包括: 根据所述第一数据和 所述至少一个第二数据, 检测到在一检测时间段内, 所述用户行为的运动 轨迹处于同一区域内, 则计算出所述区域的中心点;
相应地,所述呈现模块,还用于在所述第一设备显示屏中的时间轴上、 与所述检测时间段对应点上显示在所述中心点处发生所述用户行为。
8、 根据权利要求 5所述的第一设备, 其特征在于, 所述设备还包括: 判断模块, 用于所述第一设备根据所述第一数据和所述至少一个第二 数据, 判断获知所述用户行为的状态;
相应地, 所述预设规则包括根据所述用户行为的状态, 选择具有高优 先级的所述第一数据或所述第二数据作为所述用户行为对应的用户数据。
PCT/CN2014/081247 2014-06-30 2014-06-30 用户数据的处理方法和设备 WO2016000163A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
PCT/CN2014/081247 WO2016000163A1 (zh) 2014-06-30 2014-06-30 用户数据的处理方法和设备
CN201480008208.6A CN105519074B (zh) 2014-06-30 2014-06-30 用户数据的处理方法和设备
EP14896735.9A EP3145156A4 (en) 2014-06-30 2014-06-30 User data processing method and device
JP2016575952A JP6380961B2 (ja) 2014-06-30 2014-06-30 ユーザデータ処理方法、およびデバイス
US15/391,083 US20170213367A1 (en) 2014-06-30 2016-12-27 User data processing method, and device for displaying data acquired from a wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/081247 WO2016000163A1 (zh) 2014-06-30 2014-06-30 用户数据的处理方法和设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/391,083 Continuation US20170213367A1 (en) 2014-06-30 2016-12-27 User data processing method, and device for displaying data acquired from a wearable device

Publications (1)

Publication Number Publication Date
WO2016000163A1 true WO2016000163A1 (zh) 2016-01-07

Family

ID=55018249

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/081247 WO2016000163A1 (zh) 2014-06-30 2014-06-30 用户数据的处理方法和设备

Country Status (5)

Country Link
US (1) US20170213367A1 (zh)
EP (1) EP3145156A4 (zh)
JP (1) JP6380961B2 (zh)
CN (1) CN105519074B (zh)
WO (1) WO2016000163A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10351777B2 (en) 2013-03-15 2019-07-16 All Power Labs, Inc. Simultaneous pyrolysis and communition for fuel flexible gasification and pyrolysis

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107811624A (zh) * 2017-12-12 2018-03-20 深圳金康特智能科技有限公司 一种基于双智能穿戴设备的用户信息采集***

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070049814A1 (en) * 2005-08-24 2007-03-01 Muccio Philip E System and device for neuromuscular stimulation
CN202282004U (zh) * 2011-06-02 2012-06-20 上海巨浪信息科技有限公司 基于情景感知与活动分析的移动健康管理***
CN103198615A (zh) * 2013-03-21 2013-07-10 浙江畅志科技有限公司 基于多传感器协同的人体跌倒检测预警装置
CN103810254A (zh) * 2014-01-22 2014-05-21 浙江大学 基于云端的用户行为实时分析方法

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050195094A1 (en) * 2004-03-05 2005-09-08 White Russell W. System and method for utilizing a bicycle computer to monitor athletic performance
JP2006334087A (ja) * 2005-06-01 2006-12-14 Medical Electronic Science Inst Co Ltd 睡眠状態判定システム及び睡眠状態判定方法
JP4905918B2 (ja) * 2006-02-22 2012-03-28 株式会社タニタ 健康管理装置
US20100305480A1 (en) * 2009-06-01 2010-12-02 Guoyi Fu Human Motion Classification At Cycle Basis Of Repetitive Joint Movement
US9664518B2 (en) * 2010-08-27 2017-05-30 Strava, Inc. Method and system for comparing performance statistics with respect to location
US8694282B2 (en) * 2010-09-30 2014-04-08 Fitbit, Inc. Methods and systems for geo-location optimized tracking and updating for events having combined activity and location information
US8849610B2 (en) * 2010-09-30 2014-09-30 Fitbit, Inc. Tracking user physical activity with multiple devices
AU2013217184A1 (en) * 2012-02-02 2014-08-21 Tata Consultancy Services Limited A system and method for identifying and analyzing personal context of a user
JP2013168026A (ja) * 2012-02-15 2013-08-29 Omron Healthcare Co Ltd 睡眠分析結果表示プログラム、睡眠改善支援画面表示プログラム及び睡眠改善行動結果表示プログラム
US9582755B2 (en) * 2012-05-07 2017-02-28 Qualcomm Incorporated Aggregate context inferences using multiple context streams
CN107095679B (zh) * 2012-06-04 2020-12-01 耐克创新有限合伙公司 用于进行锻炼的用户使用的计算机实施的方法以及***
US20140003983A1 (en) * 2012-06-28 2014-01-02 Trebor International Restrained, unattached, ultrapure pump diaphragm
JP5846179B2 (ja) * 2013-09-30 2016-01-20 ダイキン工業株式会社 生体情報取得装置
JP2017079807A (ja) * 2014-03-11 2017-05-18 株式会社東芝 生体センサ、生体データ収集端末、生体データ収集システム、及び生体データ収集方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070049814A1 (en) * 2005-08-24 2007-03-01 Muccio Philip E System and device for neuromuscular stimulation
CN202282004U (zh) * 2011-06-02 2012-06-20 上海巨浪信息科技有限公司 基于情景感知与活动分析的移动健康管理***
CN103198615A (zh) * 2013-03-21 2013-07-10 浙江畅志科技有限公司 基于多传感器协同的人体跌倒检测预警装置
CN103810254A (zh) * 2014-01-22 2014-05-21 浙江大学 基于云端的用户行为实时分析方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3145156A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10351777B2 (en) 2013-03-15 2019-07-16 All Power Labs, Inc. Simultaneous pyrolysis and communition for fuel flexible gasification and pyrolysis

Also Published As

Publication number Publication date
CN105519074A (zh) 2016-04-20
EP3145156A4 (en) 2017-05-31
JP2017522962A (ja) 2017-08-17
EP3145156A1 (en) 2017-03-22
CN105519074B (zh) 2019-06-07
JP6380961B2 (ja) 2018-08-29
US20170213367A1 (en) 2017-07-27

Similar Documents

Publication Publication Date Title
US11044684B2 (en) Method and device for measuring amount of user physical activity
KR102446811B1 (ko) 복수의 디바이스들로부터 수집된 데이터 통합 및 제공 방법 및 이를 구현한 전자 장치
KR102072788B1 (ko) 휴대 장치 및 휴대 장치의 콘텐트 화면 변경방법
EP3162284B1 (en) Communication method and apparatus for wearable device
US20180042542A1 (en) System, device and method for remotely monitoring the well-being of a user with a wearable device
CN107734487B (zh) 控制可穿戴电子设备的方法、中心装置及设备
US9798385B1 (en) User physical attribute based device and content management system
US10182770B2 (en) Smart devices that capture images and sensed signals
EP3660688A1 (en) Information processing system, information processing device, information processing method, and recording medium
US11983313B2 (en) User physical attribute based device and content management system
CN107223247A (zh) 用于获得多个健康参数的方法、***和可穿戴装置
KR20180075931A (ko) 온라인 게임 내에서 아이템 추천 서비스를 제공하는 방법 및 그 장치
US11651238B2 (en) Earpiece advisor
WO2017193566A1 (zh) 一种穿戴式设备数据的管理方法、终端及***
WO2014131003A1 (en) System and method for monitoring biometric data
WO2016124495A1 (en) Smart air quality evaluating wearable device
US20170339255A1 (en) Dynamic feedback for wearable devices
CN106293810B (zh) 基于vr设备的应用处理方法、装置和vr设备
CN113646027A (zh) 电子装置和用于由该电子装置提供用于减压的信息的方法
WO2016000163A1 (zh) 用户数据的处理方法和设备
US20140115092A1 (en) Sensory communication sessions over a network
CN106030442B (zh) 交互设备的选择方法和装置
JP2016170589A (ja) 情報処理装置、情報処理方法およびプログラム
WO2016151061A1 (en) Wearable-based health and safety warning systems
CA3117604C (en) Apparatus for determining mobile application user engagement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14896735

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2014896735

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014896735

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2016575952

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE