CN104238729A - Somatic sense control implementing method and somatic sense control implementing system - Google Patents
Somatic sense control implementing method and somatic sense control implementing system Download PDFInfo
- Publication number
- CN104238729A CN104238729A CN201310245412.7A CN201310245412A CN104238729A CN 104238729 A CN104238729 A CN 104238729A CN 201310245412 A CN201310245412 A CN 201310245412A CN 104238729 A CN104238729 A CN 104238729A
- Authority
- CN
- China
- Prior art keywords
- gray level
- level image
- sense control
- mobile device
- movement locus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides a somatic sense control implementing method. The method comprises the following steps of acquiring grey-scale images of a series of actions of a shot object; recognizing the motion curve of the shot object according to the acquired grey-scale images; and performing interactive operation on a user interface of mobile equipment according to the motion curve. The invention also provides a somatic sense control implementing system. By the somatic sense control implementing method and the somatic sense control implementing system, somatic sense control can be implemented in the mobile equipment.
Description
Technical field
The present invention relates to a kind of body sense Implementation Technology and system, especially relate to a kind of method and system realizing body sense in a mobile device and control.
Background technology
Body sense technology, refers to that people directly can use limb action, with device or the environment interaction of periphery, and without the need to using the opertaing device of any complexity, people just can be allowed to do interaction with content with being personally on the scene.Along with the development of science and technology, body sense technology has more and more been applied in the life of the mankind, such as, and somatic sensation television game, 3D virtual reality, space mouse etc.But body sense equipment is now substantially all that infrared ray sensor realizes in conjunction with rf terminal, that is, body sense equipment comprises a fixing receiving terminal and a hand-held launch terminal.Because needs possess receiving terminal and launch terminal simultaneously, thus body sense technology is caused not to be suitable for mobile device.Therefore need to find a kind of method that body sense technology can be allowed to realize on the mobile apparatus.
Summary of the invention
In view of above content, be necessary to provide a kind of body sense control method and system, body sense can be realized in a mobile device and control.
A kind of body sense control method, run in mobile device, the method comprising the steps of: (a) gathers the gray level image of a series of actions of shooting object; B () is according to the movement locus of this shooting object of gray level image identification of described collection; And (c) carries out interactive operation according to this movement locus to the user interface of mobile device.
A kind of body sense control realization system, be applied in mobile device, this system comprises: acquisition module, for gathering the gray level image of a series of actions of shooting object; Identification module, for the movement locus of this shooting object of gray level image identification according to described collection; And interactive module, for carrying out interactive operation according to this movement locus to the user interface of mobile device.
Compared to prior art, body sense control method provided by the present invention and system, can after unlatching body sense input, the body sense of gray scale camera collection user is utilized to input, and the control of user interface in mobile device (User Interface) is realized according to this body sense input, virtual reality feels strong alternately, is convenient to user operation, and without the need to realizing body sense technology by the mode of receiving terminal and launch terminal, joint space-efficient object can be reached.
Accompanying drawing explanation
Fig. 1 is the hardware structure figure of body sense control realization system of the present invention preferred embodiment.
Fig. 2 is the operation process chart of body sense control method of the present invention preferred embodiment.
Fig. 3 comprises Fig. 3 A, Fig. 3 B, Fig. 3 C, is that the present invention controls an instantiation of user interface by body sense.
Main element symbol description
Mobile device | 100 |
Body sense control realization system | 10 |
Acquisition module | 11 |
Judge module | 12 |
Reminding module | 13 |
Identification module | 14 |
Interactive module | 15 |
Storage unit | 20 |
Image unit | 30 |
Cache memory | 40 |
Digital signal processing appts | 50 |
Display screen | 60 |
File | 70 |
Following embodiment will further illustrate the present invention in conjunction with above-mentioned accompanying drawing.
Embodiment
Consulting shown in Fig. 1, is the hardware structure figure of body sense control realization system of the present invention preferred embodiment.This body sense control realization system 10 runs in mobile device 100, and this mobile device 100 can be the portable apparatus such as mobile phone, panel computer, palm PC (Personal Digital Assistant, PDA), Google's glasses.This mobile device 100 also comprises parts, as storage unit 20, image unit 30, cache memory 40, digital signal processing (Digital Signal Processing, DSP) equipment 50 and display screen 60.The parts 10-60 of mobile device is communicated by system bus.
Described image unit 30 is for according to default time interval, and as 1 second, 0.5 second, shooting was positioned at the gray level image of a series of actions of the shooting object of coverage.Described shooting object is the article or all or part of of human body that need to follow the trail of, the hand of such as human body.In this preferred embodiment, described image unit 30 is that a shooting speed is greater than 20 frames camera per second.It is per second that the shooting speed of described image unit 30 is greater than 20 frames, can ensure that the movement locus obtained according to captured gray level image is continuous.Described cache memory 40 can be the high-speed caches such as internal memory, for storing the gray level image of described shooting.Described DSP equipment 50 for carrying out difference operation to the gray level image of described shooting, and obtains the movement locus of shooting object according to operation result.Described display screen 60 is for showing interaction results according to described movement locus.
Described body sense control realization system 10 is for gathering the image of a series of actions of the shooting object in the coverage of image unit 30, and when described shooting object is the object supported of mobile device 100, identify the movement locus of described shooting object, and this movement locus is sent to user interface (the User Interface of display screen 60, UI), to carry out interactive operation according to this movement locus.Describedly supporting that object refers to the shooting object that mobile device 100 can identify, can be the various models etc. needing each several part of object model or the people followed the trail of, the hand model of such as people.Described storage unit 20 can support object described in storing.
This body sense control realization system 10 comprises acquisition module 11, judge module 12, reminding module 13, identification module 14 and interactive module 15.Module 11-15 comprises computerize programmed instruction.
After the body sense input of mobile device 100 is opened, acquisition module 11 is for gathering the gray level image of a series of actions of the shooting object in coverage by image unit 30.In this preferred embodiment, the gray level image of collection was stored in cache memory 40 every default time interval by described acquisition module 11.This preset time interval by user or Operation system setting, as being 0.5 second or 1 second equal time value.
Judge module 12 is for judging that according to the gray level image of described collection whether described shooting object is the object supported of mobile device 100.In this preferred embodiment, judge module 12 obtains the profile of shooting object according to the gray level image of described collection, and is compared by the object supported stored in this profile and storage unit 20.When the object supported stored in storage unit 20 comprises the profile of this shooting object, judge module 12 judges that described shooting object is the object supported of mobile device 100.When the object supported stored in storage unit 20 does not comprise the profile of this shooting object, judge module 12 judges described shooting object not as the object supported of mobile device 100.
When described shooting object is not the object supported of mobile device 100, reminding module 13 is for prompting shooting object recognition failures on display screen 60.
When described shooting object is the object supported of mobile device 100, identification module 14 is for the movement locus of this shooting object of gray level image identification according to described collection.In this preferred embodiment, identification module 14 obtains the gray level image stored in cache memory 40, and carries out difference operation by DSP50 to the gray level image of this acquisition, to obtain the movement locus of described shooting object.
It should be noted that, when doing difference operation, the gray level image of acquisition arranges according to time order and function order by identification module 14, adjacent gray level image is carried out between two difference operation to obtain the displacement between each adjacent gray level image.Afterwards, identification module 14 is by capable for the every shift-in movement locus combined to obtain shooting object.
Interactive module 15 for described movement locus being sent to the UI interface of display screen 60, and carries out interactive operation according to this movement locus to this UI interface.In this preferred embodiment, interactive module 15 is according to described movement locus control UI interface.Consult shown in Fig. 3, as shown in Figure 3A, according to the gray level image identification movement locus of described shooting as shown in Figure 3 B, interactive module 15 according to this movement locus control UI interface as shown in Figure 3 C for identification module 14 for the gray level image that image unit 30 is taken.In fig. 3 c, file 70 is moved to position B from position A according to described movement locus by interactive module 15.
Consulting shown in Fig. 2, is the process flow diagram of body sense control method of the present invention preferred embodiment.
Step S10, acquisition module 11 gathers the gray level image of a series of actions of the shooting object in coverage by image unit 30.In this preferred embodiment, the gray level image of collection was stored in cache memory 40 every default time interval by described acquisition module 11.This preset time interval by user or Operation system setting, as being 0.5 second or 1 second equal time value.
Step S20, according to the gray level image of described collection, judge module 12 judges that whether described shooting object is the object supported of mobile device 100.When described shooting object is not the object supported of mobile device 100, perform step S30, reminding module 13 points out shooting object recognition failures on display screen 60, and direct process ends.When described shooting object is the object supported of mobile device 100, perform step S40.
In this preferred embodiment, judge module 12 obtains the profile of shooting object according to the gray level image of described collection, and is compared by the object supported stored in this profile and storage unit 20.When the object supported stored in storage unit 20 comprises the profile of this shooting object, judge module 12 judges that described shooting object is the object supported of mobile device 100.When the object supported stored in storage unit 20 does not comprise the profile of this shooting object, judge module 12 judges described shooting object not as the object supported of mobile device 100.
Step S40, identification module 14 is according to the movement locus of this shooting object of gray level image identification of described collection.In this preferred embodiment, identification module 14 obtains the gray level image stored in cache memory 40, and carries out difference operation by DSP50 to the gray level image of this acquisition, to obtain the movement locus of described shooting object.
It should be noted that, when doing difference operation, the gray level image of acquisition arranges according to time order and function order by identification module 14, adjacent gray level image is carried out between two difference operation to obtain the displacement between each adjacent gray level image.Afterwards, identification module 14 is by capable for the every shift-in movement locus combined to obtain shooting object.
Step S50, described movement locus is sent to the UI interface of display screen 60 by interactive module 15, and carries out interactive operation according to this movement locus to this UI interface.In this preferred embodiment, interactive module 15 is according to described movement locus control UI interface.
It should be noted last that, above embodiment is only in order to illustrate technical scheme of the present invention and unrestricted, although with reference to above preferred embodiment to invention has been detailed description, those of ordinary skill in the art is to be understood that, can modify to technical scheme of the present invention or equivalent replacement, and not depart from the spirit and scope of technical solution of the present invention.
Claims (10)
1. a body sense control method, run in mobile device, it is characterized in that, the method comprises:
Acquisition step: the gray level image gathering a series of actions of shooting object;
Identification step: according to the movement locus of this shooting object of gray level image identification of described collection; And
Interactive step: interactive operation is carried out to the user interface of mobile device according to this movement locus.
2. body sense control method as claimed in claim 1, it is characterized in that, before the identifying step, the method also comprises determining step:
Obtain the profile of shooting object according to the gray level image of described collection, and the object supported stored in the storage unit of this profile and mobile device is compared;
According to above-mentioned comparison result, judge that whether described shooting object is the object supported of mobile device.
3. body sense control method as claimed in claim 1, is characterized in that, in described acquisition step, the gray level image gathered was stored in the cache memory of mobile device every default time interval.
4. body sense control method as claimed in claim 3, it is characterized in that, described identification step comprises:
Obtain the gray level image stored in cache memory, and difference operation is carried out to the gray level image of this acquisition, to obtain the movement locus of described shooting object.
5. body sense control method as claimed in claim 4, it is characterized in that, described identification step also comprises:
The gray level image of acquisition is arranged according to time order and function order;
Adjacent gray level image is carried out between two difference operation to obtain the displacement between each adjacent gray level image; And
By capable for every shift-in combination to obtain the movement locus of shooting object.
6. a body sense control realization system, is applied in mobile device, it is characterized in that, this system comprises:
Acquisition module, for gathering the gray level image of a series of actions of shooting object;
Identification module, for the movement locus of this shooting object of gray level image identification according to described collection; And
Interactive module, for carrying out interactive operation according to this movement locus to the user interface of mobile device.
7. body sense control realization system as claimed in claim 6, it is characterized in that, this system also comprises judge module, for obtaining the profile of shooting object according to the gray level image of described collection, and the object supported stored in the storage unit of this profile and mobile device is compared, and according to above-mentioned comparison result, judge that whether described shooting object is the object supported of mobile device.
8. body sense control realization system as claimed in claim 6, is characterized in that, the gray level image of collection was stored in the cache memory of mobile device every default time interval by described acquisition module.
9. body sense control realization system as claimed in claim 8, it is characterized in that, described identification module is by taking the movement locus of object described in following steps identification:
Obtain the gray level image stored in cache memory, and difference operation is carried out to the gray level image of this acquisition, to obtain the movement locus of described shooting object.
10. body sense control realization system as claimed in claim 9, it is characterized in that, described identification module is further by the movement locus taking object described in following steps identification:
The gray level image of acquisition is arranged according to time order and function order;
Adjacent gray level image is carried out between two difference operation to obtain the displacement between each adjacent gray level image; And
By capable for every shift-in combination to obtain the movement locus of shooting object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310245412.7A CN104238729A (en) | 2013-06-20 | 2013-06-20 | Somatic sense control implementing method and somatic sense control implementing system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310245412.7A CN104238729A (en) | 2013-06-20 | 2013-06-20 | Somatic sense control implementing method and somatic sense control implementing system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN104238729A true CN104238729A (en) | 2014-12-24 |
Family
ID=52226976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310245412.7A Pending CN104238729A (en) | 2013-06-20 | 2013-06-20 | Somatic sense control implementing method and somatic sense control implementing system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104238729A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105338117A (en) * | 2015-11-27 | 2016-02-17 | 亮风台(上海)信息科技有限公司 | Method, device and system for generating AR applications and presenting AR instances |
CN106778537A (en) * | 2016-11-28 | 2017-05-31 | 中国科学院心理研究所 | A kind of collection of animal social network structure and analysis system and its method based on image procossing |
CN106970738A (en) * | 2017-02-16 | 2017-07-21 | 全球能源互联网研究院 | A kind of exchange method and device at the pseudo operation interface corresponding with power equipment |
CN109618088A (en) * | 2018-01-05 | 2019-04-12 | 马惠岷 | Intelligent camera system and method with illumination identification and reproduction capability |
-
2013
- 2013-06-20 CN CN201310245412.7A patent/CN104238729A/en active Pending
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105338117A (en) * | 2015-11-27 | 2016-02-17 | 亮风台(上海)信息科技有限公司 | Method, device and system for generating AR applications and presenting AR instances |
WO2017088777A1 (en) * | 2015-11-27 | 2017-06-01 | 亮风台(上海)信息科技有限公司 | Method, device and system for generating ar application and presenting ar instance |
CN105338117B (en) * | 2015-11-27 | 2018-05-29 | 亮风台(上海)信息科技有限公司 | For generating AR applications and method, equipment and the system of AR examples being presented |
US10885713B2 (en) | 2015-11-27 | 2021-01-05 | Hiscene Information Technology Co., Ltd | Method, apparatus, and system for generating an AR application and rendering an AR instance |
CN106778537A (en) * | 2016-11-28 | 2017-05-31 | 中国科学院心理研究所 | A kind of collection of animal social network structure and analysis system and its method based on image procossing |
CN106970738A (en) * | 2017-02-16 | 2017-07-21 | 全球能源互联网研究院 | A kind of exchange method and device at the pseudo operation interface corresponding with power equipment |
CN109618088A (en) * | 2018-01-05 | 2019-04-12 | 马惠岷 | Intelligent camera system and method with illumination identification and reproduction capability |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109085966B (en) | Three-dimensional display system and method based on cloud computing | |
CN113132618B (en) | Auxiliary photographing method and device, terminal equipment and storage medium | |
CN107948543B (en) | Video special effect processing method and device | |
CN111556278B (en) | Video processing method, video display device and storage medium | |
CN103973969B (en) | Electronic installation and its image system of selection | |
JP2019535055A (en) | Perform gesture-based operations | |
CN108230232B (en) | Image processing method and related device | |
CN108200334B (en) | Image shooting method and device, storage medium and electronic equipment | |
CN112348933B (en) | Animation generation method, device, electronic equipment and storage medium | |
CN112199016B (en) | Image processing method, image processing device, electronic equipment and computer readable storage medium | |
EP3968131A1 (en) | Object interaction method, apparatus and system, computer-readable medium, and electronic device | |
CN110941332A (en) | Expression driving method and device, electronic equipment and storage medium | |
CN109145809A (en) | A kind of note spectrum processing method and device and computer readable storage medium | |
CN107749046B (en) | Image processing method and mobile terminal | |
CN112115894B (en) | Training method and device of hand key point detection model and electronic equipment | |
CN111209812A (en) | Target face picture extraction method and device and terminal equipment | |
CN104238729A (en) | Somatic sense control implementing method and somatic sense control implementing system | |
CN110443769A (en) | Image processing method, image processing apparatus and terminal device | |
CN110604579A (en) | Data acquisition method, device, terminal and storage medium | |
US9560272B2 (en) | Electronic device and method for image data processing | |
CN113544738B (en) | Portable acquisition device for anthropometric data and method for collecting anthropometric data | |
CN114581525A (en) | Attitude determination method and apparatus, electronic device, and storage medium | |
CN111901518B (en) | Display method and device and electronic equipment | |
CN112270242A (en) | Track display method and device, readable medium and electronic equipment | |
CN109889892A (en) | Video effect adding method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20141224 |
|
RJ01 | Rejection of invention patent application after publication |