CN202362731U - Man-machine interaction system - Google Patents

Man-machine interaction system Download PDF

Info

Publication number
CN202362731U
CN202362731U CN2011204226884U CN201120422688U CN202362731U CN 202362731 U CN202362731 U CN 202362731U CN 2011204226884 U CN2011204226884 U CN 2011204226884U CN 201120422688 U CN201120422688 U CN 201120422688U CN 202362731 U CN202362731 U CN 202362731U
Authority
CN
China
Prior art keywords
image
human
camera head
computer interaction
gesture motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2011204226884U
Other languages
Chinese (zh)
Inventor
董德福
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunhu Times Technology Co., Ltd.
Original Assignee
BEIJING DEXIN INTERACTIVE NETWORK TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING DEXIN INTERACTIVE NETWORK TECHNOLOGY CO LTD filed Critical BEIJING DEXIN INTERACTIVE NETWORK TECHNOLOGY CO LTD
Priority to CN2011204226884U priority Critical patent/CN202362731U/en
Application granted granted Critical
Publication of CN202362731U publication Critical patent/CN202362731U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The utility model relates to a man-machine interaction system, which comprises a camera device and a controller. The camera device transmits an image shot in a real-time manner to the controller. The controller comprises a receiving module that receives the image transmitted by the camera device; an action recognition chip connected with the receiving module to output gesture identifier information of controller's gesture in the image received by the receiving module; a storage module that stores the gesture identifier information and interface control command corresponding relationship information; and a control chip respectively connected with the action recognition chip and the storage module to output the interface control command corresponding to the gesture identifier information of the controller gesture in the storage module. The technical scheme provided by the utility model can realize man-machine interaction based on image capture and gesture so that the implementation way of man-machine interaction is diversified, and the man-machine interaction system is suitable for practical application.

Description

The human-computer interaction system
Technical field
The utility model relates to a kind of human-computer interaction technology, particularly relates to a kind of human-computer interaction system.
Background technology
During the human-computer interaction technology has been widely used in daily life and has worked.For example, control of somatic sensation television game and electric equipment or the like.Especially the somatic sensation television game in the human-computer interaction technology is because it has purpose and the liking of people extremely of body-building and amusement concurrently.
Present existing human-computer interaction technology realizes based on opertaing device that normally for example, somatic sensation television game is normally realized through computing machine and body sense control device or realized through televisor, STB and body sense control device.Body sense control device such as game paddle etc., body sense control device can be held in hand by the singlehanded perhaps both hands of user usually, and carry out control operation.
The inventor finds in realizing the utility model process: opertaing device is a physical entity equipment normally, and be made up of elements such as a plurality of buttons, rocking bar, light source, acceleration of gravity sensor and the small screen usually.Yet the entity device that present human-computer interaction technology can be not limited to physics has been realized, for example, can realize human-computer interaction through the finger touches screen based on the equipment of touching screen such as panel computer etc.The implementation of existing human-computer interaction awaits further abundant.
Because the demand that above-mentioned existing human-computer interaction technology exists; The inventor is based on being engaged in this type of product design manufacturing abundant for many years practical experience and professional knowledge; And cooperate the utilization of studying the science, actively study innovation, in the hope of founding a kind of human-computer interaction system of new structure; Can satisfy the demand that existing human-computer interaction technology exists, make it have more practicality.Through continuous research and design, and, found out the utility model of true tool practical value finally through after studying sample and improvement repeatedly.
Summary of the invention
The purpose of the utility model is, satisfies the demand that the human-computer interaction technology exists, and provides a kind of human-computer interaction system of new structure, technical matters to be solved to be, makes the implementation diversification of human-computer interaction technology, is very suitable for practicality.
The purpose of the utility model and solve its technical matters and can adopt following technical scheme to realize.
According to a kind of human-computer interaction system that the utility model proposes, said system comprises: camera head and control device; Said camera head transfers to control device with the image of real time shooting; Said control device comprises: receiver module receives the image that the camera head transmission comes; The action recognition chip is connected with said receiver module, exports the gesture motion identification information of the effector's gesture motion in the image that said receiver module receives; Store the memory module of gesture motion identification information and interface control order correspondence relationship information; Control chip is connected respectively with said memory module with the action recognition chip, exports gesture motion identification information corresponding interface control in memory module of said effector's gesture motion and orders.
The purpose of the utility model and solve its technical matters and can also adopt following technical measures to come further to realize.
Preferable, aforesaid human-computer interaction system, wherein said camera head comprises: 2D camera head or 3D camera head.
Preferable, aforesaid human-computer interaction system, wherein said 2D camera head comprises: one or two 2D cameras.
Preferable, aforesaid human-computer interaction system, wherein said 3D camera head comprises: infrared light supply or led light source or LASER Light Source; The CMOS image sensor, output infrared light coding image or LED light coding image or laser code image; The scene depth processing module is connected with said CMOS image sensor, receives infrared light coding image or the LED light coding image or the laser code image of said CMOS image sensor output, and to control device output scene depth image.
Preferable, aforesaid human-computer interaction system, wherein said human-computer interaction system comprises: hand held electronic terminals equipment.
Preferable, aforesaid human-computer interaction system, wherein said hand held electronic terminals equipment comprises: mobile phone, notebook computer, panel computer or handheld game machine.
By technique scheme; The human-computer interaction system of the utility model has advantage and beneficial effect at least: the utility model is through utilizing camera head (like 2D camera head or 3D camera head) pickup image; And utilize the action recognition chip identification to go out effector's in this image gesture motion identification information; Make control chip can effector's gesture motion be converted into the interface control order; Realizing human-computer interaction, thereby make the implementation diversification of human-computer interaction, be very suitable for practicality based on image capturing and gesture motion.
In sum, the utility model has obvious improvement technically, and has tangible good effect, really is the new design of a novelty, progress, practicality.
Above-mentioned explanation only is the general introduction of the utility model technical scheme; In order more to know the technological means of understanding the utility model; And can implement according to the content of instructions, and for let the above-mentioned of the utility model with other purposes, feature and advantage can be more obviously understandable, below special act preferred embodiment; And conjunction with figs., specify as follows.
Description of drawings
Fig. 1 is the human-computer interaction system schematic of the utility model.
Embodiment
For further setting forth the utility model is to reach technological means and the effect that predetermined utility model purpose is taked; Below in conjunction with accompanying drawing and preferred embodiment; To its embodiment of human-computer interaction system, structure, characteristic and the effect thereof that proposes according to the utility model, specify as after.
Fig. 1 shows a kind of human-computer interaction system of the utility model specific embodiment, and this human-computer interaction system can be hand held electronic terminals equipment such as mobile phone, notebook computer, panel computer or handheld game machine.
Above-mentioned human-computer interaction system comprises: camera head 1 and control device 2.Camera head 1 can all integratedly be built in the human-computer interaction system with control device 2; Certainly; Also can there be other set-up mode in this system; Like camera head 1 and control device 2 independent separate settings, and control device 2 integrated being built in the human-computer interaction system, and camera head 1 carries out information interaction with control device 2 through wired connection mode (like USB) or wireless connections mode.Control device 2 wherein can specifically comprise: receiver module 21, action recognition chip 22, memory module 23 and control chip 24.
Camera head 1 is connected with control device 2, is connected with receiver module 21 in the control device 2 like camera head 1.Camera head 1 can be the 2D camera head, also can be the 3D camera head, and this 3D camera head such as existing 3D shooting are first-class.
Camera head 1 is mainly used in the real time shooting image, and the image of its picked-up is transferred to control device 2.For example, 3D camera head real time shooting scene depth image, and will absorb the scene depth image that obtains and transfer to control device 2.The real time shooting here is as carrying out image sampling according to predetermined sampling frequency.If camera head 1 all is built in the human-computer interaction system with control device 2, then can be connected through signal wire between camera head 1 and the control device 2, promptly camera head 1 employing wired connection mode transfers to control device 2 with its image information of absorbing.The utility model do not limit camera head 1 particular type and with the connected mode of control device 2.
The promptly present common camera of 2D camera head, the price of this camera is very cheap.This 2D camera head can comprise: the 2D camera that one or two are common.
The 3D camera head can comprise: infrared light supply, CMOS image sensor and scene depth processing module.Above-mentioned infrared light supply also can be led light source or LASER Light Source.
Infrared light supply should meet the one-level safety requirements in the IEC 60825-1 standard.Led light source or LASER Light Source also should meet the safety requirements in the respective standard.
The CMOS image sensor is mainly used in the infrared light that receives infrared light supply and emit (perhaps the LED light that sends of led light source or LASER Light Source send laser); And based on the infrared light that receives (perhaps LED light or laser) generation infrared light coding image (perhaps LED light coding image or laser code image); Afterwards, the infrared light coding image (perhaps LED light coding image or laser code image) with its generation is transferred to the scene depth processing module.
The scene depth processing module is connected with the CMOS image sensor.The scene depth processing module can be the PS1080 chip, certainly, also can for the chip of similar other model of PS1080 chip functions effect.The scene depth processing module is mainly used in to be handled infrared image (perhaps LED light coding image or laser code image), generates the scene depth image by frame, and the scene depth image of its generation is transferred to control device 2.
Control device 2 is mainly used in the represented interface control order of coming out of the gesture motion that from the image of camera head 1 picked-up (the scene depth image that picked-up obtains like the 3D camera head), identifies the effector (promptly analyzing the interface control that effector's gesture motion institute hope gives expression to orders); And export the order of this interface control, under the situation of opertaing device of no physics entity the Be Controlled object is carried out interface control and operate thereby be implemented in.The interface control order of control device 2 outputs can offer other module in the human-computer interaction system, also can offer the miscellaneous equipment that is connected with the human-computer interaction system.
Receiver module 21 and camera head 1 wired or wireless connections in the control device 2.Receiver module 21 is mainly used in wired or wireless mode and receives the image sequence (the scene depth image sequence that transmission comes like the 3D camera head) that camera head 1 transmission comes.Receiver module 21 can specifically comprise: USB interface, signal wire and buffer memory medium etc.
Action recognition chip 22 in the control device 2 all is connected with control chip 24 with receiver module 21.Action recognition chip 22 is mainly used in a series of images sequence (like the scene depth image) that receiver module 21 is received and compares to confirm the effector's in the image gesture motion; And the gesture motion identification information of definite this gesture motion, to control chip 24 its gesture motion identification informations of determining of output.
Memory module 23 in the control device 2 is connected with control chip 24.This memory module 23 can be internal memory or flash memory etc.Store the correspondence relationship information between the order of gesture motion identification information and interface control in the memory module 23.For example, store the call number of gesture motion and the correspondence relationship information of interface control order in the memory module 23.
Above-mentioned interface control order can be called the interface control order to certain menu in the human-computer interaction system; For example, the order that is moved to the left of the cursor in the control mobile phone master menu, order that the cursor in the control mobile phone master menu moves down, the order or the like that triggers the order of certain application program in the mobile phone master menu or get into the next stage submenu of certain option in the mobile phone master menu.
Action recognition chip 23 can begin from the time point that receiver module 21 receives image (like the scene depth image) at first; Each image (like the scene depth image) that receives in the predetermined amount of time (like 3 seconds) is compared (comparing), to confirm the represented effector's who goes out of each image that receives in this predetermined amount of time gesture motion as adopting existing comparison technology.After action recognition chip 23 can relatively finish at the image of this predetermined amount of time; Delete part or all and carried out image relatively; Like the image that deletion receives at first a second, afterwards, action recognition chip 23 continues each image that receives in the predetermined amount of time is discerned.
The hand that action recognition chip 23 can be through judging adjacent image and the registration of forearm are confirmed the effector's in the image gesture motion; For example: the hand that action recognition chip 23 relatively can be judged this effector through the registration of a series of adjacent image moves from left to right, thereby the action of determining the effector is for wave from left to right; Again for example: the hand that action recognition chip 23 relatively can be judged this effector through the registration of a series of adjacent image carries out cyclic motion in vertical direction, thereby the action of determining the effector is that hand is vertically drawn circle.Above-mentioned example is that example describes with the dynamic gesture action all, need to prove, this gesture motion also can be static gesture motion, like the OK gesture etc.
Can store the correspondence relationship information of action message and gesture motion identification information in the action recognition chip 23 in advance; Like this; Action recognition chip 23 can be determined the gesture motion identification information of needs output according to this correspondence relationship information of its storage after the action of determining effector's execution.
Action recognition chip 23 in the present embodiment can adopt existing action recognition chip, like the chip of U.S. Canesta company production.Present embodiment not concrete structure, the concrete gesture motion that the effector carried out that action recognition chip 23 is determined, the action recognition chip 23 of limit movement identification chip 23 is confirmed concrete manifestation form of stored relation in concrete implementation and the action recognition chip 23 of gesture motion identification informations or the like.
Control chip 24 in the control device 2 all is connected with memory module 23 with action recognition chip 22.Search in the gesture motion identification information that control chip 24 is mainly used in memory module 23 storage in advance and the correspondence relationship information of interface control order with the interface control of the gesture motion identification information coupling of action recognition chip 22 outputs and order; And the interface control order of exporting this coupling, like the interface control order of other this coupling of module output in the human-computer interaction system.
The interface control order of control chip 24 output (like interface control order) to the output of human-computer interaction system can for the cursor in the control mobile phone master menu left/right/on/order of moving down; Cursor in the control game machine master menu left/right/on/order of moving down; Trigger the order of certain application program in mobile phone/game machine/notebook computer master menu or get into order of the next stage submenu of certain option in mobile phone/game machine/notebook computer master menu or the like.Control chip 24 should be the command format that the human-computer interaction system supports to the form of the interface control order of human-computer interaction system output.Control chip 24 can adopt the agreement of human-computer interaction system support to produce the interface control order.
A concrete example of the operation that control chip 24 is carried out: control chip 24 is behind the gesture motion identification information of the hand horizontal bar circle action that receives 23 outputs of action recognition chip, according to the interface control order of the next stage submenu of the option at cursor place in the executive component output display operation system master menu of stored relation information in the memory module 23 in notebook computer.The example that another is concrete: control chip 24 is after the hand that receives 23 outputs of action recognition chip is brandished the gesture motion identification information of action from left to right, according to the interface control order of cursor place option in the executive component output activation manipulation system master menu of stored relation information in the memory module 23 in notebook computer.Control chip 24 in the present embodiment can adopt fpga chip etc., and present embodiment does not limit concrete agreement that control chip 24 adopted to the concrete transmission mode and the interface control order of human-computer interaction system transmissions interface control order or the like.
The above only is the preferred embodiment of the utility model; Be not that the utility model is done any pro forma restriction; Though the utility model discloses as above with preferred embodiment; Yet be not in order to limit the utility model; Any professional and technical personnel of being familiar with makes a little change or is modified to the equivalent embodiment of equivalent variations when the technology contents of above-mentioned announcement capable of using in not breaking away from the utility model technical scheme scope, is the content that does not break away from the utility model technical scheme in every case;, all still belong in the scope of the utility model technical scheme any simple modification, equivalent variations and modification that above embodiment did according to the technical spirit of the utility model.

Claims (6)

1. a human-computer interaction system is characterized in that, said system comprises: camera head and control device;
Said camera head transfers to control device with the image of real time shooting;
Said control device comprises:
Receiver module receives the image that said camera head transmission comes;
The action recognition chip is connected with said receiver module, exports the gesture motion identification information of the effector's gesture motion in the image that said receiver module receives;
Store the memory module of gesture motion identification information and interface control order correspondence relationship information;
Control chip is connected respectively with said memory module with said action recognition chip, exports gesture motion identification information corresponding interface control in memory module of said effector's gesture motion and orders.
2. human-computer interaction as claimed in claim 1 system is characterized in that said camera head comprises: 2D camera head or 3D camera head.
3. human-computer interaction as claimed in claim 2 system is characterized in that said 2D camera head comprises: one or two 2D cameras.
4. human-computer interaction as claimed in claim 2 system is characterized in that said 3D camera head comprises:
Infrared light supply or led light source or LASER Light Source;
The CMOS image sensor, output infrared light coding image or LED light coding image or laser code image;
The scene depth processing module is connected with said CMOS image sensor, receives infrared light coding image or the LED light coding image or the laser code image of said CMOS image sensor output, and to control device output scene depth image.
5. like the described human-computer interaction of arbitrary claim system in the claim 1 to 4, it is characterized in that said human-computer interaction system comprises: hand held electronic terminals equipment.
6. human-computer interaction as claimed in claim 5 system is characterized in that said hand held electronic terminals equipment comprises: mobile phone, notebook computer, panel computer or handheld game machine.
CN2011204226884U 2011-10-31 2011-10-31 Man-machine interaction system Expired - Fee Related CN202362731U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011204226884U CN202362731U (en) 2011-10-31 2011-10-31 Man-machine interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011204226884U CN202362731U (en) 2011-10-31 2011-10-31 Man-machine interaction system

Publications (1)

Publication Number Publication Date
CN202362731U true CN202362731U (en) 2012-08-01

Family

ID=46573903

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011204226884U Expired - Fee Related CN202362731U (en) 2011-10-31 2011-10-31 Man-machine interaction system

Country Status (1)

Country Link
CN (1) CN202362731U (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102799191A (en) * 2012-08-07 2012-11-28 北京国铁华晨通信信息技术有限公司 Method and system for controlling pan/tilt/zoom based on motion recognition technology
CN103412649A (en) * 2013-08-20 2013-11-27 苏州跨界软件科技有限公司 Control system based on non-contact type hand motion capture
CN103744359A (en) * 2013-10-30 2014-04-23 杭州古北电子科技有限公司 Method and device for controlling electric appliance through motion sensor
CN103916660A (en) * 2013-01-07 2014-07-09 义明科技股份有限公司 3D image sensing device and 3D image sensing method
CN106980376A (en) * 2017-03-29 2017-07-25 广州新节奏智能科技股份有限公司 A kind of Intelligent Laser detecting system
WO2018176243A1 (en) * 2017-03-29 2018-10-04 广州新节奏智能科技股份有限公司 Intelligent laser detection system
CN110277042A (en) * 2019-06-17 2019-09-24 深圳市福瑞达显示技术有限公司 A kind of the real time rotation display system and its method of human-computer interaction

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102799191A (en) * 2012-08-07 2012-11-28 北京国铁华晨通信信息技术有限公司 Method and system for controlling pan/tilt/zoom based on motion recognition technology
CN102799191B (en) * 2012-08-07 2016-07-13 通号通信信息集团有限公司 Cloud platform control method and system based on action recognition technology
CN103916660A (en) * 2013-01-07 2014-07-09 义明科技股份有限公司 3D image sensing device and 3D image sensing method
CN103916660B (en) * 2013-01-07 2016-05-04 义明科技股份有限公司 3D image sensing device and 3D image sensing method
CN103412649A (en) * 2013-08-20 2013-11-27 苏州跨界软件科技有限公司 Control system based on non-contact type hand motion capture
CN103744359A (en) * 2013-10-30 2014-04-23 杭州古北电子科技有限公司 Method and device for controlling electric appliance through motion sensor
CN106980376A (en) * 2017-03-29 2017-07-25 广州新节奏智能科技股份有限公司 A kind of Intelligent Laser detecting system
WO2018176243A1 (en) * 2017-03-29 2018-10-04 广州新节奏智能科技股份有限公司 Intelligent laser detection system
CN110277042A (en) * 2019-06-17 2019-09-24 深圳市福瑞达显示技术有限公司 A kind of the real time rotation display system and its method of human-computer interaction

Similar Documents

Publication Publication Date Title
CN202362731U (en) Man-machine interaction system
CN103577108B (en) Video file transfer method and video file transfer system
CN102769802A (en) Man-machine interactive system and man-machine interactive method of smart television
CN102902476B (en) A kind of method of touch terminal control electronic equipment and system thereof
CN102184014A (en) Intelligent appliance interaction control method and device based on mobile equipment orientation
WO2013167057A2 (en) Television interface focus control method, apparatus and system
CN202433831U (en) Man-machine interaction system
CN102902406B (en) The system of a kind of touch terminal and control electronic equipment thereof
EP3859489A1 (en) Gesture-based manipulation method and terminal device
CN102637127A (en) Method for controlling mouse modules and electronic device
CN102902356B (en) A kind of gestural control system and control method thereof
CN102955565A (en) Man-machine interaction system and method
CN104661066A (en) Multi-point floating touch remote control device and remote control method thereof
CN102868925A (en) Intelligent TV (television) control method
CN102685581B (en) Multi-hand control system for intelligent television
CN101916141A (en) Interactive input device and method based on space orientation technique
CN202460088U (en) Holographic projection body feeling interactive system
CN202433830U (en) Human-machine interactive system
CN103914305A (en) Method and system for freely controlling applications on mobile terminal
CN201765582U (en) Controller of projection type virtual touch menu
CN103024502B (en) In intelligent television, realize the system and method for handwriting input word
CN107291358A (en) Content display control method, electronic equipment, and videoconference client
CN203537531U (en) Full-function remote controller
CN202257441U (en) Man-machine interaction system
CN107102754A (en) Terminal control method and device, storage medium

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20160425

Address after: 100176 Beijing street Rongchang Beijing economic and Technological Development Zone No. 5 Building No. 3, B zone 2

Patentee after: Beijing Yunhu Times Technology Co., Ltd.

Address before: 100176 Beijing city street Rongchang Daxing District Yizhuang Economic Development Zone No. 5 Longsheng building block C

Patentee before: Beijing Dexin Interactive Network Technology Co.,Ltd.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120801

Termination date: 20191031