Background technology
During the human-computer interaction technology has been widely used in daily life and has worked.For example, control of somatic sensation television game and electric equipment or the like.Especially the somatic sensation television game in the human-computer interaction technology is because it has purpose and the liking of people extremely of body-building and amusement concurrently.
Present existing human-computer interaction technology realizes based on opertaing device that normally for example, somatic sensation television game is normally realized through computing machine and body sense control device or realized through televisor, STB and body sense control device.Body sense control device such as game paddle etc., body sense control device can be held in hand by the singlehanded perhaps both hands of user usually, and carry out control operation.
The inventor finds in realizing the utility model process: opertaing device is a physical entity equipment normally, and be made up of elements such as a plurality of buttons, rocking bar, light source, acceleration of gravity sensor and the small screen usually.Yet the entity device that present human-computer interaction technology can be not limited to physics has been realized, for example, can realize human-computer interaction through the finger touches screen based on the equipment of touching screen such as panel computer etc.The implementation of existing human-computer interaction awaits further abundant.
Because the demand that above-mentioned existing human-computer interaction technology exists; The inventor is based on being engaged in this type of product design manufacturing abundant for many years practical experience and professional knowledge; And cooperate the utilization of studying the science, actively study innovation, in the hope of founding a kind of human-computer interaction system of new structure; Can satisfy the demand that existing human-computer interaction technology exists, make it have more practicality.Through continuous research and design, and, found out the utility model of true tool practical value finally through after studying sample and improvement repeatedly.
Summary of the invention
The purpose of the utility model is, satisfies the demand that the human-computer interaction technology exists, and provides a kind of human-computer interaction system of new structure, technical matters to be solved to be, makes the implementation diversification of human-computer interaction technology, is very suitable for practicality.
The purpose of the utility model and solve its technical matters and can adopt following technical scheme to realize.
According to a kind of human-computer interaction system that the utility model proposes, said system comprises: camera head and control device; Said camera head transfers to control device with the image of real time shooting; Said control device comprises: receiver module receives the image that the camera head transmission comes; The action recognition chip is connected with said receiver module, exports the gesture motion identification information of the effector's gesture motion in the image that said receiver module receives; Store the memory module of gesture motion identification information and interface control order correspondence relationship information; Control chip is connected respectively with said memory module with the action recognition chip, exports gesture motion identification information corresponding interface control in memory module of said effector's gesture motion and orders.
The purpose of the utility model and solve its technical matters and can also adopt following technical measures to come further to realize.
Preferable, aforesaid human-computer interaction system, wherein said camera head comprises: 2D camera head or 3D camera head.
Preferable, aforesaid human-computer interaction system, wherein said 2D camera head comprises: one or two 2D cameras.
Preferable, aforesaid human-computer interaction system, wherein said 3D camera head comprises: infrared light supply or led light source or LASER Light Source; The CMOS image sensor, output infrared light coding image or LED light coding image or laser code image; The scene depth processing module is connected with said CMOS image sensor, receives infrared light coding image or the LED light coding image or the laser code image of said CMOS image sensor output, and to control device output scene depth image.
Preferable, aforesaid human-computer interaction system, wherein said human-computer interaction system comprises: hand held electronic terminals equipment.
Preferable, aforesaid human-computer interaction system, wherein said hand held electronic terminals equipment comprises: mobile phone, notebook computer, panel computer or handheld game machine.
By technique scheme; The human-computer interaction system of the utility model has advantage and beneficial effect at least: the utility model is through utilizing camera head (like 2D camera head or 3D camera head) pickup image; And utilize the action recognition chip identification to go out effector's in this image gesture motion identification information; Make control chip can effector's gesture motion be converted into the interface control order; Realizing human-computer interaction, thereby make the implementation diversification of human-computer interaction, be very suitable for practicality based on image capturing and gesture motion.
In sum, the utility model has obvious improvement technically, and has tangible good effect, really is the new design of a novelty, progress, practicality.
Above-mentioned explanation only is the general introduction of the utility model technical scheme; In order more to know the technological means of understanding the utility model; And can implement according to the content of instructions, and for let the above-mentioned of the utility model with other purposes, feature and advantage can be more obviously understandable, below special act preferred embodiment; And conjunction with figs., specify as follows.
Embodiment
For further setting forth the utility model is to reach technological means and the effect that predetermined utility model purpose is taked; Below in conjunction with accompanying drawing and preferred embodiment; To its embodiment of human-computer interaction system, structure, characteristic and the effect thereof that proposes according to the utility model, specify as after.
Fig. 1 shows a kind of human-computer interaction system of the utility model specific embodiment, and this human-computer interaction system can be hand held electronic terminals equipment such as mobile phone, notebook computer, panel computer or handheld game machine.
Above-mentioned human-computer interaction system comprises: camera head 1 and control device 2.Camera head 1 can all integratedly be built in the human-computer interaction system with control device 2; Certainly; Also can there be other set-up mode in this system; Like camera head 1 and control device 2 independent separate settings, and control device 2 integrated being built in the human-computer interaction system, and camera head 1 carries out information interaction with control device 2 through wired connection mode (like USB) or wireless connections mode.Control device 2 wherein can specifically comprise: receiver module 21, action recognition chip 22, memory module 23 and control chip 24.
Camera head 1 is connected with control device 2, is connected with receiver module 21 in the control device 2 like camera head 1.Camera head 1 can be the 2D camera head, also can be the 3D camera head, and this 3D camera head such as existing 3D shooting are first-class.
Camera head 1 is mainly used in the real time shooting image, and the image of its picked-up is transferred to control device 2.For example, 3D camera head real time shooting scene depth image, and will absorb the scene depth image that obtains and transfer to control device 2.The real time shooting here is as carrying out image sampling according to predetermined sampling frequency.If camera head 1 all is built in the human-computer interaction system with control device 2, then can be connected through signal wire between camera head 1 and the control device 2, promptly camera head 1 employing wired connection mode transfers to control device 2 with its image information of absorbing.The utility model do not limit camera head 1 particular type and with the connected mode of control device 2.
The promptly present common camera of 2D camera head, the price of this camera is very cheap.This 2D camera head can comprise: the 2D camera that one or two are common.
The 3D camera head can comprise: infrared light supply, CMOS image sensor and scene depth processing module.Above-mentioned infrared light supply also can be led light source or LASER Light Source.
Infrared light supply should meet the one-level safety requirements in the IEC 60825-1 standard.Led light source or LASER Light Source also should meet the safety requirements in the respective standard.
The CMOS image sensor is mainly used in the infrared light that receives infrared light supply and emit (perhaps the LED light that sends of led light source or LASER Light Source send laser); And based on the infrared light that receives (perhaps LED light or laser) generation infrared light coding image (perhaps LED light coding image or laser code image); Afterwards, the infrared light coding image (perhaps LED light coding image or laser code image) with its generation is transferred to the scene depth processing module.
The scene depth processing module is connected with the CMOS image sensor.The scene depth processing module can be the PS1080 chip, certainly, also can for the chip of similar other model of PS1080 chip functions effect.The scene depth processing module is mainly used in to be handled infrared image (perhaps LED light coding image or laser code image), generates the scene depth image by frame, and the scene depth image of its generation is transferred to control device 2.
Control device 2 is mainly used in the represented interface control order of coming out of the gesture motion that from the image of camera head 1 picked-up (the scene depth image that picked-up obtains like the 3D camera head), identifies the effector (promptly analyzing the interface control that effector's gesture motion institute hope gives expression to orders); And export the order of this interface control, under the situation of opertaing device of no physics entity the Be Controlled object is carried out interface control and operate thereby be implemented in.The interface control order of control device 2 outputs can offer other module in the human-computer interaction system, also can offer the miscellaneous equipment that is connected with the human-computer interaction system.
Receiver module 21 and camera head 1 wired or wireless connections in the control device 2.Receiver module 21 is mainly used in wired or wireless mode and receives the image sequence (the scene depth image sequence that transmission comes like the 3D camera head) that camera head 1 transmission comes.Receiver module 21 can specifically comprise: USB interface, signal wire and buffer memory medium etc.
Action recognition chip 22 in the control device 2 all is connected with control chip 24 with receiver module 21.Action recognition chip 22 is mainly used in a series of images sequence (like the scene depth image) that receiver module 21 is received and compares to confirm the effector's in the image gesture motion; And the gesture motion identification information of definite this gesture motion, to control chip 24 its gesture motion identification informations of determining of output.
Memory module 23 in the control device 2 is connected with control chip 24.This memory module 23 can be internal memory or flash memory etc.Store the correspondence relationship information between the order of gesture motion identification information and interface control in the memory module 23.For example, store the call number of gesture motion and the correspondence relationship information of interface control order in the memory module 23.
Above-mentioned interface control order can be called the interface control order to certain menu in the human-computer interaction system; For example, the order that is moved to the left of the cursor in the control mobile phone master menu, order that the cursor in the control mobile phone master menu moves down, the order or the like that triggers the order of certain application program in the mobile phone master menu or get into the next stage submenu of certain option in the mobile phone master menu.
Action recognition chip 23 can begin from the time point that receiver module 21 receives image (like the scene depth image) at first; Each image (like the scene depth image) that receives in the predetermined amount of time (like 3 seconds) is compared (comparing), to confirm the represented effector's who goes out of each image that receives in this predetermined amount of time gesture motion as adopting existing comparison technology.After action recognition chip 23 can relatively finish at the image of this predetermined amount of time; Delete part or all and carried out image relatively; Like the image that deletion receives at first a second, afterwards, action recognition chip 23 continues each image that receives in the predetermined amount of time is discerned.
The hand that action recognition chip 23 can be through judging adjacent image and the registration of forearm are confirmed the effector's in the image gesture motion; For example: the hand that action recognition chip 23 relatively can be judged this effector through the registration of a series of adjacent image moves from left to right, thereby the action of determining the effector is for wave from left to right; Again for example: the hand that action recognition chip 23 relatively can be judged this effector through the registration of a series of adjacent image carries out cyclic motion in vertical direction, thereby the action of determining the effector is that hand is vertically drawn circle.Above-mentioned example is that example describes with the dynamic gesture action all, need to prove, this gesture motion also can be static gesture motion, like the OK gesture etc.
Can store the correspondence relationship information of action message and gesture motion identification information in the action recognition chip 23 in advance; Like this; Action recognition chip 23 can be determined the gesture motion identification information of needs output according to this correspondence relationship information of its storage after the action of determining effector's execution.
Action recognition chip 23 in the present embodiment can adopt existing action recognition chip, like the chip of U.S. Canesta company production.Present embodiment not concrete structure, the concrete gesture motion that the effector carried out that action recognition chip 23 is determined, the action recognition chip 23 of limit movement identification chip 23 is confirmed concrete manifestation form of stored relation in concrete implementation and the action recognition chip 23 of gesture motion identification informations or the like.
Control chip 24 in the control device 2 all is connected with memory module 23 with action recognition chip 22.Search in the gesture motion identification information that control chip 24 is mainly used in memory module 23 storage in advance and the correspondence relationship information of interface control order with the interface control of the gesture motion identification information coupling of action recognition chip 22 outputs and order; And the interface control order of exporting this coupling, like the interface control order of other this coupling of module output in the human-computer interaction system.
The interface control order of control chip 24 output (like interface control order) to the output of human-computer interaction system can for the cursor in the control mobile phone master menu left/right/on/order of moving down; Cursor in the control game machine master menu left/right/on/order of moving down; Trigger the order of certain application program in mobile phone/game machine/notebook computer master menu or get into order of the next stage submenu of certain option in mobile phone/game machine/notebook computer master menu or the like.Control chip 24 should be the command format that the human-computer interaction system supports to the form of the interface control order of human-computer interaction system output.Control chip 24 can adopt the agreement of human-computer interaction system support to produce the interface control order.
A concrete example of the operation that control chip 24 is carried out: control chip 24 is behind the gesture motion identification information of the hand horizontal bar circle action that receives 23 outputs of action recognition chip, according to the interface control order of the next stage submenu of the option at cursor place in the executive component output display operation system master menu of stored relation information in the memory module 23 in notebook computer.The example that another is concrete: control chip 24 is after the hand that receives 23 outputs of action recognition chip is brandished the gesture motion identification information of action from left to right, according to the interface control order of cursor place option in the executive component output activation manipulation system master menu of stored relation information in the memory module 23 in notebook computer.Control chip 24 in the present embodiment can adopt fpga chip etc., and present embodiment does not limit concrete agreement that control chip 24 adopted to the concrete transmission mode and the interface control order of human-computer interaction system transmissions interface control order or the like.
The above only is the preferred embodiment of the utility model; Be not that the utility model is done any pro forma restriction; Though the utility model discloses as above with preferred embodiment; Yet be not in order to limit the utility model; Any professional and technical personnel of being familiar with makes a little change or is modified to the equivalent embodiment of equivalent variations when the technology contents of above-mentioned announcement capable of using in not breaking away from the utility model technical scheme scope, is the content that does not break away from the utility model technical scheme in every case;, all still belong in the scope of the utility model technical scheme any simple modification, equivalent variations and modification that above embodiment did according to the technical spirit of the utility model.