CN101807111A - Information apparatus, control method and system thereof - Google Patents

Information apparatus, control method and system thereof Download PDF

Info

Publication number
CN101807111A
CN101807111A CN200910007425A CN200910007425A CN101807111A CN 101807111 A CN101807111 A CN 101807111A CN 200910007425 A CN200910007425 A CN 200910007425A CN 200910007425 A CN200910007425 A CN 200910007425A CN 101807111 A CN101807111 A CN 101807111A
Authority
CN
China
Prior art keywords
special object
movement locus
trajectory
information equipment
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200910007425A
Other languages
Chinese (zh)
Inventor
孔晓东
李季檩
吴亚栋
吴波
陈芒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Sharp Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp filed Critical Sharp Corp
Priority to CN200910007425A priority Critical patent/CN101807111A/en
Publication of CN101807111A publication Critical patent/CN101807111A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an information apparatus, a control method and a system thereof based on object detection and tracing. The method comprises the steps of: automatically analyzing a field video shot by a camera; detecting a special object detected in the video; tracing the object and recording the moving track of the object. A computer can determine whether the user has already sent out some signals and which signals are sent based on the obtained track; thereafter, the determined signals are converted into commands to execute which are adapted to the target system. Thus, the user can control non-contact system by brandishing special objects.

Description

Information equipment and control method thereof and system
Technical field
The present invention relates to non-contact control, be specifically related to a kind of information equipment and its control method and system, it allows the user to realize non-contact control to information equipment by brandishing specific object.
Background technology
As everyone knows, in system control technology field, the non-contact control of intelligence is one of very promising direction.And in various non-contact control methods, those methods based on visual information are very important, and this is because visual information can provide the mode in the perception world as human to machine.
In addition, because the fast development of manufacturing technology, it is more and more cheap that cam device becomes, and performance becomes more and more stronger.Now, camera has become the standard fitting of numerous information equipments, from the mobile phone to the notebook computer, from the Automatic Teller Machine to the board, bulletin, has all assembled camera.These all provide solid foundation for the application based on visual information.But at present in many cases, camera has only played some simple effects, for example only is used to write down visual information in Automatic Teller Machine.Therefore, need develop the range of application that more possesses the electronic installation of camera function with expansion based on the method for visual information.
Patent documentation 1 (CN1534544) has proposed a kind of non-contact control method that is used for large-sized monitor.It is that a kind of laser designator of using is come the method that computing machine is controlled, and Figure 10 has showed the applied system of the method that the prior art proposed.This system comprises: projection screen, laser designator, projector, video camera and computing machine.
Laser designator possesses wireless sending module.Computing machine possesses wireless receiving module and video card.When this system works, at first, computing machine outputs to projector with video.Then, projector project video on screen.After this, video camera is caught the image of projection screen, and sends these images to computing machine by video card.If include the luminous point of laser in these images, computing machine just can obtain the position of laser spot by image processing techniques, based on this position cursor in the computer operating system interface is reorientated then.In addition, utilize wireless sending module, laser designator can send some extra control commands.Utilize wireless receiving module.Computing machine can receive these orders and convert them the clicking operation of mouse to.Like this, the user just can be as using mouse to use laser designator to come control computer.
As shown in figure 11, at step S11, the user uses laser designator that laser spot is radiated on the screen, and simultaneously at step S21, wireless sending module sends extra steering order.
At step S12, video camera is caught the image on the screen, is transferred to computing machine.At step S13, image is handled so that determine the position of laser spot on screen by computing machine.Then, at step S14, computing machine is according to the position of the position calculation computer cursor of laser spot.
After the wireless sending module by laser designator sent the additional command of ' click ' and so on, the wireless receiving module that is equipped with on by computing machine at step S22 received this extra instruction.Then,, the command conversion that receives is become computing machine executable ' click ' order, thereby realize clicking operation in the indicated position of laser designator by computing machine at step S23.Like this, additional command can be used to imitate the control to computing machine of the pointing apparatus such as mouse and trace ball that provides for computing machine.
But, in the prior art, because projection screen image is essential for computing machine positioning cursor position, so projection screen image should be known and be complete.If relatively near projection screen, then his shadow will be superimposed upon on the projection screen user, this must cause the generation of occlusion issue, thus the lost part projection screen image.In this case, computing machine can not correctly be worked.
In addition, because the position of cursor is to determine according to the position of laser spot, then the detection of laser spot position must be very reliably and be unique.But because some that exist in the environment are disturbed, for example flash of light on the projection screen might cause the incorrect of testing result, and this will reduce the reliability of this system greatly.
Summary of the invention
The objective of the invention is to propose a kind of information equipment and control method thereof and system, it allows the user to realize non-contact control to information equipment by brandishing specific object.
In a first aspect of the present invention, a kind of method of controlling contactless system has been proposed, described contactless system comprises: information equipment, and camera, over against predefined special object and catch the live video of special object, the method comprising the steps of: the object detection step, detect described special object, so that export the positional information of described special object in the live video that camera is caught; The object tracking step produces the movement locus of described special object based on the positional information of described special object; The trajectory analysis step, thus detect specific trajectory model by movement locus analysis to described special object, and the signal of this trajectory model is represented in output; And the conversion of signals step, the order that becomes goal systems to carry out described conversion of signals.
In a second aspect of the present invention, a kind of information equipment has been proposed, possess camera and be applied in the contactless system, described camera is over against predefined special object and catch the live video of special object, described information equipment comprises: subject detecting unit, in the live video that camera is caught, detect described special object, so that export the positional information of described special object; Subject tracking unit is based on the positional information of described special object and the movement locus of the described special object of time information generating; The trajectory analysis unit, by the movement locus analysis of described special object is detected predefined trajectory model, and the signal of this trajectory model is represented in output; And signal conversion unit, the order that becomes goal systems to carry out described conversion of signals.
In a third aspect of the present invention, a kind of contactless system has been proposed, comprise above-mentioned information equipment.
Utilize said structure of the present invention and method, can with non-contacting mode realize to the robust of goal systems and control reliably.
Description of drawings
From the detailed description below in conjunction with accompanying drawing, above-mentioned feature and advantage of the present invention will be more obvious, wherein:
Fig. 1 shows the synoptic diagram according to the Touchless control system of the embodiment of the invention;
Fig. 2 shows the schematic block diagram according to the Touchless control system of the embodiment of the invention;
Fig. 3 is the process flow diagram of description according to the process of the control method of the embodiment of the invention;
Fig. 4 is a process flow diagram of describing the process that special object detects;
Fig. 5 is a process flow diagram of describing the process that special object is followed the tracks of;
Fig. 6 is a process flow diagram of describing the process of the track that detects special object;
Fig. 7 is a process flow diagram of describing signal conversion process;
Fig. 8 A, 8B and 8C are the synoptic diagram of description according to the example of the control method of the embodiment of the invention;
Fig. 9 is the synoptic diagram of description according to a practical application of the method for the embodiment of the invention;
Figure 10 is the synoptic diagram of describing according to the non-contact control method of prior art; And
Figure 11 shows the process flow diagram according to the non-contact control method of prior art.
Embodiment
Below, describe preferred implementation of the present invention with reference to the accompanying drawings in detail.In the accompanying drawings, though be shown in the different accompanying drawings, identical Reference numeral is used to represent identical or similar assembly.For clarity and conciseness, be included in here known function and the detailed description of structure will be omitted, otherwise they will make theme of the present invention unclear.
Fig. 1 shows the synoptic diagram according to the Touchless control system of the embodiment of the invention.As shown in Figure 1, the Touchless control system according to the embodiment of the invention possesses the information equipment 100 such as computing machine, the camera 110 that faces the special object setting, displaying screen 120 and the special object 130 such as pen.According to another embodiment of the present invention, camera 110 can be integrated in the information equipment 100.
Special object 130 can be selected by user 150, for example a pen or a hand, and it can be defined by user 150 itself.When system works, the live video of this object will be caught by camera 110, and be imported in the information equipment 100.
This information equipment 100 can this object of detection and tracking.According to this motion of objects track, the position of the cursor that shows on the screen of information equipment 100 is relocated.Next, information equipment 100 compares this track and preset mode, can realize various controls, and for example a left side is clicked or right the click.Like this, user 150 can use special object to control the goal systems that is connected with information equipment, perhaps the goal systems in the control information equipment.
Fig. 2 shows the schematic structure frame according to the Touchless control system of the embodiment of the invention
Figure.As shown in Figure 2, when camera 110 is brandished by user 150 at special object, catch live video, and be entered in the information equipment 100.Information equipment 100 produces based on this live video and brandishes the corresponding control command of process with this, sends to goal systems 140, the operation of controlled target system.As mentioned above, the goal systems 140 here also can be the part of information equipment.
In the storage unit of information equipment 105, stored the image of predefined special object, for example pen, hand, people's face etc., and at least one projected path pattern, for example ' V ', ' Λ ' and ' o ' etc., they correspond respectively to the signal such as ' left button click ', ' clicking by right key ' and ' double-click '.
As shown in Figure 2, the live video that the subject detecting unit 101 that is equipped with in the information equipment 100 receives from camera, and each the frame picture that comes the matched field video by image or model with predefined special object.If there is the picture of coupling, then think the image that has special object in this live video.
In addition, be equipped with subject tracking unit 102 in the information equipment 100, it produces the path curves of special object according to the testing result of subject detecting unit 101.Trajectory analysis unit 103 with in the storage unit 105 in advance the storage trajectory model and the movement locus of record compare.In case the trajectory model of coupling is arranged, then signal conversion unit 104 will with the corresponding order of trajectory model of coupling order as the track representative of record.At last, by signal conversion unit 104 conversion of signals is become to be fit to the order that goal systems 140 is carried out.
As shown in Figure 2, in information equipment 100, also be equipped with and make things convenient for user 150 to carry out the definition unit 106 of self-defined special object and trajectory model.When user 150 will define own distinctively when brandishing object, take the image of these objects by camera 100, and with the image taken as the template of this special object exist in storage unit 105.In addition, user 150 can also import specific trajectory model, and for example ' m ' waits and set own distinctive trajectory model.
Below in conjunction with process flow diagram specific operation process according to each unit of the control method of the embodiment of the invention and information equipment is described.Fig. 3 is the process flow diagram of description according to the process of the control method of the embodiment of the invention.
At step S31, user's 150 hand-held special objects such as pen are brandished before camera, so that the action of the mouse on the screen of control information equipment 100.At step S32, the camera 110 in for example computing machine of information equipment 100 is caught the live video of this special object, and is entered in the subject detecting unit 101 of information equipment.
Next, at step S33, subject detecting unit 101 receives the live video from camera, and comes each frame picture of matched field video by the image with predefined special object.If there is the picture of coupling, then think the image that has special object in this live video.
At step S34, subject tracking unit 102 produces the path curves of special object according to the testing result of subject detecting unit 101.Trajectory analysis unit 103 with in the storage unit 105 in advance the storage trajectory model and the movement locus of record compare.In case the trajectory model of coupling is arranged, then at step S35, signal conversion unit 104 will with the corresponding order of trajectory model of coupling order as the track representative of record.At last, at step S36, conversion of signals is become to be fit to the order that goal systems 140 is carried out by signal conversion unit 104.
Fig. 4 is a process flow diagram of describing the process that special object detects.As shown in Figure 4, at step S41, subject detecting unit 101 is utilized the technology such as masterplates coupling or texture coupling, the live video of catching according to the special object template detection of storage in the storage unit 105, and judge at step S42 whether special object is wherein arranged.If no, then flow process forwards step S41 to, continue to detect, otherwise, at step S43, the position of subject detecting unit output special object.The position of this object can be used for the cursor on the positioning screen, also can be used to produce the movement locus of special object.
As mentioned above, before carrying out object detection, may need the specific object of predefined.This object is very simple, for example is red point.On the other hand, this object also can be very complicated, for example is people's face or hand.This special object will be used for detecting subsequently.
Fig. 5 is a process flow diagram of describing the process that special object is followed the tracks of.As shown in Figure 5, the step S51 before following the tracks of should obtain the information of special object.Then, at step S52, subject tracking unit for example uses the average drifting tracking to create a tracker for this special object, comes this tracker of initialization with the color information of special object.At step S53, when this subject tracking unit is started working, new video frame more continuously.For each new frame, tracker will be searched for the zone the most similar to it, the information that this is regional and the information matches of special object.According to embodiments of the invention, subject tracking unit also can be used other tracking, for example conditional probability density transmission method or the like.
At step S54, in case realize stable tracking, subject tracking unit 102 will obtain positional information and the zone in every frame.Like this, according to the positional information and the zone of the special object that detects, just can create this motion of objects track according to time sequencing.
Fig. 6 is a process flow diagram of describing the process of the track that detects special object.Before the track of special object is analyzed, at first define trajectory model.Trajectory model is used for detecting the given shape of track.
When start working in trajectory analysis unit 103, obtain the track of detection from subject tracking unit 102.Then step S62 with storage unit 105 in the storage all predefined trajectory models analyze this trajectory model.At step S63, when detecting specific trajectory model, just output and the corresponding signal of this pattern of trajectory analysis unit 103.Like this, user 150 can utilize the action of oneself to send signal to goal systems.
Fig. 7 is a process flow diagram of describing signal conversion process.The signal of 103 outputs is corresponding to user 150 action from the trajectory analysis unit.But they still can not be carried out by goal systems 140, because goal systems 140 can not be understood these actions.Therefore, in signal conversion unit 104, at step S71, obtain and analyze the corresponding signal of the track that obtains, and all conversion of signals are become suitable order at step S72.After this, at step S73, signal conversion unit 104 will be ordered output.
Fig. 8 A, 8B and 8C are the synoptic diagram of description according to the example of the control method of the embodiment of the invention.In this embodiment, red point is defined as special object, is used for object detection process and object tracing process.In addition, three kinds of different trajectory models have been defined.They are ' V ' pattern, ' Λ ' pattern and ' o ' pattern.
Fig. 9 is the synoptic diagram of description according to a practical application of the method for the embodiment of the invention.How to utilize red point to open video file red having provided of this example.In order to realize this function, at first that the left side click of ' V ' pattern and mouse is corresponding, the right side click of ' Λ ' pattern and mouse is corresponding, ' o ' pattern is corresponding with the double-click of mouse.When user 150 moves at red as predefined pattern, will be sent to information equipment 100 with the corresponding order of this pattern and it will be controlled.Like this, when user 150 writes ' o ' pattern, double-click and order the operating system of the information equipment 100 that will be sent out to open video file.
As mentioned above, according to embodiments of the invention, when special object is mobile before camera, its live video will be caught by camera.Then, information equipment will obtain this video and this video analysis will be detected whether there is specific object in the video.Next, follow the tracks of the object that detects, and write down its movement locus.The movement locus of record is compared with the trajectory model of presetting, and information equipment can determine whether the user has sent signal and sent which kind of signal.Finally, the conversion of signals of determining is become to be fit to the instruction that goal systems is carried out.Therefore, the user can realize the non-contact control process.
As mentioned above, according to embodiments of the invention since with camera and screen or camera over against special object, special object can be caught by camera like this, and need not show that screen image come detected object by catching, thus the solution occlusion issue.
In addition, embodiments of the invention have proposed to replace object detection technique with the object tracking technique, thereby realize the control of reliable system.According to the method for the embodiment of the invention, the user uses different trajectory models to send different orders to goal systems, thereby carries out different operations.Because these trajectory models all are unique on room and time, reliability and accuracy that they can the control of assurance system.
In addition, in embodiments of the present invention, special object can be defined by the user.Allow to use any very simple object as special object like this, for example red point.Simultaneously, allow the user to use very complex objects, for example people's face as special object.
In addition, in the embodiment of the invention, can adopt the universal PC camera to catch image, this does not need to increase other picture pick-up device or video card.
As mentioned above, equipment of the present invention and method can be used for camera support information device, for example: desktop PC, above-knee PC, mobile phone, PDA, electronic whiteboard, remote control, supervising device or the like.
Top description only is used to realize embodiments of the present invention; it should be appreciated by those skilled in the art; the any modification or partial replacement that is not departing from the scope of the present invention; all should belong to claim of the present invention and come restricted portion; therefore, protection scope of the present invention should be as the criterion with the protection domain of claims.

Claims (11)

1. method of controlling contactless system, described contactless system comprises: information equipment and camera, over against predefined special object and catch the live video of special object, the method comprising the steps of:
The object detection step detects described special object, so that export the positional information of described special object in the live video that camera is caught;
The object tracking step produces the movement locus of described special object based on the positional information of described special object;
The trajectory analysis step, by the movement locus of described special object being analyzed detecting specific trajectory model, and the signal of this trajectory model is represented in output;
And the conversion of signals step, the order that becomes goal systems to carry out described conversion of signals.
2. the method for claim 1, wherein said object detection step comprises:
The special object template or the model of reading pre-stored;
Special object template and each frame in the live video of catching are carried out matching treatment;
Under the situation of coupling, export the positional information of the special object in each frame in the described live video.
3. the method for claim 1, wherein said object tracking step comprises:
Obtain the described special object positional information in each frame of video at the scene;
Produce the movement locus of described special object according to time sequencing.
4. the method for claim 1, wherein said trajectory analysis step comprises:
Movement locus with the desired trajectory mode detection generation of storing in advance;
Under the situation of the movement locus of described special object and desired trajectory pattern match, produce the signal of representing this movement locus.
5. as the described method of one of claim 2~4, also be included in before the object detection step, define the step of special object and trajectory model according to user's needs.
6. information equipment possesses camera and is applied in the contactless system, and described camera is over against predefined special object and catch the live video of special object, and described information equipment comprises:
Subject detecting unit detects described special object, so that export the positional information of described special object in the live video that camera is caught;
Subject tracking unit produces the movement locus of described special object based on the positional information of described special object;
The trajectory analysis unit, by the movement locus of described special object being analyzed detecting predefined trajectory model, and the signal of this trajectory model is represented in output;
And signal conversion unit, the order that becomes goal systems to carry out described conversion of signals.
7. information equipment as claimed in claim 6, the special object template of wherein said subject detecting unit reading pre-stored, and each frame in the special object template and the live video of catching carried out matching treatment, under the situation of coupling, export the positional information that does not have the special object in the frame in the described live video.
8. information equipment as claimed in claim 6, wherein said subject tracking unit obtain the described special object positional information in each frame of video at the scene, produce the movement locus of described special object according to time sequencing.
9. information equipment as claimed in claim 6, the movement locus that wherein said trajectory analysis unit produces with the desired trajectory mode detection of storing in advance, under the situation of the movement locus of described special object and desired trajectory pattern match, produce the signal of representing this trajectory model.
10. as the described information equipment of one of claim 7~9, also comprise the definition unit that defines special object and trajectory model according to user's needs.
11. a contactless system comprises as the described information equipment of one of claim 6~10.
CN200910007425A 2009-02-13 2009-02-13 Information apparatus, control method and system thereof Pending CN101807111A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910007425A CN101807111A (en) 2009-02-13 2009-02-13 Information apparatus, control method and system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910007425A CN101807111A (en) 2009-02-13 2009-02-13 Information apparatus, control method and system thereof

Publications (1)

Publication Number Publication Date
CN101807111A true CN101807111A (en) 2010-08-18

Family

ID=42608925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910007425A Pending CN101807111A (en) 2009-02-13 2009-02-13 Information apparatus, control method and system thereof

Country Status (1)

Country Link
CN (1) CN101807111A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012171190A1 (en) * 2011-06-15 2012-12-20 青岛海信信芯科技有限公司 Television, control method and control device for the television
CN103135746A (en) * 2011-11-25 2013-06-05 夏普株式会社 Non-touch control method and non-touch control system and non-touch control device based on static postures and dynamic postures
CN103942811A (en) * 2013-01-21 2014-07-23 中国电信股份有限公司 Method and system for determining motion trajectory of characteristic object in distributed and parallel mode
CN111034171A (en) * 2017-09-26 2020-04-17 索尼半导体解决方案公司 Information processing system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1534544A (en) * 2003-04-01 2004-10-06 中国科学院电子学研究所 Large screen non contact type control mode
CN101071350A (en) * 2006-05-11 2007-11-14 北京华旗资讯数码科技有限公司 Device for operating cursor, window by identifying dynamic trace
CN101295442A (en) * 2008-06-17 2008-10-29 上海沪江虚拟制造技术有限公司 Non-contact stereo display virtual teaching system
CN101354608A (en) * 2008-09-04 2009-01-28 中兴通讯股份有限公司 Method and system for implementing video input

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1534544A (en) * 2003-04-01 2004-10-06 中国科学院电子学研究所 Large screen non contact type control mode
CN101071350A (en) * 2006-05-11 2007-11-14 北京华旗资讯数码科技有限公司 Device for operating cursor, window by identifying dynamic trace
CN101295442A (en) * 2008-06-17 2008-10-29 上海沪江虚拟制造技术有限公司 Non-contact stereo display virtual teaching system
CN101354608A (en) * 2008-09-04 2009-01-28 中兴通讯股份有限公司 Method and system for implementing video input

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012171190A1 (en) * 2011-06-15 2012-12-20 青岛海信信芯科技有限公司 Television, control method and control device for the television
CN103135746A (en) * 2011-11-25 2013-06-05 夏普株式会社 Non-touch control method and non-touch control system and non-touch control device based on static postures and dynamic postures
CN103135746B (en) * 2011-11-25 2018-01-02 夏普株式会社 Non-contact control method, system and equipment based on static posture and dynamic posture
CN103942811A (en) * 2013-01-21 2014-07-23 中国电信股份有限公司 Method and system for determining motion trajectory of characteristic object in distributed and parallel mode
CN103942811B (en) * 2013-01-21 2017-08-15 中国电信股份有限公司 Distributed parallel determines the method and system of characteristic target movement locus
CN111034171A (en) * 2017-09-26 2020-04-17 索尼半导体解决方案公司 Information processing system

Similar Documents

Publication Publication Date Title
CN103092432B (en) The trigger control method of man-machine interactive operation instruction and system and laser beam emitting device
CN102541256B (en) There is the location-aware posture of visual feedback as input method
CN110434853B (en) Robot control method, device and storage medium
CN102339125A (en) Information equipment and control method and system thereof
US8525876B2 (en) Real-time embedded vision-based human hand detection
JP4243248B2 (en) User interface system based on pointing device
EP2133848B1 (en) Computer-implemented process for controlling a user-selected electronic component using a pointing device
US20110115892A1 (en) Real-time embedded visible spectrum light vision-based human finger detection and tracking method
JP5264844B2 (en) Gesture recognition apparatus and method
CN102103409A (en) Man-machine interaction method and device based on motion trail identification
US20140071042A1 (en) Computer vision based control of a device using machine learning
CN104427252A (en) Method for synthesizing images and electronic device thereof
CN104166509A (en) Non-contact screen interaction method and system
Rahman et al. Motion-path based gesture interaction with smart home services
CN112801061A (en) Posture recognition method and system
CN111596776B (en) Electronic whiteboard writing pen and teaching system thereof
CN103135746A (en) Non-touch control method and non-touch control system and non-touch control device based on static postures and dynamic postures
CN101807111A (en) Information apparatus, control method and system thereof
Qian et al. Arnnotate: An augmented reality interface for collecting custom dataset of 3d hand-object interaction pose estimation
Margetis et al. Augmenting physical books towards education enhancement
CN103135745A (en) Non-touch control method and non-touch control information device and non-touch control system based on depth images
CN1326023C (en) Electronic display system positioning method and positioner thereof
US9761009B2 (en) Motion tracking device control systems and methods
CN109151298A (en) Video camera control method, equipment and system based on screen
Kavitha et al. Interactive Screens Using Hand Gestures and Microcontroller

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20100818