CN104199547B - Virtual touch screen operation device, system and method - Google Patents

Virtual touch screen operation device, system and method Download PDF

Info

Publication number
CN104199547B
CN104199547B CN201410436118.9A CN201410436118A CN104199547B CN 104199547 B CN104199547 B CN 104199547B CN 201410436118 A CN201410436118 A CN 201410436118A CN 104199547 B CN104199547 B CN 104199547B
Authority
CN
China
Prior art keywords
coordinate
hand
value
camera
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410436118.9A
Other languages
Chinese (zh)
Other versions
CN104199547A (en
Inventor
廖裕民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rockchip Electronics Co Ltd
Original Assignee
Fuzhou Rockchip Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou Rockchip Electronics Co Ltd filed Critical Fuzhou Rockchip Electronics Co Ltd
Priority to CN201410436118.9A priority Critical patent/CN104199547B/en
Publication of CN104199547A publication Critical patent/CN104199547A/en
Application granted granted Critical
Publication of CN104199547B publication Critical patent/CN104199547B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides a virtual touch screen operation device, system and method. The method comprises the steps that hand recognition is carried out on image data collected by a camera with a single hand of a user horizontally arranged on an image capturing area in a hanging mode, and therefore the position of the central position of the hand in an image is determined, the camera resolution degree is combined, and the pixel position of the central point of the hand is converted into a two-dimensional coordinate value of an XZ coordinate surface and a two-dimensional coordinate value of a YZ coordinate surface so that the coordinate value of the central point of the hand in an XYZ three-dimensional coordinate system can be set; according to the three-dimensional coordinate value, the working mode is determined by the number of fingers used by the user, the operation position and the operation mode of the user are judged, according to the judgment result and coordinates of all current operational icons, a picture for virtualizing the position of the hand on a touch screen and effective operation is drawn, and the drawn image is displayed. According to the man-machine interactive type virtual touch device, system and method, the user can conveniently perform free man-machine interactive operation through the virtual touch screen any time any where.

Description

A kind of virtual contact action device, system and method
Technical field
The present invention relates to virtual technical field of touch control, more particularly to a kind of virtual contact action device, system and side Method.
Background technology
Touch screen (touch control screen) is the important component part in the man-machine interaction of prior art, existing touch screen Input all relies on entity touchpad device, that is to say, that it is necessary to have an in esse touch screen can complete people Machine is interacted, and thus greatly limit carries out the place scope and condition of man-machine interaction.
The content of the invention
In view of the above problems, the present invention provides a kind of one kind for overcoming the problems referred to above or at least partly solving the above problems Virtual contact action device, system and method.
The present invention provides a kind of virtual contact action device, including display control unit and display unit, the device Including:
View recognition unit, the view data for the singlehanded head collection to camera to needing to operate in user's both hands is entered Row hard recognition, to determine hand center position in the picture.
Horizontal plane two-dimensional coordinate sets up unit, and the hand center position for being identified according to the view recognition unit exists Position in image and the pixel resolution of camera, by hand central point location of pixels the two-dimensional coordinate of XZ coordinate surfaces is converted to Value.
Vertical plane two-dimensional coordinate sets up unit, and the hand center position for being identified according to the view recognition unit exists Position in image and the pixel resolution of camera, by hand central point location of pixels the two-dimensional coordinate of YZ coordinate surfaces is converted to Value.
Three-dimensional coordinate computing unit, for setting up unit and the foundation of vertical plane two-dimensional coordinate according to the horizontal plane two-dimensional coordinate The hand central point location of pixels that unit determines respectively sets up hand center in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces Coordinate value of the point in XYZ three-dimensional system of coordinates.
Mode of operation judging unit, the finger number used for user in the singlehanded image according to the identification is operated The selection of pattern.
Action judging unit, for hand D coordinates value and the Working mould for being set up according to the three-dimensional coordinate computing unit The selected mode of operation of formula judging unit judges the operating position and operator scheme of user.And
Chart drawing unit, for according to the seat of the judged result of the action judging unit and current all operable icons The position of hand and the picture for effectively operating on virtual touch screen are made in plotting, to call the display control unit to control the display Unit shows the image of user operation so that user is learnt in the corresponding virtual touch screen of current hand central point according to feedback Position and hand continued to move to according to shown image carry out virtual touch control operation.
The present invention also provides a kind of virtual contact action system, including a kind of virtual tactile described in as above any one Screen operation device and two picture pick-up devices with the device communication connection.
The present invention also provides a kind of virtual contact action method, and the method includes:
User vacantly lies against the one hand for needing to operate in both hands in image capture area, the image to camera collection Data carry out hard recognition, to determine hand center position in the picture.
According to the hand center position for identifying position in the picture and the pixel resolution of camera, by hand Heart point location of pixels is respectively converted into the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces.
Hand central point is set up in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces according to hand central point location of pixels Coordinate value in XYZ three-dimensional system of coordinates.
The finger number used according to user in the singlehanded image of identification is operated the selection of pattern.
The operating position and operator scheme of user are judged according to the hand D coordinates value set up, the mode of operation of determination.
According to judged result and the coordinate of current all operable icons draw out the position of hand on virtual touch screen With the picture of effective operation.And
Show the images for user viewing of the drafting so that user learns that current hand central point is corresponding according to feedback Position in virtual touch screen and hand is continued to move to according to the image of the display carry out virtual touch control operation.
A kind of virtual contact action device, system and method that the present invention is provided, are captured image by camera and are led to Cross image recognition hand and judge mode of operation to determine operating position and by recognizing finger number, it is three-dimensional by the hand for obtaining Coordinate is mapped directly into the operational motion to virtual touch screen, and shows feed back to user over the display, can make to touch Control screen input no longer needs entity device, by the shooting in intelligent glasses and Intelligent bracelet, or intelligent and portable mobile device The virtual touch screen input environment of equipment fast construction, carries out whenever and wherever possible touch screen input, facilitates user whenever and wherever possible by void Plan touch screen carries out man-machine interactive operation freely.
Description of the drawings
Fig. 1 is the hardware structure schematic diagram of a kind of virtual contact action system in embodiment of the present invention;
Fig. 2 is the high-level schematic functional block diagram of a kind of virtual contact action device in embodiment of the present invention;
Fig. 3 is that hand central point location of pixels is converted to into the two-dimensional coordinate value of XZ coordinate surfaces in embodiment of the present invention Transfer principle schematic diagram;
Fig. 4 is the schematic flow sheet of a kind of virtual contact action method in embodiment of the present invention.
Label declaration:
System 100
Device 10
Brightness sensing unit 101
View recognition unit 102
Longitudinal view recognizes subelement 1021
Transverse views recognize subelement 1022
Finger number judging unit 103
Mode of operation judging unit 104
Horizontal plane two-dimensional coordinate sets up unit 105
Vertical plane two-dimensional coordinate sets up unit 106
Three-dimensional coordinate computing unit 107
Action judging unit 108
Chart drawing unit 109
Display control unit 110
Display unit 111
Picture pick-up device 20
First camera 201
Second camera 202
Display device 21
Specific embodiment
To describe technology contents of the invention, structural feature in detail, purpose and effect being realized, below in conjunction with embodiment And coordinate accompanying drawing to be explained in detail.
Fig. 1 is referred to, is the hardware structure schematic diagram of a kind of virtual contact action system in embodiment of the present invention, should System 100 includes 10, two picture pick-up devices 20 of virtual contact action device and display device 21, for the inspection to user gesture Survey realizes that touch-control is input into.
It is the high-level schematic functional block diagram of the virtual contact action device in embodiment of the present invention please refer to Fig. 2, should Device 10 includes that ambient brightness sensing unit 101, view recognition unit 102, finger number judging unit 103, mode of operation are sentenced Disconnected unit 104, horizontal plane two-dimensional coordinate set up unit 105, vertical plane two-dimensional coordinate and set up unit 106, three-dimensional coordinate calculating list Unit 107, action judging unit 108, chart drawing unit 109, display control unit 110 and display unit 111.The device 10 In can be using electronic equipments such as camera, mobile phone, panel computers, the picture pick-up device 20 be entered by network with the device 10 Row communication connection, the transmission medium of the network can be the wireless transmission mediums such as bluetooth, zigbee, WIFI.
Each picture pick-up device 20 includes the first camera 201 and second camera 202, sets respectively as longitudinal direction shooting Standby and horizontal picture pick-up device.Wherein, can may be at for intelligent glasses etc. as the first camera 201 of longitudinal picture pick-up device Mobile portable formula electronic equipment above user's hand, can be intelligent hand as the second camera 202 of horizontal picture pick-up device Ring etc. can be positioned over the Mobile portable formula electronic equipment in front of user.Further, the first shooting of each picture pick-up device 20 201 and second camera 202 be respectively common camera and infrared camera.Wherein, common camera can be in light line In the case of part is preferable, IMAQ is carried out to user operation action and device 10 is sent to analyzing.Infrared camera can be in light In the case that lines part is poor, IMAQ is carried out to user operation action and device 10 is sent to analyzing.The view recognition unit 102 include longitudinal view identification subelement 1021 and transverse views identification subelement 1022, respectively to should be used as longitudinal shooting First camera 201 and second camera 202 of equipment and horizontal picture pick-up device is arranged, and the image for gathering to it is carried out Identifying processing.
In original state, two pairs of cameras (a pair of common cameras and a pair of infrared cameras) use cooperatively, shooting Direction is set to orthogonal, can simultaneously catch hand action behavior both vertically and horizontally.Generally, intelligent glasses In two cameras (common camera and an infrared camera) put down, in Intelligent bracelet or smart mobile phone Two camera (common camera and an infrared camera) levels are put.Also, by shooting for the two pairs of cameras Rectangular area be collectively forming image capture area.
The brightness value of the induced environment of ambient brightness sensing unit 101, and ambient brightness value is sent to the view identification In unit 102.The view recognition unit 102 judges to use common camera or infrared according to the luminance threshold value that pre-sets Camera.For example, brightness impression scope is 1~100, and threshold value is 50, then determination when ambient brightness value is more than 50 uses common Camera, infrared camera image is used when ambient brightness value is less than 50.
Determined after the camera types for using according to ambient brightness value, start initial alignment operation, it is specific as follows.The device 10 carry out initial alignment operate when, user will need in both hands operate one hold fist vacantly lie against two groups select take the photograph As the position that head can be photographed, i.e. image capture area, and the static of certain hour is kept, to complete user's hand position Initialization flow process, is easy to device 10 to recognize and orient the initial position of hand, so as to follow-up operation.The device 10 identification and The principle of positioning hand position will be described below in detail.
When the formula that interacts is operated, user will need the hand (hereinafter referred to as singlehanded) for operating vacantly to keep flat in both hands In image capture area, the ring that longitudinal view identification subelement 1021 is detected according to the ambient brightness sensing unit 101 Border brightness value judges to use common camera or infrared camera, and in singlehanded top after the camera for using is determined The view data of the common camera or infrared camera collection as longitudinal picture pick-up device carries out hard recognition, to determine hand Portion center position in the picture.The transverse views identification subelement 1022 is detectd according to the ambient brightness sensing unit 101 The ambient brightness value for measuring judges to use common camera or infrared camera, and to being in after the camera for using is determined Singlehanded front carries out hard recognition as the view data that the common camera or infrared camera of horizontal picture pick-up device are gathered, To determine hand center position in the picture.
Wherein, the hand center that longitudinal view identification subelement 1021 determines position in the picture is to sit in XZ The hand central point pixel in mark face position in the picture, for example, hand central point pixel is located at a rows b of XZ faces image and arranges.Should The hand center that transverse views identification subelement 1022 determines position in the picture is at the hand center of YZ coordinate surfaces Point pixel position in the picture.
Further, judge that the method for hand central point includes color background method and color glove by common camera Method.Wherein, color background method is specially:The environmental background of bimanualness needs color relatively easy and single, so can be straight The direct handle portion Extraction of Image of color interval scope of human body complexion was connected out, then according to the hand images area for extracting The line number for being averagely worth to central point of peak and minimum point in domain, by ultra-left point and rightest point center is averagely worth to The columns of point.Color glove auxiliary law is specially:User wears special pure red gloves, because common camera is all RGB (red-green-blue) samples, and can go out pure red regional location with extracting directly, it is also possible to use green or blueness as hand Set finger end points color.Then, averagely it is worth to center according to highs and lows in the hand images region for extracting The line number of point, by ultra-left point and the columns for being averagely worth to central point of rightest point.
Judge that the method for finger centre point filters method and color glove auxiliary law including temperature by infrared camera.Its In, temperature filters method and is in particular:Bimanualness can directly by human surface temperature's relative ambient temperature it is higher the characteristics of Directly the higher hand Extraction of Image of temperature out, then according to highs and lows in the hand images region for extracting The line number for being averagely worth to central point, by ultra-left point and the columns for being averagely worth to central point of rightest point.Color glove Auxiliary law is specially:User wears special gloves, and there is heating effect on the surface of gloves, so can go out heat in image with extracting directly Region, then according to the line number for being averagely worth to central point of highs and lows in the hand images region for extracting, leads to Cross the columns for being averagely worth to central point of ultra-left point and rightest point.
The horizontal plane two-dimensional coordinate is set up unit 105 and is recognized in the hand that subelement 1021 is identified according to longitudinal view Heart point position position in the picture and the pixel resolution of camera, by hand central point location of pixels XZ coordinate surfaces are converted to Two-dimensional coordinate value.The vertical plane two-dimensional coordinate sets up what unit 106 was identified according to the transverse views identification subelement 1022 Hand center position position in the picture and the pixel resolution of camera, by hand central point location of pixels YZ is converted to The two-dimensional coordinate value of coordinate surface.
Fig. 3 is referred to, hand central point location of pixels is converted to the transfer principle tool of the two-dimensional coordinate value of XZ coordinate surfaces It is body:Image lower left corner pixel is set to into the starting point 0 of two-dimensional coordinate system, according to image analytic degree and being converted to two dimension Coordinate value range computation after coordinate goes out coordinate value scope with respect to the line number of each image and the ratio of columns.For example, XZ coordinates The a height of 2000*1000 of face image analytic degree width, the coordinate value scope of two-dimentional XZ plane coordinate systems is that 1 to 150, Z axis are 1 for X-axis To 100, then the column number proportion of Z axis coordinate value scope relative image is 100/1000, the columns of X-axis coordinate value scope relative image Ratio 150/2000.The location of pixels of hand central point is multiplied by into the ratio of calculated coordinate range relative image row, column number Example, so as to get converted to two-dimensional coordinate after end points two-dimensional coordinate value.For example, the location of pixels of certain hand central point is 300 rows 200 are arranged, then the Z axis coordinate of the hand central point is 300*100/1000=30, and the X-axis coordinate of the hand central point is 200*150/2000=15.The transfer principle of the two-dimensional coordinate value that hand central point location of pixels is converted to into YZ coordinate surfaces is same On, here is not added with repeating.
The three-dimensional coordinate computing unit 107 sets up unit 105 and vertical plane two-dimensional coordinate according to the horizontal plane two-dimensional coordinate Set up the hand central point location of pixels that unit 106 determines respectively and set up hand in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces Coordinate value of portion's central point in XYZ three-dimensional system of coordinates.
Wherein, the operation principle for setting up coordinate value of the hand central point in XYZ three-dimensional system of coordinates is specially:Because XZ sits Mark face and YZ coordinate surfaces have common Z axis, so by each seat in the Z values of each coordinate end points in XZ coordinate surfaces and YZ coordinate surfaces The Z values of mark end points are all extracted and are compared, and Z axis coordinate value is consistent or immediate coordinate end points can be considered as same Individual end points, then merges into a seat by the coordinate value of the coordinate value and YZ coordinate surfaces that are judged as the XZ coordinate surfaces of same end point Mark end points, using the coordinate value as XYZ three-dimensional system of coordinates.Because Z values are possible to different, the new three-dimensional coordinate for producing Z values are the coordinate Z value of the coordinate Z values plus YZ coordinate surfaces and then the operation result divided by 2 of XZ coordinate surfaces, in three-dimensional system of coordinate X, Y-coordinate value are respectively equal to the X-coordinate value and Y-coordinate value of XZ coordinate surfaces and YZ coordinate surfaces.
It is sent to the horizontal plane two in the position of longitudinal view identification subelement 1021 by hand center position in the picture Also the hand images for identifying are sent to the finger number judging unit 103 while dimension coordinate sets up unit 105.
The finger number judging unit 103 recognizes that user is operated used hand in longitudinal view according to hand images Refer to number.
Specifically, the finger number judging unit 103 determines finger by the identification to finger end points in hand images Number.Wherein, recognize that the method for finger end points includes color background method and color glove auxiliary law, the monochrome back of the body by common camera Scape method is specially:The environmental background of bimanualness needs color relatively easy and single, so can directly pass through human body complexion The direct handle portion Extraction of Image of color interval scope out, then hand each strip is calculated according to figure endpoint algorithm and is prolonged The cut off position stretched, as the endpoint location of every finger, then calculates one and has several end points.Color glove auxiliary law has Body is:User needs to wear special gloves, and each fingertip location of gloves is pure red, because common camera is all RGB (red-green-blue) samples, the position that can go out pure red point with extracting directly, it is also possible to use green or blueness as gloves Finger end points color, then calculates one and has several end points.
Recognize that the method for finger end points filters method and color glove auxiliary law, temperature filter including temperature by infrared camera Division is specially:Bimanualness can directly by human surface temperature's relative ambient temperature it is higher the characteristics of directly temperature compared with High hand Extraction of Image out, then calculates the cut off position of each strip extension of hand according to figure endpoint algorithm, As the endpoint location of every finger, then calculate one and have several end points.Color glove auxiliary law is in particular:User wears Special gloves, each fingertip location of gloves is the point for having heating, so can go out hotspot location in image with extracting directly, so One is calculated afterwards has several end points.
The mode of operation judging unit 104 uses the individual of finger according to the user that the finger number judging unit 103 determines Number is operated the selection of pattern, for example, when stretching out (hand is in and holds bulk) without finger, the operation mould to virtual touch screen Formula is the position coordinates for changing hand in screen;When only using a finger (other fingers are in and hold bulk), touch to virtual The operator scheme of control screen is to choose certain icon;During using two fingers, the operator scheme of virtual touch screen is chosen for dragging Icon;It is the whole screen that slides to the operator scheme of virtual touch screen during using three fingers.0 finger of most definables To 5 fingers, altogether 6 kinds of mode of operations, all can be re-defined using the corresponding mode of operation of different fingers.
Hand D coordinates value, the work that the action judging unit 108 is set up according to the three-dimensional coordinate computing unit 107 Mode of operation and the XZ coordinate models of the operable icon of the feedback of chart drawing unit 109 that mode determination 104 determines Region is enclosed, the operating position and operator scheme of user is judged.
In the present embodiment, in hand position initial phase, the action judging unit 108 is according to hand central point D coordinates value in vertical direction minimum end points (the namely min coordinates value of Y-axis) as click on judge plane Y-axis value, The D coordinates value of handle portion central point is mapped on operable area, and judges what is currently carried out with reference to current mode of operation Hand judges the operating result corresponding to plane through the click.For example, in the corresponding hand that changes of 0 finger in screen Under the mode of operation of position coordinates, clicking operation does not have any implication;Certain icon working mould is chosen 1 finger is corresponding Under formula, clicking operation represents the icon chosen in certain screen;In the corresponding icon working pattern chosen of dragging of 2 fingers Under, click action represents that certain icon is started or terminated drag operation by user;In the corresponding whole screen that slides of 3 fingers Under mode of operation, click action represents operation that whole screen-picture is started or terminate to drag by user etc..
Due to being hand position initial phase, the action judging unit 108 sets the initial of the judgement face of click using Y value Value, so the Y value of hand coordinate is both greater than or equal to the decision content in the judgement face of click.Also, when user's movement hand is carried out just During often operation under mode of operation, every time the action judging unit 108 is received after the three-dimensional coordinate of hand central point, is no longer weighed The click is newly set and judges plane Y-axis value, but directly judge plane Y-axis value to judge whether virtual touch screen according to the click There is effective click action.
Wherein, the D coordinates value of hand central point is mapped on the operable area of touch screen, in particular:Setting The operable area scope is the coordinate value scope of XZ coordinate surfaces, and the coordinate in the XZ faces of hand central point can be mapped directly into can Plan-position coordinate in operating area.
Click action is judged according to the D coordinates value of hand central point, specially:Judge plane Y-axis when have selected to click on After value, as long as the Y value in hand central point three-dimensional coordinate judges plane Y-axis value less than the click, then judge what the end points was passed through The click judges that click behavior occur in plane, the i.e. finger, then in conjunction with hand central point in which region decision user most It is eventually that clicking operation has been carried out to which position.
Judged result and current all operable icons of the chart drawing unit 109 according to the action judging unit 108 Coordinate, draw out the position of hand on virtual touch screen and the effectively picture of operation.The chart drawing unit 109 is just at the beginning The coordinate value of the XZ coordinate surfaces of all operable icons of beginningization, and the coordinates regional of the XZ coordinate surfaces of each operable icon is anti- It is fed to the action judging unit 108.The chart drawing unit 109 is according to the position of the different operable icons of operation change, root Specific response is made according to the band of position of the click behavior for occurring, for example, highlight, dragging, deletion etc., and will be drawn Image send to the display control unit 110, while update shift position after operable icon coordinate value and feed back to The action judging unit 108.
The display control unit 110 is converted to display device 21 the image drawn by the chart drawing unit 109 can be with The sequential of display, call the display unit 111 by the image operated on virtual touch screen be shown on display device 21 for Family is watched, and user can learn the position in oneself corresponding virtual touch screen of current hand central point according to feedback, then Can start to continue to move to hand according to display content to proceed virtual touch control operation.
Fig. 4 is referred to, is the schematic flow sheet of the virtual contact action method in embodiment of the present invention, the method bag Include:
Step S30, the brightness value of the induced environment of ambient brightness sensing unit 101, the view recognition unit 102 is according to pre- The ambient brightness value that the luminance threshold value and the ambient brightness sensing unit 101 for first arranging is sensed is judged using common shooting Head or infrared camera.
In original state, two pairs of cameras (a pair of common cameras and a pair of infrared cameras) use cooperatively, shooting Direction is set to orthogonal, can simultaneously catch hand action behavior both vertically and horizontally.Generally, intelligent glasses In two cameras (common camera and an infrared camera) put down, in Intelligent bracelet or smart mobile phone Two camera (common camera and an infrared camera) levels are put.Also, by shooting for the two pairs of cameras Rectangular area be collectively forming image capture area.
Step S31, user will need one operated holding fist vacantly to lie against in image capture area and protect in both hands The static of certain hour is held, the initial position of hand is recognized by device 10 and oriented, the initialization of user's hand position is completed.
The device 10 recognizes and positions that the principle of hand position will be described below in detail.
Step S32, user will need the hand (hereinafter referred to as singlehanded) for operating vacantly to lie against picture catching in both hands In region, longitudinal view identification subelement 1021 according in singlehanded top as the common camera of longitudinal direction picture pick-up device or Person is that the view data of infrared camera collection carries out hard recognition, to determine hand center position in the picture.Should Transverse views recognize subelement 1022 according in singlehanded front as the common camera in horizontal picture pick-up device or infrared The view data of camera collection carries out hard recognition, to determine hand center position in the picture.
Specifically, the hand center that longitudinal view identification subelement 1021 determines position in the picture is in XZ The hand central point pixel of coordinate surface position in the picture, for example, hand central point pixel is located at a rows b of XZ faces image and arranges. The hand center that the transverse views identification subelement 1022 determines position in the picture is in the hand of YZ coordinate surfaces Heart point pixel position in the picture.
Further, judge that the method for hand central point includes color background method and color glove by common camera Method.Wherein, color background method is specially:The environmental background of bimanualness needs color relatively easy and single, so can be straight The direct handle portion Extraction of Image of color interval scope of human body complexion was connected out, then according to the hand images area for extracting The line number for being averagely worth to central point of peak and minimum point in domain, by ultra-left point and rightest point center is averagely worth to The columns of point.Color glove auxiliary law is specially:User wears special pure red gloves, because common camera is all RGB (red-green-blue) samples, and can go out pure red regional location with extracting directly, it is also possible to use green or blueness as hand Set finger end points color.Then, averagely it is worth to center according to highs and lows in the hand images region for extracting The line number of point, by ultra-left point and the columns for being averagely worth to central point of rightest point.
Judge that the method for finger centre point filters method and color glove auxiliary law including temperature by infrared camera.Its In, temperature filters method and is in particular:Bimanualness can directly by human surface temperature's relative ambient temperature it is higher the characteristics of Directly the higher hand Extraction of Image of temperature out, then according to highs and lows in the hand images region for extracting The line number for being averagely worth to central point, by ultra-left point and the columns for being averagely worth to central point of rightest point.Color glove Auxiliary law is specially:User wears special gloves, and there is heating effect on the surface of gloves, so can go out heat in image with extracting directly Region, then according to the line number for being averagely worth to central point of highs and lows in the hand images region for extracting, leads to Cross the columns for being averagely worth to central point of ultra-left point and rightest point.
Step S33, the horizontal plane two-dimensional coordinate is set up unit 105 and is identified according to longitudinal view identification subelement 1021 Hand center position position in the picture and the pixel resolution of camera, hand central point location of pixels is converted to The two-dimensional coordinate value of XZ coordinate surfaces.The vertical plane two-dimensional coordinate sets up unit 106 according to the transverse views identification subelement 1022 The hand center position for identifying position in the picture and the pixel resolution of camera, by hand central point location of pixels Be converted to the two-dimensional coordinate value of YZ coordinate surfaces.
Wherein, hand central point location of pixels is converted to the transfer principle of two-dimensional coordinate value of XZ coordinate surfaces specifically For:Image lower left corner pixel is set to into the starting point 0 of two-dimensional coordinate system, according to image analytic degree and being converted to two-dimensional coordinate Coordinate value range computation afterwards goes out coordinate value scope with respect to the line number of each image and the ratio of columns.For example, XZ coordinate surfaces figure As a height of 2000*1000 of resolution width, the coordinate value scope of two-dimentional XZ plane coordinate systems is that X-axis is arrived for 1 to 150, Z axis for 1 100, then the column number proportion of Z axis coordinate value scope relative image is 100/1000, the columns ratio of X-axis coordinate value scope relative image Example 150/2000.The location of pixels of hand central point is multiplied by into the ratio of calculated coordinate range relative image row, column number, End points two-dimensional coordinate value after so as to get converted to two-dimensional coordinate.For example, the location of pixels of certain hand central point is 300 rows 200 row, then the Z axis coordinate of the hand central point is 300*100/1000=30, and the X-axis coordinate of the hand central point is 200* 150/2000=15.Hand central point location of pixels is converted to the transfer principle of two-dimensional coordinate value of YZ coordinate surfaces ibid, This is not added with repeating.
Step S34, the three-dimensional coordinate computing unit 107 sets up unit 105 and vertical plane according to the horizontal plane two-dimensional coordinate Two-dimensional coordinate sets up hand central point location of pixels that unit 106 determines respectively in XZ coordinate surfaces and the two-dimensional coordinate of YZ coordinate surfaces Value sets up coordinate value of the hand central point in XYZ three-dimensional system of coordinates.
Wherein, the method for setting up coordinate value of the hand central point in XYZ three-dimensional system of coordinates is specially:Due to XZ coordinate surfaces There are common Z axis with YZ coordinate surfaces, so by each coordinate end in the Z values of each coordinate end points in XZ coordinate surfaces and YZ coordinate surfaces The Z values of point are all extracted and are compared, and Z axis coordinate value is consistent or immediate coordinate end points can be considered as same end Point, then merges into a coordinate end by the coordinate value of the coordinate value and YZ coordinate surfaces that are judged as the XZ coordinate surfaces of same end point Point, using the coordinate value as XYZ three-dimensional system of coordinates.Because Z values are possible to different, the Z values of the new three-dimensional coordinate for producing For the coordinate Z value of the coordinate Z values plus YZ coordinate surfaces and then the operation result divided by 2 of XZ coordinate surfaces, X, the Y in three-dimensional system of coordinate Coordinate value is respectively equal to the X-coordinate value and Y-coordinate value of XZ coordinate surfaces and YZ coordinate surfaces.
Step S35, the finger number judging unit 103 recognizes that user carries out operation institute in longitudinal view according to hand images The finger number for using.
Specifically, the finger number judging unit 103 determines finger by the identification to finger end points in hand images Number.Wherein, recognize that the method for finger end points includes color background method and color glove auxiliary law, the monochrome back of the body by common camera Scape method is specially:The environmental background of bimanualness needs color relatively easy and single, so can directly pass through human body complexion The direct handle portion Extraction of Image of color interval scope out, then hand each strip is calculated according to figure endpoint algorithm and is prolonged The cut off position stretched, as the endpoint location of every finger, then calculates one and has several end points.Color glove auxiliary law has Body is:User needs to wear special gloves, and each fingertip location of gloves is pure red, because common camera is all RGB (red-green-blue) samples, the position that can go out pure red point with extracting directly, it is also possible to use green or blueness as gloves Finger end points color, then calculates one and has several end points.
Recognize that the method for finger end points filters method and color glove auxiliary law, temperature filter including temperature by infrared camera Division is specially:Bimanualness can directly by human surface temperature's relative ambient temperature it is higher the characteristics of directly temperature compared with High hand Extraction of Image out, then calculates the cut off position of each strip extension of hand according to figure endpoint algorithm, As the endpoint location of every finger, then calculate one and have several end points.Color glove auxiliary law is in particular:User wears Special gloves, each fingertip location of gloves is the point for having heating, so can go out hotspot location in image with extracting directly, so One is calculated afterwards has several end points.
Step S36, the mode of operation judging unit 104 is used according to the user that the finger number judging unit 103 determines The number of finger is operated the selection of pattern.
For example, it is change hand to the operator scheme of virtual touch screen when stretching out (hand is in and holds bulk) without finger Position coordinates in screen;When only using a finger (other fingers are in and hold bulk), the operation mould to virtual touch screen Formula is to choose certain icon;It is to drag the icon chosen to the operator scheme of virtual touch screen during using two fingers;Using three It is the whole screen that slides to the operator scheme of virtual touch screen during root finger.At most 0 finger of definable to 5 fingers, altogether 6 kinds of mode of operations, all can be re-defined using the corresponding mode of operation of different fingers.
Step S37, the hand three-dimensional coordinate that the action judging unit 108 is set up according to the three-dimensional coordinate computing unit 107 Mode of operation and the operable icon of the feedback of chart drawing unit 109 that value, the mode of operation judging unit 104 determine XZ coordinate ranges region, judge the operating position and operator scheme of user.
In the present embodiment, in hand position initial phase, the action judging unit 108 is according to hand central point D coordinates value in vertical direction minimum end points (the namely min coordinates value of Y-axis) as click on judge plane Y-axis value, The D coordinates value of handle portion central point is mapped on operable area, and judges what is currently carried out with reference to current mode of operation Hand judges the operating result corresponding to plane through the click.For example, in the corresponding hand that changes of 0 finger in screen Under the mode of operation of position coordinates, clicking operation does not have any implication;Certain icon working mould is chosen 1 finger is corresponding Under formula, clicking operation represents the icon chosen in certain screen;In the corresponding icon working pattern chosen of dragging of 2 fingers Under, click action represents that certain icon is started or terminated drag operation by user;In the corresponding whole screen that slides of 3 fingers Under mode of operation, click action represents operation that whole screen-picture is started or terminate to drag by user etc..
Due to being hand position initial phase, the action judging unit 108 sets the initial of the judgement face of click using Y value Value, so the Y value of hand coordinate is both greater than or equal to the decision content in the judgement face of click.Also, when user's movement hand is carried out just During often operation under mode of operation, every time the action judging unit 108 is received after the three-dimensional coordinate of hand central point, is no longer weighed The click is newly set and judges plane Y-axis value, but directly judge plane Y-axis value to judge whether virtual touch screen according to the click There is effective click action.
Wherein, the method D coordinates value of hand central point being mapped on the operable area of touch screen is:Setting The operable area scope is the coordinate value scope of XZ coordinate surfaces, and the coordinate in the XZ faces of hand central point can be mapped directly into can Plan-position coordinate in operating area.
Judge that the method for click action is according to the D coordinates value of hand central point:Judge plane Y when have selected to click on After axle value, as long as the Y value in hand central point three-dimensional coordinate judges plane Y-axis value less than the click, then judge that the end points is passed through The click judge that click behavior occur in plane, the i.e. finger, then in conjunction with hand central point in which region decision user It is finally that clicking operation has been carried out to which position.
Step S38, the chart drawing unit 109 according to the judged result of the action judging unit 108 and it is current it is all can The coordinate of handle icon, draws out the position of hand and the picture for effectively operating on virtual touch screen.
Wherein, the chart drawing unit 109 initializes at the beginning the coordinate value of the XZ coordinate surfaces of all operable icons, and The coordinates regional of the XZ coordinate surfaces of each operable icon is fed back to the action judging unit 108.The chart drawing unit 109 According to the position of the different operable icons of operation change, specific sound is made according to the band of position of the click behavior for occurring Should, for example, highlight, dragging, deletion etc., and drawn image is sent to the display control unit 110, while updating The coordinate value of the operable icon behind shift position simultaneously feeds back to the action judging unit 108.
Step S39, the display control unit 110 is converted to display and sets the image drawn by the chart drawing unit 109 Standby 21 sequential that can be shown, call the display unit 111 that the image operated on virtual touch screen is shown to into display device Watch for user on 21, user can learn the position in oneself corresponding virtual touch screen of current hand central point according to feedback Put, then can start to continue to move to hand according to display content to proceed virtual touch control operation.
A kind of virtual contact action device, system and method that the present invention is provided, image is captured and by figure by camera Operating position is determined as identification hand and mode of operation is judged by recognizing finger number, by the hand three-dimensional coordinate for obtaining The operational motion to virtual touch screen is mapped directly into, and is shown feed back to user over the display, touch screen can be made Input no longer needs entity device, by the picture pick-up device in intelligent glasses and Intelligent bracelet, or intelligent and portable mobile device The virtual touch screen input environment of fast construction, carries out whenever and wherever possible touch screen input, facilitates user to touch by virtual whenever and wherever possible Control screen carries out man-machine interactive operation freely.
Embodiments of the invention are the foregoing is only, the scope of the claims of the present invention is not thereby limited, it is every using this Equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills Art field, is included within the scope of the present invention.

Claims (14)

1. a kind of man-machine interactive virtual touch control device, including display control unit and display unit, it is characterised in that the dress Put including:
View recognition unit, the view data for the singlehanded head collection to camera to needing to operate in user's both hands carries out hand Portion recognizes, to determine hand center position in the picture;
Horizontal plane two-dimensional coordinate sets up unit, and the hand center position for being identified according to the view recognition unit is being schemed Position as in and the pixel resolution of camera, by hand central point location of pixels the two-dimensional coordinate of XZ coordinate surfaces is converted to Value;
Vertical plane two-dimensional coordinate sets up unit, and the hand center position for being identified according to the view recognition unit is being schemed Position as in and the pixel resolution of camera, by hand central point location of pixels the two-dimensional coordinate of YZ coordinate surfaces is converted to Value;
Three-dimensional coordinate computing unit, for setting up unit according to the horizontal plane two-dimensional coordinate and the vertical plane two-dimensional coordinate is built The hand central point location of pixels that vertical unit determines respectively is set up in hand in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces Coordinate value of the heart point in XYZ three-dimensional system of coordinates;The Z coordinate value of the three-dimensional system of coordinate is the coordinate Z value and YZ of XZ coordinate surfaces The mean value of the coordinate Z values of coordinate surface, the X-coordinate value of three-dimensional system of coordinate for XZ coordinate surfaces X-coordinate value, the Y of three-dimensional system of coordinate Coordinate value is the Y-coordinate value of YZ coordinate surfaces;
Mode of operation judging unit, the finger number used for user in the singlehanded image that recognized according to view recognition unit is entered The selection of row mode of operation;
Action judging unit, the hand D coordinates value for being set up according to the three-dimensional coordinate computing unit judges the behaviour of user Make position, the mode of operation selected according to the mode of operation judging unit judges the operator scheme of user;And
Chart drawing unit, for according to the coordinate of the judged result of the action judging unit and current all operable icons The position of hand and the picture for effectively operating on virtual touch screen are drawn out, to call the display control unit control described aobvious Show that unit shows the image of user operation so that user learns the corresponding virtual touch screen of current hand central point according to feedback In position and the image of the user operation according to display unit continue to move to hand and carry out virtual touch control operation.
2. man-machine interactive virtual touch control device as claimed in claim 1, it is characterised in that also single including ambient brightness sensing Unit, for the brightness value of induced environment;
The view recognition unit includes:
Longitudinal view recognizes subelement, and the ambient brightness value for being detected according to the ambient brightness sensing unit judges to use Common camera or infrared camera, and the image of subelement collection is recognized to longitudinal view after the camera for using is determined Data carry out hard recognition, to determine in the hand central point pixel position in the picture of XZ coordinate surfaces;And
Transverse views recognize subelement, and the ambient brightness value for being detected according to the ambient brightness sensing unit judges to use Common camera or infrared camera, and after the camera for using is determined transverse views are recognized with the image of subelement collection Data carry out hard recognition, to determine in the hand central point pixel position in the picture of YZ coordinate surfaces.
3. man-machine interactive virtual touch control device as claimed in claim 2, it is characterised in that also judge single including finger number Unit, for the finger used according to user in the longitudinal view of hand images identification of longitudinal view identification subelement identification Number.
4. man-machine interactive virtual touch control device as claimed in claim 1, it is characterised in that the horizontal plane two-dimensional coordinate is built Hand central point location of pixels is converted to vertical unit the two-dimensional coordinate value of XZ coordinate surfaces, and the vertical plane two-dimensional coordinate is built Hand central point location of pixels is converted to vertical unit the two-dimensional coordinate value of YZ coordinate surfaces, specially:By image lower left corner pixel Point is set to the starting point 0 of two-dimensional coordinate system, according to image analytic degree and is converted to the coordinate value range computation after two-dimensional coordinate Go out coordinate value scope with respect to the line number of each image and the ratio of columns.
5. man-machine interactive virtual touch control device as claimed in claim 1, it is characterised in that the action judging unit is also used According to the minimum end points of vertical direction in the D coordinates value of hand central point as point when in hand position initial phase Judgement plane Y-axis value is hit, the D coordinates value of handle portion central point is mapped on operable area, and with reference to current Working mould Formula judges the hand for currently carrying out through the operating result clicked on and judge corresponding to plane.
6. man-machine interactive virtual touch control device as claimed in claim 1, it is characterised in that the chart drawing unit is also used In the coordinates regional of the XZ coordinate surfaces of initialized operable icon is fed back to into the action judging unit;The action judges Hand D coordinates value, the work of mode of operation judging unit determination that unit is set up according to the three-dimensional coordinate computing unit The operating position of the XZ coordinate range region decision users of the operable icon of operation mode and chart drawing unit feedback And operator scheme.
7. the virtual touch-control system of a kind of man-machine interactive, it is characterised in that include the people as described in claim 1-6 any one Machine interactive virtual contactor control device and two picture pick-up devices with described device communication connection.
8. the virtual touch-control system of man-machine interactive as claimed in claim 7, it is characterised in that the picture pick-up device includes first Camera and second camera, used as longitudinal picture pick-up device, the second camera is used as laterally taking the photograph for first camera As equipment, the first camera is set to orthogonal with the shooting direction of second camera.
9. the virtual touch-control system of man-machine interactive as claimed in claim 8, it is characterised in that first camera is common Camera, the second camera is infrared camera.
10. the virtual touch control method of a kind of man-machine interactive, it is characterised in that methods described includes:
User vacantly lies against the one hand for needing to operate in both hands in image capture area, the view data to camera collection Hard recognition is carried out, to determine hand center position in the picture;
According to the hand center position for identifying position in the picture and the pixel resolution of camera, by hand central point Location of pixels is respectively converted into the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces;
Hand central point is set up in XYZ in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces according to hand central point location of pixels Coordinate value in three-dimensional system of coordinate;The Z coordinate value of the three-dimensional coordinate is the coordinate Z values of XZ coordinate surfaces and the coordinate of YZ coordinate surfaces The mean value of Z values, the X of three-dimensional system of coordinate, Y-coordinate value are respectively equal to the X-coordinate value and Y-coordinate of XZ coordinate surfaces and YZ coordinate surfaces Value;
The finger number used according to user in the singlehanded image of identification is operated the selection of pattern;
Hand D coordinates value according to setting up judges the operating position of user, and according to the mode of operation for determining the behaviour of user is judged Operation mode;
The position of hand and effectively is drawn out on virtual touch screen according to the coordinate of judged result and current all operable icons The picture of operation;And
Show the images for user viewing drawn so that user learns that current hand central point is corresponding and virtually touches according to feedback Controlling the position in screen and continuing to move to hand according to the image drawn carries out virtual touch control operation, and the image of the drafting is root It is judged that the coordinate of result and current all operable icons draws out the position of hand and effectively operation on virtual touch screen Picture.
The virtual touch control method of 11. man-machine interactives as claimed in claim 10, it is characterised in that the user will need in both hands The one hand to be operated vacantly is lain against in image capture area, hard recognition is carried out to the view data of camera collection, with true Before the step of determining hand center position in the picture, including:
Judge to use common camera or infrared photography with the luminance threshold value for pre-setting according to the ambient brightness value of sensing Head;
User will need one operated holding fist vacantly to lie against in image capture area and keep certain hour in both hands It is static, recognized by selected camera and oriented the initial position of hand.
The virtual touch control method of 12. man-machine interactives as claimed in claim 11, it is characterised in that described by selected camera Also include after the step of recognizing and orient the initial position of hand:
Plane Y-axis value, handle are judged as clicking on according to the minimum end points of vertical direction in the D coordinates value of hand central point The D coordinates value of portion's central point is mapped on operable area, and the hand for judging currently to carry out with reference to current mode of operation Through the operating result clicked on and judge corresponding to plane.
The virtual touch control method of 13. man-machine interactives as claimed in claim 10, it is characterised in that the user will need in both hands The one hand to be operated vacantly is lain against in image capture area, hard recognition is carried out to the view data of camera collection, with true The step of determining hand center position in the picture is specially:
Determine respectively in the hand central point pixel position in the picture of XZ coordinate surfaces and YZ coordinate surfaces.
The virtual touch control method of 14. man-machine interactives as claimed in claim 13, it is characterised in that the hand that the basis is identified Portion's center position position in the picture and the pixel resolution of camera, hand central point location of pixels is respectively converted into The step of two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces, is specially:
Image lower left corner pixel is set to into the starting point 0 of two-dimensional coordinate system, according to image analytic degree and being converted to two-dimentional seat Coordinate value range computation after mark goes out coordinate value scope with respect to the line number of each image and the ratio of columns.
CN201410436118.9A 2014-08-29 2014-08-29 Virtual touch screen operation device, system and method Active CN104199547B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410436118.9A CN104199547B (en) 2014-08-29 2014-08-29 Virtual touch screen operation device, system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410436118.9A CN104199547B (en) 2014-08-29 2014-08-29 Virtual touch screen operation device, system and method

Publications (2)

Publication Number Publication Date
CN104199547A CN104199547A (en) 2014-12-10
CN104199547B true CN104199547B (en) 2017-05-17

Family

ID=52084848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410436118.9A Active CN104199547B (en) 2014-08-29 2014-08-29 Virtual touch screen operation device, system and method

Country Status (1)

Country Link
CN (1) CN104199547B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6500986B2 (en) * 2015-06-25 2019-04-17 富士通株式会社 Electronic device and drive control method
CN105929957A (en) * 2016-04-26 2016-09-07 深圳市奋达科技股份有限公司 Control method, apparatus and device for intelligent glasses
CN106200904A (en) * 2016-06-27 2016-12-07 乐视控股(北京)有限公司 A kind of gesture identifying device, electronic equipment and gesture identification method
CN106802717A (en) * 2017-01-20 2017-06-06 深圳奥比中光科技有限公司 Space gesture remote control thereof and electronic equipment
CN106933347A (en) * 2017-01-20 2017-07-07 深圳奥比中光科技有限公司 The method for building up and equipment in three-dimensional manipulation space
CN106951087B (en) * 2017-03-27 2020-02-21 联想(北京)有限公司 Interaction method and device based on virtual interaction plane
JP6737824B2 (en) * 2018-03-13 2020-08-12 ファナック株式会社 Control device, control method, and control program
CN112306305B (en) * 2020-10-28 2021-08-31 黄奎云 Three-dimensional touch device
CN112506372B (en) * 2020-11-30 2022-06-07 广州朗国电子科技股份有限公司 Graffiti spray drawing method and device based on touch screen and storage medium
CN115617178B (en) * 2022-11-08 2023-04-25 润芯微科技(江苏)有限公司 Method for completing key and function triggering by no contact between finger and vehicle
CN116661656B (en) * 2023-08-02 2024-03-12 安科优选(深圳)技术有限公司 Picture interaction method and shooting display system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101799717A (en) * 2010-03-05 2010-08-11 天津大学 Man-machine interaction method based on hand action catch
WO2013018099A3 (en) * 2011-08-04 2013-07-04 Eyesight Mobile Technologies Ltd. System and method for interfacing with a device via a 3d display

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3114813B2 (en) * 1991-02-27 2000-12-04 日本電信電話株式会社 Information input method
AU2008299883B2 (en) * 2007-09-14 2012-03-15 Facebook, Inc. Processing of gesture-based user interactions

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101799717A (en) * 2010-03-05 2010-08-11 天津大学 Man-machine interaction method based on hand action catch
WO2013018099A3 (en) * 2011-08-04 2013-07-04 Eyesight Mobile Technologies Ltd. System and method for interfacing with a device via a 3d display

Also Published As

Publication number Publication date
CN104199547A (en) 2014-12-10

Similar Documents

Publication Publication Date Title
CN104199547B (en) Virtual touch screen operation device, system and method
CN104199550B (en) Virtual keyboard operation device, system and method
CN104199548B (en) A kind of three-dimensional man-machine interactive operation device, system and method
CN102662577B (en) A kind of cursor operating method based on three dimensional display and mobile terminal
CN103197885B (en) The control method and its mobile terminal of mobile terminal
CN106598227A (en) Hand gesture identification method based on Leap Motion and Kinect
CN106959808A (en) A kind of system and method based on gesture control 3D models
CN103809866B (en) A kind of operation mode switching method and electronic equipment
TWI471815B (en) Gesture recognition device and method
MX2009000305A (en) Virtual controller for visual displays.
US20140022171A1 (en) System and method for controlling an external system using a remote device with a depth sensor
CN102880304A (en) Character inputting method and device for portable device
CN105242776A (en) Control method for intelligent glasses and intelligent glasses
CN101847057A (en) Method for touchpad to acquire input information
CN104267802A (en) Human-computer interactive virtual touch device, system and method
CN107239222A (en) The control method and terminal device of a kind of touch-screen
CN104199549B (en) A kind of virtual mouse action device, system and method
CN106325726A (en) A touch control interaction method
TW201624196A (en) A re-anchorable virtual panel in 3D space
US11500453B2 (en) Information processing apparatus
Hartanto et al. Real time hand gesture movements tracking and recognizing system
TW201439813A (en) Display device, system and method for controlling the display device
CN103543825A (en) Camera cursor system
CN204229299U (en) A kind of electronic equipment
US9189075B2 (en) Portable computer having pointing functions and pointing system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 350003 Fuzhou Gulou District, Fujian, software Avenue, building 89, No. 18

Applicant after: FUZHOU ROCKCHIP ELECTRONICS CO., LTD.

Address before: 350003 Fuzhou Gulou District, Fujian, software Avenue, building 89, No. 18

Applicant before: Fuzhou Rockchip Semiconductor Co., Ltd.

COR Change of bibliographic data
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 350003 building, No. 89, software Avenue, Gulou District, Fujian, Fuzhou 18, China

Patentee after: Ruixin Microelectronics Co., Ltd

Address before: 350003 building, No. 89, software Avenue, Gulou District, Fujian, Fuzhou 18, China

Patentee before: Fuzhou Rockchips Electronics Co.,Ltd.

CP01 Change in the name or title of a patent holder