CN107123099B - Human-computer interface data acquisition method - Google Patents

Human-computer interface data acquisition method Download PDF

Info

Publication number
CN107123099B
CN107123099B CN201710289656.3A CN201710289656A CN107123099B CN 107123099 B CN107123099 B CN 107123099B CN 201710289656 A CN201710289656 A CN 201710289656A CN 107123099 B CN107123099 B CN 107123099B
Authority
CN
China
Prior art keywords
human
image
computer interface
wide
angle camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710289656.3A
Other languages
Chinese (zh)
Other versions
CN107123099A (en
Inventor
王石
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Huaxin Intelligent Technology Co Ltd
Guangdong Hust Industrial Technology Research Institute
Original Assignee
Dongguan Huaxin Intelligent Technology Co Ltd
Guangdong Hust Industrial Technology Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Huaxin Intelligent Technology Co Ltd, Guangdong Hust Industrial Technology Research Institute filed Critical Dongguan Huaxin Intelligent Technology Co Ltd
Priority to CN201710289656.3A priority Critical patent/CN107123099B/en
Publication of CN107123099A publication Critical patent/CN107123099A/en
Application granted granted Critical
Publication of CN107123099B publication Critical patent/CN107123099B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Library & Information Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a human-computer interface data acquisition method, which comprises the following steps: providing a wide-angle camera, a surface light source device, an upper computer and a human-computer interface, and aligning the wide-angle camera and the surface light source device to the human-computer interface; starting the wide-angle camera and the surface light source device to perform brightness detection and light source control, so that the wide-angle camera acquires a clear image; the human-computer interface is connected with the upper computer to calibrate the screen; acquiring any pixel point through the camera and transmitting the pixel point to the upper computer for image correction; carrying out template configuration through the upper computer, framing a plurality of rectangular interest areas in the corrected image and establishing a template library; acquiring an image of a real-time human-computer interface through the wide-angle camera and transmitting the image to the upper computer for pattern matching; and the upper computer compares the data and then transmits the data to the wireless transceiving module. The human-computer interface acquisition method is simple to operate, high in universality and capable of rapidly acquiring data.

Description

Human-computer interface data acquisition method
Technical Field
The invention relates to the technical field of mechanical automation control, in particular to a human-computer interface data acquisition method.
Background
At present, industrial production has realized mechanization and automation to a certain extent, and production equipment is the most main source for acquiring production process data, and if one hand of data cannot be directly acquired from a production equipment unit, production data have to be acquired in the forms of external sensors, radio frequency identification, product bar codes and the like, so that the cost and the implementation difficulty are increased, and the accuracy of data acquisition is influenced. However, in the above data acquisition process, the data is often derived from the data interface, and due to the lack of a standard forcing unification, the data interfaces in different industries, different devices, different manufacturers, different models, and even different versions are difficult to be consistent and standardized, which increases the difficulty of data acquisition. Data acquisition devices and systems are often developed in a competitive manner according to different equipment environments, so that the research and development cost and the maintenance cost of equipment data acquisition are increased, the operation is complicated, and the universality is poor.
Disclosure of Invention
Therefore, a simple, general and quick-matching human-computer interface data acquisition method is needed to solve the problems of human-computer interface data acquisition.
A human-computer interface data acquisition method comprises the following steps:
the first step is as follows: providing a wide-angle camera, a surface light source device, an upper computer and a human-computer interface, and aligning the wide-angle camera and the surface light source device to the human-computer interface;
the second step is as follows: starting the wide-angle camera and the surface light source device to perform brightness detection and light source control, so that the wide-angle camera acquires a clear image;
the third step: the human-computer interface is connected with the upper computer to calibrate the screen;
the fourth step: acquiring any pixel point through the camera and transmitting the pixel point to the upper computer for image correction;
the fifth step: carrying out template configuration through the upper computer, framing a plurality of rectangular interest areas in the corrected image and establishing a template library;
a sixth step: acquiring an image of a real-time human-computer interface through the wide-angle camera and transmitting the image to the upper computer for pattern matching;
a seventh step of: and the upper computer compares the data and then transmits the data to the wireless transceiving module.
According to the human-computer interface data acquisition method, the wide-angle camera, the surface light source device and the upper computer are cooperatively matched, so that data acquisition can be performed on human-computer interfaces of different industries, different equipment, different manufacturers, different models and different versions. The human-computer interface data can be accurately and quickly acquired through image acquisition of the wide-angle camera, screen calibration of an upper computer, image correction, template configuration and mode matching. The human-computer interface data acquisition method has the advantages that the data acquisition device and the data acquisition system cannot be developed specifically according to different equipment environments, the operation is simple, the universality is high, and the data can be obtained quickly.
In one embodiment, in the second step, the surface light source device covers and irradiates the human-computer interface, the brightness of the light source is controlled according to the average brightness obtained by the wide-angle camera, negative feedback compensation is implemented, and the influence caused by the change of ambient light is eliminated.
In one embodiment, the third step specifically includes: four-corner pixel coordinates A1(x1, y1), A2(x2, y2), A3(x3, y3) and A4(x4, y4) of a human-computer interface are extracted clockwise from an image acquired by the wide-angle camera, the width W and the height H of the human-computer interface are actually measured, and actual coordinates B1(0,0), B2(0, H), B3(W, H) and B4(W,0) are obtained.
In one embodiment, the pixel coordinate and the actual coordinate satisfy the following formula 1:
Figure BDA0001281615160000021
the transformation parameter SET (a … H) is derived from equation 1.
In one embodiment, the image correction obtains any pixel coordinate Ai (xi, yi) of an image through the wide-angle camera, and transmits the pixel coordinate Ai (xi, yi) to the upper computer according to formula 2:
X=(A*x+B*y+C)/(x+G*y+H)
Y=(D*x+E*y+F)/(x+G*y+H)
and obtaining an actual coordinate Bi (Xi, Yi), drawing an image according to Bi, and finishing image correction.
In one embodiment, the rectangular region of interest includes starting coordinates (Xn, Yn), width Wn, height Hn, and image Pn within the region.
In one embodiment, the starting coordinates (Xn, Yn), width Wn, height Hn, and image Pn within the region form a template library: r1(X1, Y1, W1, H1, P1), R2(X2, Y2, W3, H4, P5) … Rn (Xn, Yn, Wn, Hn, Pn).
In one embodiment, the data comparison comprises:
a. directly transmitting the numerical value obtained after the first pattern matching to a subsequent unit and storing the numerical value in an internal memory;
b. comparing the value obtained by the subsequent pattern matching with the value stored in the memory;
c. if the data are the same, discarding, if the data are different, transmitting the data to a subsequent unit, and updating the data stored in the memory of the current value.
Drawings
FIG. 1 is a flow chart of a human-machine interface data acquisition method according to the present invention.
Detailed Description
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. Preferred embodiments of the present invention are shown in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Fig. 1 is a schematic flow chart of a human-machine interface data acquisition method according to a preferred embodiment of the present invention, the human-machine interface data acquisition method includes the following steps:
the first step is as follows: the wide-angle camera, the surface light source device, the upper computer and the human-computer interface are provided, and the wide-angle camera and the surface light source device are aligned to the human-computer interface.
The second step is as follows: the wide-angle camera and the surface light source device are started to carry out brightness detection and light source control, the surface light source device covers and irradiates a human-computer interface, the brightness of the light source is controlled according to the average brightness obtained by the wide-angle camera, negative feedback compensation is implemented, the influence caused by the change of ambient light is eliminated, and the wide-angle camera obtains a clear image.
The third step: the method comprises the steps that a human-computer interface is connected with an upper computer to carry out screen calibration, four-corner pixel coordinates A1(x1, y1), A2(x2, y2), A3(x3, y3) and A4(x4, y4) of the human-computer interface are extracted clockwise from an image acquired by a wide-angle camera, the width W and the height H of the human-computer interface are actually measured, and actual coordinates B1(0,0), B2(0, H), B3(W, H) and B4(W,0) are obtained. The pixel coordinate and the actual coordinate satisfy formula 1:
Figure BDA0001281615160000041
and solving by adopting a Gaussian elimination method according to the formula 1 to obtain a transformation parameter SET (A … H), and storing the A … H parameter for later use.
The fourth step: acquire arbitrary pixel through the camera and transmit for the host computer and carry out image correction, image correction passes through any pixel coordinate Ai (xi, yi) of wide angle camera acquisition image to carry for the host computer, according to formula 2:
X=(A*x+B*y+C)/(x+G*y+H)
Y=(D*x+E*y+F)/(x+G*y+H)
and obtaining an actual coordinate Bi (Xi, Yi), drawing an image according to Bi, and finishing image correction.
The fifth step: and (3) carrying out template configuration through an upper computer, framing a plurality of rectangular interest areas in the corrected image and establishing a template library, wherein the rectangular interest areas comprise indicator lamps, digital characters and the like. Wherein, the rectangular interest area comprises a start coordinate (Xn, Yn), a width Wn, a height Hn and an image Pn in the area, and the start coordinate (Xn, Yn), the width Wn, the height Hn and the image Pn in the area form a template library: r1(X1, Y1, W1, H1, P1), R2(X2, Y2, W3, H4, P5) … Rn (Xn, Yn, Wn, Hn, Pn).
A sixth step: acquiring an image of a real-time human-computer interface through a wide-angle camera and transmitting the image to an upper computer for pattern matching; and matching the currently acquired human-computer interface image of the equipment and a template set in a template library by mode matching according to the initial coordinates (Xn, Yn), the width Wn, the height Hn and the image Pn in the region to acquire the numerical value of each rectangular interest region.
A seventh step of: the host computer carries out data comparison, later with data transmission for wireless transceiver module, wherein data comparison includes:
a. directly transmitting the numerical value obtained after the first pattern matching to a subsequent unit and storing the numerical value in an internal memory;
b. comparing the value obtained by the subsequent pattern matching with the value stored in the memory;
c. if the data are the same, discarding, if the data are different, transmitting the data to a subsequent unit, and updating the data stored in the memory of the current value.
The screen calibration is only carried out during the first installation, and the template configuration is only carried out to change the acquired data when the human-computer interface is found to have errors with the preset settings.
The wireless receiving and transmitting module is a low-power local area network protocol (ZigBee) wireless data receiving and transmitting node, when data are required to be transmitted, the change and the data of the node are transmitted, and when no data are required to be transmitted, the node is in a dormant state. The data acquisition devices correspondingly comprise a plurality of wireless data receiving and transmitting nodes which can form a wireless data acquisition network based on a human-computer interface, and the nodes are communicated through a Z-Stack protocol and can be flexibly deployed on any equipment needing equipment work monitoring.
According to the human-computer interface data acquisition method, the wide-angle camera, the surface light source device and the upper computer are cooperatively matched, so that data acquisition can be performed on human-computer interfaces of different industries, different equipment, different manufacturers, different models and different versions. The human-computer interface data can be accurately and quickly acquired through image acquisition of the wide-angle camera, screen calibration of an upper computer, image correction, template configuration and mode matching. The human-computer interface data acquisition method has the advantages that the data acquisition device and the data acquisition system cannot be developed specifically according to different equipment environments, the operation is simple, the universality is high, and the data can be obtained quickly.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (6)

1. A human-computer interface data acquisition method is characterized by comprising the following steps:
the first step is as follows: providing a wide-angle camera, a surface light source device, an upper computer and a human-computer interface, and aligning the wide-angle camera and the surface light source device to the human-computer interface;
the second step is as follows: starting the wide-angle camera and the surface light source device to perform brightness detection and light source control, so that the wide-angle camera acquires a clear image;
the third step: the human-computer interface is connected with the upper computer to calibrate the screen;
the fourth step: acquiring any pixel point through the camera and transmitting the pixel point to the upper computer for image correction;
the fifth step: carrying out template configuration through the upper computer, framing a plurality of rectangular interest areas in the corrected image and establishing a template library;
a sixth step: acquiring an image of a real-time human-computer interface through the wide-angle camera and transmitting the image to the upper computer for pattern matching;
a seventh step of: the upper computer compares the data and then transmits the data to the wireless transceiving module;
the third step is specifically as follows: clockwise extracting four-corner pixel coordinates A1(x1, y1), A2(x2, y2), A3(x3, y3), A4(x4, y4) of a human-computer interface from an image acquired by the wide-angle camera, and actually measuring the width W and the height H of the human-computer interface to obtain actual coordinates B1(0,0), B2(0, H), B3(W, H) and B4(W, 0);
the pixel coordinate and the actual coordinate satisfy formula 1:
Figure FDA0002163262620000011
the transformation parameter SET (a … H) is derived from equation 1.
2. The human-computer interface data acquisition method according to claim 1, wherein in the second step, the surface light source device is covered and irradiates the human-computer interface, and the brightness of the light source is controlled according to the average brightness acquired by the wide-angle camera, so that negative feedback compensation is implemented, and the influence caused by the change of ambient light is eliminated.
3. The human-computer interface data acquisition method according to claim 1, wherein the image correction acquires any pixel coordinate Ai (xi, yi) of an image through the wide-angle camera and transmits the pixel coordinate Ai (xi, yi) to the upper computer according to formula 2:
X=(A*x+B*y+C)/(x+G*y+H)
Y=(D*x+E*y+F)/(x+G*y+H)
and obtaining an actual coordinate Bi (Xi, Yi), drawing an image according to Bi, and finishing image correction.
4. The human-computer interface data acquisition method according to claim 1, wherein the rectangular region of interest comprises a start coordinate (Xn, Yn), a width Wn, a height Hn, and an image Pn within the region.
5. The human-machine interface data acquisition method according to claim 4, characterised in that the starting coordinates (Xn, Yn), the width Wn, the height Hn, and the image Pn within the area form a template library: r1(X1, Y1, W1, H1, P1), R2(X2, Y2, W3, H4, P5) … Rn (Xn, Yn, Wn, Hn, Pn).
6. The human-machine interface data collection method of claim 1, wherein the data comparison comprises:
a. directly transmitting the numerical value obtained after the first pattern matching to a subsequent unit and storing the numerical value in an internal memory;
b. comparing the value obtained by the subsequent pattern matching with the value stored in the memory;
c. if the data are the same, discarding, if the data are different, transmitting the data to a subsequent unit, and updating the data stored in the memory of the current value.
CN201710289656.3A 2017-04-27 2017-04-27 Human-computer interface data acquisition method Active CN107123099B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710289656.3A CN107123099B (en) 2017-04-27 2017-04-27 Human-computer interface data acquisition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710289656.3A CN107123099B (en) 2017-04-27 2017-04-27 Human-computer interface data acquisition method

Publications (2)

Publication Number Publication Date
CN107123099A CN107123099A (en) 2017-09-01
CN107123099B true CN107123099B (en) 2020-02-14

Family

ID=59725130

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710289656.3A Active CN107123099B (en) 2017-04-27 2017-04-27 Human-computer interface data acquisition method

Country Status (1)

Country Link
CN (1) CN107123099B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101709957A (en) * 2009-12-14 2010-05-19 上海应用技术学院 Straightness automatic measurement method based on machine vision
CN102483751A (en) * 2009-10-15 2012-05-30 博世汽车部件(苏州)有限公司 Navigation system and method with improved destination searching
CN102611822A (en) * 2012-03-14 2012-07-25 海信集团有限公司 Projector and projection image rectifying method thereof
CN102750067A (en) * 2011-04-19 2012-10-24 中国科学院软件研究所 Large screen interaction method based on handheld device
CN202563233U (en) * 2011-12-21 2012-11-28 海信集团有限公司 Projection image correcting device
CN103093221A (en) * 2013-01-31 2013-05-08 冠捷显示科技(厦门)有限公司 Displayer capable of tracking reading materials intelligently and collecting images of reading materials and method thereof
CN103839058A (en) * 2012-11-21 2014-06-04 方正国际软件(北京)有限公司 Information locating method for document image based on standard template
CN105278659A (en) * 2014-06-18 2016-01-27 中国电信股份有限公司 Target positioning method and device based on visual line tracking technology
CN105979234A (en) * 2016-06-13 2016-09-28 Tcl集团股份有限公司 Projection image correction method and projection device
CN105989577A (en) * 2015-02-17 2016-10-05 中兴通讯股份有限公司 Image correction method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102483751A (en) * 2009-10-15 2012-05-30 博世汽车部件(苏州)有限公司 Navigation system and method with improved destination searching
CN101709957A (en) * 2009-12-14 2010-05-19 上海应用技术学院 Straightness automatic measurement method based on machine vision
CN102750067A (en) * 2011-04-19 2012-10-24 中国科学院软件研究所 Large screen interaction method based on handheld device
CN202563233U (en) * 2011-12-21 2012-11-28 海信集团有限公司 Projection image correcting device
CN102611822A (en) * 2012-03-14 2012-07-25 海信集团有限公司 Projector and projection image rectifying method thereof
CN103839058A (en) * 2012-11-21 2014-06-04 方正国际软件(北京)有限公司 Information locating method for document image based on standard template
CN103093221A (en) * 2013-01-31 2013-05-08 冠捷显示科技(厦门)有限公司 Displayer capable of tracking reading materials intelligently and collecting images of reading materials and method thereof
CN105278659A (en) * 2014-06-18 2016-01-27 中国电信股份有限公司 Target positioning method and device based on visual line tracking technology
CN105989577A (en) * 2015-02-17 2016-10-05 中兴通讯股份有限公司 Image correction method and device
CN105979234A (en) * 2016-06-13 2016-09-28 Tcl集团股份有限公司 Projection image correction method and projection device

Also Published As

Publication number Publication date
CN107123099A (en) 2017-09-01

Similar Documents

Publication Publication Date Title
RU86023U1 (en) TWO PROTOCOL PORTABLE MAINTENANCE FIELD WITH FIELD CONDITIONS WITH THE OPPORTUNITY OF RADIO FREQUENCY COMMUNICATION
CN103776841B (en) Synthetic leather automatic defect detecting device and detection method
EP1596324A2 (en) Contactless data carrier system
US9241110B2 (en) Information display device and information device system
CN105137389B (en) A kind of video auxiliary radio frequency positioning method and device
US11646807B2 (en) Methods including detecting cables connected to ports of communications network equipment and related systems
RU2728726C2 (en) Base station for collecting data from localized sensors
CN109556510B (en) Position detection device and computer-readable storage medium
US20160370787A1 (en) Operation management system
US20190114816A1 (en) Augmented reality light beacon
CN109945992B (en) Calibration method of electronic tag with temperature sensor
CN108627104A (en) A kind of dot laser measurement method of parts height dimension
US20230271325A1 (en) Industrial internet of things systems for monitoring collaborative robots with dual identification, control methods and storage media thereof
CN104581038A (en) Camera position recognition system
CN112308930B (en) Camera external parameter calibration method, system and device
CN104243815A (en) Focusing method and electronic equipment
CN108180935B (en) Fault detection method and device of sensor
CN102236777A (en) Hand-held mobile data acquisition terminal
CN107493377B (en) Thermotechnical parameter acquisition device and method based on mobile terminal application
CN107123099B (en) Human-computer interface data acquisition method
CN106709388B (en) Reader-writer power calibration device and calibration method
US11402244B2 (en) Automatic calibration of a measuring circuit
US20150262414A1 (en) Image processing device, image processing method, and image processing program
EP4332940A1 (en) Measurement device and method for transmitting output of sensor in measurement device
US20140052302A1 (en) Method and sensor node for managing power consumption and power consumption information collection apparatus using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant