CN116448385A - Automatic testing method and device for intelligent glasses and electronic equipment - Google Patents

Automatic testing method and device for intelligent glasses and electronic equipment Download PDF

Info

Publication number
CN116448385A
CN116448385A CN202310416016.XA CN202310416016A CN116448385A CN 116448385 A CN116448385 A CN 116448385A CN 202310416016 A CN202310416016 A CN 202310416016A CN 116448385 A CN116448385 A CN 116448385A
Authority
CN
China
Prior art keywords
standard
display
data
partial
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310416016.XA
Other languages
Chinese (zh)
Inventor
成传罡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Bounds Inc
Original Assignee
Meta Bounds Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meta Bounds Inc filed Critical Meta Bounds Inc
Priority to CN202310416016.XA priority Critical patent/CN116448385A/en
Publication of CN116448385A publication Critical patent/CN116448385A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0207Details of measuring devices
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The embodiment of the application provides an intelligent glasses automatic testing method and device and electronic equipment. The intelligent glasses automatic testing method comprises the following steps: responding to the test instruction, and sending test data corresponding to the test instruction to the intelligent glasses to be tested; acquiring a display picture of corresponding test data acquired through the intelligent glasses to be tested; obtaining a standard picture corresponding to the test data; performing visual algorithm processing on the display picture and the standard picture to obtain display data corresponding to the display picture and standard data corresponding to the standard picture; based on the display data and the standard data, it is determined whether the test instruction was successfully executed. According to the intelligent glasses automatic testing method, the testing result can be automatically obtained, and therefore the efficiency of testing the intelligent glasses can be improved.

Description

Automatic testing method and device for intelligent glasses and electronic equipment
Technical Field
The application relates to the technical field of glasses, in particular to an intelligent glasses automatic testing method, an intelligent glasses automatic testing device and electronic equipment.
Background
The intelligent glasses are a technology capable of calculating the position and angle of the influence of the camera in real time and adding corresponding images, so that real world information and virtual world information are integrated in a seamless mode. The goal of this technique is to fit the virtual world around the real world and interact on the screen. Therefore, the intelligent glasses are more and more popular with users and widely applied to daily life of people.
In the process of producing intelligent glasses, when testing intelligent glasses, generally through human eye discernment debugging among the prior art, efficiency is extremely low.
Disclosure of Invention
The embodiment of the application provides an intelligent glasses automatic testing method, an intelligent glasses automatic testing device and electronic equipment, and at least the efficiency of testing intelligent glasses can be improved to a certain extent.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned in part by the practice of the application.
According to an aspect of the embodiments of the present application, there is provided an automated testing method for smart glasses, including: responding to a test instruction, and sending test data corresponding to the test instruction to the intelligent glasses to be tested; acquiring a display picture corresponding to the test data acquired through the intelligent glasses to be tested; obtaining a standard picture corresponding to the test data; performing visual algorithm processing on the display picture and the standard picture to obtain display data corresponding to the display picture and standard data corresponding to the standard picture; and determining whether the test instruction is successfully executed or not based on the display data and the standard data.
According to an aspect of the embodiments of the present application, there is provided an intelligent glasses automation test device, including: the driving module is used for responding to the test instruction, sending test data corresponding to the test instruction to the intelligent glasses to be tested, acquiring a display picture corresponding to the test data acquired through the intelligent glasses to be tested, and acquiring a standard picture corresponding to the test data; the image processing module is electrically connected with the driving module and is used for receiving the display picture and the standard picture sent by the driving module, carrying out algorithm processing on the display picture and the standard picture to obtain display data corresponding to the display picture and standard data corresponding to the standard picture, and determining whether the test instruction is successfully executed or not based on the display data and the standard data.
In some embodiments of the present application, based on the foregoing solution, the smart glasses automated testing apparatus further includes: the communication module is electrically connected with the driving module, is used for carrying out data transmission with the driving module and is used for establishing electrical connection with the intelligent glasses to be tested; the display module is electrically connected with the driving module and is used for displaying a test interface corresponding to the intelligent glasses to be tested if a selection instruction aiming at the intelligent glasses to be tested is received, determining a test instruction corresponding to the test instruction selection operation according to the test instruction selection operation triggered by a user on the test interface and sending the test instruction to the driving module; and the data storage module is electrically connected with the image processing module and used for storing the display data, the standard data, the display picture, the standard picture and the test result received from the image processing module.
In some embodiments of the present application, based on the foregoing solution, the communication module is further configured to: before responding to a test instruction, establishing electrical connection with the intelligent glasses to be tested; the drive module is further configured to: responding to a test instruction received through a test interface; acquiring test data corresponding to the test instruction; and sending the test data to the intelligent glasses to be tested which are electrically connected.
In some embodiments of the present application, based on the foregoing solution, the image processing module is further configured to: identifying a region of interest in the display screen and/or the standard screen; dividing the display picture and the standard picture based on the region of interest to obtain a plurality of partial display pictures and a plurality of partial standard pictures; performing visual algorithm processing on the plurality of partial display pictures to obtain a plurality of partial display data corresponding to the plurality of partial display pictures; determining a partial display data threshold value of the plurality of partial display data as the display data; performing visual algorithm processing on the partial standard pictures to obtain a plurality of partial standard data corresponding to the partial standard pictures; a partial standard data threshold value of the plurality of partial standard data is determined as the standard data.
In some embodiments of the present application, based on the foregoing solution, the image processing module is further configured to: respectively carrying out image smoothing processing and/or image binarization processing and/or edge detection processing on the plurality of partial display pictures to obtain a plurality of defect display data serving as the plurality of partial display data; and respectively carrying out image smoothing processing and/or image binarization processing and/or edge detection processing on the plurality of partial standard pictures to obtain the plurality of defect standard data serving as the plurality of partial standard data.
In some embodiments of the present application, based on the foregoing solution, the image processing module is further configured to: performing color space processing and/or gray scale processing on the partial display pictures to obtain a plurality of gray scale display data; the plurality of gray scale display data are used as the plurality of partial display data; performing color space processing and/or gray scale processing on the plurality of partial standard pictures to obtain a plurality of gray scale standard data; and using the plurality of gray standard data as the plurality of partial standard data.
In some embodiments of the present application, based on the foregoing solution, the image processing module is further configured to: identifying a region of interest in the display screen and/or the standard screen; dividing the display picture and the standard picture based on the region of interest to obtain a plurality of partial display pictures and a plurality of partial standard pictures; gray scale processing is carried out on the partial display pictures to obtain a plurality of pre-definition display data; performing gradient processing on the plurality of pre-definition display data to obtain definition display data serving as the display data; gray processing is carried out on the partial standard picture images to obtain a plurality of pre-definition standard data; and carrying out gradient processing on the plurality of pre-definition standard data to obtain definition standard data serving as the standard data.
In some embodiments of the present application, based on the foregoing solution, the image processing module is further configured to: comparing the display data with the standard data to obtain the matching similarity of the display data and the standard data; if the matching similarity is within a set range, determining that the test instruction is successfully executed; and if the matching similarity exceeds the set range, determining that the test instruction fails to execute.
According to one aspect of the embodiments of the present application, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the smart glasses automated test method as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic device including: one or more processors; and a storage means for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the smart glasses automated test method as described in the above embodiments.
In the technical scheme provided by some embodiments of the present application, test data corresponding to a test instruction is sent to the intelligent glasses to be tested by responding to the test instruction; acquiring a display picture of corresponding test data acquired through the intelligent glasses to be tested; obtaining a standard picture corresponding to the test data; performing visual algorithm processing on the display picture and the standard picture to obtain display data corresponding to the display picture and standard data corresponding to the standard picture; based on the display data and the standard data, it is determined whether the test instruction was successfully executed. According to the intelligent glasses automatic testing method, after the testing instruction is given, the visual algorithm is utilized for comparison, the testing result is automatically obtained, and therefore the efficiency of testing the intelligent glasses can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application. It is apparent that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art. In the drawings:
FIG. 1 shows a schematic diagram of an exemplary system architecture to which the technical solutions of embodiments of the present application may be applied;
FIG. 2 illustrates a flow chart of a smart glasses automated testing method according to one embodiment of the present application;
FIG. 3 illustrates a flow chart of a smart glasses automated testing method according to one embodiment of the present application;
FIG. 4 illustrates a block diagram of an intelligent eyeglass automated test apparatus, according to one embodiment of the present application;
fig. 5 shows a schematic diagram of a computer system suitable for use in implementing the electronic device of the embodiments of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present application. One skilled in the relevant art will recognize, however, that the aspects of the application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
Fig. 1 shows a schematic diagram of an exemplary system architecture to which the technical solutions of the embodiments of the present application may be applied.
As shown in fig. 1, a system architecture 100 may include an automated test equipment 101 and smart glasses 102. The automatic test equipment 101 and the intelligent glasses 102 can perform data transmission through wired communication or wireless communication, wherein the wireless transmission can be through a network, bluetooth and the like. It should be understood that the number of automated test equipment 101 and smart glasses 102 shown in fig. 1 is merely illustrative. There may be any number of automated test equipment 101 and smart glasses 102, as desired for implementation. Alternatively, the smart glasses 102 may be AR glasses (Augmented Reality Glass).
It should be further noted that, without considering the process of a single specific embodiment, there is no substantial distinction between the automated test equipment 101 in the present application, and each automated test equipment 101 is capable of performing the solution provided in the present application. For example, in one embodiment of the present application, automated test equipment 101 may send test data corresponding to a test instruction to smart glasses 102 in response to the test instruction; acquiring a display picture of corresponding test data acquired through the intelligent glasses 102; obtaining a standard picture corresponding to the test data; performing visual algorithm processing on the display picture and the standard picture to obtain display data corresponding to the display picture and standard data corresponding to the standard picture; based on the display data and the standard data, it is determined whether the test instruction was successfully executed.
It should be noted that, the automatic testing method for the smart glasses provided in the embodiments of the present application is generally executed by the automatic testing device 101, and accordingly, the automatic testing device 101 is generally disposed in the user side.
The implementation details of the technical solutions of the embodiments of the present application are described in detail below:
fig. 2 shows a flow chart of a smart glasses automated test method according to one embodiment of the present application, which may be performed by a terminal device, which may be the automated test equipment 101 shown in fig. 1. Referring to fig. 2, the automated testing method for smart glasses at least includes steps S210 to S250, which are described in detail as follows:
in step S210, in response to the test instruction, test data corresponding to the test instruction is sent to the smart glasses to be tested.
In one implementation manner of this embodiment, an electrical connection may be established with the smart glasses to be tested before responding to the test instruction, and the connection manner may be a wired connection or a wireless connection, and the wireless connection is not limited to a bluetooth connection, a network connection, and the like.
In one implementation manner of this embodiment, test data corresponding to a test instruction may be obtained in response to the test instruction received through the test interface or the test key; and sending test data to the electrically connected intelligent glasses to be tested.
In one implementation manner of this embodiment, when the test instruction is received through the test interface, if a selection instruction for the smart glasses to be tested is received, the test interface corresponding to the smart glasses to be tested may be displayed; determining a test instruction corresponding to the test instruction selection operation according to the test instruction selection operation triggered by the user on the test interface, and acquiring test data corresponding to the test instruction; and sending test data to the electrically connected intelligent glasses to be tested.
In this embodiment, if the automated test equipment is electrically connected to only one smart glasses, the selection instruction for the smart glasses to be tested may be a connection instruction between the automated test equipment and the smart glasses. If the automatic test equipment is electrically connected with a plurality of intelligent glasses, and the intelligent glasses execute the same test, when the test interfaces are the same, the selection instruction aiming at the intelligent glasses to be tested can also be an instruction that the automatic test equipment is connected with the intelligent glasses and then the connection is completed. The selection instruction for the intelligent glasses to be tested can also be a starting instruction or a testing operation instruction of the automatic testing equipment, the testing interface can be automatically displayed after the automatic testing equipment is started or after the testing operation instruction is received, and the testing interface is also a testing interface corresponding to the intelligent glasses to be tested, which are connected to the automatic testing equipment.
In one implementation of this embodiment, the instruction to be tested may include an on instruction, a display instruction, a switch screen instruction, or a shutdown instruction.
In step S220, a display screen of the corresponding test data collected through the smart glasses to be tested is obtained.
In one implementation manner of this embodiment, the display screen of the corresponding test data acquired through the smart glasses to be tested may be acquired through a camera or other image acquisition device, where the camera or other image acquisition device may be a part of an automated test device, or may be a separate set of the camera or other image acquisition device and the automated test device.
In step S230, a standard screen corresponding to the test data is acquired.
In one implementation of the present embodiment, a standard screen corresponding to the test data stored in advance may be acquired from the terminal device. And the standard picture corresponding to the test data can be searched from the server or the cloud through a network or other communication modes. The standard screen may be a screen that the smart glasses should display in an ideal state.
In step S240, visual algorithm processing is performed on the display screen and the standard screen, so as to obtain display data corresponding to the display screen and standard data corresponding to the standard screen.
In one implementation of the present embodiment, a region of interest (region of interest, ROI) in a display screen or a standard screen may be identified; dividing a display picture and a standard picture based on the region of interest to obtain a plurality of partial display pictures and a plurality of partial standard pictures; performing visual algorithm processing on the multiple partial display pictures to obtain multiple partial display data corresponding to the multiple partial display pictures; determining a partial display data threshold value of a plurality of partial display data, which may be all or part of the display data, as the display data; performing visual algorithm processing on the plurality of partial standard pictures to obtain a plurality of partial standard data corresponding to the plurality of partial standard pictures; a partial standard data threshold value of the plurality of partial standard data is determined, and as the standard data, the partial standard data threshold value may be all or part of the display data.
In this embodiment, the region of interest may also be obtained by identifying the regions of interest in the display screen and the standard screen, that is, identifying the regions of interest in the display screen and the regions of interest in the standard screen, respectively, and then regarding the overlapping regions of the regions of interest in the display screen and the regions of interest in the standard screen as the regions of interest.
In one implementation of the present embodiment, the image smoothing process and/or the image binarization process and/or the edge detection process may be performed on the plurality of partial display screens, respectively, to obtain a plurality of defect display data, and the plurality of defect display data may be all or part of the plurality of partial display data as the plurality of partial display data. The method comprises the steps of respectively carrying out image smoothing processing on a plurality of partial display pictures to obtain smoothing processing results corresponding to the plurality of display pictures so as to detect defects in the partial display pictures, wherein the plurality of smoothing processing results corresponding to the plurality of display pictures are all or part of a plurality of defect display data. The image binarization processing or the edge detection processing can be respectively carried out on the plurality of partial display pictures to obtain binarization processing results or edge detection processing results corresponding to the plurality of display pictures, and the plurality of defect display data can also comprise binarization processing results or edge detection processing results corresponding to the plurality of display pictures.
In one implementation of this embodiment, the visual algorithm processing may be performed on a plurality of partial standard frames, and may include: and carrying out image smoothing processing and/or image binarization processing and/or edge detection processing on the plurality of partial standard pictures to obtain a plurality of defect standard data, wherein the plurality of defect standard data can be all or part of the plurality of partial standard data as the plurality of partial standard data.
In one implementation of this embodiment, color space processing and/or gray scale processing may be performed on a plurality of partial display images to obtain a plurality of gray scale display data; the plurality of gradation display data are set as a plurality of partial display data.
In this embodiment, the plurality of gradation display data may be all or part of the plurality of pieces of partial standard data, and the gradation display data may include color space channel values obtained by color space processing, and the color space may be Hue Saturation color Value (HSV) color space, red Green Blue (Red, blue, RGB) color space, or color-opponent space (Lab color space), or the like. The gray scale display data may further include a luminance value obtained by gray scale processing, that is, gray scale processing is performed on each part of the standard frame to obtain a pixel gray scale value of each pixel in each frame, and an average pixel gray scale value in each frame is used as the luminance value of the part of the standard frame.
In one implementation of this embodiment, color space processing and/or gray scale processing may be performed on a plurality of partial standard images to obtain a plurality of gray scale standard data; the plurality of gradation standard data may be all or part of the plurality of partial standard data as the plurality of partial standard data.
In one implementation of the present embodiment, a region of interest in a display screen and/or a standard screen may be identified; dividing a display picture and a standard picture based on the region of interest to obtain a plurality of partial display pictures and a plurality of partial standard pictures; gray scale processing is carried out on the partial display pictures to obtain a plurality of pre-definition display data; gradient (Tenengrad) processing is carried out on the plurality of pre-definition display data to obtain definition display data, wherein the definition display data can be all or part of the display data; gray processing is carried out on the partial standard picture images to obtain a plurality of pre-definition standard data; and carrying out gradient processing on the plurality of pre-definition standard data to obtain definition standard data, wherein the definition standard data can be all or part of the standard data as the standard data.
With continued reference to fig. 2, in step S250, it is determined whether the test instruction was successfully executed based on the display data and the standard data.
In one implementation of this embodiment, the display data and the standard data may be compared to obtain a matching similarity between the display data and the standard data; if the matching similarity is within the set range, determining that the test instruction is successfully executed; if the matching similarity exceeds the set range, determining that the test instruction fails to execute. In this embodiment, the difference between the display data and the standard data may be calculated first, and then the matching similarity may be obtained after calculating the ratio of the difference to the display data or the standard data.
The automated test method of fig. 2 includes the steps of sending test data corresponding to a test instruction to the intelligent glasses to be tested by responding to the test instruction; acquiring a display picture of corresponding test data acquired through the intelligent glasses to be tested; obtaining a standard picture corresponding to the test data; performing visual algorithm processing on the display picture and the standard picture to obtain display data corresponding to the display picture and standard data corresponding to the standard picture; based on the display data and the standard data, it is determined whether the test instruction was successfully executed. According to the intelligent glasses automatic test method, after the test instruction is given, the visual algorithm is used for comparison, the test result is automatically obtained, and therefore the efficiency of testing the intelligent glasses can be improved
Fig. 3 shows a flow chart of a smart glasses automated test method according to one embodiment of the present application, which may be performed by a terminal device, which may be the automated test equipment 101 shown in fig. 1. The automated test equipment 101 may comprise: computer, automated testing tool, bluetooth Dongle (Dongle), camera, test stand.
Referring to fig. 3, the method for automatically testing the intelligent glasses is described in detail as follows:
The Bluetooth encryption lock (Dongle) and the camera are connected with a computer, an automatic testing tool is installed in the computer, and an intelligent glasses automatic testing method shown in fig. 2 is operated in the automatic testing tool. After the intelligent glasses are connected with a Bluetooth encryption lock (Dongle) of an automatic testing tool in a pairing way, the intelligent glasses are placed on a testing bracket; the camera lens on the test support simulates the position of eyes to be opposite to the imaging display area of the lens; the intelligent glasses are controlled to simulate keys and/or touch actions to switch pictures and click through a Bluetooth encryption lock (Dongle) command; the camera shooting imaging picture is transmitted into an automatic testing tool to carry out template comparison processing and save data through a visual algorithm, so as to judge whether picture switching is successful or not, and whether each function is triggered to take effect or not.
The automated test tool content may include a graphical user interface (Graphical User Interface, GUI for short), a communication module (to establish smart glasses, bluetooth Dongle, camera and computer communication), a driving module (bluetooth Dongle remote control smart glasses switch screen, remote control camera take photo), an image processing module (visual algorithm process screen), a data storage module (to store test data).
The rules of the test decision may include: the quality of the display picture, whether the key or touch key switching interface is successful or not and whether the function is triggered to be effective or not are judged by judging the matching degree of the defect data, the RGB color value, the average pixel value and the gradient value deviation value after the processing of the standard picture and the actual acquired display picture, and specific rule explanation can be described in the embodiment of fig. 2.
The automatic testing method of the intelligent glasses shown in the figure 3 controls and switches the application function display of the intelligent glasses through a software communication means, so that the inconvenience of manual operation is reduced; the imaging picture is acquired by adopting a visual camera, and the template comparison automatic judgment result is carried out by utilizing a visual algorithm, so that the fault tolerance rate of manual judgment and intervention is reduced; development of an automatic testing tool is carried out without manual participation, so that the investment of testing manpower is reduced, and the labor cost is reduced; and the development automation testing tool is used for testing and judging, simultaneously saving the result and the original imaging picture data, improving the testing accuracy and facilitating the tracing.
The automatic testing method for the intelligent glasses solves the problem that the intelligent glasses are inconvenient to perform functional testing operation in the wearing process; the problem that the display performance test is carried out in the process of wearing the intelligent glasses by different people and the imaging picture effect judgment standards are inconsistent is solved; the problem that intelligent glasses cannot be tested in large scale in batches is solved; the problem that an imaging picture can not be stored in the process of manually observing and testing the intelligent glasses is solved.
The following describes an embodiment of the apparatus of the present application that may be used to perform the smart glasses automated testing method of the above-described embodiments of the present application. For details not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the method for automatically testing smart glasses described in the present application.
Fig. 4 shows a block diagram of an intelligent eyeglass automation test device, which may be provided within a terminal device, according to one embodiment of the present application.
Referring to fig. 4, an intelligent glasses automation test apparatus 400 according to an embodiment of the present application includes: a driving module 401, an image processing module 402, a communication module 403, a display module 404 and a data storage module 405.
The driving module 401 is configured to send test data corresponding to a test instruction to the smart glasses to be tested in response to the test instruction, obtain a display screen of the corresponding test data acquired through the smart glasses to be tested, and obtain a standard screen of the corresponding test data; the image processing module 402 is electrically connected with the driving module and is used for receiving the display picture and the standard picture sent by the driving module, carrying out algorithm processing on the display picture and the standard picture to obtain display data corresponding to the display picture and standard data corresponding to the standard picture, and determining whether the test instruction is successfully executed or not based on the display data and the standard data; the communication module 403 is electrically connected with the driving module, and is used for carrying out data transmission with the driving module and establishing electrical connection with the intelligent glasses to be tested; the display module 404 is electrically connected to the driving module, and is configured to display a test interface corresponding to the intelligent glasses to be tested if a selection instruction for the intelligent glasses to be tested is received, determine a test instruction corresponding to the test instruction selection operation according to a test instruction selection operation triggered by a user on the test interface, and send the test instruction to the driving module; the data saving module 405 is electrically connected to the image processing module, and is configured to save the display data, the standard data, the display screen, the standard screen, and the test result received from the image processing module.
In some embodiments of the present application, based on the foregoing scheme, the communication module 403 is further configured to: before responding to the test instruction, establishing electrical connection with the intelligent glasses to be tested; the drive module is further configured to: responding to a test instruction received through a test interface; acquiring test data corresponding to the test instruction; and sending test data to the electrically connected intelligent glasses to be tested.
In some embodiments of the present application, based on the foregoing scheme, the image processing module 402 is further configured to: identifying a region of interest in the display screen and/or the standard screen; dividing a display picture and a standard picture based on the region of interest to obtain a plurality of partial display pictures and a plurality of partial standard pictures; performing visual algorithm processing on the multiple partial display pictures to obtain multiple partial display data corresponding to the multiple partial display pictures; determining a partial display data threshold value in the plurality of partial display data as display data; performing visual algorithm processing on the plurality of partial standard pictures to obtain a plurality of partial standard data corresponding to the plurality of partial standard pictures; a partial standard data threshold value among the plurality of partial standard data is determined as standard data.
In some embodiments of the present application, based on the foregoing scheme, the image processing module 402 is further configured to: respectively carrying out image smoothing processing and/or image binarization processing and/or edge detection processing on the plurality of partial display pictures to obtain a plurality of defect display data serving as a plurality of partial display data; and respectively carrying out image smoothing processing and/or image binarization processing and/or edge detection processing on the plurality of partial standard pictures to obtain a plurality of defect standard data serving as a plurality of partial standard data.
In some embodiments of the present application, based on the foregoing scheme, the image processing module 402 is further configured to: performing color space processing and/or gray scale processing on the multiple partial display pictures to obtain multiple gray scale display data; a plurality of gradation display data are set as a plurality of partial display data; performing color space processing and/or gray scale processing on the plurality of partial standard pictures to obtain a plurality of gray scale standard data; the plurality of gradation standard data are used as the plurality of partial standard data.
In some embodiments of the present application, based on the foregoing scheme, the image processing module 402 is further configured to: identifying a region of interest in the display screen and/or the standard screen; dividing a display picture and a standard picture based on the region of interest to obtain a plurality of partial display pictures and a plurality of partial standard pictures; gray scale processing is carried out on the partial display pictures to obtain a plurality of pre-definition display data; gradient processing is carried out on the plurality of pre-definition display data to obtain definition display data serving as display data; gray processing is carried out on the partial standard picture images to obtain a plurality of pre-definition standard data; and carrying out gradient processing on the plurality of pre-definition standard data to obtain definition standard data serving as standard data.
In some embodiments of the present application, based on the foregoing scheme, the image processing module 402 is further configured to: comparing the display data with the standard data to obtain the matching similarity of the display data and the standard data; if the matching similarity is within the set range, determining that the test instruction is successfully executed; if the matching similarity exceeds the set range, determining that the test instruction fails to execute.
Fig. 5 shows a schematic diagram of a computer system suitable for use in implementing the electronic device of the embodiments of the present application.
It should be noted that, the computer system 500 of the electronic device shown in fig. 5 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 5, the computer system 500 includes a central processing unit (Central Processing Unit, CPU) 501, which can perform various appropriate actions and processes, such as performing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 502 or a program loaded from a storage section 508 into a random access Memory (Random Access Memory, RAM) 503. In the RAM 503, various programs and data required for the system operation are also stored. The CPU 501, ROM 502, and RAM 503 are connected to each other through a bus 504. An Input/Output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input section 506 including a keyboard, a mouse, and the like; an output portion 507 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and the like, and a speaker, and the like; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The drive 510 is also connected to the I/O interface 505 as needed. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as needed so that a computer program read therefrom is mounted into the storage section 508 as needed.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method shown in the flowchart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 509, and/or installed from the removable media 511. When executed by a Central Processing Unit (CPU) 501, performs the various functions defined in the system of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with a computer-readable computer program embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. A computer program embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Where each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented by means of software, or may be implemented by means of hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As another aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer-readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to implement the methods described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, in accordance with embodiments of the present application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a usb disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a touch terminal, or a network device, etc.) to perform the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. An automated testing method for intelligent glasses is characterized by comprising the following steps:
responding to a test instruction, and sending test data corresponding to the test instruction to the intelligent glasses to be tested;
acquiring a display picture corresponding to the test data acquired through the intelligent glasses to be tested;
obtaining a standard picture corresponding to the test data;
performing visual algorithm processing on the display picture and the standard picture to obtain display data corresponding to the display picture and standard data corresponding to the standard picture;
And determining whether the test instruction is successfully executed or not based on the display data and the standard data.
2. The automated smart eyewear testing method of claim 1, wherein an electrical connection is established with the smart eyewear under test prior to responding to a test instruction;
if a selection instruction aiming at the intelligent glasses to be tested is received, displaying a test interface corresponding to the intelligent glasses to be tested;
the response to the test instruction, sending test data corresponding to the test instruction to the intelligent glasses to be tested, comprises:
determining a test instruction corresponding to the test instruction selection operation according to the test instruction selection operation triggered by a user on the test interface;
acquiring test data corresponding to the test instruction;
and sending the test data to the intelligent glasses to be tested which are electrically connected.
3. The automated testing method of intelligent glasses according to claim 1, wherein the performing visual algorithm processing on the display screen and the standard screen to obtain display data corresponding to the display screen and standard data corresponding to the standard screen includes:
identifying a region of interest in the display screen and/or the standard screen;
Dividing the display picture and the standard picture based on the region of interest to obtain a plurality of partial display pictures and a plurality of partial standard pictures;
performing visual algorithm processing on the plurality of partial display pictures to obtain a plurality of partial display data corresponding to the plurality of partial display pictures;
determining a partial display data threshold value of the plurality of partial display data as the display data;
performing visual algorithm processing on the partial standard pictures to obtain a plurality of partial standard data corresponding to the partial standard pictures;
a partial standard data threshold value of the plurality of partial standard data is determined as the standard data.
4. The automated testing method of intelligent glasses according to claim 3, wherein the performing visual algorithm processing on the plurality of partial display frames to obtain a plurality of partial display data corresponding to the plurality of partial display frames comprises:
respectively carrying out image smoothing processing and/or image binarization processing and/or edge detection processing on the plurality of partial display pictures to obtain a plurality of defect display data serving as the plurality of partial display data;
The visual algorithm processing is performed on the plurality of partial standard pictures to obtain a plurality of partial standard data corresponding to the plurality of partial standard pictures, including:
and respectively carrying out image smoothing processing and/or image binarization processing and/or edge detection processing on the plurality of partial standard pictures to obtain the plurality of defect standard data serving as the plurality of partial standard data.
5. The automated testing method of intelligent glasses according to claim 3, wherein the performing visual algorithm processing on the plurality of partial display frames to obtain a plurality of partial display data corresponding to the plurality of partial display frames comprises:
performing color space processing and/or gray scale processing on the partial display pictures to obtain a plurality of gray scale display data;
the plurality of gray scale display data are used as the plurality of partial display data;
the visual algorithm processing is performed on the plurality of partial standard pictures to obtain a plurality of partial standard data corresponding to the plurality of partial standard pictures, including:
performing color space processing and/or gray scale processing on the plurality of partial standard pictures to obtain a plurality of gray scale standard data;
And using the plurality of gray standard data as the plurality of partial standard data.
6. The automated testing method of intelligent glasses according to claim 1, wherein the performing visual algorithm processing on the display screen and the standard screen to obtain display data corresponding to the display screen and standard data corresponding to the standard screen includes:
identifying a region of interest in the display screen and/or the standard screen;
dividing the display picture and the standard picture based on the region of interest to obtain a plurality of partial display pictures and a plurality of partial standard pictures;
gray scale processing is carried out on the partial display pictures to obtain a plurality of pre-definition display data;
performing gradient processing on the plurality of pre-definition display data to obtain definition display data serving as the display data;
gray processing is carried out on the partial standard picture images to obtain a plurality of pre-definition standard data;
and carrying out gradient processing on the plurality of pre-definition standard data to obtain definition standard data serving as the standard data.
7. The automated smart eyewear testing method of claim 1, wherein the determining whether the test instruction was successfully executed based on the display data and the standard data comprises:
Comparing the display data with the standard data to obtain the matching similarity of the display data and the standard data;
if the matching similarity is within a set range, determining that the test instruction is successfully executed;
and if the matching similarity exceeds the set range, determining that the test instruction fails to execute.
8. Automatic testing arrangement of intelligent glasses, its characterized in that includes
The driving module is used for responding to the test instruction, sending test data corresponding to the test instruction to the intelligent glasses to be tested, acquiring a display picture corresponding to the test data acquired through the intelligent glasses to be tested, and acquiring a standard picture corresponding to the test data;
the image processing module is electrically connected with the driving module and is used for receiving the display picture and the standard picture sent by the driving module, carrying out algorithm processing on the display picture and the standard picture to obtain display data corresponding to the display picture and standard data corresponding to the standard picture, and determining whether the test instruction is successfully executed or not based on the display data and the standard data.
9. The automated smart eyewear testing device of claim 8, further comprising:
The communication module is electrically connected with the driving module, is used for carrying out data transmission with the driving module and is used for establishing electrical connection with the intelligent glasses to be tested;
the display module is electrically connected with the driving module and is used for displaying a test interface corresponding to the intelligent glasses to be tested if a selection instruction aiming at the intelligent glasses to be tested is received, determining a test instruction corresponding to the test instruction selection operation according to the test instruction selection operation triggered by a user on the test interface and sending the test instruction to the driving module;
and the data storage module is electrically connected with the image processing module and used for storing the display data and/or the standard data and/or the display picture and/or the standard picture and/or the test result received from the image processing module.
10. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the smart glasses automated testing method of any of claims 1-7.
CN202310416016.XA 2023-04-18 2023-04-18 Automatic testing method and device for intelligent glasses and electronic equipment Pending CN116448385A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310416016.XA CN116448385A (en) 2023-04-18 2023-04-18 Automatic testing method and device for intelligent glasses and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310416016.XA CN116448385A (en) 2023-04-18 2023-04-18 Automatic testing method and device for intelligent glasses and electronic equipment

Publications (1)

Publication Number Publication Date
CN116448385A true CN116448385A (en) 2023-07-18

Family

ID=87123355

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310416016.XA Pending CN116448385A (en) 2023-04-18 2023-04-18 Automatic testing method and device for intelligent glasses and electronic equipment

Country Status (1)

Country Link
CN (1) CN116448385A (en)

Similar Documents

Publication Publication Date Title
CN111598091A (en) Image recognition method and device, electronic equipment and computer readable storage medium
CN108197618B (en) Method and device for generating human face detection model
CN111080595A (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN110930296A (en) Image processing method, device, equipment and storage medium
CN110062157B (en) Method and device for rendering image, electronic equipment and computer readable storage medium
CN111429537A (en) Optical detection method, device and equipment for movie screen and intelligent network sensor
US11481927B2 (en) Method and apparatus for determining text color
CN113724175A (en) Image processing method and device based on artificial intelligence and electronic equipment
CN109147708A (en) Gamma value adjusting method and device of display panel and display equipment
CN105678301A (en) Method, system and device for automatically identifying and segmenting text image
CN111428740A (en) Detection method and device for network-shot photo, computer equipment and storage medium
CN111387932A (en) Vision detection method, device and equipment
CN113706400A (en) Image correction method, image correction device, microscope image correction method, and electronic apparatus
CN110310341B (en) Method, device, equipment and storage medium for generating default parameters in color algorithm
CN112511890A (en) Video image processing method and device and electronic equipment
WO2020135097A1 (en) Method and apparatus for channel switch detection of display terminal
CN112581395A (en) Image processing method, image processing device, electronic equipment and storage medium
CN110047126B (en) Method, apparatus, electronic device, and computer-readable storage medium for rendering image
CN116448385A (en) Automatic testing method and device for intelligent glasses and electronic equipment
CN112287945A (en) Screen fragmentation determination method and device, computer equipment and computer readable storage medium
WO2023151210A1 (en) Image processing method, electronic device and computer-readable storage medium
CN116958035A (en) Product part defect detection method, device, medium and electronic equipment
CN111292247A (en) Image processing method and device
CN113269730B (en) Image processing method, image processing device, computer equipment and storage medium
CN112801997B (en) Image enhancement quality evaluation method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination