CN114071120A - Camera testing system, method, storage medium and electronic equipment - Google Patents

Camera testing system, method, storage medium and electronic equipment Download PDF

Info

Publication number
CN114071120A
CN114071120A CN202010768613.5A CN202010768613A CN114071120A CN 114071120 A CN114071120 A CN 114071120A CN 202010768613 A CN202010768613 A CN 202010768613A CN 114071120 A CN114071120 A CN 114071120A
Authority
CN
China
Prior art keywords
time
test
video stream
frame
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010768613.5A
Other languages
Chinese (zh)
Inventor
马海斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Artek Microelectronics Co Ltd
Original Assignee
Artek Microelectronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Artek Microelectronics Co Ltd filed Critical Artek Microelectronics Co Ltd
Priority to CN202010768613.5A priority Critical patent/CN114071120A/en
Publication of CN114071120A publication Critical patent/CN114071120A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The system comprises a test device, a display device and a device to be tested, wherein the device to be tested comprises a camera, and the test device is respectively connected with the display device and the device to be tested; the test equipment is used for triggering the display equipment to output a display object; the device to be tested is used for shooting the display object through the camera to obtain a test video stream and sending the test video stream to the test device; the test equipment is further configured to receive the test video stream, acquire a first time and a second time, and determine a working delay of the camera according to the first time and the second time, where the first time includes a time when the test equipment triggers the display equipment to output the display object, and the second time includes a time when the test equipment receives a first frame of the display object when receiving the test video stream.

Description

Camera testing system, method, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of camera technologies, and in particular, to a camera testing system, a camera testing method, a storage medium, and an electronic device.
Background
Along with the rapid development of image technology, install the camera on more and more products, in order to improve the shooting effect of camera, guarantee the performance of product, can carry out the time delay measurement to the camera usually.
In the related art, the delay measurement can be performed on the camera by pressing the stopwatch, however, this method not only requires manually determining the shooting time when the camera shoots and the current time when the camera outputs the image, but also requires manually pressing the stopwatch, and because the manual operation may have a visual error and a reaction error, the accuracy of the measured working delay of the camera is low.
Disclosure of Invention
In order to solve the above problems, the present disclosure provides a camera test system, a camera test method, a storage medium, and an electronic device.
In a first aspect, the present disclosure provides a camera testing system, the system comprising: the device to be tested comprises a camera, and the test device is connected with the display device and the device to be tested respectively; the test equipment is used for triggering the display equipment to output a display object; the device to be tested is used for shooting the display object through the camera to obtain a test video stream and sending the test video stream to the test device; the test equipment is further configured to receive the test video stream, acquire a first time and a second time, and determine a working delay of the camera according to the first time and the second time, where the first time includes a time when the test equipment triggers the display equipment to output the display object, and the second time includes a time when the test equipment receives a first frame of the display object when receiving the test video stream.
Optionally, in case the display device comprises a light emitting device, the display object comprises light; the test equipment is also used for triggering the light-emitting equipment to emit light according to the target brightness.
Optionally, the test apparatus comprises: the processor is respectively connected with the display equipment and the video receiver, and the video receiver is connected with the equipment to be tested; the processor is used for triggering the light-emitting device to emit light according to the target brightness; the video receiver is used for receiving the test video stream sent by the equipment to be tested; the processor is further configured to store the first time when the light-emitting device is triggered to emit light according to the target brightness, obtain the first time after the test video stream sent by the device to be tested is received, determine the first frame according to a brightness characteristic of each frame of data in the test video stream, obtain the second time when the first frame is received, obtain a time difference between the first time and the second time, and use the time difference as a working delay of the camera.
Optionally, the processor is further configured to perform binarization processing on data of a Y channel of each frame of the test video stream to obtain target frame data, obtain a brightness feature of each target frame data, and determine the first frame according to the brightness feature.
Optionally, in a case where the presentation device includes a display device, the presentation object includes a target video stream; the test equipment is also used for sending the target video stream to the display equipment; the display device is used for displaying the target video stream.
Optionally, the test apparatus comprises: the processor is respectively connected with the signal source and the video receiver, the video receiver is connected with the equipment to be tested, and the signal source is connected with the display equipment; the processor is used for outputting the target video stream through the signal source; the video receiver is used for receiving the test video stream sent by the equipment to be tested; the processor is further configured to, when the target video stream is output, store a first time for outputting a first frame of the target video stream, obtain the first time after receiving the test video stream, determine the first frame according to a luminance characteristic of the first frame of the target video stream and a luminance characteristic of each frame of data in the test video stream, obtain the second time for receiving the first frame, obtain a time difference between the first time and the second time, and use the time difference as a working delay of the camera.
Optionally, the test device is further configured to obtain a size of the display device, obtain a plurality of difference frames between every two frames of data in the test video stream after receiving the test video stream sent by the device to be tested, and determine whether a frame rate of the device to be tested is stable according to image features of the plurality of difference frames, where the difference frames include difference data between every two frames of data.
Optionally, in a case that the image feature includes an area of an image region corresponding to the difference frame, the test device is further configured to determine that the frame rate of the device to be tested is stable in a case that an area of an image region corresponding to each difference frame is smaller than or equal to a preset area threshold; or, in a case that the image feature includes a width of an image region corresponding to the difference frame, the test device is further configured to determine that the frame rate of the device under test is stable in a case that the width of the image region corresponding to each difference frame is smaller than or equal to a preset width threshold.
Optionally, the device to be tested is further configured to determine a first frame of the display object from original image data captured by the camera, determine a capturing time for capturing the first frame, and send the capturing time to the test device; the test equipment is further used for obtaining a time difference value between the shooting time and the time when the test equipment triggers the display equipment to output the display object, and the time difference value is used as the working delay of the camera.
In a second aspect, the present disclosure provides a camera testing method, applied to a testing device in a camera testing system, where the system includes: the device to be tested comprises a camera, and the test device is connected with the display device and the device to be tested respectively; the method comprises the following steps: triggering the display equipment to output a display object; under the condition that the display object is shot by the camera to obtain a test video stream, receiving the test video stream sent by the equipment to be tested; acquiring first time and second time, and determining the working delay of the camera according to the first time and the second time, wherein the first time comprises the time when the test equipment triggers the display equipment to output the display object, and the second time comprises the time when the test equipment receives the first frame of the display object when receiving the test video stream.
Optionally, in case the display device comprises a light emitting device, the display object comprises light; the triggering the display device to output the display object comprises: and triggering the light-emitting device to emit light according to the target brightness.
Optionally, the method further comprises: storing the first time when the light-emitting device is triggered to emit light according to the target brightness; the obtaining the first time and the second time and determining the working delay of the camera according to the first time and the second time comprises: acquiring the first time; determining the first frame according to the brightness characteristic of each frame data in the test video stream; acquiring the second time when the first frame is received; and acquiring a time difference value between the first time and the second time, and taking the time difference value as the working delay of the camera.
Optionally, the determining the first frame according to the brightness characteristic of each frame of data in the test video stream includes: carrying out binarization processing on the data of the Y channel of each frame of the test video stream to obtain target frame data; acquiring brightness characteristics corresponding to each target frame data; and determining the first frame according to the brightness characteristic.
Optionally, in a case where the presentation device includes a display device, the presentation object includes a target video stream; the triggering the display device to output the display object comprises: and sending the target video stream to the display equipment so as to enable the display equipment to display the target video stream.
Optionally, the method further comprises: storing a first time to output a first frame of the target video stream while outputting the target video stream; the obtaining the first time and the second time and determining the working delay of the camera according to the first time and the second time comprises: acquiring the first time; determining the first frame according to the brightness characteristic of the first frame of the target video stream and the brightness characteristic of each frame of data in the test video stream; acquiring the second time when the first frame is received; and acquiring a time difference value between the first time and the second time, and taking the time difference value as the working delay of the camera.
Optionally, before the sending the target video stream to the display device, the method further includes: acquiring the size of the display equipment; after the receiving the test video stream sent by the device under test, the method further includes: acquiring a plurality of difference frames between every two frames of data in the test video stream; and determining whether the frame rate of the device to be tested is stable according to the image characteristics of the plurality of difference frames, wherein the difference frames comprise difference data between every two frames of data.
Optionally, in a case that the image feature includes an area of an image region corresponding to the difference frame, the determining, according to the image features of the difference frames, whether the frame rate of the device under test is stable includes: determining that the frame rate of the device to be tested is stable under the condition that the area of the image area corresponding to each difference frame is smaller than or equal to a preset area threshold; or, in a case that the image feature includes a width of an image area corresponding to the difference frame, the determining whether the frame rate of the device under test is stable according to the image features of the difference frames includes: and determining that the frame rate of the device to be tested is stable under the condition that the width of the image area corresponding to each difference frame is less than or equal to a preset width threshold.
Optionally, after the triggering the display device outputs the display object, the method further includes: determining a first frame of the display object in original image data shot by the camera by the equipment to be tested, and receiving shooting time sent by the equipment to be tested after determining shooting time for shooting the first frame; and acquiring a time difference value between the shooting time and the time when the test equipment triggers the display equipment to output the display object, and taking the time difference value as the working delay of the camera.
In a third aspect, the present disclosure provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the second aspect of the present disclosure.
In a fourth aspect, the present disclosure provides an electronic device comprising: a memory having a computer program stored thereon; a processor for executing the computer program in the memory to implement the steps of the method of the second aspect of the disclosure.
According to the technical scheme, the camera testing system comprises testing equipment, display equipment and equipment to be tested, wherein the equipment to be tested comprises a camera, and the testing equipment is respectively connected with the display equipment and the equipment to be tested; the test equipment is used for triggering the display equipment to output a display object; the device to be tested is used for shooting the display object through the camera to obtain a test video stream and sending the test video stream to the test device; the test equipment is further configured to receive the test video stream, acquire a first time and a second time, and determine a working delay of the camera according to the first time and the second time, where the first time includes a time when the test equipment triggers the display equipment to output the display object, and the second time includes a time when the test equipment receives a first frame of the display object when receiving the test video stream. That is to say, the test equipment can determine the working delay of the camera of the device to be tested according to the first time for triggering the display equipment to output the display object and the second time for receiving the first frame of the display object sent by the device to be tested, so that the first time and the second time can be directly obtained through the test equipment, the obtaining mode is simple, the whole test process does not need manual participation, the first time and the second time are more accurate, and the accuracy of measurement can be improved.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
fig. 1 is a schematic structural diagram of a camera testing system according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a second camera testing system provided in the embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a third camera testing system provided in the embodiment of the present disclosure;
FIG. 4 is a schematic illustration of a physical dimension marking provided by embodiments of the present disclosure;
fig. 5 is a schematic diagram of a difference frame provided by the embodiment of the present disclosure;
fig. 6 is a flowchart of a method for testing a camera according to an embodiment of the present disclosure;
fig. 7 is a flowchart of a second method for testing a camera according to an embodiment of the present disclosure;
fig. 8 is a flowchart of a third method for testing a camera according to an embodiment of the present disclosure;
fig. 9 is a block diagram of an electronic device provided by an embodiment of the present disclosure.
Description of the reference numerals
101 test equipment 102 display equipment
103 device under test 1011 processor
1012 video receiver 1013 signal source
Detailed Description
The following detailed description of specific embodiments of the present disclosure is provided in connection with the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present disclosure, are given by way of illustration and explanation only, not limitation.
In the description that follows, the terms "first," "second," and the like are used for descriptive purposes only and are not intended to indicate or imply relative importance nor order to be construed.
First, an application scenario of the present disclosure will be explained. This openly can be applied to the scene of test camera work delay, and this camera can include the camera in the terminal equipment, also can include the camera in the unmanned aerial vehicle, and this disclosure does not limit to this. In order to ensure the performance of the product, in the product debugging stage, the time delay measurement needs to be performed on the camera used in the product, so that the problem that the user experience is poor due to too long working time delay of the camera is avoided.
In the related art, the working delay of the camera to be measured can be manually obtained, for example, a stopwatch can be pressed and started to time at the same time when the camera to be measured is used for shooting, the stopwatch is pressed again and stopped to time at the same time when the camera to be measured outputs the shot image, and the time counted by the stopwatch is the working delay of the camera to be measured; the method also includes the steps of shooting a certain clock display interface through the camera to be tested to obtain an image, manually reading the current time presented in the clock display interface and the shooting time presented in the clock interface in the image when the camera to be tested outputs the image, and obtaining the time difference between the current time and the shooting time as the work delay of the camera to be tested. However, the above-mentioned manner of manually obtaining the working delay of the camera to be tested needs to manually determine whether the camera to be tested outputs an image, and needs to manually press the stopwatch or manually read the corresponding time, and because human vision and response have errors, the accuracy of the above-mentioned manner of manually obtaining the working delay of the camera to be tested is low, and in addition, the manual operation manner also leads to low testing efficiency.
In order to solve the above problems, the present disclosure provides a camera test system, a camera test method, a storage medium, and an electronic device. The test equipment in the camera test system can determine the working delay of the equipment to be tested according to the first time for triggering the display equipment to output the display object and the second time for receiving the first frame of the display object sent by the equipment to be tested, so that the first time and the second time can be directly obtained through the test equipment, the obtaining mode is simple, the whole process does not need manual participation, the first time and the second time are more accurate, and the accuracy of measurement can be improved.
The present disclosure is described below with reference to specific examples.
Fig. 1 is a schematic structural diagram of a camera test system according to an embodiment of the present disclosure. As shown in fig. 1, the camera test system 100 may include a test device 101, a display device 102, and a device under test 103, where the device under test 103 includes a camera, and the test device 101 is connected to the display device 102 and the device under test 103 respectively;
the testing device 101 is configured to trigger the display device 102 to output a display object;
the device under test 103 is configured to capture the display object through the camera to obtain a test video stream, and send the test video stream to the test device 101;
the testing device 101 is further configured to receive the test video stream, obtain a first time and a second time, and determine a working delay of the camera according to the first time and the second time, where the first time may include a time when the testing device 101 triggers the display device 102 to output the display object, and the second time may include a time when the testing device 101 receives a first frame of the display object when receiving the test video stream.
It should be noted that before the camera test system 100 is used, a corresponding test environment needs to be established, and the present disclosure may determine the corresponding test environment according to the type of the display device 102. For example, in the case that the display device 102 is a lighting device, the test environment may be a darkroom environment, so that the brightness of the test environment changes significantly after the lighting device emits light. In addition, the camera of the device under test 103 may be aligned to the display object output by the display device 102, so as to ensure that the camera can shoot the display object. For example, if the presentation device 102 is a light emitting device, the camera may be directed at the light emitting device; if the presentation device 102 is a projector, the camera can be directed at the area projected by the projector.
The test device 101 may comprise a start button by which a user may start the test device 101, and after the test device 101 starts to operate, the entire camera test system 100 enters an operating state. In addition, in order to ensure that the camera of the device under test 103 can capture the moment when the display device 102 outputs the display object, the device under test 103 may be in an operating state until the test device 101 is started, that is, the camera of the device under test 103 is in a capturing state.
After receiving the trigger operation of the user on the start button, the test device 101 may immediately trigger the display device 102 to output a display object, or trigger the display device 102 to output a display object after a preset time period, where the disclosure does not limit the time point when the display device 102 is triggered. The manner of triggering the display device 102 by the test device 102 may include various manners, for example, the test device 101 may trigger the display device 102 by a pulse signal, and may also trigger the display device 102 by a level jump, which is not limited in this disclosure. In addition, the testing device 101 can also record the first time for triggering the display device 102 to output the display object.
The presentation device 102 may output a presentation object after receiving the trigger operation of the test device 101. Because the device under test 103 is always in a working state, and the camera of the device under test 103 is always shooting the position of the display object, the camera can shoot the whole process before and after the display device 102 outputs the display object.
After the test device 101 is started, a test video stream sent by the device under test 103 may be received. Since the test video stream may include a picture before the display device 102 outputs the display object, a picture at a first time when the display object is output, and a picture after the display object is output, after receiving the test video stream sent by the device under test 103, the test device 101 needs to determine, from the test video stream, a time at the first time when the display device 102 outputs the display object, that is, a second time including a first frame of the display object. After determining the second time, the testing device 101 may obtain a first time for triggering the display device 102 to output the display object, and obtain a time difference between the first time and the second time, where the time difference is a working delay of the camera of the device under test 103.
It should be noted that after the camera of the device under test 103 captures the original image data corresponding to the display object, the original image data needs to be encoded and compressed to obtain the test video stream, and then the test video stream is sent to the test device 101.
Considering that encoding, compression, and the like require processing time, and transmission time is also required for the device under test 103 to transmit the test video stream to the test device 101, the processing time and transmission time are ignored by the test device 101 when acquiring the second time. Therefore, in order to solve the above problem, the device under test 103 is further configured to determine a first frame of the display object from the original image data captured by the camera, determine a capturing time for capturing the first frame, and send the capturing time to the testing device 101; the testing device 101 is further configured to obtain a time difference between the shooting time and a time when the testing device 101 triggers the display device 102 to output the display object, and use the time difference as a working delay of the camera. Therefore, after the camera acquires the original image data of the first frame including the display object, the device to be tested 103 can directly acquire the shooting time for shooting the first frame, and then only needs to send the shooting time to the test device 101, the test device 101 can directly determine the working delay of the camera of the device to be tested 103 according to the time difference between the shooting time and the time for outputting the display object, the processing time of the original image and the transmission time of the data are filtered out in the working delay, and therefore the accuracy of the working delay can be improved.
The following describes a camera testing method according to the present disclosure with reference to some examples.
In an example, the display device 102 includes a light-emitting device.
In case the presentation device 102 comprises a light emitting device, the presentation object comprises light, and the test device 101 is adapted to trigger the light emitting device to emit light at a target brightness. The target brightness may be a brightness significantly different from the current test environment, so that the test device 101 may determine to output the frame of the display object from the test video stream sent by the device under test 103 according to the brightness change
Fig. 2 is a schematic structural diagram of a second camera test system provided in the embodiment of the present disclosure. As shown in fig. 2, the testing apparatus 101 includes a processor 1011 and a video receiver 1012, the processor 1011 is connected to the display apparatus 102 and the video receiver 1012, respectively, and the video receiver 1012 is connected to the device under test 103;
the processor 1011 is configured to trigger the light emitting device to emit light according to the target brightness;
the video receiver 1012, configured to receive the test video stream sent by the device under test 103;
the processor 1011 is further configured to store the first time when the light-emitting device is triggered to emit light according to the target brightness, obtain the first time after receiving the test video stream sent by the device under test 103, determine the first frame according to the brightness characteristic of each frame of data in the test video stream, obtain the second time when the first frame is received, obtain a time difference between the first time and the second time, and use the time difference as the working delay of the camera.
The processor 1011 of the test device 101 may send a trigger signal to the lighting device after receiving a start instruction from a user, and record a first time for triggering the lighting device, and the lighting device may emit light at a target brightness after receiving the trigger signal. If the light-emitting device is in a light-emitting state before receiving the trigger signal sent by the testing device 101, the target brightness may be a brightness that is greatly different from the initial brightness before receiving the trigger signal, for example, the target brightness may be brighter than the initial brightness or darker than the initial brightness; if the light-emitting device is in the off state until receiving the trigger signal sent by the testing device 101, the target brightness may be a lower brightness, as long as the difference between the frames of the light-emitting device before and after the light-emitting device emits light according to the target brightness, which is shot by the camera of the device under test 103, is large.
Since the camera of the device under test 103 needs to capture a picture of the light-emitting device at the first moment of emitting light according to the target brightness, the video receiver 1012 of the test device 101 may be controlled to receive the test video stream sent by the device under test 103 before the test device 101 sends the trigger signal to the light-emitting device. After receiving the test video stream sent by the device under test 103, the video receiver 1012 of the test device 101 may send the test video stream to the processor 1011 of the test device 101, where the processor 1011 may perform binarization processing on data of a Y channel of each frame of the test video stream to obtain target frame data, obtain a brightness feature of each target frame data, and determine a first frame according to the brightness feature.
For each frame of data of the test video stream, the processor 1011 may perform binarization processing on the Y channel data of the frame of data by using a method of the related art, and obtain a sum of the luminances of all the pixels. In the picture taken by the camera of the device under test 103, the luminance of the picture after the lighting device lights according to the target luminance is different from the luminance of the picture before the lighting device lights according to the target luminance, so that the first frame can be determined from the test video stream according to the luminance of each frame of data in the test video stream. For example, if the light-emitting device is in an off state before receiving the trigger signal sent by the testing device 101, and emits light according to the target brightness after receiving the trigger signal, a frame with significantly increased brightness may be determined from the test video stream as the first frame; if the light emitting device is in a light emitting state before receiving the trigger signal sent by the test device 101 and is turned off after receiving the trigger signal, a frame with significantly reduced brightness may be determined from the test video stream as the first frame.
In addition, after receiving the test video stream sent by the device under test 103, the test device 101 may obtain a first time for triggering the light-emitting device to emit light according to the target brightness, and after determining the first frame, may obtain a second time for the test device 101 to receive the first frame, and obtain a difference between the first time and the second time, where the difference is used as a working delay of the camera of the device under test 103. Therefore, the test equipment 103 can determine the work delay of the equipment to be tested 103 by triggering the light-emitting equipment to emit light, and the manual participation is not needed in the test process, so that the accuracy of the work delay can be improved.
It should be noted that the light-emitting device may be turned off after outputting the first frame data of the target video stream, or may be in a light-emitting state all the time, which is not limited by the present disclosure.
Example two, the demonstration device 102 includes a display device as an example.
In case the presentation device 102 comprises a display device, the presentation object may comprise a target video stream, the test device 101 is further configured to send the target video stream to the display device, the display device is configured to display the target video stream. The target video stream may be a video stream to which a specific mark is added, for example, the target video stream has a high brightness.
Fig. 3 is a schematic structural diagram of a third camera test system provided in the embodiment of the present disclosure. As shown in fig. 3, the test apparatus 101 includes: a processor 1011, a signal source 1013 and a video receiver 1012, wherein the processor 1011 is connected to the signal source 1013 and the video receiver 1012, the video receiver 1012 is connected to the device under test 103, and the signal source 1013 is connected to the display device;
the processor 1011 is configured to output the target video stream via the signal source 1013;
the video receiver 1012, configured to receive the test video stream sent by the device under test 103;
the processor 1011 is further configured to, when the target video stream is output, store a first time for outputting a first frame of the target video stream, obtain the first time after receiving the test video stream, determine the first frame according to a luminance characteristic of the first frame of the target video stream and a luminance characteristic of each frame of data in the test video stream, obtain a second time for receiving the first frame, obtain a time difference between the first time and the second time, and use the time difference as a working delay of the camera.
The processor 1011 of the test device 101 may transmit a target video stream to the display device via the signal source 1013 after receiving a start instruction from a user, and record a first time at which a first frame of the target video stream is transmitted. If the display device does not display the video stream before receiving the target video stream, for example, the display device is in a standby black screen state, the target video stream may be any video stream, and when the display device displays the target video stream, the brightness of a display screen is obviously increased compared with the standby black screen state; if the display device has displayed an initial video stream before receiving the target video stream, the target video stream may be a specific video stream, for example, the brightness of the target video stream may be significantly changed from the brightness of the initial video stream. The display device can display the target video stream after receiving the target video stream transmitted by the processor 1011 through the signal source 1013.
In addition, if the target video stream is an arbitrary video stream, the camera of the device under test 103 needs to capture the first time when the display device displays the target video stream, so that the device under test 103 can start capturing the display device before the test device 101 sends the target video stream to the display device, and therefore, the test device 101 can control the video receiver 1012 of the test device 101 to receive the test video stream sent by the device under test 103 before sending the target video stream to the display device.
The video receiver 1012 of the testing device 101 may send the test video stream to the processor 1011 of the testing device 101 after receiving the test video stream sent by the device under test 103, and the processor 1011 may obtain a first time for outputting a first frame of the target video stream and obtain a target brightness characteristic of the first frame data of the target video stream. The processor 1011 can then obtain the luminance characteristics of each frame of data in the test video stream, use the frame of the test video stream with the same luminance characteristics as the target luminance characteristics as the first frame, and obtain the second time when the video receiver 1012 receives the first frame. For the method for obtaining the brightness feature, reference may be made to the method for obtaining the brightness feature in example one, and details are not described here.
After the processor 1011 obtains the first time for outputting the first frame of the target video stream and the second time for receiving the first frame, the difference between the first time and the second time may be obtained, and the difference is used as the working delay of the camera of the device under test 103. Therefore, the test equipment 103 does not need to be synchronized with other external equipment, the working delay of the camera of the equipment to be tested 103 can be obtained only through the target video stream, the test process is simpler, the requirements on the running speed and the cost of a test system are lower, manual participation is not needed, and more accurate working delay can be obtained.
It should be noted that the testing device 101 may also add a specific flag to any frame data of the target video stream, for example, the luminance of the 5 th frame data of the target video stream may be set to a larger target luminance value, which is larger than the luminance values of the 4 th frame data and the 6 th frame data. In this way, the test device 101 may obtain a first time for outputting the 5 th frame data, after receiving the test video stream sent by the device under test 103, determine a target test frame corresponding to the 5 th frame data from the test video stream according to a target brightness value, obtain a second time for receiving the target test frame, obtain a difference between the first time and the second time, and use the difference as a working delay of the camera of the device under test 103. Thus, the device under test 103 does not need to be in a working state all the time in order to capture the first frame data of the target video stream, and the test device 101 may trigger the device under test 103 to enter the working state after receiving a start instruction from a user, so that energy consumption of the device under test 103 may be reduced.
In addition, the display device 102 may include a light emitting device and a display device, that is, the two manners of the first example and the second example are superimposed, and when the test device 101 outputs the first frame data of the target video stream, the light emitting device may be triggered to emit light according to the target brightness at the same time, so that the brightness of the first frame data captured by the camera of the device under test 103 may be more prominent, and when the test device 101 determines the first frame from the test video stream sent by the device under test 103, the erroneous determination may not occur due to the small brightness difference of each frame data, which may cause the finally obtained work delay of the camera of the device under test 103 to be inaccurate, so that the accuracy of the work delay may be improved.
Based on the method for obtaining the working delay of the camera of the device under test 103 in this example, the test device 101 may also obtain the frame rate stability of the device under test 103. In a possible implementation manner, the test device 101 is further configured to obtain a size of the display device, obtain a plurality of difference frames between every two frames of data in the test video stream after receiving the test video stream sent by the device under test 103, and determine whether the frame rate of the device under test 103 is stable according to image features of the plurality of difference frames, where the difference frames include difference data between every two frames of data.
When a test environment is built, the size of the display device can be stored in advance through configuration information, and the test device 101 can directly acquire the size when needed. For the case that the size of the display device cannot be determined, in one possible implementation, the test device 101 may obtain the size of the display device through a physical size mark. Fig. 4 is a schematic diagram of a physical size marker provided by an embodiment of the present disclosure, and as shown in fig. 4, the physical size marker may be placed above the display device, and the physical size marker may include a black lattice and a white lattice, and the sizes of the black lattice and the white lattice may be fixed values set in advance. The test device 101 may determine the number of the black lattices and the white lattices from the test video stream after receiving the test video stream including the physical size mark transmitted by the device under test 103, and determine the size of the display device according to the number.
In another possible implementation manner, the testing device 101 may determine the size of the display device according to a unit width of each frame data and a number of frames of an image that the display device can simultaneously display, and a product of the unit width and the number of frames is the size of the display device.
After obtaining the size of the display device and receiving the test video stream sent by the device under test 103, the test device 101 may perform binarization processing on each frame of data in the test video stream to obtain a binarized image corresponding to each frame of data, and then obtain a difference frame between every two frames of data in the test video stream based on the binarized image. For example, if the test video stream includes 5 frames of data, the test device 101 may obtain a difference frame between the first frame of data and the second frame of data, a difference frame between the second frame of data and the third frame of data, a difference frame between the third frame of data and the fourth frame of data, and a difference frame between the fourth frame of data and the fifth frame of data in the test video stream. Fig. 5 is a schematic diagram of the difference frame provided by the embodiment of the present disclosure, and as shown in fig. 5, an uppermost image represents a binarized image of first frame data of the test video stream, a middle image represents a binarized image of second frame data of the test video stream, and a lowermost image represents a binarized image of difference frame data after the first frame data and the second frame data are xored.
After acquiring a difference frame between every two frames of data in the test video stream, the test device 101 may determine whether the frame rate of the device under test 103 is stable according to an image feature of the difference frame, where the image feature may include an area of an image region and a width of the image region.
In a case that the image feature includes an area of an image region corresponding to the difference frame, the test device 101 is further configured to determine that the frame rate of the device under test is stable when the area of the image region corresponding to each difference frame is smaller than or equal to a preset area threshold. The preset area threshold may be determined according to a requirement of the device under test 103 on frame rate stability, and under a condition that the frame rate of the device under test 103 is stable, an area of each frame of data output by the device under test 103 should be the same, based on which, if the requirement of the device under test 103 on frame rate stability is higher, for example, 99%, a smaller preset area threshold, for example, 2%, may be set; if the requirement of the dut 103 on the frame rate stability is low, for example, 90%, a larger preset area threshold may be set, for example, 5%, which is not limited in this disclosure.
After obtaining the area of the image region corresponding to each difference frame, the test device 101 may compare the area with the preset area threshold, and if the area of the image region corresponding to each difference frame is smaller than or equal to the preset area threshold, it may be determined that the frame rate of the device under test 103 is stable; if the area of the image region corresponding to the partial difference frame is larger than the preset area threshold, it is determined that the frame rate of the device under test 103 is unstable. Here, the number of partial difference frames may be determined according to the requirement of the device under test 103 on the frame rate stability, and if the requirement of the device under test 103 on the frame rate stability is higher, for example, 99%, a smaller number, for example, 1, may be set; if the requirement on the frame rate stability of the device under test 103 is low, for example, 90%, a larger number, for example, 5, may be set, which is not limited in this disclosure.
When the image feature includes the width of the image region corresponding to the difference frame, the test device 101 is further configured to determine that the frame rate of the device under test is stable when the width of the image region corresponding to each difference frame is less than or equal to a preset width threshold. The preset width threshold may be determined according to a requirement of the device under test 103 on frame rate stability, and under a condition that the frame rate of the device under test 103 is stable, the width of each frame of data output by the device under test 103 should be the same, based on which, if the requirement of the device under test 103 on frame rate stability is higher, for example, 99%, a smaller preset width threshold, for example, 2%, may be set; if the requirement for frame rate stability of the device under test 103 is low, for example 90%, a larger preset width threshold may be set, for example 5%, which is not limited in this disclosure.
After obtaining the width of the image area corresponding to each difference frame, the test device 101 may compare the width with the preset width threshold, and if the width of the image area corresponding to each difference frame is less than or equal to the preset width threshold, determine that the frame rate of the device under test 103 is stable; if the width of the image area corresponding to the partial difference frame is greater than the preset width threshold, it is determined that the frame rate of the device under test 103 is unstable. Here, the number of partial difference frames may be determined according to the requirement of the device under test 103 on the frame rate stability, and if the requirement of the device under test 103 on the frame rate stability is higher, for example, 99%, a smaller number, for example, 1, may be set; if the requirement on the frame rate stability of the device under test 103 is low, for example, 90%, a larger number, for example, 5, may be set, which is not limited in this disclosure. In this way, the frame rate stability of the device under test 103 can be determined only by acquiring the width of the image region of the difference frame, and compared with the method for determining the stability of the device under test 103 by the area of the image region corresponding to the difference frame, the calculation process is simpler, so that the efficiency of determining the frame rate stability can be improved.
Through the system, the test equipment can trigger the display equipment to output different display objects according to the type of the display equipment, under the condition that the display equipment is the light-emitting equipment, the working delay of the camera of the equipment to be tested can be determined according to the first time for triggering the light-emitting equipment to emit light and the second time for receiving the first frame of the display object, and under the condition that the display equipment is the display equipment, the working delay of the camera of the equipment to be tested can be determined according to the first time for outputting the first frame of the target video stream and the second time for receiving the first frame of the display object. Therefore, the whole testing process does not need manual participation, and the testing equipment can automatically acquire the work delay of the camera, so that the accuracy of the work delay can be improved, and the labor cost can be saved. In addition, the test equipment can also determine the stability of the frame rate of the equipment to be tested according to the image characteristics of the difference frame between every two frames of data in the test video stream, so that the performance of the equipment to be tested can be more comprehensively obtained.
Fig. 6 is a flowchart of a camera testing method, which is applied to a testing device in a camera testing system, and the system includes: the test equipment comprises a camera, and is respectively connected with the display equipment and the equipment to be tested; as shown in fig. 6, the method includes:
and S601, triggering the display equipment to output a display object.
S602, receiving the test video stream sent by the device to be tested under the condition that the camera shoots the display object to obtain the test video stream.
S603, acquiring the first time and the second time, and determining the working delay of the camera according to the first time and the second time.
The first time may include a time when the test device triggers the display device to output the display object, and the second time may include a time when the test device receives the first frame of the display object when receiving the test video stream.
It should be noted that after the camera of the device under test captures the original image data corresponding to the display object, the original image data needs to be encoded and compressed to obtain the test video stream, and then the test video stream is sent to the test device. Processing time is consumed due to encoding, compression and the like, transmission time is also consumed when the device under test sends the test video stream to the test device, and the processing time and the transmission time are ignored when the second time is acquired by the test device.
In order to solve the above problem, after triggering the display device to output a display object, the device to be tested may determine a first frame of the display object from original image data captured by the camera, and receive the capturing time sent by the device to be tested after determining the capturing time for capturing the first frame; and acquiring a time difference value between the shooting time and the time when the test equipment triggers the display equipment to output the display object, and taking the time difference value as the working delay of the camera. Therefore, after the camera acquires the original image data of the first frame including the display object, the device to be tested can directly acquire the shooting time for shooting the first frame, and then only needs to send the shooting time to the test device, the test device can directly determine the working delay of the camera of the device to be tested according to the time difference between the shooting time and the time for outputting the display object, and the processing time of the original image and the transmission time of the data are filtered out in the working delay, so that the accuracy of the working delay can be improved.
By adopting the method, the test equipment can determine the working delay of the equipment to be tested according to the first time for triggering the display equipment to output the display object and the second time for receiving the first frame of the display object sent by the equipment to be tested, so that the first time and the second time can be directly obtained through the test equipment, the obtaining mode is simple, the whole process does not need manual participation, the first time and the second time are more accurate, and the accuracy of measurement can be improved.
Fig. 7 is a flowchart of a second method for testing a camera according to an embodiment of the present disclosure, where the method is applied to a testing device. As shown in fig. 7, the method includes:
and S701, under the condition that the display device comprises a light-emitting device, triggering the light-emitting device to emit light according to the target brightness, and storing the first time for triggering the light-emitting device to emit light according to the target brightness.
S702, acquiring the first time.
And S703, determining the first frame according to the brightness characteristic of each frame data in the test video stream.
In this step, after receiving the test video stream sent by the device to be tested, the test device may perform binarization processing on the data of the Y channel of each frame of the test video stream to obtain target frame data, obtain a luminance characteristic corresponding to each target frame data, and determine the first frame according to the luminance characteristic.
For each frame of data of the test video stream, the test device may perform binarization processing on the data of the Y channel of the frame of data by using a method of a related art, and obtain a sum of the luminances of all the pixel points. In the picture shot by the camera of the device to be tested, the brightness of the picture after the light-emitting device emits light according to the target brightness is different from the brightness of the picture before the light-emitting device emits light according to the target brightness, so that the first frame can be determined from the test video stream according to the brightness of each frame of data in the test video stream. For example, if the light-emitting device is in an off state before receiving the trigger signal sent by the testing device, and emits light according to the target brightness after receiving the trigger signal, a frame with significantly increased brightness may be determined from the test video stream as the first frame.
S704, obtain the second time when the first frame is received.
S705, acquiring a time difference value between the first time and the second time, and taking the time difference value as the working delay of the camera.
By adopting the method, the test equipment can determine the working delay of the camera of the equipment to be tested according to the first time for triggering the light-emitting equipment to emit light and the second time for receiving the first frame of the display object, so that the whole test process does not need manual participation, and the test equipment can automatically acquire the working delay of the camera, thereby improving the accuracy of the working delay.
Fig. 8 is a flowchart of a third method for testing a camera according to an embodiment of the present disclosure, where the method is applied to a testing device. As shown in fig. 8, the method includes:
s801, in a case that the display device includes a display device, sending the target video stream to the display device, and storing a first time for outputting a first frame of the target video stream.
In this step, the test device may transmit the target video stream to the display device, and the display device may display the target video stream.
S802, obtaining the first time.
And S803, determining the first frame according to the brightness characteristic of the first frame of the target video stream and the brightness characteristic of each frame of data in the test video stream.
In this step, after receiving the test video stream sent by the device under test, the test device may obtain a target brightness characteristic of first frame data of the target video stream. Then, the test device may obtain the brightness characteristic of each frame of data in the test video stream, and use the frame with the brightness characteristic same as the target brightness characteristic in the test video stream as the first frame. The method for obtaining the brightness feature may refer to the method for obtaining the brightness feature in the embodiment shown in fig. 7, and is not described herein again.
S804, the second time when the first frame is received is obtained.
And S805, acquiring a time difference value between the first time and the second time, and taking the time difference value as the working delay of the camera.
It should be noted that the test device may also determine the frame rate stability of the device under test according to the test video stream sent by the device under test. In a possible implementation manner, before the target video stream is sent to the display device, the test device may obtain a size of the display device, and after the test video stream sent by the device under test is received, may obtain a plurality of difference frames between every two frames of data in the test video stream, and determine whether a frame rate of the device under test is stable according to image features of the plurality of difference frames, where the difference frames include difference data between every two frames of data. The manner of obtaining the size of the display device is described in detail in the embodiment corresponding to the test system, and is not described herein again.
Wherein the image feature may comprise an area of the image region and a width of the image region. Determining that the frame rate of the device to be tested is stable under the condition that the image characteristics comprise the areas of the image areas corresponding to the difference frames and the area of the image area corresponding to each difference frame is smaller than or equal to a preset area threshold; and under the condition that the image characteristics comprise the width of the image area corresponding to the difference frame, determining that the frame rate of the device to be tested is stable under the condition that the width of the image area corresponding to each difference frame is less than or equal to a preset width threshold.
By adopting the method, the testing equipment can determine the working delay of the camera of the equipment to be tested according to the first time of outputting the first frame of the target video stream and the second time of receiving the first frame of the display object, so that the whole testing process does not need manual participation, and the testing equipment can automatically acquire the working delay of the camera, thereby improving the accuracy of the working delay. In addition, the test equipment can also determine the stability of the frame rate of the equipment to be tested according to the image characteristics of the difference frame between every two frames of data in the test video stream, so that the performance of the equipment to be tested can be more comprehensively obtained.
With regard to the method in the above-described embodiment, the specific manner in which each step performs the operation has been described in detail in the embodiment related to the system, and will not be elaborated upon here.
Fig. 9 is a block diagram of an electronic device 900 provided by an embodiment of the disclosure. As shown in fig. 9, the electronic device 900 may include: a processor 901 and a memory 902. The electronic device 900 may also include one or more of a multimedia component 903, an input/output (I/O) interface 904, and a communications component 905.
The processor 901 is configured to control the overall operation of the electronic device 900, so as to complete all or part of the steps in the above-mentioned camera testing method. The memory 902 is used to store various types of data to support operation of the electronic device 900, such as instructions for any application or method operating on the electronic device 900 and application-related data, such as contact data, transmitted and received messages, pictures, audio, video, and the like. The Memory 902 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia component 903 may include a screen and an audio component. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 902 or transmitted through the communication component 905. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 904 provides an interface between the processor 901 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 905 is used for wired or wireless communication between the electronic device 900 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding communication component 905 may thus include: Wi-Fi module, Bluetooth module, NFC module, etc.
In an exemplary embodiment, the electronic Device 900 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the camera testing methods described above.
In another exemplary embodiment, a computer readable storage medium comprising program instructions which, when executed by a processor, implement the steps of the above-described camera testing method is also provided. For example, the computer readable storage medium may be the memory 902 described above including program instructions that are executable by the processor 901 of the electronic device 900 to perform the camera testing method described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned camera testing method when executed by the programmable apparatus.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure. It should be noted that, in the foregoing embodiments, various features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various combinations that are possible in the present disclosure are not described again.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.

Claims (16)

1. A camera testing system, the system comprising: the device to be tested comprises a camera, and the test device is connected with the display device and the device to be tested respectively;
the test equipment is used for triggering the display equipment to output a display object;
the device to be tested is used for shooting the display object through the camera to obtain a test video stream and sending the test video stream to the test device;
the test equipment is further configured to receive the test video stream, acquire a first time and a second time, and determine a working delay of the camera according to the first time and the second time, where the first time includes a time when the test equipment triggers the display equipment to output the display object, and the second time includes a time when the test equipment receives a first frame of the display object when receiving the test video stream.
2. The system of claim 1, wherein, in the case where the display device comprises a light emitting device, the display object comprises light;
the test equipment is also used for triggering the light-emitting equipment to emit light according to the target brightness.
3. The system of claim 2, wherein the test equipment comprises: the processor is respectively connected with the display equipment and the video receiver, and the video receiver is connected with the equipment to be tested;
the processor is used for triggering the light-emitting device to emit light according to the target brightness;
the video receiver is used for receiving the test video stream sent by the equipment to be tested;
the processor is further configured to store the first time when the light-emitting device is triggered to emit light according to the target brightness, obtain the first time after the test video stream sent by the device to be tested is received, determine the first frame according to a brightness characteristic of each frame of data in the test video stream, obtain the second time when the first frame is received, obtain a time difference between the first time and the second time, and use the time difference as a working delay of the camera.
4. The system according to claim 3, wherein the processor is further configured to perform binarization processing on data of a Y channel of each frame of the test video stream to obtain target frame data, obtain a brightness feature of each target frame data, and determine the first frame according to the brightness feature.
5. The system of claim 1, wherein, in the case where the presentation device comprises a display device, the presentation object comprises a target video stream;
the test equipment is also used for sending the target video stream to the display equipment;
the display device is used for displaying the target video stream.
6. The system of claim 5, wherein the test equipment comprises: the processor is respectively connected with the signal source and the video receiver, the video receiver is connected with the equipment to be tested, and the signal source is connected with the display equipment;
the processor is used for outputting the target video stream through the signal source;
the video receiver is used for receiving the test video stream sent by the equipment to be tested;
the processor is further configured to, when the target video stream is output, store a first time for outputting a first frame of the target video stream, obtain the first time after receiving the test video stream, determine the first frame according to a luminance characteristic of the first frame of the target video stream and a luminance characteristic of each frame of data in the test video stream, obtain the second time for receiving the first frame, obtain a time difference between the first time and the second time, and use the time difference as a working delay of the camera.
7. The system according to claim 5, wherein the test device is further configured to obtain a size of the display device, obtain a plurality of difference frames between every two frames of data in the test video stream after receiving the test video stream sent by the device under test, and determine whether the frame rate of the device under test is stable according to image features of the plurality of difference frames, where the difference frames include difference data between every two frames of data.
8. The system according to claim 7, wherein in a case that the image feature includes an area of an image region corresponding to the difference frame, the test device is further configured to determine that the frame rate of the device under test is stable in a case that an area of an image region corresponding to each difference frame is smaller than or equal to a preset area threshold; alternatively, the first and second electrodes may be,
and under the condition that the image characteristics comprise the width of the image area corresponding to the difference frame, the test equipment is further used for determining that the frame rate of the equipment to be tested is stable under the condition that the width of the image area corresponding to each difference frame is smaller than or equal to a preset width threshold value.
9. The system according to claim 1, wherein the device under test is further configured to determine a first frame of the display object from the original image data captured by the camera, determine a capturing time for capturing the first frame, and send the capturing time to the test device;
the test equipment is further used for obtaining a time difference value between the shooting time and the time when the test equipment triggers the display equipment to output the display object, and the time difference value is used as the working delay of the camera.
10. A camera testing method is characterized in that the method is applied to testing equipment in a camera testing system, and the system comprises the following steps: the device to be tested comprises a camera, and the test device is connected with the display device and the device to be tested respectively; the method comprises the following steps:
triggering the display equipment to output a display object;
under the condition that the display object is shot by the camera to obtain a test video stream, receiving the test video stream sent by the equipment to be tested;
acquiring first time and second time, and determining the working delay of the camera according to the first time and the second time, wherein the first time comprises the time when the test equipment triggers the display equipment to output the display object, and the second time comprises the time when the test equipment receives the first frame of the display object when receiving the test video stream.
11. The method of claim 10, wherein, in the case where the display device comprises a light emitting device, the display object comprises light; the triggering the display device to output the display object comprises:
and triggering the light-emitting device to emit light according to the target brightness.
12. The method of claim 11, further comprising:
storing the first time when the light-emitting device is triggered to emit light according to the target brightness;
the obtaining the first time and the second time and determining the working delay of the camera according to the first time and the second time comprises:
acquiring the first time;
determining the first frame according to the brightness characteristic of each frame data in the test video stream;
acquiring the second time when the first frame is received;
and acquiring a time difference value between the first time and the second time, and taking the time difference value as the working delay of the camera.
13. The method according to claim 10, wherein in the case where the presentation device comprises a display device, the presentation object comprises a target video stream; the triggering the display device to output the display object comprises:
and sending the target video stream to the display equipment so as to enable the display equipment to display the target video stream.
14. The method of claim 13, wherein prior to said sending the target video stream to the display device, the method further comprises:
acquiring the size of the display equipment;
after the receiving the test video stream sent by the device under test, the method further includes:
acquiring a plurality of difference frames between every two frames of data in the test video stream;
and determining whether the frame rate of the device to be tested is stable according to the image characteristics of the plurality of difference frames, wherein the difference frames comprise difference data between every two frames of data.
15. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 10 to 14.
16. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 10 to 14.
CN202010768613.5A 2020-08-03 2020-08-03 Camera testing system, method, storage medium and electronic equipment Pending CN114071120A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010768613.5A CN114071120A (en) 2020-08-03 2020-08-03 Camera testing system, method, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010768613.5A CN114071120A (en) 2020-08-03 2020-08-03 Camera testing system, method, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN114071120A true CN114071120A (en) 2022-02-18

Family

ID=80231631

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010768613.5A Pending CN114071120A (en) 2020-08-03 2020-08-03 Camera testing system, method, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114071120A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114640608A (en) * 2022-04-01 2022-06-17 上海商汤智能科技有限公司 Test method and device, electronic equipment and computer readable storage medium
CN116389717A (en) * 2023-04-11 2023-07-04 深圳市龙之源科技股份有限公司 Outdoor camera detection device and control method thereof
CN117041528A (en) * 2023-08-07 2023-11-10 昆易电子科技(上海)有限公司 Time difference measuring method and system and waveform processing module
WO2023245584A1 (en) * 2022-06-23 2023-12-28 北京小米移动软件有限公司 Camera assembly testing method and apparatus, and electronic device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1909678A (en) * 2006-08-11 2007-02-07 中国船舶重工集团公司第七○九研究所 Delay testing method for IP video communication system
JP2009130820A (en) * 2007-11-27 2009-06-11 Canon Inc Information processing apparatus
CN107094249A (en) * 2017-03-31 2017-08-25 腾讯科技(上海)有限公司 A kind of method and device for testing camera delay
CN108419017A (en) * 2018-04-28 2018-08-17 Oppo广东移动通信有限公司 Control method, apparatus, electronic equipment and the computer readable storage medium of shooting
CN110361774A (en) * 2019-07-18 2019-10-22 江苏康众数字医疗科技股份有限公司 Test the device and test method of x-ray detector time response
CN111294666A (en) * 2019-07-04 2020-06-16 杭州萤石软件有限公司 Video frame transmission method and method, device and system for determining video frame transmission delay

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1909678A (en) * 2006-08-11 2007-02-07 中国船舶重工集团公司第七○九研究所 Delay testing method for IP video communication system
JP2009130820A (en) * 2007-11-27 2009-06-11 Canon Inc Information processing apparatus
CN107094249A (en) * 2017-03-31 2017-08-25 腾讯科技(上海)有限公司 A kind of method and device for testing camera delay
CN108419017A (en) * 2018-04-28 2018-08-17 Oppo广东移动通信有限公司 Control method, apparatus, electronic equipment and the computer readable storage medium of shooting
CN111294666A (en) * 2019-07-04 2020-06-16 杭州萤石软件有限公司 Video frame transmission method and method, device and system for determining video frame transmission delay
CN110361774A (en) * 2019-07-18 2019-10-22 江苏康众数字医疗科技股份有限公司 Test the device and test method of x-ray detector time response

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114640608A (en) * 2022-04-01 2022-06-17 上海商汤智能科技有限公司 Test method and device, electronic equipment and computer readable storage medium
WO2023245584A1 (en) * 2022-06-23 2023-12-28 北京小米移动软件有限公司 Camera assembly testing method and apparatus, and electronic device and storage medium
CN116389717A (en) * 2023-04-11 2023-07-04 深圳市龙之源科技股份有限公司 Outdoor camera detection device and control method thereof
CN117041528A (en) * 2023-08-07 2023-11-10 昆易电子科技(上海)有限公司 Time difference measuring method and system and waveform processing module

Similar Documents

Publication Publication Date Title
CN114071120A (en) Camera testing system, method, storage medium and electronic equipment
US10215557B2 (en) Distance image acquisition apparatus and distance image acquisition method
CN109120863B (en) Shooting method, shooting device, storage medium and mobile terminal
US20190295493A1 (en) Method, terminal and storage medium for preventing display from image-sticking
CN111858318A (en) Response time testing method, device, equipment and computer storage medium
CN111105392B (en) Display performance testing method and device and storage medium
CN108632666B (en) Video detection method and video detection equipment
CN112788255A (en) Wireless screen projection method and device and screen projection host
CN113038149A (en) Live video interaction method and device and computer equipment
CN109104608B (en) Television performance test method, equipment and computer readable storage medium
CN113766217B (en) Video delay testing method and device, electronic equipment and storage medium
US20150077576A1 (en) Information processing device, system, and storage medium
WO2022105027A1 (en) Image recognition method and system, electronic device, and storage medium
EP3660822A1 (en) Screen display method and screen display device
CN109218621A (en) Image processing method, device, storage medium and mobile terminal
US20160212318A1 (en) Information processing device, information processing method, and program
CN101441393A (en) Projection device for image projection with document camera device connected thereto, and projection method
US20170064128A1 (en) Imaging apparatus, recording instruction apparatus, image recording method and recording instruction method
US10277817B2 (en) Information processing apparatus and information processing method
KR20200027276A (en) Electronic device for obtaining images by controlling frame rate for external object moving through point ofinterest and operating method thereof
CN112770106B (en) Hardware-in-the-loop evaluation method, device, storage medium, electronic equipment and system
CN112383661B (en) Mobile terminal automatic test method and device, electronic equipment and storage medium
US9761200B2 (en) Content output system, content output apparatus, content output method, and computer-readable medium
JP2016178566A (en) Imaging controller, imaging control program and imaging control method
CN110166768B (en) Shooting method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination