WO2024055531A1 - Procédé d'identification de valeur d'illuminomètre, dispositif électronique et support de stockage - Google Patents

Procédé d'identification de valeur d'illuminomètre, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2024055531A1
WO2024055531A1 PCT/CN2023/078535 CN2023078535W WO2024055531A1 WO 2024055531 A1 WO2024055531 A1 WO 2024055531A1 CN 2023078535 W CN2023078535 W CN 2023078535W WO 2024055531 A1 WO2024055531 A1 WO 2024055531A1
Authority
WO
WIPO (PCT)
Prior art keywords
value
preset
image
numerical
display area
Prior art date
Application number
PCT/CN2023/078535
Other languages
English (en)
Chinese (zh)
Inventor
林俞竹
周璇
吴晓霞
Original Assignee
深圳创维-Rgb电子有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳创维-Rgb电子有限公司 filed Critical 深圳创维-Rgb电子有限公司
Publication of WO2024055531A1 publication Critical patent/WO2024055531A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/1444Selective acquisition, locating or processing of specific regions, e.g. highlighted text, fiducial marks or predetermined fields
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/148Segmentation of character regions
    • G06V30/153Segmentation of character regions using recognition of characters or words
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/19007Matching; Proximity measures
    • G06V30/19013Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching

Definitions

  • the present application relates to the technical field of televisions, and in particular to an illumination count value identification method, electronic equipment and storage media.
  • TVs are often equipped with the function of automatically adjusting the screen brightness or automatically adjusting the color temperature through light-sensing components.
  • the current light-sensing components have problems such as low recognition accuracy and low installation position, it is usually necessary to perform functional mapping processing on the values measured by the light-sensing components based on the values of the illuminance meter.
  • observers often need to observe and record the values of the illuminance meter when the ambient brightness and color temperature change over a wide range.
  • the ambient light will be blocked and the illuminance count will be reduced.
  • the accuracy of the value, and if observed from the side, it may cause erroneous observation and recording of the value, and the reduced accuracy of the final observed illumination count value will result in the inability to correctly adjust the screen color temperature after the final light-sensing data brightness.
  • the main purpose of this application is to provide an illumination count value identification method, electronic device and storage medium, aiming to solve the technical problem of low accuracy of illumination count values observed in the prior art.
  • the illumination counting value identification method includes:
  • Illuminance count values are identified from the numerical image.
  • the step of identifying the numerical display area in the illuminance meter image includes:
  • a numerical display area is determined.
  • the preset color threshold includes a preset red threshold, a preset green threshold, and a preset blue threshold. Based on the preset color threshold, the illuminance meter image is binarized to obtain the illuminance.
  • the steps to calculate a binary map include:
  • the step of determining the numerical display area according to each of the connected domains includes:
  • the minimum circumscribed quadrilateral of the target connected domain is determined as the numerical display area in the illuminance meter image.
  • the step of converting the viewing angle of the numerical display area to obtain a numerical image includes:
  • the color value of each of the null value pixel points is determined to obtain a numerical image.
  • the step before the step of performing coordinate transformation on the position of each pixel in the numerical display area according to a perspective transformation algorithm to obtain perspective coordinates, the step further includes:
  • the initial coordinates and perspective coordinates of each representative point are substituted into the perspective transformation algorithm to determine a preset perspective transformation matrix.
  • the step of identifying illumination count values from the numerical image includes:
  • the numerical recognition results are arranged and combined to obtain the illumination count value.
  • the method further includes:
  • a light-sensing mapping curve is generated.
  • the electronic device is a physical device.
  • the electronic device includes: a memory, a processor, and the illumination count value identification stored on the memory and operable on the processor.
  • the program of the method when the program of the illumination count value identification method is executed by the processor, can implement the steps of the illumination count value identification method as described above.
  • This application also provides a storage medium, the storage medium is a computer-readable storage medium, and the computer-readable storage medium
  • the storage medium stores a program for implementing the illumination count value identification method.
  • the program for the illumination count value identification method is executed by the processor, the above-mentioned steps of the illumination count value identification method are implemented.
  • the present application also provides a computer program product, which includes a computer program that implements the above-mentioned steps of the illumination count value identification method when executed by a processor.
  • the present application provides an illuminance counting value identification method, electronic device and storage medium.
  • an illuminance counting value identification method By acquiring an illuminance meter image, identifying the numerical display area in the illuminance meter image, the image of the numerical display area on the illuminance meter is obtained, and then through Convert the viewing angle of the numerical display area to obtain a numerical image, thereby realizing the viewing angle correction of the image in the numerical display area, so that the illuminance meter image collected from the side can be converted into a front view, and then the illuminance meter image collected from the side can be converted into a front view, and then the numerical image can be obtained by
  • the illuminance count value is obtained through identification, which realizes the accurate identification of the illuminance count value in the illuminance meter image collected from the side.
  • the squint of the human eye can be effectively reduced.
  • the error generated improves the accuracy of identification of illumination count values, and overcomes the technical problem of low accuracy of illumination count values observed in the prior art.
  • Figure 1 is a schematic structural diagram of an electronic device of the hardware operating environment involved in the illumination count value identification method in the embodiment of the present application;
  • Figure 2 is a schematic flow chart of an embodiment of the illumination count value identification method of the present application.
  • Figure 3 is a schematic diagram of an implementation manner of the binary diagram of the illuminance meter in this application;
  • Figure 4 is a schematic diagram of another implementation of the binary diagram of the illuminance meter in this application.
  • Figure 5 is a schematic diagram of an implementation manner of the numerical display area in the illuminance meter image in this application;
  • Figure 6 is a schematic diagram of another implementation of the binary diagram of the illuminance meter in this application.
  • Figure 7 is a schematic diagram of an implementation method of the light sensing mapping curve in this application.
  • Figure 8 is a schematic flow chart of another embodiment of the illumination count value identification method of the present application.
  • Figure 9 is a schematic diagram of an implementation manner of perspective transformation in this application.
  • Figure 10 is a schematic diagram of an implementation of the numerical display area before perspective conversion in this application.
  • Figure 11 is a schematic diagram of an implementation manner of the numerical image after perspective conversion in this application.
  • Figure 1 is a schematic diagram of the terminal structure of the hardware operating environment involved in the embodiment of the present application.
  • the terminal in the embodiment of this application may be a PC, a smartphone, a tablet, an e-book reader, an MP3 (Moving Picture Experts Group Audio Layer III, Moving Picture Experts Compression Standard Audio Layer III) player, or an MP4 (Moving Picture Experts) Group Audio Layer IV, a dynamic image expert compresses standard audio layer 3) for mobile terminal devices such as players and portable computers.
  • MP3 Moving Picture Experts Group Audio Layer III, Moving Picture Experts Compression Standard Audio Layer III
  • MP4 Moving Picture Experts Group Audio Layer IV
  • a dynamic image expert compresses standard audio layer 3
  • the terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, and a communication bus 1002.
  • the communication bus 1002 is used to implement connection communication between these components.
  • the user interface 1003 may include a display screen (Display) and an input unit such as a keyboard (Keyboard).
  • the user interface 1003 may also include a standard wired interface and a wireless interface.
  • the network interface 1004 may include standard wired interfaces and wireless interfaces (such as WI-FI interfaces).
  • the memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory.
  • the memory 1005 may also be a storage device independent of the aforementioned processor 1001.
  • the terminal may also include a camera, RF (Radio Frequency, radio frequency) circuit, sensor, audio circuit, WiFi module, etc.
  • sensors such as light sensors, motion sensors and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor.
  • the ambient light sensor may adjust the brightness of the display screen according to the brightness of the ambient light.
  • the proximity sensor may turn off the display screen and/or when the mobile terminal moves to the ear. Backlight.
  • the gravity acceleration sensor can detect the magnitude of acceleration in various directions (generally three axes).
  • the mobile terminal can detect the magnitude and direction of gravity when stationary, and can be used to identify mobile terminal posture applications (such as horizontal and vertical screen switching, Related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.; of course, the mobile terminal can also be equipped with other sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc., here No longer.
  • mobile terminal posture applications such as horizontal and vertical screen switching, Related games, magnetometer attitude calibration
  • vibration recognition related functions such as pedometer, tapping
  • the mobile terminal can also be equipped with other sensors such as gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc., here No longer.
  • terminal structure shown in FIG. 1 does not limit the terminal, and may include more or fewer components than shown, or combine certain components, or arrange different components.
  • memory 1005 which is a computer storage medium, may include an operating system, a network communication module, a user interface module, and an illumination count value identification application program.
  • the network interface 1004 is mainly used to connect to the backend server and communicate with the backend server;
  • the user interface 1003 is mainly used to connect to the client (user) and communicate with the client;
  • the processor 1001 can be used to call the illumination count value recognition application stored in memory 1005 and perform the following operations:
  • Illuminance count values are identified from the numerical image.
  • processor 1001 can call the illumination count value identification application stored in the memory 1005, and also perform the following operations:
  • a numerical display area is determined.
  • processor 1001 can call the illumination count value identification application stored in the memory 1005, and also perform the following operations:
  • processor 1001 can call the illumination count value identification application stored in the memory 1005, and also perform the following operations:
  • the minimum circumscribed quadrilateral of the target connected domain is determined as the numerical display area in the illuminance meter image.
  • processor 1001 can call the illumination count value identification application stored in the memory 1005, and also perform the following operations:
  • x, y, 1 are the initial coordinates
  • X, Y, Z are the intermediate conversion coordinates
  • X', Y', Z' are the perspective coordinates
  • the color value of each of the null value pixel points is determined to obtain a numerical image.
  • processor 1001 can call the illumination count value identification application stored in the memory 1005, and also perform the following operations:
  • the initial coordinates and perspective coordinates of each representative point are substituted into the perspective transformation algorithm to determine a preset perspective transformation matrix.
  • processor 1001 can call the illumination count value identification application stored in the memory 1005, and also perform the following operations:
  • the numerical recognition results are arranged and combined to obtain the illumination count value.
  • processor 1001 can call the illumination count value identification application stored in the memory 1005, and also perform the following operations:
  • a light-sensing mapping curve is generated.
  • an embodiment of the present application provides an illumination count value identification method.
  • the illumination count value identification method includes the following steps:
  • Step S10 obtain the illuminance meter image
  • the illuminance meter is an instrument that measures optical parameters such as illuminance, brightness, and color temperature.
  • the illuminance meter image is an image containing all or part of the area of the illuminance meter.
  • the illuminance meter The meter image at least includes the numerical display area of the illuminance meter, and may also include other areas of the illuminance meter, such as the housing, the light sensor, etc.
  • the numerical display area is the area on the illuminance meter that displays the measurement results.
  • the numerical display area It can be part or all of the display area of the illuminance meter.
  • the illumination meter is photographed by an image acquisition device to obtain an image of the illumination meter, wherein the image is captured
  • the set of equipment includes cameras, video cameras, etc. For example, you can use a camera to take pictures of the values displayed on the illuminance meter under different illuminances, or you can use a camera to capture the changing process of the illuminance meter values under different illuminances, and then intercept each illumination intensity from the video. corresponding image.
  • Step S20 identify the numerical display area in the illuminance meter image
  • image recognition technology is used to perform image recognition on the numerical display area in the illuminance meter image according to the shape, specification, color, etc. of the numeric value display area, and determine the numeric value display area in the illuminance meter image. numerical display area.
  • the numerical display area can also be cut from the illuminance meter image, retaining the numerical display area, and removing Other parts outside the numerical display area to avoid patterns, words, numbers, etc. in other parts except the numerical display area from interfering with the subsequent illumination counting value recognition process, and at the same time, it can also reduce the subsequent viewing angle
  • the calculation amount of conversion and illumination count value recognition is reduced, and the recognition efficiency is improved.
  • the step of identifying the numerical display area in the illuminance meter image includes:
  • Step S21 Binarize the illuminance meter image based on a preset color threshold to obtain a binary image of the illuminance meter;
  • a color threshold is set in advance based on the color difference between the numerical display area and other areas in the illuminance meter image, and each pixel point in the illuminance meter image is compared with the color threshold. , perform image segmentation on the illuminance meter image based on a preset color threshold to obtain a binary image of the illuminance meter composed of two colors, where the two colors in the binary image of the illuminance meter can be any two different colors.
  • the color can be specifically set according to actual needs, which is not limited in this embodiment.
  • the preset color threshold can be determined based on the actual color difference between the numerical display area and other areas. For example, if the numerical display area If the green component is significantly larger than other areas, the green threshold can be determined based on the difference in the green value of the image, and the image can be binarized based on the green threshold. If the white component of the numerical display area is significantly larger than other areas, the green threshold can be determined based on the green value of the image. The color difference sets the green threshold, red threshold and blue threshold, and the image is binarized based on the green threshold, red threshold and blue threshold.
  • the preset color threshold includes a preset red threshold, a preset green threshold, and a preset blue threshold. Based on the preset color threshold, the illuminance meter image is binarized to obtain the illuminance.
  • the steps for calculating binary images include:
  • Step S211 obtain the color value of each pixel in the illuminance meter image, where the color value includes a red value, a green value and a blue value;
  • Step S212 Compare each color value with a preset color threshold, and determine that the red value is less than or equal to the preset red threshold, the blue value is less than or equal to the preset blue threshold, and the green value is greater than or equal to the preset green threshold. target pixel;
  • Step S213 assign the color value of each target pixel point to a preset first color value, and assign the color value of other pixels except each of the target pixel points to a preset second color value, to obtain Illuminance meter binary diagram.
  • the illuminance meter image adopts the RGB color mode.
  • the color value of each pixel in the illuminance meter image includes a red value, a green value and a blue value.
  • the preset color Thresholds include preset red threshold, preset green threshold and preset blue threshold.
  • each pixel point in the light meter image is traversed, the color value of each pixel point in the light meter image is obtained, each color value is compared with a preset color threshold, and the red color of each pixel point is determined.
  • the value is less than or equal to the preset red threshold, whether the blue value is less than or equal to the preset blue threshold, and whether the green value is greater than or equal to the preset green threshold
  • the red value is less than or equal to the preset red threshold
  • the blue value is less than or Pixels that are equal to the preset blue threshold and whose green value is greater than or equal to the preset green threshold are determined as target pixels
  • the color value of each target pixel is assigned to the preset first color value
  • each of the target pixels is divided into The color values of other pixels other than the target pixel are assigned to the preset Assume the second color value to obtain the binary image of the illuminance meter.
  • Step S22 traverse the binary image of the illuminance meter and determine at least one connected domain corresponding to the preset first color value in the binary image of the illuminance meter;
  • all pixels of the binary image of the illuminance meter are traversed, and according to the preset connected domain algorithm, at least one corresponding to the preset first color value in the binary image of the illuminance meter is determined.
  • a connected domain wherein the connected domain can be all connected domains, a maximum connected domain, or a connected domain with an area greater than a preset area threshold.
  • the step of determining at least one connected domain corresponding to the preset first color value in the binary image of the illuminance meter according to the preset connected domain algorithm may further include: determining each Whether the area of the connected domain is greater than or equal to the preset area threshold, the initial connected domain whose area is greater than or equal to the preset area threshold is retained, and the initial connected domain whose area is smaller than the preset area threshold is filtered out to reduce interference.
  • Figure 3 is a schematic diagram of an implementable manner of the binary diagram of the illuminance meter in this application
  • Figure 4 is another binary diagram of the illuminance meter in this application.
  • the first color value is preset to be the color value corresponding to white.
  • the connected domain of white is filtered according to the preset area threshold to obtain the second value of the illuminance meter as shown in Figure 4. picture.
  • Step S23 Determine a numerical display area based on each connected domain.
  • the target connected domain corresponding to the numerical display area is determined from each of the connected domains according to the specifications, shape and/or position of each of the connected domains, and then determined based on the target connected domain.
  • the position, specification and/or shape of the numerical display area, etc., for example, the geometric center of the target connected domain can be used as the geometric center of the numerical display area, and the numerical display can be determined according to the preset graphic of the numerical display area. area, the smallest preset external graphic corresponding to the target connected domain can also be determined as the numerical display area.
  • the specifications, shape, position and other information of the connected domain can be based on the actual conditions of the illuminance meter. The situation is determined, and this embodiment does not limit this.
  • the step of determining the numerical display area according to each of the connected domains includes:
  • Step S231 perform image recognition on each of the connected domains according to the specifications and shape of the preset reference area, and determine the target connected domain with the highest similarity to the preset reference area from each of the connected domains;
  • Step S232 Determine the minimum circumscribed quadrilateral of the target connected domain as the numerical display area in the illuminance meter image.
  • the specifications and shape of the preset reference area are determined based on the actual shape and specifications of the numerical display area, the positional relationship between the numerical display area and the image acquisition device, etc., for example, if If the actual shape of the numerical display area is a rectangle, then the numerical display area in the illuminance meter image is a quadrilateral. If the actual shape of the numerical display area is a circle, then the numerical display area in the illuminance meter image is an ellipse.
  • Figure 5 is a schematic diagram of an implementable manner of the numerical display area in the illuminance meter image in this application.
  • the minimum value determined according to the white connected domain in Figure 4 The circumscribed quadrilateral is the area selected by the white quadrilateral frame in Figure 5. That is, the area within the white quadrilateral frame in Figure 5 is the numerical display area.
  • Step S30 perform perspective conversion on the numerical display area to obtain a numerical image
  • the numerical display area is converted in perspective according to a preset perspective conversion algorithm to obtain a numerical image, wherein the numerical image can be an image corresponding to the numerical display area after perspective conversion.
  • the image may also be an image obtained by converting the viewing angle of the illuminance meter image.
  • the numerical image may be a front view of the image corresponding to the numerical display area, or there may be a certain conversion deviation from the front view.
  • the viewing angle may include image rotation. For example, if the illuminance meter is hung upside down, the numerical display area needs to be rotated 180 degrees before the accurate value can be recognized.
  • the perspective conversion method includes image rotation.
  • the perspective conversion method is Methods can also include parallel projection, perspective projection, etc., which can be determined based on actual needs, test results, etc.
  • Step S40 identify the illumination count value from the numerical image
  • the numerical characters in the numerical image can be recognized through image recognition, text recognition, etc. to obtain the illumination count value. It should be noted that if the numerical image is the numerical value If the image corresponding to the display area is an image after perspective conversion, the numerical characters in the numerical image can be directly identified to obtain the illumination count value. If the numerical image is an image after perspective conversion of the illuminance meter image, Then, the step of identifying the illumination count value from the numerical image is to identify the illumination count value from the numerical display area of the numerical image.
  • the step of identifying illumination count values from the numerical image includes:
  • Step S41 Binarize the numerical image to obtain a numerical binary image
  • Step S42 Perform character segmentation processing on the numerical binary image to obtain at least one character binary image
  • Step S43 perform similarity matching between each of the character binary images and a preset character template, and determine the numerical recognition results corresponding to each of the character binary images according to the character template with the highest similarity;
  • Step S44 Arrange and combine the numerical recognition results to obtain the illumination count value.
  • the numerical display area in the numerical image is binarized to obtain a numerical binary image
  • the numerical binary image is The rows and columns of the image are traversed respectively.
  • the target row or the target column is used as a boundary, and the numerical binary map is processed according to at least one determined boundary.
  • Perform character segmentation processing to obtain at least one character binary image.
  • Each of the character binary images is quantized into a dot matrix of 0 and 1, subtracted from the preset character template's quantized dot matrix of 0 and 1, and all points are The absolute values of the corresponding differences are accumulated, and the accumulated sum is determined as the error value.
  • the character template with the highest similarity is determined from each of the character templates, and the character template with the highest similarity is The corresponding numerical characters are determined as the corresponding numerical recognition results of each of the character binary diagrams.
  • the illumination count value can be obtained, wherein the numerical binary diagram is the A binary image of the image corresponding to the numerical display area.
  • the preset character template is an image of a standard front view corresponding to a combination of one or more of numbers, symbols, Chinese, English and other characters.
  • the illumination count value includes Color temperature, brightness, illuminance, etc. It should be noted that if the numerical display area includes more than one illuminance count value, the corresponding illuminance count value can be determined according to the positional relationship of each of the illuminance count values in the numerical display area. specific value.
  • the numerical binary image is subjected to character segmentation processing to obtain eight character binary images, and the eight character binary images are combined with the preset character template. Similarity matching can determine that the numerical recognition results are “1”, “1", “0.”, “5", “6", “1", “2” and “1”, among which, if preset Determine the brightness in the first line and the color temperature in the second line, then you can determine the brightness as "110.5" and the color temperature as "6121".
  • the step further includes:
  • Step S50 Obtain the light sensing component value corresponding to each illumination count value
  • Step S51 Generate a light-sensing mapping curve based on each illumination count value and each light-sensing component value.
  • the value of the light-sensing component detected by the light-sensing component under the same illuminance as each of the illuminance count values is obtained. Taking the value of each of the light-sensing component as the abscissa, each of the The illumination count value corresponding to the value of the light-sensing component is the ordinate, and the light-sensing mapping curve is obtained by fitting.
  • the light-sensing mapping curve is used in televisions to more accurately and correctly adjust the color temperature and brightness of the picture according to the environment. .
  • Figure 7 is a schematic diagram of an implementable manner of the light sensing mapping curve in this application.
  • the abscissa is the brightness of the light sensing component
  • the ordinate is the illuminance meter.
  • the numerical value is obtained
  • the image realizes the viewing angle correction of the image in the numerical display area, so that the illuminance meter image collected from the side can be converted into a front view, and then the illuminance count value is obtained from the numerical image, thereby realizing the illuminance meter image collected from the side.
  • the step of converting the viewing angle of the numerical display area to obtain a numerical image includes:
  • Step S31 perform coordinate transformation on the position of each pixel in the numerical display area according to a perspective transformation algorithm to obtain perspective coordinates, where the perspective transformation algorithm is:
  • a rectangular coordinate system is established, the coordinates of each pixel point in the numerical display area in the rectangular coordinate system are determined, and the coordinates of each pixel point are substituted into the perspective transformation algorithm.
  • the position of each pixel in the numerical display area is coordinate transformed to obtain perspective coordinates, where the perspective transformation algorithm is:
  • the preset perspective transformation matrix can be preset in value according to the actual situation.
  • x, y, and 1 are the initial coordinates
  • X, Y, and Z are the intermediate transformation coordinates. In the same horizontal plane, it is necessary to divide by Z to realize the conversion from two-dimensional space to three-dimensional space.
  • the obtained X', Y', and Z' are perspective coordinates.
  • the coordinates of each pixel in the numerical display area are The process of position coordinate conversion includes two coordinate conversions. The first time is from (x, y, 1) to (X, Y, Z), and the second time is from (X, Y, Z) to ( X', Y', Z').
  • the two transformation methods in this embodiment can also be converted into one coordinate transformation or more than two coordinate transformations through formula transformation. In this embodiment, No restrictions.
  • the step before the step of performing coordinate transformation on the position of each pixel in the numerical display area according to a perspective transformation algorithm to obtain perspective coordinates, the step further includes:
  • Step S311 establish a rectangular coordinate system
  • Step S312 Obtain the initial coordinates and perspective coordinates of at least one preset representative point in the rectangular coordinate system in the numerical display area;
  • a rectangular coordinate system is established, at least one representative point is selected from the numerical display area, and the initial coordinates and perspective coordinates of each representative point in the rectangular coordinate system are obtained, where,
  • the representative point may be a vertex of the numerical display area, a point on the contour line of the numerical display area, or any point in the numerical display area from which accurate coordinates after perspective conversion can be determined.
  • Step S313 Substitute the initial coordinates and perspective coordinates of each representative point into the perspective transformation algorithm to determine the preset perspective transformation matrix.
  • the perspective transformation algorithm is:
  • x, y, 1 are the initial coordinates
  • X, Y, Z are the intermediate transformation coordinates
  • X', Y', Z' are the perspective coordinates.
  • the number of representative points is 4.
  • Formula 3 can be obtained: Among them, x 1 and y 1 are the initial coordinates of the first representative point, X 1 ' and Y 1 ' are the perspective coordinates of the first representative point, x 2 and y 2 are the initial coordinates of the second representative point, 2 ' and Y 2 ' are the perspective coordinates of the second representative point, x 3 and y 3 are the initial coordinates of the third representative point, X 3 ' and Y 3 ' are the perspective coordinates of the third representative point, x 4 , y 4 are the initial coordinates of the fourth representative point, X 4 ', Y 4 ' are the perspective coordinates of the fourth representative point. Substituting the initial coordinates and perspective coordinates of the four representative points into formula 3, the perspective transformation can be determined matrix specific value.
  • a numerical image can be obtained by mapping the numerical display area on the front view plane along the direction of the arrow through a preset perspective transformation matrix.
  • Step S32 Based on each of the perspective coordinates, fill the color value of each pixel in the numerical display area into a preset rectangular area;
  • a preset rectangular area is determined in the rectangular coordinate system in advance according to the actual situation, each of the perspective coordinates is located in the preset rectangular area, and each of the numerical display areas is The color value of the pixel is assigned to the pixel corresponding to the perspective coordinate of each pixel.
  • Step S33 determine the null pixels with empty color values in the preset rectangular area
  • Step S34 Determine the color value of each null pixel point based on the color value of the pixel point adjacent to each null value pixel point, and obtain a numerical image.
  • the average value of the color values of the pixels adjacent to each of the null pixels is calculated or The median value, etc., determines the average or median value of the color values of the pixels adjacent to each of the null value pixels as the color value of each of the null value pixels, until the preset rectangle If there are no null pixels with empty color values in the area, then a numerical image after the viewing angle conversion is obtained; if there are null pixels with empty color values in the preset rectangular area, then a numerical image after the viewing angle conversion is obtained .
  • the image of the numerical display area captured from the side can be better corrected to a frontal view, thereby helping to improve the accuracy of subsequent numerical recognition.
  • the present application also provides a computer program product, including a computer program that implements the steps of the illumination count value identification method as described above when executed by a processor.
  • the computer program product provided by this application solves the technical problem of low accuracy of illumination count values observed in the prior art. Compared with the prior art, the beneficial effects of the computer program product provided by the embodiments of the present application are the same as the beneficial effects of the illumination count value identification method provided by the above embodiments, and will not be described again here.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente demande concerne un procédé d'identification de valeur d'illuminomètre, un dispositif électronique et un support de stockage. Le procédé d'identification de valeur d'illuminomètre comprend : l'acquisition d'une image d'illuminomètre ; l'identification d'une zone d'affichage de valeur dans l'image d'illuminomètre ; la réalisation d'une conversion d'angle visuel sur la zone d'affichage de valeur pour obtenir une image de valeur ; et l'identification d'une valeur d'illuminomètre à partir de l'image de valeur.
PCT/CN2023/078535 2022-09-13 2023-02-27 Procédé d'identification de valeur d'illuminomètre, dispositif électronique et support de stockage WO2024055531A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211110091.5 2022-09-13
CN202211110091.5A CN115457055A (zh) 2022-09-13 2022-09-13 照度计数值识别方法、电子设备及存储介质

Publications (1)

Publication Number Publication Date
WO2024055531A1 true WO2024055531A1 (fr) 2024-03-21

Family

ID=84303020

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/078535 WO2024055531A1 (fr) 2022-09-13 2023-02-27 Procédé d'identification de valeur d'illuminomètre, dispositif électronique et support de stockage

Country Status (2)

Country Link
CN (1) CN115457055A (fr)
WO (1) WO2024055531A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115457055A (zh) * 2022-09-13 2022-12-09 深圳创维-Rgb电子有限公司 照度计数值识别方法、电子设备及存储介质

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006309405A (ja) * 2005-04-27 2006-11-09 Chugoku Electric Power Co Inc:The メータ認識システム、メータ認識方法、およびメータ認識プログラム
CN204598390U (zh) * 2015-04-21 2015-08-26 利亚德光电股份有限公司 用于电子设备显示器的亮度调节***和电子设备显示器
CN105740856A (zh) * 2016-01-28 2016-07-06 宁波理工监测科技股份有限公司 一种基于机器视觉的指针式仪表示数读取方法
US20190095739A1 (en) * 2017-09-27 2019-03-28 Harbin Institute Of Technology Adaptive Auto Meter Detection Method based on Character Segmentation and Cascade Classifier
KR102150200B1 (ko) * 2019-05-31 2020-08-31 경성대학교 산학협력단 7-세그먼트를 구비한 진단 기기 이미지 인식 장치
CN113449639A (zh) * 2021-06-29 2021-09-28 深圳市海亿达科技股份有限公司 一种物联网网关对仪表的无接触数据采集方法
CN114219760A (zh) * 2021-11-12 2022-03-22 深圳市优必选科技股份有限公司 仪表的读数识别方法、装置及电子设备
CN114999371A (zh) * 2022-04-14 2022-09-02 深圳创维-Rgb电子有限公司 区域控光方法、装置、设备及可读存储介质
CN115457055A (zh) * 2022-09-13 2022-12-09 深圳创维-Rgb电子有限公司 照度计数值识别方法、电子设备及存储介质

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006309405A (ja) * 2005-04-27 2006-11-09 Chugoku Electric Power Co Inc:The メータ認識システム、メータ認識方法、およびメータ認識プログラム
CN204598390U (zh) * 2015-04-21 2015-08-26 利亚德光电股份有限公司 用于电子设备显示器的亮度调节***和电子设备显示器
CN105740856A (zh) * 2016-01-28 2016-07-06 宁波理工监测科技股份有限公司 一种基于机器视觉的指针式仪表示数读取方法
US20190095739A1 (en) * 2017-09-27 2019-03-28 Harbin Institute Of Technology Adaptive Auto Meter Detection Method based on Character Segmentation and Cascade Classifier
KR102150200B1 (ko) * 2019-05-31 2020-08-31 경성대학교 산학협력단 7-세그먼트를 구비한 진단 기기 이미지 인식 장치
CN113449639A (zh) * 2021-06-29 2021-09-28 深圳市海亿达科技股份有限公司 一种物联网网关对仪表的无接触数据采集方法
CN114219760A (zh) * 2021-11-12 2022-03-22 深圳市优必选科技股份有限公司 仪表的读数识别方法、装置及电子设备
CN114999371A (zh) * 2022-04-14 2022-09-02 深圳创维-Rgb电子有限公司 区域控光方法、装置、设备及可读存储介质
CN115457055A (zh) * 2022-09-13 2022-12-09 深圳创维-Rgb电子有限公司 照度计数值识别方法、电子设备及存储介质

Also Published As

Publication number Publication date
CN115457055A (zh) 2022-12-09

Similar Documents

Publication Publication Date Title
CN109104596B (zh) 投影***以及显示影像的校正方法
US9325968B2 (en) Stereo imaging using disparate imaging devices
CN110300292B (zh) 投影畸变校正方法、装置、***及存储介质
TWI719493B (zh) 投影系統、投影裝置以及其顯示影像的校正方法
US10319104B2 (en) Method and system for determining datum plane
US20180204340A1 (en) A depth map generation apparatus, method and non-transitory computer-readable medium therefor
JP2015212849A (ja) 画像処理装置、画像処理方法および画像処理プログラム
CN102508574A (zh) 基于投影屏幕的多点触控检测方法及多点触控***
CN112085775B (zh) 图像处理的方法、装置、终端和存储介质
US10168984B2 (en) Image receiving apparatus and method thereof for determining an orientation of a screen of an electronic apparatus
CN108074237B (zh) 图像清晰度检测方法、装置、存储介质及电子设备
CN109313797B (zh) 一种图像显示方法及终端
WO2024055531A1 (fr) Procédé d'identification de valeur d'illuminomètre, dispositif électronique et support de stockage
CN111694528A (zh) 显示墙的排版辨识方法以及使用此方法的电子装置
WO2022105277A1 (fr) Procédé et appareil de commande de projection, machine optique de projection et support lisible de stockage
WO2011096571A1 (fr) Dispositif d'entrée
KR102505951B1 (ko) 이미지 제공 장치, 방법 및 컴퓨터 프로그램
CN110047126B (zh) 渲染图像的方法、装置、电子设备和计算机可读存储介质
CN117253022A (zh) 一种对象识别方法、装置及查验设备
KR20140090538A (ko) 디스플레이 장치 및 제어 방법
JP6971788B2 (ja) 画面表示制御方法および画面表示制御システム
WO2019100547A1 (fr) Procédé de commande de projection, appareil, système d'interaction de projection, et support d'informations
US20090046063A1 (en) Coordinate positioning system and method with in-the-air positioning function
US20180061135A1 (en) Image display apparatus and image display method
CN112291445B (zh) 图像处理的方法、装置、设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23864274

Country of ref document: EP

Kind code of ref document: A1