CN115134492A - Image acquisition method, electronic device and computer readable medium - Google Patents

Image acquisition method, electronic device and computer readable medium Download PDF

Info

Publication number
CN115134492A
CN115134492A CN202210609863.3A CN202210609863A CN115134492A CN 115134492 A CN115134492 A CN 115134492A CN 202210609863 A CN202210609863 A CN 202210609863A CN 115134492 A CN115134492 A CN 115134492A
Authority
CN
China
Prior art keywords
exposure time
image
target
pixel mean
image acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210609863.3A
Other languages
Chinese (zh)
Other versions
CN115134492B (en
Inventor
任彦斌
曹学友
黄怡菲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Aurora Smart Core Technology Co ltd
Original Assignee
Beijing Jihao Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jihao Technology Co Ltd filed Critical Beijing Jihao Technology Co Ltd
Priority to CN202210609863.3A priority Critical patent/CN115134492B/en
Publication of CN115134492A publication Critical patent/CN115134492A/en
Application granted granted Critical
Publication of CN115134492B publication Critical patent/CN115134492B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The embodiment of the application discloses an image acquisition method, electronic equipment and a computer readable medium. An embodiment of the method comprises: acquiring an image based on historical exposure time to obtain a first image, wherein the historical exposure time is the exposure time recorded after the last image acquisition; determining an actual pixel mean of a target region in a first image; and if the actual pixel mean value is positioned in the floating interval corresponding to the target pixel mean value, outputting a first image. The embodiment improves the image acquisition efficiency and the adaptability to the image acquisition environment.

Description

Image acquisition method, electronic device, and computer-readable medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to an image acquisition method, electronic equipment and a computer readable medium.
Background
With the development of computer technology, more and more functions in electronic devices need to rely on optical sensors. For example, the function of fingerprint identification under the screen needs to rely on an optical sensor to collect fingerprint images, and the function of face unlocking needs to rely on an optical sensor to collect face images. To improve the efficiency of these functions, it is often necessary to control the timing of the optical sensor.
In the prior art, each time an optical sensor takes a picture, the optical sensor needs to go through a photometric stage and an exposure stage. In the metering stage, an initial image needs to be acquired based on the metering time, and the exposure time is predicted based on the initial image and the metering time. During the exposure phase, a final image needs to be acquired based on the exposure time. The image acquisition mode has complex operation process and low image acquisition efficiency. In addition, the image acquisition time is easily prolonged due to environmental factors (such as aging of a screen fingerprint acquisition area, user film sticking, desktop background change and the like), and the adaptability to the image acquisition environment is low.
Disclosure of Invention
The embodiment of the application provides an image acquisition method, electronic equipment and a computer readable medium, and aims to solve the technical problems that in the prior art, the image acquisition efficiency is low and the adaptability to an image acquisition environment is low.
In a first aspect, an embodiment of the present application provides an image acquisition method, including: acquiring an image based on historical exposure time to obtain a first image, wherein the historical exposure time is the exposure time recorded after the last image acquisition; determining an actual pixel mean of a target region in the first image; and if the actual pixel mean value is located in the floating interval corresponding to the target pixel mean value, outputting the first image.
In a second aspect, an embodiment of the present application provides an electronic device, including: one or more processors; storage means having one or more programs stored thereon which, when executed by the one or more processors, cause the one or more processors to carry out the method as described in the first aspect.
In a third aspect, embodiments of the present application provide a computer-readable medium, on which a computer program is stored, which when executed by a processor, implements the method as described in the first aspect.
In a fourth aspect, the present application provides a computer program product comprising a computer program that, when executed by a processor, implements the method described in the first aspect.
According to the image acquisition method, the electronic device and the computer readable medium, the first image is obtained by acquiring the image based on the exposure time recorded after the last image acquisition, and then the actual pixel mean value of the target area in the first image is determined, so that the first image is output when the actual pixel mean value is located in the floating interval of the target pixel mean value, therefore, the historical exposure time can be multiplexed, one image acquisition operation and one exposure time prediction operation are omitted, and the image acquisition efficiency is improved. In addition, the historical exposure time recorded by the latest image acquisition is used during each image acquisition, so that the influence of environmental factors on image acquisition can be weakened, and the method has stronger adaptability to the change of environmental factors (such as aging of a screen fingerprint acquisition area, film sticking of a user, change of a desktop background and the like).
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a flow chart of one embodiment of an image acquisition method of the present application;
FIG. 2 is a flow chart of a process for determining an actual pixel mean in the image acquisition method of the present application;
FIG. 3 is a schematic diagram of a histogram in an image acquisition method of the present application;
FIG. 4 is a flowchart of a process for determining a floating interval of a target pixel mean value in the image capturing method of the present application;
FIG. 5 is a schematic structural diagram of one embodiment of an image capture device of the present application;
fig. 6 is a schematic structural diagram of a computer system for implementing an electronic device according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not to be construed as limiting the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
It should be noted that all actions of acquiring signals, information or data in the present application are performed under the premise of complying with the corresponding data protection regulation policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
Biometric technology has been widely applied to various terminal devices or electronic apparatuses. Biometric identification techniques include, but are not limited to, fingerprint identification, palm print identification, vein identification, iris identification, face identification, biometric identification, anti-counterfeiting identification, and the like. Among them, fingerprint recognition generally includes optical fingerprint recognition, capacitive fingerprint recognition, and ultrasonic fingerprint recognition. With the rise of the full screen technology, the fingerprint identification module can be arranged in a local area or a whole area below the display screen, so that Under-screen (Under-display) optical fingerprint identification is formed; or, can also be with inside partly or the whole display screen that integrates to electronic equipment of optical fingerprint identification module to form the optical fingerprint identification In-screen (In-display). The Display screen may be an Organic Light Emitting Diode (OLED) Display screen, a Liquid Crystal Display (LCD), or the like. Fingerprint identification methods generally include the steps of fingerprint image acquisition, preprocessing, feature extraction, feature matching, and the like. Part or all of the steps can be realized by a traditional Computer Vision (CV) algorithm, and also can be realized by an Artificial Intelligence (AI) -based deep learning algorithm. The fingerprint identification technology can be applied to portable or mobile terminals such as smart phones, tablet computers and game equipment, and other electronic equipment such as smart door locks, automobiles and bank automatic teller machines, and is used for fingerprint unlocking, fingerprint payment, fingerprint attendance, identity authentication and the like.
In biometric identification techniques, it is often necessary to rely on optical sensors for image acquisition. For example, the fingerprint identification function under the screen needs to rely on an optical sensor to collect fingerprint images, and the face unlocking function needs to rely on the optical sensor to collect face images. To improve the efficiency of these functions, it is often necessary to control the timing of the optical sensor. The application provides an image acquisition method which can reduce the time consumed by image acquisition of an optical sensor and can improve the adaptability of the optical sensor to an image acquisition environment.
Referring to FIG. 1, a flow 100 of one embodiment of an image acquisition method according to the present application is shown. The image acquisition method can be applied to various electronic equipment with an image acquisition function. For example, the electronic devices described above may include, but are not limited to: smart phones, tablet computers, electronic book readers, MP3 (Moving Picture Experts Group Audio Layer III) players, MP4 (Moving Picture Experts Group Audio Layer IV) players, laptop portable computers, car-mounted computers, palm top computers, desktop computers, set-top boxes, smart televisions, cameras, wearable devices, smart locks, and the like. The execution subject of the image acquisition method may be a processor in the electronic device, such as the processing device 501 in fig. 5.
The image acquisition method comprises the following steps:
step 101, acquiring an image based on historical exposure time to obtain a first image.
In the present embodiment, the execution subject of the image capturing method may first acquire the historical exposure time. The historical exposure time is the exposure time recorded after the last image acquisition, and specifically may be the actual exposure time used by the last image acquisition, or may be the optimal exposure time obtained after the actual exposure time used by the last image acquisition is adjusted. After the historical exposure time is obtained, image acquisition can be performed based on the historical exposure time to obtain a first image.
In some optional implementations, if there is no historical exposure time (for example, no exposure time is recorded after the last image acquisition, or a scene such as an image is acquired for the first time), the default metering time may be read, and image acquisition is performed based on the default metering time to obtain the first image. The default photometric time is an exposure time used when the optical sensor is subjected to factory verification, and the default exposure time may be preset as needed, which is not specifically limited herein.
Step 102, an actual pixel mean of the target region in the first image is determined.
In this embodiment, the executing subject may first select a target region in the first image, and then count the actual pixel mean of the target region. The target area can be selected according to the brightness value of the pixel point in the first image. For example, a region formed by pixel points whose luminance values are within the target luminance value range may be used as the target region. The target brightness value interval can be determined according to the brightness value distribution condition of the pixel points in the first image. For example, a range from a certain brightness value to the maximum brightness value of the pixel points in the first image may be set.
In practice, for a certain optical sensor, when other shooting factors (such as ambient light intensity, shooting position, shooting object, other shooting parameters, etc.) except the exposure time are the same, the exposure time is in a linear positive correlation with the actual pixel mean value of the image shot by the optical sensor, and is also in a linear positive correlation with the actual pixel mean value of a certain fixed local area in the image shot by the optical sensor. Generally, the area with larger brightness value in the image can show whether the image is overexposed or underexposed, so the actual pixel mean value of the area with larger brightness value in the image is usually used to determine whether the image quality is in accordance with the expectation. Therefore, the range from a certain brightness value to the maximum brightness value of the pixel points in the first image can be used as a target brightness value range, the range formed by the pixel points with the brightness values in the target brightness value range is used as a target range, and the actual pixel average value of the target range is determined.
In step 103, if the actual pixel mean value is within the target pixel mean value interval, outputting a first image.
In the present embodiment, it can be understood that, in the case of no overexposure, the longer the exposure time, the larger the actual pixel average value of the target region in the first image, and the higher the image quality. Therefore, the actual pixel mean value of the target area in the first image can be used for measuring the image quality. However, the longer the exposure time is, the longer the image acquisition time is, so the image acquisition time and the image quality need to be balanced, and a target pixel average value which can simultaneously take account of the image acquisition time and the image quality is set. In practice, the target pixel mean value may be set by performing experiments according to a large amount of data, and the specific value is not limited herein.
In addition, it can be understood that, since there is a small variation in the amount of light entering during the exposure process, it is difficult to make the actual pixel average value equal to the target pixel average value, and therefore, considering the fault tolerance, the floating interval corresponding to the target pixel average value can be set based on the target pixel average value. For example, if the number of bits of the photo sensor is 12 bits, the maximum pixel value is 4096, the target pixel average value may be set to 2048, and the floating interval may be set to [2048-5 × 64, 2048+4 × 64 ]. When the actual pixel mean value is within the floating interval, the actual pixel mean value can be considered to meet the requirement, and the first image can be output at the moment.
According to the method provided by the embodiment of the application, the first image is obtained by acquiring the image based on the exposure time recorded after the last image acquisition, and then the actual pixel mean value of the target area in the first image is determined, so that the first image is output when the actual pixel mean value is positioned in the floating interval of the target pixel mean value, therefore, the historical exposure time can be multiplexed, one image acquisition operation and one exposure time prediction operation are omitted, and the image acquisition efficiency is improved. In addition, the historical exposure time recorded by the latest image acquisition is used during each image acquisition, so that the influence of environmental factors on image acquisition can be weakened, and the method has stronger adaptability to the change of environmental factors (such as aging of a screen fingerprint acquisition area, film sticking of a user, change of a desktop background and the like).
In some alternative embodiments, after step 102 is performed (i.e., after determining the actual pixel mean of the target area in the first image), the performing subject may also determine the optimal exposure time based on the actual pixel mean, the historical exposure time, and the target pixel mean. After the optimal exposure time is obtained, the optimal exposure time may be recorded and used as the historical exposure time at the next image acquisition. Therefore, when an image is acquired every time, the last recorded historical exposure time can be reused, the influence of environmental factors on image acquisition is reduced, and the method has stronger adaptability to the change of environmental factors (such as aging of a screen fingerprint acquisition area, film sticking of a user, desktop background change and the like).
Wherein the optimal exposure time may be determined using the following formula: t ═ R-R)/k + T. Wherein, T is the optimal exposure time, R is the target pixel mean, R is the actual pixel mean, T is the historical exposure time, and k is the slope of the linear positive correlation between the actual pixel mean of the target area in the image and the exposure time. Taking an optical sensor for collecting a fingerprint image as an example, the brightness value of a fingerprint area in the fingerprint image is larger and the brightness value of a background area is smaller. When the optical sensor is subjected to factory calibration, the flesh color rubber head can be used for simulating a real person finger touch fingerprint detection area, and a fingerprint image is collected through the optical sensor. The area with the maximum brightness in the image corresponds to the flesh color rubber head touch area, and the actual pixel mean value of the area and the exposure time can be in a linear positive correlation relationship. The slope of the linear positive correlation can be expressed as k ', and k is k' x a. Wherein a is a fixed ratio for representing the difference between the slope of the real human finger and the slope of the flesh color rubber head, and the value can be obtained according to a large number of experimental statistics, which is not limited specifically here.
In some alternative embodiments, when the shooting scene or the shooting environment (e.g., the ambient light intensity) changes significantly, the image shooting effect may not be expected by using the historical exposure time recorded in the last shooting, which is indicated that the actual pixel mean determined in step 102 is outside the target pixel mean interval. In this case, the executing body may take a process of photographing the first image and a process of calculating an optimal exposure time as a photometry process, acquire the second image based on the optimal exposure time, and output the second image. Thereby, an image capturing effect as intended is obtained.
In some alternative embodiments, when acquiring the second image based on the optimal exposure time, if it is detected that the optimal exposure time is greater than the first exposure time threshold (which may be set as required, for example, 50ms, 60ms, or 100ms, etc.), the executing body may replace the optimal exposure time with the first exposure time threshold, and acquire the second image with the replaced optimal exposure time. Therefore, the image overexposure can be avoided, and the quality of the acquired image is improved.
In some optional embodiments, when acquiring the second image based on the optimal exposure time, if the optimal exposure time is less than the second exposure time threshold (which may be set as required, and the first exposure time threshold is greater than the second exposure time threshold), the executing entity may replace the optimal exposure time with the second exposure time threshold, and acquire the second image with the replaced optimal exposure time. Therefore, the image underexposure can be avoided, and the quality of the acquired image is improved.
In some alternative embodiments, referring to fig. 2, the actual pixel mean in step 102 may be determined using sub-steps S11 through S13 as follows:
and a sub-step S11 of converting the first image into a histogram. Specifically, the brightness value of each pixel point in the first image may be determined first, then the number of pixel points corresponding to each brightness value may be counted, and finally a histogram is drawn based on the statistical result. As shown in fig. 3, the horizontal axis of the histogram may represent the luminance value or the luminance value range, and the vertical axis may represent the number of pixels corresponding to different luminance values or different luminance value ranges. The histogram can represent the distribution condition of pixel points with different brightness values or different brightness value ranges, and the brightness degree of the image is visually represented.
And a substep S12 of selecting a target intensity value interval based on the histogram. Here, since a region having a high luminance value indicates a higher luminance value as the abscissa of the histogram is larger, a section formed by at least one luminance value on the right side in the histogram may be set as a target luminance value section as indicated by reference numeral 301 in fig. 3.
And a substep S13, taking a region formed by pixel points of which the brightness values are within the target brightness value interval in the first image as a target region, and determining the actual pixel average value of the target region.
By converting the first image into the histogram, the actual pixel mean value of the area with larger brightness value in the first image can be conveniently and quickly counted.
In some optional embodiments, referring to fig. 4, the floating interval of the target pixel mean in step 104 may be determined through sub-steps S21 to S24 as follows:
and a substep S21, performing image acquisition based on a plurality of candidate exposure times smaller than or equal to the first exposure time threshold, to obtain a test image set corresponding to each candidate exposure time. The first exposure threshold may refer to a maximum exposure threshold, which may be set as needed. For example, it may be set to 50ms, 60ms, 100ms, or the like. Taking the first exposure time threshold as 100ms as an example, a plurality of test images can be collected by taking 100ms as exposure time, and are summarized into a first test image set; then, collecting a plurality of test images by taking 90ms as exposure time, and summarizing the test images into a second test image set; then, collecting a plurality of test images by taking 80ms as exposure time, and summarizing the test images into a third test image set; and so on.
And a substep S22 of processing the test image set corresponding to each candidate exposure time based on the target algorithm and determining an algorithm pass rate of the test image set corresponding to each candidate exposure time. The target algorithm may be an algorithm for detecting, identifying, and the like the acquired image. For example, if the captured image is a fingerprint image, the target algorithm may be a fingerprint recognition model. Each test image in the test image set corresponding to each candidate exposure time can be input to the fingerprint identification model, and a fingerprint identification result corresponding to each test image can be obtained. And according to the fingerprint identification result, counting the algorithm passing rate of the test image set corresponding to each candidate exposure time. Continuing with the above example, the algorithm pass rate of the first test set may be 99%, the algorithm pass rate of the second test set may be 95%, and the algorithm pass rate of the third test set may be 80%.
And a substep S23 of selecting a target exposure time corresponding to the passing rate of the target algorithm from the candidate exposure times, and determining a target pixel mean value based on a target test image set corresponding to the target exposure time. Here, a target algorithm passing rate may be selected from the calculated algorithm passing rates, for example, a minimum algorithm passing rate (e.g., 96%) greater than a certain preset threshold (e.g., 95%) is used as the target algorithm passing rate. Then, the exposure time (i.e., 90ms) corresponding to the target algorithm throughput rate (i.e., 96%) is used as the target exposure time. When determining the target pixel mean value based on the target test image set corresponding to the target exposure time, the pixel mean value of the target area in each test image in the target test image set may be determined first, and then the pixel mean values of the target areas in each test image are averaged to obtain the target pixel mean value.
And a sub-step S24 of determining a floating interval of the target pixel mean value based on the target pixel mean value and a preset offset value. Here, the preset offset value may include an upper offset value and a lower offset value, and the upper offset value and the lower offset value may be the same or different. For example, the target pixel mean value is 2048, the upper offset value may be 4 × 64, and the lower offset value may be 5 × 64, where the floating interval corresponding to the target pixel mean value is [2048-5 × 64, 2048+4 × 64 ].
The target pixel mean value is selected by testing the algorithm passing rate of the test image set corresponding to the candidate exposure times, so that the image quality and the time consumed by image acquisition can be effectively balanced, and the target pixel mean value is more reasonable to set.
With further reference to fig. 5, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of an image capturing apparatus, which corresponds to the embodiment of the method shown in fig. 1, and which is particularly applicable to various electronic devices.
As shown in fig. 5, the image capturing apparatus 500 of the present embodiment includes: a first acquisition unit 501, configured to acquire an image based on historical exposure time, to obtain a first image, where the historical exposure time is exposure time recorded after the last image acquisition; a determining unit 502, configured to determine an actual pixel mean of the target area in the first image; the first output unit 503 is configured to output the first image if the actual pixel average value is within a floating interval corresponding to the target pixel average value.
In some optional implementations of this embodiment, the apparatus further includes a recording unit, configured to determine an optimal exposure time based on the actual pixel average, the historical exposure time, and the target pixel average; and recording the optimal exposure time to be used as the historical exposure time of the next image acquisition.
In some optional implementations of the embodiment, the apparatus further includes a second output unit, configured to acquire a second image based on the optimal exposure time and output the second image if the actual pixel mean is outside the floating interval.
In some optional implementations of this embodiment, the first acquiring unit 501 is further configured to: if the optimal exposure time is greater than a first exposure time threshold, replacing the optimal exposure time with the first exposure time threshold, and acquiring a second image by adopting the replaced optimal exposure time; and if the optimal exposure time is smaller than a second exposure time threshold, replacing the optimal exposure time with the second exposure time threshold, and acquiring a second image by adopting the replaced optimal exposure time, wherein the first exposure time threshold is larger than the second exposure time threshold.
In some optional implementations of this embodiment, the determining unit 502 is further configured to convert the first image into a histogram; selecting a target brightness value interval based on the histogram; and taking a region formed by pixel points with the brightness values within the target brightness value interval in the first image as a target region, and determining the actual pixel average value of the target region.
In some optional implementations of this embodiment, the floating interval is determined by: acquiring images based on a plurality of candidate exposure times smaller than or equal to a first exposure time threshold value to obtain a test image set corresponding to each candidate exposure time; processing the test image set corresponding to each candidate exposure time based on a target algorithm, and determining the algorithm passing rate of the test image set corresponding to each candidate exposure time; selecting target exposure time corresponding to the passing rate of a target algorithm from the candidate exposure time, and determining a target pixel mean value based on a target test image set corresponding to the target exposure time; and determining a floating interval corresponding to the target pixel mean value based on the target pixel mean value and a preset deviation value.
In some optional implementations of this embodiment, the apparatus further includes a second acquisition unit, configured to: if the historical exposure time does not exist, reading default photometric time, wherein the default photometric time is exposure time used when the optical sensor is subjected to factory verification; and acquiring an image based on the default photometric time to obtain a first image.
The device provided by the above embodiment of the application acquires the first image by acquiring the image based on the exposure time recorded after the last image acquisition, and then determines the actual pixel mean value of the target area in the first image, so that the first image is output when the actual pixel mean value is within the floating interval of the target pixel mean value, thereby multiplexing the historical exposure time, omitting one-time image acquisition operation and exposure time prediction operation, and improving the image acquisition efficiency. In addition, the historical exposure time recorded by the latest image acquisition is used during each image acquisition, so that the influence of environmental factors on image acquisition can be weakened, and the method has stronger adaptability to the change of environmental factors (such as aging of a screen fingerprint acquisition area, film sticking of a user, change of a desktop background and the like).
Embodiments of the present application further provide an electronic device, which includes one or more processors, a storage device, and one or more programs stored thereon, and when the one or more programs are executed by the one or more processors, the one or more processors implement the above-mentioned image capturing method.
Reference is now made to fig. 6, which illustrates a schematic structural diagram of an electronic device for implementing some embodiments of the present application. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the range of use of the embodiments of the present application.
As shown in fig. 6, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, magnetic disks, hard disks, and the like; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 6 may represent one device or may represent multiple devices as desired.
An embodiment of the present application further provides a computer program product, which includes a computer program, and when the computer program is executed by a processor, the computer program implements the image capturing method.
In particular, according to some embodiments of the present application, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network through the communication device 609, or installed from the storage device 608, or installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of some embodiments of the present application.
An embodiment of the present application further provides a computer readable medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the image capturing method.
It should be noted that the computer readable medium described in some embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present application, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText transfer protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring an image based on historical exposure time to obtain a first image, wherein the historical exposure time is the exposure time recorded after the last image acquisition; determining an actual pixel mean of a target region in a first image; and outputting a first image if the actual pixel mean value is located in the floating interval corresponding to the target pixel mean value.
Computer program code for carrying out operations for embodiments of the present application may be written in one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +; conventional procedural programming languages, such as the "C" language or similar programming languages, are also included. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present application may be implemented by software or by hardware. The described units may also be provided in a processor, and may be described as: a processor includes a first determining unit, a second determining unit, a selecting unit, and a third determining unit. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the present application and is provided for the purpose of illustrating the general principles of the technology. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present application is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present application are mutually replaced to form the technical solution.

Claims (10)

1. An image acquisition method, characterized in that the method comprises:
acquiring an image based on historical exposure time to obtain a first image, wherein the historical exposure time is the exposure time recorded after the last image acquisition;
determining an actual pixel mean of a target region in the first image;
and if the actual pixel mean value is located in the floating interval corresponding to the target pixel mean value, outputting the first image.
2. The method of claim 1, wherein after determining the actual pixel mean of the target region in the first image, the method further comprises:
determining an optimal exposure time based on the actual pixel mean, the historical exposure time, and the target pixel mean;
and recording the optimal exposure time as the historical exposure time of the next image acquisition.
3. The method of claim 2, further comprising:
and if the actual pixel mean value is positioned outside the floating interval, acquiring a second image based on the optimal exposure time, and outputting the second image.
4. The method of claim 3, wherein said acquiring a second image based on said optimal exposure time comprises:
if the optimal exposure time is larger than a first exposure time threshold, replacing the optimal exposure time with the first exposure time threshold, and acquiring a second image by adopting the replaced optimal exposure time;
and if the optimal exposure time is smaller than a second exposure time threshold, replacing the optimal exposure time with the second exposure time threshold, and acquiring a second image by adopting the replaced optimal exposure time, wherein the first exposure time threshold is larger than the second exposure time threshold.
5. The method according to any one of claims 1-4, wherein said determining an actual pixel mean of the target region in the first image comprises:
converting the first image into a histogram;
selecting a target brightness value interval based on the histogram;
and taking a region formed by pixel points with the brightness values within the target brightness value interval in the first image as a target region, and determining the actual pixel mean value of the target region.
6. Method according to one of claims 1 to 5, characterized in that the floating interval is determined by:
acquiring images based on a plurality of candidate exposure times smaller than or equal to a first exposure time threshold value to obtain a test image set corresponding to each candidate exposure time;
processing the test image set corresponding to each candidate exposure time based on a target algorithm, and determining the algorithm passing rate of the test image set corresponding to each candidate exposure time;
selecting target exposure time corresponding to the passing rate of a target algorithm from the candidate exposure time, and determining a target pixel mean value based on a target test image set corresponding to the target exposure time;
and determining a floating interval corresponding to the target pixel mean value based on the target pixel mean value and a preset deviation value.
7. The method according to one of claims 1 to 6, characterized in that the method further comprises:
if the historical exposure time does not exist, reading default photometric time, wherein the default photometric time is exposure time used when the optical sensor is subjected to factory verification;
and acquiring an image based on the default photometric time to obtain a first image.
8. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
9. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the method of any of claims 1-7 when executed by a processor.
CN202210609863.3A 2022-05-31 2022-05-31 Image acquisition method, electronic device, and computer-readable medium Active CN115134492B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210609863.3A CN115134492B (en) 2022-05-31 2022-05-31 Image acquisition method, electronic device, and computer-readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210609863.3A CN115134492B (en) 2022-05-31 2022-05-31 Image acquisition method, electronic device, and computer-readable medium

Publications (2)

Publication Number Publication Date
CN115134492A true CN115134492A (en) 2022-09-30
CN115134492B CN115134492B (en) 2024-03-19

Family

ID=83377379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210609863.3A Active CN115134492B (en) 2022-05-31 2022-05-31 Image acquisition method, electronic device, and computer-readable medium

Country Status (1)

Country Link
CN (1) CN115134492B (en)

Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0996284A1 (en) * 1998-10-23 2000-04-26 Datalogic S.P.A. Process for regulating the exposure time of a light sensor
WO2006070046A1 (en) * 2004-12-29 2006-07-06 Nokia Corporation Exposure of digital imaging
US20090180022A1 (en) * 2008-01-16 2009-07-16 Samsung Electronics Co., Ltd. System and method for acquiring moving images
CN102291538A (en) * 2011-08-17 2011-12-21 浙江博视电子科技股份有限公司 Automatic exposure method and control device of camera
CN102685393A (en) * 2011-03-07 2012-09-19 精工爱普生株式会社 Digital camera and exposure control method
WO2014119159A1 (en) * 2013-02-04 2014-08-07 日立マクセル株式会社 Image capture device
CN104052905A (en) * 2013-03-12 2014-09-17 三星泰科威株式会社 Method and apparatus for processing image
CN105681626A (en) * 2016-02-25 2016-06-15 广东欧珀移动通信有限公司 Detection method, control method, detection device, control device and electronic device
CN106412447A (en) * 2015-07-31 2017-02-15 广达电脑股份有限公司 Exposure control system and method thereof
CN107249104A (en) * 2017-06-15 2017-10-13 武汉云衡智能科技有限公司 A kind of drive recorder smart camera automatic explosion method
CN107613190A (en) * 2016-07-11 2018-01-19 中兴通讯股份有限公司 A kind of photographic method and terminal
EP3336798A1 (en) * 2016-12-16 2018-06-20 Goodrich Corporation Hdr imaging
CN109040622A (en) * 2018-08-22 2018-12-18 中国电子科技集团公司第四十四研究所 Cmos image sensor exposure time series control method
CN109639994A (en) * 2019-01-03 2019-04-16 湖北工业大学 Embedded onboard camera exposure time dynamic adjusting method
CN110035810A (en) * 2016-11-29 2019-07-19 普诺森公司 Method and apparatus for detecting a wide range of protein concentration simultaneously
CN110248112A (en) * 2019-07-12 2019-09-17 成都微光集电科技有限公司 A kind of exposal control method of imaging sensor
US20200036882A1 (en) * 2018-07-26 2020-01-30 Qualcomm Incorporated Faster automatic exposure control system
US20200053267A1 (en) * 2018-08-13 2020-02-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Imaging Control Method and Apparatus, Electronic Device, and Computer Readable Storage Medium
CN110958400A (en) * 2019-12-13 2020-04-03 上海海鸥数码照相机有限公司 System, method and device for keeping exposure of continuously shot pictures consistent
CN111064900A (en) * 2019-12-25 2020-04-24 宜宾凯翼汽车有限公司 Self-adaptive white balance method and vehicle-mounted panoramic looking-around system
CN111432134A (en) * 2020-03-17 2020-07-17 广东博智林机器人有限公司 Method and device for determining exposure time of image acquisition equipment and processor
US20210065392A1 (en) * 2019-08-29 2021-03-04 Microsoft Technology Licensing, Llc Optimized exposure control for improved depth mapping
CN112514367A (en) * 2020-01-13 2021-03-16 深圳市大疆创新科技有限公司 Electronic front curtain timing control device and method and image acquisition device
WO2021077393A1 (en) * 2019-10-25 2021-04-29 深圳市汇顶科技股份有限公司 Below-screen fingerprint acquisition method and apparatus, electronic device, and storage medium
CN113206949A (en) * 2021-04-01 2021-08-03 广州大学 Semi-direct monocular vision SLAM method based on entropy weighted image gradient
CN113473035A (en) * 2021-07-23 2021-10-01 北京字节跳动网络技术有限公司 Ambient brightness determination method and device and electronic equipment
CN113572973A (en) * 2021-09-28 2021-10-29 武汉市聚芯微电子有限责任公司 Exposure control method, device, equipment and computer storage medium
CN114067274A (en) * 2021-11-19 2022-02-18 杭州萤石软件有限公司 Control method and device of image acquisition equipment and electronic equipment
CN114189634A (en) * 2022-01-26 2022-03-15 阿里巴巴达摩院(杭州)科技有限公司 Image acquisition method, electronic device and computer storage medium
WO2022099482A1 (en) * 2020-11-10 2022-05-19 深圳市大疆创新科技有限公司 Exposure control method and apparatus, mobile platform, and computer-readable storage medium

Patent Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0996284A1 (en) * 1998-10-23 2000-04-26 Datalogic S.P.A. Process for regulating the exposure time of a light sensor
WO2006070046A1 (en) * 2004-12-29 2006-07-06 Nokia Corporation Exposure of digital imaging
US20090180022A1 (en) * 2008-01-16 2009-07-16 Samsung Electronics Co., Ltd. System and method for acquiring moving images
CN102685393A (en) * 2011-03-07 2012-09-19 精工爱普生株式会社 Digital camera and exposure control method
CN102291538A (en) * 2011-08-17 2011-12-21 浙江博视电子科技股份有限公司 Automatic exposure method and control device of camera
WO2014119159A1 (en) * 2013-02-04 2014-08-07 日立マクセル株式会社 Image capture device
CN104052905A (en) * 2013-03-12 2014-09-17 三星泰科威株式会社 Method and apparatus for processing image
CN106412447A (en) * 2015-07-31 2017-02-15 广达电脑股份有限公司 Exposure control system and method thereof
CN105681626A (en) * 2016-02-25 2016-06-15 广东欧珀移动通信有限公司 Detection method, control method, detection device, control device and electronic device
CN107613190A (en) * 2016-07-11 2018-01-19 中兴通讯股份有限公司 A kind of photographic method and terminal
CN110035810A (en) * 2016-11-29 2019-07-19 普诺森公司 Method and apparatus for detecting a wide range of protein concentration simultaneously
EP3336798A1 (en) * 2016-12-16 2018-06-20 Goodrich Corporation Hdr imaging
CN107249104A (en) * 2017-06-15 2017-10-13 武汉云衡智能科技有限公司 A kind of drive recorder smart camera automatic explosion method
US20200036882A1 (en) * 2018-07-26 2020-01-30 Qualcomm Incorporated Faster automatic exposure control system
US20200053267A1 (en) * 2018-08-13 2020-02-13 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Imaging Control Method and Apparatus, Electronic Device, and Computer Readable Storage Medium
CN109040622A (en) * 2018-08-22 2018-12-18 中国电子科技集团公司第四十四研究所 Cmos image sensor exposure time series control method
CN109639994A (en) * 2019-01-03 2019-04-16 湖北工业大学 Embedded onboard camera exposure time dynamic adjusting method
CN110248112A (en) * 2019-07-12 2019-09-17 成都微光集电科技有限公司 A kind of exposal control method of imaging sensor
US20210065392A1 (en) * 2019-08-29 2021-03-04 Microsoft Technology Licensing, Llc Optimized exposure control for improved depth mapping
WO2021077393A1 (en) * 2019-10-25 2021-04-29 深圳市汇顶科技股份有限公司 Below-screen fingerprint acquisition method and apparatus, electronic device, and storage medium
CN110958400A (en) * 2019-12-13 2020-04-03 上海海鸥数码照相机有限公司 System, method and device for keeping exposure of continuously shot pictures consistent
CN111064900A (en) * 2019-12-25 2020-04-24 宜宾凯翼汽车有限公司 Self-adaptive white balance method and vehicle-mounted panoramic looking-around system
CN112514367A (en) * 2020-01-13 2021-03-16 深圳市大疆创新科技有限公司 Electronic front curtain timing control device and method and image acquisition device
CN111432134A (en) * 2020-03-17 2020-07-17 广东博智林机器人有限公司 Method and device for determining exposure time of image acquisition equipment and processor
WO2022099482A1 (en) * 2020-11-10 2022-05-19 深圳市大疆创新科技有限公司 Exposure control method and apparatus, mobile platform, and computer-readable storage medium
CN113206949A (en) * 2021-04-01 2021-08-03 广州大学 Semi-direct monocular vision SLAM method based on entropy weighted image gradient
CN113473035A (en) * 2021-07-23 2021-10-01 北京字节跳动网络技术有限公司 Ambient brightness determination method and device and electronic equipment
CN113572973A (en) * 2021-09-28 2021-10-29 武汉市聚芯微电子有限责任公司 Exposure control method, device, equipment and computer storage medium
CN114067274A (en) * 2021-11-19 2022-02-18 杭州萤石软件有限公司 Control method and device of image acquisition equipment and electronic equipment
CN114189634A (en) * 2022-01-26 2022-03-15 阿里巴巴达摩院(杭州)科技有限公司 Image acquisition method, electronic device and computer storage medium

Also Published As

Publication number Publication date
CN115134492B (en) 2024-03-19

Similar Documents

Publication Publication Date Title
US10949952B2 (en) Performing detail enhancement on a target in a denoised image
CN111193923B (en) Video quality evaluation method and device, electronic equipment and computer storage medium
CN108269254B (en) Image quality evaluation method and device
CN109005366A (en) Camera module night scene image pickup processing method, device, electronic equipment and storage medium
EP3940633B1 (en) Image alignment method and apparatus, electronic device, and storage medium
CN109618102B (en) Focusing processing method and device, electronic equipment and storage medium
CN114387548A (en) Video and liveness detection method, system, device, storage medium and program product
CN109819176A (en) A kind of image pickup method, system, device, electronic equipment and storage medium
KR20120133646A (en) Apparatus and method for estimating the number of object
CN110971833B (en) Image processing method and device, electronic equipment and storage medium
CN114419400A (en) Training method, recognition method, device, medium and equipment of image recognition model
CN113158773B (en) Training method and training device for living body detection model
CN114650361B (en) Shooting mode determining method, shooting mode determining device, electronic equipment and storage medium
CN113989387A (en) Camera shooting parameter adjusting method and device and electronic equipment
CN112069880A (en) Living body detection method, living body detection device, electronic apparatus, and computer-readable medium
CN114255177B (en) Exposure control method, device, equipment and storage medium in imaging
CN115134492B (en) Image acquisition method, electronic device, and computer-readable medium
CN110321857B (en) Accurate passenger group analysis method based on edge calculation technology
CN114332993A (en) Face recognition method and device, electronic equipment and computer readable storage medium
CN113489911A (en) Iris camera of electronic dimming and dimming mirror and iris camera control method
CN113240602A (en) Image defogging method and device, computer readable medium and electronic equipment
CN112070022A (en) Face image recognition method and device, electronic equipment and computer readable medium
US11232314B2 (en) Computer vision based approach to image injection detection
CN113723242B (en) Visual lie detection method based on video terminal, electronic equipment and medium
CN113516089B (en) Face image recognition method, device, equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 201-1, 2nd Floor, Building 4, No. 188 Rixin Road, Binhai Science and Technology Park, Binhai New Area, Tianjin, 300450

Applicant after: Tianjin Jihao Technology Co.,Ltd.

Address before: 100089 Z, 17th floor, No. 1, Zhongguancun Street, Haidian District, Beijing

Applicant before: Beijing Jihao Technology Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230831

Address after: 1502, 15th floor, No. 3 Suzhou Street, Haidian District, Beijing, 100080

Applicant after: Beijing Aurora Smart Core Technology Co.,Ltd.

Address before: 201-1, 2nd Floor, Building 4, No. 188 Rixin Road, Binhai Science and Technology Park, Binhai New Area, Tianjin, 300450

Applicant before: Tianjin Jihao Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant