CN116095498A - Image acquisition method, device, terminal and storage medium - Google Patents

Image acquisition method, device, terminal and storage medium Download PDF

Info

Publication number
CN116095498A
CN116095498A CN202111302409.5A CN202111302409A CN116095498A CN 116095498 A CN116095498 A CN 116095498A CN 202111302409 A CN202111302409 A CN 202111302409A CN 116095498 A CN116095498 A CN 116095498A
Authority
CN
China
Prior art keywords
image
frame number
camera
image frame
code scanning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111302409.5A
Other languages
Chinese (zh)
Inventor
曹杰
张志辉
尚红霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202111302409.5A priority Critical patent/CN116095498A/en
Publication of CN116095498A publication Critical patent/CN116095498A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The disclosure relates to the field of computer technology, and in particular, to an image acquisition method, an image acquisition device, a terminal and a storage medium. The image acquisition method comprises the following steps: if the running state of the camera is a code scanning state, acquiring an image frame number corresponding to the camera; acquiring an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number; and controlling the camera to acquire a code scanning image set corresponding to the image frame number based on the exposure initial value and the convergence speed value. By adopting the method and the device, the accuracy of image acquisition can be improved.

Description

Image acquisition method, device, terminal and storage medium
Technical Field
The disclosure relates to the field of computer technology, and in particular, to an image acquisition method, an image acquisition device, a terminal and a storage medium.
Background
With the development of science and technology, the development of terminals is also more and more rapid, so improving the convenience of users to use the terminals is a focus of attention of users. The terminal can not only shoot images, but also can automatically expose (Automatic Exposure, AE). Automatic exposure is a necessary function of the terminal camera. The terminal can rapidly adjust the exposure of the camera under dark and bright environments, and higher imaging quality is achieved, namely, the terminal can acquire images with higher quality.
Disclosure of Invention
The disclosure provides an image acquisition method, an image acquisition device, a terminal and a storage medium, and the main purpose of the image acquisition method, the device and the terminal is to improve the accuracy of image acquisition. The technical scheme of the embodiment of the disclosure is as follows:
according to an aspect of the present disclosure, an embodiment of the present disclosure provides an image acquisition method, including:
if the running state of the camera is a code scanning state, acquiring an image frame number corresponding to the camera;
acquiring an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number;
controlling the camera to acquire a code scanning image set corresponding to the image frame number based on the exposure initial value and the convergence speed value; the code scanning image set is used for indicating the terminal to acquire the identification code corresponding to the code scanning image set.
Optionally, the acquiring the number of image frames corresponding to the camera includes:
acquiring a camera module identifier corresponding to the camera;
and acquiring the image frame number corresponding to the camera module identification.
Optionally, the acquiring the number of image frames corresponding to the camera includes:
acquiring a currently running application program;
and acquiring the image frame number corresponding to the application program, and determining the image frame number as the image frame number corresponding to the camera.
Optionally, the acquiring the number of image frames corresponding to the camera includes:
acquiring image frame numbers corresponding to all application programs in an application program set, and acquiring an image frame number set;
and determining the maximum image frame number in the image frame number set as the image frame number corresponding to the camera.
Optionally, the acquiring the initial exposure value corresponding to the image frame number and the convergence speed value corresponding to the image frame number includes:
acquiring an exposure initial value set and a convergence speed value set, wherein the exposure initial value set comprises each image frame number and an exposure initial value corresponding to the image frame number, and the convergence speed value set comprises each image frame number and a convergence speed value corresponding to the image frame number;
and acquiring an exposure initial value corresponding to the image frame number based on the exposure initial value set, and acquiring a convergence speed value corresponding to the image frame number based on the convergence speed value set.
Optionally, the controlling the camera to collect the scan code image set corresponding to the image frame number based on the exposure initial value and the convergence speed value includes:
calculating an exposure parameter corresponding to the camera based on the exposure initial value and the convergence speed value;
And controlling the camera to acquire a code scanning image set corresponding to the image frame number based on the exposure parameters.
Optionally, the controlling the camera to collect the scan code image set with the image frame number includes:
controlling the camera to acquire a first code scanning image;
if the first code scanning image meets the image condition, controlling the camera to acquire a second code scanning image with a preset frame number, wherein the preset frame number is one frame smaller than the image frame number;
and if the second code scanning image with the preset frame number is obtained, adding the first code scanning image and the second code scanning image with the preset frame number to a code scanning image set.
According to an aspect of the present disclosure, an embodiment of the present disclosure provides an image acquisition apparatus including:
the frame number acquisition unit is used for acquiring the image frame number corresponding to the camera if the running state of the camera is a code scanning state;
a speed value acquisition unit configured to acquire an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number;
the image acquisition unit is used for controlling the camera to acquire a code scanning image set corresponding to the image frame number based on the exposure initial value and the convergence speed value; the code scanning image set is used for indicating the terminal to acquire the identification code corresponding to the code scanning image set.
Optionally, the frame number obtaining unit includes an identification obtaining subunit and a frame number obtaining subunit, where the frame number obtaining unit is configured to, when obtaining the frame number of the image corresponding to the camera:
the identification acquisition subunit is used for acquiring the camera module identification corresponding to the camera;
the frame number acquisition subunit is used for acquiring the image frame number corresponding to the camera module identification.
Optionally, the frame number obtaining unit includes a program obtaining subunit and a frame number obtaining subunit, where the frame number obtaining unit is configured to, when obtaining the frame number of the image corresponding to the camera:
the program acquisition subunit is used for acquiring the currently running application program;
the frame number acquisition subunit is configured to acquire an image frame number corresponding to the application program, and determine the image frame number as an image frame number corresponding to the camera.
Optionally, the frame number obtaining unit is configured to, when obtaining the frame number of the image corresponding to the camera, specifically:
acquiring image frame numbers corresponding to all application programs in an application program set, and acquiring an image frame number set;
and determining the maximum image frame number in the image frame number set as the image frame number corresponding to the camera.
Optionally, the speed value obtaining unit is configured to, when obtaining an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number, specifically:
acquiring an exposure initial value set and a convergence speed value set, wherein the exposure initial value set comprises each image frame number and an exposure initial value corresponding to the image frame number, and the convergence speed value set comprises each image frame number and a convergence speed value corresponding to the image frame number;
and acquiring an exposure initial value corresponding to the image frame number based on the exposure initial value set, and acquiring a convergence speed value corresponding to the image frame number based on the convergence speed value set.
Optionally, the image acquisition unit is configured to control, based on the exposure initial value and the convergence speed value, when the camera acquires a scan code image set corresponding to the image frame number, specifically configured to:
calculating an exposure parameter corresponding to the camera based on the exposure initial value and the convergence speed value;
and controlling the camera to acquire a code scanning image set corresponding to the image frame number based on the exposure parameters.
Optionally, the image acquisition unit is configured to control the camera to acquire the scan code image set with the image frame number, and specifically configured to:
Controlling the camera to acquire a first code scanning image;
if the first code scanning image meets the image condition, controlling the camera to acquire a second code scanning image with a preset frame number, wherein the preset frame number is one frame smaller than the image frame number;
and if the second code scanning image with the preset frame number is obtained, adding the first code scanning image and the second code scanning image with the preset frame number to a code scanning image set.
According to an aspect of the present disclosure, there is provided a terminal including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the preceding aspects.
According to an aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the method of any one of the preceding aspects.
According to an aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the method of any one of the preceding aspects.
The technical scheme provided by some embodiments of the present disclosure has the following beneficial effects:
in one or more embodiments of the present disclosure, if the running state of the camera is a code scanning state, an image frame number corresponding to the camera is obtained, an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number may be obtained, based on the exposure initial value and the convergence speed value, the camera is controlled to collect a code scanning image set corresponding to the image frame number, because the code scanning image set is used to instruct the terminal to obtain an identification code corresponding to the code scanning image set, in a code scanning process, there is no need to obtain position information of the identification code in the code scanning image, a duration of obtaining the position information of the identification code may be reduced, a code scanning efficiency may be improved, and meanwhile, based on the exposure initial value and the convergence speed value, the camera is controlled to collect the code scanning image set corresponding to the image frame number may be reduced, a probability that the identification code cannot be identified by the exposure image may be improved, an accuracy of image collection may be improved, and a probability of success of identification code identification may be improved, and a user experience may be improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings may be obtained according to these drawings without inventive effort to a person of ordinary skill in the art.
Fig. 1 illustrates a system architecture diagram of an image acquisition method provided by an embodiment of the present disclosure;
fig. 2 illustrates a background schematic diagram of an image acquisition method provided in an embodiment of the present disclosure;
fig. 3 shows a flowchart of an image acquisition method according to an embodiment of the present disclosure;
fig. 4 shows a flowchart of an image acquisition method according to an embodiment of the present disclosure;
fig. 5 shows an exemplary schematic diagram of a scan code image acquired in a high dynamic scene provided by an embodiment of the present disclosure;
FIG. 6 shows an exemplary schematic diagram of a terminal interface provided by an embodiment of the present disclosure;
FIG. 7 shows an exemplary schematic diagram of a convergence speed value set provided by an embodiment of the disclosure;
FIG. 8 shows an exemplary schematic diagram of a scan code image provided by an embodiment of the present disclosure;
FIG. 9 shows an exemplary schematic diagram of a scan code image provided by an embodiment of the present disclosure;
fig. 10 is a schematic structural view of an image capturing device according to an embodiment of the present disclosure;
fig. 11 is a schematic structural view of an image capturing device according to an embodiment of the present disclosure;
fig. 12 is a schematic structural view of an image capturing device according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of a terminal according to an embodiment of the present disclosure;
FIG. 14 is a schematic diagram of the architecture of an operating system and user space provided by embodiments of the present disclosure;
FIG. 15 is an architecture diagram of the android operating system of FIG. 14;
FIG. 16 is an architecture diagram of the IOS operating system of FIG. 14.
Detailed Description
The following description of the technical solutions in the embodiments of the present disclosure will be made clearly and completely with reference to the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, not all embodiments. Based on the embodiments in this disclosure, all other embodiments that a person of ordinary skill in the art would obtain without making any inventive effort are within the scope of protection of this disclosure.
In the description of the present disclosure, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present disclosure, it is noted that, unless expressly specified and limited otherwise, "comprise" and "have" and any variations thereof are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus. The specific meaning of the terms in this disclosure will be understood by those of ordinary skill in the art in the specific context. Furthermore, in the description of the present disclosure, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
With the development of scientific technology, the terminal supports more and more functions, thereby greatly facilitating the life of users. For example, the terminal may support an image acquisition function. Acquired images include, but are not limited to, high-Dynamic Range images (HDR) and normal images. Compared with the common image, the High-Dynamic Range image (HDR) can provide more Dynamic Range and image details, and the final HDR image is synthesized by utilizing the LDR image with the optimal detail corresponding to each exposure time according to the LDR (Low-Dynamic Range) images with different exposure times, so that the visual effect in the real environment of the user can be reflected better.
It is easy to understand that automatic exposure is a necessary function of the terminal camera, that is, the terminal can quickly adjust the exposure of the camera in dark and bright environments, so as to achieve higher imaging quality, that is, the terminal can acquire images with higher quality.
In some embodiments, fig. 1 shows a system architecture diagram of an image acquisition method provided by an embodiment of the present disclosure. As shown in fig. 1, the terminal 11 may obtain, for example, a two-dimensional code displayed by the terminal 13 through the network 12. When the terminal 11 is far away from the terminal 12, the proportion of the two-dimensional code image which can be acquired and displayed by the terminal to the display screen of the terminal is low, and the terminal 11 can receive an amplifying instruction aiming at the two-dimensional code image, so that the terminal 11 can identify the two-dimensional code successfully.
Fig. 2 illustrates a background schematic diagram of an image acquisition method provided by an embodiment of the present disclosure, according to some embodiments. As shown in fig. 2, in the field of automatic exposure technology, the image collected by the terminal C may be as shown in fig. 2. When the terminal collects images, for example, the exposure time and the sensitivity of the next frame in the single code scanning process can be calculated based on the brightness information of the whole display picture. In order to improve performance and experience, a terminal camera generally adopts a norm up technology, namely AE information when the last camera is closed is stored and directly acts on a first frame exposure of the next camera which is opened, so that an image with ideal brightness can be obtained at the moment of opening the camera when the camera is used for the second time under the same scene.
Alternatively, in fig. 2, the black area is a picture displayed by the camera of the terminal C, which indicates that the current scene is a night scene, and the white area is a light-emitting screen, for example, may be a light-emitting screen area of another terminal acquired by the terminal C. Another terminal includes, but is not limited to, a cell phone, a tablet, a parking lot fee, and a vending machine. The terminal C can improve the weights of the exposure parameters and the focusing parameters in the central area of the screen, so that the code scanning problem of a high-dynamic scene can be solved, and the code scanning speed of other scenes can be improved to a certain extent. But when the two-dimensional code image is not in the central area of the screen, the code scanning success rate can be greatly reduced.
It is easy to understand that the terminal may also obtain illumination information according to the image entropy value. Based on the comparison condition of the image entropy value and the preset entropy value, dividing the image into an interested region and a non-interested region, then distributing different weights to different regions by adopting a weight distribution method according to the image entropy value, and finally accurately and automatically exposing. However, because the position of the two-dimensional code is not a fixed position, the terminal needs to identify the position information of the two-dimensional code, so that the code scanning time length can be increased, the code scanning efficiency can be reduced, and the use experience of a user can be reduced.
The present disclosure is described in detail below with reference to specific examples.
In one embodiment, as shown in fig. 3, fig. 3 illustrates a flowchart of an image acquisition method provided by an embodiment of the present disclosure, which may be implemented in dependence on a computer program, and may be run on a device including an image acquisition function. The computer program may be integrated in the application or may run as a stand-alone tool class application.
The image acquisition device may be a terminal with an image acquisition function, including but not limited to: wearable devices, handheld devices, personal computers, tablet computers, vehicle-mounted devices, smart phones, computing devices, or other processing devices connected to a wireless modem, etc. Terminals may be called different names in different networks, for example: a user equipment, an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent or user equipment, a cellular telephone, a cordless telephone, a personal digital assistant (personal digital assistant, PDA), a fifth Generation mobile communication technology (5th Generation Mobile Communication Technology,5G) network, a fourth Generation mobile communication technology (the 4th Generation mobile communication technology,4G) network, a third Generation mobile communication technology (3 rd-Generation, 3G) network, or a terminal in a future evolution network, etc.
Specifically, the image acquisition method comprises the following steps:
s101, if the running state of the camera is a code scanning state, acquiring the image frame number corresponding to the camera;
in some embodiments, the camera is a video input device. The camera of the embodiment of the disclosure is used for acquiring a code scanning image when a terminal scans a code. The camera of the embodiments of the present disclosure does not refer specifically to a certain fixed camera. For example, when the type of camera changes, the camera also changes accordingly. Cameras include, but are not limited to, digital cameras, analog cameras, and the like.
According to some embodiments, the operational status of the camera is used to indicate the images captured by the camera. Wherein different operating states of the camera can be used to acquire different images. For example, the photographing state of the camera is used for photographing an image, and identification in the image is not required in the photographing process to acquire an identification code included in the image. For example, when the camera is in a code scanning state, the camera is not required to capture an image, but the captured image is required to be recognized so that the identification code in the image can be obtained.
It is easy to understand that the code scanning state is used to indicate the current state of the camera, and the camera is suitable for scanning codes, but not starting other functions of the camera.
Optionally, the image frame number refers to the frame number of the image acquired by the camera in one code scanning process. The number of image frames is not particularly limited to a certain fixed number of frames. For example, when the type of camera changes, the number of image frames may also change accordingly. For example, when the time point at which the image is acquired changes, the number of image frames may also change accordingly. The number of image frames may be set, for example, when the terminal leaves the factory, or may be set based on a frame number setting instruction of the user.
According to some embodiments, the image frame number refers to an effective frame number of the code scanning image collected by the terminal control camera, that is, the terminal can obtain the identification code information corresponding to the code scanning image based on the code scanning image corresponding to the image frame number.
In some embodiments, the terminal may be in an operational state of the camera when the terminal performs the image acquisition method. If the terminal determines that the running state of the camera is the code scanning state, the terminal can acquire the image frame number corresponding to the camera.
S102, acquiring an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number;
in some embodiments, exposure refers to the amount of light that enters the lens to impinge on the photosensitive element during the photographing process, controlled by a combination of aperture, shutter, and sensitivity. The Exposure Value (EV) represents all camera aperture shutter combinations that can give the same Exposure. The exposure initial value refers to an exposure value corresponding to the camera when the camera is controlled to start the code scanning function, namely the exposure initial value is used for indicating the exposure value corresponding to the initial code scanning state of the camera. The exposure initial value is not particularly limited to a fixed initial value, and corresponds to the number of image frames. For example, when the number of image frames changes, the exposure initial value also changes accordingly.
According to some embodiments, the convergence speed value is used to control the luminance difference between two adjacent frames of images. Wherein the larger the convergence speed value is, the larger the brightness difference between the adjacent two frames of images is. The smaller the convergence speed value, the smaller the luminance difference between the adjacent two frame images. The convergence speed value is not particularly limited to a fixed speed value, and corresponds to the number of frames of the image. For example, when the number of image frames changes, the convergence speed value also changes accordingly.
It is easy to understand that the terminal can be in an operational state of the camera when the terminal performs the image capturing method. If the terminal determines that the running state of the camera is the code scanning state, the terminal can acquire the image frame number corresponding to the camera. The terminal may acquire an exposure initial value corresponding to the number of image frames and a convergence speed value corresponding to the number of image frames.
S103, based on the exposure initial value and the convergence speed value, controlling the camera to acquire a scanning code image set corresponding to the image frame number.
According to some embodiments, a collection refers to a collection of concrete or abstract objects that have a certain specific property. The code scanning image refers to an image including an identification code in the image. The code scanning image is not particularly limited to a certain image. For example, when the identification code included in the image changes, the scan code image also changes accordingly. The identification code includes, but is not limited to, a two-dimensional code, a bar code, and the like.
In some embodiments, the set of scan code images is a set of multi-frame scan code images acquired for a certain identification code image. The code scanning image set is used for indicating the terminal to acquire the identification code corresponding to the code scanning image set. Namely, when the terminal acquires the code scanning image set, the terminal can identify the code scanning image included in the code scanning image set to acquire an identification code. The identification code includes, but is not limited to, a two-dimensional code, a bar code, and the like.
The identification code image is an image displayed by another terminal and is not an execution subject of the embodiment of the disclosure. The scan code image set does not refer to a fixed image set. For example, when the number of image frames included in the scan code image set changes, the scan code image set also changes accordingly. For example, when the identification code image corresponding to the scan code image set changes, the scan code image set also changes accordingly.
It is easy to understand that the terminal can be in an operational state of the camera when the terminal performs the image capturing method. If the terminal determines that the running state of the camera is the code scanning state, the terminal can acquire the image frame number corresponding to the camera. The terminal may acquire an exposure initial value corresponding to the number of image frames and a convergence speed value corresponding to the number of image frames. The terminal can control the camera to collect a code scanning image set corresponding to the image frame number based on the exposure initial value and the convergence speed value.
In one or more embodiments of the present disclosure, if the running state of the camera is a code scanning state, an image frame number corresponding to the camera is obtained, an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number may be obtained, based on the exposure initial value and the convergence speed value, the camera is controlled to collect a code scanning image set corresponding to the image frame number, because the code scanning image set is used to instruct the terminal to obtain an identification code corresponding to the code scanning image set, in a code scanning process, there is no need to obtain position information of the identification code in the code scanning image, a duration of obtaining the position information of the identification code may be reduced, a code scanning efficiency may be improved, and meanwhile, based on the exposure initial value and the convergence speed value, the camera is controlled to collect the code scanning image set corresponding to the image frame number may be reduced, a probability that the identification code cannot be identified by the exposure image may be improved, an accuracy of image collection may be improved, and a probability of success of identification code identification may be improved, and a user experience may be improved.
Referring to fig. 4, fig. 4 is a schematic flow chart of an image acquisition method according to an embodiment of the disclosure. Specific:
s201, if the running state of the camera is a code scanning state, acquiring the image frame number corresponding to the camera;
The specific process is as described above, and will not be described here again.
In some embodiments, the disclosed embodiments may be applied to high dynamic scenarios. The high dynamic scene may be, for example, a two-dimensional code identification scene. Fig. 5 shows an exemplary schematic diagram of a scan code image acquired in a high dynamic scene according to an embodiment of the present disclosure. As shown in fig. 5, the two-dimensional code image in the high dynamic scene may undergo a process from a dark image to a clear image to an overexposed image, for example. As shown in fig. 5, in this image acquisition process, since the 6 th frame starts to be overexposed, the first frame of image is unclear, so that the number of frames of the image acquired by the terminal is 4 frames, that is, the number of effective frames acquired by the terminal is 4 frames.
In some embodiments, fig. 6 shows an exemplary schematic diagram of a terminal interface provided by an embodiment of the present disclosure. As shown in fig. 6, the terminal B may display a two-dimensional code image, for example. The execution subject of the embodiment of the present disclosure may be, for example, terminal a. The two-dimensional bar code/two-dimensional code (2-dimensional bar code) is a graph which is distributed on a plane (two-dimensional direction) according to a certain rule by using a certain specific geometric figure, is black-white alternate and records data symbol information. Terminal a may receive, for example, a click command entered for the "swipe code" function. The terminal A determines that the running state of the camera is a code scanning state, and the terminal A can control the camera to acquire the two-dimensional code image displayed by the terminal B.
According to some embodiments, when the terminal obtains the number of image frames corresponding to the camera, the terminal may obtain the camera module identifier corresponding to the camera. When the terminal obtains the camera module identification corresponding to the camera, the terminal can obtain the image frame number corresponding to the camera module identification. The terminal obtains the image frame number through the camera module identification, can improve the matching nature of image frame number and camera, can improve the accuracy that the image frame number obtained, and then can improve the accuracy of image acquisition.
It is easy to understand that the camera module identification is used for uniquely identifying the identification of the same type of camera. One camera corresponds to only one camera module identifier. One camera module identification corresponds to one image frame number. The camera modules with the same identification can correspond to the same image frame number, namely, when the camera modules with the same identification are included in the cameras, the image frame numbers corresponding to the cameras are the same.
Optionally, the number of frames of the image corresponding to the Q camera module identifier may be 2 frames, the number of frames of the image corresponding to the W camera module identifier may be 3 frames, and the number of frames of the image corresponding to the E camera module identifier may be 4 frames, for example. For example, the camera module identifier corresponding to the camera acquired by the terminal may be, for example, a W camera module identifier, the number of frames of the image corresponding to the W camera module identifier acquired by the terminal may be, for example, 3 frames, that is, the number of frames of the image corresponding to the camera acquired by the terminal may be, for example, 3 frames.
According to some embodiments, when the terminal acquires the number of image frames corresponding to the camera, the terminal may acquire the currently running application program. The terminal may acquire the number of image frames corresponding to the application program and determine the number of image frames as the number of image frames corresponding to the camera. Wherein different applications may correspond to different image frames. For example, different applications may correspond to different image frames because of different requirements of the different applications for image recognition. Because different application programs correspond to different image frame numbers, the terminal acquires the image frame numbers based on the currently running application program, the accuracy of image frame number acquisition can be improved, and the accuracy of image acquisition can be improved.
It is to be readily understood that the number of image frames corresponding to the R application may be 2 frames, the number of image frames corresponding to the T application may be 3 frames, and the number of image frames corresponding to the Y application may be 4 frames, for example. The current application acquired by the terminal may be, for example, a T application. The number of frames of the image corresponding to the T application acquired by the terminal may be, for example, 3 frames, i.e., the number of frames of the image corresponding to the camera may be 3 frames.
According to some embodiments, when the terminal acquires the image frame number corresponding to the camera, the terminal may acquire the image frame number corresponding to each application in the application set, and acquire the image frame number set. The terminal may determine a maximum image frame number in the image frame number set as an image frame number corresponding to the camera. The terminal directly determines the maximum image frame number in the image frame number set as the image frame number corresponding to the camera, so that the requirements of a plurality of application programs on the image frame number can be met, the image frame number can be directly acquired, the acquisition time of the image frame number is reduced, and the convenience of image acquisition can be improved.
It is readily understood that the set of applications comprises at least one application. The application set is not specific to a particular application set. For example, when the types of applications included in the application set change, the application set also changes accordingly. For example, when the number of applications included in an application set changes, the application set also changes accordingly.
Optionally, the image frame number set includes image frame numbers corresponding to each application program. The set of image frames does not refer specifically to a fixed set of frames. For example, when the number of applications included in the application set changes, the number of image frames included in the image frame number set also changes, and the image frame number set also changes accordingly.
In some embodiments, when the terminal acquires the image frame number corresponding to each application program, the terminal may determine based on the image frame number of the identification code corresponding to the scan code image successfully acquired by each application program in the history.
In some embodiments, the application set may include, for example, an R application, a T application, and a Y application, where the number of image frames corresponding to the R application may be, for example, 2 frames, the number of image frames corresponding to the T application may be, for example, 3 frames, and the number of image frames corresponding to the Y application may be, for example, 4 frames. The set of image frames acquired by the terminal may include, for example, 2 frames, 3 frames, and 4 frames. The terminal may acquire the maximum image frame number in the image frame number set, for example, may be 4 frames, and the terminal may determine the 4 frames as the image frame number corresponding to the camera.
S202, acquiring an exposure initial value set and a convergence speed value set;
in some embodiments, the exposure initial value set refers to a set of correspondence between the number of image frames and the exposure initial value. The exposure initial value set includes each image frame number and an exposure initial value corresponding to each image frame number. The exposure initial value set does not particularly refer to a certain fixed correspondence. For example, when the terminal acquires a modification instruction for the exposure initial value set, the terminal may modify the exposure initial value set based on the modification instruction. Wherein, different image frames may correspond to different exposure initial values. The terminal may acquire the image frame number and the exposure initial value corresponding to the image frame number in advance, and store the image frame number and the exposure initial value corresponding to the image frame number in association with each other, for example.
According to some embodiments, the convergence speed value set refers to a set of correspondence between the number of image frames and the convergence speed value. The convergence speed value set includes the number of frames of each image and the convergence speed value corresponding to the number of frames of each image. The convergence speed value set does not particularly refer to a certain fixed correspondence. For example, when the terminal acquires a modification instruction for the convergence speed value set, the terminal may modify the convergence speed value set based on the modification instruction. Wherein different image frames may correspond to different convergence speed values. The terminal may acquire the image frame number and the convergence speed value corresponding to the image frame number in advance, and store the image frame number and the convergence speed value corresponding to the image frame number in association with each other, for example. Fig. 7 shows an exemplary schematic diagram of a convergence speed value set provided by an embodiment of the disclosure. As shown in fig. 7, when the number of image frames is 8 frames, the convergence rate value corresponding to the number of image frames may be 0.6, for example.
It is easy to understand that when the terminal acquires the number of image frames corresponding to the camera, the terminal may acquire the initial set of exposure values and the convergence speed value set.
S203, acquiring an exposure initial value corresponding to the image frame number based on the exposure initial value set, and acquiring a convergence speed value corresponding to the image frame number based on the convergence speed value set;
In some embodiments, when the terminal acquires an exposure initial value set between the number of image frames and the exposure initial value and a convergence speed value set between the number of image frames and the convergence speed value, since the exposure initial value set includes the exposure initial value corresponding to each number of image frames and the convergence speed value set includes the convergence speed value corresponding to each number of image frames, the terminal may acquire the exposure initial value corresponding to the number of image frames based on the exposure initial value set and the convergence speed value corresponding to the number of image frames based on the convergence speed value set.
It is easy to understand that, since the exposure initial value determines the definition of the first frame image collected by the terminal control camera, and the convergence speed value determines the effective frame number of the scan code image collected by the terminal control camera, the terminal obtains the exposure initial value corresponding to the image frame number based on the exposure initial value set and obtains the convergence speed value corresponding to the image frame number based on the convergence speed value set, so that the accuracy of scan code image obtaining can be improved.
Optionally, when the terminal obtains the exposure initial value corresponding to the image frame number based on the exposure initial value set, the terminal may determine the exposure initial value directly based on the definition of the first frame image collected by the camera, for example. The exposure initial value is not particularly limited to a fixed initial value, and may be changed when the sharpness of the first frame image changes, for example.
It is easy to understand that, after the terminal determines the exposure initial value, the terminal can acquire a convergence speed value set between the number of image frames and the convergence speed value through experimental data. The number of image frames obtained by the terminal when the scanning of each application program is successful may be, for example, 0.65.
S204, based on the exposure initial value and the convergence speed value, controlling the camera to acquire a code scanning image set corresponding to the image frame number.
The specific process is as described above, and will not be described here again.
According to some embodiments, when the terminal controls the camera to collect the scan code image set corresponding to the image frame number based on the exposure initial value and the convergence speed value, the terminal may calculate the exposure parameter corresponding to the camera based on the exposure initial value and the convergence speed value. The terminal can control the camera to collect a code scanning image set corresponding to the image frame number based on the exposure parameters. Based on the fact that the terminal can continuously adjust exposure parameters in the process of acquiring the code scanning images, the camera can be controlled to acquire different code scanning images, and therefore accuracy of code scanning image acquisition can be improved.
According to some embodiments, when the terminal controls the camera to collect the code scanning image set corresponding to the image frame number, the terminal may control the camera to collect the first code scanning image. When the terminal acquires the first code scanning image, whether the first code scanning image meets an image condition or not can be detected. If the first code scanning image meets the image condition, the terminal can control the camera to acquire a second code scanning image with a preset frame number. Wherein the preset frame number is one frame smaller than the image frame number. If the second code scanning image with the preset frame number is obtained, the terminal can add the first code scanning image and the second code scanning image with the preset frame number to the code scanning image set. The terminal acquires a second code scanning image with a preset frame number from the first code scanning image conforming to the image condition, so that the probability of acquiring the code scanning image which cannot be identified is reduced, and the accuracy of acquiring the code scanning image can be improved.
In some embodiments, the first code scanning image refers to an image that a terminal controls a certain frame acquired by a camera to meet an image condition. The frame number corresponding to the first code scanning image is one frame. The number of frames corresponding to the second scan code image is one frame. The first code scanning image can be a first frame code scanning image acquired by the terminal control camera when the running state of the camera is the code scanning state, or can be a second frame code scanning image. If the terminal controls the first frame code scanning image acquired by the camera to meet the image condition, the first code scanning image can be the first frame code scanning image. And the dragging terminal controls the second frame code scanning image acquired by the camera to meet the image condition, so that the first code scanning image can be the second frame code scanning image.
It is easy to understand that the image condition refers to a condition for judging whether the code scanning image satisfies the image recognition requirement. The image condition is not particularly limited to a certain fixed condition. The image conditions include, but are not limited to, sharpness conditions, brightness conditions, and the like.
Optionally, the preset frame number refers to a frame number corresponding to the second scan code image. The preset frame number is a frame number that is one frame smaller than the image frame number. The preset number of frames is not particularly limited to a certain fixed number of frames. For example, when the number of image frames changes, the preset number of frames also changes.
It is easy to understand that the number of image frames may be 6 frames, for example. The preset number of frames may be, for example, 5 frames. When the terminal determines that the running state of the camera is the code scanning state, the terminal controls the first frame code scanning image acquired by the camera to meet the image condition, the terminal can continuously acquire 5 frames of images and adds the first frame code scanning image and the 5 frames of images to the code scanning image set.
In some embodiments, fig. 8 shows an exemplary schematic diagram of a scan code image provided by an embodiment of the disclosure. As shown in fig. 8, an image acquired by a terminal without adopting the method provided by the embodiment of the present disclosure may be, for example, an image in an exposure state, and the terminal cannot acquire an identification code corresponding to the image based on the image. Fig. 9 shows an exemplary schematic diagram of a scan code image provided by an embodiment of the present disclosure. As shown in fig. 9, an image acquired by the terminal using the method provided by the embodiment of the present disclosure may be, for example, an image that is not in an exposure state, and the terminal may acquire an identification code corresponding to the image based on the image.
Optionally, when the terminal obtains the code scanning image set, the terminal can obtain the identification code corresponding to the code scanning image set based on the code scanning image set, so that accuracy of the identification code obtaining can be improved, convenience of the identification code obtaining is improved, and use experience of a user is improved.
In one or more embodiments of the present disclosure, if the operation state of the camera is a code scanning state, the image frame number corresponding to the camera is obtained, and the determination of the operation state of the camera can improve accuracy of obtaining the image frame number and accuracy of obtaining a code scanning image set. And secondly, acquiring an exposure initial value set and a convergence speed value set, wherein the exposure initial value corresponding to the image frame number can be acquired based on the exposure initial value set, the convergence speed value corresponding to the image frame number can be acquired based on the convergence speed value set, the accuracy of acquiring the exposure initial value and the convergence speed value can be improved, and the accuracy of acquiring the code scanning image set can be improved. Finally, based on the exposure initial value and the convergence speed value, the camera is controlled to acquire the scanning code image set corresponding to the image frame number, so that the probability that the acquired exposure image cannot identify the identification code can be reduced, the accuracy of image acquisition can be improved, the probability of successful identification of the identification code can be improved, and the use experience of a user can be further improved.
The following are device embodiments of the present disclosure that may be used to perform method embodiments of the present disclosure. For details not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiments of the method of the present disclosure.
Referring to fig. 10, a schematic structural diagram of a first image capturing device according to an exemplary embodiment of the present disclosure is shown. The image acquisition device may be implemented as all or part of the device by software, hardware or a combination of both. The image acquisition apparatus 1000 includes a frame number acquisition unit 1001, a speed value acquisition unit 1002, and an image acquisition unit 1003, wherein:
a frame number obtaining unit 1001, configured to obtain an image frame number corresponding to the camera if the running state of the camera is a code scanning state;
a speed value acquisition unit 1002 for acquiring an exposure initial value corresponding to the number of image frames and a convergence speed value corresponding to the number of image frames;
an image acquisition unit 1003, configured to control the camera to acquire a scan code image set corresponding to the image frame number based on the exposure initial value and the convergence speed value; the code scanning image set is used for indicating the terminal to acquire the identification code corresponding to the code scanning image set.
Fig. 11 illustrates a schematic structural diagram of an image capturing device according to an embodiment of the present disclosure. As shown in fig. 11, the frame number acquisition unit 1001 includes an identification acquisition subunit 1011 and a frame number acquisition subunit 1021, and the frame number acquisition unit 1001 is configured to, when acquiring the number of image frames corresponding to the camera:
The identifier obtaining subunit 1011 is configured to obtain a camera module identifier corresponding to the camera;
and a frame number acquisition subunit 1021, configured to acquire an image frame number corresponding to the camera module identifier.
Fig. 12 illustrates a schematic structural diagram of an image capturing device according to an embodiment of the present disclosure. As shown in fig. 12, the frame number acquisition unit 1001 includes a program acquisition subunit 1031 and a frame number acquisition subunit 1021, and the frame number acquisition unit 1001 is configured to, when acquiring the number of image frames corresponding to the camera:
a program acquisition subunit 1031, configured to acquire a currently running application program;
a frame number acquisition subunit 1021 for acquiring the number of image frames corresponding to the application program, and determining the number of image frames as the number of image frames corresponding to the camera.
According to some embodiments, the frame number obtaining unit 1001 is configured to, when obtaining the frame number of the image corresponding to the camera, specifically:
acquiring image frame numbers corresponding to all application programs in an application program set, and acquiring an image frame number set;
and determining the maximum image frame number in the image frame number set as the image frame number corresponding to the camera.
According to some embodiments, the speed value obtaining unit 1002 is configured to, when obtaining an exposure initial value corresponding to an image frame number and a convergence speed value corresponding to the image frame number, specifically:
Acquiring an exposure initial value set and a convergence speed value set, wherein the exposure initial value set comprises each image frame number and an exposure initial value corresponding to each image frame number, and the convergence speed value set comprises each image frame number and a convergence speed value corresponding to each image frame number;
based on the exposure initial value set and the convergence speed value set, an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number are acquired.
According to some embodiments, the image acquisition unit 1003 is configured to, based on the exposure initial value and the convergence speed value, control the camera to acquire the scan code image set corresponding to the image frame number, specifically configured to:
calculating an exposure parameter corresponding to the camera based on the exposure initial value and the convergence speed value;
based on the exposure parameters, the camera is controlled to acquire a code scanning image set corresponding to the image frame number.
According to some embodiments, the image acquisition unit 1003 is configured to, when controlling the camera to acquire the scan code image set of the image frame number, specifically:
controlling a camera to acquire a first code scanning image;
if the first code scanning image meets the image condition, controlling the camera to acquire a second code scanning image with a preset frame number, wherein the preset frame number is one frame smaller than the image frame number;
If the second code scanning image with the preset frame number is obtained, adding the first code scanning image and the second code scanning image with the preset frame number to the code scanning image set.
It should be noted that, when the image capturing apparatus provided in the foregoing embodiment performs the image capturing method, only the division of the foregoing functional modules is used as an example, in practical application, the foregoing functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the image capturing device and the image capturing method provided in the foregoing embodiments belong to the same concept, which embody the detailed implementation process in the method embodiment, and are not described herein again.
The foregoing embodiment numbers of the present disclosure are merely for description and do not represent advantages or disadvantages of the embodiments.
In one or more embodiments of the present disclosure, if the running state of the camera is a scan code state, the frame number acquiring unit may acquire an image frame number corresponding to the camera, the speed value acquiring unit may acquire an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number, the image acquiring unit may control the camera to acquire a scan code image set corresponding to the image frame number based on the exposure initial value and the convergence speed value, and since the scan code image set is used to instruct the terminal to acquire an identification code corresponding to the scan code image set, in the scan code process, no position information of the identification code in the scan code image needs to be acquired, so that the duration of acquiring the position information of the identification code can be reduced, the efficiency of scanning the code can be improved, and meanwhile, based on the exposure initial value and the convergence speed value, the scan code image set corresponding to the image frame number is controlled, the probability that the identification code cannot be identified by the exposure image is reduced, the accuracy of image acquisition can be improved, the probability of success of identification code identification is improved, and the use experience of a user can be improved.
The embodiments of the present disclosure further provide a computer storage medium, where a plurality of instructions may be stored, where the instructions are adapted to be loaded by a processor and executed by the processor to perform the image capturing method according to the embodiments shown in fig. 3 to fig. 9, and the specific execution process may refer to the specific description of the embodiments shown in fig. 3 to fig. 9, which is not repeated herein.
The disclosure further provides a computer program product, where at least one instruction is stored, where the at least one instruction is loaded by the processor and executed by the processor to perform the image capturing method according to the embodiment shown in fig. 3 to fig. 9, and the specific execution process may refer to the specific description of the embodiment shown in fig. 3 to fig. 9, which is not repeated herein.
Referring to fig. 13, a block diagram illustrating a structure of a terminal according to an exemplary embodiment of the present disclosure is shown. A terminal in the present disclosure may include one or more of the following: processor 110, memory 120, input device 130, output device 140, and bus 150. The processor 110, the memory 120, the input device 130, and the output device 140 may be connected by a bus 150. The processor loads and executes the image acquisition method according to the embodiment shown in fig. 3 to fig. 9, and the specific execution process may refer to the specific description of the embodiment shown in fig. 3 to fig. 9, which is not described herein.
Processor 110 may include one or more processing cores. The processor 110 connects various parts within the overall terminal using various interfaces and lines, performs various functions of the terminal 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120, and invoking data stored in the memory 120. Alternatively, the processor 110 may be implemented in at least one hardware form of digital signal processing (digital signal processing, DSP), field-programmable gate array (field-programmable gate array, FPGA), programmable logic array (programmable logic Array, PLA). The processor 110 may integrate one or a combination of several of a central processing unit (central processing unit, CPU), an image processor (graphics processing unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for being responsible for rendering and drawing of display content; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 110 and may be implemented solely by a single communication chip.
The memory 120 may include a random access memory (random Access Memory, RAM) or a read-only memory (ROM). Optionally, the memory 120 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 120 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, which may be an Android (Android) system, including an Android system-based deep development system, an IOS system developed by apple corporation, including an IOS system-based deep development system, or other systems, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal in use, such as phonebooks, audio-video data, chat-record data, etc.
Referring to FIG. 14, the memory 120 may be divided into an operating system space in which the operating system is running and a user space in which native and third party applications are running. In order to ensure that different third party application programs can achieve better operation effects, the operating system allocates corresponding system resources for the different third party application programs. However, the requirements of different application scenarios in the same third party application program on system resources are different, for example, under the local resource loading scenario, the third party application program has higher requirement on the disk reading speed; in the animation rendering scene, the third party application program has higher requirements on the GPU performance. The operating system and the third party application program are mutually independent, and the operating system often cannot timely sense the current application scene of the third party application program, so that the operating system cannot perform targeted system resource adaptation according to the specific application scene of the third party application program.
In order to enable the operating system to distinguish specific application scenes of the third-party application program, data communication between the third-party application program and the operating system needs to be communicated, so that the operating system can acquire current scene information of the third-party application program at any time, and targeted system resource adaptation is performed based on the current scene.
Taking an operating system as an Android system as an example, as shown in fig. 15, a program and data stored in the memory 120 may be stored in the memory 120 with a Linux kernel layer 320, a system runtime library layer 340, an application framework layer 360 and an application layer 380, where the Linux kernel layer 320, the system runtime library layer 340 and the application framework layer 360 belong to an operating system space, and the application layer 380 belongs to a user space. The Linux kernel layer 320 provides the various hardware of the terminal with the underlying drivers such as display drivers, audio drivers, camera drivers, bluetooth drivers, wi-Fi drivers, power management, etc. The system runtime layer 340 provides the main feature support for the Android system through some C/c++ libraries. For example, the SQLite library provides support for databases, the OpenGL/ES library provides support for 3D graphics, the Webkit library provides support for browser kernels, and the like. Also provided in the system runtime library layer 340 is a An Zhuoyun runtime library (Android run) which provides mainly some core libraries that can allow developers to write Android applications using the Java language. The application framework layer 360 provides various APIs that may be used in building applications, which developers can also build their own applications by using, for example, campaign management, window management, view management, notification management, content provider, package management, call management, resource management, location management. At least one application program is running in the application layer 380, and these application programs may be native application programs of the operating system, such as a contact program, a short message program, a clock program, a camera application, etc.; and can also be a third party application program developed by a third party developer, such as a game application program, an instant messaging program, a photo beautification program, an image acquisition program and the like.
Taking an operating system as an IOS system as an example, the programs and data stored in the memory 120 are shown in fig. 16, the IOS system includes: core operating system layer 420 (Core OS layer), core service layer 440 (Core Services layer), media layer 460 (Media layer), and touchable layer 480 (Cocoa Touch Layer). The core operating system layer 420 includes an operating system kernel, drivers, and underlying program frameworks that provide more hardware-like functionality for use by the program frameworks at the core services layer 440. The core services layer 440 provides system services and/or program frameworks required by the application, such as a Foundation (Foundation) framework, an account framework, an advertisement framework, a data storage framework, an image acquisition framework, a geographic location framework, a motion framework, and the like. The media layer 460 provides an interface for applications related to audiovisual aspects, such as a graphics-image related interface, an audio technology related interface, a video technology related interface, an audio video transmission technology wireless play (AirPlay) interface, and so forth. The touchable layer 480 provides various commonly used interface-related frameworks for application development, with the touchable layer 480 being responsible for user touch interactions on the terminal. Such as a local notification service, a remote push service, an advertisement framework, a game tool framework, a message User Interface (UI) framework, a User Interface UIKit framework, a map framework, and so forth.
Among the frameworks shown in fig. 14, frameworks related to most applications include, but are not limited to: the infrastructure in core services layer 440 and the UIKit framework in touchable layer 480. The infrastructure provides many basic object classes and data types, providing the most basic system services for all applications, independent of the UI. While the class provided by the UIKit framework is a basic UI class library for creating touch-based user interfaces, iOS applications can provide UIs based on the UIKit framework, so it provides the infrastructure for applications to build user interfaces, draw, process and user interaction events, respond to gestures, and so on.
The manner and principle of implementing data communication between the third party application program and the operating system in the IOS system may refer to the Android system, and this disclosure will not be repeated here.
The input device 130 is configured to receive input instructions or data, and the input device 130 includes, but is not limited to, a keyboard, a mouse, a camera, a microphone, or a touch device. The output device 140 is used to output instructions or data, and the output device 140 includes, but is not limited to, a display device, a speaker, and the like. In one example, the input device 130 and the output device 140 may be combined, and the input device 130 and the output device 140 are a touch display screen for receiving a touch operation thereon or thereabout by a user using a finger, a touch pen, or any other suitable object, and displaying a user interface of each application program. The touch display screen is typically provided at the front panel of the terminal. The touch display screen may be designed as a full screen, a curved screen, or a contoured screen. The touch display screen may also be designed as a combination of a full screen and a curved screen, a combination of a shaped screen and a curved screen, and the embodiments of the present disclosure are not limited thereto.
In addition, those skilled in the art will appreciate that the configuration of the terminal illustrated in the above-described figures does not constitute a limitation of the terminal, and the terminal may include more or less components than illustrated, or may combine certain components, or may have a different arrangement of components. For example, the terminal further includes components such as a radio frequency circuit, an input unit, a sensor, an audio circuit, a wireless fidelity (wireless fidelity, wiFi) module, a power supply, and a bluetooth module, which are not described herein.
In the embodiment of the present disclosure, the execution subject of each step may be the terminal described above. Optionally, the execution subject of each step is an operating system of the terminal. The operating system may be an android system, an IOS system, or other operating systems, which embodiments of the present disclosure do not limit.
The terminal of the embodiment of the disclosure may further have a display device mounted thereon, where the display device may be various devices capable of implementing a display function, for example: cathode ray tube displays (cathode ray tubedisplay, CR), light-emitting diode displays (light-emitting diode display, LED), electronic ink screens, liquid crystal displays (liquid crystal display, LCD), plasma display panels (plasma display panel, PDP), and the like. A user may view displayed text, images, video, etc. information using a display device on the terminal 100. The terminal may be a smart phone, a tablet computer, a gaming device, an AR (Augmented Reality ) device, an automobile, a data storage device, an audio playing device, a video playing device, a notebook, a desktop computing device, a wearable device such as an electronic watch, electronic glasses, an electronic helmet, an electronic bracelet, an electronic necklace, an electronic article of clothing, etc.
Those skilled in the art will readily appreciate that the techniques of the present disclosure may be implemented by means of software and/or hardware. "Unit" and "module" in this specification refer to software and/or hardware capable of performing a specific function, either alone or in combination with other components, such as Field programmable gate arrays (Field-ProgrammaBLE Gate Array, FPGAs), integrated circuits (Integrated Circuit, ICs), etc.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present disclosure is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present disclosure. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all of the preferred embodiments, and that the acts and modules referred to are not necessarily required by the present disclosure.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present disclosure, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, such as the division of the units, merely a logical function division, and there may be additional manners of dividing the actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some service interface, device or unit indirect coupling or communication connection, electrical or otherwise.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a memory, comprising several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the method described in the various embodiments of the present disclosure. And the aforementioned memory includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be performed by hardware associated with a program that is stored in a computer readable memory, which may include: flash disk, read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), magnetic or optical disk, and the like.
The foregoing is merely exemplary embodiments of the present disclosure and is not intended to limit the scope of the present disclosure. That is, equivalent changes and modifications are contemplated by the teachings of this disclosure, which fall within the scope of the present disclosure. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a scope and spirit of the disclosure being indicated by the claims.

Claims (17)

1. An image acquisition method, comprising:
If the running state of the camera is a code scanning state, acquiring an image frame number corresponding to the camera;
acquiring an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number;
controlling the camera to acquire a code scanning image set corresponding to the image frame number based on the exposure initial value and the convergence speed value; the code scanning image set is used for indicating the terminal to acquire the identification code corresponding to the code scanning image set.
2. The method of claim 1, wherein the acquiring the number of image frames corresponding to the camera comprises:
acquiring a camera module identifier corresponding to the camera;
and acquiring the image frame number corresponding to the camera module identification.
3. The method of claim 1, wherein the acquiring the number of image frames corresponding to the camera comprises:
acquiring a currently running application program;
and acquiring the image frame number corresponding to the application program, and determining the image frame number as the image frame number corresponding to the camera.
4. The method of claim 1, wherein the acquiring the number of image frames corresponding to the camera comprises:
Acquiring image frame numbers corresponding to all application programs in an application program set, and acquiring an image frame number set;
and determining the maximum image frame number in the image frame number set as the image frame number corresponding to the camera.
5. The method according to claim 1, wherein the acquiring an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number includes:
acquiring an exposure initial value set and a convergence speed value set, wherein the exposure initial value set comprises each image frame number and an exposure initial value corresponding to the image frame number, and the convergence speed value set comprises each image frame number and a convergence speed value corresponding to the image frame number;
and acquiring an exposure initial value corresponding to the image frame number based on the exposure initial value set, and acquiring a convergence speed value corresponding to the image frame number based on the convergence speed value set.
6. The method of claim 1, wherein the controlling the camera to acquire the set of scan images corresponding to the number of image frames based on the exposure initial value and the convergence speed value comprises:
calculating an exposure parameter corresponding to the camera based on the exposure initial value and the convergence speed value;
And controlling the camera to acquire a code scanning image set corresponding to the image frame number based on the exposure parameters.
7. The method of claim 1, wherein the controlling the camera to acquire the set of scan images of the number of image frames comprises:
controlling the camera to acquire a first code scanning image;
if the first code scanning image meets the image condition, controlling the camera to acquire a second code scanning image with a preset frame number, wherein the preset frame number is one frame smaller than the image frame number;
and if the second code scanning image with the preset frame number is obtained, adding the first code scanning image and the second code scanning image with the preset frame number to a code scanning image set.
8. An image acquisition device, comprising:
the frame number acquisition unit is used for acquiring the image frame number corresponding to the camera if the running state of the camera is a code scanning state;
a speed value acquisition unit configured to acquire an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number;
the image acquisition unit is used for controlling the camera to acquire a code scanning image set corresponding to the image frame number based on the exposure initial value and the convergence speed value; the code scanning image set is used for indicating the terminal to acquire the identification code corresponding to the code scanning image set.
9. The apparatus of claim 8, wherein the frame number acquisition unit comprises an identification acquisition subunit and a frame number acquisition subunit, the frame number acquisition unit being configured to, when acquiring the number of image frames corresponding to the camera:
the identification acquisition subunit is used for acquiring the camera module identification corresponding to the camera;
the frame number acquisition subunit is used for acquiring the image frame number corresponding to the camera module identification.
10. The apparatus according to claim 8, wherein the frame number acquisition unit includes a program acquisition subunit and a frame number acquisition subunit, the frame number acquisition unit being configured to, when acquiring the number of image frames corresponding to the camera:
the program acquisition subunit is used for acquiring the currently running application program;
the frame number acquisition subunit is configured to acquire an image frame number corresponding to the application program, and determine the image frame number as an image frame number corresponding to the camera.
11. The apparatus according to claim 8, wherein the frame number obtaining unit is configured to, when obtaining the number of image frames corresponding to the camera, specifically:
acquiring image frame numbers corresponding to all application programs in an application program set, and acquiring an image frame number set;
And determining the maximum image frame number in the image frame number set as the image frame number corresponding to the camera.
12. The apparatus according to claim 8, wherein the speed value obtaining unit is configured to, when obtaining an exposure initial value corresponding to the image frame number and a convergence speed value corresponding to the image frame number, specifically:
acquiring an exposure initial value set and a convergence speed value set, wherein the exposure initial value set comprises each image frame number and an exposure initial value corresponding to the image frame number, and the convergence speed value set comprises each image frame number and a convergence speed value corresponding to the image frame number;
and acquiring an exposure initial value corresponding to the image frame number based on the exposure initial value set, and acquiring a convergence speed value corresponding to the image frame number based on the convergence speed value set.
13. The device according to claim 8, wherein the image acquisition unit is configured to, based on the exposure initial value and the convergence speed value, control the camera to acquire a scan code image set corresponding to the image frame number, specifically configured to:
calculating an exposure parameter corresponding to the camera based on the exposure initial value and the convergence speed value;
And controlling the camera to acquire a code scanning image set corresponding to the image frame number based on the exposure parameters.
14. The device according to claim 8, wherein the image acquisition unit is configured to, when controlling the camera to acquire the scan code image set corresponding to the image frame number, specifically:
controlling the camera to acquire a first code scanning image;
if the first code scanning image meets the image condition, controlling the camera to acquire a second code scanning image with a preset frame number, wherein the preset frame number is one frame smaller than the image frame number;
and if the second code scanning image with the preset frame number is obtained, adding the first code scanning image and the second code scanning image with the preset frame number to a code scanning image set.
15. A terminal, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-7.
17. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any of claims 1-7.
CN202111302409.5A 2021-11-04 2021-11-04 Image acquisition method, device, terminal and storage medium Pending CN116095498A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111302409.5A CN116095498A (en) 2021-11-04 2021-11-04 Image acquisition method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111302409.5A CN116095498A (en) 2021-11-04 2021-11-04 Image acquisition method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN116095498A true CN116095498A (en) 2023-05-09

Family

ID=86208767

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111302409.5A Pending CN116095498A (en) 2021-11-04 2021-11-04 Image acquisition method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN116095498A (en)

Similar Documents

Publication Publication Date Title
CN110475072B (en) Method, device, terminal and storage medium for shooting image
CN109144649B (en) Icon display method and device, terminal and storage medium
US11146739B2 (en) Method for image shooting, terminal device, and storage medium
CN111459586B (en) Remote assistance method, device, storage medium and terminal
CN110874217A (en) Interface display method and device for fast application and storage medium
CN110881104A (en) Photographing method, photographing device, storage medium and terminal
CN107748656B (en) Picture display method, device, terminal and storage medium
CN112839223B (en) Image compression method, image compression device, storage medium and electronic equipment
US11102397B2 (en) Method for capturing images, terminal, and storage medium
CN111127469A (en) Thumbnail display method, device, storage medium and terminal
CN110968362A (en) Application running method and device and storage medium
CN111866372A (en) Self-photographing method, device, storage medium and terminal
CN107864333B (en) Image processing method, device, terminal and storage medium
CN110865864B (en) Interface display method, device and equipment for quick application and storage medium
CN110730300A (en) Camera control method, device, storage medium and terminal
CN113450762B (en) Text reading method, text reading device, terminal and storage medium
CN116095498A (en) Image acquisition method, device, terminal and storage medium
CN112988097A (en) Display screen control method and device, storage medium and mobile terminal
CN114285936A (en) Screen brightness adjusting method and device, storage medium and terminal
CN114630085B (en) Image projection method, image projection device, storage medium and electronic equipment
CN113692026B (en) Network connection method, device, terminal and storage medium
CN113691676B (en) Equipment state prompting method and device, storage medium and electronic equipment
CN116264633A (en) Image processing method, device, terminal and storage medium
CN107800618B (en) Picture recommendation method and device, terminal and computer-readable storage medium
CN117762545A (en) Wallpaper processing method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination