CN115278097A - Image generation method, image generation device, electronic device, and medium - Google Patents

Image generation method, image generation device, electronic device, and medium Download PDF

Info

Publication number
CN115278097A
CN115278097A CN202210731467.8A CN202210731467A CN115278097A CN 115278097 A CN115278097 A CN 115278097A CN 202210731467 A CN202210731467 A CN 202210731467A CN 115278097 A CN115278097 A CN 115278097A
Authority
CN
China
Prior art keywords
image
exposure area
image data
exposure
photosensitive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210731467.8A
Other languages
Chinese (zh)
Inventor
周新泽
裴珺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202210731467.8A priority Critical patent/CN115278097A/en
Publication of CN115278097A publication Critical patent/CN115278097A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The application discloses an image generation method, an image generation device, electronic equipment and a medium, and belongs to the technical field of shooting. Wherein the image generation device comprises an image sensor, the method comprising: under the condition of controlling the image sensor to carry out exposure, acquiring a first exposure area of the image sensor, wherein the first exposure area comprises a photosensitive area corresponding to a moving object; exposing the photosensitive unit corresponding to the first exposure area to obtain first image data; exposing a photosensitive unit corresponding to a second exposure area of the image sensor under the condition that the first exposure area is exposed to obtain second image data; a target image is generated from the first image data and the second image data.

Description

Image generation method, image generation device, electronic device, and medium
Technical Field
The application belongs to the technical field of camera shooting, and particularly relates to an image generation method and device, electronic equipment and a medium.
Background
In general, when a user photographs a photographic subject using an electronic device, the electronic device may control a rolling shutter of the electronic device to expose a plurality of lines of photosensitive cells of an image sensor line by line to expose a photosensitive cell corresponding to the photographic subject, so that a photographed image of the photographic subject may be obtained.
However, since the photographic subject may be a moving subject and the electronic device performs exposure on the photosensitive units corresponding to the moving subject line by line, the exposure time of the photosensitive units corresponding to the moving subject is long, and thus, a phenomenon that an image area where the moving subject is located in the obtained photographic image is deformed may occur.
Thus, the shooting effect of the electronic equipment is poor.
Disclosure of Invention
An object of the embodiments of the present application is to provide an image generation method, an apparatus, an electronic device, and a medium, which can reduce deformation of an image area where a moving object is located, and improve a shooting effect of the electronic device.
In a first aspect, an embodiment of the present application provides an image generation method, which is applied to an image generation apparatus including an image sensor, and includes: under the condition of controlling an image sensor to perform exposure, acquiring a first exposure area of the image sensor, wherein the first exposure area comprises a photosensitive area corresponding to a moving object; exposing the photosensitive unit corresponding to the first exposure area to obtain first image data; exposing a photosensitive unit corresponding to a second exposure area of the image sensor under the condition that the first exposure area is exposed to obtain second image data; and generating a target image according to the first image data and the second image data.
In a second aspect, an embodiment of the present application provides an image generation apparatus, including an image sensor, the image generation apparatus further including: the device comprises an acquisition module, an exposure module and a processing module. The acquisition module is used for acquiring a first exposure area of the image sensor under the condition of controlling the image sensor to perform exposure, wherein the first exposure area comprises a photosensitive area corresponding to a moving object. The exposure module is used for exposing the photosensitive unit corresponding to the first exposure area acquired by the acquisition module to obtain first image data; and exposing the photosensitive unit corresponding to a second exposure area of the image sensor under the condition that the first exposure area is exposed to obtain second image data, wherein the second exposure area comprises the exposure area except the first exposure area in the image sensor. And the processing module is used for generating a target image according to the first image data and the second image data obtained by the exposure module.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, stored in a storage medium, which is executed by at least one processor to implement the steps of the method according to the first aspect.
In this embodiment of the present application, in a case that the image generation device controls an image sensor of the image generation device to perform exposure, the image generation device may first obtain a first exposure area (that is, an exposure area corresponding to a moving object) of the image sensor, then expose a photosensitive unit corresponding to the first exposure area to obtain first image data, and expose a photosensitive unit corresponding to a second exposure area (that is, an exposure area other than the first exposure area in the image sensor) of the image sensor to obtain second image data in a case that the first exposure area is completely exposed, so that the image generation device may generate a target image according to the first image data and the second image data. Because the image generation device can firstly acquire the first exposure area corresponding to the moving object, then directly expose the photosensitive units corresponding to the first exposure area, and only expose the photosensitive units corresponding to the second exposure area (namely, the photosensitive units corresponding to the exposure areas except the first exposure area in the image sensor) of the image sensor under the condition that the first exposure area is exposed, namely, the image generation device can firstly expose the photosensitive units corresponding to the first exposure area, but not expose all the exposure areas of the image sensor line by line, the time for exposing the photosensitive units corresponding to the moving object can be reduced, the phenomenon that the image area where the moving object is located is deformed can be reduced, and the shooting effect of the image generation device can be improved.
Drawings
Fig. 1 is a schematic flowchart of an image generation method provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a position relationship of a photosensitive unit corresponding to an exposure region according to an embodiment of the present application;
fig. 3 is a second schematic flowchart of an image generation method according to an embodiment of the present application;
FIG. 4 is a schematic circuit diagram of a sensor pixel according to an embodiment of the present application;
fig. 5 is a third schematic flowchart of an image generation method according to an embodiment of the present application;
FIG. 6 is a fourth flowchart of an image generation method according to an embodiment of the present disclosure;
FIG. 7 is a second schematic diagram illustrating a position relationship of the photosensitive units corresponding to the exposure regions according to the embodiment of the present application;
fig. 8 is a schematic structural diagram of an image generation apparatus provided in an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 10 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application are capable of operation in sequences other than those illustrated or described herein, and that the terms "first," "second," etc. are generally used in a generic sense and do not limit the number of terms, e.g., a first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The image generation method, the image generation device, the electronic device, and the image generation medium according to the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 shows a flowchart of an image generation method provided in an embodiment of the present application, which is applied to an image generation apparatus including an image sensor. As shown in fig. 1, the image generation method provided in the embodiment of the present application may include steps 101 to 104 described below.
Step 101, in the case that the image generation device controls the image sensor to perform exposure, the image generation device acquires a first exposure area of the image sensor.
In the embodiment of the application, a rolling shutter can be arranged in the image generation device.
Optionally, in this embodiment of the application, when the user triggers the image generation apparatus to start the shooting class application, the image generation apparatus may display a preview interface of the shooting class application, start a camera of the image generation apparatus, and control the rolling shutter to expose the image sensor (sensor).
In an embodiment of the present application, the first exposure region includes a photosensitive region corresponding to a moving object.
The moving object may include an animal, a human, and the like.
It should be noted that the "photosensitive area corresponding to the moving object" may be understood as: and in the shot image, the photosensitive area corresponding to the image area where the moving object is located.
It is understood that the image generating device may expose the photosensitive unit corresponding to the first exposure region to obtain an image of the moving object.
Optionally, in this embodiment of the application, the image generation device may acquire target position information of the first exposure area.
Wherein the target location information may include at least one of: position information of a target point of the first exposure area, size information of the first exposure area, and the like. Wherein the target point may be any one of: end points, center points, etc.
In one example, the image generation device may acquire contour information of a moving object through a motion sensor, so that the image generation device may acquire target position information of the first exposure area according to the contour information of the moving object. Wherein the motion sensor may be any one of: infrared sensors, laser sensors, dynamic vision sensors, and the like.
In another example, the image generation device may acquire a preview screen while controlling the image sensor to perform exposure, and detect the preview screen to determine an area where the moving object is located, so that the image generation device may acquire target position information of the first exposure area corresponding to the area according to the position information of the area.
In still another example, at least two high-sensitivity photosensitive units (e.g., real-sensing pixels in the following embodiments) may be disposed in an image sensor of the image generation apparatus, and in a case where the image generation apparatus controls the image sensor to perform exposure, the at least two high-sensitivity photosensitive units may detect a contour of a moving object, and when some of the at least two high-sensitivity photosensitive units detect the contour of the moving object, position information of the some high-sensitivity photosensitive units may be transmitted to the image generation apparatus, so that the image generation apparatus may acquire target position information of the first exposure region according to the position information of the some high-sensitivity photosensitive units.
And 102, exposing the photosensitive unit corresponding to the first exposure area by the image generation device to obtain first image data.
Optionally, in this embodiment of the application, the image generation apparatus may expose the photosensitive unit corresponding to the first exposure area based on the target position information to obtain first image data.
The image generating device may determine the photosensitive units corresponding to the first exposure area according to the target position information, expose the photosensitive units corresponding to the first exposure area line by line, and read data corresponding to the photosensitive units corresponding to the first exposure area line by line to obtain first image data. The photosensitive unit may be specifically: a photodiode.
Specifically, the image generating apparatus may expose the first row of photosensitive units corresponding to the first exposure area one by one, read data corresponding to the first row of photosensitive units when the first row of photosensitive units completes exposure, then expose the second row of photosensitive units corresponding to the first exposure area one by one, read data corresponding to the second row of photosensitive units when the second row of photosensitive units completes exposure, and so on until the photosensitive units in the last row of photosensitive units corresponding to the first exposure area are exposed one by one, and read corresponding data in the last row of photosensitive units when the last row of photosensitive units completes exposure, so as to obtain the first image data.
Here, the image generating apparatus may expose a first photosensitive unit in a first row of photosensitive units corresponding to the first exposure region, then expose a second photosensitive unit in the first row of photosensitive units, and so on until a last photosensitive unit in the first row of photosensitive units is exposed, so as to expose the first row of photosensitive units corresponding to the first exposure region.
And 103, exposing the photosensitive unit corresponding to the second exposure area of the image sensor by the image generation device under the condition that the first exposure area is exposed, so as to obtain second image data.
In an embodiment of the present application, the second exposure region includes an exposure region of the image sensor except the first exposure region.
It is understood that the second exposure area may be the entire exposure area in the image sensor or the exposure area other than the first exposure area in the image sensor.
Optionally, in this embodiment of the application, the image generation apparatus may expose the photosensitive units corresponding to the second exposure area line by line, and read data corresponding to the second exposure area line by line to obtain second image data.
It should be noted that, for the description of the image generation apparatus obtaining the second image data, reference may be made to specific description of the image generation apparatus obtaining the first image data, and details of the embodiment of the present application are not repeated herein. Optionally, in this embodiment of the application, when the second exposure area includes an exposure area other than the first exposure area in the image sensor, if a line of photosensitive units corresponding to the second exposure area is located in the same line as a line of first photosensitive units corresponding to the first exposure area, because the first exposure area may be located at a middle position, the image generation apparatus may expose a part of the photosensitive units in the line of photosensitive units corresponding to the second exposure area (for example, the photosensitive units on the left side of the first exposure area), and then expose another part of the photosensitive units in the line of photosensitive units corresponding to the second exposure area (for example, the photosensitive units on the right side of the first exposure area).
For example, fig. 2 shows a schematic diagram of a positional relationship between photosensitive units corresponding to exposure regions in the embodiments of the present application. As shown in fig. 2, the photosensitive cells corresponding to the first exposure region (e.g., the region 10) include a row of first photosensitive cells 11, the second exposure region includes an exposure region except the region 10 in the image sensor, the photosensitive cells corresponding to the second exposure region include a row of photosensitive cells, the row of photosensitive cells and the row of first photosensitive cells 11 are located in the same row, that is, the row of photosensitive cells and the row of first photosensitive cells 11 are in the same row on the image sensor, and the image generating apparatus may expose the one portion of photosensitive cells 12 one by one, and then expose the other portion of photosensitive cells 13 one by one.
Step 104, the image generation device generates a target image according to the first image data and the second image data.
Optionally, in this embodiment of the application, the image generating apparatus may perform synthesis processing on the first image data and the second image data to obtain target image data, then process the target image data through the image signal processor, and compress the target image data to generate the target image.
Optionally, in this embodiment, after the image generation apparatus generates the target image, the image generation apparatus may display the target image in a preview interface of the shooting application, or the image generation apparatus may directly store the target image in a preset storage area (for example, a storage area corresponding to an "album").
In the following, how the image generating apparatus generates the target image will be exemplified by taking the second exposure area as the exposure area of the image sensor except the first exposure area.
Aiming at the second exposure area, the exposure areas except the first exposure area in the image sensor are:
alternatively, in this embodiment of the application, as shown in fig. 3 in combination with fig. 1, the step 104 may be specifically implemented by a step 104a described below.
Step 104a, the image generation device performs fusion processing on the first image data and the target image data to generate a target image.
In this embodiment, the target image data is image data corresponding to an exposure area other than the first exposure area in the second image data.
Further optionally, in this embodiment of the application, the image generating device may first obtain position information of Q photosensitive units corresponding to the first exposure area and position information of T photosensitive units corresponding to the target image data, and restore the first image data corresponding to the Q photosensitive units according to the position information of the Q photosensitive units, and restore the target image data corresponding to the T photosensitive units according to the position information of the T photosensitive units, so that the image generating device may perform fusion processing on the first image data and the target image data to generate the target image. Wherein Q and T are positive integers.
Specifically, the Q photosensitive units may be all photosensitive units corresponding to the first exposure region; the T photosensitive cells may be all photosensitive cells corresponding to the target image data.
Specifically, the position information may be coordinate information.
Wherein the coordinate information may be coordinate information in a target coordinate system; the target coordinate system is: and the rectangular coordinate system takes the specific photosensitive units corresponding to all exposure areas of the image sensor as an origin. The specific photosensitive unit may be any one of: the photosensitive units are positioned at the centers of all exposure areas of the image sensor and the photosensitive units are positioned at the end points of all exposure areas of the image sensor.
Specifically, position information (e.g., coordinate information) of all the photosensitive units corresponding to all the exposure areas of the image sensor is stored in the image generation apparatus in advance, so that the image generation apparatus can acquire coordinate information of Q photosensitive units and coordinate information of T photosensitive units from the coordinate information of all the photosensitive units.
Specifically, for each of the Q photosensitive units, the image generation apparatus may determine the coordinate information of one photosensitive unit as the coordinate information of a pixel point of the synthesized image data corresponding to the one photosensitive unit, and then restore the image data corresponding to the one photosensitive unit.
Specifically, for each photosensitive unit in the T photosensitive units, the image generation device may determine the coordinate information of one photosensitive unit as the coordinate information of a pixel point of the synthesized image data corresponding to the one photosensitive unit, and then restore the image data corresponding to the one photosensitive unit.
It can be understood that the image generating apparatus may determine the coordinate information of the pixel points corresponding to the Q photosensitive units and the T photosensitive units, and then restore the image data corresponding to the Q photosensitive units and the T photosensitive units according to the determined coordinate information to generate the target image.
In the embodiment of the application, the Q photosensitive units acquire image data corresponding to a moving object, and the T photosensitive units acquire image data corresponding to a background, so that the image generation device can restore the image data corresponding to the Q photosensitive units and the T photosensitive units respectively according to the position information of the Q photosensitive units and the position information of the T photosensitive units to obtain a target image, and therefore the condition that the image is abnormal in the target image can be avoided.
As can be seen, the image generation apparatus can perform fusion processing on the image data corresponding to the exposure areas other than the first exposure area in the first image data and the second image data to obtain a complete image (i.e., a target image) quickly.
For the second exposure area, all exposure areas in the image sensor:
alternatively, in this embodiment of the present application, the step 104 may be specifically implemented by the step 104b described below.
And step 104b, the image generation device performs fusion processing on the first image data and the target image data to generate a target image.
In an embodiment of the present application, the target image data is: the second image data corresponds to image data corresponding to a second exposure area other than the first exposure area.
Further alternatively, in this embodiment of the application, the image generating device may first acquire position information (for example, coordinate information) of Q photosensitive units corresponding to the first image data and position information (for example, coordinate information) of T photosensitive units corresponding to the second image data, then determine, according to the coordinate information of the Q photosensitive units, photosensitive units other than the Q photosensitive units from the coordinate information of the T photosensitive units to determine image data corresponding to the target image area in the second image data, and then fuse the first image data and the image data corresponding to the target image area to obtain fused image data, so that the image generating device may process the fused image data through the image signal processor to generate the target image.
It should be noted that, for the description of the fusion process, reference may be made to specific descriptions in related technologies, and details are not repeated herein in the embodiments of the present application.
In the embodiment of the application, the Q photosensitive units acquire image data corresponding to a moving object in a target image, and the T photosensitive units acquire image data corresponding to the moving object and a background in the target image, so that the image generation device can perform fusion processing on the first image data and the target image area to generate the target image.
As described above, the image generating apparatus can perform the fusion processing of the image data corresponding to the target image region in the first image data and the second image data, and thus can reduce the occurrence of deformation of the moving object in the target image.
In the embodiment of the application, the image generation device may first obtain the first exposure area corresponding to the moving object, and then directly expose the photosensitive units corresponding to the first exposure area, instead of exposing all exposure areas of the image sensor line by line, so that the time for exposing the photosensitive units corresponding to the moving object may be reduced.
In the image generating method provided in the embodiment of the present application, when the image generating device controls the image sensor of the image generating device to perform exposure, the image generating device may first obtain a first exposure area (that is, an exposure area corresponding to a moving object) of the image sensor, then expose a photosensitive unit corresponding to the first exposure area to obtain first image data, and expose a photosensitive unit corresponding to a second exposure area (that is, an exposure area other than the first exposure area in the image sensor) of the image sensor to obtain second image data when the first exposure area is exposed, so that the image generating device may generate a target image according to the first image data and the second image data. Because the image generation device can firstly acquire the first exposure area corresponding to the moving object, then directly expose the photosensitive units corresponding to the first exposure area, and only expose the photosensitive units corresponding to the second exposure area (namely, the photosensitive units corresponding to the exposure areas except the first exposure area in the image sensor) of the image sensor under the condition that the first exposure area is exposed, namely, the image generation device can firstly expose the photosensitive units corresponding to the first exposure area, but not expose all the exposure areas of the image sensor line by line, the time for exposing the photosensitive units corresponding to the moving object can be reduced, the phenomenon that the image area where the moving object is located is deformed can be reduced, and the shooting effect of the image generation device can be improved.
How the image sensor of the photographing device acquires motion information will be described below by taking as an example that at least two high-sensitivity pixels (e.g., real sensing pixels) are disposed in the image sensor of the photographing device.
Fig. 4 shows a schematic circuit diagram of a real pixel. As shown in fig. 4, a real sensing pixel circuit may include: the circuit comprises a first current amplification module 20, a second current amplification module 21, an analog-to-digital conversion module 22, a logic judgment module 23 and a signal control module 24.
The first current amplifying module 20 includes: a pixel PD1, a first end of the pixel PD1 being grounded, a second end of the pixel PD1 being connected to a first end of a first capacitor C1; an NPN-type triode 25, a base of the NPN-type triode 25 being connected to the second end of the first capacitor C1, the base of the NPN-type triode 25 being further connected to the first power supply V0 through a first resistor R1, a collector of the NPN-type triode 25 being connected to the first end of the second capacitor C2, the collector of the NPN-type triode 25 being further connected to the first power supply V0 through a second resistor R2, an emitter of the NPN-type triode 25 being grounded; the second end of the second capacitor C2 is grounded through a third resistor R3, and the second end of the second capacitor C2 is further connected to the first switch tube T1.
The analog-to-digital conversion module 22 includes: an Analog-to-digital converter (ADC) 26, a first end of the ADC26 is connected to a second end of the second capacitor C2, and a second end of the ADC26 is further connected to a first end of the logic determining module 23;
a first end of the signal control module 24 is connected with a second end of the logic judgment module 23, and a second end of the signal control module 24 is connected with the first switch tube T1;
the second current amplification block 21 includes: a pixel PD2, a first end of the pixel PD2 being grounded, a second end of the pixel PD2 being connected to a first end of a third capacitor C3; an NPN type triode 27, a base of the NPN type triode 27 is connected to the second end of the third capacitor C3, the base of the NPN type triode 27 is further connected to the second power source V1 through a fourth resistor R4, a collector of the NPN type triode 27 is connected to the first end of the fourth capacitor C4, the collector of the NPN type triode 27 is further connected to the second power source V1 through a fifth resistor R5, and an emitter of the NPN type triode 27 is grounded; a second end of the fourth capacitor C4 is grounded through a sixth resistor R6, and the second end of the fourth capacitor C4 is further connected to the second switching tube T2; the second end of the fourth capacitor C4 is further connected to the analog-to-digital converter ADC, and the second end of the signal control module 24 is further connected to the second switch tube T2.
The first current amplification module 20 is configured to convert an optical signal into a current signal by using the photodiode PD1, amplify the current signal by using the resistor R1, the resistor R2 and the triode, convert the current signal into a voltage signal 1 by using the resistor R3, and output the voltage signal 1 to the analog-to-digital conversion module 12.
In the analog-to-digital conversion block 22, VADCAnd Vref are the operating and reference voltages of an Analog-to-digital converter (ADC), respectively. The digital-to-analog conversion module is used for converting analog quantities such as voltage, current and the like into digital quantities so as to facilitate subsequent processing. After the voltage signal 1 outputted from the first current amplifying module 20 is inputted into the analog-to-digital converting module 22, it is converted into a photosensitive digital signal 1' by comparing with the reference voltage Vref and converting the photosensitive digital signalThe signal 1' is output to the logic determination module 23.
In the logic determining module 23, VH and VL are the maximum value and the minimum value of the photo sensing digital signal 2 ″ of the photo sensing diode PD1 at the previous moment, i.e., VH and VL indicate the voltage range where the photo sensing digital signal 2' of the photo sensing diode PD1 at the previous moment is located. The logic determining module 23 can compare the photo-sensing digital signal 1' with VH and VL of the photo-sensing diode PD1 at the previous time, respectively. If the photo-sensing digital signal 1' is out of the range of VH and VL, the logic judgment module 23 may output a request signal for requesting to output the voltage signal output by the first current amplification module 20, i.e., the voltage signal 1, to the signal control module 24.
The signal control module 24, after receiving the request signal output by the logic determining module 23, may arbitrate when to output the voltage signal 1, and output a control signal to the multiplexing switch module, where the control signal is used to control the output of the voltage signal 1 through a first path, where the first path includes: the device comprises a first switch tube T1, a first analog signal output module and a multiplexing switch module which are connected in sequence.
It can be understood that the functions of the components of the second current amplifying module 21 are similar to the first current amplifying module 20, and the functions of the second analog signal output module and the second switch transistor T2 are similar to the functions of the first analog signal output module and the first switch transistor T1, and therefore are not described in detail again.
The second analog signal output module and the second switch tube T2 form a second path for outputting a voltage signal (analog) of the second current amplifying module 21.
It should be noted that, in actual implementation, the signal control module 24 may control the voltage signal 1 of the first current amplification module 20 and the voltage signal 2 of the second current amplification module 21 to be output simultaneously, and control the multiplexing switch module to process the two voltage signals (i.e., the voltage signal 1 and the voltage signal 2), and finally, the multiplexing switch module outputs the voltage signal 1, the voltage signal 2, and the voltage signal 1+ the voltage signal 2.
Therefore, the voltage signal 1 and the voltage signal 2 can be used for judging the phase difference and measuring distance or speed; and voltage signal 1+ voltage signal 2 is the photosensitive signal value of the real sensing pixel (e.g., luminance information captured by the real sensing pixel) for subsequent picture processing.
In the embodiment of the application, each of the at least two sensing pixels can be independent from each other, and each sensing pixel can sense the change of the external environment brightness in real time along with the clock frequency of the pixel, so that the change of the environment brightness is converted into the change of the current, and further converted into the change of the digital signal.
If the variation of the digital signal of some of the at least two real sensing pixels exceeds a preset threshold, the some real sensing pixels report to the system for reading, and output a data packet with the position information, the brightness information and the time information of the some real sensing pixels.
It can be understood that the change of the position of the moving object can change the ambient brightness, thus causing the change of the digital signals corresponding to some real sensing pixels; therefore, the electronic equipment can obtain the contour of the moving object through the digital signal change and the coordinate information corresponding to some real sensing pixels, and accordingly the motion information of the moving object in the field of view of the image sensor is determined.
Optionally, in this embodiment of the application, at least two of the real sensing pixels are distributed in the image sensor according to a predetermined density, and the at least two of the real sensing pixels may be arranged in the image sensor of the photographing device in an array manner.
Further optionally, in this application embodiment, the size and density of at least two of the sensory pixels may be flexibly adjusted according to an actual application scenario, which is not limited in this application embodiment.
Optionally, in this embodiment of the application, the coordinate information may be a coordinate position in a target coordinate system; the target coordinate system is as follows: a rectangular coordinate system with a specific pixel of the image sensor as an origin. The specific pixel may be any one of: a pixel located at the center of the image sensor, and a pixel located at an end of the image sensor.
It is understood that if the variation amount of the digital signal corresponding to one real sensing pixel exceeds the preset range, the one real sensing pixel may be considered to correspond to the contour of the moving object, and therefore, the image generation apparatus may determine the contour of the moving object based on the coordinate information of the one real sensing pixel.
Optionally, in this embodiment of the application, the image sensor includes a sensing pixel. Specifically, as shown in fig. 5 in conjunction with fig. 1, before the step 101, the image generation method provided in the embodiment of the present application may further include a step 201 described below, and the step 101 may be specifically implemented by a step 101a described below.
Step 201, the image generation device acquires the brightness change information of the sensing pixel.
In this embodiment, the above-mentioned real sensing pixels may include at least two real sensing pixels.
In the embodiment of the present application, when the variation amount of the digital signal output by the real pixel is greater than the preset threshold, the real pixel may send luminance variation information to the image generating device.
In this embodiment, the luminance change information may be specifically a photosensitive signal value of a real sensing pixel.
In step 101a, the image generation apparatus determines a first exposure region based on the luminance change information and the position information of the real sensing pixel.
Further optionally, the position information of the real sensing pixel may specifically be: coordinate information of the sensory pixel.
In a case where the real sensing pixels include a plurality of real sensing pixels, the image generation apparatus may determine, as the first exposure region, a region surrounded and synthesized by the plurality of real sensing pixels according to the luminance change information and the position information of the plurality of real sensing pixels, so that the image generation apparatus may acquire coordinate information of an end point of the first exposure region and size information of the first exposure region to acquire the first exposure region.
As described above, since the image generating apparatus can specify the first exposure region based on the luminance change information and the position information of the sensory pixel, the accuracy of specifying the first exposure region by the image generating apparatus can be improved.
The following description will be given for the process of exposing the photosensitive units corresponding to the first exposure area and the second exposure area by the image generating apparatus, respectively.
And aiming at the process that the image generation device exposes the photosensitive unit corresponding to the first exposure area:
optionally, in this embodiment of the present application, with reference to fig. 1, as shown in fig. 6, the step 102 may be specifically implemented by the following step 102a and step 102 b.
Step 102a, the image generation device determines N rows of first photosensitive units according to the position information of the first exposure area.
Further optionally, in this embodiment of the application, the image generation apparatus may determine, from the coordinate information of all the photosensitive units corresponding to all the exposure areas of the image sensor stored in advance, all the first photosensitive units located in the first exposure area, so as to determine N rows of first photosensitive units.
And 102b, exposing the N lines of first photosensitive units line by the image generation device to obtain first image data.
Further optionally, in this embodiment of the application, the image generating apparatus may expose first row first photosensitive units in the N rows of first photosensitive units one by one, and read data corresponding to the first row first photosensitive units when the first row first photosensitive units complete exposure, then expose second row first photosensitive units in the N rows of first photosensitive units one by one, and read data corresponding to the second row first photosensitive units when the second row first photosensitive units complete exposure, and so on until exposing last row first photosensitive units in the N rows of first photosensitive units one by one, and read data corresponding to the last row first photosensitive units when the last row first photosensitive units complete exposure, so as to obtain first image data.
For example, fig. 7 shows a schematic diagram of a positional relationship between photosensitive units corresponding to exposure regions in an embodiment of the present application. As shown in fig. 7, the image generating apparatus may determine N rows of first photosensitive cells, for example, 700 rows of first photosensitive cells, that is, 1401 to 2200 photosensitive cells of 1300 to 2000 rows, according to the position information of the first exposure region, so that the image generating apparatus may expose from the 1401 st photosensitive cell in the 1300 th row to the 2200 th photosensitive cell one by one, then read from the 1401 st photosensitive cell in the 1300 th row to data corresponding to the 2200 th photosensitive cell one by one, and so on until exposing from the 1401 th photosensitive cell in the 2000 th row to the 2200 th photosensitive cell one by one, and then read from the 2200 st photosensitive cell in the 2000 th row to data corresponding to the 1401 th photosensitive cell one by one to obtain the first image data.
Therefore, the image generation device can expose the N lines of first photosensitive units line by line after determining the N lines of first photosensitive units, so that a complete first image can be obtained quickly, the exposure time of the photosensitive units corresponding to the moving object is reduced, the deformation phenomenon of the image area where the moving object is located can be reduced, and the shooting effect of the image generation device can be improved.
And the process of exposing the photosensitive unit corresponding to the second exposure area for the image generation device is as follows:
the second exposure area is taken as an example of an exposure area except the first exposure area in the image sensor.
Alternatively, in this embodiment of the application, the step 103 may be specifically implemented by the following steps 103a and 103 b.
Step 103a, under the condition that the first exposure area is exposed, the image generation device determines M rows of second photosensitive units according to the position information of the second exposure area.
In the embodiment of the application, M is a positive integer.
Further alternatively, in this embodiment of the present application, the image generation apparatus may determine, from the coordinate information of all the photosensitive units corresponding to all the exposure areas of the pre-stored image sensor, all the second photosensitive units located in the second exposure area, so as to determine M rows of second photosensitive units.
And 103b, exposing the M lines of second photosensitive units line by the image generation device to obtain second image data.
Further optionally, in this embodiment of the application, the image generating apparatus may expose the first row of second photosensitive units in the M rows of second photosensitive units one by one, and read data corresponding to the first row of second photosensitive units when the first row of second photosensitive units completes exposure, then expose the second row of second photosensitive units in the M rows of second photosensitive units one by one, and read data corresponding to the second row of second photosensitive units when the second row of second photosensitive units completes exposure, and so on until the last row of second photosensitive units in the M rows of second photosensitive units is exposed one by one, and read data corresponding to the last row of second photosensitive units when the last row of second photosensitive units completes exposure, so as to obtain the second image data.
Therefore, the image generation device can expose the M rows of second photosensitive units line by line, so that a complete second image can be obtained quickly, the time for exposing the photosensitive units corresponding to the background image is reduced, and the shooting effect of the image generation device can be improved.
Of course, a certain row of second photosensitive cells corresponding to the second exposure region may be located in the same row as a certain row of first photosensitive cells in the first exposure region, so that when the image generating apparatus exposes the certain row of second photosensitive cells, the image generating apparatus may expose a part of second photosensitive cells located in front of the certain row of first photosensitive cells in the certain row of second photosensitive cells first, and then expose another part of second photosensitive cells located behind the certain row of first photosensitive cells to obtain second image data.
The exposure of the second photosensitive cells of the certain row by the image generating apparatus will be exemplified below.
Optionally, in this embodiment of the application, the M rows of second photosensitive units include N rows of third photosensitive units and N rows of fourth photosensitive units, for each row of third photosensitive units in the N rows of third photosensitive units and each row of fourth photosensitive units in the N rows of fourth photosensitive units, the one row of third photosensitive units, the one row of fourth photosensitive units, and the one row of first photosensitive units are located in the same row (i.e., the same row of photosensitive units corresponding to the exposure area of the image sensor), and the first photosensitive unit in the one row of first photosensitive units is located between the third photosensitive unit in the one row of third photosensitive units and the fourth photosensitive unit in the one row of fourth photosensitive units. Specifically, the step 103a can be realized by the steps 103b1 and 103b2 described below.
Step 103b1, the image generation device exposes the third photosensitive unit on the ith row.
In the embodiment of the application, i is a positive integer less than or equal to N.
Step 103b2, in the case that the exposure of the third photosensitive unit of the ith row is completed, the image generating device exposes the fourth photosensitive unit of the ith row.
For example, referring to fig. 7, the m rows of second photosensitive units (e.g. 3000 rows of photosensitive units) include a part of rows of photosensitive units (e.g. 1299 rows of photosensitive units, i.e. 1 to 4000 photosensitive units of the 1 st to 1299 rows), N rows of third photosensitive units (e.g. 700 rows of photosensitive units, i.e. 1 to 1400 photosensitive units of the 1300 th to 1999 rows), N rows of fourth photosensitive units (e.g. 700 rows of photosensitive units, i.e. 2201 to 4000 photosensitive units of the 1300 th to 1999 rows), and another part of rows of photosensitive units (e.g. 1001 rows of photosensitive units, i.e. 1 to 4000 photosensitive units of the 2000 th to 3000 rows), so that the image generating apparatus can first expose the 1 to 4000 photosensitive units of the 1 st to 1299 rows line by line, and read the data corresponding to the 1 to 4000 photosensitive units of the 1 st to 1299 rows one by one, then, the 1 st to 1400 th photosensitive cells in the 1300 th row are exposed one by one, the 2201 th to 4000 th photosensitive cells in the 1300 th row are exposed one by one, then, the data corresponding to the 1 st to 1400 th photosensitive cells in the 1301 th row are read, the data corresponding to the 2201 th to 4000 th photosensitive cells in the 1301 th row are read one by one, and so on, until the 1 st to 1400 th photosensitive cells in the 1999 th row are exposed one by one, and the 2201 st to 4000 th photosensitive cells in the 1999 th row are exposed one by one, and finally, the image generating device can expose the 1 st to 4000 th photosensitive cells in the 2000 th to 3000 th row one by one, and read the data corresponding to the 1 st to 4000 th photosensitive cells in the 2001 to 3000 th row one by one, so as to obtain second image data.
As described above, since the image generating apparatus can expose the at least one line of the third photosensitive cells and the at least one line of the fourth photosensitive cells one by one, the time required for exposing the photosensitive cells corresponding to the background image can be reduced, and the second image can be obtained quickly.
According to the image generation method provided by the embodiment of the application, the execution subject can be an image generation device. In the embodiment of the present application, an image generation method executed by an image generation apparatus is taken as an example, and the image generation apparatus provided in the embodiment of the present application is described.
Fig. 8 shows a schematic diagram of a possible structure of an image generation apparatus related to an embodiment of the present application, which includes an image sensor. As shown in fig. 8, the image generating apparatus 60 further includes: an acquisition module 61, an exposure module 62, and a processing module 63. The obtaining module 61 is configured to obtain a first exposure area of the image sensor under the condition that the image sensor is controlled to perform exposure, where the first exposure area includes a photosensitive area corresponding to the moving object. The exposure module 62 is configured to expose the photosensitive unit corresponding to the first exposure area to obtain first image data; and exposing the photosensitive unit corresponding to the second exposure area of the image sensor under the condition that the first exposure area is exposed to obtain second image data. And a processing module 63, configured to generate a target image according to the first image data and the second image data obtained by the exposure module 62.
In a possible implementation manner, the exposure module 62 is specifically configured to determine N rows of first photosensitive units according to the position information of the first exposure area; exposing the N lines of first photosensitive units line by line to obtain first image data; wherein N is a positive integer.
In a possible implementation manner, the exposure module 62 is specifically configured to determine M rows of second photosensitive units according to the position information of the second exposure area; exposing the M lines of second photosensitive units line by line to obtain second image data; wherein M is a positive integer.
In a possible implementation manner, the processing module 63 is specifically configured to perform fusion processing on the first image data and the target image data to generate a target image. The target image data is image data corresponding to exposure areas except the first exposure area in the second image data.
In one possible implementation, the image sensor includes a sensing pixel. The obtaining module 61 is further configured to obtain luminance change information of the real sensing pixel. The processing module 63 is further configured to determine the first exposure region based on the luminance change information and the position information of the real sensing pixel acquired by the acquiring module 61.
According to the image generation device provided by the embodiment of the application, because the image generation device can firstly acquire the first exposure area corresponding to the moving object, then directly expose the photosensitive units corresponding to the first exposure area, and only expose the photosensitive units corresponding to the second exposure area (including the exposure area except the first exposure area in the image sensor) of the image sensor under the condition that the first exposure area completes exposure, namely, the image generation device can firstly expose the photosensitive units corresponding to the first exposure area, but not expose all the exposure areas of the image sensor line by line, the time for exposing the photosensitive units corresponding to the moving object can be reduced, so that the phenomenon that the image area where the moving object is located deforms can be reduced, and thus, the shooting effect of the image generation device can be improved.
The image generation apparatus in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic device may be, for example, a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) device, a robot, a wearable device, a super-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The image generation apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present application.
The image generation device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to 7, achieve the same technical effect, and is not described herein again to avoid repetition.
Optionally, in this embodiment, as shown in fig. 9, an electronic device 80 is further provided in this embodiment, and includes a processor 81 and a memory 82, where a program or an instruction that can be executed on the processor 81 is stored in the memory 82, and when the program or the instruction is executed by the processor 81, the process steps of the embodiment of the image generation method are implemented, and the same technical effect can be achieved, and are not described again here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 10 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: the radio frequency unit 101, the network module 102, the audio output unit 103, the input unit 104, the sensor 105, the display unit 106, the user input unit 107, the interface unit 108, the memory 109, and the processor 110, and the sensor 105 includes an image sensor.
Those skilled in the art will appreciate that the electronic device 100 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 10 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The processor 110 is configured to, under a condition that the image sensor is controlled to perform exposure, obtain a first exposure area of the image sensor, where the first exposure area includes a photosensitive area corresponding to a moving object; exposing the photosensitive unit corresponding to the first exposure area to obtain first image data; under the condition that the first exposure area is exposed, exposing a photosensitive unit corresponding to a second exposure area of the image sensor to obtain second image data, wherein the second exposure area comprises the exposure area except the first exposure area in the image sensor; and generating a target image from the first image data and the second image data.
According to the electronic equipment provided by the embodiment of the application, because the electronic equipment can firstly acquire the first exposure area corresponding to the moving object, then directly expose the photosensitive unit corresponding to the first exposure area, and only expose the photosensitive unit corresponding to the second exposure area (including the exposure area except the first exposure area in the image sensor) of the image sensor under the condition that the first exposure area completes exposure, namely, the electronic equipment can firstly expose the photosensitive unit corresponding to the first exposure area, but not expose all the exposure areas of the image sensor line by line, therefore, the time for exposing the photosensitive unit corresponding to the moving object can be reduced, the phenomenon that the image area where the moving object is located is deformed can be reduced, and therefore, the shooting effect of the electronic equipment can be improved.
Optionally, in this embodiment of the application, the processor 110 is specifically configured to determine N rows of first photosensitive units according to the position information of the first exposure area; and exposing the N lines of first photosensitive units line by line to obtain first image data.
Therefore, the electronic equipment can expose the N lines of first photosensitive units line by line after the N lines of first photosensitive units are determined, so that a complete first image can be quickly obtained, the time for exposing the photosensitive units corresponding to the moving object is reduced, the phenomenon that the image area where the moving object is located deforms can be reduced, and the shooting effect of the electronic equipment can be improved.
Optionally, in this embodiment of the application, the processor 110 is specifically configured to determine M rows of second photosensitive units according to the position information of the second exposure area; and exposing the M lines of second photosensitive units line by line to obtain second image data.
Therefore, the electronic equipment can expose the M rows of second photosensitive units line by line, so that the time for exposing the photosensitive units corresponding to the background image can be shortened, the second image can be obtained quickly, and the shooting effect of the electronic equipment can be improved.
Optionally, in this embodiment, the processor 110 is specifically configured to perform fusion processing on the first image data and the target image data to generate a target image.
The target image data is image data corresponding to exposure areas except the first exposure area in the second image data.
Therefore, the electronic device can perform fusion processing on the image data corresponding to the exposure areas other than the first exposure area in the first image data and the second image data to quickly obtain a complete image (i.e., a target image).
Optionally, in an embodiment of the present application, the image sensor includes a sensing pixel.
A processor 110 for obtaining the brightness variation information of the real sensing pixel; and determining the first exposure region based on the luminance change information and the position information of the real sensing pixels.
Therefore, the electronic equipment can determine the first exposure area according to the brightness change information and the position information of the sensing pixels, and therefore the accuracy of determining the first exposure area by the electronic equipment can be improved.
It should be understood that, in the embodiment of the present application, the input unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processing unit 1041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes at least one of a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, memory 109 may include volatile memory or non-volatile memory, or memory 109 may include both volatile and non-volatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory may be Random Access Memory (RAM), static random access memory (static RAM, SRAM), dynamic random access memory (dynamic RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous DRAM (ddr SDRAM), enhanced synchronous SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and direct bus RAM (DRRAM). The memory 109 in the embodiments of the subject application includes, but is not limited to, these and any other suitable types of memory.
Processor 110 may include one or more processing units; optionally, the processor 110 integrates an application processor, which primarily handles operations involving the operating system, user interface, and applications, etc., and a modem processor, which primarily handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image generation method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the embodiment of the image generation method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as a system-on-chip, or a system-on-chip.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing embodiments of the image generation method, and achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the present embodiments are not limited to those precise embodiments, which are intended to be illustrative rather than restrictive, and that various changes and modifications may be effected therein by one skilled in the art without departing from the scope of the appended claims.

Claims (12)

1. An image generation method applied to an image generation apparatus including an image sensor, the method comprising:
under the condition of controlling the image sensor to carry out exposure, acquiring a first exposure area of the image sensor, wherein the first exposure area comprises a photosensitive area corresponding to a moving object;
exposing the photosensitive unit corresponding to the first exposure area to obtain first image data;
exposing a photosensitive unit corresponding to a second exposure area of the image sensor under the condition that the first exposure area is exposed to obtain second image data, wherein the second exposure area comprises an exposure area except the first exposure area in the image sensor;
and generating a target image according to the first image data and the second image data.
2. The method according to claim 1, wherein the exposing the photosensitive unit corresponding to the first exposure area to obtain first image data comprises:
determining N rows of first photosensitive units according to the position information of the first exposure area;
exposing the N lines of first photosensitive units line by line to obtain first image data;
wherein N is a positive integer.
3. The method according to claim 1, wherein exposing the photosensitive unit corresponding to the second exposure area of the image sensor to obtain second image data when the exposure of the first exposure area is completed comprises:
determining M rows of second photosensitive units according to the position information of the second exposure area;
exposing the M lines of second photosensitive units line by line to obtain second image data;
wherein M is a positive integer.
4. The method of claim 1, wherein generating a target image from the first image data and the second image data comprises:
fusing the first image data and target image data to generate the target image;
and the target image data is image data corresponding to an exposure area except the first exposure area in the second image data.
5. The method of claim 1, wherein the image sensor comprises a sensory pixel; before the acquiring the first exposure region of the image sensor, the method further comprises:
acquiring brightness change information of the real sensing pixels;
and determining the first exposure area based on the brightness change information and the position information of the sensing pixel.
6. An image generation apparatus, characterized in that the image generation apparatus comprises an image sensor, the image generation apparatus further comprising: the system comprises an acquisition module, an exposure module and a processing module;
the acquisition module is used for acquiring a first exposure area of the image sensor under the condition of controlling the image sensor to perform exposure, wherein the first exposure area comprises a photosensitive area corresponding to a moving object;
the exposure module is used for exposing the photosensitive unit corresponding to the first exposure area acquired by the acquisition module to obtain first image data; exposing a photosensitive unit corresponding to a second exposure area of the image sensor under the condition that the first exposure area is exposed to obtain second image data, wherein the second exposure area comprises the exposure area except the first exposure area in the image sensor;
and the processing module is used for generating a target image according to the first image data and the second image data obtained by the exposure module.
7. The image generating apparatus according to claim 6, wherein the exposure module is specifically configured to determine the N rows of first photosensitive units according to position information of the first exposure area; exposing the N lines of first photosensitive units line by line to obtain first image data;
wherein N is a positive integer.
8. The image generation apparatus according to claim 6, wherein the exposure module is specifically configured to determine M rows of second photosensitive cells according to the position information of the second exposure area; exposing the M lines of second photosensitive units line by line to obtain second image data;
wherein M is a positive integer.
9. The image generating apparatus according to claim 6, wherein the processing module is specifically configured to perform fusion processing on the first image data and target image data to generate the target image;
and the target image data is image data corresponding to an exposure area except the first exposure area in the second image data.
10. The image generation apparatus of claim 6, wherein the image sensor comprises a sensory pixel;
the acquisition module is further used for acquiring brightness change information of the real sensing pixels;
the processing module is further configured to determine the first exposure area based on the brightness change information and the position information of the sensory pixel acquired by the acquiring module.
11. An electronic device comprising a processor and a memory, the memory storing a program or instructions executable on the processor, the program or instructions when executed by the processor implementing the steps of the image generation method of any of claims 1 to 5.
12. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the image generation method according to any one of claims 1 to 5.
CN202210731467.8A 2022-06-24 2022-06-24 Image generation method, image generation device, electronic device, and medium Pending CN115278097A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210731467.8A CN115278097A (en) 2022-06-24 2022-06-24 Image generation method, image generation device, electronic device, and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210731467.8A CN115278097A (en) 2022-06-24 2022-06-24 Image generation method, image generation device, electronic device, and medium

Publications (1)

Publication Number Publication Date
CN115278097A true CN115278097A (en) 2022-11-01

Family

ID=83760759

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210731467.8A Pending CN115278097A (en) 2022-06-24 2022-06-24 Image generation method, image generation device, electronic device, and medium

Country Status (1)

Country Link
CN (1) CN115278097A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115278056A (en) * 2022-06-24 2022-11-01 维沃移动通信有限公司 Shooting method, shooting device, electronic equipment and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014020970A1 (en) * 2012-07-31 2014-02-06 ソニー株式会社 Image processing device, image processing method, and program
JP2015082675A (en) * 2013-10-21 2015-04-27 三星テクウィン株式会社Samsung Techwin Co., Ltd Image processing device and image processing method
CN107395997A (en) * 2017-08-18 2017-11-24 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN114390209A (en) * 2022-02-23 2022-04-22 维沃移动通信有限公司 Photographing method, photographing apparatus, electronic device, and readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014020970A1 (en) * 2012-07-31 2014-02-06 ソニー株式会社 Image processing device, image processing method, and program
JP2015082675A (en) * 2013-10-21 2015-04-27 三星テクウィン株式会社Samsung Techwin Co., Ltd Image processing device and image processing method
CN107395997A (en) * 2017-08-18 2017-11-24 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN114390209A (en) * 2022-02-23 2022-04-22 维沃移动通信有限公司 Photographing method, photographing apparatus, electronic device, and readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
姚洪涛;李晓宁;田青青;: "5T结构全局曝光CMOS图像传感器的研究与设计", 现代计算机(专业版), no. 31 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115278056A (en) * 2022-06-24 2022-11-01 维沃移动通信有限公司 Shooting method, shooting device, electronic equipment and medium

Similar Documents

Publication Publication Date Title
US10419661B2 (en) Shooting method and shooting device
CN112153301B (en) Shooting method and electronic equipment
US11532089B2 (en) Optical flow computing method and computing device
CN113099122A (en) Shooting method, shooting device, shooting equipment and storage medium
CN109889712B (en) Pixel circuit, image sensor, terminal equipment and signal control method
JP2021029017A (en) Photoelectric conversion device, imaging system, mobile body, and exposure control device
CN115278097A (en) Image generation method, image generation device, electronic device, and medium
JP2001177752A (en) Image pickup method and device to generate combined output image having image components photographed by different focal distances
US10154205B2 (en) Electronic device and image processing method thereof
CN101441393A (en) Projection device for image projection with document camera device connected thereto, and projection method
EP3872463A1 (en) Light sensor module, method for acquiring light sensor data, electronic equipment, and storage medium
CN113747067A (en) Photographing method and device, electronic equipment and storage medium
US20200412984A1 (en) Cross-row time delay integration method, apparatus and camera
CN114286011B (en) Focusing method and device
CN107993253B (en) Target tracking method and device
CN115278056A (en) Shooting method, shooting device, electronic equipment and medium
CN112702524B (en) Image generation method and device and electronic equipment
CN114615426A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN114390206A (en) Shooting method and device and electronic equipment
CN112399092A (en) Shooting method and device and electronic equipment
JPH11153430A (en) Taken image management device and its program recording medium
JP6679430B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
US11696040B2 (en) Image processing apparatus that retouches and displays picked-up image, image processing method, and storage medium
TWI828302B (en) Shooting methods and shooting equipment
CN113709326B (en) Lens shading correction method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination