CN113438416A - Image quantity acquisition method and device, electronic equipment and storage medium - Google Patents

Image quantity acquisition method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113438416A
CN113438416A CN202110688027.4A CN202110688027A CN113438416A CN 113438416 A CN113438416 A CN 113438416A CN 202110688027 A CN202110688027 A CN 202110688027A CN 113438416 A CN113438416 A CN 113438416A
Authority
CN
China
Prior art keywords
angle
posture
value
image
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110688027.4A
Other languages
Chinese (zh)
Other versions
CN113438416B (en
Inventor
吴学芳
任桥
李少雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110688027.4A priority Critical patent/CN113438416B/en
Publication of CN113438416A publication Critical patent/CN113438416A/en
Application granted granted Critical
Publication of CN113438416B publication Critical patent/CN113438416B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2624Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

The disclosure relates to an image quantity acquisition method and device, an electronic device and a storage medium. The method comprises the following steps: acquiring field angle data of a camera module in electronic equipment and a configuration posture of the camera module; acquiring the number of cycles of rotation of the camera module around a target object according to the field angle data for the configuration posture, wherein the number of cycles is the minimum number of rotations of the camera module; acquiring the frame number of the image corresponding to each week according to the field angle data and the configuration posture, wherein the frame number is the minimum number of the images shot by each week of the camera module; and acquiring the sum of the number of the image frames corresponding to each week, wherein the sum is the minimum number of the images required by synthesizing the spherical panorama. In this embodiment, the minimum number of the cycles of rotation of the camera module and the minimum number of the images shot each week are obtained, so that the minimum number of the images required by synthesizing the spherical panorama can be obtained, the shooting difficulty can be reduced, and the use experience can be improved.

Description

Image quantity acquisition method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to an image quantity obtaining method and apparatus, an electronic device, and a storage medium.
Background
At present, more and more users are used to shoot images by using cameras on electronic equipment and create other images, such as spherical panoramas, by using the images, so as to obtain better shooting experience. Taking a spherical panorama as an example, the spherical panorama can bring a three-dimensional stereo feeling to the audience, and can watch all information of the whole scene space.
In order to obtain the spherical panorama described above, the user may take images by rotating himself as the center of the sphere according to the guidance information of the panorama application, and each taken frame of image is a part of the spherical surface. When a group of images are obtained by shooting a frame of image at all guiding positions, the group of images are synthesized into a spherical panoramic image through a preset image algorithm, and the effect of filling the spherical surface with the images is achieved.
In practical applications, a group Of images needs to be captured because the Field Of View (FOV) Of the camera is limited, which cannot capture a picture Of the whole space at one time. In order to take account of cameras of various electronic devices as much as possible, an existing panorama application program guides a user to shoot multi-frame images as many as possible, that is, the number of the group of images is usually 45-80 frames, so that the shooting success rate is improved. However, for non-professional users, the difficulty of shooting 45-80 frames of images is high, that is, the difficulty of obtaining one frame of spherical panoramic image is high, and the use experience is reduced.
Disclosure of Invention
The present disclosure provides an image quantity acquisition method and apparatus, an electronic device, and a storage medium to solve the deficiencies of the related art.
According to a first aspect of the embodiments of the present disclosure, there is provided an image quantity acquisition method, the method including:
acquiring field angle data of a camera module in electronic equipment and a configuration posture of the camera module; the configuration posture refers to a transverse posture or a longitudinal posture corresponding to the electronic equipment when the electronic equipment shoots the image;
acquiring the number of cycles of rotation of the camera module around a target object according to the field angle data for the configuration posture, wherein the number of cycles is the minimum number of rotations of the camera module;
acquiring the frame number of the image corresponding to each week according to the field angle data and the configuration posture, wherein the frame number is the minimum number of the images shot by each week of the camera module;
and acquiring the sum of the number of the image frames corresponding to each week, wherein the sum is the minimum number of the images required by synthesizing the spherical panorama.
Optionally, for the configuration posture, acquiring a number of rotations of the camera module around the target object according to the field angle data includes:
acquiring a first actual proportion for synthesizing a spherical panoramic image in each frame of image according to a preset overlapping degree;
acquiring a first actual field angle corresponding to the image according to the field angle data and the first actual proportion;
and acquiring the week number according to a preset first shooting angle value and the first actual field angle.
Optionally, obtaining a first actual scale for synthesizing the spherical panorama in each frame of image according to a preset overlap degree includes:
acquiring a preset overlapping degree and a first shooting angle value; the overlapping degree is the proportion of the overlapping part of two adjacent frames of images in the original image, and the first shooting angle value is the maximum value of the shooting range in the first direction;
subtracting the overlap by 1 to obtain a first actual ratio.
Optionally, the acquiring the number of weeks according to a preset first shooting angle value and the first actual field angle includes:
calculating a quotient value of the first photographing angle value and the first actual field angle;
and rounding up the quotient value to obtain a minimum integer, wherein the minimum integer is the week number.
Optionally, for the configuration posture, the step of obtaining the number of revolutions of the camera module around the target object according to the field angle data is implemented by adopting the following formula:
when the configuration posture is the transverse posture, the formula is as follows:
Cir=A/(VFOV*(1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is a longitudinal posture, the formula is as follows:
Cir=A/(HFOV*(1-O));
wherein Cir represents the number of weeks; a represents a first shooting angle value, and the value is 180; HFOV represents a second field of view; VFOV represents a first field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
Optionally, when a shooting angle between the camera module and the reference plane is a first angle, acquiring the number of frames of the image corresponding to each week according to the field angle data and the configuration posture includes:
acquiring a second actual proportion for synthesizing the spherical panoramic image in each frame of image according to the preset overlapping degree;
acquiring a second actual field angle corresponding to the image according to the field angle data and the second actual proportion;
and acquiring the frame number corresponding to the camera module under the first angle according to a preset second shooting angle value and the second actual field angle.
Optionally, obtaining a second actual scale for synthesizing the spherical panorama in each frame of image according to the preset overlap degree includes:
acquiring a preset overlapping degree and a second shooting angle value; the overlapping degree is the proportion of the overlapping part of two adjacent frames of images in the original image, and the second shooting angle value is the maximum value of the shooting range in the second direction;
subtracting the overlap from 1 yields a second actual ratio.
Optionally, obtaining a second actual scale for synthesizing the spherical panorama in each frame of image according to the preset overlap degree includes:
acquiring a product of the second actual proportion and the field angle data, wherein the product is a second actual field angle corresponding to the image;
calculating a quotient value of the second shooting angle value and the second actual field angle;
and rounding up the second quotient to obtain a minimum integer, wherein the minimum integer is the frame number.
Optionally, when a shooting angle between the camera module and the reference plane is a first angle, the step of obtaining the number of frames of the image corresponding to each week according to the field angle data and the configuration posture is implemented by using the following formula:
when the configuration posture is the transverse posture, the formula is as follows:
Fr=B/(HFOV*(1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is a longitudinal posture, the formula is as follows:
Fr=B/(VFOV*(1-O));
in the formula, Fr represents the number of frames; b represents a second shooting angle value, and the value is 360; HFOV represents a second field of view; VFOV represents a first field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
Optionally, when a shooting angle between the camera module and the reference plane is a second angle, acquiring the number of frames of the image corresponding to each week according to the field angle data and the configuration posture includes:
acquiring a preset second angle;
acquiring a sine value of the second angle;
acquiring the product of the corresponding frame number and the sine value at the second angle;
and carrying out upward rounding processing on the product to obtain a minimum integer, wherein the minimum integer is the frame number corresponding to the second angle.
Optionally, the second angle is any angle within a preset shooting angle range, or is calculated according to the week number, the overlapping degree and the field angle data.
Optionally, the formula for calculating the second angle according to the number of weeks, the overlapping degree and the field angle data is as follows:
when the configuration posture is the longitudinal posture, the second angle X is: x ═ HFOV (1-O) × N;
alternatively, the first and second electrodes may be,
when the configuration attitude is the lateral attitude, the second angle X is: x ═ VFOV (1-O) × N;
wherein X represents a second angle; HFOV represents a second field of view; VFOV represents a first field angle; o represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panoramic image; and N represents the number of the cycles of the image shot when the camera module inclines upwards or downwards.
According to a second aspect of the embodiments of the present disclosure, there is provided an image number acquisition apparatus, the apparatus including:
the data acquisition module is used for acquiring the field angle data of a camera module in the electronic equipment and the configuration posture of the camera module; the configuration posture refers to a transverse posture or a longitudinal posture corresponding to the electronic equipment when the electronic equipment shoots the image;
the cycle number acquisition module is used for acquiring the cycle number of the camera module rotating around the target object according to the field angle data aiming at the configuration posture, wherein the cycle number is the minimum number of the camera module rotating;
a frame number obtaining module, configured to obtain a frame number of images corresponding to each week according to the field angle data and the configuration posture, where the frame number is a minimum number of images shot by each week of the camera module;
and the quantity acquisition module is used for acquiring the sum of the number of the image frames corresponding to each week, wherein the sum is the minimum quantity of the images required by synthesizing the spherical panorama.
Optionally, the week number obtaining module includes:
the scale obtaining submodule is used for obtaining a first actual scale for synthesizing the spherical panorama in each frame of image according to the preset overlapping degree;
the visual angle acquisition submodule is used for acquiring a first actual visual angle corresponding to the image according to the visual angle data and the first actual proportion;
and the week number obtaining submodule is used for obtaining the week number according to a preset first shooting angle value and the first actual field angle.
Optionally, the proportion obtaining sub-module includes:
the overlapping degree obtaining unit is used for obtaining preset overlapping degree and a first shooting angle value; the overlapping degree is the proportion of the overlapping part of two adjacent frames of images in the original image, and the first shooting angle value is the maximum value of the shooting range in the first direction;
and the first proportion acquisition unit is used for subtracting the overlapping degree by using 1 to obtain a first actual proportion.
Optionally, the week number obtaining sub-module includes:
a quotient value calculation unit for calculating a quotient value of the first photographing angle value and the first actual field angle;
and the week number acquisition unit is used for rounding up the quotient to obtain a minimum integer, and the minimum integer is the week number.
Optionally, the week number obtaining module is implemented by using the following formula:
when the configuration posture is the transverse posture, the formula is as follows: cir ═ a/(VFOV (1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is a longitudinal posture, the formula is as follows: cir ═ a/(HFOV (1-O));
wherein Cir represents the number of weeks; a represents a first shooting angle value, and the value is 180; HFOV represents a second field of view; VFOV represents a first field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
Optionally, when a shooting angle between the camera module and the reference plane is a first angle, the frame number obtaining module includes:
the scale obtaining submodule is used for obtaining a second actual scale for synthesizing the spherical panorama in each frame of image according to the preset overlapping degree;
the visual angle acquisition submodule is used for acquiring a second actual visual angle corresponding to the image according to the visual angle data and the second actual proportion;
and the frame number obtaining submodule is used for obtaining the frame number corresponding to the camera module under the first angle according to a preset second shooting angle value and the second actual field angle.
Optionally, the proportion obtaining sub-module includes:
the overlapping degree obtaining unit is used for obtaining preset overlapping degree and a second shooting angle value; the overlapping degree is the proportion of the overlapping part of two adjacent frames of images in the original image, and the second shooting angle value is the maximum value of the shooting range in the second direction;
and the proportion acquisition unit is used for subtracting the overlapping degree by using 1 to obtain a second actual proportion.
Optionally, the proportion obtaining sub-module includes:
a product acquisition unit, configured to acquire a product of the second actual proportion and the viewing angle data, where the product is a second actual viewing angle corresponding to the image;
a quotient value calculating unit, configured to calculate a quotient value of the second shooting angle value and the second actual field angle;
and a frame number obtaining unit, configured to perform rounding-up processing on the second quotient to obtain a minimum integer, where the minimum integer is the frame number.
Optionally, when the shooting angle between the camera module and the reference plane is a first angle, the frame number obtaining module is implemented by using the following formula:
when the configuration posture is the transverse posture, the formula is as follows: fr ═ B/(HFOV (1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is a longitudinal posture, the formula is as follows: fr ═ B/(VFOV × (1-O));
in the formula, Fr represents the number of frames; b represents a second shooting angle value, and the value is 360; HFOV represents a second field of view; VFOV represents a first field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
Optionally, when a shooting angle between the camera module and the reference plane is a second angle, the frame number obtaining module includes:
the angle acquisition submodule is used for acquiring a preset second angle;
a sine value obtaining submodule for obtaining a sine value of the second angle;
the product obtaining submodule is used for obtaining the product of the corresponding frame number and the sine value under the second angle;
and the frame number obtaining submodule is used for carrying out upward rounding processing on the product to obtain a minimum integer, and the minimum integer is the corresponding frame number under the second angle.
Optionally, the second angle is any angle within a preset shooting angle range, or is calculated according to the week number, the overlapping degree and the field angle data.
The formula for calculating the second angle from the week number, the degree of overlap, and the field angle data is as follows:
when the configuration posture is the longitudinal posture, the second angle X is: x ═ HFOV (1-O) × N;
alternatively, the first and second electrodes may be,
when the configuration attitude is the lateral attitude, the second angle X is: x ═ VFOV (1-O) × N;
wherein X represents a second angle; HFOV represents a second field of view; VFOV represents a first field angle; o represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panoramic image; and N represents the number of the cycles of the image shot when the camera module inclines upwards or downwards.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor;
a memory for storing a computer program executable by the processor;
wherein the processor is configured to execute the computer program in the memory to implement the method of any one of the above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium, in which an executable computer program is capable of implementing the method according to any one of the above when executed by a processor.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
as can be seen from the foregoing embodiments, in the solution provided by the embodiments of the present disclosure, the field angle data of the camera module in the electronic device and the configuration posture of the camera module may be acquired; the configuration posture refers to a transverse posture or a longitudinal posture corresponding to the electronic equipment when the electronic equipment shoots the image; then, acquiring the number of revolutions of the camera module around the target object according to the field angle data for the configuration posture, wherein the number of revolutions refers to the minimum number of revolutions of the camera module; then, acquiring the frame number of the images corresponding to each week according to the field angle data and the configuration posture, wherein the frame number refers to the minimum number of the images shot by the camera module in each week; and finally, acquiring the sum of the number of the image frames corresponding to each week, wherein the sum is the minimum number of the images required by synthesizing the spherical panorama. Like this, through the minimum number of weeks that obtains the camera module and rotate in this embodiment to and the minimum number of the image of shooing weekly, thereby can obtain the minimum number of synthesizing the required image of spherical panorama, be favorable to reducing and shoot the degree of difficulty, promote and use experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating a method of image quantity acquisition according to an exemplary embodiment.
FIG. 2 is a flow chart illustrating obtaining a number of weeks, according to an example embodiment.
FIG. 3 is a flow diagram illustrating acquiring frame numbers according to an example embodiment.
Fig. 4(a) is a composite effect of a spherical panorama in a second direction according to an exemplary embodiment.
Fig. 4(b) is a set of images taken with one camera module rotation, according to an exemplary embodiment.
Fig. 5(a) is a composite effect of a spherical panorama first direction according to an exemplary embodiment.
Fig. 5(b) is a set of images taken in a first direction of a camera module according to an exemplary embodiment.
Fig. 6(a) is a schematic trajectory formed by focal points of a camera module when the camera module rotates for shooting according to an exemplary embodiment.
Fig. 6(b) is a diagram illustrating exemplary track radii when a camera module shoots in a second direction and shoots obliquely upward according to an exemplary embodiment.
Fig. 7 is a block diagram illustrating an image quantity acquisition apparatus according to an exemplary embodiment.
FIG. 8 is a block diagram illustrating an electronic device in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The following exemplary described embodiments do not represent all embodiments consistent with the present disclosure. Rather, they are merely examples of devices consistent with certain aspects of the present disclosure as recited in the claims below. It should be noted that, in the following examples and embodiments, features may be combined with each other without conflict.
In order to solve the above technical problem, an embodiment of the present disclosure provides an image quantity obtaining method, which is applied to an electronic device. The electronic equipment comprises a camera module and a processor. It is understood that only the components related to the present disclosure are shown in the present embodiment, and other components (such as a power supply, a motherboard) required by the normal operation of the electronic device are not shown, and the corresponding components can be supplemented according to actual requirements. For convenience of description, the processor in the electronic device is used as an execution subject to describe the scheme of each embodiment.
Fig. 1 is a flowchart illustrating an image quantity acquisition method according to an exemplary embodiment, and referring to fig. 1, an image quantity acquisition method includes steps 11 to 14.
In step 11, acquiring field angle data of a camera module in the electronic equipment and a configuration posture of the camera module; the configuration posture refers to a transverse posture or a longitudinal posture corresponding to the electronic equipment when the electronic equipment shoots the image.
In this embodiment, the processor in the electronic device may obtain the field angle data of the camera module, and may include: for example, the processor may communicate with the camera module to obtain field angle data of the camera module. For another example, the local memory stores the angle of view data of the camera, and the processor may communicate with the local memory to read the angle of view data. For another example, the processor may communicate with the camera module to acquire the physical size and the focal length of each lens, and then calculate the angle of view from S/2 (EFL × tan (FOV/2), which is a correspondence relationship between the physical size S (including width W or height V), the focal length EFL, and the angle of view FOV.
Note that the angle-of-view data includes a first angle of view corresponding to the first direction and a second angle of view corresponding to the second direction. Taking a vertically shot spherical panorama of the electronic equipment held by a user as an example, the first direction is a vertical direction, namely a direction in which the height of the user is located, and correspondingly, the first field angle is a field angle corresponding to the direction in which the height of the electronic equipment is located; the second direction is a horizontal direction, i.e. a direction in which the arm of the user extends, and accordingly the second field angle is a field angle corresponding to a direction in which the width of the electronic device is located.
It should be further noted that the subsequent reference plane refers to a plane formed by one rotation of the second direction around the first direction, and the spherical panorama vertically shot by the user holding the electronic device is taken as an example, and at this time, the reference plane refers to a horizontal plane.
For convenience of description, the scheme of each embodiment is described in the following by taking a spherical panorama shot by a user holding the electronic device.
It is contemplated that multiple lenses, such as a front lens and multiple rear lenses, may be included in the camera module. In this embodiment, the processor may select a lens with a largest horizontal field of view HFOV or vertical field of view VFOV from the rear lenses of the plurality of lenses as a lens for capturing an image, and then acquire the configuration posture of the camera module when selecting the largest horizontal field of view HFOV or vertical field of view HFOV.
When the wide side and the long side of the camera module correspond to the wide side and the long side of the display screen, the configuration posture refers to a transverse posture or a longitudinal posture corresponding to the electronic equipment when the electronic equipment shoots the image. The horizontal posture refers to the posture corresponding to the camera module shooting module when the display screen of the electronic equipment is horizontal, and the vertical posture refers to the posture corresponding to the camera module shooting module when the display screen of the electronic equipment is vertical. That is, when the camera module is in the horizontal posture, the length of the top edge of the shot image is greater than the length of the height; when the camera module is in the vertical posture, the length of the top edge of the shot image is smaller than the length of the height.
It should be noted that the configuration gesture may be configured in advance, that is, the electronic device may obtain a gesture corresponding to an operation according to a configuration operation of a user, that is, the configuration gesture, and store the configuration gesture in the local storage. The processor may communicate with the local memory and read the configuration gestures described above. The configuration posture may also be obtained in real time, that is, the processor may communicate with a spatial posture sensor (e.g., an acceleration sensor, a gravity sensor, a gyroscope, etc.) in the electronic device and obtain a spatial posture of the electronic device, and then determine the posture of the camera module, that is, the configuration posture, according to the spatial posture.
In step 12, for the arrangement posture, the number of revolutions of the camera module around the target object is obtained according to the field angle data, and the number of revolutions refers to the minimum number of revolutions of the camera module.
In this embodiment, the processor may obtain the number of rotations of the camera module around the target object according to the configuration posture and the field angle data, where the number of rotations is the minimum number of rotations of the camera module. Assuming that the target object is a user, the user holds the electronic device to shoot an image, and the user rotates to drive the camera module to rotate around the target object, namely the camera module rotates in a horizontal plane vertical to the target object; then, shooting a frame of image at certain angle intervals (less than or equal to FOV); when the sphere rotates for one circle (namely 360 degrees), the shot images can cover one circle of the sphere after being spliced. And then, adjusting the shooting angle of the camera module upwards or downwards, and repeating the steps until the image is fully covered by the sphere after the number of cycles is rotated. That is, the number of the cycles is the number of the rotations of the camera module around the target object, one set of images is obtained every one cycle, and a plurality of sets of images corresponding to the number of the cycles are stacked up and down to cover the sphere.
In this embodiment, the stacking of the plurality of sets of images is described by taking a horizontal plane perpendicular to the target object as a reference, and then described by taking another angle, that is, an angle of an axis of the target object (that is, a vertical direction of the spherical panorama), after the images cover the sphere, the number of images in each column is the same as the number of circles. Based on the above analysis, the processor can obtain the above number of weeks by acquiring the minimum number of images required to be taken in the vertical direction.
Referring to fig. 2, in step 21, the processor may obtain an actual scale for synthesizing the spherical panorama in each frame of image according to a preset overlap degree, which is referred to as a first actual scale for distinction. The overlapping degree refers to a ratio of an overlapping portion of two adjacent frames of images to an original image, in this example, a value range of the overlapping degree may be 15% to 30%, and then, a value of the overlapping degree is 30% to describe each embodiment, it should be noted that a specific value of the overlapping degree may be determined by a later-stage fusion algorithm.
For example, the local memory may pre-store the degree of overlap and the processor may communicate with the local memory and read the degree of overlap. For another example, a camera application or a panorama application is installed in the electronic device, and the processor may read the degree of overlap from the configuration information corresponding to the application.
The processor may obtain a first shooting angle value, where the first shooting angle value is a maximum value of a shooting range in the vertical direction, and a value of the first shooting angle value is 180 in this example, which may also be understood as a range in which the camera module needs to rotate 180 degrees in the vertical direction.
The processor may use 1 to subtract the degree of overlap to obtain a first actual ratio, i.e., (1-O), where O represents the degree of overlap.
In step 22, the processor may obtain a first actual field angle corresponding to the image according to the field angle data and the first actual proportion, that is: when the configuration posture is the transverse posture, the first actual view angle is VFOV (1-O); alternatively, when the configuration posture is the portrait posture, the first actual angle of view is: HFOV (1-O). Wherein HFOV represents a horizontal field angle; VFOV represents the vertical field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
In step 23, the processor may acquire the number of weeks according to a preset first photographing angle value and the first actual field angle. The processor may calculate a quotient value of the first photographing angle value and the first actual field angle; then, the processor may perform rounding-up processing on the quotient to obtain a minimum integer, where the minimum integer is the number of weeks.
For example, for the configuration posture, the processor may obtain the number of rotation cycles of the camera module around the target object according to the field angle data, and implement the following formula:
when the configuration posture is the transverse posture, the formula is as follows:
Cir=A/(VFOV*(1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is the longitudinal posture, the formula is as follows:
Cir=A/(HFOV*(1-O));
wherein Cir represents the number of weeks, an integer rounded up; a denotes a first shooting angle value, here taking the value 180.
In step 13, the number of frames of the images corresponding to each week is obtained according to the field angle data and the configuration posture, and the number of frames is the minimum number of the images shot by the camera module in each week.
In this embodiment, the processor may obtain the number of frames of the image corresponding to each week according to the angle of view data and the configuration posture. Referring to fig. 3, in step 31, the processor may obtain a second actual scale for synthesizing the spherical panorama in each frame of image according to the preset overlap. The processor may obtain an actual scale for synthesizing the spherical panorama in each frame of image according to a preset overlap degree, which is subsequently referred to as a second actual scale for distinction.
The processor may obtain a second shooting angle value, where the second shooting angle value is a maximum value of a shooting range in a horizontal direction, and a value of the second shooting angle value is 360 degrees in this example, and it may also be understood that the camera module needs to rotate 360 degrees around the target object. The processor may use 1 to subtract the degree of overlap to obtain a second actual ratio, i.e., (1-O), where O represents the degree of overlap.
In step 32, the processor may obtain a second actual field angle corresponding to the image according to the field angle data and the second actual proportion, that is: when the configuration posture is the transverse posture, the second actual field angle is HFOV (1-O); alternatively, when the deployed attitude is a portrait attitude, the second actual field of view is VFOV (1-O).
In step 33, the processor may obtain a frame number corresponding to the camera module at the first angle according to a preset second shooting angle value and the second actual field angle. The processor may calculate a quotient value of the second photographing angle value and the second actual field angle; then, the processor may perform a rounding-up operation on the quotient to obtain a minimum integer, where the minimum integer is the number of frames.
For example, the step of obtaining the number of frames of the image corresponding to each week according to the field angle data and the configuration posture is implemented by adopting the following formula:
when the configuration posture is the transverse posture, the formula is as follows:
Fr=B/(HFOV*(1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is the longitudinal posture, the formula is as follows:
Fr=B/(VFOV*(1-O));
in the formula, Fr represents the number of weeks and is an integer rounded up; b represents a second shooting angle value, here taking 360.
It should be noted that fig. 3 illustrates the number of frames of images required by the camera module to capture a circle of a sphere when the capturing angle is the first angle (i.e., 0 degrees).
In an embodiment, when the camera module adjusts the shooting angle to a second angle, that is, the camera module tilts upward or downward so that the included angle between the camera module and the horizontal plane is the second angle, the processor may obtain the number of frames of the image corresponding to each week, including: first, the processor may acquire a preset second angle.
In an example, the second angle may be any angle within a preset shooting angle range, such as 20 degrees to 35 degrees or 50 degrees to 70 degrees, and may be set according to a specific scene. The shooting angle of the camera module is adjusted to be within the preset shooting angle range.
In another example, the second angle may also be calculated from the week number, the overlap, and the field angle data, namely: when the configuration posture is the longitudinal posture, the second angle X is: x ═ HFOV (1-O) × N; alternatively, when the configuration posture is the lateral posture, the second angle X is: x ═ VFOV (1-O) × N.
Wherein N is the number of weeks corresponding to the captured image after the angle is adjusted upwards or downwards, and 0< N < ═ (Cir-1)/2. It can be understood that, when the number of weeks is Cir, one week of the first angle-captured image is removed, and upward inclination capturing (Cir-1)/2 weeks and downward inclination capturing (Cir-1)/2 weeks are performed.
The processor may then obtain a sine value of the second angle, sin (90-X).
The processor may then obtain the product of both the corresponding frame number and the sine value at the first angle, i.e., Cir X sin (90-X).
Finally, the processor may perform rounding-up processing on the above-mentioned equal product to obtain a minimum integer, where the minimum integer is the number of frames corresponding to the second angle. That is, the processor may acquire the number of frames photographed for one rotation at a different second angle.
In step 14, the sum of the number of frames of the images corresponding to each week is obtained, which is the minimum number of images required to synthesize the spherical panorama.
In this embodiment, the processor may obtain a sum of the number of frames of the image corresponding to each week, so as to obtain the minimum number of images required for synthesizing the spherical panorama.
So far, through the minimum number of cycles that the acquisition camera module rotated in this embodiment to and the minimum number of the image of shooing weekly, thereby can obtain the minimum quantity of the required image of synthetic spherical panorama, be favorable to reducing and shoot the degree of difficulty, promote and use experience.
The following describes a scheme of the image quantity obtaining method provided by the present disclosure in conjunction with an embodiment, where the electronic device is a smartphone and the overlapping degree may be 30%:
the processor may obtain a focal length list supported by each lens in a camera module of the smartphone and a physical size of a complete pixel array of the lens, and calculate a horizontal field angle HFOV and a vertical field angle VFOV of each lens, respectively, as shown in table 1.
TABLE 1 Angle of View data
Lens barrel HFOV VFOV
Lens 0 71.22 56.48
Lens 2 71.63 56.85
Lens 3 95.78 78.72
Lens 4 71.63 56.85
Lens 5 71.22 56.48
Lens 6 71.21 56.48
As can be seen from table 1, the maximum angle of view is the lens 3 regardless of whether the arrangement posture is the lateral posture or the longitudinal posture.
(1) And determining the minimum number of frames of images needing to be shot in one circle in the horizontal plane when the shooting angle is 0 according to the selected field angle.
When the shooting interface is designed to be shot longitudinally, namely the configuration state is a longitudinal posture, the formula is as follows:
360/(VFOV*(1-O));
alternatively, the first and second electrodes may be,
when the shooting interface is designed to be shot transversely, namely the configuration state is a transverse posture, the formula is as follows:
360/(HFOV*(1-O))。
assuming that the shooting interface of the present disclosure is designed for vertical shooting, taking images with the lens 3, the minimum number of frames required to take an image for one week in the horizontal plane is:
360/(78.72*(1-30%))≈6.53;
rounding 6.53 up to 7, that is, in the case where the shooting angle is 0 degree, 7 frames of images are shot at minimum, and the effect is as shown in fig. 4(a) and 4 (b). Fig. 4(a) shows the horizontal synthetic effect of the spherical panorama, and fig. 4(b) shows a set of images captured by one rotation of the camera module. Fig. 4(b) is intended to highlight the order of images captured by one rotation and to combine the effects of the panoramic image shown in fig. 4 (b). There is some distortion for the image shown in fig. 4(b) because some shape processing is applied to the image when the panorama is synthesized, which is negligible.
(2) Determining the minimum number of frames of a picture to be taken for one week in the vertical direction may be determined using the following formula:
when the shooting interface is designed to be shot longitudinally, namely the configuration state is a longitudinal posture, the formula is as follows:
180/(HFOV*(1-O));
alternatively, the first and second electrodes may be,
when the shooting interface is designed to be shot transversely, namely the configuration state is a transverse posture, the formula is as follows:
180/(VFOV*(1-O))。
assuming that the shooting interface of the present disclosure is designed for vertical shooting, and images are shot with the lens 3, the minimum number of frames required to shoot an image for one week in the vertical direction is:
180/(95.78*(1-30%))≈2.68;
rounding up to 3 for 2.68, a minimum of 3 frames of images need to be taken in the vertical direction, and the effect is shown in fig. 5(a) and 5 (b). Wherein fig. 5(a) shows the resultant effect of the spherical panorama in the vertical direction, and fig. 5(b) shows a set of images taken by the camera module in the vertical direction. Fig. 5(b) is intended to highlight the order of images captured by one rotation and to combine the effects of the panoramic image shown in fig. 5 (b). There is some distortion to the image shown in fig. 5(b) because some shape processing is applied to the image when the panorama is synthesized, which is negligible.
In this example, the above-mentioned minimum 3 frames of images taken in the vertical direction can be understood as that the minimum three times (i.e. the number of weeks) of taking a complete sphere is needed, that is, taking one round in the horizontal direction, taking one round at the upward inclination angle X, and taking one round at the downward inclination angle X, and the effect is shown in fig. 6(a) and 6 (b). Fig. 6(a) shows a schematic track formed by the focal point of the camera module during the rotation shooting; fig. 6(b) shows the radii of example trajectories when the camera module shoots in the horizontal direction and shoots obliquely upward.
Assuming that the shooting interface of the present application is designed for longitudinal shooting, the inclination angle X of the lens 3 is: x95.78 (1-30%) is ≈ 67. That is, the camera module needs to take one round in the horizontal direction, take one round in the upward-inclined 67 degrees and take one round in the downward-inclined 67 degrees.
Assuming that the shooting interface of the present application is designed for longitudinal shooting, the minimum number of frames of the required shot image when the lens 3 is tilted up by 67 degrees is: 7 sin (90-67) ≈ 2.74; the 2.68 is rounded up to 3, and 3 frames of images are required to be shot at least when the inclination angle is 67 degrees.
Similarly, the minimum number of frames required to capture an image at the downward inclination angle 67 is also 3 frames.
In this way, in this embodiment, it is necessary to take 13 frames of the captured image 7+3+3, which can meet the requirement of synthesizing the spherical panorama. 30 multiframe images need to be shot in the related art, data of shot images can be reduced, shooting difficulty of a user is reduced, and use experience can be improved.
Based on the above image quantity obtaining method, an embodiment of the present disclosure further provides an image quantity obtaining apparatus, referring to fig. 7, where the apparatus includes:
the data acquisition module 71 is configured to acquire field angle data of a camera module in the electronic device and a configuration posture of the camera module; the configuration posture refers to a transverse posture or a longitudinal posture corresponding to the electronic equipment when the electronic equipment shoots the image;
a cycle number obtaining module 72, configured to obtain, according to the field angle data, a cycle number of rotations of the camera module around the target object for the configuration posture, where the cycle number is a minimum number of rotations of the camera module;
a frame number obtaining module 73, configured to obtain a frame number of the image corresponding to each week according to the field angle data and the configuration posture, where the frame number is a minimum number of images shot by each week of the camera module;
and a quantity obtaining module 74, configured to obtain a sum of the number of frames of the images corresponding to each week, where the sum is the minimum quantity of images required for synthesizing the spherical panorama.
In one embodiment, the week number obtaining module comprises:
the scale obtaining submodule is used for obtaining a first actual scale for synthesizing the spherical panorama in each frame of image according to the preset overlapping degree;
the visual angle acquisition submodule is used for acquiring a first actual visual angle corresponding to the image according to the visual angle data and the first actual proportion;
and the week number obtaining submodule is used for obtaining the week number according to a preset first shooting angle value and the first actual field angle.
In one embodiment, the proportion obtaining sub-module includes:
the overlapping degree obtaining unit is used for obtaining preset overlapping degree and a first shooting angle value; the overlapping degree is the proportion of the overlapping part of two adjacent frames of images in the original image, and the first shooting angle value is the maximum value of the shooting range in the first direction;
and the first proportion acquisition unit is used for subtracting the overlapping degree by using 1 to obtain a first actual proportion.
In one embodiment, the week number obtaining sub-module includes:
a quotient value calculation unit for calculating a quotient value of the first photographing angle value and the first actual field angle;
and the week number acquisition unit is used for rounding up the quotient to obtain a minimum integer, and the minimum integer is the week number.
In one embodiment, the week number obtaining module is implemented by the following formula:
when the configuration posture is the transverse posture, the formula is as follows: cir ═ a/(VFOV (1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is a longitudinal posture, the formula is as follows: cir ═ a/(HFOV (1-O));
wherein Cir represents the number of weeks; a represents a first shooting angle value, and the value is 180; HFOV represents a second field of view; VFOV represents a first field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
In an embodiment, when a shooting angle between the camera module and a reference plane is a first angle, the frame number obtaining module includes:
the scale obtaining submodule is used for obtaining a second actual scale for synthesizing the spherical panorama in each frame of image according to the preset overlapping degree;
the visual angle acquisition submodule is used for acquiring a second actual visual angle corresponding to the image according to the visual angle data and the second actual proportion;
and the frame number obtaining submodule is used for obtaining the frame number corresponding to the camera module under the first angle according to a preset second shooting angle value and the second actual field angle.
In one embodiment, the proportion obtaining sub-module includes:
the overlapping degree obtaining unit is used for obtaining preset overlapping degree and a second shooting angle value; the overlapping degree is the proportion of the overlapping part of two adjacent frames of images in the original image, and the second shooting angle value is the maximum value of the shooting range in the second direction;
and the proportion acquisition unit is used for subtracting the overlapping degree by using 1 to obtain a second actual proportion.
In one embodiment, the proportion obtaining sub-module includes:
a product acquisition unit, configured to acquire a product of the second actual proportion and the viewing angle data, where the product is a second actual viewing angle corresponding to the image;
a quotient value calculating unit, configured to calculate a quotient value of the second shooting angle value and the second actual field angle;
and a frame number obtaining unit, configured to perform rounding-up processing on the second quotient to obtain a minimum integer, where the minimum integer is the frame number.
In an embodiment, when the shooting angle between the camera module and the reference plane is a first angle, the frame number obtaining module is implemented by using the following formula:
when the configuration posture is the transverse posture, the formula is as follows: fr ═ B/(HFOV (1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is a longitudinal posture, the formula is as follows: fr ═ B/(VFOV × (1-O));
in the formula, Fr represents the number of frames; b represents a second shooting angle value, and the value is 360; HFOV represents a second field of view; VFOV represents a first field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
In an embodiment, when a shooting angle between the camera module and a reference plane is a second angle, the frame number obtaining module includes:
the angle acquisition submodule is used for acquiring a preset second angle;
a sine value obtaining submodule for obtaining a sine value of the second angle;
the product obtaining submodule is used for obtaining the product of the corresponding frame number and the sine value under the second angle;
and the frame number obtaining submodule is used for carrying out upward rounding processing on the product to obtain a minimum integer, and the minimum integer is the corresponding frame number under the second angle.
In an embodiment, the second angle is any angle within a preset shooting angle range, or is calculated according to the week number, the overlapping degree and the field angle data.
The formula for calculating the second angle from the week number, the degree of overlap, and the field angle data is as follows:
when the configuration posture is the longitudinal posture, the second angle X is: x ═ HFOV (1-O) × N;
alternatively, the first and second electrodes may be,
when the configuration attitude is the lateral attitude, the second angle X is: x ═ VFOV (1-O) × N;
wherein X represents a second angle; HFOV represents a second field of view; VFOV represents a first field angle; o represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panoramic image; and N represents the number of the cycles of the image shot when the camera module inclines upwards or downwards.
It should be noted that the content of the method embodiment shown in fig. 1 is matched with the content of the apparatus shown in this embodiment, and the content of the method embodiment may be referred to, which is not described herein again.
FIG. 8 is a block diagram illustrating an electronic device in accordance with an example embodiment. For example, the electronic device 800 may be a smartphone, a computer, a digital broadcast terminal, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
Referring to fig. 8, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, communication component 816, image capture component 818, and the aforementioned housing.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute computer programs. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include computer programs for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800. The power supply module 806 may include a power chip, and the controller may communicate with the power chip to control the power chip to turn the switching device on or off to allow the battery to supply power to the motherboard circuitry or not.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and the target object. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input information from the target object. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The audio component 810 is configured to output and/or input audio file information. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio file information when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio file information may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, the audio component 810 also includes a speaker for outputting audio file information.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the electronic device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or one of the components, the presence or absence of a target object in contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. In this example, the sensor assembly 814 may include a magnetic sensor, a gyroscope, and a magnetic field sensor, wherein the magnetic field sensor includes at least one of: hall sensor, thin film magneto-resistance sensor, magnetic liquid acceleration sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G, 5G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives broadcast information or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, communications component 816 further includes a Near Field Communications (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital information processors (DSPs), digital information processing devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components.
In an exemplary embodiment, a non-transitory readable storage medium is also provided, such as the memory 804 including instructions, that includes an executable computer program that is executable by the processor. The readable storage medium may be, among others, ROM, Random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (26)

1. An image quantity acquisition method, characterized in that the method comprises:
acquiring field angle data of a camera module in electronic equipment and a configuration posture of the camera module; the configuration posture refers to a transverse posture or a longitudinal posture corresponding to the electronic equipment when the electronic equipment shoots the image;
acquiring the number of cycles of rotation of the camera module around a target object according to the field angle data for the configuration posture, wherein the number of cycles is the minimum number of rotations of the camera module;
acquiring the frame number of the image corresponding to each week according to the field angle data and the configuration posture, wherein the frame number is the minimum number of the images shot by each week of the camera module;
and acquiring the sum of the number of the image frames corresponding to each week, wherein the sum is the minimum number of the images required by synthesizing the spherical panorama.
2. The method of claim 1, wherein acquiring, for the configuration pose, a number of rotations of the camera module about a target object from the field angle data comprises:
acquiring a first actual proportion for synthesizing a spherical panoramic image in each frame of image according to a preset overlapping degree;
acquiring a first actual field angle corresponding to the image according to the field angle data and the first actual proportion;
and acquiring the week number according to a preset first shooting angle value and the first actual field angle.
3. The method of claim 2, wherein obtaining a first actual scale for synthesizing the spherical panorama in each frame of image according to a preset overlap comprises:
acquiring a preset overlapping degree and a first shooting angle value; the overlapping degree is the proportion of the overlapping part of two adjacent frames of images in the original image, and the first shooting angle value is the maximum value of the shooting range in the first direction;
subtracting the overlap by 1 to obtain a first actual ratio.
4. The method according to claim 2, wherein acquiring the number of weeks according to a preset first photographing angle value and the first actual field angle comprises:
calculating a quotient value of the first photographing angle value and the first actual field angle;
and rounding up the quotient value to obtain a minimum integer, wherein the minimum integer is the week number.
5. The method according to claim 1, wherein the step of acquiring the number of rotations of the camera module around the target object according to the field angle data for the configuration posture is implemented by adopting the following formula:
when the configuration posture is the transverse posture, the formula is as follows:
Cir=A/(VFOV*(1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is a longitudinal posture, the formula is as follows:
Cir=A/(HFOV*(1-O));
wherein Cir represents the number of weeks; a represents a first shooting angle value, and the value is 180; HFOV represents a second field of view; VFOV represents a first field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
6. The method according to claim 1, wherein when the shooting angle of the camera module group to the reference plane is a first angle, acquiring the number of frames of the image corresponding to each week according to the field angle data and the configuration posture comprises:
acquiring a second actual proportion for synthesizing the spherical panoramic image in each frame of image according to the preset overlapping degree;
acquiring a second actual field angle corresponding to the image according to the field angle data and the second actual proportion;
and acquiring the frame number corresponding to the camera module under the first angle according to a preset second shooting angle value and the second actual field angle.
7. The method of claim 6, wherein obtaining a second actual scale for synthesizing the spherical panorama in each frame of image according to a preset overlap comprises:
acquiring a preset overlapping degree and a second shooting angle value; the overlapping degree is the proportion of the overlapping part of two adjacent frames of images in the original image, and the second shooting angle value is the maximum value of the shooting range in the second direction;
subtracting the overlap from 1 yields a second actual ratio.
8. The method of claim 6, wherein obtaining a second actual scale for synthesizing the spherical panorama in each frame of image according to a preset overlap comprises:
acquiring a product of the second actual proportion and the field angle data, wherein the product is a second actual field angle corresponding to the image;
calculating a quotient value of the second shooting angle value and the second actual field angle;
and rounding up the second quotient to obtain a minimum integer, wherein the minimum integer is the frame number.
9. The method according to claim 1, wherein when the shooting angle of the camera module with respect to the reference plane is a first angle, the step of obtaining the number of frames of the image corresponding to each week according to the field angle data and the configuration posture is implemented by using the following formula:
when the configuration posture is the transverse posture, the formula is as follows:
Fr=B/(HFOV*(1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is a longitudinal posture, the formula is as follows:
Fr=B/(VFOV*(1-O));
in the formula, Fr represents the number of frames; b represents a second shooting angle value, and the value is 360; HFOV represents a second field of view; VFOV represents a first field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
10. The method according to claim 1, wherein when the shooting angle of the camera module with respect to the reference plane is a second angle, acquiring the number of frames of the image corresponding to each week according to the field angle data and the configuration posture comprises:
acquiring a preset second angle;
acquiring a sine value of the second angle;
acquiring the product of the corresponding frame number and the sine value at the second angle;
and carrying out upward rounding processing on the product to obtain a minimum integer, wherein the minimum integer is the frame number corresponding to the second angle.
11. The method according to claim 10, wherein the second angle is any one of a preset shooting angle range or is calculated from the week number, the overlapping degree and the field angle data.
12. The method of claim 10, wherein the formula for calculating the second angle from the week number, the degree of overlap, and the field angle data is as follows:
when the configuration posture is the longitudinal posture, the second angle X is: x ═ HFOV (1-O) × N;
alternatively, the first and second electrodes may be,
when the configuration attitude is the lateral attitude, the second angle X is: x ═ VFOV (1-O) × N;
wherein X represents a second angle; HFOV represents a second field of view; VFOV represents a first field angle; o represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panoramic image; and N represents the number of the cycles of the image shot when the camera module inclines upwards or downwards.
13. An image quantity acquisition apparatus, characterized in that the apparatus comprises:
the data acquisition module is used for acquiring the field angle data of a camera module in the electronic equipment and the configuration posture of the camera module; the configuration posture refers to a transverse posture or a longitudinal posture corresponding to the electronic equipment when the electronic equipment shoots the image;
the cycle number acquisition module is used for acquiring the cycle number of the camera module rotating around the target object according to the field angle data aiming at the configuration posture, wherein the cycle number is the minimum number of the camera module rotating;
a frame number obtaining module, configured to obtain a frame number of images corresponding to each week according to the field angle data and the configuration posture, where the frame number is a minimum number of images shot by each week of the camera module;
and the quantity acquisition module is used for acquiring the sum of the number of the image frames corresponding to each week, wherein the sum is the minimum quantity of the images required by synthesizing the spherical panorama.
14. The apparatus of claim 13, wherein the week number obtaining module comprises:
the scale obtaining submodule is used for obtaining a first actual scale for synthesizing the spherical panorama in each frame of image according to the preset overlapping degree;
the visual angle acquisition submodule is used for acquiring a first actual visual angle corresponding to the image according to the visual angle data and the first actual proportion;
and the week number obtaining submodule is used for obtaining the week number according to a preset first shooting angle value and the first actual field angle.
15. The apparatus of claim 14, wherein the proportion obtaining sub-module comprises:
the overlapping degree obtaining unit is used for obtaining preset overlapping degree and a first shooting angle value; the overlapping degree is the proportion of the overlapping part of two adjacent frames of images in the original image, and the first shooting angle value is the maximum value of the shooting range in the first direction;
and the first proportion acquisition unit is used for subtracting the overlapping degree by using 1 to obtain a first actual proportion.
16. The apparatus of claim 14, wherein the week number acquisition submodule comprises:
a quotient value calculation unit for calculating a quotient value of the first photographing angle value and the first actual field angle;
and the week number acquisition unit is used for rounding up the quotient to obtain a minimum integer, and the minimum integer is the week number.
17. The apparatus of claim 13, wherein the cycle number obtaining module is implemented by the following formula:
when the configuration posture is the transverse posture, the formula is as follows: cir ═ a/(VFOV (1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is a longitudinal posture, the formula is as follows: cir ═ a/(HFOV (1-O));
wherein Cir represents the number of weeks; a represents a first shooting angle value, and the value is 180; HFOV represents a second field of view; VFOV represents a first field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
18. The apparatus of claim 13, wherein when the shooting angle of the camera module with respect to the reference plane is a first angle, the frame number obtaining module comprises:
the scale obtaining submodule is used for obtaining a second actual scale for synthesizing the spherical panorama in each frame of image according to the preset overlapping degree;
the visual angle acquisition submodule is used for acquiring a second actual visual angle corresponding to the image according to the visual angle data and the second actual proportion;
and the frame number obtaining submodule is used for obtaining the frame number corresponding to the camera module under the first angle according to a preset second shooting angle value and the second actual field angle.
19. The apparatus of claim 18, wherein the proportion obtaining sub-module comprises:
the overlapping degree obtaining unit is used for obtaining preset overlapping degree and a second shooting angle value; the overlapping degree is the proportion of the overlapping part of two adjacent frames of images in the original image, and the second shooting angle value is the maximum value of the shooting range in the second direction;
and the proportion acquisition unit is used for subtracting the overlapping degree by using 1 to obtain a second actual proportion.
20. The apparatus of claim 18, wherein the proportion obtaining sub-module comprises:
a product acquisition unit, configured to acquire a product of the second actual proportion and the viewing angle data, where the product is a second actual viewing angle corresponding to the image;
a quotient value calculating unit, configured to calculate a quotient value of the second shooting angle value and the second actual field angle;
and a frame number obtaining unit, configured to perform rounding-up processing on the second quotient to obtain a minimum integer, where the minimum integer is the frame number.
21. The apparatus of claim 13, wherein when the shooting angle of the camera module with respect to the reference plane is a first angle, the frame number obtaining module is implemented by using the following formula:
when the configuration posture is the transverse posture, the formula is as follows: fr ═ B/(HFOV (1-O));
alternatively, the first and second electrodes may be,
when the configuration posture is a longitudinal posture, the formula is as follows: fr ═ B/(VFOV × (1-O));
in the formula, Fr represents the number of frames; b represents a second shooting angle value, and the value is 360; HFOV represents a second field of view; VFOV represents a first field angle; and O represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panorama.
22. The apparatus of claim 13, wherein when the shooting angle of the camera module with respect to the reference plane is a second angle, the frame number obtaining module comprises:
the angle acquisition submodule is used for acquiring a preset second angle;
a sine value obtaining submodule for obtaining a sine value of the second angle;
the product obtaining submodule is used for obtaining the product of the corresponding frame number and the sine value under the second angle;
and the frame number obtaining submodule is used for carrying out upward rounding processing on the product to obtain a minimum integer, and the minimum integer is the corresponding frame number under the second angle.
23. The apparatus according to claim 22, wherein the second angle is any one of a preset shooting angle range or is calculated from the week number, the overlapping degree and the field angle data.
24. The apparatus of claim 23, wherein the formula for calculating the second angle from the week number, the degree of overlap, and the field angle data is as follows:
when the configuration posture is the longitudinal posture, the second angle X is: x ═ HFOV (1-O) × N;
alternatively, the first and second electrodes may be,
when the configuration attitude is the lateral attitude, the second angle X is: x ═ VFOV (1-O) × N;
wherein X represents a second angle; HFOV represents a second field of view; VFOV represents a first field angle; o represents the overlapping degree and is matched with the fusion algorithm of the synthetic spherical panoramic image; and N represents the number of the cycles of the image shot when the camera module inclines upwards or downwards.
25. An electronic device, comprising:
a processor;
a memory for storing a computer program executable by the processor;
wherein the processor is configured to execute the computer program in the memory to implement the method of any of claims 1 to 12.
26. A computer-readable storage medium, characterized in that an executable computer program in the storage medium, when executed by a processor, is capable of implementing the method according to any one of claims 1 to 12.
CN202110688027.4A 2021-06-21 2021-06-21 Image quantity acquisition method and device, electronic equipment and storage medium Active CN113438416B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110688027.4A CN113438416B (en) 2021-06-21 2021-06-21 Image quantity acquisition method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110688027.4A CN113438416B (en) 2021-06-21 2021-06-21 Image quantity acquisition method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113438416A true CN113438416A (en) 2021-09-24
CN113438416B CN113438416B (en) 2022-12-09

Family

ID=77756830

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110688027.4A Active CN113438416B (en) 2021-06-21 2021-06-21 Image quantity acquisition method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113438416B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05100335A (en) * 1991-10-04 1993-04-23 Canon Inc Panoramic photographing device
EP2031561A1 (en) * 2007-08-27 2009-03-04 Samsung Electronics Co., Ltd. Method for photographing panoramic picture
US20140132788A1 (en) * 2012-11-09 2014-05-15 Sean Geoffrey Ramsay Systems and Methods for Generating Spherical Images
CN109362234A (en) * 2016-04-28 2019-02-19 深圳市大疆创新科技有限公司 System and method for obtaining Spherical Panorama Image
CN110049221A (en) * 2019-04-29 2019-07-23 维沃移动通信(杭州)有限公司 Image pickup method and mobile terminal
US20200177824A1 (en) * 2018-11-30 2020-06-04 Vecnos Inc. Image processing apparatus, image capturing apparatus, video reproducing system, method and program
JP2020096349A (en) * 2018-11-30 2020-06-18 ベクノス株式会社 Image processing device, imaging device, moving image reproduction system, method, and program
CN111415388A (en) * 2020-03-17 2020-07-14 Oppo广东移动通信有限公司 Visual positioning method and terminal
CN112150355A (en) * 2019-06-26 2020-12-29 华为技术有限公司 Image processing method and related equipment
CN112866583A (en) * 2020-12-30 2021-05-28 深圳追一科技有限公司 Data acquisition system, method, device, electronic equipment and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05100335A (en) * 1991-10-04 1993-04-23 Canon Inc Panoramic photographing device
EP2031561A1 (en) * 2007-08-27 2009-03-04 Samsung Electronics Co., Ltd. Method for photographing panoramic picture
US20140132788A1 (en) * 2012-11-09 2014-05-15 Sean Geoffrey Ramsay Systems and Methods for Generating Spherical Images
CN109362234A (en) * 2016-04-28 2019-02-19 深圳市大疆创新科技有限公司 System and method for obtaining Spherical Panorama Image
US20200177824A1 (en) * 2018-11-30 2020-06-04 Vecnos Inc. Image processing apparatus, image capturing apparatus, video reproducing system, method and program
JP2020096349A (en) * 2018-11-30 2020-06-18 ベクノス株式会社 Image processing device, imaging device, moving image reproduction system, method, and program
CN110049221A (en) * 2019-04-29 2019-07-23 维沃移动通信(杭州)有限公司 Image pickup method and mobile terminal
CN112150355A (en) * 2019-06-26 2020-12-29 华为技术有限公司 Image processing method and related equipment
CN111415388A (en) * 2020-03-17 2020-07-14 Oppo广东移动通信有限公司 Visual positioning method and terminal
CN112866583A (en) * 2020-12-30 2021-05-28 深圳追一科技有限公司 Data acquisition system, method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113438416B (en) 2022-12-09

Similar Documents

Publication Publication Date Title
US9282242B2 (en) Method and electric device for taking panoramic photograph
US9712751B2 (en) Camera field of view effects based on device orientation and scene content
EP3010226B1 (en) Method and apparatus for obtaining photograph
CN106572299B (en) Camera opening method and device
EP3038345B1 (en) Auto-focusing method and auto-focusing device
US20100194860A1 (en) Method of stereoscopic 3d image capture using a mobile device, cradle or dongle
US20140132735A1 (en) Array camera, mobile terminal, and methods for operating the same
US10158798B2 (en) Imaging apparatus and method of controlling the same
EP2840445A1 (en) Photograph shooting method and electronic device
JPWO2013069047A1 (en) Image generating apparatus and image generating method
JPWO2013069049A1 (en) Image generating apparatus and image generating method
CN106503682B (en) Method and device for positioning key points in video data
JP7110443B2 (en) Shooting method and shooting device, electronic equipment, storage medium
EP4322520A1 (en) Photographing method and apparatus, and electronic device
CN105141942A (en) 3d image synthesizing method and device
CN109218709B (en) Holographic content adjusting method and device and computer readable storage medium
CN115134505B (en) Preview picture generation method and device, electronic equipment and storage medium
CN113438416B (en) Image quantity acquisition method and device, electronic equipment and storage medium
US11252341B2 (en) Method and device for shooting image, and storage medium
KR102557592B1 (en) Method and apparatus for displaying an image, electronic device and computer-readable storage medium
US9619016B2 (en) Method and device for displaying wallpaper image on screen
WO2023225910A1 (en) Video display method and apparatus, terminal device, and computer storage medium
CN116939351A (en) Shooting method, shooting device, electronic equipment and readable storage medium
KR20160149945A (en) Mobile terminal
JP2014127830A (en) Information processing device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant