CN112702506A - Shooting method, shooting device and electronic equipment - Google Patents

Shooting method, shooting device and electronic equipment Download PDF

Info

Publication number
CN112702506A
CN112702506A CN201911012225.8A CN201911012225A CN112702506A CN 112702506 A CN112702506 A CN 112702506A CN 201911012225 A CN201911012225 A CN 201911012225A CN 112702506 A CN112702506 A CN 112702506A
Authority
CN
China
Prior art keywords
image
focusing
scene
shot
preview image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911012225.8A
Other languages
Chinese (zh)
Inventor
杨宗保
郑严
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201911012225.8A priority Critical patent/CN112702506A/en
Publication of CN112702506A publication Critical patent/CN112702506A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure relates to a shooting method, a shooting device and an electronic device, wherein the shooting method comprises the following steps: the method comprises the steps of obtaining a preview image of a scene to be shot, determining the focusing position of human eyes on the preview image, and finishing focusing operation according to image information at the focusing position. The focusing position of human eyes on the preview image of the scene to be shot is collected, and the focusing operation is completed according to the image information at the focusing position, so that the focusing position of the scene to be shot can be changed according to the requirements of a user, and the focus-following shooting is realized. In addition, the focusing position of the scene to be shot is determined by collecting the focusing position of human eyes on the preview image, so that the problems of inconvenience and interference to the focusing process caused by manual focusing position selection and environmental factors are solved, the fluency, intelligence and operation convenience of the focusing process are improved, and the focusing efficiency and user experience of the electronic equipment using the shooting method are improved.

Description

Shooting method, shooting device and electronic equipment
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a shooting method, a shooting device, and an electronic apparatus.
Background
In the related art, electronic devices such as mobile phones generally include a photographing function. The focusing process of the scene to be shot is mainly realized based on manual or automatic focusing of the camera module. Manual focusing requires a user to manually select a focusing area, which reduces the smoothness and intelligence of the shooting operation. The camera module is limited by the light contrast, brightness, distance and other environmental parameters of the scene to be shot in the automatic focusing process, and influences the automatic focusing effect and the overall shooting function of the electronic equipment.
Disclosure of Invention
The disclosure provides a shooting method, a shooting device and electronic equipment, so that the anti-interference performance of a focusing process is increased, and the focusing effect and the user experience of the electronic equipment are improved.
According to a first aspect of the present disclosure, a shooting method is provided, where the shooting method is applied to a camera module, and the shooting method includes:
acquiring a preview image of a scene to be shot;
determining a focus position of a human eye on the preview image;
and finishing focusing operation aiming at the scene to be shot according to the image information at the focusing position.
Optionally, the determining the focus position of the human eye on the preview image includes:
acquiring images of eyes and eyeballs of a user;
determining the intersection position of the human eye sight line and the preview image according to the positions of the pupils of the two eyes in the eyeball image;
judging whether the intersection position meets a first preset condition, and if so, confirming the intersection position as the focusing position; if not, the intersection position is continuously tracked until the intersection position meets the first preset condition.
Optionally, the determining the focus position of the human eye on the preview image includes:
collecting a face image of a user;
determining a relative angle between a facial feature plane in the human face image and an imaging plane of the preview image; wherein the relative angle has a mapping relation with the corresponding position of the preview image;
judging whether the relative angle meets a second preset condition, if so, mapping the relative angle on a corresponding position on the preview image and confirming the corresponding position as the focusing position; if not, continuing to track the relative angle until the relative angle meets the second preset condition.
Optionally, the completing the focusing operation on the scene to be shot according to the image information at the focusing position includes:
acquiring the image definition at the focusing position;
and judging whether the image definition is greater than or equal to a first preset value, if so, finishing focusing operation on the scene to be shot, otherwise, controlling a camera lens for shooting the scene to be shot in the camera module to move until the image definition is greater than or equal to the first preset value.
Optionally, the performing a focusing operation according to the image information at the focusing position includes:
acquiring an image phase difference at the focusing position;
and judging whether the image phase difference is smaller than or equal to a second preset value, if so, finishing focusing operation on the scene to be shot, otherwise, controlling a camera lens for shooting the scene to be shot in the camera module to move until the image phase difference is smaller than or equal to the second preset value.
According to a second aspect of the present disclosure, a camera module is provided, which includes:
the acquisition unit is used for acquiring a preview image of a scene to be shot;
a determination unit that determines a focus position of the human eye on the preview image;
and the focusing unit is used for finishing focusing operation aiming at the scene to be shot according to the image information at the focusing position.
Optionally, the determining unit includes:
the first acquisition module is used for acquiring the images of the eyes and the eyeballs of the two eyes of the user;
the first determining module is used for determining the intersection position of the sight line of the human eyes and the preview image according to the positions of the pupils of the two eyes in the eyeball image;
the first judgment module is used for judging whether the intersection position meets a first preset condition or not, and if so, confirming the intersection position as the focusing position; if not, the intersection position is continuously tracked until the intersection position meets the first preset condition.
Optionally, the determining unit includes:
the second acquisition module is used for acquiring a face image of the user;
the second determination module is used for determining the relative angle between a face feature plane in the face image and an imaging plane of the preview image; wherein the relative angle has a mapping relation with the corresponding position of the preview image;
the second judgment module is used for judging whether the relative angle meets a second preset condition or not, and if so, determining that the corresponding position of the relative angle mapped on the preview image is the focusing position; if not, continuing to track the relative angle until the relative angle meets the second preset condition.
Optionally, the focusing unit includes:
the first acquisition module is used for acquiring the image definition at the focusing position;
and the third judgment module is used for judging whether the image definition is greater than or equal to a first preset value or not, finishing focusing operation aiming at the scene to be shot if the image definition is greater than or equal to the first preset value, and controlling a camera lens for shooting the scene to be shot in the camera module to move if the image definition is not greater than the first preset value.
Optionally, the focusing unit includes:
the second acquisition module is used for acquiring the image phase difference at the focusing position;
and the fourth judgment module is used for judging whether the image phase difference is smaller than or equal to a second preset numerical value or not, finishing focusing operation aiming at the scene to be shot if the image phase difference is smaller than or equal to the second preset numerical value, and controlling a camera lens for shooting the scene to be shot in the camera module to move if the image phase difference is not smaller than or equal to the second preset numerical value.
According to a third aspect of the present disclosure, there is provided an electronic device, comprising:
the camera module is used for acquiring a scene image to be shot and a focusing position of human eyes on a preview image of the scene to be shot;
the screen module is electrically connected with the camera module to display the scene image to be shot and the focusing position;
a processor for performing the photographing method according to any one of claims 1 to 5.
Optionally, the electronic device further comprises a device body, the device body comprising a use surface facing a user and a back surface opposite to the use surface; the camera module comprises a front camera and a rear camera, wherein the lens of the front camera is matched with the use surface, and the lens of the rear camera is matched with the back surface.
Optionally, the camera module further includes a data processing module; the front camera corresponds to the rear camera in position and is respectively electrically connected with the data processing module.
Optionally, the data processing module includes a first processing chip and a second processing chip, the front-facing camera is electrically connected to the first processing chip, and the rear-facing camera is electrically connected to the second processing chip.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement: the shooting method comprises the following steps.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the method and the device, the focusing position of the human eyes on the preview image of the scene to be shot is collected, and the focusing operation is completed according to the image information at the focusing position, so that the focusing position of the scene to be shot can be changed according to the requirements of a user, and focus-following shooting is realized. In addition, the focusing position of the scene to be shot is determined by collecting the focusing position of human eyes on the preview image, so that the problems of inconvenience and interference to the focusing process caused by manual focusing position selection and environmental factors are solved, the fluency, intelligence and operation convenience of the focusing process are improved, and the focusing efficiency and user experience of the electronic equipment using the shooting method are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart of a method of capturing in an exemplary embodiment of the present disclosure;
fig. 2 is a flowchart of a photographing method in another exemplary embodiment of the present disclosure;
FIG. 3 is a flow chart of a photographing method in a further exemplary embodiment of the present disclosure;
FIG. 4 is a flow chart of a photographing method in yet another exemplary embodiment of the present disclosure;
FIG. 5 is a flow chart of a photographing method in yet another exemplary embodiment of the present disclosure;
fig. 6 is a block diagram of a photographing apparatus according to an exemplary embodiment of the present disclosure;
fig. 7 is a block diagram of a photographing apparatus according to another exemplary embodiment of the present disclosure;
fig. 8 is a block diagram of a photographing apparatus according to still another exemplary embodiment of the present disclosure;
fig. 9 is a block diagram of a photographing apparatus according to still another exemplary embodiment of the present disclosure;
fig. 10 is a block diagram of a photographing apparatus according to still another exemplary embodiment of the present disclosure;
FIG. 11 is a schematic diagram of an electronic device in an exemplary embodiment of the disclosure;
fig. 12 is a schematic view of a usage scenario of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In the related art, electronic devices such as mobile phones generally include a shooting function, wherein a focusing process for a scene to be shot is mainly achieved based on manual or automatic focusing of a camera module. Manual focusing requires a user to manually select a focusing area, which reduces the smoothness and intelligence of the shooting operation. The camera module is limited by the light contrast, brightness, distance and other environmental parameters of the scene to be shot in the automatic focusing process, and influences the automatic focusing effect and the overall shooting function of the electronic equipment.
Fig. 1 is a flowchart of a photographing method in an exemplary embodiment of the present disclosure. As shown in fig. 1, the shooting method is applied to a camera module, and the shooting method can be implemented by the following steps:
in step S101, a preview image of a scene to be photographed is acquired.
In step S102, the focus position of the human eye on the preview image is determined.
Wherein, the focusing position of human eyes on the preview image can be only one, so as to form the shooting effect for a single focusing subject. Alternatively, a plurality of focus positions of human eyes on a preview image may be recorded to form a photographing effect for a plurality of in-focus subjects. The present disclosure is not limited by the number of focus positions.
Further, the focus position may be an imaging area or an imaging point on the preview image. For example, a preview image for a scene to be photographed is displayed on a screen of the camera module, and when a focusing position is an imaging area, the preview image can be represented by a rectangular frame displayed on the screen; when the focus position is an imaging point, it can be represented by the intersection of the cross symbols displayed on the screen.
In step S103, the focusing operation for the scene to be photographed is completed according to the image information at the focused position.
The image information at the focusing position may be parameters such as image sharpness and phase difference, so that focusing operation for a scene to be photographed can be achieved.
The focusing position of human eyes on the preview image of the scene to be shot is collected, and the focusing operation is completed according to the image information at the focusing position, so that the focusing position of the scene to be shot can be changed according to the requirements of a user, and the focus-following shooting is realized. In addition, the focusing position of the scene to be shot is determined by collecting the focusing position of human eyes on the preview image, so that the problems of inconvenience and interference on the focusing process caused by manual focusing position selection and environmental factors are solved. When the focusing position needs to be changed in the shooting process, the focusing position of human eyes on the preview image is changed, so that the fluency, the intelligence and the operation convenience of focus-following shooting are improved, and the focusing efficiency and the user experience of the electronic equipment using the shooting method are improved.
The following is an exemplary description of a specific manner of determining the focus position of the human eye on the preview image. Fig. 2 is a flowchart of a photographing method in another exemplary embodiment of the present disclosure, which may be implemented by the following steps in the embodiment shown in fig. 2:
in step S201, a preview image of a scene to be photographed is acquired.
In step S202, a user binocular eyeball image is acquired.
In step S203, the intersection position of the human eye sight line and the preview image is determined according to the positions of the pupils of both eyes in the eyeball image.
The method comprises the steps of collecting images of eyes of a user through a camera facing the user, and calculating the intersection position of the sight line of the eyes and a preview image according to the position relation of the pupils of the eyes in the images of the eyes and the distance between the eyes and the image forming surface of the preview image. Specifically, when the sight line direction of the human eyes is changed, the positions of the pupils in the eyeballs are correspondingly changed, the sight line direction of the two eyes can be determined according to the positions of the pupils of the two eyes, and then the intersection position of the sight line of the human eyes and the preview image is calculated according to the distance between the human eyes and the image forming surface of the preview image.
The intersection position of the sight line of the human eyes and the preview image can be accurately obtained by monitoring the positions of the pupils of the human eyes, so that the accuracy of focusing and shooting by controlling the human eyes and the user experience are improved.
In step S204, it is determined whether the intersection position satisfies a first preset condition, and if so, the intersection position is determined as a focus position; if not, the intersection position is continuously tracked until the intersection position meets the first preset condition.
Judging whether a certain intersection position is a focusing position can be realized by monitoring the stay time of the sight of human eyes at the intersection position. That is, the first preset condition may be that the time for which the line of sight of the human eye stays at the intersection position is longer than a preset time period. For example, when it is judged that the human eye stays at the first position of the preview image for more than 3 seconds, the first position is determined as the focused position.
Alternatively, determining whether an intersection is in focus may be accomplished by monitoring a specified motion of the user's face or other body location or receiving a specified audio. That is, the first preset condition may be that the human eye stays at the intersecting position while detecting a specified motion or audio. For example, when it is determined that the user is looking at the second position of the preview image, an action of blinking twice, nodding, shaking the head, or the like is made continuously or audio information of the "in-focus position" is received, that is, the second position is determined as the in-focus position.
In step S205, the focusing operation for the scene to be photographed is completed according to the image information at the focused position.
In another embodiment, fig. 3 is a flowchart of a photographing method in yet another exemplary embodiment of the present disclosure, which may be implemented by the following steps, as shown in fig. 3:
in step S301, a preview image of a scene to be photographed is acquired.
In step S302, a face image of the user is acquired.
In step S303, a relative angle between a face feature plane in the face image and an imaging plane of the preview image is determined, where the relative angle has a mapping relationship with a corresponding position of the preview image.
The user face image can be collected through the camera facing the user, and due to the fact that when the user observes different positions of the preview image, the sight line is changed and the head correspondingly rotates in the upward, downward, left, right and other directions, the preset face feature plane in the face image and the imaging plane of the preview image form different relative angles. Therefore, a corresponding position that can be a focused position can be theoretically obtained from the mapping relationship between the relative angle and the corresponding position of the preview image.
The face feature plane can be a functional plane which is fit-synthesized by approximate calculation at any position of a human face, and is used for calculating a relative angle between an imaging plane of the preview image and the face feature plane. For example, a partial region of the forehead of a human face can be approximately fitted to a plane as a face feature plane, and the corresponding position mapped on the preview image can be known by monitoring the relative angle between the plane and the imaging plane of the preview image.
By the method for acquiring the face image of the user, the corresponding position on the preview image can be acquired, so that the data processing difficulty is reduced, the acquisition precision requirement on the camera module is reduced, and the overall cost of the camera module is reduced.
In step S304, it is determined whether the relative angle satisfies a second preset condition, and if so, the corresponding position of the relative angle mapped on the preview image is determined as the focus position; if not, the relative angle is continuously tracked until the relative angle meets a second preset condition.
Determining whether a corresponding position of a certain relative angle corresponding to the preview image is a focused position may be accomplished by monitoring the hold time of the relative angle. That is, the second preset condition may be that the holding time of the relative angle is longer than a preset time period. For example, the relative angle corresponds to the third position of the preview image, and it is determined that the relative angle holding time exceeds 3 seconds, that is, the third position is determined as the focus position.
Alternatively, determining whether a relative angle corresponds to a respective location of the preview image being in focus may also be accomplished by monitoring a specified motion of the user's face or other location of the body or receiving a specified audio. That is, the second preset condition may be that the specified motion or audio is detected when the relative angle corresponds to the fourth position of the preview image. For example, the relative angle corresponds to a fourth position of the preview image, at which the action of blinking twice or the like is received continuously or the audio information of the "in-focus position" is received, i.e., the fourth position is determined as the in-focus position.
In step S305, the focusing operation for the scene to be photographed is completed in accordance with the image information at the focused position.
Based on the different types of the acquired image information at the focusing position, there may be multiple implementation manners for completing the focusing operation for the scene to be shot according to the image information at the focusing position, and the specific focusing manner of the scene to be shot is exemplarily described below by taking the determination manner of the focusing position adopted by the shooting method in the embodiment shown in fig. 2 as an example: in an embodiment, fig. 4 is a flowchart of a photographing method in still another exemplary embodiment of the present disclosure, and as shown in fig. 4, the photographing method may be implemented by:
in step S401, a preview image of a scene to be photographed is acquired.
In step S402, a user binocular eye image is acquired.
In step S403, the intersection position of the human eye sight line and the preview image is determined according to the positions of the pupils of both eyes in the eyeball image.
The method comprises the steps of collecting images of eyes of a user through a camera facing the user, and calculating the intersection position of the sight line of the eyes and a preview image according to the position relation of the pupils of the eyes in the images of the eyes and the distance between the eyes and the image forming surface of the preview image. Specifically, when the sight line direction of the human eyes is changed, the positions of the pupils in the eyeballs are correspondingly changed, the sight line direction of the two eyes can be determined according to the positions of the pupils of the two eyes, and then the intersection position of the sight line of the human eyes and the preview image is calculated according to the distance between the human eyes and the image forming surface of the preview image.
The intersection position of the sight line of the human eyes and the preview image can be accurately obtained by monitoring the positions of the pupils of the human eyes, so that the accuracy of focusing and shooting by controlling the human eyes and the user experience are improved.
In step S404, it is determined whether the intersection position satisfies a first preset condition, and if so, the intersection position is determined as a focus position; if not, the intersection position is continuously tracked until the intersection position meets the first preset condition.
Judging whether a certain intersection position is a focusing position can be realized by monitoring the stay time of the sight of human eyes at the intersection position. That is, the first preset condition may be that the time for which the line of sight of the human eye stays at the intersection position is longer than a preset time period. For example, when it is judged that the human eye stays at the first position of the preview image for more than 3 seconds, the first position is determined as the focused position.
Alternatively, determining whether an intersection is in focus may be accomplished by monitoring a specified motion of the user's face or other body location or receiving a specified audio. That is, the first preset condition may be that the human eye stays at the intersecting position while detecting a specified motion or audio. For example, when it is determined that the user is looking at the second position of the preview image, an action of blinking twice, nodding, shaking the head, or the like is made continuously or audio information of the "in-focus position" is received, that is, the second position is determined as the in-focus position.
In step S405, the image sharpness at the focus position is acquired.
In step S406, it is determined whether the image sharpness is greater than or equal to a first preset value, if so, the focusing operation for the scene to be photographed is completed, otherwise, the camera lens of the camera module for photographing the scene to be photographed is controlled to move until the image sharpness is greater than or equal to the first preset value.
The data processing process of the focusing operation of the scene to be shot is directly and easily realized by monitoring the image definition at the focusing position, the focusing operation difficulty can be reduced, and the focusing effect reliability is improved.
It should be noted that, when the focusing position determination method used in the shooting method in the embodiment shown in fig. 3 is adopted, the focusing method described above may also be used, and the present disclosure does not limit this. The first preset value may be set according to a specific focusing requirement, and the disclosure does not limit the first preset value.
Further, taking the determination method of the focusing position adopted by the shooting method in the embodiment shown in fig. 3 as an example, a specific focusing method of a scene to be shot is exemplarily described below: in an embodiment, fig. 5 is a flowchart of a photographing method in still another exemplary embodiment of the present disclosure, and as shown in fig. 5, the photographing method may be implemented by:
in step S501, a preview image of a scene to be photographed is acquired.
In step S502, a user face image is acquired;
in step S503, determining a relative angle between a face feature plane in a face image and an imaging plane of the preview image; wherein the relative angle has a mapping relation with the corresponding position of the preview image;
the user face image can be collected through the camera facing the user, and due to the fact that when the user observes different positions of the preview image, the sight line is changed and the head correspondingly rotates in the upward, downward, left, right and other directions, the preset face feature plane in the face image and the imaging plane of the preview image form different relative angles. Therefore, a corresponding position that can be a focused position can be theoretically obtained from the mapping relationship between the relative angle and the corresponding position of the preview image.
The face feature plane can be a functional plane which is fit-synthesized by approximate calculation at any position of a human face, and is used for calculating a relative angle between an imaging plane of the preview image and the face feature plane. For example, a partial region of the forehead of a human face can be approximately fitted to a plane as a face feature plane, and the corresponding position mapped on the preview image can be known by monitoring the relative angle between the plane and the imaging plane of the preview image.
By the method for acquiring the face image of the user, the corresponding position on the preview image can be acquired, so that the data processing difficulty is reduced, the acquisition precision requirement on the camera module is reduced, and the overall cost of the camera module is reduced.
In step S504, it is determined whether the relative angle satisfies a second preset condition, and if so, the relative angle is mapped to a corresponding position on the preview image to determine the corresponding position as a focus position; if not, the relative angle is continuously tracked until the relative angle meets a second preset condition.
Determining whether a corresponding position of a certain relative angle corresponding to the preview image is a focused position may be accomplished by monitoring the hold time of the relative angle. That is, the second preset condition may be that the holding time of the relative angle is longer than a preset time period. For example, the relative angle corresponds to the third position of the preview image, and it is determined that the relative angle holding time exceeds 3 seconds, that is, the third position is determined as the focus position.
Alternatively, determining whether a relative angle corresponds to a respective location of the preview image being in focus may also be accomplished by monitoring a specified motion of the user's face or other location of the body or receiving a specified audio. That is, the second preset condition may be that the specified motion or audio is detected when the relative angle corresponds to the fourth position of the preview image. For example, the relative angle corresponds to a fourth position of the preview image, at which the action of blinking twice or the like is received continuously or the audio information of the "in-focus position" is received, i.e., the fourth position is determined as the in-focus position.
In step S505, an image phase difference at the focus position is acquired.
In step S506, it is determined whether the image phase difference is less than or equal to a second preset value, if so, the focusing operation for the scene to be shot is completed, otherwise, the camera lens of the camera module for shooting the scene to be shot is controlled to move until the image phase difference is less than or equal to the second preset value.
The focusing operation aiming at the scene to be shot is realized by monitoring the image phase difference, so that the cycle times of moving the lens to be close to the focus can be reduced, the focusing time is reduced, and the focusing efficiency is improved.
Note that, when the focusing position determination method used in the shooting method in the embodiment shown in fig. 2 is adopted, the focusing method described above may be used, and the present disclosure does not limit this. Alternatively, for the focusing position determination manner adopted by the shooting method in the embodiment shown in fig. 2 and fig. 3, other image information at the focusing position may also be used to implement focusing operation, and the disclosure does not limit the specific focusing manner at the focusing position. In addition, the second preset value may be set according to a specific focusing requirement, and the disclosure does not limit the second preset value.
According to the above embodiment, the present disclosure further provides a shooting device applied to a camera module. Fig. 6 is a block diagram of a photographing apparatus according to an exemplary embodiment of the present disclosure, as shown in fig. 6, the photographing apparatus including: the device comprises an acquisition unit, a determination unit and a focusing unit. Wherein:
the acquisition unit is configured to acquire a preview image of a scene to be photographed.
The determination unit is configured to determine a focus position of the human eye on the preview image.
The focusing unit is configured to complete focusing operation for the scene to be shot according to the image information at the focusing position.
As shown in fig. 7, fig. 7 is a block diagram of a camera according to another exemplary embodiment of the present disclosure, and the embodiment is based on the foregoing embodiment shown in fig. 6, and the determining unit 62 includes: the device comprises a first acquisition module, a first determination module and a first judgment module. Wherein the content of the first and second substances,
the first acquisition module is configured to acquire images of eyes and eyeballs of both eyes of a user.
The first determination module is configured to determine the intersection position of the human eye sight line and the preview image according to the positions of the pupils of the two eyes in the eyeball image.
The first judging module is configured to judge whether the intersection position meets a first preset condition, and if so, the intersection position is confirmed as a focusing position; if not, the intersection position is continuously tracked until the intersection position meets the first preset condition.
As shown in fig. 8, fig. 8 is a block diagram of a camera according to still another exemplary embodiment of the present disclosure, and the embodiment is based on the foregoing embodiment shown in fig. 6, and the determining unit 62 includes: the device comprises a second acquisition module, a second determination module and a second judgment module. Wherein the content of the first and second substances,
the second acquisition module is configured to acquire a face image of the user;
the second determination module is configured to determine a relative angle between a face feature plane in the face image and an imaging plane of the preview image, wherein the relative angle has a mapping relation with a corresponding position of the preview image.
The second judging module is configured to judge whether the relative angle meets a second preset condition, and if so, the corresponding position of the relative angle mapped on the preview image is determined as a focusing position; if not, the relative angle is continuously tracked until the relative angle meets a second preset condition.
As shown in fig. 9, fig. 9 is a block diagram of a shooting device according to still another exemplary embodiment of the present disclosure, and the focusing unit 63 of the embodiment is based on the embodiment shown in fig. 6, and includes: the device comprises a first obtaining module and a third judging module. Wherein the content of the first and second substances,
the first acquisition module is configured to acquire image sharpness at a focus position.
The third judging module is configured to judge whether the image definition is greater than or equal to a first preset value, if so, the focusing operation for the scene to be shot is completed, and if not, the camera lens for shooting the scene to be shot in the camera module is controlled to move until the image definition is greater than or equal to the first preset value.
As shown in fig. 10, fig. 10 is a block diagram of a shooting device according to still another exemplary embodiment of the present disclosure, and the focusing unit 63 of the embodiment is based on the foregoing embodiment shown in fig. 6, and includes: the device comprises a second obtaining module and a fourth judging module. Wherein the content of the first and second substances,
the second acquisition module is configured to acquire an image phase difference at the focus position.
The fourth judging module is configured to judge whether the image phase difference is smaller than or equal to a second preset value, if so, the focusing operation aiming at the scene to be shot is completed, and if not, the camera lens for shooting the scene to be shot in the camera module is controlled to move until the image phase difference is smaller than or equal to the second preset value.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
The disclosure further proposes a computer-readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the photographing method. In an exemplary embodiment, the present disclosure provides a non-transitory computer-readable storage medium comprising instructions. For example, a memory including instructions that, when executed by a processor of an electronic device, implement the above-described photographing method of the present disclosure. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
The present disclosure further proposes an electronic device, as shown in fig. 11, the electronic device 1 includes a camera module 11, a screen module 12, and a processor 13. The camera module 11 is configured to collect an image of a scene to be shot and a focus position of the human eye 2 on a preview image of the scene to be shot. The screen module 12 is electrically connected to the camera module 11 to display the image of the scene to be shot and the focusing position. The processor 13 is used to execute the shooting method in the above embodiments.
By collecting the focusing position of the human eye 2 on the preview image of the scene to be shot and finishing the focusing operation according to the image information at the focusing position, the focusing position of the scene to be shot can be changed according to the requirements of the user, and the focus-following shooting is realized. In addition, the focusing position of the scene to be shot is determined by acquiring the focusing position of the human eyes 2 on the preview image, so that the problems of inconvenience and interference on the focusing process caused by manual focusing position selection and environmental factors are solved. When the focusing position needs to be changed in the shooting process, the focusing position of human eyes on the preview image is changed, so that the fluency, the intelligence and the operation convenience of focus-following shooting are improved, and the focusing efficiency and the user experience of the electronic equipment 1 using the shooting method are improved.
Further, solid arrows in fig. 12 represent the lines of sight of the human eyes, dotted lines represent the light collection ranges of the front camera, and dotted lines represent the light collection ranges of the rear camera. As shown in fig. 11 and 12, the electronic device 1 further includes a device main body 14, and the device main body 14 includes a use surface 141 facing the user and a back surface 142 opposite to the use surface 141. The camera module 11 includes a front camera 111 and a rear camera 112, wherein a lens of the front camera 111 is fitted to the use surface 141, and a lens of the rear camera 112 is fitted to the rear surface 142. The biological characteristic image of the user is acquired through the front camera 111, the scene image to be shot is acquired through the rear camera 112, the data processing difficulty of the camera module 11 is reduced in a mode that the front camera 111 and the rear camera 112 acquire the image independently and then work in a matched mode, the orientations of the lenses of the front camera 111 and the rear camera 112 also accord with the operation habit of the user, and operation convenience is provided for acquisition of the biological characteristic image and the scene image to be shot.
Alternatively, the camera module 11 may also implement the above-mentioned shooting method only by using the front camera 111 facing the user, that is, the front camera 111 takes the scene at the user side as the scene to be shot, collects the biometric image and the image of the scene to be shot at the same time, determines the focus position on the preview image of the scene to be shot according to the biometric image, and then implements the focusing operation. The focusing shooting function of the front camera 111 is enriched through the arrangement, and the user experience of the user during front shooting is improved.
Further, the camera module 11 further includes a data processing module 113, and the front camera 111 corresponds to the rear camera 112 in position and is electrically connected to the data processing module 113 respectively. The front camera 111 and the rear camera 112 corresponding to the positions reduce the data processing difficulty for determining the focusing position of the human eye 2 on the preview image, and are convenient for structure centralized setting and connection matching.
The data processing module 113 may include a first processing chip (not labeled) and a second processing chip (not labeled), the front camera 111 is electrically connected to the first processing chip, and the rear camera 112 is electrically connected to the second processing chip. When the shooting pixel requirement of at least one of the front camera and the rear camera 112 is higher, the data processing workload can be increased, so that the data processing efficiency can be improved by setting a first processing chip and a second processing chip for the data processing module 113, and the efficiency of the focusing process and the user experience are further improved. Or, the data processing module 113 may also be formed by a separate processing chip, and the front camera 111 and the rear camera 112 are electrically connected to the separate processing chip respectively, so as to implement data processing on the front camera 111 and the rear camera 112 through the processing chip, thereby improving the integration of the data processing module 113 and reducing the cost.
Further, the camera module 11 may further include a binocular vision or a structured light module (not labeled), and the binocular vision or the structured light module is electrically connected to the front camera 111 to increase the accuracy of obtaining the biological characteristic image by the front camera 111. The biometric image may be a binocular eye image or a face image, and the disclosure is not limited thereto.
The electronic device 1 according to the present disclosure may be a mobile phone, a tablet computer, a vehicle-mounted device, a medical terminal, and the like, and the present disclosure does not limit this.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (15)

1. A shooting method is applied to a camera module and comprises the following steps:
acquiring a preview image of a scene to be shot;
determining a focus position of a human eye on the preview image;
and finishing focusing operation aiming at the scene to be shot according to the image information at the focusing position.
2. The capture method of claim 1, wherein the determining the focus position of the human eye on the preview image comprises:
acquiring images of eyes and eyeballs of a user;
determining the intersection position of the human eye sight line and the preview image according to the positions of the pupils of the two eyes in the eyeball image;
judging whether the intersection position meets a first preset condition, and if so, confirming the intersection position as the focusing position; if not, the intersection position is continuously tracked until the intersection position meets the first preset condition.
3. The capture method of claim 1, wherein the determining the focus position of the human eye on the preview image comprises:
collecting a face image of a user;
determining a relative angle between a facial feature plane in the human face image and an imaging plane of the preview image; wherein the relative angle has a mapping relation with the corresponding position of the preview image;
judging whether the relative angle meets a second preset condition, if so, mapping the relative angle on a corresponding position on the preview image and confirming the corresponding position as the focusing position; if not, continuing to track the relative angle until the relative angle meets the second preset condition.
4. The shooting method according to claim 1, wherein the completing of the focusing operation for the scene to be shot according to the image information at the focusing position comprises:
acquiring the image definition at the focusing position;
and judging whether the image definition is greater than or equal to a first preset value, if so, finishing focusing operation on the scene to be shot, otherwise, controlling a camera lens for shooting the scene to be shot in the camera module to move until the image definition is greater than or equal to the first preset value.
5. The shooting method according to claim 1, wherein the performing a focusing operation according to the image information at the focusing position includes:
acquiring an image phase difference at the focusing position;
and judging whether the image phase difference is smaller than or equal to a second preset value, if so, finishing focusing operation on the scene to be shot, otherwise, controlling a camera lens for shooting the scene to be shot in the camera module to move until the image phase difference is smaller than or equal to the second preset value.
6. The utility model provides a shooting device, its characterized in that is applied to the camera module, shooting device includes:
the acquisition unit is used for acquiring a preview image of a scene to be shot;
a determination unit that determines a focus position of the human eye on the preview image;
and the focusing unit is used for finishing focusing operation aiming at the scene to be shot according to the image information at the focusing position.
7. The photographing apparatus according to claim 6, wherein the determination unit includes:
the first acquisition module is used for acquiring the images of the eyes and the eyeballs of the two eyes of the user;
the first determining module is used for determining the intersection position of the sight line of the human eyes and the preview image according to the positions of the pupils of the two eyes in the eyeball image;
the first judgment module is used for judging whether the intersection position meets a first preset condition or not, and if so, confirming the intersection position as the focusing position; if not, the intersection position is continuously tracked until the intersection position meets the first preset condition.
8. The photographing apparatus according to claim 6, wherein the determination unit includes:
the second acquisition module is used for acquiring a face image of the user;
the second determination module is used for determining the relative angle between a face feature plane in the face image and an imaging plane of the preview image; wherein the relative angle has a mapping relation with the corresponding position of the preview image;
the second judgment module is used for judging whether the relative angle meets a second preset condition or not, and if so, determining that the corresponding position of the relative angle mapped on the preview image is the focusing position; if not, continuing to track the relative angle until the relative angle meets the second preset condition.
9. The photographing apparatus according to claim 6, wherein the focusing unit includes:
the first acquisition module is used for acquiring the image definition at the focusing position;
and the third judgment module is used for judging whether the image definition is greater than or equal to a first preset value or not, finishing focusing operation aiming at the scene to be shot if the image definition is greater than or equal to the first preset value, and controlling a camera lens for shooting the scene to be shot in the camera module to move if the image definition is not greater than the first preset value.
10. The photographing apparatus according to claim 6, wherein the focusing unit includes:
the second acquisition module is used for acquiring the image phase difference at the focusing position;
and the fourth judgment module is used for judging whether the image phase difference is smaller than or equal to a second preset numerical value or not, finishing focusing operation aiming at the scene to be shot if the image phase difference is smaller than or equal to the second preset numerical value, and controlling a camera lens for shooting the scene to be shot in the camera module to move if the image phase difference is not smaller than or equal to the second preset numerical value.
11. An electronic device, comprising:
the camera module is used for acquiring a scene image to be shot and a focusing position of human eyes on a preview image of the scene to be shot;
the screen module is electrically connected with the camera module to display the scene image to be shot and the focusing position;
a processor for performing the photographing method according to any one of claims 1 to 5.
12. The electronic device according to claim 11, further comprising a device body including a use face that faces a user and a back face opposite to the use face; the camera module comprises a front camera and a rear camera, wherein the lens of the front camera is matched with the use surface, and the lens of the rear camera is matched with the back surface.
13. The electronic device of claim 12, wherein the camera module further comprises a data processing module; the front camera corresponds to the rear camera in position and is respectively electrically connected with the data processing module.
14. The electronic device of claim 13, wherein the data processing module comprises a first processing chip and a second processing chip, the front camera is electrically connected to the first processing chip, and the rear camera is electrically connected to the second processing chip.
15. A computer readable storage medium having computer instructions stored thereon which, when executed by a processor, implement: the steps of the photographing method as claimed in claims 1 to 5.
CN201911012225.8A 2019-10-23 2019-10-23 Shooting method, shooting device and electronic equipment Pending CN112702506A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911012225.8A CN112702506A (en) 2019-10-23 2019-10-23 Shooting method, shooting device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911012225.8A CN112702506A (en) 2019-10-23 2019-10-23 Shooting method, shooting device and electronic equipment

Publications (1)

Publication Number Publication Date
CN112702506A true CN112702506A (en) 2021-04-23

Family

ID=75505049

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911012225.8A Pending CN112702506A (en) 2019-10-23 2019-10-23 Shooting method, shooting device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112702506A (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101523896A (en) * 2006-10-02 2009-09-02 索尼爱立信移动通讯有限公司 Focused areas in an image
CN103248822A (en) * 2013-03-29 2013-08-14 东莞宇龙通信科技有限公司 Focusing method of camera shooting terminal and camera shooting terminal
CN103338331A (en) * 2013-07-04 2013-10-02 上海斐讯数据通信技术有限公司 Image acquisition system adopting eyeballs to control focusing
CN103780839A (en) * 2014-01-21 2014-05-07 宇龙计算机通信科技(深圳)有限公司 Shooting method and terminal
CN103795926A (en) * 2014-02-11 2014-05-14 惠州Tcl移动通信有限公司 Method, system and photographing device for controlling photographing focusing by means of eyeball tracking technology
CN103905709A (en) * 2012-12-25 2014-07-02 联想(北京)有限公司 Electronic device control method and electronic device
CN104460185A (en) * 2014-11-28 2015-03-25 小米科技有限责任公司 Automatic focusing method and device
US20150095815A1 (en) * 2013-09-27 2015-04-02 International Business Machines Corporation Method and system providing viewing-angle sensitive graphics interface selection compensation
CN105208273A (en) * 2015-09-24 2015-12-30 宇龙计算机通信科技(深圳)有限公司 Method and device for taking photos through dual-camera terminal and dual-camera terminal
CN105279459A (en) * 2014-11-20 2016-01-27 维沃移动通信有限公司 Terminal anti-peeping method and mobile terminal
CN105704369A (en) * 2016-01-20 2016-06-22 努比亚技术有限公司 Information-processing method and device, and electronic device
CN106231185A (en) * 2016-08-01 2016-12-14 乐视控股(北京)有限公司 The photographic method of a kind of intelligent terminal and device
CN106296264A (en) * 2016-07-28 2017-01-04 河海大学常州校区 A kind of pushing intelligent advertisements system based on recognition of face
CN106470308A (en) * 2015-08-18 2017-03-01 联想(北京)有限公司 Image processing method and electronic equipment
CN109271914A (en) * 2018-09-07 2019-01-25 百度在线网络技术(北京)有限公司 Detect method, apparatus, storage medium and the terminal device of sight drop point
CN110177210A (en) * 2019-06-17 2019-08-27 Oppo广东移动通信有限公司 Photographic method and relevant apparatus
CN110225252A (en) * 2019-06-11 2019-09-10 Oppo广东移动通信有限公司 Camera control method and Related product

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101523896A (en) * 2006-10-02 2009-09-02 索尼爱立信移动通讯有限公司 Focused areas in an image
CN103905709A (en) * 2012-12-25 2014-07-02 联想(北京)有限公司 Electronic device control method and electronic device
CN103248822A (en) * 2013-03-29 2013-08-14 东莞宇龙通信科技有限公司 Focusing method of camera shooting terminal and camera shooting terminal
CN103338331A (en) * 2013-07-04 2013-10-02 上海斐讯数据通信技术有限公司 Image acquisition system adopting eyeballs to control focusing
US20150095815A1 (en) * 2013-09-27 2015-04-02 International Business Machines Corporation Method and system providing viewing-angle sensitive graphics interface selection compensation
CN103780839A (en) * 2014-01-21 2014-05-07 宇龙计算机通信科技(深圳)有限公司 Shooting method and terminal
CN103795926A (en) * 2014-02-11 2014-05-14 惠州Tcl移动通信有限公司 Method, system and photographing device for controlling photographing focusing by means of eyeball tracking technology
CN105279459A (en) * 2014-11-20 2016-01-27 维沃移动通信有限公司 Terminal anti-peeping method and mobile terminal
CN104460185A (en) * 2014-11-28 2015-03-25 小米科技有限责任公司 Automatic focusing method and device
CN106470308A (en) * 2015-08-18 2017-03-01 联想(北京)有限公司 Image processing method and electronic equipment
CN105208273A (en) * 2015-09-24 2015-12-30 宇龙计算机通信科技(深圳)有限公司 Method and device for taking photos through dual-camera terminal and dual-camera terminal
CN105704369A (en) * 2016-01-20 2016-06-22 努比亚技术有限公司 Information-processing method and device, and electronic device
CN106296264A (en) * 2016-07-28 2017-01-04 河海大学常州校区 A kind of pushing intelligent advertisements system based on recognition of face
CN106231185A (en) * 2016-08-01 2016-12-14 乐视控股(北京)有限公司 The photographic method of a kind of intelligent terminal and device
CN109271914A (en) * 2018-09-07 2019-01-25 百度在线网络技术(北京)有限公司 Detect method, apparatus, storage medium and the terminal device of sight drop point
CN110225252A (en) * 2019-06-11 2019-09-10 Oppo广东移动通信有限公司 Camera control method and Related product
CN110177210A (en) * 2019-06-17 2019-08-27 Oppo广东移动通信有限公司 Photographic method and relevant apparatus

Similar Documents

Publication Publication Date Title
JP6522708B2 (en) Preview image display method and apparatus, and terminal
US10165176B2 (en) Methods, systems, and computer readable media for leveraging user gaze in user monitoring subregion selection systems
US10609273B2 (en) Image pickup device and method of tracking subject thereof
CN108076278B (en) Automatic focusing method and device and electronic equipment
CN105339841B (en) The photographic method and bimirror head apparatus of bimirror head apparatus
CN110738142A (en) method, system and storage medium for self-adaptively improving face image acquisition
CN103747183B (en) Mobile phone shooting focusing method
US9300858B2 (en) Control device and storage medium for controlling capture of images
CN108833795B (en) Focusing method and device of image acquisition equipment
CN103297696A (en) Photographing method, photographing device and photographing terminal
EP3651457B1 (en) Pupillary distance measurement method, wearable eye equipment and storage medium
CN106331498A (en) Image processing method and image processing device used for mobile terminal
CN108200340A (en) The camera arrangement and photographic method of eye sight line can be detected
CN108510540A (en) Stereoscopic vision video camera and its height acquisition methods
CN112666705A (en) Eye movement tracking device and eye movement tracking method
KR20170011362A (en) Imaging apparatus and method for the same
CN112738388B (en) Photographing processing method and system, electronic device and storage medium
JPWO2019021601A1 (en) Information processing apparatus, information processing method, and program
CN112954296B (en) Binocular vision-based fundus focusing method, binocular vision-based fundus focusing system and storage medium
CN108282650B (en) Naked eye three-dimensional display method, device and system and storage medium
CN105787435A (en) Indication method and apparatus for iris acquisition
CN113395438B (en) Image correction method and related device for eyeball tracking technology
CN108156387A (en) Terminate the device and method of camera shooting automatically by detecting eye sight line
CN109756663B (en) AR device control method and device and AR device
CN112702506A (en) Shooting method, shooting device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210423

RJ01 Rejection of invention patent application after publication