CN114401340B - Collaborative shooting method, electronic equipment and medium thereof - Google Patents

Collaborative shooting method, electronic equipment and medium thereof Download PDF

Info

Publication number
CN114401340B
CN114401340B CN202111682873.1A CN202111682873A CN114401340B CN 114401340 B CN114401340 B CN 114401340B CN 202111682873 A CN202111682873 A CN 202111682873A CN 114401340 B CN114401340 B CN 114401340B
Authority
CN
China
Prior art keywords
screen
collaborative
shooting
mobile phone
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111682873.1A
Other languages
Chinese (zh)
Other versions
CN114401340A (en
Inventor
刘智鹍
聂星星
王拣贤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111682873.1A priority Critical patent/CN114401340B/en
Priority to CN202311176831.XA priority patent/CN117424954A/en
Publication of CN114401340A publication Critical patent/CN114401340A/en
Application granted granted Critical
Publication of CN114401340B publication Critical patent/CN114401340B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly
    • H04M1/0268Details of the structure or mounting of specific components for a display module assembly including a flexible display panel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/16Details of telephonic subscriber devices including more than one display unit
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

The present application relates to the field of electronic technologies, and in particular, to a collaborative shooting method, an electronic device, and a medium thereof. Through the collaborative shooting method, when a photographer uses the folding screen mobile phone to shoot the photographer, the photographer can intuitively see the shooting effect through the mirror image picture displayed by the external screen of the mobile phone, and the mirror image picture displayed by the external screen is the picture after mirror image processing of the display picture of the internal screen of the mobile phone, namely, the picture displayed by the external screen is similar to the picture displayed when the photographer looks at the mirror, so that the photographer can directly adjust the posture and the position of the photographer according to the picture displayed by the external screen to obtain better shooting effect.

Description

Collaborative shooting method, electronic equipment and medium thereof
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a collaborative shooting method, an electronic device, and a medium thereof.
Background
At present, when people use the rear camera of the folding screen mobile phone to take pictures, a photographer cannot watch a preview picture shot by the photographer through the outer screen of the folding screen mobile phone, so that the imaging effect of the photographer in the rear camera cannot be intuitively seen, and the position can be moved or the modeling can be adjusted only according to commands or prompts of other people so as to improve the imaging effect. For example, as shown in fig. 1 (a), the photographer 10 photographs the photographer 20 using the folding screen mobile phone 100, and the external screen a of the folding screen mobile phone 100 displays a standby screen (such as date and time), so that the photographer 20 cannot view the photographing preview screen through the external screen a of the folding screen mobile phone 100, resulting in poor user experience.
Disclosure of Invention
The embodiment of the application provides a collaborative shooting method, electronic equipment and a medium thereof.
In a first aspect, an embodiment of the present application provides a collaborative shooting method, where the collaborative shooting method is applied to an electronic device, where the electronic device includes a first camera, and a first screen (e.g., a mobile phone inner screen) and a second screen (e.g., a mobile phone outer screen) that are located on different sides of the electronic device, where the first camera and the first screen are located on different sides of the electronic device, e.g., a rear camera of a mobile phone and a mobile phone inner screen are located on different sides of the mobile phone, and where the first camera and the second screen are located on the same side of the electronic device, e.g., the rear camera of the mobile phone and the mobile phone outer screen are located on the same side of the mobile phone;
the method comprises the following steps:
the first screen displays a first shooting picture shot by the first camera and detects a first instruction;
in response to the first instruction, the second screen displays a second shot, wherein the second shot is a mirror image of the first shot, and the size ratio of the first shot to the second shot is the same.
It is understood that the present application does not limit the order of displaying the first shooting picture on the first screen and the first instruction detected by the electronic device. That is, the first screen may display the first photographing screen photographed by the first camera after the electronic device detects the first instruction, for example, the first photographing screen may be displayed on the first screen after the electronic device detects the instruction of the user to start the collaborative photographing function. Or after the first screen displays the first shooting picture shot by the first camera, the electronic device can also display the first shooting picture first when detecting a first instruction, for example, when a user usually opens a mobile phone camera application and does not start a collaborative shooting function instruction.
In the method, in the shooting process, the electronic equipment can display the first shooting picture on the first screen according to the user instruction, and simultaneously display the second shooting picture on the second screen, wherein the second shooting picture is a mirror image picture of the first shooting picture, so that a shot person can intuitively see the imaging effect of the shot person through the second shooting picture of the second screen, and the shooting experience of the user is improved.
In one possible implementation manner of the first aspect, the first screen displays a display interface of the camera application, and the display interface of the camera application includes a first shot.
In a possible implementation manner of the first aspect, the display interface of the camera application further includes a first control, and in a case that a first operation of the first control by the user is detected, a collaborative shooting instruction is detected.
In some embodiments, the first control may be a button.
That is, the user may initiate a collaborative shooting function of the electronic device according to the first control on the camera application display interface. In some possible implementations, the first operation may be that the user clicks the first control, or a voice instruction input by the user to the first control, or the like, which is not limited by the present application.
In a possible implementation manner of the first aspect, the camera application includes a plurality of shooting modes, and
in response to the first instruction, the second screen displays a second photographing screen including:
and controlling the second screen to display a second shooting picture in the case that the camera supports the collaborative shooting function in the current shooting mode.
That is, since a certain shooting mode of the camera application may not support the collaborative shooting function, the second screen cannot display the second shooting picture at this time, and therefore, before the second screen displays the second shooting picture, it is required to determine that the current mode of the camera application supports the collaborative shooting function, so that collaborative display of the first screen and the second screen can be achieved.
In a possible implementation manner of the first aspect, the shooting mode includes one or more of a shooting mode, a video recording mode, a portrait mode, or a movie mode.
In a possible implementation manner of the first aspect, the method further includes:
detecting that a shooting mode of a camera application is switched from a first shooting mode supporting a collaborative shooting function to a second shooting mode not supporting the collaborative shooting function;
the first control is not displayed on the display interface of the camera application.
It will be appreciated that the first control is used to activate or deactivate the collaborative shooting function, and thus when a certain camera mode does not support the collaborative shooting function, it is naturally not necessary to display the first control on the camera application interface anymore. Accordingly, when the photographing mode of the camera application is switched from the photographing mode supporting the collaborative photographing function to the photographing mode not supporting the collaborative photographing function, the camera interface may not display the first control.
In addition, under the shooting mode which does not support the collaborative shooting function, the first control is not displayed, and the situation that the current shooting mode is not available due to the fact that a user touches the first control by mistake, for example, a camera application appears in a black screen, can be effectively avoided.
In a possible implementation manner of the first aspect, the electronic device further includes a collaborative photographing function module, a first screen display module, and a second screen display module; and, in addition, the processing unit,
the first instruction is detected by the collaborative photographing function module; and is also provided with
In response to the first instruction, the second screen displays a second photographing screen including:
the second screen display module receives a first instruction from the collaborative photographing function module through the first screen display module;
the second screen display module controls the second screen to display a second shooting picture in response to receiving the first instruction.
That is, the display content of the first screen is controlled by the first screen control module, and the display content of the second screen is controlled by the second screen control module. The electronic equipment detects a first instruction through the collaborative photographing functional module, and after the collaborative photographing functional module detects the first instruction, the electronic equipment sends the first instruction to the second screen display module through the first screen display module so as to control the collaborative display of the second screen display module and the first screen display module.
In a possible implementation manner of the first aspect, the electronic device further includes a view module and an image composition module, and
the second screen display module controls the second screen to display a second shooting picture in response to receiving the first instruction, and the second screen display module comprises:
the second screen display module is used for responding to the first instruction, acquiring a layer bearing a second shooting picture from the view module, and displaying the layer bearing the second shooting picture on the second screen, wherein the second shooting picture borne in the layer is acquired from the image synthesis module by the view module, and the second shooting picture is obtained by carrying out mirror image processing on the first shooting picture by the image synthesis module.
Wherein in some embodiments, the above-described View module may be a View (View) framework, hereinafter, and the image composition module may be a SurfaceFlinger module, hereinafter.
In a possible implementation manner of the first aspect, the electronic device further includes a preview data stream module that stores data of the first shot screen; and is also provided with
The image synthesis module generates a second photographed picture by:
the image synthesis module acquires copy data of the first shooting picture from the preview data flow module; and the image synthesis module performs mirror image processing on the copied data to obtain data of a second shooting picture.
In a second aspect, embodiments of the present application also provide an electronic device including a memory storing computer program instructions; a processor coupled to the memory, the memory storing computer program instructions that, when executed by the processor, cause the electronic device to implement the collaborative shooting method of any of the first aspects described above.
In a third aspect, an embodiment of the present application further provides a computer readable storage medium, where a computer program is stored, where the computer program is executed by a processor to implement the collaborative shooting method according to any one of the first aspect above.
In a fourth aspect, an embodiment of the present application further provides a computer program product, which when run on an electronic device, causes the electronic device to perform the collaborative shooting method according to any one of the first aspect above.
It will be appreciated that the advantages of the second to fourth aspects may be found in the relevant description of the first aspect and are not repeated here.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 shows an application scenario diagram of the present solution;
fig. 2 shows two states of the mobile phone 100, in which fig. 2 (a) shows a schematic view of the folding-screen mobile phone 100 in an unfolded state, fig. 2 (B) shows a schematic view of the folding-screen mobile phone 100 in an unfolded state, and fig. 2 (C) shows a schematic view of the folding-screen mobile phone 100 in a folded state;
Fig. 3 shows a schematic diagram of an internal and external screen display interface of a mobile phone, where fig. 3 (a) and fig. 3 (B) respectively show an example of an internal and external screen display interface of a collaborative display of internal and external screens provided by an embodiment of the present application;
fig. 4 shows a schematic diagram of an internal and external screen display interface of a mobile phone with internal and external screen collaborative display, where fig. 4 (a) and fig. 4 (B) respectively show an example of an internal and external screen display interface schematic diagram of internal and external screen collaborative display provided by an embodiment of the present application;
fig. 5 shows a schematic diagram of an internal and external screen display interface of a mobile phone with internal and external screen collaborative display, where fig. 5 (a) and fig. 5 (B) respectively show an example of an internal and external screen display interface schematic diagram of internal and external screen collaborative display provided by an embodiment of the present application;
fig. 6 shows a schematic diagram of an internal and external screen display interface of a mobile phone with internal and external screen collaborative display, where fig. 6 (a) and fig. 6 (B) respectively show an example of an internal and external screen display interface schematic diagram of internal and external screen collaborative display provided by an embodiment of the present application;
fig. 7 shows a flow diagram of the collaborative shooting method of the present application;
FIG. 8 is a schematic diagram of a mirroring method according to an embodiment of the application;
fig. 9 is a diagram showing a software architecture of the mobile phone 100 according to the embodiment of the present application;
FIG. 10 is a schematic diagram showing interactions between the various software structures of FIG. 9 in cooperation to implement the method of the present application;
Fig. 11 is a schematic diagram showing a hardware structure of a mobile phone 100 according to an embodiment of the present application.
Detailed Description
Various aspects of the illustrative embodiments will be described using terms commonly employed by those skilled in the art.
It should be noted that the collaborative shooting method of the present application is applicable to any multi-screen electronic device having a shooting function and having screens on opposite sides, such as a folding screen mobile phone, a tablet computer, an intelligent wearable device, and the like, which is not limited in this aspect of the present application. Hereinafter, for convenience of description, an example of a camera application in the folding-screen mobile phone 100 will be described, and the folding-screen mobile phone 100 will be simply referred to as the mobile phone 100. It can be understood that the technical solution of the present application is also applicable to other applications with shooting functions on electronic devices, and is not limited to camera applications.
It will be appreciated that one configuration of the mobile phone 100 may be as shown in fig. 2, where fig. 2 (a) -2 (B) show the unfolded state of the mobile phone 100 and fig. 2 (C) shows the folded state of the mobile phone 100. Specifically, as shown in fig. 2 (a), the mobile phone 100 has an external screen a on which information such as the current time, the network state, the electric quantity, etc. can be displayed, and a camera P1 is provided on the external screen a. Fig. 2 (B) shows the inner screen of the mobile phone 100, and it can be seen that the inner screen can be divided into an inner screen area B-1 and an inner screen area B-2 when the mobile phone 100 is in the unfolded state. The camera P2 may be disposed on the inner screen area B-1 or the inner screen area B-2 of the mobile phone 100, which is not limited in the present application.
Fig. 2 (C) shows a schematic view of the mobile phone 100 in a folded state, and it can be seen that the inner screen area B-1 and the inner screen area B-2 are not displayed externally and the outer screen a is displayed externally.
In the above, for convenience of description, the inner screen area B-1 and the inner screen area B-2 of the mobile phone 100 are collectively referred to as the inner screen B, and the expansion angle and the expansion state of the inner screen B of the mobile phone 100 are referred to as the expansion angle and the expansion state of the mobile phone 100, for example, in fig. 2 (B), the inner screen B of the mobile phone 100 is in the expansion state, and in fig. 2 (C), the mobile phone 100 is also in the expansion state, and in the folding state, the inner screen B of the mobile phone 100 is also in the folding state.
In addition, for convenience of description, in the following description, it is assumed that a photographer faces an inner screen B, and a photographer faces an outer screen a as an example, and correspondingly, the camera P1 is referred to as a front camera 1, and the camera P2 is referred to as a rear camera 2. It will be appreciated that the collaborative photographing method of the present application is also applicable to a scenario in which a photographer faces an external screen a and a subject faces an internal screen B, and in this case, the camera P1 is referred to as a rear camera 1 and the camera P2 is referred to as a front camera 2, respectively.
As described in the background art, when a person uses the rear camera P1 of the mobile phone 100 to take a picture, the person to be photographed cannot intuitively watch the imaging effect of the person to be photographed, which affects the user experience of the person to be photographed. In order to solve the technical problem, some embodiments of the present application provide a collaborative shooting method suitable for a folding screen electronic device.
Specifically, in the collaborative shooting method according to some embodiments of the present application, after the shooting picture displayed on the inner screen B of the mobile phone 100 is subjected to mirror image processing, the shooting picture is displayed on the outer screen a of the folding-screen mobile phone 100 (hereinafter referred to as mirror image picture) in the same proportion, so that when a person to be shot shoots, the person to be shot can watch the picture currently shot by the camera P1 through the outer screen a of the folding-screen mobile phone 100, and the person to be shot can conveniently adjust the gesture. The method for mirroring and displaying the photographed pictures in the same proportion on the inner and outer screens a will be described in detail below.
For example, as shown in fig. 1B, when the photographer 10 photographs the subject 20 with the rear camera P1 of the mobile phone 100, the photographed screen 101a photographed by the camera P1 is displayed on the inner screen B facing the photographer, and the mirrored screen 101B (which is laterally symmetrical to the photographed screen 101 a) of the photographed screen 101a is displayed on the outer screen a of the folding-screen mobile phone 100 facing the subject. As described above, the mirror image screen 101b displayed on the outer screen a corresponds to the subject's own view on the mirror, that is, the information such as the posture of the subject 20 seen on the mirror image screen 101b of the outer screen a by the subject 20, the position of the subject 20 on the entire image screen, and the like is the same as the information seen on the image screen 101a of the inner screen by the user 10. And in order to display the complete mirror image picture of the photographing picture 101a in the external screen a, the photographing picture 101a and the mirror image picture 101b have the same size ratio (e.g., aspect ratio).
In this way, when the photographer uses the folding screen mobile phone 100 to photograph the photographer, the photographer can intuitively see the photographing effect through the mirror image picture displayed by the external screen a, and because the mirror image picture displayed by the external screen a is the picture after the mirror image processing of the display picture of the internal screen B, that is, for the photographer, the picture displayed by the external screen a is similar to the picture displayed when the photographer views the mirror, the photographer can directly adjust the posture and the position of the photographer according to the picture displayed by the external screen a to obtain a better photographing effect.
It can be understood that the user can open the collaborative shooting function by clicking a button related to collaborative shooting in the camera application interface, so that shooting is performed by adopting the collaborative shooting method of the present application. It is to be understood that the user may also turn on the collaborative shooting function by means of a voice command or the like, and the present application is not limited thereto.
For example, fig. 3 (a) and 3 (B) illustrate one implementation of the mobile phone 100 to turn on the collaborative photographing function. As shown in fig. 3 (a), after the photographer 10 opens the camera application, the screen B in the mobile phone 100 displays a camera application interface, which includes a currently photographed picture and function buttons located on both sides of the camera application interface, and these function buttons include, in order from top to bottom, a setup button, a brightness adjustment button, a flash button, an intelligent recognition button, a beauty button, and a collaborative photographing button 102. After the user clicks the collaborative photographing button 102, the mobile phone 100 starts the collaborative photographing function, and the external screen a of the mobile phone 100 displays the mirror image screen 101B of the photographing screen 101a of the internal screen B.
After the mobile phone 100 starts the collaborative photographing function, the user can select a specific photographing mode through the function buttons on the right side of the camera application interface, such as a photographing mode button, a video mode button, a movie mode button, etc., so as to meet the photographing requirement. In the process of selecting the camera mode to take a picture, the proportion of the pictures displayed on the inner screen B and the outer screen a of the mobile phone 100 is the same.
For example, as shown in fig. 4 (a), after the user selects the photographing mode 105 from the camera modes on the right side of the screen B in the mobile phone 100, the photographing screen 101a is displayed on the screen B at the ratio of 4:3, and at the same time, as shown in fig. 4 (B), the external screen a of the mobile phone 100 also displays a mirror image screen 101B of the photographing screen 101a, and the aspect ratio of the mirror image screen 101B is also 4:3.
For example, as shown in fig. 5 (a), when the user selects the video recording mode 106 in the camera mode on the right side of the screen B in the mobile phone 100, the shot screen 101a is displayed on the screen B at the aspect ratio of 21:9, and as shown in fig. 5 (B), the mirror screen 101B of the shot screen 101a is displayed on the screen a outside the mobile phone 100, and the aspect ratio of the mirror screen 101B is also 21:9.
It will be appreciated that the user may also select other modes from the camera mode on the right side of screen B in the mobile phone 100, such as a self-timer mode, a movie mode, a portrait mode (not shown), etc., which the present application is not limited to. However, no matter what camera mode is selected by the user, when the camera mode supports the collaborative photographing function, the proportion of the display images of the inner and outer screens a of the mobile phone 100 is the same, and the proportion corresponds to the camera mode. For example, in fig. 3, the aspect ratio corresponding to the photographing mode is 4:3, and the aspect ratio corresponding to the video recording mode is 21:9.
Further, it will be appreciated that in some embodiments, after the user selects another mode from the camera modes that does not support the collaborative photographing function, the collaborative photographing function of the mobile phone 100 is automatically turned off, and the external screen a of the mobile phone 100 no longer displays a mirror image of the photographing screen of the internal screen B of the mobile phone 100. For example, as shown in fig. 6 (a) - (B), after the user clicks the lens flip button 104 and turns on the front camera P2 to enter the self-photographing mode, since the collaborative photographing function only supports photographing using the rear camera P1, the collaborative photographing function of the mobile phone 100 is automatically turned off at this time, and the collaborative photographing button 102 is not displayed on the inner screen B, and the photographing screen 101a photographed by the front camera P2 is displayed only on the inner screen B of the mobile phone 100, and the photographing screen (or information such as date and time shown in fig. 1 (a) is displayed on the outer screen a, which is not limited herein). It will be appreciated that the camera mode described above that does not support the collaborative photographing function may also be other modes, as the application is not limited in this regard.
In addition, in other embodiments, after the user selects the photographing mode in which the camera application does not support the collaborative photographing function, the collaborative photographing button 102 may also be turned gray, so that the collaborative photographing button 102 is in an inoperable state.
In addition, it may be further understood that, in some embodiments, in order to improve the user experience, after the user exits the camera application in the camera mode supporting the collaborative photographing function, if the user opens the camera application again within a preset period of time (e.g., 15 minutes), the collaborative photographing function of the camera application is still in an on state, so that the user does not need to click the collaborative photographing function button 102 again to start the collaborative photographing function.
The above describes the corresponding interface change condition of the mobile phone 100 when the user performs the related touch operation and uses the collaborative photographing function of the mobile phone 100. In order to better understand the collaborative photographing method of the present application, a specific implementation procedure of the collaborative photographing method of the present application will be described with reference to fig. 7. It will be appreciated that the following description is also given by way of example of a camera application in the handset 100, and that the steps of the method shown in fig. 7 may be implemented by the processor 110 of the handset 100 executing a related program.
Specifically, fig. 7 illustrates a collaborative shooting method according to some embodiments of the present application. As shown in fig. 7, includes:
701, an on instruction of a collaborative photographing function is detected in a camera application.
As previously described, the user may activate the collaborative photographing function by clicking a collaborative photographing function button in the camera application interface. For example, as shown in fig. 3 (a) - (B), the user may click on the collaborative photographing function button 102 on the left side of the internal screen B in the camera application interface of the internal screen B of the mobile phone 100 to turn on the collaborative photographing function.
702, it is determined whether the current camera mode supports a collaborative photographing function. If the current camera mode supports the collaborative photographing function, then 703 is entered, i.e., the mobile phone 100 generates a mirror image of the photographing picture of the inner screen B and displays it on the outer screen a. If the current camera mode does not support the collaborative photographing function, then go to 704, i.e., the external screen A of the mobile phone 100 does not display the mirror image of the picture taken by the internal screen B.
For example, in the case where the current camera mode is the photographing mode shown in fig. 4 (a), since the photographing mode supports the collaborative photographing function, in this mode, the external screen a displays the mirror image screen 101B of the internal screen B photographing screen 101a as shown in fig. 4 (B). For another example, in the case where the current camera mode is the self-timer mode shown in fig. 6 (a), since the self-timer mode does not support the collaborative photographing function, in this mode, the external screen a displays not a mirror image screen of the internal screen B photographing screen 101a but information such as the current date and time as shown in fig. 6 (B). It will be appreciated that in some embodiments, the external screen a may display other screens, such as a default lock screen, a desktop, etc., as the application is not limited in this regard.
It can be understood that various existing technical schemes can be adopted to mirror the photographed picture. For example, in some embodiments, the specific manner in which the mobile phone 100 performs the mirror image processing on the shot image to obtain the mirror image may be: the mobile phone 100 horizontally turns the photographed picture to obtain a mirror image of the photographed picture. For example, assuming that the image captured by the internal screen B and the corresponding pixel matrix acquired by the mobile phone 100 are as shown in fig. 8 (a), the mobile phone 100 performs mirror-inversion transformation on the pixel matrix corresponding to the captured image by using the following formula (1), so as to obtain the pixel matrix corresponding to the mirror image as shown in fig. 8 (B), and then the mobile phone 100 outputs the mirror image matrix to the external screen a by using the image processor, so as to obtain the mirror image as shown in fig. 8 (B).
Wherein, formula (1) is as follows:
M′=S(s x ,s y )*M (1)
wherein M is a pixel matrix corresponding to the photographed picture, M' is a pixel matrix corresponding to the mirror image picture, S (S x ,s y ) Is a mirror-inverted transform coefficient, wherein s x =1,s y -1, which means that the ordinate of each pixel in the pixel matrix of the photographed picture remains unchanged, and the abscissa is multiplied by-1, that is, the coordinates of each pixel point of the photographed picture are axisymmetrically processed (i.e., horizontally flipped) based on the y-axis.
In addition, it can be further understood that, as described above, after the user opens the collaborative photographing function, the mirror image of the photographing picture of the inner screen B is displayed on the display interface of the outer screen a of the mobile phone, and the aspect ratio of the mirror image picture displayed by the outer screen a is consistent with that of the photographing picture displayed by the inner screen B in the current camera mode.
It can be understood that different camera modes correspond to different image aspect ratios, and in some embodiments of the present application, the correspondence between the camera modes with the collaborative photographing function and the image aspect ratios of the middle branch of the mobile phone 100 is shown in the following table 1:
TABLE 1
Camera mode Aspect ratio of image
Photographing mode 4:3
Video recording mode 21:9
Film mode 16:9
Then, the mobile phone 100 performs mirror image processing on the shot picture to obtain a mirror image picture, and then displays the mirror image picture on the mobile phone external screen A. It can be understood that, since the size of the mobile phone external screen a is different from the size of the mobile phone internal screen B, when the mobile phone 100 displays the mirror image picture on the external screen a, the mirror image picture is preferentially adapted to the width of the mobile phone external screen a, and then is displayed on the mobile phone external screen a according to the aspect ratio in the corresponding camera mode.
For example, when the user performs photographing in the "photographing mode", the aspect ratio of the photographing screen displayed on the inner screen B of the mobile phone 100 is 4:3 (refer to fig. 4 (a) above), and then the photographing screen under the aspect ratio is mirrored and then is displayed on the outer screen a of the mobile phone according to the aspect ratio of 4:3 after being adapted to the width of the outer screen a of the mobile phone in the manner shown in fig. 4 (B).
In addition, in order to further improve the user experience, when the user changes the resolution of the shooting picture displayed by the inner screen B in the same camera mode, the resolution of the mirror image picture displayed by the outer screen a may be correspondingly changed at the same time, so that the resolution of the shooting picture displayed by the inner screen B and the resolution of the mirror image picture displayed by the outer screen a are kept consistent.
The above describes a specific implementation process of the collaborative shooting method according to the present application, through the above method, when a user shoots, the user can intuitively watch the imaging effect of the user under the rear camera P1 through the external screen a of the mobile phone 100, and adjust the gesture and position of the user according to the image displayed by the external screen a, so as to achieve a better shooting effect, and improve the shooting experience of the user.
In order to more clearly understand the implementation details of the collaborative shooting method on the mobile phone 100, the following describes the process of implementing the collaborative shooting method by combining various software/hardware components in the mobile phone 100 with fig. 9 to 10. The method comprises the following steps:
Fig. 9 is a block diagram of a software architecture of a mobile phone 100 implementing the collaborative photographing method according to some embodiments of the present application. As shown in fig. 9, the software framework of the mobile phone 100 includes an application layer 10, an application framework layer 20, a hardware abstraction layer 30, and a hardware layer 40.
The application layer 10 includes a camera application 1, and the camera application 1 includes a plurality of camera modes, such as a photographing mode 11, a portrait mode 12, a video recording mode 13, a movie mode 14, a self-timer mode, and the like. Wherein, the photographing mode 11, the portrait mode 12, the video mode 13 and the movie mode 14 all support the collaborative photographing function 1105.
It will be appreciated that in some embodiments, the photographing mode 11 may include functions of intelligent knowledge 1101, artificial intelligence (artificial intelligence, AI) photographing 1102, a filter function 1103, a flash function 1104, a collaborative photographing function 1105, etc., which may be selected by a user according to the needs to achieve the corresponding purpose.
For example, the user may choose to turn on the smart knowledge 1101 function in the photographing mode 11 to recognize an object in the image. Specifically, after the user turns on the smart recognition function in the photographing mode 11, the user only needs to aim the camera at the photographed object, and when the mobile phone 100 recognizes the object, the name (such as "azalea") or the type ("dog") of the object is displayed on the screen B in the mobile phone. In other embodiments, the user may also activate the filter function 1103 to beautify the photographed object. The application is not limited in this regard.
In some embodiments, portrait mode 12 may include collaborative photographing function 1105, and may also include other functions, such as a beauty function, special effects function (not shown), and so forth. When the user selects the portrait mode 12 for photographing, the portrait mode can keep the face of the photographed person clear while blurring the background, thereby highlighting the subject of photographing.
In some embodiments, the recording mode 13 may include a multi-view recording function 1301, a flash function 1302, a filter function 1303, a collaborative photographing function 1105, and the like. When the user starts the multi-view video function 1301, the front and rear cameras P1 of the mobile phone 100 are simultaneously started, so that the user can record a plurality of objects simultaneously by using the front and rear cameras P1. For example, after the user can start the multi-view video recording function 1301, the mirror image shot by the lower front camera P2 and the mirror image shot by the rear camera P1 can be recorded at the same time, so as to record the scene where the user is located in detail. In addition, the user can turn on the flash function 1302, so that the imaging effect is improved in a darker scene, for example, when the user shoots in a darker place, the user can turn on the flash function 1302, the exposure time is increased, and the imaging effect of the shot object is improved.
In some embodiments, the movie mode 14 may include a Look Up Table (LUT) 1401,4K high-dynamic range (HDR) 1402, a slow motion 1403, a flash 1104, a collaborative photographing function 1105, and the like. The color finding function 1401 is essentially to realize a color matching function for an object photographed by a user, so that the color of the object photographed by the user is richer. In the 4K HDR function 1402, 4K refers to resolution (4096 pixels×2160 pixels), and HDR is an image rendering technology, which aims to implement scene illumination and simulated reflection and refraction of light, so as to make the object representation more realistic, and 4K HDR is used to improve the quality of the photographed picture, and improve the image resolution and the object reality.
The slow motion function 1403 can increase the sense of beauty and realism of the photographed image by the user through the slow motion function 1403 in order to increase the interest in photographing.
In addition, the camera application realizes the opening and closing of the functions, controls the collaborative display of the inner screen and the outer screen of the mobile phone 100 and detects the unfolding state of the inner screen B of the mobile phone through corresponding control logic.
Specifically, the camera application 1 controls the on and off of the collaborative photographing function by the collaborative photographing function on-off rule 15.
When the user touch operation detected by the mobile phone 100 accords with the collaborative photographing function opening and closing rule 15, the camera application opens the collaborative photographing function 1105. In some embodiments of the present application, the user may trigger the collaborative photographing function opening and closing rule 15 to open the collaborative photographing function 1105 of the camera application by clicking on the collaborative photographing function button 102 on the internal screen B of the mobile phone as shown in fig. 3 (a).
In other embodiments, in the case where the user turns on the collaborative photographing function, the collaborative photographing function may trigger the collaborative photographing function on-off rule 15 by clicking the collaborative photographing function button 102 again to turn off the collaborative photographing function 1105 of the camera application. It will be appreciated that in some embodiments, the display state of the collaborative photographing function button 102 may be changed according to the on and off states of the collaborative photographing function, for example, when the collaborative photographing function is on, the collaborative photographing function button 102 is in a convex state, and when the collaborative photographing function is off, the collaborative photographing function button 102 is restored to the original state. The application is not limited in this regard.
When the user turns on the collaborative photographing function 1105, the camera application 1 controls the collaborative display of the internal and external screens of the mobile phone through a User Interface (UI) interface control 16.
Specifically, the camera application 1 controls the display of the mobile phone inner screen B through the mobile phone inner screen B display module 1601, and controls the display of the mobile phone outer screen a through the mobile phone outer screen a display module 1602. For example, when the user opens the camera application 1 but does not open the collaborative photographing function yet, the mobile phone inner screen B display module 1602 initializes the mobile phone inner screen B first, that is, determines the display area of the mobile phone inner screen B in the current camera mode, for example, if the default camera mode is the photographing mode when the camera application 1 is opened, then the mobile phone inner screen B display module 1602 determines that the display area of the mobile phone inner screen B needs to satisfy: the aspect ratio is 4:3 and is centrally displayed.
When the user starts the collaborative photographing function, the collaborative photographing function 1105 of the camera application 1 sends a collaborative photographing instruction to the mobile phone inner screen B display module 1601, and then the mobile phone inner screen B display module 1601 initiates a request for collaborative display (i.e., a request for initializing a preview stream) to the mobile phone outer screen a display module 1602, where the request for initializing the preview stream includes an instruction to determine a display area (or a preview area) of the mobile phone outer screen a and to mirror an image displayed on the display area of the mobile phone outer screen a. It will be appreciated that in some embodiments, the mobile phone internal screen B also performs internal screen initialization according to the collaborative photographing instruction. In other embodiments, the initialization of the inner screen of the mobile phone inner screen B may be completed when the camera application is started. The application is not limited in this regard.
The mobile phone external screen a display module 1602 responds to the request for initializing the preview stream, and initializes the preview area of the external screen a, that is, the external screen a adds a tag for mirror image processing to the preview area, and generates a layer corresponding to the preview area by the View framework 21. The manner in which the View framework 21 generates the corresponding layer of the preview area will be described below.
In addition, the camera application 1 detects whether the internal screen B of the mobile phone 100 is in the expanded state by the mobile phone internal screen B expanded state detection module 1603. Specifically, the camera-applied mobile phone inner screen B expanded state detection module 1603 determines whether the inner screen B of the mobile phone 100 is in an expanded state by the sensor data of the sensor module 43 of the hardware layer 40. In addition, the specific process is described below in conjunction with fig. 10, and it is understood that the application layer 10 may also include other applications, such as video applications, chat applications, etc., which the present application is not limited to.
The application framework layer 20 of the handset 100 includes a View framework 21 and a surfeflinger 22.
In some embodiments, the View framework 21 is configured to create a corresponding display layer (display) according to the display area determined by the display module 1601 of the mobile phone inner screen B, so that the display layer is used to carry display data to be displayed on the outer screen a.
For example, in the photographing mode, the display area of the mobile phone inner screen B satisfies: aspect ratio 4:3 and centrally display, the display area of the corresponding mobile phone external screen A also meets the following requirements: and the aspect ratio is 4:3 and is displayed centrally, the View framework layer 21 creates a layer with the aspect ratio of 4:3 for the mobile phone external screen A, and the layer is used as a carrier of the mirror image picture to be displayed on the mobile phone external screen A.
In other embodiments, the View framework 21 is further configured to request the SurfaceFlinger 22 to perform post-mirroring on the content displayed in the preview area, attach the obtained post-mirroring content to the generated layer, and send the layer to the mobile phone external screen a display module 1602 for display.
In some embodiments, the SurfaceFlinger 22 is configured to obtain the data to be displayed from the hardware abstraction layer 30, then process the display images accordingly, and send the processed display data to the View framework 21. In the embodiment of the present application, the SurfaceFlinger 22 acquires the content to be displayed in the preview area from the preview data stream module 32 with an instruction to mirror the content displayed in the preview area according to the request sent by the View frame 21, and mirrors the content. In other embodiments, the SurfaceFlinger 22 is also used to send mirrored content to the View framework 21.
For example, the surfeflinger 22 acquires the copy data of the photographed picture from the preview data flow module 32, then performs mirroring processing on the copy data of the photographed picture to obtain mirrored picture data of the photographed picture, and transmits the mirrored picture data to the View frame 21.
It will be appreciated that the application framework layer 20 may also include other modules, such as a view manager (not shown), a wireless fidelity communication module (wireless fidelity, wi-Fi) (not shown), to which the present application is not limited.
The hardware abstraction layer 30 includes a photo data stream module 31, a preview data stream module 32, and a video data stream module 33. The photographing data flow module 31 is used for storing images obtained by photographing in the photographing mode 11, the portrait mode 12 and other camera modes. The video data stream module 33 is used for storing the video captured by the camera in the video recording mode 13, the film mode 14, and other camera modes.
The preview data stream module 32 is configured to store a shot picture of a shot image displayed on the screen B in the mobile phone when the camera is applied in any of the above camera modes, that is, before the user clicks the shooting button or the recording button, the camera shoots the shot image. Such as the image preview stream displayed by the mobile phone inner screen B in fig. 4 (a) or the image preview stream displayed by the mobile phone inner screen B in fig. 5 (a). It will be appreciated that, in order to enable the above-mentioned collaborative display on the inner and outer screens of the mobile phone, in some embodiments of the present application, the preview data stream module 32 stores the shot image of the inner screen B of the mobile phone and copies the shot image at the same time, so as to generate the mirror image by using the copy of the shot image.
It will be appreciated that in some embodiments of the present application, the stored image or video data of the photo data stream module 31, the preview data stream module 32 and the video data stream module 33 may be implemented based on the memory shown in fig. 11 below, which is not a limitation of the present application.
The hardware layer 40 includes a sensor module 43, an image signal processing (image signal processor, ISP) driver 41, and a Display Port (DP) driver 42. Among other things, in some embodiments of the application, the sensor module 43 includes a gyroscopic sensor, as well as an acceleration sensor, described below. In other embodiments, the sensor module 43 may further include other sensors, such as a pressure sensor (not shown), a touch sensor (not shown), an image sensor (not shown), and the like, which are not limited by the present application.
The ISP driver 41 is used for processing the signal transmitted by the image sensor of the mobile phone 100. For example, an electrical signal of an image fed back by the image sensor is converted into an image visible to the naked eye. The ISP drive 41 may also perform algorithmic optimization of noise, brightness, etc. of the image. ISP driver 41 can also optimize parameters such as exposure, color temperature, etc. of the shooting scene, or realize anti-shake processing of the image, and enhance the display effect of the image
The DP driver 42 is an interface of the external display connected to the mobile phone 100, and the mobile phone 100 can connect with the external display through the DP driver 42 and synchronously display the images displayed on the mobile phone 100 on the external display.
It will be appreciated that the above described software architecture is exemplary only and not limiting as to the software architecture of the handset 100, and that in other embodiments, the handset 100 may have more or less architecture, as the application is not limited in this regard.
In order to more intuitively understand the process of implementing the collaborative display of the inner screen and the outer screen of the mobile phone by matching the software modules, the process of displaying the mirror image of the inner screen B of the mobile phone by the outer screen a of the mobile phone will be described by taking the interaction diagram shown in fig. 10 as an example.
Specifically, as shown in fig. 10, the above-described process includes:
1001, after the collaborative photographing function 1105 detects that the collaborative photographing function opening rule is satisfied, the collaborative photographing function is opened.
1002, the collaborative photographing function 1105 sends a collaborative photographing instruction to the mobile phone inner screen B display module 1601.
1003, the mobile phone internal screen B display module 1601 performs internal screen initialization.
1004, the mobile phone inner screen B display module 1601 sends an instruction to initialize the outer screen preview area to the mobile phone outer screen a display module 1602.
1005, the mobile phone external screen a display module 1602 initializes the external screen preview area and adds a mirror label to the preview area.
In some embodiments, the mirror tag may be a string, number, or the like, e.g., the mirror tag may be an "x_flag," as the application is not limited in this regard.
1006, the mobile phone external screen a display module 1602 sends preview area information with mirror labels to the View framework 21.
It will be appreciated that in some embodiments, the information of the preview region may include coordinates of the preview region.
It will be appreciated that in some embodiments, the View framework 21 creates a layer corresponding to the preview area coordinates for the mobile phone external screen a according to the preview area coordinates with the mirror label, so as to carry the mirror image with the layer later, and displays the layer with the mirror image in the preview area.
1007, the view framework 21 creates a layer corresponding to the preview area.
The process of creating the layer specifically may refer to the above related description, and will not be described herein.
1008, the view framework 21 sends an instruction to mirror the preview area corresponding display content to the SurfaceFlinger 22.
It will be appreciated that the surfeflinger 22 will mirror the shot to be displayed in the preview area based on the mirror label carried by the preview area information. The mirroring process may refer to the above related description, and will not be described herein.
1009,SurfaceFlinger 22 an instruction to acquire copy data of the screen shot screen is generated.
1010,SurfaceFlinger 22 sends an instruction to the preview stream data module 32 to acquire copy data of the screen shot.
1011, the preview stream data module 32 generates copy data of the inside-screen shot screen.
In some embodiments, the preview data stream module 32 generates bitmap (bitmap) copy data of the photographed image stored in the in-cell-phone screen B display module 1601 according to an instruction sent by the surfeflinger 22, where the bitmap is data composed of pixel information of an image, for example, a matrix composed of coordinates of pixels of the image as shown in fig. 8.
1012, the preview stream data module 32 sends the copy data of the live-in screen shot to the SurfaceFlinger 22.
1013,SurfaceFlinger 22 mirror-images the copied data of the picture shot by the inner screen to obtain mirror-image picture data.
The specific mirroring process may refer to the above related description, and will not be described herein.
1014,SurfaceFlinger 22 sends mirror picture data to the View framework 21.
1015, the view framework 21 attaches mirror image picture data in the above layer.
1016, the view framework 21 sends the layer carrying the mirror image to the handset external screen a display module 1602.
1017, the mobile phone external screen a display module 1602 displays the layer carrying the mirror image in the preview area.
A schematic structure diagram of a hardware structure 1100 of the mobile phone 100 for implementing the above method is described below with reference to fig. 11.
As shown in fig. 11, the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, keys 190, a display 194, and the like. The sensor module 180 may include a gyro sensor 180A, an acceleration sensor 180B, a touch sensor 180C, and the like.
It should be understood that the structure illustrated in the embodiments of the present application is not limited to the specific embodiment of the mobile phone 100. In other embodiments of the application, the handset 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system. In some embodiments of the present application, the processor 110 may be configured to implement the mirror image processing on the screen B shot in the mobile phone.
In some embodiments, the processor 110 may include one or more interfaces. The external memory interface 120 may be used to connect to external memory, such as a Micro SD card, to enable expansion of the memory capabilities of the handset 100. The external memory communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, a photographing screen of the user before clicking the photographing button or the recording button is stored in the external memory 120, or a video after the user clicks the photographing button or the recording button and stops recording is stored.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store a program code corresponding to the network switching method of the present application, so that when the program code is executed, the mobile phone 100 can execute the collaborative shooting method of the present application, and the storage data area may store a shooting picture or a mirror image picture of the shooting picture related to the collaborative shooting method of the present application. In some embodiments, the storage data area may be divided into a plurality of logical partitions to store the above image or video data related to the collaborative shooting method of the present application.
The wireless communication function of the mobile phone 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 100. The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wi-Fi network, wireless fidelity), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied to the mobile phone 100.
The gyro sensor 180A and the acceleration sensor 180B in the sensor module 180 of the mobile phone 100 are used for obtaining the expansion angle of the mobile phone inner screen B, so that the mobile phone 100 can determine the expansion state of the mobile phone 100 inner screen B according to the expansion angle of the mobile phone inner screen B.
In some embodiments, the cell phone 100 may have a plurality of gyro sensors 180A and a plurality of acceleration sensors 180B disposed on the outer screen a and the inner screen B of the cell phone 100, respectively. For example, a set of sensors { one gyro sensor 180A and one acceleration sensor 180B } is provided on the external screen a of the mobile phone 100, and a set of sensors { one gyro sensor 180A and one acceleration sensor 180B } is also provided on the internal screen B of the mobile phone 100. Then, the mobile phone 100 combines the data collected by the gyroscope sensor and the data collected by the acceleration sensor on the inner screen A and the outer screen A to determine the unfolding angle of the inner screen B of the mobile phone 100.
The touch sensor 180C may be disposed on the display 194, and the touch sensor 180C and the display 194 form a touch screen, which is also referred to as a "touch screen". The touch sensor 180C is used to detect a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. For example, the touch sensor 180C acquires a touch operation of the user clicking on the collaborative photographing function button 102 in fig. 3 (a), and then passes the operation to the processor 110, and the processor 110 determines that the touch event is "open collaborative photographing function". When the user clicks the collaborative photographing button 102 again, the operation is transferred to the processor 110 by the touch sensor 180C, and the processor 110 determines that the touch event is "turning off the collaborative photographing function". In some embodiments, visual output related to touch operations may be provided through the display screen 194. In other embodiments, the touch sensor 180C may also be disposed on a surface of the mobile phone 100 at a location different from the location of the display 194. The application is not limited in this regard.
The embodiment of the application also provides electronic equipment, which comprises: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, which when executed by the processor performs the steps of any of the various method embodiments described above.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps for implementing the various method embodiments described above.
Embodiments of the present application provide a computer program product which, when run on a mobile terminal, causes the mobile terminal to perform steps that enable the implementation of the method embodiments described above.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above-described embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and the computer program may implement the steps of the method embodiments described above when executed by a processor. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing device/terminal apparatus, recording medium, computer memory, read-only memory (ROM), random access memory (random access memory, RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In the description above, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (5)

1. The collaborative shooting method is characterized in that the electronic equipment comprises a first camera, a first screen and a second screen, wherein the first screen and the second screen are positioned on different sides of the electronic equipment, the first camera and the first screen are positioned on different sides of the electronic equipment, and the first camera and the second screen are positioned on the same side of the electronic equipment;
the electronic equipment comprises a collaborative photographing function module, a first screen display module, a second screen display module, a view module, an image synthesis module and a preview data stream module, wherein the collaborative photographing function module, the first screen display module and the second screen display module are positioned at an application layer;
The method comprises the following steps:
the first screen displays a first shooting picture shot by the first camera;
the collaborative photographing function module detects a first instruction, and the first instruction is determined under the condition that a first operation of a first control by a user is detected;
the collaborative shooting function module sends a first instruction to a second screen display module through the first screen display module under the condition that the current shooting mode of the electronic equipment is determined to support the collaborative shooting function;
in response to receiving the first instruction, the second screen display module sends image preview area information carrying a mirror image label to the view module;
in response to receiving the image preview area information carrying the mirror image tag, the view module sends an instruction for mirror image processing of the first shooting picture to the image synthesis module;
in response to receiving an instruction for mirroring the first photographed picture, the image synthesis module obtains copy data of the first photographed picture from the preview data stream module;
the image synthesis module performs mirror image processing on the copied data to obtain data of a second shooting picture; the second shooting picture is a mirror image picture of the first shooting picture, and the size proportion of the first shooting picture and the second shooting picture is the same;
The image synthesis module sends the data of the second shooting picture to the view module;
the view module generates a layer carrying the second shooting picture according to the data of the second shooting picture and sends the layer to the second screen display module;
and the second screen display module displays a layer carrying the second shooting picture on the second screen.
2. The method of claim 1, wherein the first screen displays a display interface of a camera application and the display interface of the camera application includes the first captured picture, wherein the display interface of the camera application further includes the first control;
the method further comprises the steps of:
detecting that a shooting mode of the camera application is switched from a first shooting mode supporting the collaborative shooting function to a second shooting mode not supporting the collaborative shooting function;
and not displaying the first control on a display interface of the camera application.
3. The method of claim 2, wherein the shooting mode comprises one or more of a photographing mode, a video mode, a portrait mode, or a movie mode.
4. A readable medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to perform the collaborative shooting method of any of claims 1-3.
5. An electronic device, the electronic device comprising:
a memory for storing instructions for execution by one or more processors of the electronic device, and,
a processor, which is one of the processors of the electronic device, for performing the collaborative shooting method of any one of claims 1 to 3.
CN202111682873.1A 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof Active CN114401340B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111682873.1A CN114401340B (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof
CN202311176831.XA CN117424954A (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111682873.1A CN114401340B (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311176831.XA Division CN117424954A (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof

Publications (2)

Publication Number Publication Date
CN114401340A CN114401340A (en) 2022-04-26
CN114401340B true CN114401340B (en) 2023-09-26

Family

ID=81229238

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111682873.1A Active CN114401340B (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof
CN202311176831.XA Pending CN117424954A (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202311176831.XA Pending CN117424954A (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof

Country Status (1)

Country Link
CN (2) CN114401340B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118338119A (en) * 2021-12-31 2024-07-12 荣耀终端有限公司 Method, apparatus and readable storage medium for displaying preview image
CN118276805A (en) * 2022-06-17 2024-07-02 荣耀终端有限公司 Application interface display method, electronic device and storage medium
CN117156270B (en) * 2023-02-03 2024-07-23 荣耀终端有限公司 Photographing processing method and related device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107333047A (en) * 2017-08-24 2017-11-07 维沃移动通信有限公司 A kind of image pickup method, mobile terminal and computer-readable recording medium
CN107613196A (en) * 2017-09-05 2018-01-19 珠海格力电器股份有限公司 Self-photographing method and device and electronic equipment
CN108234875A (en) * 2018-01-15 2018-06-29 广东欧珀移动通信有限公司 Shoot display methods, device, mobile terminal and storage medium
CN108833769A (en) * 2018-05-18 2018-11-16 Oppo广东移动通信有限公司 Shoot display methods, device, mobile terminal and storage medium
CN110933306A (en) * 2019-11-29 2020-03-27 维沃移动通信有限公司 Method for sharing shooting parameters and electronic equipment
CN111385456A (en) * 2018-12-27 2020-07-07 北京小米移动软件有限公司 Photographing preview method and device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107333047A (en) * 2017-08-24 2017-11-07 维沃移动通信有限公司 A kind of image pickup method, mobile terminal and computer-readable recording medium
CN107613196A (en) * 2017-09-05 2018-01-19 珠海格力电器股份有限公司 Self-photographing method and device and electronic equipment
CN108234875A (en) * 2018-01-15 2018-06-29 广东欧珀移动通信有限公司 Shoot display methods, device, mobile terminal and storage medium
CN108833769A (en) * 2018-05-18 2018-11-16 Oppo广东移动通信有限公司 Shoot display methods, device, mobile terminal and storage medium
CN111385456A (en) * 2018-12-27 2020-07-07 北京小米移动软件有限公司 Photographing preview method and device and storage medium
CN110933306A (en) * 2019-11-29 2020-03-27 维沃移动通信有限公司 Method for sharing shooting parameters and electronic equipment

Also Published As

Publication number Publication date
CN114401340A (en) 2022-04-26
CN117424954A (en) 2024-01-19

Similar Documents

Publication Publication Date Title
CN114401340B (en) Collaborative shooting method, electronic equipment and medium thereof
CN108833753B (en) Image acquisition and application method, terminal and computer readable storage medium
CN114092364B (en) Image processing method and related device
CN110636276B (en) Video shooting method and device, storage medium and electronic equipment
CN108632543B (en) Image display method, image display device, storage medium and electronic equipment
CN115002339A (en) Collaborative photographing method, electronic device, storage medium, and program product
CN106791390B (en) Wide-angle self-timer real-time preview method and user terminal
CN114845059B (en) Shooting method and related equipment
CN111083371A (en) Shooting method and electronic equipment
CN115633262B (en) Image processing method and electronic device
CN115567630B (en) Electronic equipment management method, electronic equipment and readable storage medium
CN114630053B (en) HDR image display method and display device
CN114866681B (en) Cross-equipment collaborative shooting method, related device and system
CN115359105B (en) Depth-of-field extended image generation method, device and storage medium
CN109923850A (en) A kind of image capture device and method
CN111064895A (en) Virtual shooting method and electronic equipment
CN113329172A (en) Shooting method and device and electronic equipment
CN115272138A (en) Image processing method and related device
CN110086998A (en) A kind of image pickup method and terminal
CN111107281B (en) Image processing method, image processing apparatus, electronic device, and medium
WO2023124202A1 (en) Image processing method and electronic device
JP2005073161A (en) Processing apparatus and image recording method
CN113542463A (en) Video shooting device and method based on folding screen, storage medium and mobile terminal
CN115914860A (en) Shooting method and electronic equipment
CN104994294B (en) A kind of image pickup method and mobile terminal of more wide-angle lens

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant