WO2020220973A1 - Procédé de photographie et terminal mobile - Google Patents

Procédé de photographie et terminal mobile Download PDF

Info

Publication number
WO2020220973A1
WO2020220973A1 PCT/CN2020/084225 CN2020084225W WO2020220973A1 WO 2020220973 A1 WO2020220973 A1 WO 2020220973A1 CN 2020084225 W CN2020084225 W CN 2020084225W WO 2020220973 A1 WO2020220973 A1 WO 2020220973A1
Authority
WO
WIPO (PCT)
Prior art keywords
angle
image sensor
preset
mobile terminal
viewfinder lens
Prior art date
Application number
PCT/CN2020/084225
Other languages
English (en)
Chinese (zh)
Inventor
徐伟
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020220973A1 publication Critical patent/WO2020220973A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present disclosure relates to the technical field of communication applications, and in particular, to a shooting method and a mobile terminal.
  • the camera function is used very frequently.
  • the front of the mobile phone must have a camera shooting window.
  • a lift camera becomes a better solution to this problem.
  • the purpose of the present disclosure is to provide a photographing method and a mobile terminal to solve the problem that the elevating camera in the related art is difficult to automatically track images.
  • the embodiments of the present disclosure provide a shooting method applied to a mobile terminal.
  • the mobile terminal includes a housing, a viewfinder lens, and an image sensor.
  • the housing is provided with an opening, and the viewfinder lens is movably Is arranged in the housing to extend or retract from the opening, and the viewfinder lens is rotatable around the axis of the opening, the image sensor is arranged inside the housing, and the viewfinder lens is used to The light rays entering the viewfinder lens are projected to the image sensor, and the shooting method includes:
  • the image frame is generated by shooting the subject.
  • an embodiment of the present disclosure also provides a mobile terminal, including a housing, a viewfinder lens, and an image sensor, the housing is provided with an opening, and the viewfinder lens is movably disposed on the housing to view The opening extends or retracts, and the viewfinder lens is rotatable around the axis of the opening, the image sensor is arranged inside the housing, and the viewfinder lens is used to shoot into the viewfinder lens The light is projected to the image sensor, and the mobile terminal further includes:
  • a determining module configured to determine a shooting object when detecting that the viewfinder lens protrudes from the housing through the opening
  • the acquiring module is configured to acquire the target angle between the position of the photographing object and the preset position of the image sensor within a preset time period before the target moment, and the target time is to generate each image according to the photographing object Moment of frame
  • An adjustment module configured to adjust the target angle to the preset angle by controlling the rotation of the viewfinder lens when the target angle is different from the preset angle;
  • the processing module is configured to photograph the subject to generate the image frame.
  • the embodiments of the present disclosure also provide a mobile terminal, including a processor, a memory, and a computer program stored on the memory and running on the processor, and the computer program is executed by the processor. When executed, the steps of the above-mentioned shooting method are realized.
  • the embodiments of the present disclosure also provide a computer-readable storage medium having a computer program stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the shooting method described above are implemented.
  • the above-mentioned technical solutions of the embodiments of the present disclosure ensure that the target angle between the position of the shooting object and the preset position of the image sensor is the preset angle by rotating the viewfinder lens so as to achieve the target angle Real-time tracking.
  • FIG. 1 is a schematic diagram of imaging through a viewfinder lens and a camera in an embodiment of the disclosure
  • FIG. 2 is a schematic diagram of the first light path during imaging in an embodiment of the disclosure
  • FIG. 3 is a schematic diagram of a second light path during imaging in an embodiment of the disclosure.
  • FIG. 4 is a schematic flowchart of a shooting method according to an embodiment of the disclosure.
  • FIG. 5 is a schematic diagram of the driving motor rotating in an embodiment of the disclosure.
  • FIG. 6 is a schematic diagram of the relationship between the difference ⁇ between the target angle and the preset angle and the number of image frames
  • Figure 7 is a schematic diagram of the rotation curve corresponding to the uniform rotation and the non-uniform rotation of the motor
  • FIG. 8 is one of schematic diagrams of modules of a mobile terminal provided by an embodiment of the disclosure.
  • FIG. 9 is a second schematic diagram of modules of a mobile terminal provided by an embodiment of the disclosure.
  • FIG. 10 is a structural block diagram of a mobile terminal provided by an embodiment of the disclosure.
  • the shooting method of the embodiment of the present disclosure is applied to a mobile terminal.
  • the mobile terminal includes a housing, a viewfinder lens, and an image sensor, the housing is provided with an opening, and the viewfinder lens is movably provided on the housing to Extend or retract from the opening, and the viewfinder lens is rotatable around the axis of the opening, the image sensor is arranged inside the housing, and the viewfinder lens is used to shoot into the viewfinder lens
  • the above-mentioned viewfinder lens may be specifically a mirror or a reflective lens.
  • the mobile terminal of the embodiment of the present disclosure further includes a camera lens arranged inside the housing and connected to the image sensor, As shown in FIG.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • the image sensor converts the resulting image into binary data and passes through the terminal processor (Central The image information processor (Image Signal Processor, ISP) in the Processing Unit (CPU) converts it into display data and displays it on the display screen, so that the user can see the captured image.
  • ISP Image Signal Processor
  • the image formed by the subject on the image sensor is A'
  • the image formed by the subject on the image sensor is A1'
  • the subject is at A2
  • the image of the subject on the image sensor is A2'.
  • the frame rate of the camera shooting preview is generally 30 frames.
  • the horizontal distance x of the subject moving on the image sensor can be obtained, and each frame of image can be obtained through the Hall device on the closed loop motor.
  • the distance d from the middle image sensor to the camera lens is used to obtain the moving angle of the object relative to the camera in two adjacent frames, so that the angle of the object moving relative to the camera can be obtained in real time.
  • the purpose of tracking shooting is to keep the shooting object at a specific position of the image sensor (for example, the middle position of the image sensor) for imaging.
  • an embodiment of the present disclosure provides a shooting method, including:
  • Step 401 When it is detected that the viewfinder lens protrudes from the housing through the opening, determine the shooting object.
  • the processor controls the viewfinder lens to extend out of the housing through the opening.
  • the camera module is activated, and the camera determines the shooting target selected by the user as the shooting object.
  • the subject can be a human face.
  • the camera recognizes the image of the photographed object and collects feature points, so that the photographed object can be tracked and photographed later.
  • Step 402 Acquire a target angle between the position of the photographic subject and the preset position of the image sensor within a preset period of time before the target moment, and the target moment is the value of each image frame generated according to the photographic subject time.
  • the target angle specifically refers to the angle by which the current position of the photographing object deviates from the preset position of the image sensor. Before each image frame is generated, the target angle between the position of the photographing object and the preset position of the image sensor is obtained. In order to control the rotation of the viewfinder lens according to the target angle, thereby ensuring that the image formed by the subject on the image sensor is always located at the preset position of the image sensor.
  • the preset position of the image sensor is the center position of the image sensor.
  • Determining the preset position of the image sensor as the center position of the image sensor ensures that the image formed by the subject on the image sensor is always located at the center position of the image sensor, which conforms to the user's shooting habits.
  • Step 403 When the target angle is different from the preset angle, adjust the target angle to the preset angle by controlling the rotation of the viewfinder lens.
  • the preset angle can be specifically set to 0 degrees.
  • the viewfinder lens is controlled to rotate to ensure the position of the shooting object and the preset position of the image sensor
  • the angle between is 0 degrees, that is, to ensure that the image formed by the subject on the image sensor is always at the center of the image sensor.
  • Step 404 Take a picture of the subject to generate the image frame.
  • the image frame is generated by controlling the camera to shoot the subject.
  • the shooting object when it is detected that the viewfinder lens extends from the housing through the opening, the shooting object is determined; the position and the position of the shooting object are acquired within a preset time period before the target time.
  • the target angle between the preset positions of the image sensor, the target moment is the moment when each image frame is generated according to the photographing object; when the target angle is different from the preset angle, by controlling the rotation of the viewfinder lens, The target angle is adjusted to the preset angle; and the image frame is generated by shooting the subject.
  • the viewfinder lens is rotated to ensure that the target angle between the position of the subject and the preset position of the image sensor is the preset angle when each image frame is taken, so that real-time tracking of the subject is achieved .
  • the preset time period is a time period corresponding to shooting one image frame.
  • the time corresponding to shooting an image frame is n milliseconds, that is, when an image frame is generated at time n, time 2n, time 3n, ..., then it is in 0-n, n-2n, 2n -3n..., acquire the target angle between the position of the subject and the preset position of the image sensor.
  • the frame rate of the camera is 30 frames
  • the time period corresponding to shooting an image frame is 33.3ms, specifically in the time period of 0-33.3ms, the time period of 33.3-66.6,..., the position and image of the subject are acquired The target angle between the preset positions of the sensor.
  • the target angle obtained in the 0-n time period is the angle between the object and the image sensor at time 0
  • the target angle obtained in the n-2n time period is the angle between the object and the image sensor at time n
  • the angle between is the angle between the subject and the image sensor when the target angle is acquired in the time period of 2n-3n.
  • the target angle between the position of the shooting object and the preset position of the image sensor is obtained to ensure that when each image frame is generated, the shooting object is in the image
  • the image formed on the sensor is always close to the center of the image sensor, that is, tracking shooting is realized.
  • the mobile terminal further includes a camera lens, and the camera lens is arranged in the housing and connected with the image sensor;
  • the acquiring the target angle between the current position of the shooting object and the preset position of the image sensor includes:
  • the first distance is the distance between the image formed by the subject on the image sensor and the preset position of the image sensor
  • the second distance Is the distance between the image sensor and the camera lens
  • the target angle is obtained.
  • the above-mentioned first distance can be obtained by counting pixels by the processor, and the above-mentioned second distance can be obtained by a Hall device on a closed-loop motor.
  • the mobile terminal further includes: a motor connected to the viewfinder lens;
  • the adjusting the target angle to the preset angle by controlling the rotation of the viewfinder lens includes:
  • the angle of the motor rotation is a corresponding relationship between the angle of the motor rotation and the angle of the viewfinder lens.
  • the first angle of the viewfinder lens rotation can be calculated.
  • the first angle is the above-mentioned target angle and The difference between the preset angles, and according to the corresponding relationship between the angle of rotation of the motor and the angle of rotation of the viewfinder lens, the second angle of motor rotation is obtained, and then the processor controls the motor to rotate the second angle within a preset period of time angle.
  • the processor sends an instruction EN to enable the motor driver, and selects the drive mode (subdivision drive level) of the motor driver through the MODE pin.
  • the motor driver modulates according to pulse width
  • the number of pulses Pulse-Width Modulation, PWM determines how many steps the motor takes to achieve precise control of the motor.
  • the difference between the rotation speed of the motor at the last moment of the preset time period and zero is less than a preset threshold.
  • the motor can rotate in a way that the speed gradually decreases within a preset time period, or it can rotate at a uniform speed first, and then rotate in a way that the speed gradually decreases, that is to ensure that the motor speed approaches 0 at the last moment. .
  • the purpose of tracking and shooting is to always keep the subject in the middle of the image sensor, that is, after reaching a steady state, the value of ⁇ (the difference between the target angle and the preset angle) tends to be as close as possible.
  • the error of the motor needs to be taken into account.
  • the error of the motor mainly comes from the inertia after completing the number of steps given by the motor driver, and the inertia will continue to move forward, and this inertia is directly related to the motor speed.
  • Figure 6 there may be a situation as shown in Figure 6 (assuming that the shooting target is fixed and the mobile phone is also fixed).
  • the motor rotates at a non-uniform speed, first fast and then slow within time t.
  • the angle of rotation of the motor is detected close to ⁇ , the speed of motor B is also close to 0, eliminating the influence of inertia.
  • the time of each frame is about 33.3ms.
  • the motor In order to achieve the effect of tracking and shooting, the motor must meet the time of one frame (33.3ms) and be able to run the angle of ⁇ .
  • also needs to meet: -0.5FOV ⁇ 0.5FOV; FOV is the field of view of the camera. Obviously, when ⁇ exceeds this range, the camera will not be able to capture the target picture in the next frame of picture and cannot track and shoot.
  • the shooting method of the embodiment of the present disclosure by rotating the viewfinder lens, when each image frame is shot, it can be ensured that the target angle between the position of the shooting object and the preset position of the image sensor is the preset angle, so as to realize the control of the shooting object. Real-time tracking.
  • An embodiment of the present disclosure also provides a mobile terminal, including a housing, a viewfinder lens, and an image sensor.
  • the housing is provided with an opening, and the viewfinder lens is movably disposed on the housing to extend from the opening.
  • the viewfinder lens is rotatable around the axis of the opening, the image sensor is arranged inside the housing, and the viewfinder lens is used to project the light that enters the viewfinder lens to all
  • the image sensor as shown in FIG. 8, the mobile terminal 800 further includes:
  • the determining module 801 is configured to determine the shooting object when detecting that the viewfinder lens extends from the housing through the opening;
  • the obtaining module 802 is configured to obtain a target angle between the position of the photographing object and the preset position of the image sensor within a preset period of time before the target time, and the target time is to generate each object according to the photographing object.
  • the adjustment module 803 is configured to adjust the target angle to the preset angle by controlling the rotation of the viewfinder lens when the target angle is different from the preset angle;
  • the processing module 804 is configured to photograph the subject to generate the image frame.
  • the preset time period is a time period corresponding to the shooting of one image frame.
  • the mobile terminal of the embodiment of the present disclosure further includes a camera lens, the camera lens is arranged in the housing and connected to the image sensor; as shown in FIG. 9, the acquisition module 802 includes:
  • the first acquisition sub-module 8021 is used to acquire a first distance and a second distance; wherein, the first distance is the difference between the image formed by the subject on the image sensor and the preset position of the image sensor The second distance is the distance between the image sensor and the camera lens;
  • the second obtaining submodule 8022 is configured to obtain the target angle according to the first distance and the second distance.
  • the mobile terminal of the embodiment of the present disclosure further includes: a motor connected to the viewfinder lens;
  • the adjustment module 803 includes:
  • the first determining submodule 8031 is configured to determine the first angle of rotation of the viewfinder lens according to the target angle and the preset angle;
  • the second determining sub-module 8032 is configured to determine a second angle of rotation of the motor according to the first angle
  • the control sub-module 8033 is configured to control the motor to rotate the second angle within the preset time period.
  • the difference between the rotation speed of the motor at the last moment of the preset time period and zero is less than a preset threshold.
  • the preset position of the image sensor is the center position of the image sensor.
  • the mobile terminal of the embodiment of the present disclosure can ensure that the target angle between the position of the object and the preset position of the image sensor is the preset angle by rotating the viewfinder lens when shooting each image frame. Real-time tracking.
  • the mobile terminal is a mobile terminal corresponding to the foregoing method embodiment, and all implementation manners in the foregoing method embodiment are applicable to the embodiment of the mobile terminal, and the same technical effect can also be achieved.
  • FIG. 10 is a schematic diagram of the hardware structure of a mobile terminal implementing various embodiments of the present disclosure.
  • the mobile terminal 1000 includes but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, and a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, a power supply 1011 and other components.
  • Those skilled in the art can understand that the structure of the mobile terminal shown in FIG. 10 does not constitute a limitation on the terminal.
  • the mobile terminal may include more or less components than those shown in the figure, or combine certain components, or different component arrangements.
  • mobile terminals include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
  • the mobile terminal also includes a housing, a viewfinder lens, and an image sensor.
  • the housing is provided with an opening, and the viewfinder lens is movably provided on the housing to extend or retract from the opening, and The viewfinder lens is rotatable around the axis of the opening, the image sensor is arranged inside the housing, and the viewfinder lens is used for projecting the light incident on the viewfinder lens to the image sensor.
  • the processor 1010 is configured to detect that the viewfinder lens protrudes from the housing through the opening, and determine the shooting object;
  • the image frame is generated by shooting the subject.
  • the above-mentioned technical solutions of the embodiments of the present disclosure ensure that the target angle between the position of the shooting object and the preset position of the image sensor is the preset angle by rotating the viewfinder lens so as to achieve the target angle Real-time tracking.
  • the radio frequency unit 1001 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 1010; Uplink data is sent to the base station.
  • the radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 1001 can also communicate with the network and other devices through a wireless communication system.
  • the mobile terminal provides users with wireless broadband Internet access through the network module 1002, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 1003 can convert the audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into audio signals and output them as sounds. Moreover, the audio output unit 1003 may also provide audio output related to a specific function performed by the mobile terminal 1000 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 1003 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 1004 is used to receive audio or video signals.
  • the input unit 1004 may include a graphics processing unit (GPU) 10041 and a microphone 10042, and the graphics processor 10041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 1006.
  • the image frame processed by the graphics processor 10041 may be stored in the memory 1009 (or other storage medium) or sent via the radio frequency unit 1001 or the network module 1002.
  • the microphone 10042 can receive sound and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 1001 in the case of a telephone call mode for output.
  • the mobile terminal 1000 also includes at least one sensor 1005, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 10061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 10061 and the display panel 10061 when the mobile terminal 1000 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify terminal posture (such as horizontal and vertical screen switching, related games, Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 1005 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared Sensors, etc., will not be repeated here.
  • the display unit 1006 is used to display information input by the user or information provided to the user.
  • the display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 1007 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the mobile terminal.
  • the user input unit 1007 includes a touch panel 10071 and other input devices 10072.
  • the touch panel 10071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 10071 or near the touch panel 10071. operating).
  • the touch panel 10071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it
  • the processor 1010 receives and executes the command sent by the processor 1010.
  • the touch panel 10071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 1007 may also include other input devices 10072.
  • other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 10071 can be overlaid on the display panel 10061.
  • the touch panel 10071 detects a touch operation on or near it, it transmits it to the processor 1010 to determine the type of the touch event, and then the processor 1010 determines the type of touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 10061.
  • the touch panel 10071 and the display panel 10061 are used as two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated
  • the implementation of the input and output functions of the mobile terminal is not specifically limited here.
  • the interface unit 1008 is an interface for connecting an external device with the mobile terminal 1000.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (input/output, I/O) port, video I/O port, headphone port, etc.
  • the interface unit 1008 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the mobile terminal 1000 or can be used to connect to the mobile terminal 1000 and external Transfer data between devices.
  • the memory 1009 can be used to store software programs and various data.
  • the memory 1009 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 1009 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 1010 is the control center of the mobile terminal. It uses various interfaces and lines to connect the various parts of the entire mobile terminal, runs or executes software programs and/or modules stored in the memory 1009, and calls data stored in the memory 1009 , Perform various functions of the mobile terminal and process data, so as to monitor the mobile terminal as a whole.
  • the processor 1010 may include one or more processing units; optionally, the processor 1010 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 1010.
  • the mobile terminal 1000 may also include a power supply 1011 (such as a battery) for supplying power to various components.
  • a power supply 1011 (such as a battery) for supplying power to various components.
  • the power supply 1011 may be logically connected to the processor 1010 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the mobile terminal 1000 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides a mobile terminal, including a processor, a memory, and a computer program stored in the memory and running on the processor.
  • the computer program is executed by the processor to implement the aforementioned shooting method
  • the embodiments of the present disclosure also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the above-mentioned shooting method embodiment is realized, and the same technical effect can be achieved To avoid repetition, I won’t repeat it here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, or optical disk, etc.
  • the method of the above embodiments can be implemented by means of software plus the necessary general hardware platform. Of course, it can also be implemented by hardware, but in many cases the former is better. ⁇
  • the technical solution of the present disclosure essentially or the part that contributes to the related technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present disclosure.
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components can be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the computer software product is stored in a storage medium and includes several instructions to make a A computer device (which may be a personal computer, a server, or a network device, etc.) executes all or part of the steps of the methods described in the various embodiments of the present disclosure.
  • the aforementioned storage media include: U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk and other media that can store program codes.
  • the program can be stored in a computer readable storage medium. When executed, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.
  • modules, units, and sub-units can be implemented in one or more Application Specific Integrated Circuits (ASIC), Digital Signal Processor (DSP), Digital Signal Processing Device (DSP Device, DSPD) ), Programmable Logic Device (PLD), Field-Programmable Gate Array (FPGA), general-purpose processors, controllers, microcontrollers, microprocessors, used to implement Described functions in other electronic units or combinations thereof.
  • ASIC Application Specific Integrated Circuits
  • DSP Digital Signal Processor
  • DSP Device Digital Signal Processing Device
  • DSPD Digital Signal Processing Device
  • PLD Programmable Logic Device
  • FPGA Field-Programmable Gate Array
  • the technology described in the embodiments of the present disclosure can be implemented through modules (for example, procedures, functions, etc.) that perform the functions described in the embodiments of the present disclosure.
  • the software codes can be stored in the memory and executed by the processor.
  • the memory can be implemented in the processor or external to the processor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de photographie et un terminal mobile. Le procédé de photographie de la présente invention comprend les étapes suivantes : la détermination, lorsqu'il est détecté qu'une caméra de recherche de vue s'étend hors d'une coque à travers une ouverture, d'un objet photographié ; l'obtention d'un angle cible entre la position de l'objet photographié et une position prédéfinie d'un capteur d'image dans une période de temps prédéfinie avant un moment cible, le moment cible étant un moment où chaque trame d'image est générée selon l'objet photographié ; et lorsque l'angle cible est différent d'un angle prédéfini, l'ajustement de l'angle cible par rapport à l'angle prédéfini par commande de la caméra de recherche de vue pour la faire pivoter ; et la photographie de l'objet photographié pour générer la trame d'image.
PCT/CN2020/084225 2019-04-29 2020-04-10 Procédé de photographie et terminal mobile WO2020220973A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910355129.7 2019-04-29
CN201910355129.7A CN110049221B (zh) 2019-04-29 2019-04-29 拍摄方法及移动终端

Publications (1)

Publication Number Publication Date
WO2020220973A1 true WO2020220973A1 (fr) 2020-11-05

Family

ID=67280169

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/084225 WO2020220973A1 (fr) 2019-04-29 2020-04-10 Procédé de photographie et terminal mobile

Country Status (2)

Country Link
CN (1) CN110049221B (fr)
WO (1) WO2020220973A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112843678A (zh) * 2020-12-31 2021-05-28 上海米哈游天命科技有限公司 拍摄图像的方法、装置、电子设备及存储介质
CN115802133A (zh) * 2021-09-10 2023-03-14 Oppo广东移动通信有限公司 摄像头模组、电子设备、图像获取方法、装置、存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110049221B (zh) * 2019-04-29 2021-06-15 维沃移动通信(杭州)有限公司 拍摄方法及移动终端
CN113438416B (zh) * 2021-06-21 2022-12-09 北京小米移动软件有限公司 图像数量获取方法和装置、电子设备、存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002287239A (ja) * 2001-03-28 2002-10-03 Mitsubishi Electric Corp 屋外複合一体型カメラ
CN106331471A (zh) * 2015-07-10 2017-01-11 宇龙计算机通信科技(深圳)有限公司 一种自动跟踪摄像方法、装置、移动终端和旋转支架
CN107888805A (zh) * 2016-09-29 2018-04-06 贵州火星探索科技有限公司 一种手机摄像头拍照跟踪装置和方法
CN108388314A (zh) * 2018-04-04 2018-08-10 深圳天珑无线科技有限公司 一种电子设备及其摄像头组件
CN108391058A (zh) * 2018-05-17 2018-08-10 Oppo广东移动通信有限公司 图像拍摄方法、装置、电子装置及存储介质
CN109302552A (zh) * 2018-10-30 2019-02-01 维沃移动通信(杭州)有限公司 一种终端设备及终端设备的控制方法
CN110049221A (zh) * 2019-04-29 2019-07-23 维沃移动通信(杭州)有限公司 拍摄方法及移动终端

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101420525A (zh) * 2007-10-26 2009-04-29 鸿富锦精密工业(深圳)有限公司 拍照装置及方法
CN103645749A (zh) * 2013-12-23 2014-03-19 张志增 自动调节型显示设备及调节方法
WO2016151925A1 (fr) * 2015-03-26 2016-09-29 富士フイルム株式会社 Dispositif de commande de suivi, procédé de commande de suivi, programme de commande de suivi et système de capture d'image/de suivi automatique
US20160352992A1 (en) * 2015-05-27 2016-12-01 Gopro, Inc. Image Stabilization Mechanism
CN205123805U (zh) * 2015-11-02 2016-03-30 维沃移动通信有限公司 一种移动终端
CN206413038U (zh) * 2016-10-31 2017-08-15 维沃移动通信有限公司 一种摄像头及移动终端
CN108769496B (zh) * 2018-08-08 2020-11-27 深圳市广和通无线股份有限公司 一种摄像组件及电子设备
CN109389093A (zh) * 2018-10-23 2019-02-26 同济大学 基于人脸识别的面向型追踪方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002287239A (ja) * 2001-03-28 2002-10-03 Mitsubishi Electric Corp 屋外複合一体型カメラ
CN106331471A (zh) * 2015-07-10 2017-01-11 宇龙计算机通信科技(深圳)有限公司 一种自动跟踪摄像方法、装置、移动终端和旋转支架
CN107888805A (zh) * 2016-09-29 2018-04-06 贵州火星探索科技有限公司 一种手机摄像头拍照跟踪装置和方法
CN108388314A (zh) * 2018-04-04 2018-08-10 深圳天珑无线科技有限公司 一种电子设备及其摄像头组件
CN108391058A (zh) * 2018-05-17 2018-08-10 Oppo广东移动通信有限公司 图像拍摄方法、装置、电子装置及存储介质
CN109302552A (zh) * 2018-10-30 2019-02-01 维沃移动通信(杭州)有限公司 一种终端设备及终端设备的控制方法
CN110049221A (zh) * 2019-04-29 2019-07-23 维沃移动通信(杭州)有限公司 拍摄方法及移动终端

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112843678A (zh) * 2020-12-31 2021-05-28 上海米哈游天命科技有限公司 拍摄图像的方法、装置、电子设备及存储介质
CN112843678B (zh) * 2020-12-31 2023-05-23 上海米哈游天命科技有限公司 拍摄图像的方法、装置、电子设备及存储介质
CN115802133A (zh) * 2021-09-10 2023-03-14 Oppo广东移动通信有限公司 摄像头模组、电子设备、图像获取方法、装置、存储介质

Also Published As

Publication number Publication date
CN110049221A (zh) 2019-07-23
CN110049221B (zh) 2021-06-15

Similar Documents

Publication Publication Date Title
WO2020220973A1 (fr) Procédé de photographie et terminal mobile
WO2019096123A1 (fr) Procédé de commande de caméra et terminal mobile
CN108702446B (zh) 一种拍照方法和终端
CN110213414B (zh) 一种驱动控制方法、终端及计算机可读存储介质
WO2021104197A1 (fr) Procédé de poursuite d'objet et dispositif électronique
WO2020108261A1 (fr) Procédé de photographie et terminal
WO2020238380A1 (fr) Procédé de photographie panoramique et dispositif terminal
WO2020221044A1 (fr) Procédé de commande pour un moteur pas à pas et terminal mobile
WO2019129020A1 (fr) Procédé de mise au point automatique d'un appareil photo, dispositif de stockage, et terminal mobile
CN109348020B (zh) 一种拍照方法及移动终端
WO2021104227A1 (fr) Procédé de photographie et dispositif électronique
WO2021103737A1 (fr) Procédé de photographie et dispositif électronique
WO2018098638A1 (fr) Dispositif électronique, procédé de photographie et appareil
WO2021013009A1 (fr) Procédé de photographie et équipement terminal
WO2021063099A1 (fr) Procédé de photographie et dispositif électronique
WO2020020134A1 (fr) Procédé de photographie et terminal mobile
WO2021179800A1 (fr) Dispositif électronique
CN108989644A (zh) 一种摄像头控制方法及终端
US12022190B2 (en) Photographing method and electronic device
WO2021190390A1 (fr) Procédé de réglage de longueur focale, dispositif électronique, support de stockage et produit-programme
CN109639975A (zh) 一种拍摄控制方法及移动终端
CN108881721B (zh) 一种显示方法及终端
WO2020151745A1 (fr) Terminal mobile et procédé de commande
CN110769154B (zh) 一种拍摄方法及电子设备
CN108317992A (zh) 一种物距测量方法及终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20798865

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20798865

Country of ref document: EP

Kind code of ref document: A1