CN108270971A - A kind of method, equipment and the computer readable storage medium of mobile terminal focusing - Google Patents

A kind of method, equipment and the computer readable storage medium of mobile terminal focusing Download PDF

Info

Publication number
CN108270971A
CN108270971A CN201810100689.3A CN201810100689A CN108270971A CN 108270971 A CN108270971 A CN 108270971A CN 201810100689 A CN201810100689 A CN 201810100689A CN 108270971 A CN108270971 A CN 108270971A
Authority
CN
China
Prior art keywords
focusing
mobile terminal
user
animation
focal length
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810100689.3A
Other languages
Chinese (zh)
Other versions
CN108270971B (en
Inventor
彭灿灿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201810100689.3A priority Critical patent/CN108270971B/en
Publication of CN108270971A publication Critical patent/CN108270971A/en
Application granted granted Critical
Publication of CN108270971B publication Critical patent/CN108270971B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)

Abstract

The invention discloses method, equipment and the computer readable storage medium of a kind of focusing of mobile terminal, this method includes:After monitoring that camera application program is opened, shooting preview picture is obtained;Focusing trigger event is monitored, obtains focal position and focal length;The focal position and focal length are transferred to application layer;Receive the apex coordinate that the application layer is calculated according to the focal position and focal length;According to drafting focusing animation in preview screen of the apex coordinate in screen.The present invention realizes focusing animation in application layer, introduces GPU processing, can realize more smooth, diversified focusing traverse, can alleviate HAL layers of workload to a certain extent, accelerates the response speed of camera bottom layer driving, promotes user experience.

Description

A kind of method, equipment and the computer readable storage medium of mobile terminal focusing
【Technical field】
The present invention relates to information technology field, more precisely a kind of method, equipment and the calculating of mobile terminal focusing Machine readable storage medium storing program for executing.
【Background technology】
For mobile phone camera, when focal modes are in Continuous AF, when click preview screen or when FV is (clear Clear degree), Gyro (gyroscope) or SAD (brightness change) will trigger the focusing of camera when factors change.Focus process We are it will be seen that preview screen will appear stretching, the focusing phenomenon scaled.The auto-focusing mistake of current fixed-focus picture head non-on the market Cheng Jiben is realized by HAL layers, when receiving triggering factors, just starts focus process, to carrying out the cutting in region as head data Realize focusing animation, this process serial process in CPU has delayed focus process to a certain extent, and this focusing animation compared with It is simple.
【Invention content】
In view of the foregoing drawbacks, the present invention provides method, equipment and the computer-readable storages of a kind of focusing of mobile terminal Medium.
A kind of method of mobile terminal focusing, including:After monitoring that camera application program is opened, obtain shooting preview and draw Face;Focusing trigger event is monitored, obtains focal position and focal length;The focal position and focal length are transferred to application layer;It receives The apex coordinate that the application layer is calculated according to the focal position and focal length;According to preview of the apex coordinate in screen Focusing animation is drawn in picture.
Optionally, clicking operation of the user to the preview screen is received, triggers focusing event;The mobile terminal occurs It shakes or floor environment changes after causing clarity, brightness or gyroscope variation, trigger focusing event.
Optionally, the mobile terminal calculates focusing duration according to focal length;And corresponding duration is played according to focusing duration The focusing animation.
Optionally, the focusing duration is passed into application layer, the application layer is according to calculating the focusing duration The zoom factor of focusing animation.
Optionally, the focusing animation can be selected by user, according to selection of the user to the focusing animation, right The focusing animation will be played during burnt.
Optionally, the method further includes monitoring focusing completion event;After the completion of focusing, information alert user focusing is sent It has completed.
Optionally, the application layer draws thread for OpenGL, and the OpenGL draws thread and runs on graphics processor In.
Optionally, the OpenGL draws thread and operates in Android platform, by calling the API that Android is provided real Now shoot focus function.
In addition the present invention also proposes a kind of equipment of mobile terminal focusing, and the equipment of the mobile terminal focusing includes shooting Unit, display unit, user input unit, processor, graphics processor, memory and communication bus;The shooting unit is used In the static images of acquisition or the image data of video;The display unit by information input by user or is supplied to for display The information of user;The user input unit is for the number or character information of reception input and generation and mobile terminal User setting and function control it is related key signals input;The graphics processor is used for the static map obtained to shooting unit The image data of piece or video is handled;The communication bus is used to implement the connection communication between processor and memory; The memory is used to store the data of custom application;The processor is used to perform the mobile terminal stored in memory Focusing program, to realize following steps:
Including:After monitoring that camera application program is opened, shooting preview picture is obtained;Focusing trigger event is monitored, is obtained Focal position and focal length;The focal position and focal length are transferred to application layer;The application layer is received according to the focus position Put the apex coordinate calculated with focal length;According to drafting focusing animation in preview screen of the apex coordinate in screen.
Optionally, clicking operation of the user to the preview screen is received, triggers focusing event;The mobile terminal occurs It shakes or floor environment changes after causing clarity, brightness or gyroscope variation, trigger focusing event.
Optionally, the mobile terminal calculates focusing duration according to focal length;And corresponding duration is played according to focusing duration The focusing animation.
Optionally, the focusing duration is passed into application layer, the application layer is according to calculating the focusing duration The zoom factor of focusing animation.
Optionally, the focusing animation can be selected by user, according to selection of the user to the focusing animation, right The focusing animation will be played during burnt.
Optionally, the method further includes monitoring focusing completion event;After the completion of focusing, information alert user focusing is sent It has completed.
Optionally, the application layer draws thread for OpenGL, and the OpenGL draws thread and runs on graphics processor In.
Optionally, the OpenGL draws thread and operates in Android platform, by calling the API that Android is provided real Now shoot focus function.
In addition the present invention also proposes a kind of computer readable storage medium, and the computer-readable recording medium storage has one A or multiple programs, one or more of programs can be performed by one or more processor, be used to implement described The method of mobile terminal focusing:
Including:After monitoring that camera application program is opened, shooting preview picture is obtained;Focusing trigger event is monitored, is obtained Focal position and focal length;The focal position and focal length are transferred to application layer;The application layer is received according to the focus position Put the apex coordinate calculated with focal length;According to drafting focusing animation in preview screen of the apex coordinate in screen.
Optionally, clicking operation of the user to the preview screen is received, triggers focusing event;The mobile terminal occurs It shakes or floor environment changes after causing clarity, brightness or gyroscope variation, trigger focusing event.
Optionally, the mobile terminal calculates focusing duration according to focal length;And corresponding duration is played according to focusing duration The focusing animation.
Optionally, the focusing duration is passed into application layer, the application layer is according to calculating the focusing duration The zoom factor of focusing animation.
Optionally, the focusing animation can be selected by user, according to selection of the user to the focusing animation, right The focusing animation will be played during burnt.
Optionally, the method further includes monitoring focusing completion event;After the completion of focusing, information alert user focusing is sent It has completed.
Optionally, the application layer draws thread for OpenGL, and the OpenGL draws thread and runs on graphics processor In.
Optionally, the OpenGL draws thread and operates in Android platform, by calling the API that Android is provided real Now shoot focus function.
Beneficial effects of the present invention:It is proposed by the present invention to realize focusing animation in application layer, introduce GPU processing, Ke Yishi Now more smooth, diversified focusing traverse can alleviate HAL layers of workload to a certain extent, accelerate camera bottom layer driving Response speed promotes user experience.In addition, present invention can also apply to fixed-focus picture head, the process of focusing is realized in application layer, For the application of the false focusing scene of low-end product, to the experience of the similar focusing of user.
【Description of the drawings】
The hardware architecture diagram of Fig. 1 mobile terminals of each embodiment to realize the present invention.
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1.
Fig. 3 is the method flow diagram of the embodiment of the method one of mobile terminal focusing provided by the invention.
Fig. 4 is the method flow diagram of the embodiment of the method two of mobile terminal focusing provided by the invention.
Fig. 5 is the method flow diagram of the embodiment of the method three of mobile terminal focusing provided by the invention.
Fig. 6 is the module map of the apparatus embodiments four of mobile terminal focusing provided by the invention.
【Specific embodiment】
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In subsequent description, using for representing that the suffix of such as " module ", " component " or " unit " of element is only Be conducive to the explanation of the present invention, itself there is no a specific meaning.Therefore, " module ", " component " or " unit " can mix Ground uses.
Terminal can be implemented in a variety of manners.For example, terminal described in the present invention can include such as mobile phone, tablet Computer, laptop, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portable The shiftings such as media player (Portable Media Player, PMP), navigation device, wearable device, Intelligent bracelet, pedometer The dynamic fixed terminals such as terminal and number TV, desktop computer.
It will be illustrated by taking mobile terminal as an example in subsequent descriptions, it will be appreciated by those skilled in the art that in addition to special For moving except the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Referring to Fig. 1, a kind of hardware architecture diagram of its mobile terminal of each embodiment to realize the present invention, the shifting Dynamic terminal 100 can include:RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio output unit 103rd, A/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108th, the components such as memory 109, processor 110 and power supply 111.It will be understood by those skilled in the art that shown in Fig. 1 Mobile terminal structure does not form the restriction to mobile terminal, and mobile terminal can be included than illustrating more or fewer components, Either combine certain components or different components arrangement.
The all parts of mobile terminal are specifically introduced with reference to Fig. 1:
Radio frequency unit 101 can be used for receive and send messages or communication process in, signal sends and receivees, specifically, by base station Downlink information receive after, handled to processor 110;In addition, the data of uplink are sent to base station.In general, radio frequency unit 101 Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, it penetrates Frequency unit 101 can also communicate with network and other equipment by radio communication.Above-mentioned wireless communication can use any communication Standard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunications System), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (Code Division Multiple Access 2000, CDMA 2000), WCDMA (Wideband Code Division Multiple Access, wideband code division multiple access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, TD SDMA), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency division duplex long term evolution) and TDD-LTE (Time Division Duplexing-Long Term Evolution, time division duplex long term evolution) etc..
WiFi belongs to short range wireless transmission technology, and mobile terminal can help user to receive and dispatch electricity by WiFi module 102 Sub- mail, browsing webpage and access streaming video etc., it has provided wireless broadband internet to the user and has accessed.Although Fig. 1 shows Go out WiFi module 102, but it is understood that, and must be configured into for mobile terminal is not belonging to, it completely can be according to need It to be omitted in the range for the essence for not changing invention.
Audio output unit 103 can be in call signal reception pattern, call mode, record mould in mobile terminal 100 Formula, speech recognition mode, broadcast reception mode when under isotypes, it is that radio frequency unit 101 or WiFi module 102 are received or The audio data stored in memory 109 is converted into audio signal and exports as sound.Moreover, audio output unit 103 The relevant audio output of specific function performed with mobile terminal 100 can also be provided (for example, call signal receives sound, disappears Breath receives sound etc.).Audio output unit 103 can include loud speaker, buzzer etc..
A/V input units 104 are used to receive audio or video signal.A/V input units 104 can include graphics processor (Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition mode Or the static images or the image data of video obtained in image capture mode by image capture apparatus (such as camera) carry out Reason.Treated, and picture frame may be displayed on display unit 106.Through graphics processor 1041, treated that picture frame can be deposited Storage is sent in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.Mike Wind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042 Quiet down sound (audio data), and can be audio data by such acoustic processing.Audio that treated (voice) data can To be converted to the form output that mobile communication base station can be sent to via radio frequency unit 101 in the case of telephone calling model. Microphone 1042 can implement various types of noises elimination (or inhibition) algorithms and send and receive sound to eliminate (or inhibition) The noise generated during frequency signal or interference.
Mobile terminal 100 further includes at least one sensor 105, such as optical sensor, motion sensor and other biographies Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein, ambient light sensor can be according to environment The light and shade of light adjusts the brightness of display panel 1061, and proximity sensor can close when mobile terminal 100 is moved in one's ear Display panel 1061 and/or backlight.As one kind of motion sensor, accelerometer sensor can detect in all directions (general For three axis) size of acceleration, size and the direction of gravity are can detect that when static, can be used to identify the application of mobile phone posture (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) etc.; The fingerprint sensor that can also configure as mobile phone, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, The other sensors such as hygrometer, thermometer, infrared ray sensor, details are not described herein.
Display unit 106 is used to show by information input by user or be supplied to the information of user.Display unit 106 can wrap Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode may be used Display panel 1061 is configured in forms such as (Organic Light-Emitting Diode, OLED).
User input unit 107 can be used for receiving the number inputted or character information and generation and the use of mobile terminal The key signals input that family is set and function control is related.Specifically, user input unit 107 may include touch panel 1071 with And other input equipments 1072.Touch panel 1071, also referred to as touch screen collect user on it or neighbouring touch operation (for example user uses any suitable objects such as finger, stylus or attachment on touch panel 1071 or in touch panel 1071 Neighbouring operation), and corresponding attachment device is driven according to preset formula.Touch panel 1071 may include touch detection Two parts of device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch operation band The signal come, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and by it Contact coordinate is converted into, then gives processor 110, and the order that processor 110 is sent can be received and performed.It in addition, can To realize touch panel 1071 using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves.In addition to touch panel 1071, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 can wrap It includes but is not limited to physical keyboard, in function key (such as volume control button, switch key etc.), trace ball, mouse, operating lever etc. It is one or more, do not limit herein specifically.
Further, touch panel 1071 can cover display panel 1061, when touch panel 1071 detect on it or After neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thing The type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, touch panel 1071 and display panel 1061 be the component independent as two to realize the function that outputs and inputs of mobile terminal, but in certain embodiments, it can The function that outputs and inputs of mobile terminal is realized so that touch panel 1071 and display panel 1061 is integrated, is not done herein specifically It limits.
Interface unit 108 be used as at least one external device (ED) connect with mobile terminal 100 can by interface.For example, External device (ED) can include wired or wireless head-band earphone port, external power supply (or battery charger) port, wired or nothing Line data port, memory card port, the port for device of the connection with identification module, audio input/output (I/O) end Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, number It is believed that breath, electric power etc.) and the input received is transferred to one or more elements in mobile terminal 100 or can be with For transmitting data between mobile terminal 100 and external device (ED).
Memory 109 can be used for storage software program and various data.Memory 109 can mainly include storing program area And storage data field, wherein, storing program area can storage program area, application program (such as the sound needed at least one function Sound playing function, image player function etc.) etc.;Storage data field can store according to mobile phone use created data (such as Audio data, phone directory etc.) etc..In addition, memory 109 can include high-speed random access memory, can also include non-easy The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection A part is stored in storage by running or performing the software program being stored in memory 109 and/or module and call Data in device 109 perform the various functions of mobile terminal and processing data, so as to carry out integral monitoring to mobile terminal.Place Reason device 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulatedemodulate is mediated Device is managed, wherein, the main processing operation system of application processor, user interface and application program etc., modem processor is main Processing wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
For shooting unit 112 for shooting photo or video, the photo or video after shooting are stored in memory 109.Shooting Photo or video afterwards can be shown in display unit 106.
Mobile terminal 100 can also include the power supply 111 (such as battery) powered to all parts, it is preferred that power supply 111 Can be logically contiguous by power-supply management system and processor 110, so as to realize management charging by power-supply management system, put The functions such as electricity and power managed.
Although Fig. 1 is not shown, mobile terminal 100 can also be including bluetooth module etc., and details are not described herein.
For the ease of understanding the embodiment of the present invention, below to the communications network system that is based on of mobile terminal of the present invention into Row description.
Referring to Fig. 2, Fig. 2 is a kind of communications network system Organization Chart provided in an embodiment of the present invention, the communication network system The LTE system united as universal mobile communications technology, the LTE system include the UE (User Equipment, the use that communicate connection successively Family equipment) 201, E-UTRAN (Evolved UMTS Terrestrial Radio Access Network, evolved UMTS lands Ground wireless access network) 202, EPC (Evolved Packet Core, evolved packet-based core networks) 203 and operator IP operation 204。
Specifically, UE201 can be above-mentioned terminal 100, and details are not described herein again.
E-UTRAN202 includes eNodeB2021 and other eNodeB2022 etc..Wherein, eNodeB2021 can be by returning Journey (backhaul) (such as X2 interface) is connect with other eNodeB2022, and eNodeB2021 is connected to EPC203, ENodeB2021 can provide the access of UE201 to EPC203.
EPC203 can include MME (Mobility Management Entity, mobility management entity) 2031, HSS (Home Subscriber Server, home subscriber server) 2032, other MME2033, SGW (Serving GateWay, Gateway) 2034, PGW (PDN GateWay, grouped data network gateway) 2035 and PCRF (Policy and Charging Rules Function, policy and rate functional entity) 2036 etc..Wherein, MME2031 be processing UE201 and EPC203 between The control node of signaling, provides carrying and connection management.HSS2032 is used to provide some registers to manage such as homing position The function of register (not shown) etc, and some are preserved in relation to the dedicated letter of the users such as service features, data rate Breath.All customer data can be sent by SGW2034, PGW2035 can provide UE 201 IP address distribute with And other functions, PCRF2036 are business data flow and the strategy of IP bearing resources and charging control policy decision point, it is plan Available strategy and charging control decision are slightly selected and provided with charge execution function unit (not shown).
IP operation 204 can include internet, Intranet, IMS (IP Multimedia Subsystem, IP multimedia System) or other IP operations etc..
Although above-mentioned be described by taking LTE system as an example, those skilled in the art it is to be understood that the present invention not only Suitable for LTE system, be readily applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA with And following new network system etc., it does not limit herein.
Based on above-mentioned mobile terminal hardware configuration and communications network system, each embodiment of the method for the present invention is proposed.
Embodiment one
With reference to figure 3, a kind of method of mobile terminal focusing, including:
S101, after monitoring that camera application program is opened, shooting preview picture is obtained.
User starts photographing program (" camera " APP application programs are opened such as in smart mobile phone) in the terminal, claps The camera for taking the photograph routine call mobile terminal obtains the picture for needing to shoot, and then display should in the display screen of mobile terminal Picture.
S102, focusing trigger event is monitored, obtains focal position and focal length.
When mobile terminal startup photographing program is shot, the monitoring of focusing trigger event can be started.User was shooting It selects focus by clicking the preview screen of mobile terminal screen in journey (such as user wants the coke using someone as the secondary shooting Point then clicks the position where the people in screen), program of taking pictures listens to focusing trigger event;User adjusts mobile terminal Position or shooting environmental change (such as the sun is by cloud cover) when, the factors such as clarity, gyroscope or brightness change When, program of taking pictures listens to focusing trigger event.Program of taking pictures obtains the coke of the secondary shooting after focusing trigger event is listened to The coordinate (i.e. position mPivotX, mPivotY of screen pixels point identification) and focal length of point position.
Focal length, also referred to as focal length are the metric forms of aggregation or diverging that light is weighed in optical system, refer to from lens centre To the distance of the focus of light aggregation.Also it is in camera, from lens optical center to egative film, the imaging planes such as CCD or CMOS Distance.
The camera lens of camera is one group of lens, and when the light for being parallel to primary optical axis passes through lens, light is converged on a bit, this A point is called focus, and the distance of focus to lens centre (i.e. optical center) is known as focal length.The fixed camera lens of focal length, i.e. fixed-focus mirror Head;Focal length can adjust the camera lens of variation, be exactly zoom lens.
The Camera classes API of Android platform provides the focus and focal length ability obtained in shooting process, Camera classes position Under android.hardware NameSpaces, it provides the certain methods of operation camera.Start one in photographing program The focus and focal length of the corresponding API acquisitions current shooting picture of Android platform are called in timed task, timing.
S103, the focal position and focal length are transferred to application layer.
Program of taking pictures obtain the focal position of the secondary shooting coordinate (i.e. the position mPivotX of screen pixels point identification, MPivotY) and after focal length, the OpenGL for focal position and focal length being passed to application layer program draws thread, and OpenGL is drawn Thread is run in graphics processor.OpenGL draws thread and operates in Android platform, by calling Android offers API realizes shooting focus function.
OpenGL has seven big functions:
1. modeling:OpenGL shape libraries additionally provide multiple other than providing the drafting function of basic point, line, polygon Miscellaneous three-dimension object (ball, cone, polyhedron, teapot etc.) and complex curve and surface-rendering function.
2. transformation:The transformation of OpenGL shape libraries includes basic transformation and projective transformation.Basic transformation have translation, rotation, Four kinds of scaling, mirror image transformation, projective transformation have the two kinds of transformation of parallel projection (also known as orthogonal projection) and perspective projection.Its transformation side Method advantageously reduces the run time of algorithm, improves the display speed of 3-D graphic.
3. color mode is set:There are two types of OpenGL color modes, i.e. RGBA patterns and color index (Color Index)。
4. illumination and material setting:OpenGL light has self-luminous (Emitted Light), ambient light (Ambient Light it), diffuses (Diffuse Light) and bloom (Specular Light).Material is represented with light reflectivity. The color that object is finally reflected human eye in scene (Scene) is the RGB component of light and the reflectivity of material RGB component The color formed after multiplication.
5. texture mapping (Texture Mapping).It is expressed with being really true to life using OpenGL texture mappings function Body surface details.
6. bitmap is shown and image intensification image function is other than basic copy and pixel are read and write, fusion is also provided (Blending), the Special Graphs of antialiasing (anti-aliasing) and mist (fog) are as effect process.More than three can have more simulated object The sense of reality enhances the effect of graphical display.
Double buffer animation 7. (Double Buffering) Double buffer, that is, foreground caching and backstage cache, in short, backstage Caching calculates scene, generation picture, and foreground caching display caches the picture finished from the background.
S104, the apex coordinate that the application layer is calculated according to the focal position and focal length is received.
The API that the photograph routine call Android platform of Android mobile terminal provides draws thread using OpenGL, It is calculated according to the focal position (position mPivotX, mPivotY of screen pixels point identification) of the secondary shooting and focal length (mScale) Image preview is plotted to screen by the apex coordinate of preview using apex coordinate.It calculates apex coordinate and uses following procedure method It is calculated:
S105, according to drawing focusing animation in preview screen of the apex coordinate in screen.
OpenGL draws thread, and that image preview is plotted to screen process is as follows:
OpenGL uses cs models:C is cpu, and s is GPU, and c is vertex information and Texture information to the input of s, s Output be the image shown on display.
1.VBO/VAO (vertex buffer object or vertical array object):
VBO/VAO is the vertex information that cpu is supplied to GPU, include the position on vertex, color (the only color on vertex, It is unrelated with the color of texture), the vertex informations such as texture coordinate (be used for texture mapping).
2.VertexShader (vertex shader):
Vertex shader is to handle the program for the vertex information that VBO/VAO is provided.It holds on each vertex that VBO/VAO is provided Time vertex shader of row.Uniforms (a kind of types of variables) is consistent on each vertex, each vertex of Attribute It is different.It performs a VertexShader and exports a Varying and gl_positon.
3.PrimitiveAssembly (pel assembling):
Vertex shader next stage is pel assembling, and pel (prmitive) is that triangle, straight line or point are smart Etc. geometric objects.The set of vertices of vertex shader output is synthesized pel by this stage.
4.rasterization (rasterisation):
Rasterisation is the process that pel is converted into one group of two-dimensional slices, and then, these segments are handled by fragment shader (input of fragment shader).These two-dimensional slices represent the pixel that can be drawn on the screen.For from distributing to each figure The mechanism that the vertex shader output on first vertex generates each fragment values is referred to as interpolation (Interpolation).
5.FragmentShader (fragment shader):
Fragment shader realizes general programmable method for the operation in segment (pixel), rasterizes each of output Segment is carried out a fragment shader, and generating each segment to rasterization stage performs this tinter, generates one or more A (multi-texturing) color value is as output.
6.Per-Fragment Operations (segment by segment operation)
(1) pixelOwnershipTest (test of pixel ownership):
For determining whether the pixel of position (x, y) in frame buffer zone returns current context to own.If for example, one Display frame buffer zone window is covered by another window, then window system can determine that shielded pixel is not belonging to this The context of opengl, so as to not show these pixels.
(2) ScissorTest (cutting out test):
If the segment is located at outside cut out areas, it is abandoned
(3) StencilTest and DepthTest (template and depth test):
The relatively good understanding of depth test if the depth that fragment shader returns is less than the depth in buffering area, is given up.
(4) Blending (mixing):
Newly-generated fragment colors value with being stored in the color value of frame buffer zone is combined, generates new RGBA.
(5) dithering (shake):
Finally the segment of generation is put into frame buffer zone (preceding buffering area or rear buffering area or FBO), if not FBO, then Segment in screen drawing buffering area generates the pixel on screen.
The present embodiment realizes focusing animation by what this patent proposed in application layer, introduces GPU processing, can realize more Smoothly, diversified focusing traverse alleviates HAL layers of workload, accelerates the response speed of camera bottom layer driving, promotes user's body It tests.
Embodiment two
With reference to figure 4, the present embodiment increases following steps on the basis of embodiment one:
S106, mobile terminal calculate focusing duration according to focal length;Mobile terminal plays corresponding duration according to focusing duration Focusing animation.
The photograph program of Android mobile terminal draws the processing capacity of thread, meter according to current focal length and OpenGL It calculates and how long completes current focusing needs.Program of taking a picture is got over according to the GPU processing capacities of current mobile terminal, processing capacity By force, then it is shorter to complete focusing time;Processing capacity is weaker, then it is longer to complete focusing time.Focal length is longer, then completes focusing time It is about long;Focal length is shorter, then it is shorter to complete focusing time.
Program of taking a picture is according to focal length and GPU processing capacities, and calculating the secondary focusing duration, (such as focal length is 70mm, GPU 4 Core 1.0GHz needs that focusing could be completed within 2 seconds).Then photograph program is according to focusing duration, and photograph program is in image pickup preview screen Play the animation of corresponding duration.Animation refers to the picture decomposition of picture scaling interprocedual variation during shooting focusing into many action winks Between picture, then be continuously shot into a series of pictures, the picture of consecutive variations caused to vision.During such as picture by distally furthering, The effect gradually amplified by playing animation realization distal end picture.
S107, the focusing duration is passed to application layer program, the application layer program is according to the focusing duration meter Calculate the coefficient of scaling animation.
Program of taking a picture determines the duration of the secondary playing animation according to focusing duration;It is calculated and scaled according to focusing duration simultaneously The ratio of animation.Animation of such as secondary animation for a scaling screen, then according to focusing duration, calculate the scaling needed to animation Ratio.Focusing duration is longer, then the scaling of animation is bigger.
User can select the animation played in shooting process in program of taking a picture, and photograph program provides a variety of animations and supplies user It is selected.After user selects animation, which is saved in photograph program, process of taking pictures In, the animation selected using user is played out.
The present embodiment playing animation in focus process promotes user experience, enhances shooting effect.
Embodiment three
With reference to figure 5, the present embodiment increases following steps on the basis of embodiment one:
S108, focusing completion event is monitored;After the completion of focusing, send information alert user focusing and completed.
Program of taking a picture, which starts, monitors focusing completion event, when OpenGL is completed to defocused, sends message informing photograph program Focusing is completed.After photograph program receives the focusing completion message of OpenGL transmissions, stop playing animation.Then photograph program passes through Play sound (such as drop drop two sound), picture shows that green indicator light mode is prompted user to focus and completed, prompting user can be into Row shooting.
The present embodiment is prompted user to focus and is completed, user is facilitated to be shot, promote use by monitoring focusing completion event Family shooting experience.
Example IV
With reference to figure 6, a kind of equipment of mobile terminal focusing, which is a kind of mobile terminal (such as mobile terminal), including: P106 display units, P107 user input units, P112 shooting units, P110 processors, P1041 graphics processors, P109 are deposited Reservoir and P108 communication bus.
1) P106 display units are used to show by information input by user or be supplied to the information of user;
2) P107 user input units are for the number or character information of reception input and generation and the use of mobile terminal The key signals input that family is set and function control is related;
3) for P112 shooting units for shooting photo or video, the photo or video after shooting are stored in memory 109.
4) P1041 graphics processors in video acquisition mode or image capture mode by image capture apparatus to (such as being taken the photograph As head) image data of the static images that obtain or video handled.
5) P108 communication bus is used to implement the connection communication between processor and memory;
6) P109 memories are used to store program data;
7) P110 processors are used to perform the mobile terminal focusing program stored in memory, to realize following steps:
S101, after monitoring that camera application program is opened, shooting preview picture is obtained.
User starts photographing program (" camera " APP application programs are opened such as in smart mobile phone) in the terminal, claps The camera for taking the photograph routine call mobile terminal obtains the picture for needing to shoot, and then display should in the display screen of mobile terminal Picture.
S102, focusing trigger event is monitored, obtains focal position and focal length.
When mobile terminal startup photographing program is shot, the monitoring of focusing trigger event can be started.User was shooting It selects focus by clicking the preview screen of mobile terminal screen in journey (such as user wants the coke using someone as the secondary shooting Point then clicks the position where the people in screen), program of taking pictures listens to focusing trigger event;User adjusts mobile terminal Position or shooting environmental change (such as the sun is by cloud cover) when, the factors such as clarity, gyroscope or brightness change When, program of taking pictures listens to focusing trigger event.Program of taking pictures obtains the coke of the secondary shooting after focusing trigger event is listened to The coordinate (i.e. position mPivotX, mPivotY of screen pixels point identification) and focal length of point position.
Focal length, also referred to as focal length are the metric forms of aggregation or diverging that light is weighed in optical system, refer to from lens centre To the distance of the focus of light aggregation.Also it is in camera, from lens optical center to egative film, the imaging planes such as CCD or CMOS Distance.
The camera lens of camera is one group of lens, and when the light for being parallel to primary optical axis passes through lens, light is converged on a bit, this A point is called focus, and the distance of focus to lens centre (i.e. optical center) is known as focal length.The fixed camera lens of focal length, i.e. fixed-focus mirror Head;Focal length can adjust the camera lens of variation, be exactly zoom lens.
The Camera classes API of Android platform provides the focus and focal length ability obtained in shooting process, Camera classes position Under android.hardware NameSpaces, it provides the certain methods of operation camera.Start one in photographing program The focus and focal length of the corresponding API acquisitions current shooting picture of Android platform are called in timed task, timing.
S103, the focal position and focal length are transferred to application layer.
Program of taking pictures obtain the focal position of the secondary shooting coordinate (i.e. the position mPivotX of screen pixels point identification, MPivotY) and after focal length, the OpenGL for focal position and focal length being passed to application layer program draws thread, and OpenGL is drawn Thread is run in graphics processor.OpenGL draws thread and operates in Android platform, by calling Android offers API realizes shooting focus function.
OpenGL has seven big functions:
1. modeling:OpenGL shape libraries additionally provide multiple other than providing the drafting function of basic point, line, polygon Miscellaneous three-dimension object (ball, cone, polyhedron, teapot etc.) and complex curve and surface-rendering function.
2. transformation:The transformation of OpenGL shape libraries includes basic transformation and projective transformation.Basic transformation have translation, rotation, Four kinds of scaling, mirror image transformation, projective transformation have the two kinds of transformation of parallel projection (also known as orthogonal projection) and perspective projection.Its transformation side Method advantageously reduces the run time of algorithm, improves the display speed of 3-D graphic.
3. color mode is set:There are two types of OpenGL color modes, i.e. RGBA patterns and color index (Color Index)。
4. illumination and material setting:OpenGL light has self-luminous (Emitted Light), ambient light (Ambient Light it), diffuses (Diffuse Light) and bloom (Specular Light).Material is represented with light reflectivity. The color that object is finally reflected human eye in scene (Scene) is the RGB component of light and the reflectivity of material RGB component The color formed after multiplication.
5. texture mapping (Texture Mapping).It is expressed with being really true to life using OpenGL texture mappings function Body surface details.
6. bitmap is shown and image intensification image function is other than basic copy and pixel are read and write, fusion is also provided (Blending), the Special Graphs of antialiasing (anti-aliasing) and mist (fog) are as effect process.More than three can have more simulated object The sense of reality enhances the effect of graphical display.
Double buffer animation 7. (Double Buffering) Double buffer, that is, foreground caching and backstage cache, in short, backstage Caching calculates scene, generation picture, and foreground caching display caches the picture finished from the background.
S104, the apex coordinate that the application layer is calculated according to the focal position and focal length is received.
The API that the photograph routine call Android platform of Android mobile terminal provides draws thread using OpenGL, It is calculated according to the focal position (position mPivotX, mPivotY of screen pixels point identification) of the secondary shooting and focal length (mScale) Image preview is plotted to screen by the apex coordinate of preview using apex coordinate.It calculates apex coordinate and uses following procedure method It is calculated:
S105, according to drawing focusing animation in preview screen of the apex coordinate in screen.
OpenGL draws thread, and that image preview is plotted to screen process is as follows:
OpenGL uses cs models:C is cpu, and s is GPU, and c is vertex information and Texture information to the input of s, s Output be the image shown on display.
1.VBO/VAO (vertex buffer object or vertical array object):
VBO/VAO is the vertex information that cpu is supplied to GPU, include the position on vertex, color (the only color on vertex, It is unrelated with the color of texture), the vertex informations such as texture coordinate (be used for texture mapping).
2.VertexShader (vertex shader):
Vertex shader is to handle the program for the vertex information that VBO/VAO is provided.It holds on each vertex that VBO/VAO is provided Time vertex shader of row.Uniforms (a kind of types of variables) is consistent on each vertex, each vertex of Attribute It is different.It performs a VertexShader and exports a Varying and gl_positon.
3.PrimitiveAssembly (pel assembling):
Vertex shader next stage is pel assembling, and pel (prmitive) is that triangle, straight line or point are smart Etc. geometric objects.The set of vertices of vertex shader output is synthesized pel by this stage.
4.rasterization (rasterisation):
Rasterisation is the process that pel is converted into one group of two-dimensional slices, and then, these segments are handled by fragment shader (input of fragment shader).These two-dimensional slices represent the pixel that can be drawn on the screen.For from distributing to each figure The mechanism that the vertex shader output on first vertex generates each fragment values is referred to as interpolation (Interpolation).
5.FragmentShader (fragment shader):
Fragment shader realizes general programmable method for the operation in segment (pixel), rasterizes each of output Segment is carried out a fragment shader, and generating each segment to rasterization stage performs this tinter, generates one or more A (multi-texturing) color value is as output.
6.Per-Fragment Operations (segment by segment operation)
(1) pixelOwnershipTest (test of pixel ownership):
For determining whether the pixel of position (x, y) in frame buffer zone returns current context to own.If for example, one Display frame buffer zone window is covered by another window, then window system can determine that shielded pixel is not belonging to this The context of opengl, so as to not show these pixels.
(2) ScissorTest (cutting out test):
If the segment is located at outside cut out areas, it is abandoned
(3) StencilTest and DepthTest (template and depth test):
The relatively good understanding of depth test if the depth that fragment shader returns is less than the depth in buffering area, is given up.
(4) Blending (mixing):
Newly-generated fragment colors value with being stored in the color value of frame buffer zone is combined, generates new RGBA.
(5) dithering (shake):
Finally the segment of generation is put into frame buffer zone (preceding buffering area or rear buffering area or FBO), if not FBO, then Segment in screen drawing buffering area generates the pixel on screen.
The present embodiment realizes focusing animation by what this patent proposed in application layer, introduces GPU processing, can realize more Smoothly, diversified focusing traverse alleviates HAL layers of workload, accelerates the response speed of camera bottom layer driving, promotes user's body It tests..
Embodiment five
The present embodiment is on the basis of example IV, and P110 processors are additionally operable to perform mobile terminal focusing program, with reality Existing following steps:
S106, mobile terminal calculate focusing duration according to focal length;Mobile terminal plays corresponding duration according to focusing duration Animation.
The photograph program of Android mobile terminal draws the processing capacity of thread, meter according to current focal length and OpenGL It calculates and how long completes current focusing needs.Program of taking a picture is got over according to the GPU processing capacities of current mobile terminal, processing capacity By force, then it is shorter to complete focusing time;Processing capacity is weaker, then it is longer to complete focusing time.Focal length is longer, then completes focusing time It is about long;Focal length is shorter, then it is shorter to complete focusing time.
Program of taking a picture is according to focal length and GPU processing capacities, and calculating the secondary focusing duration, (such as focal length is 70mm, GPU 4 Core 1.0GHz needs that focusing could be completed within 2 seconds).Then photograph program is according to focusing duration, and photograph program is in image pickup preview screen Play the animation of corresponding duration.Animation refers to the picture decomposition of picture scaling interprocedual variation during shooting focusing into many action winks Between picture, then be continuously shot into a series of pictures, the picture of consecutive variations caused to vision.During such as picture by distally furthering, The effect gradually amplified by playing animation realization distal end picture.
S107, the focusing duration is passed to application layer program, the application layer program is according to the focusing duration meter Calculate the zoom factor of focusing animation.
Program of taking a picture determines the duration of the secondary playing animation according to focusing duration;It is calculated and scaled according to focusing duration simultaneously The ratio of animation.Animation of such as secondary animation for a scaling screen, then according to focusing duration, calculate the scaling needed to animation Ratio.Focusing duration is longer, then the scaling of animation is bigger.
User can select the animation played in shooting process in program of taking a picture, and photograph program provides a variety of animations and supplies user It is selected.After user selects animation, which is saved in photograph program, process of taking pictures In, the animation selected using user is played out.
The present embodiment playing animation in focus process promotes user experience, enhances shooting effect.
Embodiment six
The present embodiment is on the basis of example IV, and P110 processors are additionally operable to perform mobile terminal focusing program, with reality Existing following steps:
S108, focusing completion event is monitored;After the completion of focusing, send information alert user focusing and completed.
Program of taking a picture, which starts, monitors focusing completion event, when OpenGL is completed to defocused, sends message informing photograph program Focusing is completed.After photograph program receives the focusing completion message of OpenGL transmissions, stop playing animation.Then photograph program passes through Play sound (such as drop drop two sound), picture shows that green indicator light mode is prompted user to focus and completed, prompting user can be into Row shooting.
The present embodiment is prompted user to focus and is completed, user is facilitated to be shot, promote use by monitoring focusing completion event Family shooting experience.
It should be noted that herein, term " comprising ", "comprising" or its any other variant are intended to non-row His property includes, so that process, method, article or device including a series of elements not only include those elements, and And it further includes other elements that are not explicitly listed or further includes intrinsic for this process, method, article or device institute Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including this Also there are other identical elements in the process of element, method, article or device.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on such understanding, technical scheme of the present invention substantially in other words does the prior art Going out the part of contribution can be embodied in the form of software product, which is stored in a storage medium In (such as ROM/RAM, magnetic disc, CD), used including some instructions so that a station terminal (can be mobile phone, computer services Device, air conditioner or network equipment etc.) perform method described in each embodiment of the present invention.
The embodiment of the present invention is described above in conjunction with attached drawing, but the invention is not limited in above-mentioned specific Embodiment, above-mentioned specific embodiment is only schematical rather than restricted, those of ordinary skill in the art Under the enlightenment of the present invention, present inventive concept and scope of the claimed protection are not being departed from, can also made very much Form, these are belonged within the protection of the present invention.

Claims (10)

  1. A kind of 1. method of mobile terminal focusing, which is characterized in that the method for the mobile terminal focusing includes:
    After monitoring that camera application program is opened, shooting preview picture is obtained;
    Focusing trigger event is monitored, obtains focal position and focal length;
    The focal position and focal length are transferred to application layer;
    Receive the apex coordinate that the application layer is calculated according to the focal position and focal length;
    According to drafting focusing animation in preview screen of the apex coordinate in screen.
  2. 2. the method for mobile terminal focusing according to claim 1, which is characterized in that receive user to the preview screen Clicking operation, trigger focusing event;
    The mobile terminal shakes or floor environment changes cause clarity, brightness or gyroscope variation after, Trigger focusing event.
  3. 3. the method for mobile terminal focusing according to claim 1, which is characterized in that the mobile terminal is according to focometer Calculate focusing duration;And the focusing animation of corresponding duration is played according to focusing duration.
  4. 4. the method for mobile terminal focusing according to claim 3, which is characterized in that passing to the focusing duration should With layer, the application layer calculates the zoom factor of the focusing animation according to the focusing duration.
  5. 5. the method for mobile terminal focusing according to claim 3, which is characterized in that the focusing animation can be selected by user It selects, according to selection of the user to the focusing animation, the focusing animation will be played in focus process.
  6. 6. the method for mobile terminal focusing according to claim 1, which is characterized in that the method further includes monitoring focusing Completion event;
    After the completion of focusing, send information alert user focusing and completed.
  7. 7. the method for mobile terminal focusing according to claim 1, which is characterized in that the application layer is drawn for OpenGL Thread, the OpenGL draw thread and run in graphics processor.
  8. 8. the method for mobile terminal focusing according to claim 7, which is characterized in that the 0penGL draws thread operation In Android platform, focus function is shot by the way that the API that Android is provided is called to realize.
  9. 9. a kind of equipment for realizing mobile terminal focusing, which is characterized in that it is single that the equipment of the mobile terminal focusing includes shooting Member, display unit, user input unit, processor, graphics processor, memory and communication bus;
    The shooting unit is for the static images of acquisition or the image data of video;
    The display unit is used to show by information input by user or be supplied to the information of user;
    The user input unit is for the number or character information of reception input and generation and the user setting of mobile terminal And the key signals input that function control is related;
    Static images or the image data of video of the graphics processor for being obtained to shooting unit are handled;
    The communication bus is used to implement the connection communication between processor and memory;
    The memory is used to store the data of custom application;
    The processor is used to perform the mobile terminal focusing program stored in memory, is used to implement one of claim 1 to 8 Described mobile terminal focusing method.
  10. 10. a kind of computer readable storage medium, which is characterized in that the computer-readable recording medium storage there are one or Multiple programs, one or more of programs can be performed by one or more processor, be used to implement claim 1 to 8 One of described mobile terminal focusing method.
CN201810100689.3A 2018-01-31 2018-01-31 Mobile terminal focusing method and device and computer readable storage medium Active CN108270971B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810100689.3A CN108270971B (en) 2018-01-31 2018-01-31 Mobile terminal focusing method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810100689.3A CN108270971B (en) 2018-01-31 2018-01-31 Mobile terminal focusing method and device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108270971A true CN108270971A (en) 2018-07-10
CN108270971B CN108270971B (en) 2020-07-24

Family

ID=62777200

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810100689.3A Active CN108270971B (en) 2018-01-31 2018-01-31 Mobile terminal focusing method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108270971B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111583365A (en) * 2020-04-24 2020-08-25 完美世界(北京)软件科技发展有限公司 Animation element display processing method and device, storage medium and terminal
CN111654637A (en) * 2020-07-14 2020-09-11 RealMe重庆移动通信有限公司 Focusing method, focusing device and terminal equipment
CN112333387A (en) * 2020-10-30 2021-02-05 维沃移动通信有限公司 Shooting method, shooting device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150326776A1 (en) * 2014-05-12 2015-11-12 Vivotek Inc. Dynamical focus adjustment system and related dynamical focus adjustment method
WO2017070884A1 (en) * 2015-10-29 2017-05-04 深圳市莫孚康技术有限公司 Image focusing system and method based on wireless distance measurement, and photographing system
CN106775902A (en) * 2017-01-25 2017-05-31 北京奇虎科技有限公司 A kind of method and apparatus of image procossing, mobile terminal
CN107329649A (en) * 2017-06-14 2017-11-07 努比亚技术有限公司 Cartoon display method, terminal and computer-readable recording medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150326776A1 (en) * 2014-05-12 2015-11-12 Vivotek Inc. Dynamical focus adjustment system and related dynamical focus adjustment method
WO2017070884A1 (en) * 2015-10-29 2017-05-04 深圳市莫孚康技术有限公司 Image focusing system and method based on wireless distance measurement, and photographing system
CN106775902A (en) * 2017-01-25 2017-05-31 北京奇虎科技有限公司 A kind of method and apparatus of image procossing, mobile terminal
CN107329649A (en) * 2017-06-14 2017-11-07 努比亚技术有限公司 Cartoon display method, terminal and computer-readable recording medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111583365A (en) * 2020-04-24 2020-08-25 完美世界(北京)软件科技发展有限公司 Animation element display processing method and device, storage medium and terminal
CN111583365B (en) * 2020-04-24 2023-09-19 完美世界(北京)软件科技发展有限公司 Processing method and device for animation element display, storage medium and terminal
CN111654637A (en) * 2020-07-14 2020-09-11 RealMe重庆移动通信有限公司 Focusing method, focusing device and terminal equipment
CN111654637B (en) * 2020-07-14 2021-10-22 RealMe重庆移动通信有限公司 Focusing method, focusing device and terminal equipment
CN112333387A (en) * 2020-10-30 2021-02-05 维沃移动通信有限公司 Shooting method, shooting device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN108270971B (en) 2020-07-24

Similar Documents

Publication Publication Date Title
CN109729266A (en) A kind of image capturing method, terminal and computer readable storage medium
CN107592466A (en) A kind of photographic method and mobile terminal
CN108076292A (en) Image pickup method, mobile terminal and storage medium
CN106937039A (en) A kind of imaging method based on dual camera, mobile terminal and storage medium
CN107835367A (en) A kind of image processing method, device and mobile terminal
CN107707827A (en) A kind of high-dynamics image image pickup method and mobile terminal
CN107133939A (en) A kind of picture synthesis method, equipment and computer-readable recording medium
CN108600647A (en) Shooting preview method, mobile terminal and storage medium
CN107770454A (en) A kind of image processing method, terminal and computer-readable recording medium
CN108876878B (en) Head portrait generation method and device
CN108024065A (en) A kind of method of terminal taking, terminal and computer-readable recording medium
CN108989678A (en) A kind of image processing method, mobile terminal
CN108037845A (en) Display control method, mobile terminal and computer-readable recording medium
CN107786827A (en) Video capture method, video broadcasting method, device and mobile terminal
CN108419008A (en) A kind of image pickup method, terminal and computer readable storage medium
CN107959795A (en) A kind of information collecting method, equipment and computer-readable recording medium
CN107948530A (en) A kind of image processing method, terminal and computer-readable recording medium
CN108682040A (en) A kind of sketch image generation method, terminal and computer readable storage medium
CN107948498A (en) One kind eliminates camera Morie fringe method and mobile terminal
CN108055463A (en) Image processing method, terminal and storage medium
CN107730433A (en) One kind shooting processing method, terminal and computer-readable recording medium
CN107404618A (en) A kind of image pickup method and terminal
CN107239205A (en) A kind of photographic method, mobile terminal and storage medium
CN107689029A (en) Image processing method, mobile terminal and computer-readable recording medium
CN107948516A (en) A kind of image processing method, device and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant