CN107315529A - A kind of photographic method and mobile terminal - Google Patents

A kind of photographic method and mobile terminal Download PDF

Info

Publication number
CN107315529A
CN107315529A CN201710464380.8A CN201710464380A CN107315529A CN 107315529 A CN107315529 A CN 107315529A CN 201710464380 A CN201710464380 A CN 201710464380A CN 107315529 A CN107315529 A CN 107315529A
Authority
CN
China
Prior art keywords
touch
target
special efficacy
preview image
control operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710464380.8A
Other languages
Chinese (zh)
Other versions
CN107315529B (en
Inventor
朱宗伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710464380.8A priority Critical patent/CN107315529B/en
Publication of CN107315529A publication Critical patent/CN107315529A/en
Application granted granted Critical
Publication of CN107315529B publication Critical patent/CN107315529B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a kind of photographic method and mobile terminal.Methods described includes:Detection is directed to the touch control operation of the pending object in present preview image;Target area and the target special efficacy of the preview image are determined according to the touch control operation;The target area includes the pending object;The target special efficacy is added for the target area of the preview image, targets preview image is obtained;When receiving shooting instruction, target picture is generated according to the targets preview image.

Description

A kind of photographic method and mobile terminal
Technical field
The present embodiments relate to the communications field, more particularly to a kind of photographic method, and, a kind of mobile terminal.
Background technology
At present, increasing user shoots photo by mobile terminals such as mobile phone, tablet personal computers.
In actual shooting photo scene, if there is having and non-user wants the personage that shoots or object captured In photo background, then photo can be caused to there is flaw, not meet the shooting demand of user.For example, user is shooting some landscape There are other passerbys to stop during showplace, in background or have a vehicle process, shoot include in obtained photo with scenic spot without The content of pass, influences the bandwagon effect of photo, and user may the dissatisfied photo.
Therefore, current photographic method exist shooting photo do not meet user shoot demand the problem of.
The content of the invention
The invention provides a kind of photographic method, a kind of camera arrangement and a kind of mobile terminal, to solve background technology Present in do not meet user shoot demand the problem of.
First aspect includes there is provided a kind of photographic method, methods described:
Detection is directed to the touch control operation of the pending object in present preview image;
Target area and the target special efficacy of the preview image are determined according to the touch control operation;The target area includes The pending object;
The target special efficacy is added for the target area of the preview image, targets preview image is obtained;
When receiving shooting instruction, target picture is generated according to the targets preview image.
Second aspect includes there is provided a kind of mobile terminal, the mobile terminal:
Touch control operation detection unit, the touch control operation for detecting the pending object being directed in present preview image;
Region and special efficacy determining unit, target area and mesh for determining the preview image according to the touch control operation Mark special efficacy;The target area includes the pending object;
Targets preview image acquisition unit, the target special efficacy is added for the target area for the preview image, Obtain targets preview image;
Target picture generation unit, for when receiving shooting instruction, target to be generated according to the targets preview image Photo.
So, in the embodiment of the present invention, according to embodiments of the present invention, the touch control operation of preview image is directed to according to user, The target area for including flaw content and the target special efficacy of preview image are determined, target special efficacy is added on target area and is obtained Targets preview image, and photo is generated according to targets preview image, the flaw content of obtained photo, which is added with, can reduce the flaw The target special efficacy of defect content visibility, having reached improves the bandwagon effect of photo, meets the technology effect of the shooting demand of user Really.Moreover, user is that can obtain bandwagon effect preferably photo by simple touch control operation, taken pictures without re-starting, or Person carries out special effect processing for photo defective upon taking a picture, saves the time of user, improves Consumer's Experience.
Described above is only the general introduction of technical solution of the present invention, in order to better understand the technological means of the present invention, And can be practiced according to the content of specification, and in order to allow above and other objects of the present invention, feature and advantage can Become apparent, below especially exemplified by the embodiment of the present invention.
Brief description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, below by institute in the description to the embodiment of the present invention The accompanying drawing needed to use is briefly described, it should be apparent that, drawings in the following description are only some implementations of the present invention Example, for those of ordinary skill in the art, without having to pay creative labor, can also be according to these accompanying drawings Obtain other accompanying drawings.
Fig. 1 is a kind of step flow chart of photographic method of the embodiment of the present invention one;
Fig. 2 is a kind of step flow chart of photographic method of the embodiment of the present invention two;
Fig. 3 A are a kind of structured flowcharts of mobile terminal of the embodiment of the present invention three;
Fig. 3 B are the structured flowcharts of another mobile terminal of the embodiment of the present invention three;
Fig. 4 is a kind of schematic diagram for starting virtualization special effect processing on preview image of the present invention;
Fig. 5 is a kind of schematic diagram of virtualization special efficacy grade setting process of the present invention;
Fig. 6 is a kind of schematic diagram of virtualization handling process of the present invention;
Fig. 7 is the structured flowchart of the mobile terminal of another embodiment of the present invention;
Fig. 8 is the structured flowchart of the mobile terminal of another embodiment of the invention.
Embodiment
The exemplary embodiment of the disclosure is more fully described below with reference to accompanying drawings.Although showing the disclosure in accompanying drawing Exemplary embodiment, it being understood, however, that may be realized in various forms the disclosure without should be by embodiments set forth here Limited.On the contrary, these embodiments are provided to facilitate a more thoroughly understanding of the present invention, and can be by the scope of the present disclosure Complete conveys to those skilled in the art.
Embodiment one
Fig. 1 is a kind of step flow chart of photographic method of the embodiment of the present invention one, and methods described is applied to mobile terminal, Methods described includes:
Step 101, detection is directed to the touch control operation of the pending object in present preview image.
The embodiment of the present invention can apply to the mobile terminals such as mobile phone, tablet personal computer.User using mobile terminal certain It is individual to take pictures using when being taken pictures, take pictures using shooting preview interface can be provided on the display screen of mobile terminal, in the bat The preview image that displaying in preview interface is directed to current scene to be captured is taken the photograph, user is according to the preview image, it may be determined that when When preceding scene meets its shooting demand.User can submit shooting to instruct, by the scene capture Cheng Zhao in present preview image Piece.
Wherein, touch-control sensing module or pressure touch inductive module can be included on the display screen of mobile terminal, is used for Sense touching pressing operation, touching the touch control operation for stopping operation or touching clicking operation etc. for user.
In the embodiment of the present application in the specific implementation, when being taken pictures in user, user can be directed to what is shown on display screen The pending object of preview image carries out some touch control operation.Above-mentioned touch control operation can include touching pressing operation, touch Stop operation or touch the touch control operations such as clicking operation.
Above-mentioned pending object can include pair bandwagon effect, needing progress special effect processing for having an impact photo As.For example, passer-by or vehicle, relatively shabby building etc..
Step 102, target area and the target special efficacy of the preview image are determined according to the touch control operation;The target Region includes the pending object.
In the specific implementation, the target for needing to add special efficacy can be determined in preview image according to the touch control operation of user Region and the special efficacy for needing addition.
Determine that target area and target special efficacy there can be a variety of concrete implementation modes according to touch control operation.
In the one of which implementation that target area and target special efficacy are determined according to touch control operation, it can be touched by single Control operation determines target area and target special efficacy.It is possible, firstly, to read the touch area of touch control operation and the touch control operation exists Touch-control intensity on touch area.Then, target area and target special efficacy are determined using touch area and touch-control intensity respectively.
Touch area can be had by touching pressing operation, touch stop operation or touching the touch control operations such as clicking operation.Example Such as, when touch pressing in some region of display screen, being touched the region of pressing can produce for touching the pressure of pressing Touch press sense to answer, the region that generation touch press sense is answered can be as touch area.In another example, on a display screen some When region touch stop operation, exceed preset time when touching residence time, the region can produce touch stop sensing, The region that the generation touches stop sensing can be used as touch area.In another example, some region on a display screen is touched During clicking operation, the region can click on sensing for touching to click on to produce to touch, the generation touch click on the region of sensing can be with It is used as touch area.
Wherein, the touch control operation can include touching pressing operation, touch stop operation or touching clicking operation, described Touch-control intensity can include touching the press pressure of pressing operation, touch the residence time for stopping operation or touching clicking operation Number of clicks.
Touch-control intensity can also be had by touching pressing operation, touch stop operation or touching the touch control operations such as clicking operation, For example, being directed to touch pressing operation, the size for the press pressure that can will touch pressing is used as touch-control intensity;In another example, pin Operation is stopped for touching, the length for touching the residence time stopped can be regard as touch-control intensity;In another example, it is directed to touch Clicking operation, can regard the number of clicks for touching click as touch-control intensity.
Read after touch area and touch-control intensity, target area can be determined according to touch area, it is true according to touch-control intensity Set the goal special efficacy.
Determine that target area there can be a variety of specific implementations according to touch area.For example, being directed in static shape The pending object of state, directly can regard the touch area for including the pending object as target area.In another example, for In the pending object being kept in motion, the region where pending object may change, and cause current touch-control Region is different from the region that pending object is actually located, therefore, it can to identify first pending in current touch area Object, then tracks the pending object, regard the actually located region of the pending object after movement as target area so that Target area includes the pending object after movement.
Determine that target special efficacy there can also be a variety of specific implementations according to touch-control intensity.For example, it is different to preset each Each special efficacy grade corresponding to touch-control intensity, target special efficacy grade is determined for current touch-control intensity, and some is default Or the special efficacy of the target special efficacy grade of the special efficacy type of user's selection, it is used as target special efficacy.In another example, touch-control intensity is when default During interior consecutive variations, different special efficacy grades can be determined for different touch-control intensity, so that different special efficacys are obtained, because This, can determine target special efficacy grade according to newest obtained touch-control intensity, so that it is determined that current target special efficacy, or, note Multiple touch-control intensity of the consecutive variations are recorded, multiple touch-control intensity and corresponding special efficacy are shown to user by the way of list Grade and special efficacy, user can according to shown in list content, one of touch-control intensity is selected, by the touch-control intensity of selection Corresponding special efficacy is used as current target special efficacy.
In another implementation that target area and target special efficacy are determined according to touch control operation, at least two can be passed through Individual touch control operation determines target area and target special efficacy respectively.First, user can stop behaviour by touching pressing operation, touching Make or touch the first touch control operation of clicking operation etc., a Moving Objects identification region on preview image is determined, according to this Moving Objects identification region determines target area;Then, a default touch-control intensity reading area is provided on preview image, User can carry out touch pressing operation in the touch-control intensity reading area, touches stop operation or touch clicking operation etc. Second touch control operation, according to touch-control intensity of second touch control operation in the touch-control intensity reading area, determines target special efficacy.
From above-mentioned example, determine that target area and target special efficacy there can be a variety of concrete implementations according to touch control operation Mode, those skilled in the art can use different implementations, and the embodiment of the present invention is not restricted to this.
Step 103, the target special efficacy is added for the target area of the preview image, obtains targets preview image.
In the specific implementation, it is determined that after the target area of preview image and target special efficacy, can be in current preview graph On the target area of picture, target special efficacy is added.
In practical application, virtualization, mould can be included using different special efficacy type generation target special efficacys, special efficacy type The special efficacy of the reduction flaw content visibility of gelatinization, desalination, gray processing etc..
Some special efficacy type can be preset before taking pictures, can also there is provided multiple special efficacy classes after touch control operation is detected Type is selected for user, and user chosen after some special efficacy type, can determine target according to the target special efficacy grade of the special efficacy type Special efficacy.
After the target area addition target special efficacy of preview image, targets preview image can be obtained.Can be by the mesh Mark preview image shows user in shooting preview interface, and user can be added the photo after special efficacy with live preview, work as user Targets preview image is satisfied with, can be taken pictures for the targets preview image.
Step 104, when receiving shooting instruction, target picture is generated according to the targets preview image.
In the specific implementation, when user be satisfied with for current scene targets preview image when, can submit shooting instruct into Row is taken pictures.Instruction is shot when receiving, photo can be generated according to targets preview image.For example, directly by targets preview image As photo, or, taken pictures for the corresponding current scene of targets preview image, according to the target area of targets preview image Domain and target special efficacy, the original photo obtained to being taken pictures for current scene carry out special effect processing, the target in original photo Target special efficacy is added in region, so as to obtain target picture.
For example, user needs to be taken pictures for certain building, the row passed by is there are in the shooting background of preview image People, vehicle, user wishes to carry out special effect processing to the pedestrian, vehicle, and the pedestrian, vehicle in preview image are directed to according to user Touch operation is pressed, determines to include the pedestrian, the target area of vehicle in preview image, for the target area of preview image Domain addition virtualization special efficacy, to blur the pedestrian, vehicle, so as to reduce the influence of the pedestrian, vehicle to photo bandwagon effect.
According to embodiments of the present invention, the touch control operation of preview image is directed to according to user, including for preview image is determined The target area of flaw content and target special efficacy, on target area adding target special efficacy obtains targets preview image, and according to Targets preview image generates photo, and the flaw content of obtained photo, which is added with, can reduce the target spy of flaw content visibility Effect, improves the bandwagon effect of photo, meets the shooting demand of user.
Moreover, user is that can obtain bandwagon effect preferably photo by simple touch control operation, without re-starting bat According to, or save the time of user for photo defective progress special effect processing upon taking a picture, improve Consumer's Experience.
Embodiment two
Fig. 2 is a kind of step flow chart of photographic method of the embodiment of the present invention two, and methods described is applied to mobile terminal, Methods described includes:
Step 201, user is received to take pictures using the special effect processing request submitted for target, or, judge that user starts The target is taken pictures application.
Step 202, detection of the triggering for the touch control operation of the application of taking pictures.
In the specific implementation, user can submit special effect processing to ask being taken pictures using some using when being taken pictures.For example, The preview image in the shooting preview interface for application of taking pictures is pressed, or, provided on some position for application of taking pictures at special efficacy The submission entrance of request is managed, clicks on entrance to submit special effect processing to ask for user.When the special effect processing request for receiving user, Can be with touch control operation of the detection trigger user to the preview image for application of taking pictures, and carry out corresponding special effect processing.
Can also monitoring users start the operation of some application of taking pictures, after user, which starts, takes pictures application, detection trigger is used Family carries out corresponding special effect processing to the touch control operation of the preview image for application of taking pictures.
In practical application, those skilled in the art can trigger the detection to touch control operation using various ways.For example, also It can take pictures using upper embedded special effect processing plug-in unit, when taking pictures using starting, call the plug-in unit to detect touch control operation, go forward side by side The corresponding special effect processing of row.
Step 203, detection is directed to the touch control operation of the pending object in present preview image.
Step 204, target area and the target special efficacy of the preview image are determined according to the touch control operation;The target Region includes the pending object.
Alternatively, the step 204 can include following sub-step:
Sub-step S11, obtains the touch area of the touch control operation, the target area is determined according to the touch area Domain;
Sub-step S12, obtains the touch-control intensity on the touch area in the touch control operation, according to the touch-control of reading Intensity determines the target special efficacy.
It is described to include the step of determine the target area according to the touch area in practical application:
Sub-step S11-1, recognizes the pending object in the touch area;
Sub-step S11-2, judges whether the pending object is kept in motion;If it is not, then performing sub-step S11- 3;If so, then performing sub-step S11-4;
Sub-step S11-3, it is the target area to determine the touch area;
Sub-step S11-4, tracks the pending object, using the region residing for the pending object after movement as described Target area.
In the specific implementation, for the touch control operation detected, the touch area of touch control operation can be read, and read Touch-control intensity on the touch area, target area and target special efficacy are determined using touch area and touch-control intensity respectively.
In practical application, the touch control operation can include touching pressing operation, touch to stop operation or touch and click on behaviour Make, the touch-control intensity includes touching the press pressure of pressing operation, touches the residence time for stopping operation or touch and click on behaviour The number of clicks of work.
Generally, the pending object that user can be directed in present preview image carries out touch pressing operation, touches to stop and grasp Make or touch the touch control operations such as clicking operation, so as to form touch area in the region that pending object is presently in.Therefore, exist Read behind touch area, the pending object in touch area can be recognized, and judge the pending object whether in fortune Dynamic state.
, can be using current touch area as target area, with pin if pending object is currently at inactive state Follow-up special effect processing is carried out to the target area.
If pending object is currently at motion state, pending object may be in preview image in subsequent time Another region, i.e. pending object may be not in the touch area of user's touch control operation, if for current Touch Zone Domain carries out special effect processing, then possibly can not carry out special effect processing for pending object.It therefore, it can call on mobile terminal Subject dynamic tracing function, to track the movement locus of pending object, the current institute of pending object after identification is mobile The region at place, using the region as target area, so that, can be for including object to be handled when carrying out special effect processing Region handled.
In practical application, before the step 203, methods described can also include:
When receiving moving object tracking request, judge that including waiting of being kept in motion in present preview image locates Manage object.
In the specific implementation, by user a moving object tracking can be submitted to ask before detection touch control operation.Example Such as, when receive user submission special effect processing request or detect user start take pictures apply when, a choosing can be provided Trim vegetables for cooking list, for user confirm it is current whether need to carry out special effect processing for the object being kept in motion, when user confirms, i.e., It can determine that the pending object for including in current preview image and being kept in motion.
Alternatively, when including the pending object being kept in motion in the preview image, the touch control operation Including the first touch control operation and the second touch control operation, the step 202 can include following sub-step:
Sub-step S21, obtains the Moving Objects identification region indicated by first touch control operation, according to the motion pair As identification region determines the target area;Wherein, the Moving Objects identification region is included according to first touch control operation Touch area determined by region;
Sub-step S22, obtains touch-control intensity of second touch control operation in default touch-control intensity reading area, root The target special efficacy is determined according to the touch-control intensity of reading.
Include the situation for the pending object being kept in motion in current preview image for having determined, can be with By two different operations of the first touch control operation and the second touch control operation of user, determine that target area and target are special respectively Effect.
Can be by user by touching pressing operation, touching the first touch-control behaviour for stopping operation or touching clicking operation etc. Make, a Moving Objects identification region is indicated on preview image.For example, multiple confession user circles can be provided using upper taking pictures Some fixed object or the instrument in region, such as have the mark window of certain label range, user, which can select some, to be had necessarily The mark window of shape and size, and the mark window is dragged to the current position of pending object, after user stops dragging, Can using the region in the mark window as Moving Objects identification region, recognize in the region be kept in motion treat Process object.User can also directly be clicked on after selected some shape, the mark window of size on preview image, So as to the mark window that the position generation clicked in user is selected, the region in the mark window can be regard as motion pair As identification region.
Furthermore, it is possible to set a touch-control intensity reading area on some predeterminated position of preview image, exist for user Touch pressing operation is carried out in the touch-control intensity reading area, the second touch-control behaviour for stopping operation or touching clicking operation etc. is touched Make.When the second touch control operation for detecting user's progress, its touch-control intensity can be read, and determine according to the touch-control intensity of reading Target special efficacy.
Alternatively, it is described to include the step of determine the target area according to the Moving Objects identification region:
Sub-step S21-1, recognizes the pending object in the Moving Objects identification region;
Sub-step S21-2, tracks the pending object, using the region residing for the pending object after movement as described Target area.
In the specific implementation, the pending object in Moving Objects identification region can be recognized, and call on mobile terminal Subject dynamic tracing function, to track the movement locus of pending object, the current institute of pending object after identification is mobile The region at place, using the region as target area, so that, can be for including object to be handled when carrying out special effect processing Region handled.
Alternatively, the step of touch-control intensity according to reading determines the target special efficacy can include:
Using user, selected touch-control intensity is used as target touch-control intensity, the special efficacy in default special efficacy selection interface Selection interface includes multiple touch-control intensity that history is read;Or, it regard the touch-control intensity currently read as target touch-control intensity;
Search the target special efficacy grade corresponding to the target touch-control intensity;
According to the target special efficacy grade of setting special efficacy type, the target special efficacy is determined.
It should be noted that touch-control intensity can be the touch-control intensity read on touch area, or pre- If touch-control intensity reading area on the touch-control intensity that reads.
In the specific implementation, the touch-control intensity for the touch control operation that user is carried out may not change, it is also possible to continuous to become Change.For example, being directed to the size of the press pressure to touch pressing operation as the situation of touch-control intensity, user touches pressing Pressure may continue to increase, and thus touch-control intensity also can accordingly continue increase.In another example, it is directed to and stops operation to touch The length of residence time is as the situation of touch-control intensity, and user touches the residence time stopped can be elongated over time, thus touches Control intensity also can accordingly continue increase.It is directed to using the number of clicks for touching clicking operation and is used as the situation of touch-control intensity, user After repeatedly clicking on, touch-control intensity then can accordingly continue increase.It therefore, it can read multiple differences in user's touch control operation one by one Touch-control intensity, and record the touch-control intensity that the plurality of touch-control intensity is read as history.The touch-control for being directed to history reading is strong Degree, can take pictures using one default special efficacy selection interface of upper offer, displaying has history reading in special efficacy selection interface Touch-control intensity, select one of touch-control intensity for user, the touch-control intensity that user is selected is used as target touch-control intensity.Example Such as, by multiple press pressure 1N, 5N, 10N produced by some touch pressing operation before user, shown in visual mode Selected in special efficacy selection interface for user.
In addition it is also possible to directly regard the touch-control intensity currently read as target touch-control intensity.
It is determined that after target touch-control intensity, the special efficacy grade corresponding to the target touch-control intensity can be searched, mesh is used as Mark special efficacy grade.For example, for different press pressures, 1,2,3 ... can be set with and wait different special efficacy grades.Generally, touch Control intensity is bigger, and corresponding special efficacy higher grade so that the degree of the special efficacy such as virtualization, obfuscation is higher.
It is determined that after target special efficacy grade, default special efficacy type can be directed to, target special efficacy is determined.Need explanation It is that special efficacy type can be set in advance as virtualization, obfuscation, desalinate a kind of special efficacy therein, can also be touched in user After control operation, selected to determine the current special efficacy type for carrying out special effect processing by user, can also after user submits and shoots instruction, Special efficacy type is selected by user again.
For some special efficacy type, determine the special efficacy type in the corresponding target special efficacy of target special efficacy grade.For example, being directed to In the special efficacy type of virtualization, when it is determined that target special efficacy grade is 3, target special efficacy is then 3 grades of virtualizations.
In practical application, the special efficacy selection interface includes special efficacy grade corresponding with multiple touch-control intensity respectively And/or special efficacy.
In the specific implementation, in above-mentioned special efficacy selection interface, except the touch-control that can include multiple history readings is strong Outside degree, the corresponding special efficacy grade of multiple touch-control intensity and/or special efficacy can also be included., can be by touch-control in practical application Intensity, special efficacy grade and the packet of corresponding special efficacy show user.Wherein, the displaying of special efficacy can include the special efficacy being added to Breviary preview graph behind the target area of present preview image is shown, and user can be according to the breviary preview graph of displaying, choosing Wherein some touch-control intensity is selected as target touch-control intensity.Thus, user need not re-start touch control operation, you can be met The target special efficacy of user's request.
In practical application, before the step of lookup corresponds to the target special efficacy grade of the target touch-control intensity, Methods described also includes:
For the different corresponding special efficacy grades of touch-control intensity settings.
In the specific implementation, the corresponding special efficacy grade of different touch-control intensity can also be preset, i.e. by some touch-control intensity Bound with some special efficacy grade.For example, user can carry out touch pressing operation for some special efficacy grade, this is touched The press pressure of pressing operation binds some special efficacy grade.
By setting the corresponding special efficacy grade of different touch-control intensity so that user can touching with each self-defined special efficacy grade Intensity is controlled, is easy to user to obtain meeting the target special efficacy of its special effect processing demand using the operation of custom.
Touch-control intensity and corresponding special efficacy grade can be preset, can also again be carried out when user starts and takes pictures application Setting, can also start to take pictures in user and apply and set again when confirming to be currently needed for carrying out special effect processing.This area skill Art personnel can set the opportunity of setting according to actual needs.
Step 203, the target special efficacy is added for the target area of the preview image, obtains targets preview image.
Step 204, when receiving shooting instruction, target picture is generated according to the targets preview image.
It should be noted that the present invention can also be applied to carry out the scene of special effect processing for target reference object.Example Such as, user can carry out touch control operation for some target reference object on the preview image of current scene, to clap the target Take the photograph object be sharpened, the special effect processing of the bandwagon effect of the enhancing target reference object of halation etc..
According to embodiments of the present invention, the pending object being kept in motion by recognizing in touch area, and will wait to locate Region residing after object movement is managed as target area, so as to add target for the region comprising object to be handled Special efficacy, solve pending object after movement, if still be directed to touch area addition target special efficacy if can not be to pending The problem of object carries out special effect processing.
In order that obtaining those skilled in the art deeply understands the present invention, the specific example below with reference to Fig. 4 to Fig. 6 is carried out Explanation.
Fig. 4 is a kind of schematic diagram for starting virtualization special effect processing on preview image of the present invention.It can be seen that user Touch pressing operation can be carried out on the press points of present preview image, to open the special effect processing of virtualization.
Fig. 5 is a kind of schematic diagram of virtualization special efficacy grade setting process of the present invention.It can be seen that being opened taking pictures to apply After dynamic, user can select the pattern that pressure and virtualization special efficacy grade to touching pressing are bound.Under setting pattern, use Family can be bound some virtualization grade with some press pressure grade, so as in follow-up special effect processing, work as reading After to the press pressure for touching pressing operation, the virtualization grade of binding can be found.
Fig. 6 is a kind of schematic diagram of virtualization handling process of the present invention.It can be seen that after taking pictures using starting, using Family can bind press pressure and virtualization special efficacy grade, and user can select whether the object for being currently needed for being blurred moves.
Blurred if user's selection is directed to without mobile object, user can be directly on preview image for waiting to locate Manage object and carry out touch pressing operation, using the region for touching pressing operation as target area, search the void of press pressure association Change special efficacy grade, determine that target blurs special efficacy according to the virtualization special efficacy grade of association, and for the target area in preview image Add target virtualization special efficacy.
If user's selection is blurred for mobile object, user can be first on preview image by touching click The region that some includes mobile object is chosen in operation, and application of taking pictures can call subject dynamic tracing function, is chased after with dynamic The pending object of track, and it regard region residing after the movement of pending object as target area.Meanwhile, user can be in preview graph Carry out touching in the touch-control intensity reading area of picture stopping operation, so as to according to the residence time for stopping operation is touched, look into The virtualization special efficacy grade associated with the residence time is looked for, determines that target blurs special efficacy according to the virtualization special efficacy grade of association, and be directed to Target area addition target virtualization special efficacy in preview image.
Can be by the targets preview image shows by blurring special effect processing to user, user can be according to the mesh shown Mark preview image clicks on button of taking pictures, and obtains for pending object blur the photo of special effect processing.
Embodiment three
Fig. 3 A are a kind of structured flowcharts of mobile terminal of the embodiment of the present invention three, and the mobile terminal 300 shown in Fig. 3 A has Body can include touch control operation detection unit 301, region and special efficacy determining unit 302, the and of targets preview image acquisition unit 303 Target picture generation unit 304.
Touch control operation detection unit 301, the touch control operation for detecting the pending object being directed in present preview image;
Region and special efficacy determining unit 302, the target area for determining the preview image according to the touch control operation With target special efficacy;The target area includes the pending object;
Targets preview image acquisition unit 303, adds the target special for the target area for the preview image Effect, obtains targets preview image;
Target picture generation unit 304, for when receiving shooting instruction, mesh to be generated according to the targets preview image Mark photo.
Alternatively, on the basis of Fig. 3 A, the region and special efficacy determining unit 302 can include:First object region Determination subelement 3021 and first object special efficacy determination subelement 3022, as shown in Figure 3 B.
Wherein, first object region determination subelement 3021, the touch area for obtaining the touch control operation, according to institute State touch area and determine the target area;
First object special efficacy determination subelement 3022, for obtaining touching on the touch area in the touch control operation Intensity is controlled, the target special efficacy is determined according to the touch-control intensity of reading.
Alternatively, when including the pending object being kept in motion in the preview image, the touch control operation Including the first touch control operation and the second touch control operation, on the basis of Fig. 3 A, the region and special efficacy determining unit 302 can be with Including:Second target area determination subelement 3023 and the second target special efficacy determination subelement 3024, as shown in Figure 3 B.
Wherein, the second target area determination subelement 3023, for obtaining the motion indicated by first touch control operation Object identifying region, the target area is determined according to the Moving Objects identification region;Wherein, the Moving Objects cog region Domain includes the region according to determined by the touch area of first touch control operation;
Second target special efficacy determination subelement 3024, reads for obtaining second touch control operation in default touch-control intensity The touch-control intensity on region is taken, the target special efficacy is determined according to the touch-control intensity of reading.
Alternatively, second target area determination subelement 3023, can include:Object identifying module and Object tracking Module;
Wherein, Object identifying module, for recognizing the pending object in the Moving Objects identification region;
Object tracking module, for tracking the pending object, the region residing for the pending object after movement is made For the target area.
Alternatively, the first object special efficacy determination subelement 3022 or the second target special efficacy determination subelement 3024, can With including:Target touch-control intensity determines that module, target special efficacy grade determine that module and target special efficacy determine module.
Wherein, target touch-control intensity determines module, for by user in default special efficacy selection interface it is selected touch Intensity is controlled as target touch-control intensity, the special efficacy selection interface includes multiple touch-control intensity that history is read;Or, will be current The touch-control intensity of reading is used as target touch-control intensity;
Target special efficacy grade determines module, for searching the target special efficacy grade corresponding to the target touch-control intensity;
Target special efficacy determines module, for the target special efficacy grade according to setting special efficacy type, determines the target special efficacy.
Mobile terminal 300 can realize each process that mobile terminal is realized in Fig. 1 to Fig. 2 embodiment of the method, to keep away Exempt to repeat, will not be repeated here.Mobile terminal 300 according to embodiments of the present invention, the touch-control of preview image is directed to according to user Operation, determines the target area for including flaw content and the target special efficacy of preview image, target is added on target area special Effect obtains targets preview image, and generates photo according to targets preview image, and the flaw content of obtained photo is added with can be with The target special efficacy of flaw content visibility is reduced, the bandwagon effect of photo is improved, meets the shooting demand of user.Moreover, with Family is that can obtain bandwagon effect preferably photo by simple touch control operation, is taken pictures without re-starting, or upon taking a picture Special effect processing is carried out for photo defective, the time of user is saved, improves Consumer's Experience.Further, by knowing The pending object being kept in motion in other touch area, and it regard region residing after the movement of pending object as target area Domain, so as to add target special efficacy for the region comprising object to be handled, solves pending object after movement, such as Fruit still can not then treat the problem of process object carries out special effect processing for touch area addition target special efficacy.
Fig. 7 is the structured flowchart of the mobile terminal of another embodiment of the present invention.Mobile terminal 700 shown in Fig. 7 includes: At least one processor 701, memory 702, at least one network interface 704 and other users interface 703.Mobile terminal 700 In each component be coupled by bus system 705.It is understood that bus system 705 is used to realize between these components Connection communication.Bus system 705 is in addition to including data/address bus, in addition to power bus, controlling bus and status signal are total Line.But for the sake of clear explanation, various buses are all designated as bus system 705 in the figure 7.
Wherein, user interface 703 can include display, keyboard or pointing device (for example, mouse, trace ball (trackball), touch-sensitive plate or touch-screen etc..
It is appreciated that the memory 702 in the embodiment of the present invention can be volatile memory or nonvolatile memory, Or may include both volatibility and nonvolatile memory.Wherein, nonvolatile memory can be read-only storage (Read- OnlyMemory, ROM), programmable read only memory (ProgrammableROM, PROM), Erasable Programmable Read Only Memory EPROM (ErasablePROM, EPROM), Electrically Erasable Read Only Memory (ElectricallyEPROM, EEPROM) dodge Deposit.Volatile memory can be random access memory (RandomAccessMemory, RAM), and it is used as outside slow at a high speed Deposit.By exemplary but be not restricted explanation, the RAM of many forms can use, such as static RAM (StaticRAM, SRAM), dynamic random access memory (DynamicRAM, DRAM), Synchronous Dynamic Random Access Memory (SynchronousDRAM, SDRAM), double data speed synchronous dynamic RAM (DoubleDataRate SDRAM, DDRSDRAM), enhanced Synchronous Dynamic Random Access Memory (Enhanced SDRAM, ESDRAM), synchronized links Dynamic random access memory (SynchlinkDRAM, SLDRAM) and direct rambus random access memory (DirectRambusRAM, DRRAM).The memory 702 of the system and method for description of the embodiment of the present invention is intended to include but not limited In these memories with any other suitable type.Memory 702 can store predetermined registration operation rule, including in prerequisite Data, such as default sliding trace, default pressure threshold, default operating time interval, the embodiment of the present invention is to depositing The specific content of reservoir is not restricted.
In some embodiments, memory 702 stores following element, can perform module or data structure, or Their subset of person, or their superset:Operating system 7021 and application program 7022.
Wherein, operating system 7021, comprising various system programs, such as ccf layer, core library layer, driving layer, are used for Realize various basic businesses and handle hardware based task.Application program 7022, includes various application programs, such as media Player (MediaPlayer), browser (Browser) etc., for realizing various applied business.Realize embodiment of the present invention side The program of method may be embodied in application program 7022.
In embodiments of the present invention, by calling program or the instruction of the storage of memory 702, specifically, can be application The program stored in program 7022 or instruction, processor 701 are used to detect for the pending object in present preview image Touch control operation;Target area and the target special efficacy of the preview image are determined according to the touch control operation;The target area bag Include the pending object;The target special efficacy is added for the target area of the preview image, targets preview image is obtained; When receiving shooting instruction, target picture is generated according to the targets preview image.
The method that the embodiments of the present invention are disclosed can apply in processor 701, or be realized by processor 701. Processor 701 is probably a kind of IC chip, the disposal ability with signal.In implementation process, the above method it is each Step can be completed by the integrated logic circuit of the hardware in processor 701 or the instruction of software form.Above-mentioned processing Device 701 can be general processor, digital signal processor (DigitalSignalProcessor, DSP), application specific integrated circuit (ApplicationSpecific IntegratedCircuit, ASIC), ready-made programmable gate array (FieldProgrammableGateArray, FPGA) or other PLDs, discrete gate or transistor logic Device, discrete hardware components.It can realize or perform disclosed each method, step and the box in the embodiment of the present invention Figure.General processor can be microprocessor or the processor can also be any conventional processor etc..With reference to the present invention The step of method disclosed in embodiment, can be embodied directly in hardware decoding processor and perform completion, or use decoding processor In hardware and software module combination perform completion.Software module can be located at random access memory, and flash memory, read-only storage can In the ripe storage medium in this area such as program read-only memory or electrically erasable programmable memory, register.The storage Medium is located at memory 702, and processor 701 reads the information in memory 702, and the step of the above method is completed with reference to its hardware Suddenly.
It is understood that the embodiment of the present invention description these embodiments can with hardware, software, firmware, middleware, Microcode or its combination are realized.Realized for hardware, processing unit can be realized in one or more application specific integrated circuits (ApplicationSpecificIntegratedCircuits, ASIC), digital signal processor (DigitalSignalProcessing, DSP), digital signal processing appts (DSPDevice, DSPD), programmable logic device (ProgrammableLogicDevice, PLD), field programmable gate array (Field-ProgrammableGateArray, FPGA), general processor, controller, microcontroller, microprocessor, other electronics lists for performing function of the present invention In member or its combination.
For software realize, can by perform the module (such as process, function) of function described in the embodiment of the present invention come Realize the technology described in the embodiment of the present invention.Software code is storable in memory and by computing device.Memory can To realize within a processor or outside processor.
Alternatively, as one embodiment, processor 701 is additionally operable to:The touch area of the touch control operation is obtained, according to The touch area determines the target area;Obtain the touch-control intensity on the touch area, root in the touch control operation The target special efficacy is determined according to the touch-control intensity of reading.
Alternatively, as another embodiment, when including be kept in motion pending right in the preview image As when, the touch control operation includes the first touch control operation and the second touch control operation, and processor 701 is additionally operable to:Obtain described first Moving Objects identification region indicated by touch control operation, the target area is determined according to the Moving Objects identification region;Its In, the Moving Objects identification region includes the region according to determined by the touch area of first touch control operation;Obtain institute Touch-control intensity of second touch control operation in default touch-control intensity reading area is stated, according to being determined the touch-control intensity of reading Target special efficacy.
Alternatively, as another embodiment, processor 701 is additionally operable to:Recognize in the Moving Objects identification region Pending object;The pending object is tracked, the region residing for the pending object after movement is regard as the target area.
Alternatively, as another embodiment, processor 701 is additionally operable to:By user in default special efficacy selection interface Selected touch-control intensity includes multiple touch-control intensity that history is read as target touch-control intensity, the special efficacy selection interface; Or, it regard the touch-control intensity currently read as target touch-control intensity;Search the target spy corresponding to the target touch-control intensity Imitate grade;According to the target special efficacy grade of setting special efficacy type, the target special efficacy is determined.
Mobile terminal 700 can realize each process that mobile terminal is realized in previous embodiment, to avoid repeating, here Repeat no more.By the embodiment of the present invention, mobile terminal 700 is directed to the touch control operation of preview image according to user, determines preview The target area for including flaw content of image and target special efficacy, addition target special efficacy obtains targets preview on target area Image, and photo is generated according to targets preview image, the flaw content of obtained photo added with can reduce flaw content can The target special efficacy of degree of opinion, improves the bandwagon effect of photo, meets the shooting demand of user.Moreover, user is by simply touching Control operation can obtain bandwagon effect preferably photo, be taken pictures without re-starting, or upon taking a picture for photograph defective Piece carries out special effect processing, saves the time of user, improves Consumer's Experience.Further, by recognizing touch area Zhong Chu In the pending object of motion state, and using region residing after the movement of pending object as target area, so as to pin To comprising object to be handled region add target special efficacy, if solve pending object after movement, still for touch Addition target special efficacy in control region can not then treat the problem of process object carries out special effect processing.
Fig. 8 is the structured flowchart of the mobile terminal of another embodiment of the invention.Specifically, the mobile terminal 800 in Fig. 8 Can be mobile phone, tablet personal computer, personal digital assistant (PersonalDigital Assistant, PDA) or vehicle-mounted computer etc..
Mobile terminal 800 in Fig. 8 includes radio frequency (RadioFrequency, RF) circuit 810, memory 820, input list Member 830, display unit 840, processor 860, voicefrequency circuit 870, WiFi (WirelessFidelity) module 880 and power supply 890。
Wherein, input block 830 can be used for the numeral or character information for receiving user's input, and produce and mobile terminal The signal input that 800 user is set and function control is relevant.Specifically, in the embodiment of the present invention, the input block 830 can With including contact panel 831.Contact panel 831, also referred to as touch-screen, collect touch operation of the user on or near it (such as user uses the operations of any suitable object or annex on contact panel 831 such as finger, stylus), and according to advance The formula of setting drives corresponding attachment means.Optionally, contact panel 831 may include touch detecting apparatus and touch controller Two parts.Wherein, touch detecting apparatus detects the touch orientation of user, and detects the signal that touch operation is brought, by signal Send touch controller to;Touch controller receives touch information from touch detecting apparatus, and is converted into contact coordinate, Give the processor 860 again, and the order sent of reception processing device 860 and can be performed.Furthermore, it is possible to using resistance-type, The polytypes such as condenser type, infrared ray and surface acoustic wave realize contact panel 831.Except contact panel 831, input block 830 can also include other input equipments 832, and other input equipments 832 can include but is not limited to physical keyboard, function key One or more in (such as volume control button, switch key etc.), trace ball, mouse, action bars etc..
Wherein, display unit 840 can be used for information and the movement for showing the information inputted by user or being supplied to user The various menu interfaces of terminal 800.Display unit 840 may include display panel 841, optionally, can use LCD or organic hairs The forms such as optical diode (OrganicLight-EmittingDiode, OLED) configure display panel 841.
It should be noted that contact panel 831 can cover display panel 841, touch display screen is formed, when touch display screen inspection Measure after the touch operation on or near it, processor 860 is sent to determine the type of touch event, with preprocessor 860 provide corresponding visual output according to the type of touch event in touch display screen.
Touch display screen includes Application Program Interface viewing area and conventional control viewing area.The Application Program Interface viewing area And arrangement mode of the conventional control viewing area is not limited, can be arranged above and below, left-right situs etc. can distinguish two and show Show the arrangement mode in area.The Application Program Interface viewing area is displayed for the interface of application program.Each interface can be with The interface element such as the icon comprising at least one application program and/or widget desktop controls.The Application Program Interface viewing area It can also be the empty interface not comprising any content.The conventional control viewing area is used to show the higher control of utilization rate, for example, Application icons such as settings button, interface numbering, scroll bar, phone directory icon etc..
Wherein processor 860 is the control centre of mobile terminal 800, utilizes various interfaces and connection whole mobile phone Various pieces, software program and/or module in first memory 821 are stored in by operation or execution, and call storage Data in second memory 822, perform the various functions and processing data of mobile terminal 800, so as to mobile terminal 800 Carry out integral monitoring.Optionally, processor 890 may include one or more processing units.
In embodiments of the present invention, by call store the first memory 821 in software program and/or module and/ Or the data in the second memory 822, processor 860 is for detecting touching for the pending object being directed in present preview image Control operation;Target area and the target special efficacy of the preview image are determined according to the touch control operation;The target area includes The pending object;The target special efficacy is added for the target area of the preview image, targets preview image is obtained;When When receiving shooting instruction, target picture is generated according to the targets preview image.
Alternatively, as one embodiment, processor 860 is additionally operable to:The touch area of the touch control operation is obtained, according to The touch area determines the target area;Obtain the touch-control intensity on the touch area, root in the touch control operation The target special efficacy is determined according to the touch-control intensity of reading.
Alternatively, as another embodiment, when including be kept in motion pending right in the preview image As when, the touch control operation includes the first touch control operation and the second touch control operation, and processor 860 is additionally operable to:Obtain described first Moving Objects identification region indicated by touch control operation, the target area is determined according to the Moving Objects identification region;Its In, the Moving Objects identification region includes the region according to determined by the touch area of first touch control operation;Obtain institute Touch-control intensity of second touch control operation in default touch-control intensity reading area is stated, according to being determined the touch-control intensity of reading Target special efficacy.
Alternatively, as another embodiment, processor 860 is additionally operable to:Recognize in the Moving Objects identification region Pending object;The pending object is tracked, the region residing for the pending object after movement is regard as the target area.
Alternatively, as another embodiment, processor 860 is additionally operable to:By user in default special efficacy selection interface Selected touch-control intensity includes multiple touch-control intensity that history is read as target touch-control intensity, the special efficacy selection interface; Or, it regard the touch-control intensity currently read as target touch-control intensity;Search the target spy corresponding to the target touch-control intensity Imitate grade;According to the target special efficacy grade of setting special efficacy type, the target special efficacy is determined.
Mobile terminal 800 can realize each process that mobile terminal is realized in previous embodiment, to avoid repeating, here Repeat no more.By the embodiment of the present invention, mobile terminal 800 is directed to the touch control operation of preview image according to user, determines preview The target area for including flaw content of image and target special efficacy, addition target special efficacy obtains targets preview on target area Image, and photo is generated according to targets preview image, the flaw content of obtained photo added with can reduce flaw content can The target special efficacy of degree of opinion, improves the bandwagon effect of photo, meets the shooting demand of user.Moreover, user is by simply touching Control operation can obtain bandwagon effect preferably photo, be taken pictures without re-starting, or upon taking a picture for photograph defective Piece carries out special effect processing, saves the time of user, improves Consumer's Experience.Further, by recognizing touch area Zhong Chu In the pending object of motion state, and using region residing after the movement of pending object as target area, so as to pin To comprising object to be handled region add target special efficacy, if solve pending object after movement, still for touch Addition target special efficacy in control region can not then treat the problem of process object carries out special effect processing.
Those of ordinary skill in the art it is to be appreciated that with reference to disclosed in the embodiment of the present invention embodiment description it is each The unit and algorithm steps of example, can be realized with the combination of electronic hardware or computer software and electronic hardware.These Function is performed with hardware or software mode actually, depending on the application-specific and design constraint of technical scheme.Specialty Technical staff can realize described function to each specific application using distinct methods, but this realization should not Think beyond the scope of this invention.
It is apparent to those skilled in the art that, for convenience and simplicity of description, the system of foregoing description, The specific work process of device and unit, may be referred to the corresponding process in preceding method embodiment, will not be repeated here.
In embodiment provided herein, it should be understood that disclosed apparatus and method, others can be passed through Mode is realized.For example, device embodiment described above is only schematical, for example, the division of the unit, is only A kind of division of logic function, can there is other dividing mode when actually realizing, such as multiple units or component can combine or Person is desirably integrated into another system, or some features can be ignored, or does not perform.Another, shown or discussed is mutual Between coupling or direct-coupling or communication connection can be the INDIRECT COUPLING or communication link of device or unit by some interfaces Connect, can be electrical, machinery or other forms.
The unit illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit The part shown can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple On NE.Some or all of unit therein can be selected to realize the mesh of this embodiment scheme according to the actual needs 's.
In addition, each functional unit in each embodiment of the invention can be integrated in a processing unit, can also That unit is individually physically present, can also two or more units it is integrated in a unit.
If the function is realized using in the form of SFU software functional unit and is used as independent production marketing or in use, can be with It is stored in a computer read/write memory medium.Understood based on such, technical scheme is substantially in other words The part contributed to prior art or the part of the technical scheme can be embodied in the form of software product, the meter Calculation machine software product is stored in a storage medium, including some instructions are to cause a computer equipment (can be individual People's computer, server, or network equipment etc.) perform all or part of step of each of the invention embodiment methods described. And foregoing storage medium includes:USB flash disk, mobile hard disk, ROM, RAM, magnetic disc or CD etc. are various can be with store program codes Medium.
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited thereto, any Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, should all be contained Cover within protection scope of the present invention.Therefore, protection scope of the present invention should be defined by scope of the claims.

Claims (10)

1. a kind of photographic method, applied to mobile terminal, it is characterised in that methods described includes:
Detection is directed to the touch control operation of the pending object in present preview image;
Target area and the target special efficacy of the preview image are determined according to the touch control operation;The target area includes described Pending object;
The target special efficacy is added for the target area of the preview image, targets preview image is obtained;
When receiving shooting instruction, target picture is generated according to the targets preview image.
2. according to the method described in claim 1, it is characterised in that described that the preview image is determined according to the touch control operation Target area and include the step of target special efficacy:
The touch area of the touch control operation is obtained, the target area is determined according to the touch area;
The touch-control intensity on the touch area in the touch control operation is obtained, the mesh is determined according to the touch-control intensity of reading Mark special efficacy.
3. according to the method described in claim 1, it is characterised in that when including what is be kept in motion in the preview image During pending object, the touch control operation includes the first touch control operation and the second touch control operation, described according to the touch control operation The step of target area and target special efficacy for determining the preview image, includes:
The Moving Objects identification region indicated by first touch control operation is obtained, is determined according to the Moving Objects identification region The target area;Wherein, the Moving Objects identification region is true including the touch area institute according to first touch control operation Fixed region;
Touch-control intensity of second touch control operation in default touch-control intensity reading area is obtained, it is strong according to the touch-control of reading Degree determines the target special efficacy.
4. method according to claim 3, it is characterised in that described according to being determined the Moving Objects identification region The step of target area, includes:
Recognize the pending object in the Moving Objects identification region;
The pending object is tracked, the region residing for the pending object after movement is regard as the target area.
5. according to any described methods of claim 2-4, it is characterised in that described according to being determined the touch-control intensity of reading The step of target special efficacy, includes:
Using user, selected touch-control intensity is used as target touch-control intensity, the special efficacy selection in default special efficacy selection interface Interface includes multiple touch-control intensity that history is read;Or, it regard the touch-control intensity currently read as target touch-control intensity;
Search the target special efficacy grade corresponding to the target touch-control intensity;
According to the target special efficacy grade of setting special efficacy type, the target special efficacy is determined.
6. a kind of mobile terminal, the mobile terminal includes:
Touch control operation detection unit, the touch control operation for detecting the pending object being directed in present preview image;
Region and special efficacy determining unit, the target area and target for determining the preview image according to the touch control operation are special Effect;The target area includes the pending object;
Targets preview image acquisition unit, adds the target special efficacy for the target area for the preview image, obtains Targets preview image;
Target picture generation unit, for when receiving shooting instruction, target picture to be generated according to the targets preview image.
7. mobile terminal according to claim 6, it is characterised in that the region and special efficacy determining unit include:
First object region determination subelement, the touch area for obtaining the touch control operation is true according to the touch area The fixed target area;
First object special efficacy determination subelement, for obtaining the touch-control intensity in the touch control operation on the touch area, The target special efficacy is determined according to the touch-control intensity of reading.
8. mobile terminal according to claim 6, it is characterised in that be in motion shape when including in the preview image During the pending object of state, the touch control operation includes the first touch control operation and the second touch control operation, and the region and special efficacy are true Order member includes:
Second target area determination subelement, for obtaining the Moving Objects identification region indicated by first touch control operation, The target area is determined according to the Moving Objects identification region;Wherein, the Moving Objects identification region is included according to institute State region determined by the touch area of the first touch control operation;
Second target special efficacy determination subelement, for obtaining second touch control operation in default touch-control intensity reading area Touch-control intensity, the target special efficacy is determined according to the touch-control intensity of reading.
9. mobile terminal according to claim 8, it is characterised in that second target area determination subelement includes:
Object identifying module, for recognizing the pending object in the Moving Objects identification region;
Object tracking module, for tracking the pending object, regard the region residing for the pending object after movement as institute State target area.
10. according to any described mobile terminals of claim 7-9, it is characterised in that the first object special efficacy determines that son is single Member or the second target special efficacy determination subelement include:
Target touch-control intensity determines module, for using user in default special efficacy selection interface selected touch-control intensity as Target touch-control intensity, the special efficacy selection interface includes multiple touch-control intensity that history is read;Or, by the touch-control currently read Intensity is used as target touch-control intensity;
Target special efficacy grade determines module, for searching the target special efficacy grade corresponding to the target touch-control intensity;
Target special efficacy determines module, for the target special efficacy grade according to setting special efficacy type, determines the target special efficacy.
CN201710464380.8A 2017-06-19 2017-06-19 Photographing method and mobile terminal Active CN107315529B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710464380.8A CN107315529B (en) 2017-06-19 2017-06-19 Photographing method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710464380.8A CN107315529B (en) 2017-06-19 2017-06-19 Photographing method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107315529A true CN107315529A (en) 2017-11-03
CN107315529B CN107315529B (en) 2020-05-26

Family

ID=60184100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710464380.8A Active CN107315529B (en) 2017-06-19 2017-06-19 Photographing method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107315529B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110035227A (en) * 2019-03-25 2019-07-19 维沃移动通信有限公司 Special effect display methods and terminal device
CN110309933A (en) * 2018-03-23 2019-10-08 广州极飞科技有限公司 Plant plants data measuring method, work route method and device for planning, system
CN113132641A (en) * 2021-04-23 2021-07-16 北京达佳互联信息技术有限公司 Shooting control method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030052994A1 (en) * 2001-09-17 2003-03-20 Eric Auffret Wireless video camera
CN104159032A (en) * 2014-08-20 2014-11-19 广东欧珀移动通信有限公司 Method and device of adjusting facial beautification effect in camera photographing in real time
CN104199610A (en) * 2014-08-27 2014-12-10 联想(北京)有限公司 Information processing method and electronic device
CN105554364A (en) * 2015-07-30 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN106231182A (en) * 2016-07-29 2016-12-14 维沃移动通信有限公司 A kind of photographic method and mobile terminal
CN106791016A (en) * 2016-11-29 2017-05-31 努比亚技术有限公司 A kind of photographic method and terminal
CN106814959A (en) * 2015-11-30 2017-06-09 东莞酷派软件技术有限公司 A kind of U.S. face photographic method, device and terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030052994A1 (en) * 2001-09-17 2003-03-20 Eric Auffret Wireless video camera
CN104159032A (en) * 2014-08-20 2014-11-19 广东欧珀移动通信有限公司 Method and device of adjusting facial beautification effect in camera photographing in real time
CN104199610A (en) * 2014-08-27 2014-12-10 联想(北京)有限公司 Information processing method and electronic device
CN105554364A (en) * 2015-07-30 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN106814959A (en) * 2015-11-30 2017-06-09 东莞酷派软件技术有限公司 A kind of U.S. face photographic method, device and terminal
CN106231182A (en) * 2016-07-29 2016-12-14 维沃移动通信有限公司 A kind of photographic method and mobile terminal
CN106791016A (en) * 2016-11-29 2017-05-31 努比亚技术有限公司 A kind of photographic method and terminal

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110309933A (en) * 2018-03-23 2019-10-08 广州极飞科技有限公司 Plant plants data measuring method, work route method and device for planning, system
US11321942B2 (en) 2018-03-23 2022-05-03 Guangzhou Xaircraft Technology Co., Ltd. Method for measuring plant planting data, device and system
CN110035227A (en) * 2019-03-25 2019-07-19 维沃移动通信有限公司 Special effect display methods and terminal device
CN113132641A (en) * 2021-04-23 2021-07-16 北京达佳互联信息技术有限公司 Shooting control method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN107315529B (en) 2020-05-26

Similar Documents

Publication Publication Date Title
CN107343149B (en) A kind of photographic method and mobile terminal
CN106775252A (en) The message treatment method and mobile terminal of a kind of mobile terminal
CN107506109A (en) A kind of method and mobile terminal for starting application program
CN107172346A (en) A kind of weakening method and mobile terminal
CN107580179A (en) A kind of camera starts method and mobile terminal
CN106681623A (en) Screenshot picture sharing method and mobile terminal
CN106648416A (en) Method for starting application and mobile terminal
CN106791357A (en) A kind of image pickup method and mobile terminal
CN107203313A (en) Adjust desktop and show object method, mobile terminal and computer-readable recording medium
CN107734156A (en) A kind of image pickup method and mobile terminal
CN106527906A (en) Picture capture method and mobile terminal
CN107404577A (en) A kind of image processing method, mobile terminal and computer-readable recording medium
CN106354303A (en) Photographed method of mobile terminal and mobile terminal
CN106713747A (en) Focusing method and mobile terminal
CN107665434A (en) A kind of method of payment and mobile terminal
CN107315529A (en) A kind of photographic method and mobile terminal
CN106911897A (en) A kind of determination method and mobile terminal for shooting focus
CN107483821A (en) A kind of image processing method and mobile terminal
CN105824662A (en) Application program uninstalling method and electronic equipment
CN107038367A (en) A kind of fingerprint identification method and mobile terminal
CN107506130A (en) A kind of word delet method and mobile terminal
CN106371719A (en) Screen rotation control method and mobile terminal
CN106873874A (en) A kind of application program open method and mobile terminal
CN107360375A (en) A kind of image pickup method and mobile terminal
CN107480500A (en) The method and mobile terminal of a kind of face verification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant