CN107155063A - Night scene filming control method, system and equipment - Google Patents

Night scene filming control method, system and equipment Download PDF

Info

Publication number
CN107155063A
CN107155063A CN201710465036.0A CN201710465036A CN107155063A CN 107155063 A CN107155063 A CN 107155063A CN 201710465036 A CN201710465036 A CN 201710465036A CN 107155063 A CN107155063 A CN 107155063A
Authority
CN
China
Prior art keywords
picture
positional information
night scene
eyes
face contour
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710465036.0A
Other languages
Chinese (zh)
Other versions
CN107155063B (en
Inventor
朱斌杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Tai Heng Connaught Technology Co Ltd Shanghai Branch
Original Assignee
Shenzhen Tai Heng Connaught Technology Co Ltd Shanghai Branch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Tai Heng Connaught Technology Co Ltd Shanghai Branch filed Critical Shenzhen Tai Heng Connaught Technology Co Ltd Shanghai Branch
Priority to CN201710465036.0A priority Critical patent/CN107155063B/en
Publication of CN107155063A publication Critical patent/CN107155063A/en
Application granted granted Critical
Publication of CN107155063B publication Critical patent/CN107155063B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a kind of night scene filming control method, system and equipment, this method includes obtaining the picture of an object by the camera device of electronic equipment first, detect the positional information at the eyes position of object described in the picture, and it is located at the positional information in the picture according to the eyes portion, calculate positional information of the face contour of the object in the picture, corresponding face contour region is generated in the picture according to this, and the light compensating apparatus of control electronics carries out light filling for generated face contour region, and the view data of the object is obtained by the camera device, the application can provide high-quality night scene shooting effect for deep colour of skin object.

Description

Night scene filming control method, system and equipment
Technical field
The application is related to picture shooting technical field, more particularly to a kind of night scene filming control method, system and sets It is standby.
Background technology
Night scene is the subject matter that shutterbugs like shooting, but night scene is a highly difficult shooting subject matter, because It is not a nothing the matter on spectrum assignment, and shooting one has the night scene photo of more dark-complexioned object (i.e. Black people) more It is extremely difficult, especially when shooting the group photo of deep colour of skin object and shallow colour of skin object (i.e. Black people and white man), due to being difficult to really Determine exposure value so that it is captured go out photo be not white man it is over-exposed be exactly Black people exposure it is inadequate.Therefore height is shot The image of the deep colour of skin object of quality is this case technical task to be solved.
The content of the invention
The shortcoming of prior art in view of the above, the purpose of the application be to provide a kind of night scene filming control method, System and equipment, the problem of night scene shooting effect for solving deep colour of skin object in the prior art is not good.
In order to achieve the above objects and other related objects, the first aspect of the application provides a kind of night scene and shoots controlling party Method, applied in the electronic equipment with light compensating apparatus and camera device, including:An object is obtained by the camera device Picture, wherein, the picture at least includes the face of the object;The eyes position of object described in the picture is detected, and Confirm the eyes position relative to the positional information in the picture;It is located at the position in the picture according to the eyes portion Information, calculates positional information of the face contour of the object in the picture, is generated according to this in the picture corresponding Face contour region;And control the light compensating apparatus to carry out light filling for the face contour region, and pass through the shooting Device obtains the view data of the object..
In some embodiments of the application first aspect, the location confirmation step at the eyes position includes:Confirm The eyes portion is located at the two-dimensional coordinate positional information in the picture.
It is located in some embodiments of the application first aspect, in addition to according to the eyes portion in the picture Two-dimensional coordinate positional information, calculates positional information of multiple boundary points in the picture of the face contour of the object, according to The step of to generate corresponding face contour region in the picture.
In some embodiments of the application first aspect, include the tooth portion of object described in the detection picture Position, and confirm the step of dental area is relative to positional information in the picture.
In some embodiments of the application first aspect, in addition to according to the eyes position and the tooth portion Positional information in the picture, calculates the step of the positional information of the face contour of the object in the picture Suddenly.
In some embodiments of the application first aspect, in addition to editing interface is provided, so that user is from described The step of the eyes position and/or the dental area relative to the positional information in the picture is determined in the picture of object Suddenly.
In some embodiments of the application first aspect, the electronic equipment is smart mobile phone, video camera, hand-held electricity Any one of brain, tablet personal computer.
In some embodiments of the application first aspect, the light compensating apparatus is flash lamp.
In some embodiments of the application first aspect, the object is the object with dark skin.
The first aspect of the application provides a kind of night scene shoot control system, including for providing the supplementary lighting module of light filling; For the photographing module for the picture for obtaining object, wherein, the picture at least includes the face of the object, and according to being taken The picture of the object obtained generates the view data of the object;Eyes portion for detecting the object from the picture Position, and confirm the detection module for the positional information that the eyes portion is located in the picture;For being located at according to the eyes portion Positional information in the picture, calculates positional information of the face contour of the object in the picture, according to this in described The computing module in corresponding face contour region is generated in picture;And for controlling the supplementary lighting module to be directed in the picture The face contour region of generation carries out light filling, while controlling the control mould of the view data of the photographing module acquisition object Block.
In some embodiments of the application second aspect, the detection module is used to confirm that the eyes portion is located at institute State the two-dimensional coordinate positional information in picture.
In some embodiments of the application second aspect, the computing module is used to be located at described draw according to eyes portion Two-dimensional coordinate positional information in face, calculates position letter of multiple boundary points in the picture of the face contour of the object Breath, generates corresponding face contour region in the picture according to this.
In some embodiments of the application second aspect, the detection module also includes detecting described in the picture The dental area of object, and confirm the dental area relative to the positional information in the picture.
In some embodiments of the application second aspect, the computing module also include according to the eyes position with And positional information of the dental area in the picture, the face contour of the object is calculated described in the picture Positional information.
In some embodiments of the application second aspect, in addition to editor module, for providing editing interface, for User edits the eyes position and/or the dental area relative to the position in the picture from the picture of the object Confidence ceases.
In some embodiments of the application second aspect, the system loading is run in electronic equipment, and described Electronic equipment is any one of smart mobile phone, video camera, HPC, tablet personal computer.
In some embodiments of the application second aspect, the object is the object with dark skin.
The third aspect of the application provides a kind of electronic equipment, including:One or more processors;Memory;And one Individual or multiple programs, wherein one or more of programs be stored in the memory and be configured as by one or Multiple computing device instructions, before execute instruction described in one or more of computing devices make it that the electronic equipment is performed The night scene filming control method stated.
As described above, night scene filming control method, system and the equipment of the application, by obtaining the picture of object, to examine The positional information at the eyes position of object described in the picture is surveyed, and according to object described in the positional information calculation at eyes position Corresponding region of the face contour in the picture, control light compensating apparatus to enter for calculated face contour region according to this Row light filling, and obtain by camera device the view data of the object.
In addition, the application can also be by the position of the dental area for further detecting the object, to combine eyes and tooth The position of tooth calculates the regional location of face mask, so as to improve the accurate of the positioning of the face mask (head) of the object Degree, whereby, the face mask position progress emphasis exposure that the application can be in shooting image for the object, and then be deep skin Color object provides high-quality night scene shooting effect.
Brief description of the drawings
Fig. 1 is shown as the structural representation of the embodiment one of the night scene shoot control system of the application.
Fig. 2 is shown as the structural representation of the embodiment two of the night scene shoot control system of the application.
Fig. 3 is shown as the structural representation of the embodiment one of the electronic equipment of the application.
Fig. 4 is shown as the structural representation of the embodiment one of the night scene filming control method of the application.
Fig. 5 is shown as the structural representation of the embodiment two of the night scene filming control method of the application.
Fig. 6 A and Fig. 6 B are shown as the night scene shoot control system of the application and the embodiment design sketch of method.
Embodiment
Presently filed embodiment is illustrated by particular specific embodiment below, those skilled in the art can be by this explanation Content disclosed by book understands other advantages and effect of the application easily.
In described below, refer to the attached drawing, accompanying drawing describes some embodiments of the application.It should be appreciated that it is also possible to use Other embodiment, and can be carried out in the case of without departing substantially from spirit and scope of the present disclosure mechanical composition, structure, electrically with And the detailed description below operational change should not be considered limiting, and the scope of embodiments herein Only terms used herein are limited by the claims for the patent announced merely to describing specific embodiment, and not It is intended to limit the application.The term of space correlation, such as " on ", " under ", "left", "right", " following ", " lower section ", " bottom ", " top ", " top " etc., can use to illustrate an element or feature shown in figure and another element or spy in the text The relation levied.
Although the grade of term first, second is used for describing various elements herein in some instances, these elements It should not be limited by these terms.These terms are only used for making a distinction an element and another element.For example, first is pre- If threshold value can be referred to as the second predetermined threshold value, and similarly, the second predetermined threshold value can be referred to as the first predetermined threshold value, and The scope of various described embodiments is not departed from.First predetermined threshold value and predetermined threshold value are to describe a threshold value, still Unless context is otherwise explicitly pointed out, otherwise they are not same predetermined threshold values.Similar situation also includes first Volume and the second volume.
Furthermore, as used in herein, singulative " one ", " one " and "the" are intended to also include plural shape Formula, unless had opposite instruction in context it will be further understood that term "comprising", " comprising " show there is described spy Levy, step, operation, element, component, project, species, and/or group, but be not excluded for other one or more features, step, behaviour Work, element, component, project, the presence of species, and/or group, occur or addition term "or" used herein and "and/or" quilt It is construed to inclusive, or means any one or any combinations.Therefore, " A, B or C " or " A, B and/or C " mean " with Descend any one:A;B;C;A and B;A and C;B and C;A, B and C " are only when element, function, step or the combination of operation are in some sides When inherently mutually exclusive under formula, the exception of this definition just occurs.
The night scene shoot control system of the application is loaded in one electronic equipment of operation, is particularly suitable for use in for having deeper skin The object (people of dark skin) of color shoots night scene photo.
In actual embodiment, the electronic equipment is, for example, including but not limited to digital camera, video camera, hand-held Computer, tablet personal computer, the mobile phone with camera function or smart mobile phone, media player, personal digital assistant (PDA) etc. Deng portable electric appts, it will be appreciated that portable electric appts of the application described in embodiment are an application Example, the component of the equipment can have more or less components than diagram, or with different component Configurations.Draw figure The various assemblies shown can realize with the combination of hardware, software or software and hardware, including one or more signal transactings and/or specially Use integrated circuit.
The electronic equipment includes memory, Memory Controller, one or more processors (CPU), Peripheral Interface, RF Circuit, voicefrequency circuit, loudspeaker, microphone, input/output (I/O) subsystem, touch-screen, other outputs or control device, with And outside port.These components are communicated by one or more communication bus or signal wire.
The electronic equipment supports various application programs, such as one or more of the following:Drawing application program, Application program, word-processing application, website establishment application program, disk editor application program, spreadsheet application journey is presented Sequence, game application, telephony application, videoconference application, email application, instant message application journey Sequence, body-building support application program, photo management application program, digital camera applications program, digital video camcorder application program, Web page browsing application program, digital music player application and/or video frequency player application program.
Refering to Fig. 1, it is shown as the structural representation of the embodiment one of the night scene shoot control system of the application.As schemed Show, the night scene shoot control system 100 of the application mainly includes supplementary lighting module 110, photographing module 120, detection module 130, Computing module 140, and control module 150.
Supplementary lighting module 110 is used to provide light filling.
In actual application, the supplementary lighting module 110 is, for example, the flash lamp of camera, in shot subject light In the case of dark, flash of light illuminates shot subject to obtain correct exposure, and it mainly includes LED flash or hernia flash of light Lamp.
Hernia flashing light system mainly has three parts:A battery for providing power supply, the gas for producing mountain pass Body discharge tube, and for connecting a circuit (being made up of multiple electronic components) for power supply and gas-discharge tube, wherein, by electricity When circuit is connected at the two poles of the earth in pond, forced electronic is flowed to another pole by battery by circuit from a pole of battery.The electricity moved Sub (i.e. electric current) provides energy to each element for being connected to circuit.The discharge tube can be by a pipe structure for being full of xenon Into the two ends of pipe are electrodes, and centre is a metal trigger board.Its general principle is by electric current by the xenon in fluorescent tube From an electrode conduction (freedom of movement electronics) to another electrode.Free electron will make xenon gas atoms be powered in movement, from And it is sent optical photon.It is high voltage that the circuit, which is used for the low voltage transition of the battery, by discharge tube Metal trigger board applies a very high positive voltage, so that xenon is ionized.The electronics of high-speed motion will be sent out with xenon gas atoms Raw collision, so that it is powered and produces light, so that xenon lamp is luminous.
The operation principle of LED flash is the PN junction two ends by the way that voltage to be added in LED, is that PN junction forms an energy in itself Level, then electronics transition and produce pipe to light on this energy level, because LED flash cost is more cheap, thus compared with On the camera function for being applied to most smart mobile phone more.
Photographing module 120 is used for the picture for obtaining object, wherein, the picture at least includes the face of the object, with And the view data of the object is generated according to the picture of the acquired object.
In embodiments herein, the picture can be carried out by the photographing module 120 for the object The still photo of the object generated after shooting;It can also be the mirror that the photographing module 120 is presented in when not shooting The dynamic image of the object in head, it should be noted that, the latter may be needed for photographing module 120 and the object Position is positioned, to ensure that relative position of the object in the picture keeps fixing.
In actual application, photographing module 120 is, for example, camera, and including but not limited to slr camera, digital list are anti- Camera, twin lens reflex formula camera, middle image camera, gearing ranging camera, card camera, pinhole camera, emigrant's formula camera, side Camera shaft, Polaroid camera, panorama camera, diving camera etc..
The photographing module 120 is a kind of utilization optical imaging concept formation image and setting using negative writing image It is standby.Much can all possess the feature of camera with recording image equipment.Camera is the optical instrument for photography.Subject After the light reflected is focused on by the shutter of photographic film (taking the photograph scenery mirror) and control light exposure, subject is in camera bellows Sub-image is formed on photosensitive material, the permanent image of composition is handled and (develop, be fixed) through flushing, this technology is referred to as photography Art.It is divided into general photograph and the shooting of specialty.
Detection module 130 is used for the eyes position that the object is detected from the picture, and confirms the eyes position Positional information in the picture.Preferably, the detection module 130 is used to confirm that the eyes portion is located in the picture Two-dimensional coordinate positional information.
Please refer to Fig. 6 A, in embodiments herein, because the color of the white of the eye is shallower, even if in night scene mode Identification is also easy to, therefore, detection module 130 is this characteristic using the white of the eye, to detect the eyes position in the picture Position.
In actual applications, detection module 130 can be handled by the gray processing of image, be detected from the picture described The eyes position of object.Described image gray processing processing is turned into by the way that coloured image to be transformed into the process of gray level image The gray processing processing of image.Because the color of each pixel in coloured image has tri- components of R, G, B to determine, and each component There are 255 intermediate values can use, such a pixel there can be the excursion of the color of more than 1,600 ten thousand (255 × 255 × 255).And Gray level image is a kind of special coloured image of tri- component identicals of R, G, B, and the excursion of one pixel is 255 Kind, so the image of various forms first typically is transformed into gray level image to reduce follow-up image in Digital Image Processing kind Amount of calculation, mitigates the burden of system operation.
The gray processing processing of image can realize that first method is to obtain R, G, B of each pixel with two methods This average value, is then given to three components of this pixel by the average value of three components.Second method is according to YUV Color space in, the physical significance of Y component is the brightness of point, reflects brightness degree by the value, according to RGB and YUV colors The variation relation in space sets up the corresponding of tri- color components of brightness Y and R, G, B:Y=0.3R+0.59G+0.11B, it is bright with this Angle value expresses the gray value of image.
In addition, detection module 130 is also further combined with two-dimensional coordinate axle technology, determine that the eyes portion is located at the picture In two-dimensional coordinate positional information.
Computing module 140 is used for the positional information being located at according to the eyes portion in the picture, calculates the object Positional information of the face contour in the picture, generates corresponding face contour region in the picture according to this.
Specifically, the computing module 140 is used for the two-dimensional coordinate position letter being located at according to eyes portion in the picture Breath, calculates positional information of multiple boundary points in the picture of the face contour of the object, according to this in the picture Generate corresponding face contour region.In specific embodiment, the boundary point of described face contour may be, for example, the object Chin portion, forehead position, and ears position present position information, and generated such as by connecting above-mentioned each boundary point A framework shown in Fig. 6 B, the face area of the object is to be located in the framework shown in Fig. 6 B.
It should be noted that, the face contour region that the application is formed is not limited with the rectangular frame shown in Fig. 6 B, tool For body, when the quantity of the boundary point calculated is more, the face contour region formed is then more accurate, and the face generated The shape in contouring region is depending on the number according to the quantity of boundary point.
It is the prominent features of face in view of eyes, they are occupied in face between the position of comparison fixation, eyes Distance features the size of face, is the normalized foundation of recognition of face mesoscale, therefore, most face recognition algorithms All rely on the positioning of eyes.That is, as long as the position of eyes is accurately positioned, then other features of face, such as eyebrow, nose, Mouth etc., can more accurately be positioned by potential distribution relation.
In certain embodiments, computing module 140 can be for example using Hough transform method, deforming template method, edge feature point Analysis method piles the location algorithms such as converter technique, to be located at the two-dimensional coordinate positional information in the picture according to eyes portion, calculates Positional information of the multiple boundary points of the face contour of the object in the picture, generates correspondence in the picture according to this Face contour region (i.e. the region as shown in the square frame in Fig. 6 B).
In some embodiments, the detection module 130, the division of each functional module of computing module 140 are only to lift Can be as needed in example explanation, practical application, such as the convenient consideration of the realization of the configuration requirement or software of corresponding hardware, And above-mentioned functions are distributed and completed by different functional modules, such as the internal structure of picture processing device is divided into different Detection module 130 and computing module 140, to complete all or part of function described above.And in practical application, this reality It can be realized by corresponding hardware to apply the corresponding functional module in example, can also perform corresponding software by corresponding hardware Complete, such as can be performed by least one graphic hardware and photo management application program, the graphic hardware can be used for Handle figure and/or secondary processor performs the dedicated computing hardware of calculating task.In one embodiment, graphic hardware can Including one or more programmable graphics processing units (Graphics Processing Unit, abbreviation:GPU).
Control module 150 is used to control the supplementary lighting module 110 for the face contour region generated in the picture to enter Row light filling, while controlling the photographing module 120 to obtain the view data of the object.In actual applications, control module 150 Can be realized by corresponding hardware, corresponding software can also be performed by corresponding hardware and completed, such as described hardware can be with It is that the software and/or secondary processor that control function is shot for providing perform the dedicated computing hardware of calculating task to complete.
In another embodiment, the detection module 130 can be additionally used in the tooth portion for detecting object described in the picture Position, and confirm the dental area relative to the positional information in the picture.As shown in Figure 6A, due to dental area and the white of the eye The color characteristic at position is similar, also belongs to that color is shallower, even if being also easy to the position of identification in night scene mode, therefore, works as quilt Reference object during shooting picture when grinningly smiling, and the detection module 130 of the application can also further detect the picture Described in object dental area.
Accordingly, the computing module 140 can also further comprise the object picked out according to detection module 130 Positional information in the picture of eyes position and dental area, drawn the face contour that calculates the object in described The positional information in face.In practical application, computing module 140, which can be used, to be converted based on Fisher and is learned based on semi-supervised The improvement Level Set Method of habit carries out lip outline extraction.Whereby, the calculating precision of face contour position can be improved, to carry High shooting effect.
Please refer to Fig. 2, it is shown as the structural representation of the embodiment two of the night scene shoot control system of the application. In the present embodiment, night scene shoot control system 100 also includes editor module 160, and it is used to provide editing interface to use Person edits the eyes position and/or the dental area relative to the position letter in the picture from the picture of the object Breath.In actual applications, when detection module 130 can not detect the eyes position and/or tooth portion of the object in night scene During position either when the eyes position for going out the object detected by detection module 130 and/or not accurate enough dental area, The editing interface that can be provided by editor module 160, so that user is in a manual manner for described right in the picture The eyes position of elephant and/or dental area carry out positioning action or adjustment operation, and follow-up computing module 140 is improved whereby and calculates institute State the accuracy in the face contour region of object.
Specifically, editor module 160 is I/O systems, the inputting interface that the I/O systems are provided for equipment input/ Export the interface between peripheral hardware and Peripheral Interface, input/output peripheral such as touch-screen and other input/control devicess.Specifically Ground, editor module 160 can provide inputting interface include but is not limited to text, image, icon, soft-key button (or " virtual push button "), Drop-down menu, radio button, check box, optional list etc..Accordingly, shown user interface object can include: The nonreciprocal object of user interface outward appearance is constituted for transmission information, be available for users to interactive interactive object or it Combination.
The I/O systems include touch screen controller and for other outputs or one or more inputs of control device Controller.The electric signal of other inputs or control device is gone in one or more of input controller receptions/transmission from/. Certainly, in various embodiments, other described input/control devicess also may include that physical button (for example presses button, rocking bar Button etc.), dial, slider switch, control stick etc..
In some embodiments, editor module 160 is temporarily illustrated by taking touch-screen in I/O systems as an example, specifically, institute Touch-screen is stated based on tactile and/or tactile contact to receive the input of user.The touch-screen formation one receives user The touch sensitive surface of input.The touch-screen and touch screen controller (together with any associated module in memory and/or Instruction set is together) contact (and any movement or interruption of the touch) on detection touch-screen, and by connecing for detecting Thixotroping changes interacting for multimedia sample file (the such as picture file or video file) object with display on the touchscreen into.
In one exemplary embodiment, the contact point between touch-screen and user corresponds to one or many of user Individual finger.The touch-screen can use LCD (liquid crystal display) technologies or LPD (light emitting polymer displays) technology, but at it Other Display Techniques can be used in his embodiment.Touch-screen and touch screen controller can use any in a variety of Touch technologies One kind is contacted and its mobile or interruption to detect, these Touch technologies include but is not limited to electric capacity, resistance, infrared and surface acoustic wave Technology, and other proximity sensor arrays, or for the other technologies for the one or more points for determining to be in contact with touch-screen. Touch-screen shows the visual output from portable set, and touch plate does not provide visual output.Touch-screen, which can have, to be higher than 100dpi resolution ratio.In one exemplary embodiment, touch-screen can have about 168dpi resolution ratio.User can To use any appropriate object or accessory, such as instruction pen, finger etc. contact touch-screen.
The night scene shoot control system of the application is right described in the picture to detect by obtaining the picture of object first The positional information at the eyes position of elephant, and drawn according to the face contour of object described in the positional information calculation at eyes position in described Corresponding region in face, controls light compensating apparatus to carry out light filling for calculated face contour region, and pass through shooting according to this Device obtains the view data of the object;The application can also by the position of the dental area for further detecting the object, The regional location of face mask is calculated with the position for combining eyes and tooth, so as to improve the face mask (head of the object Portion) positioning the degree of accuracy, whereby, the application can in shooting image for the object face mask position carry out emphasis Exposure, and then provide high-quality night scene shooting effect for deep colour of skin object.
Referring to Fig. 3, the structural representation of the embodiment one of its electronic equipment for being shown as the application.As illustrated, this The electronic equipment 300 that embodiment is provided mainly includes processor 310 and memory 320, wherein, the storage execution of memory 320 refers to Order, when electronic equipment 300 is run, communicates between processor 310 and memory 320, and the processor 310 performs execute instruction So that electronic equipment 300 performs method as shown in Figure 1 and Figure 2.
In some embodiments, the processor is also operatively coupled to I/O ports and input structure, the I/O ends Mouthful it may be such that electronic equipment 300 can be interacted with various other electronic equipments, the input structure may be such that user can be with Electronic equipment 300 is interacted.Therefore, input structure may include button, keyboard, mouse, Trackpad etc..In addition, electronical display Device may include touch part, and the touch part is by detecting that object touches the hair of its screen (for example, surface of electronic console) Raw and/or position promotes user to input.
The processor is operationally coupled with memory and/or non-volatile memory device.More specifically, processor can The instruction stored in memory and/or non-volatile memory device is performed to perform operation in computing device, is such as generated View data and/or view data is transferred to electronic console.In this way, processor may include one or more general microprocessors Device, one or more application specific processors (ASIC), one or more FPGAs (FPGA) or theirs is any Combination.
The memory may include high-speed random access memory, and may also include nonvolatile memory, such as one Individual or multiple disk storage equipments, flash memory device or other non-volatile solid-state memory devices.In certain embodiments, memory The memory away from one or more processors can also be included, such as via RF circuits or outside port and communication network The network attached storage that (not shown) is accessed, wherein the communication network can be internet, one or more in-house networks, office Domain net (LAN), wide area network (WLAN), storage area network (SAN) etc., or its is appropriately combined.Memory Controller controllable device Such as CPU and Peripheral Interface etc access of the other assemblies to memory.
The electronic equipment of the application by obtaining the picture of object first, to detect the eyes of object described in the picture The positional information at position, and according to pair of the face contour in the picture of object described in the positional information calculation at eyes position Region is answered, controls light compensating apparatus to carry out light filling for calculated face contour region according to this, and obtained by camera device The view data of the object.
In addition, the application can also further detect the position of the dental area of the object, to combine eyes and tooth Position calculates the regional location of face mask, so that the degree of accuracy of the positioning of the face mask (head) of the object is improved, Whereby, the spy that the application is beneficial to be recognized in the environment of insufficient light using the eyes (white of the eye) and dental area of human body Property, and through the positional information at picked out eyes position (or combination of eyes and dental area), calculate object correspondence Face area, the face location for the object carries out emphasis exposure according to this, and then improves the shooting effect of the object. Therefore find particular application for the night scene photographed scene of deep colour of skin crowd.
Referring to Fig. 4, the flow chart of the embodiment one of its night scene filming control method for being shown as the application.The application's Night scene filming control method is shot suitable for night scene, and the night scene for the object (Black people) for providing dark skin that is particularly suitable for use in shoots effect Really.
The night scene filming control method of the application is applied in the electronic equipment with light compensating apparatus and camera device.
In actual embodiment, the electronic equipment is, for example, including but not limited to video camera, HPC, flat board Computer, mobile phone, smart mobile phone, media player, personal digital assistant (PDA) etc. portable electric appts, should be managed Solution, portable electric appts of the application described in embodiment are an application example, and the component of the equipment can compare Diagram has more or less components, or with different component Configurations.Draw diagram various assemblies can with hardware, The combination of software or software and hardware is realized, including one or more signal transactings and/or application specific integrated circuit.
The electronic equipment includes memory, Memory Controller, one or more processors (CPU), Peripheral Interface, RF Circuit, voicefrequency circuit, loudspeaker, microphone, input/output (I/O) subsystem, touch-screen, other outputs or control device, with And outside port.These components are communicated by one or more communication bus or signal wire.
The electronic equipment supports various application programs, such as one or more of the following:Drawing application program, Application program, word-processing application, website establishment application program, disk editor application program, spreadsheet application journey is presented Sequence, game application, telephony application, videoconference application, email application, instant message application journey Sequence, body-building support application program, photo management application program, digital camera applications program, digital video camcorder application program, Web page browsing application program, digital music player application and/or video frequency player application program.
The light compensating apparatus is, for example, the flash lamp of camera, in the case of shot subject dark, glistening Shot subject is illuminated to obtain correct exposure, and it mainly includes LED flash or hernia flash lamp.
Hernia flashing light system mainly has three parts:A battery for providing power supply, the gas for producing mountain pass Body discharge tube, and for connecting a circuit (being made up of multiple electronic components) for power supply and gas-discharge tube, wherein, by electricity When circuit is connected at the two poles of the earth in pond, forced electronic is flowed to another pole by battery by circuit from a pole of battery.The electricity moved Sub (i.e. electric current) provides energy to each element for being connected to circuit.The discharge tube can be by a pipe structure for being full of xenon Into the two ends of pipe are electrodes, and centre is a metal trigger board.Its general principle is by electric current by the xenon in fluorescent tube From an electrode conduction (freedom of movement electronics) to another electrode.Free electron will make xenon gas atoms be powered in movement, from And it is sent optical photon.It is high voltage that the circuit, which is used for the low voltage transition of the battery, by discharge tube Metal trigger board applies a very high positive voltage, so that xenon is ionized.The electronics of high-speed motion will be sent out with xenon gas atoms Raw collision, so that it is powered and produces light, so that xenon lamp is luminous.
The operation principle of LED flash is the PN junction two ends by the way that voltage to be added in LED, is that PN junction forms an energy in itself Level, then electronics transition and produce pipe to light on this energy level, because LED flash cost is more cheap, thus compared with On the camera function for being applied to most smart mobile phone more.
The camera device refers to camera, including but not limited to slr camera, digital single-lens reflex camera, twin lens reflex formula Camera, middle image camera, gearing ranging camera, card camera, pinhole camera, emigrant's formula camera, paraxonic camera, Polaroid phase Machine, panorama camera, diving camera etc..
The camera device is a kind of utilization optical imaging concept formation image and the equipment for using negative writing image.Very Can all possess the feature of camera with recording image equipment more.Camera is the optical instrument for photography.Subject reflects After the light gone out is focused on by the shutter of photographic film (taking the photograph scenery mirror) and control light exposure, subject is photosensitive in camera bellows Sub-image is formed on material, the permanent image of composition is handled and (develop, be fixed) through flushing, this technology is referred to as photography.Point For general photograph and the shooting of specialty.
As illustrated, the night scene filming control method of the application includes following process step:
Step S401, the picture of an object is obtained by the camera device, wherein, it is described right that the picture at least includes The face of elephant, that is, by the picture of the camera acquisition object, in practical application, the picture can be by institute When stating the still photo of the object generated after camera device is shot for the object or not shooting, The dynamic image for the object being presented in the camera lens of the camera device.It should be noted that, for the embodiment of the latter, The position for camera device and the object may be needed to be positioned, to ensure phase of the object in the picture Constant is kept to position.
Step S402, detects the eyes position of object described in the picture, and confirms the eyes position relative to institute State the positional information in picture.
Please refer to Fig. 6 A, in embodiments herein, because the color of the white of the eye is shallower, even if in night scene mode Identification is also easy to, therefore, this step is using this shallower characteristic of the color of the white of the eye in eyes, to detect in the picture Eyes position position.
In actual applications, it can be handled by the gray processing of image, the eyes of the object are detected from the picture Position.Described image gray processing processing is to turn into the gray scale of image by the way that coloured image to be transformed into the process of gray level image Change is handled.
Because the color of each pixel in coloured image has tri- components of R, G, B to determine, and each component has 255 intermediate values Desirable, such a pixel can have the excursion of the color of more than 1,600 ten thousand (255 × 255 × 255).And gray level image is A kind of special coloured image of tri- component identicals of R, G, B, the excursion of one pixel is 255 kinds, so in number The image of various forms typically is first transformed into gray level image to reduce the amount of calculation of follow-up image by word image procossing kind, is mitigated The burden of system operation.
The gray processing processing of image can realize that first method is to obtain R, G, B of each pixel with two methods This average value, is then given to three components of this pixel by the average value of three components.Second method is according to YUV Color space in, the physical significance of Y component is the brightness of point, reflects brightness degree by the value, according to RGB and YUV colors The variation relation in space sets up the corresponding of tri- color components of brightness Y and R, G, B:Y=0.3R+0.59G+0.11B, it is bright with this Angle value expresses the gray value of image.
In specific embodiment, the location confirmation step at the eyes position includes validating that the eyes portion is located at described draw Two-dimensional coordinate positional information in face, that is, by combining two-dimensional coordinate axle technology, to determine that it is described that the eyes portion is located at Two-dimensional coordinate positional information in picture.
In preferred embodiment, this step also includes providing editing interface, so that user is from the picture of the object The middle determination eyes position and/or the dental area are relative to the positional information in the picture.In actual applications, when This detecting step can not be detected in night scene when eyes position and/or the dental area of the object or when this detection is walked When the rapid detected eyes position for going out the object and/or not accurate enough dental area, the editor described in offer can be passed through Interface, so that user is directed to eyes position and/or the dental area progress of the object in the picture in a manual manner Positioning action or adjustment operation, improve whereby the object that follow-up calculation procedure is calculated face contour region it is accurate Degree.
Specifically, this edit step can be realized by I/O systems, and the editing interface that the I/O systems are provided is to set Interface between standby input/output peripheral hardware and Peripheral Interface, input/output peripheral such as touch-screen and other input/controls are set It is standby.Specifically, the editing interface that this step is provided includes but is not limited to text, image, icon, soft-key button and (or " virtually pressed Button "), drop-down menu, radio button, check box, optional list etc..Accordingly, shown user interface object can be with Including:For transmission information or constitute user interface outward appearance nonreciprocal object, be available for users to interactive interactive object Or its combination.
The I/O systems include touch screen controller and for other outputs or one or more inputs of control device Controller.The electric signal of other inputs or control device is gone in one or more of input controller receptions/transmission from/. Certainly, in various embodiments, other described input/control devicess also may include that physical button (for example presses button, rocking bar Button etc.), dial, slider switch, control stick etc..
In some embodiments, this step is temporarily illustrated by taking touch-screen in I/O systems as an example, specifically, described to touch Screen is touched based on tactile and/or tactile contact to receive the input of user.The touch-screen formation one receives user's input Touch sensitive surface.The touch-screen and touch screen controller are (together with any associated module and/or instruction in memory Collection together) detection touch-screen on contact (and any movement or interruption of the touch), and by the contact detected become Change interacting for multimedia sample file (the such as picture file or video file) object with display on the touchscreen into.
In one exemplary embodiment, the contact point between touch-screen and user corresponds to one or many of user Individual finger.The touch-screen can use LCD (liquid crystal display) technologies or LPD (light emitting polymer displays) technology, but at it Other Display Techniques can be used in his embodiment.Touch-screen and touch screen controller can use any in a variety of Touch technologies One kind is contacted and its mobile or interruption to detect, these Touch technologies include but is not limited to electric capacity, resistance, infrared and surface acoustic wave Technology, and other proximity sensor arrays, or for the other technologies for the one or more points for determining to be in contact with touch-screen. Touch-screen shows the visual output from portable set, and touch plate does not provide visual output.Touch-screen, which can have, to be higher than 100dpi resolution ratio.In one exemplary embodiment, touch-screen can have about 168dpi resolution ratio.User can To use any appropriate object or accessory, such as instruction pen, finger etc., to contact touch-screen.
Step S403, is located at the positional information in the picture according to the eyes portion, calculates the face wheel of the object The wide positional information in the picture, generates corresponding face contour region in the picture according to this.
In concrete application, this step is the two-dimensional coordinate positional information being located at according to the eyes portion in the picture, Positional information of the multiple boundary points of the face contour of the object (such as by compensating X, Y value) in the picture is calculated, Corresponding face contour region is generated in the picture according to this.
It is the prominent features of face in view of eyes, they are occupied in face between the position of comparison fixation, eyes Distance features the size of face, is the normalized foundation of recognition of face mesoscale, therefore, most face recognition algorithms All rely on the positioning of eyes.That is, as long as the position of eyes is accurately positioned, then other features of face, such as eyebrow, nose, Mouth etc., can more accurately be positioned by potential distribution relation.
In certain embodiments, this step can for example using Hough transform method, deforming template method, edge feature analytic approach or The location algorithms such as converter technique are piled, to be located at the two-dimensional coordinate positional information in the picture according to eyes portion, it is described right to calculate Positional information of the multiple boundary points of the face contour of elephant in the picture, generates corresponding face in the picture according to this Contour area (i.e. the region as shown in the square frame in Fig. 6 B).
In addition, the boundary point of described face contour may be, for example, the chin portion of the object, forehead position, Yi Jishuan The present position information of ear position, and a framework as shown in Figure 6B is generated by connecting above-mentioned each boundary point, the object Face area be located at Fig. 6 B shown in framework in.It should be noted that, the face contour region that the application is formed not with Rectangular frame shown in Fig. 6 B is limited, specifically, when the quantity of the boundary point calculated is more, the face contour formed Region is then more accurate, and the shape in the face contour region generated is depending on the number according to the quantity of boundary point.
Step S404, controls the light compensating apparatus to carry out light filling for the face contour region, and pass through the shooting Device obtains the view data of the object.
In actual applications, this step can be realized by corresponding hardware, can also be performed by corresponding hardware corresponding Software complete, such as described hardware can be by provide shoot control function software and/or secondary processor perform based on The dedicated computing hardware of calculation task is completed.
Fig. 5 is shown as the structural representation of the embodiment two of the night scene filming control method of the application.In the present embodiment, Methods described is further comprising the steps of:
Step S501, detects the dental area of object described in the picture, and confirms the dental area relative to institute State the positional information in picture.
As shown in Figure 6A, because dental area is similar to the color characteristic at white of the eye position, it is shallower to also belong to color, even if The position of identification is also easy in night scene mode, therefore, when subject is grinningly being smiled during shooting picture, the application Method can also further detect the dental area of object described in the picture.
Step S502, according to the positional information of the eyes position and the dental area in the picture, is calculated The positional information of the face contour of the object in the picture.
In practical application, this step can use the improvement level set side converted based on Fisher and based on semi-supervised learning Method carries out lip outline extraction, according to this by the positional information being located at reference to eyes position and face portion in the picture, and Calculate corresponding region of the corresponding face contour of the object in the picture.Whereby, the present processes can be further The calculating precision of face contour position is improved, to improve shooting effect.
In summary, the night scene filming control method of the application mainly by obtaining the picture of object first, to detect The positional information at the eyes position of object described in picture is stated, and according to the face of object described in the positional information calculation at eyes position Corresponding region of the contouring in the picture, controls light compensating apparatus to be mended for calculated face contour region according to this Light, and obtain by camera device the view data of the object.
In addition, the application can also further detect the position of the dental area of the object, to combine eyes and tooth Position calculates the regional location of face mask, so that the degree of accuracy of the positioning of the face mask (head) of the object is improved, Whereby, the spy that the application is beneficial to be recognized in the environment of insufficient light using the eyes (white of the eye) and dental area of human body Property, and through the positional information at picked out eyes position (or combination of eyes and dental area), calculate object correspondence Face area, the face location for the object carries out emphasis exposure according to this, and then improves the shooting effect of the object. Therefore find particular application for the night scene photographed scene of deep colour of skin crowd.
The principle and its effect of above-described embodiment only illustrative the application, not for limitation the application.It is any ripe Know the personage of this technology all can without prejudice to spirit herein and under the scope of, modifications and changes are carried out to above-described embodiment.Cause This, those of ordinary skill in the art is complete without departing from spirit disclosed herein and institute under technological thought such as Into all equivalent modifications or change, should be covered by claims hereof.

Claims (17)

1. a kind of night scene filming control method, applied in the electronic equipment with light compensating apparatus and camera device, its feature exists In, including:
The picture of an object is obtained, wherein, the picture at least includes the face of the object;
The eyes position of object described in the picture is detected, and confirms the eyes position relative to the position in the picture Information;
It is located at the positional information in the picture according to the eyes portion, calculates the face contour of the object in the picture Positional information, corresponding face contour region is generated in the picture according to this;And
Control the light compensating apparatus to carry out light filling for the face contour region, and obtain described right by the camera device The view data of elephant.
2. night scene filming control method according to claim 1, it is characterised in that the location confirmation step at the eyes position Suddenly include:Confirm the two-dimensional coordinate positional information that the eyes portion is located in the picture.
3. night scene filming control method according to claim 2, it is characterised in that also including being located at according to the eyes portion Two-dimensional coordinate positional information in the picture, calculates multiple boundary points of face contour of the object in the picture Positional information, the step of generating corresponding face contour region in the picture according to this.
4. night scene filming control method according to claim 1, it is characterised in that also including detecting described in the picture The dental area of object, and confirm the step of dental area is relative to positional information in the picture.
5. night scene filming control method according to claim 4, it is characterised in that also including according to the eyes position with And positional information of the dental area in the picture, the face contour of the object is calculated described in the picture The step of positional information.
6. night scene filming control method according to claim 4, it is characterised in that also including providing editing interface, for User determines the eyes position and/or the dental area relative to the position in the picture from the picture of the object The step of confidence ceases.
7. night scene filming control method according to claim 1, it is characterised in that the electronic equipment be smart mobile phone, Any one of video camera, HPC, tablet personal computer.
8. night scene filming control method according to claim 1, it is characterised in that the light compensating apparatus is flash lamp.
9. a kind of night scene shoot control system, it is characterised in that including:
Supplementary lighting module, for providing light filling;
Photographing module, the picture for obtaining object, wherein, the picture at least includes the face of the object, and according to The picture of the acquired object generates the view data of the object;
Detection module, for detecting the eyes position of the object from the picture, and it is described to confirm that the eyes portion is located at Positional information in picture;
Computing module, for the positional information being located at according to the eyes portion in the picture, calculates the face wheel of the object The wide positional information in the picture, generates corresponding face contour region in the picture according to this;And
Control module, the face contour region generated for controlling the supplementary lighting module to be directed in the picture carries out light filling, together When control the photographing module to obtain the view data of the object.
10. night scene shoot control system according to claim 9, it is characterised in that the detection module is used to confirm institute State two-dimensional coordinate positional information of the eyes portion in the picture.
11. night scene shoot control system according to claim 10, it is characterised in that the computing module is used for according to double Eye is located at the two-dimensional coordinate positional information in the picture, calculates multiple boundary points of face contour of the object in described Positional information in picture, generates corresponding face contour region in the picture according to this.
12. night scene shoot control system according to claim 9, it is characterised in that the detection module also includes detection The dental area of object described in the picture, and confirm the dental area relative to the positional information in the picture.
13. night scene shoot control system according to claim 12, it is characterised in that the computing module also includes foundation The positional information of the eyes position and the dental area in the picture, calculates the face contour of the object in institute State the positional information in picture.
14. night scene shoot control system according to claim 12, it is characterised in that also including editor module, for carrying For editing interface, so that user edits the eyes position and/or the dental area is relative from the picture of the object Positional information in the picture.
15. night scene shoot control system according to claim 9, it is characterised in that the system loading runs on electronics In equipment, and the electronic equipment is any one of smart mobile phone, video camera, HPC, tablet personal computer.
16. night scene filming control method according to claim 9, it is characterised in that the object is with dark skin Object.
17. a kind of electronic equipment, it is characterised in that including:
One or more processors;
Memory;And
One or more programs, wherein one or more of programs are stored in the memory and are configured as by described One or more processors execute instruction, execute instruction described in one or more of computing devices causes the electronic equipment Perform the night scene filming control method as described in claim any one of 1-8.
CN201710465036.0A 2017-06-19 2017-06-19 Night scene shooting control method, system and equipment Active CN107155063B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710465036.0A CN107155063B (en) 2017-06-19 2017-06-19 Night scene shooting control method, system and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710465036.0A CN107155063B (en) 2017-06-19 2017-06-19 Night scene shooting control method, system and equipment

Publications (2)

Publication Number Publication Date
CN107155063A true CN107155063A (en) 2017-09-12
CN107155063B CN107155063B (en) 2020-10-20

Family

ID=59796333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710465036.0A Active CN107155063B (en) 2017-06-19 2017-06-19 Night scene shooting control method, system and equipment

Country Status (1)

Country Link
CN (1) CN107155063B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1514397A (en) * 2002-12-31 2004-07-21 佳能株式会社 Human ege detecting method, apparatus, system and storage medium
CN1839410A (en) * 2003-07-18 2006-09-27 佳能株式会社 Image processor, imaging apparatus and image processing method
US20150036040A1 (en) * 2013-07-31 2015-02-05 Canon Kabushiki Kaisha Exposure control apparatus and method, storage medium, and image pickup apparatus
CN104349068A (en) * 2013-08-08 2015-02-11 联想(北京)有限公司 Shooting method and electronic equipment
CN104601870A (en) * 2015-02-15 2015-05-06 广东欧珀移动通信有限公司 Rotating camera shooting method and mobile terminal
CN105096267A (en) * 2015-07-02 2015-11-25 广东欧珀移动通信有限公司 Method and device for adjusting brightness of eye portion based on shooting identification
CN105163038A (en) * 2015-09-02 2015-12-16 移康智能科技(上海)有限公司 Linked light-supplementing method and device thereof
CN105554391A (en) * 2015-12-31 2016-05-04 广州广电运通金融电子股份有限公司 Camera control method and device and financial equipment terminal
CN105657289A (en) * 2016-03-28 2016-06-08 广东欧珀移动通信有限公司 Control method and device and electronic device
US20160381281A1 (en) * 2015-06-24 2016-12-29 Canon Kabushiki Kaisha Image capture control apparatus and control method of the same
CN106303266A (en) * 2015-05-19 2017-01-04 小米科技有限责任公司 The control method of flash lamp, device and terminal
CN106291467A (en) * 2015-06-09 2017-01-04 中兴通讯股份有限公司 A kind of method realizing shooting and terminal
CN106331505A (en) * 2016-09-30 2017-01-11 西安易朴通讯技术有限公司 Shooting method and device
CN106446873A (en) * 2016-11-03 2017-02-22 北京旷视科技有限公司 Face detection method and device
CN106657798A (en) * 2017-02-28 2017-05-10 上海传英信息技术有限公司 Photographing method for intelligent terminal
CN106682620A (en) * 2016-12-28 2017-05-17 北京旷视科技有限公司 Human face image acquisition method and device
CN106713780A (en) * 2017-01-16 2017-05-24 维沃移动通信有限公司 Control method for flash lamp and mobile terminal
CN106791451A (en) * 2017-02-28 2017-05-31 上海传英信息技术有限公司 A kind of photographic method of intelligent terminal

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1514397A (en) * 2002-12-31 2004-07-21 佳能株式会社 Human ege detecting method, apparatus, system and storage medium
CN1839410A (en) * 2003-07-18 2006-09-27 佳能株式会社 Image processor, imaging apparatus and image processing method
US20150036040A1 (en) * 2013-07-31 2015-02-05 Canon Kabushiki Kaisha Exposure control apparatus and method, storage medium, and image pickup apparatus
CN104349068A (en) * 2013-08-08 2015-02-11 联想(北京)有限公司 Shooting method and electronic equipment
CN104601870A (en) * 2015-02-15 2015-05-06 广东欧珀移动通信有限公司 Rotating camera shooting method and mobile terminal
CN106303266A (en) * 2015-05-19 2017-01-04 小米科技有限责任公司 The control method of flash lamp, device and terminal
CN106291467A (en) * 2015-06-09 2017-01-04 中兴通讯股份有限公司 A kind of method realizing shooting and terminal
US20160381281A1 (en) * 2015-06-24 2016-12-29 Canon Kabushiki Kaisha Image capture control apparatus and control method of the same
CN105096267A (en) * 2015-07-02 2015-11-25 广东欧珀移动通信有限公司 Method and device for adjusting brightness of eye portion based on shooting identification
CN105163038A (en) * 2015-09-02 2015-12-16 移康智能科技(上海)有限公司 Linked light-supplementing method and device thereof
CN105554391A (en) * 2015-12-31 2016-05-04 广州广电运通金融电子股份有限公司 Camera control method and device and financial equipment terminal
CN105657289A (en) * 2016-03-28 2016-06-08 广东欧珀移动通信有限公司 Control method and device and electronic device
CN106331505A (en) * 2016-09-30 2017-01-11 西安易朴通讯技术有限公司 Shooting method and device
CN106446873A (en) * 2016-11-03 2017-02-22 北京旷视科技有限公司 Face detection method and device
CN106682620A (en) * 2016-12-28 2017-05-17 北京旷视科技有限公司 Human face image acquisition method and device
CN106713780A (en) * 2017-01-16 2017-05-24 维沃移动通信有限公司 Control method for flash lamp and mobile terminal
CN106657798A (en) * 2017-02-28 2017-05-10 上海传英信息技术有限公司 Photographing method for intelligent terminal
CN106791451A (en) * 2017-02-28 2017-05-31 上海传英信息技术有限公司 A kind of photographic method of intelligent terminal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张志群: "夜间拍照的图像增强算法研究", 《中国优秀硕士学位论文全文数据库》 *
薄润芳: "基于肤色信息的人脸检测和眼睛定位", 《中国优秀硕士学位论文全文数据库》 *

Also Published As

Publication number Publication date
CN107155063B (en) 2020-10-20

Similar Documents

Publication Publication Date Title
US20200335136A1 (en) Method and device for processing video
US11030733B2 (en) Method, electronic device and storage medium for processing image
CN104580922B (en) A kind of control method and device for shooting light filling
WO2022179025A1 (en) Image processing method and apparatus, electronic device, and storage medium
CN107392933B (en) Image segmentation method and mobile terminal
WO2022001806A1 (en) Image transformation method and apparatus
WO2019011091A1 (en) Photographing reminding method and device, terminal and computer storage medium
CN112614057A (en) Image blurring processing method and electronic equipment
EP4072121A1 (en) Photographing method and apparatus, storage medium, and electronic device
CN112738420B (en) Special effect implementation method, device, electronic equipment and storage medium
CN108370417A (en) The mobile terminal and screening-mode conversion method of screening-mode can be converted easily
US20140354784A1 (en) Shooting method for three dimensional modeling and electronic device supporting the same
US11284020B2 (en) Apparatus and method for displaying graphic elements according to object
CN112581358A (en) Training method of image processing model, image processing method and device
US20230224574A1 (en) Photographing method and apparatus
CN108494996A (en) Image processing method, device, storage medium and mobile terminal
JP2015126326A (en) Electronic apparatus and image processing method
CN107977636B (en) Face detection method and device, terminal and storage medium
WO2021185374A1 (en) Image capturing method and electronic device
US9536133B2 (en) Display apparatus and control method for adjusting the eyes of a photographed user
CN115150542A (en) Video anti-shake method and related equipment
CN107155063A (en) Night scene filming control method, system and equipment
CN105447829B (en) Image processing method and device
US20240046560A1 (en) Three-Dimensional Model Reconstruction Method, Device, and Storage Medium
CN114979458A (en) Image shooting method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant