CN105898159A - Image processing method and terminal - Google Patents

Image processing method and terminal Download PDF

Info

Publication number
CN105898159A
CN105898159A CN201610375523.3A CN201610375523A CN105898159A CN 105898159 A CN105898159 A CN 105898159A CN 201610375523 A CN201610375523 A CN 201610375523A CN 105898159 A CN105898159 A CN 105898159A
Authority
CN
China
Prior art keywords
image data
image
frame
pixel
reference frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610375523.3A
Other languages
Chinese (zh)
Other versions
CN105898159B (en
Inventor
戴向东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201610375523.3A priority Critical patent/CN105898159B/en
Publication of CN105898159A publication Critical patent/CN105898159A/en
Priority to PCT/CN2017/082941 priority patent/WO2017206656A1/en
Application granted granted Critical
Publication of CN105898159B publication Critical patent/CN105898159B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The invention discloses an image processing method and a terminal. The image processing method comprises the following steps of: obtaining an image data flow by utilizing an electronic aperture, wherein the image data flow comprises multi-frame image data; determining a reference frame from the image data flow, and registering various frames of image data in the image data flow, except the reference frame, with the reference frame; and performing fusion processing of the registered various frames of image data after replacing pixels in a black area in an image border into pixels corresponding to the reference frame, such that a target image is obtained.

Description

A kind of image processing method and terminal
Technical field
The present invention relates to the image processing techniques in the field of taking pictures, particularly relate to a kind of image processing method and end End.
Background technology
Camera function is one of function that mobile terminal is conventional, and the camera function of part mobile terminal has electronics Aperture pattern.Under electric aperture pattern, after user adjusts f-number, mobile terminal in time of exposure Continuous continual shooting can be carried out, be then overlapped again after multiple images transparent shot out Processing, the actual effect that whole structure is brought with " slow shutter " is the most consistent, and main prominent overlength is even The other time of exposure of B gate leve can be reached.The algorithm that electric aperture pattern uses will not produce overexposure, picture The effectiveness comparison that entirety presents is natural.
When using electric aperture pattern to take pictures, carry out image co-registration owing to needs are continuously shot multiple images, Therefore, each image needs to ensure alignment.Visible, before carrying out image co-registration, need first to carry out figure As alignment.Boundary at image can form black surround, if the pixel at these black surrounds is pressed due to image alignment Carry out fusion according to average weighted mode and can form the difference of brightness, bring impact to the overall vision of image.
Summary of the invention
For solving above-mentioned technical problem, embodiments provide a kind of image processing method and terminal.
The terminal that the embodiment of the present invention provides, including:
Acquiring unit, is used for utilizing electric aperture to obtain image data stream, and described image data stream includes multiframe View data;
Registration unit, for determining reference frame, by described image data stream from described image data stream Each frame image data in addition to described reference frame registrates with described reference frame;
Integrated unit, during for each frame image data after registration is carried out fusion treatment, at image boundary The pixel of black region carries out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtains target image.
In the embodiment of the present invention, described integrated unit includes:
Analyze subelement, for each frame image data after registration is analyzed, determine each two field picture limit Black region at boundary;
Replace and fusant unit, during for each frame image data after registration is carried out fusion treatment, will figure Carry out fusion treatment after replacing with, as the pixel of boundary black region, the pixel that described reference frame is corresponding, obtain Target image.
In the embodiment of the present invention, described replacement and fusant unit, it is additionally operable to the place according to described image boundary Black region, determines reference zone corresponding with described black region in the reference frame;By image limit Fusion treatment is carried out after pixel in reference zone during the pixel of black region replaces with described reference frame at boundary, Obtain target image.
In the embodiment of the present invention, described registration unit, it is additionally operable to the first frame figure in described image data stream As data are as reference frame, by each frame figure in addition to described first frame image data in described image data stream As data are alignd with described reference frame;Wherein, described alignment refers to the pixel of same spatial location Alignment.
In the embodiment of the present invention, described integrated unit, it is additionally operable to each of each frame image data after registration Pixel is overlapped according to locus correspondence.
The image processing method that the embodiment of the present invention provides, including:
Utilizing electric aperture to obtain image data stream, described image data stream includes multiple image data;
From described image data stream, determine reference frame, by described image data stream except described reference frame with Outer each frame image data registrates with described reference frame;
When each frame image data after registration is carried out fusion treatment, by the pixel of black region at image boundary Carry out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtain target image.
In the embodiment of the present invention, described to registration after each frame image data carry out fusion treatment time, by image The pixel of boundary black region carries out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtains mesh Logo image, including:
Each frame image data after registration is analyzed, determines the black region of each two field picture boundary;
When each frame image data after registration is carried out fusion treatment, by the pixel of black region at image boundary Carry out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtain target image.
In the embodiment of the present invention, described the pixel of black region at image boundary is replaced with described reference frame pair Carry out fusion treatment after the pixel answered, obtain target image, including:
Place's black region according to described image boundary, determine in the reference frame with described black region phase Corresponding reference zone;
By laggard for the pixel in reference zone during the pixel of black region replaces with described reference frame at image boundary Row fusion treatment, obtains target image.
In the embodiment of the present invention, described from described image data stream, determine reference frame, by described picture number Registrate with described reference frame according to each frame image data in addition to described reference frame in stream, including:
Using the first frame image data in described image data stream as reference frame, by described image data stream Each frame image data in addition to described first frame image data aligns with described reference frame;
Wherein, described alignment refers to align the pixel of same spatial location.
In the embodiment of the present invention, described will registration after each frame image data carry out fusion treatment, including:
Each pixel of each frame image data after registration is overlapped according to locus correspondence.
In the technical scheme of the embodiment of the present invention, in a handheld mode, electric aperture is utilized to obtain view data Stream, described image data stream includes multiple image data;Reference frame is determined from described image data stream, Each frame image data in addition to described reference frame in described image data stream is joined with described reference frame Accurate;When each frame image data after registration is carried out fusion treatment, by the pixel of black region at image boundary Carry out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtain target image.By to the present invention The enforcement of embodiment technical scheme, user can carry out the shooting of electric aperture with handheld terminal, is improving user While shooting convenience, it is to avoid due to hand-held and that the cause problem of not fogging Chu, ensured shooting Effect, improve user and shoot experience.When carrying out image co-registration, the pixel that image registration is caused Black surround is processed, it is ensured that the pixel transition of whole each position of image is natural.
Accompanying drawing explanation
Fig. 1 is the hardware architecture diagram realizing each one optional mobile terminal of embodiment of the present invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the schematic flow sheet of the image processing method of the embodiment of the present invention one;
Fig. 4 is light stream image pixel Point matching schematic diagram;
Fig. 5 is easy mobile phone motion model schematic diagram;
Fig. 6 is that the light stream that utilizes of the embodiment of the present invention carries out the flow chart that aligns to multiple image;
Fig. 7 is the schematic flow sheet of the image processing method of the embodiment of the present invention two;
Fig. 8 is the interface schematic diagram of the prompting frame of the embodiment of the present invention;
Fig. 9 is the image co-registration schematic diagram of the embodiment of the present invention;
Figure 10 is the structure composition schematic diagram of the terminal of the embodiment of the present invention one;
Figure 11 is the structure composition schematic diagram of the terminal of the embodiment of the present invention two;
Figure 12 is the electrical structure block diagram of camera.
Detailed description of the invention
The mobile terminal realizing each embodiment of the present invention is described referring now to accompanying drawing.In follow-up description, Use the suffix of such as " module ", " parts " or " unit " for representing element only for being conducive to this The explanation of inventive embodiments, itself does not has specific meaning.Therefore, " module " can mix with " parts " Close ground to use.
Mobile terminal can be implemented in a variety of manners.Such as, the terminal described in the embodiment of the present invention is permissible Including such as mobile phone, smart phone, notebook computer, digit broadcasting receiver, personal digital assistant (PDA, Personal Digital Assistant), panel computer (PAD), portable media player The mobile terminal of (PMP, Portable Media Player), guider etc. and such as numeral TV, The fixed terminal of desk computer etc..Hereinafter it is assumed that terminal is mobile terminal.But, art technology Personnel it will be appreciated that in addition to the element except being used in particular for mobile purpose, according to the embodiment of the present invention Structure can also apply to the terminal of fixed type.
Fig. 1 is the hardware configuration signal of the mobile terminal realizing each embodiment of the present invention.
Mobile terminal 100 can include wireless communication unit 110, audio/video (A/V) input block 120, User input unit 130, sensing unit 140, output unit 150, memorizer 160, interface unit 170, Controller 180 and power subsystem 190 etc..Fig. 1 shows the mobile terminal with various assembly, but It it should be understood that, it is not required that implement all assemblies illustrated.Can alternatively implement more or less of group Part.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assembly, and it allows mobile terminal 100 with wireless Radio communication between communication system or network.Such as, wireless communication unit can include broadcast reception mould Block 111, mobile communication module 112, wireless Internet module 113, short range communication module 114 and position letter At least one in breath module 115.
Broadcast reception module 111 via broadcast channel from external broadcasting management server receive broadcast singal and/or Broadcast related information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can Be generate and send broadcast singal and/or the server of broadcast related information or receive before generate wide Broadcast signal and/or broadcast related information and send it to the server of terminal.Broadcast singal can include TV broadcast singal, radio signals, data broadcasting signal etc..And, broadcast singal can enter one Step includes the broadcast singal combined with TV or radio signals.Broadcast related information can also be via shifting Dynamic communication network provides, and in this case, broadcast related information can be by mobile communication module 112 Receive.Broadcast singal can exist in a variety of manners, such as, its can with DMB (DMB, Digital Multimedia Broadcasting) electronic program guides (EPG, Electronic Program Guide), digital video broadcast-handheld (DVB-H, Digital Video Broadcasting-Handheld) The form of electronic service guidebooks (ESG, Electronic Service Guide) etc. and exist.Broadcast reception Module 111 can be broadcasted by using various types of broadcast systems to receive signal.Especially, broadcast reception Module 111 can be by using such as multimedia broadcasting-ground (DMB-T, Digital Multimedia Broadcasting-Terrestrial), DMB-satellite (DMB-S, Digital Multimedia Broadcasting-Satellite), digital video broadcast-handheld (DVB-H), forward link media (MediaFLO, Media Forward Link Only) Radio Data System, received terrestrial digital broadcasting integrated service (ISDB-T, Integrated Services Digital Broadcasting-Terrestrial) etc. digit broadcasting system receive number Word is broadcasted.Broadcast reception module 111 may be constructed such that the various broadcast systems being adapted to provide for broadcast singal with And above-mentioned digit broadcasting system.The broadcast singal received via broadcast reception module 111 and/or the relevant letter of broadcast Breath can be stored in memorizer 160 (or other type of storage medium).
Mobile communication module 112 send radio signals to base station (such as, access point, node B etc.), In exterior terminal and server at least one and/or receive from it radio signal.Such aerogram Number can include voice call signal, video calling signal or send according to text and/or Multimedia Message And/or the various types of data received.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.This module can internal or Externally it is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by this module can include WLAN Network (Wi-Fi, WLAN, Wireless Local Area Networks), WiMAX (Wibro), the whole world Microwave interconnecting accesses (Wimax), high-speed downlink packet accesses (HSDPA, High Speed Downlink Packet Access) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology Including bluetooth, RF identification (RFID, Radio Frequency Identification), Infrared Data Association (IrDA, Infrared Data Association), ultra broadband (UWB, Ultra Wideband), purple honeybee etc. Deng.
Positional information module 115 is the module of positional information for checking or obtain mobile terminal.Position is believed The typical case of breath module is global positioning system (GPS, Global Positioning System).According to working as Front technology, GPS module 115 calculates the range information from three or more satellites and correct time letter Breath and the Information application triangulation for calculating, thus according to longitude, latitude and highly accurately count Calculate three-dimensional current location information.Currently, for calculating method three satellites of use of position and temporal information also And by using an other satellite to correct the position and the error of temporal information calculated.Additionally, GPS Module 115 can calculate velocity information by Continuous plus current location information in real time.
A/V input block 120 is used for receiving audio or video signal.A/V input block 120 can include Camera 121 and mike 122, camera 121 in Video Capture pattern or image capture mode by image Static images or the view data of video that acquisition equipment obtains process.Picture frame after process can show Show on display unit 151.Picture frame after camera 121 processes can be stored in memorizer 160 (or Other storage medium) in or be transmitted via wireless communication unit 110, can be according to mobile terminal Structure provides two or more cameras 121.Mike 122 can telephone calling model, logging mode, Speech recognition mode etc. operational mode receives sound (voice data) via mike, and can be by this The acoustic processing of sample is voice data.Audio frequency (voice) data after process can be at telephone calling model In the case of be converted to can via mobile communication module 112 be sent to mobile communication base station form output.Mike Wind 122 can be implemented various types of noise and eliminate (or suppression) algorithm to eliminate (or suppression) in reception With the noise produced during transmission audio signal or interference.
It is mobile whole to control that user input unit 130 can generate key input data according to the order of user's input The various operations of end.User input unit 130 allows user to input various types of information, and can wrap Include keyboard, metal dome, touch pad (such as, detection due to touched and cause resistance, pressure, electric capacity Etc. the sensitive component of change), roller, rocking bar etc..Especially, when touch pad superposition as a layer Time on display unit 151, touch screen can be formed.
Sensing unit 140 detects the current state of mobile terminal 100, (such as, the opening of mobile terminal 100 Or closed mode), the position of mobile terminal 100, user for mobile terminal 100 contact (that is, touch Input) presence or absence, the orientation of mobile terminal 100, the acceleration or deceleration of mobile terminal 100 move and direction Etc., and generate the order or signal being used for controlling the operation of mobile terminal 100.Such as, when mobile whole When end 100 is embodied as sliding-type mobile phone, it is to open that sensing unit 140 can sense this sliding-type phone Or close.It addition, sensing unit 140 can detect whether power subsystem 190 provides electric power or interface Whether unit 170 couples with external device (ED).Sensing unit 140 can include proximity transducer 141.
Interface unit 170 is used as at least one external device (ED) and is connected, with mobile terminal 100, the interface that can pass through. Such as, external device (ED) can include wired or wireless head-band earphone port, external power source (or battery charging Device) port, wired or wireless FPDP, memory card port, there is the device of identification module for connecting Port, audio frequency input/output (I/O) port, video i/o port, ear port etc..Identification module Can be that storage is for verifying that user uses the various information of mobile terminal 100 and can include that user identifies Module (UIM, User Identify Module), client identification module (SIM, Subscriber Identity Module), Universal Subscriber identification module (USIM, Universal Subscriber Identity Module) etc. Deng.It addition, the device (hereinafter referred to as " identifying device ") with identification module can take the shape of smart card Formula, therefore, identifies that device can be connected with mobile terminal 100 via port or other attachment means.Interface Unit 170 may be used for receive from external device (ED) input (such as, data message, electric power etc.) and And one or more elements that the input received is transferred in mobile terminal 100 or may be used for moving Dynamic transmission data between terminal and external device (ED).
It addition, when mobile terminal 100 is connected with external base, interface unit 170 can serve as allowing to lead to Cross it provide the path of mobile terminal 100 by electric power from base or can serve as allowing from base input Various command signals are transferred to the path of mobile terminal by it.From various command signals or the electricity of base input Power may serve as identifying whether mobile terminal is accurately fitted within the signal on base.Output unit 150 It is configured to provide output signal (such as, audio signal, video letter with vision, audio frequency and/or tactile manner Number, alarm signal, vibration signal etc.).Output unit 150 can include that display unit 151, audio frequency are defeated Go out module 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information processed in mobile terminal 100.Such as, mobile terminal is worked as 100 when being in telephone calling model, display unit 151 can show with call or other communicate (such as, Text messaging, multimedia file download etc.) relevant user interface (UI, User Interface) Or graphic user interface (GUI, Graphical User Interface).Lead to when mobile terminal 100 is in video When words pattern or image capture mode, display unit 151 can show the image of capture and/or the figure of reception Picture, UI or GUI illustrating video or image and correlation function etc..
Meanwhile, when display unit 151 and touch pad the most superposed on one another to form touch screen time, aobvious Show that unit 151 can serve as input equipment and output device.Display unit 151 can include liquid crystal display (LCD, Liquid Crystal Display), thin film transistor (TFT) LCD (TFT-LCD, Thin Film Transistor-LCD), Organic Light Emitting Diode (OLED, Organic Light-Emitting Diode) shows Show at least one in device, flexible display, three-dimensional (3D) display etc..In these display one May be constructed such that transparence is watched from outside with permission user, this is properly termed as transparent display, typical case Transparent display can for example, transparent organic light emitting diode (TOLED) display etc..According to spy Surely the embodiment wanted, mobile terminal 100 can include two or more display units (or other display Device), such as, mobile terminal can include outernal display unit (not shown) and inner display unit (not Illustrate).Touch screen can be used for detecting touch input pressure and touch input position and touch input area.
Dio Output Modules 152 can be in call signal at mobile terminal and receive pattern, call mode, note Time under the isotypes such as record pattern, speech recognition mode, broadcast reception mode, wireless communication unit 110 is connect Receive or in memorizer 160 storage voice data transducing audio signal and be output as sound.And, Dio Output Modules 152 can provide the audio frequency output (example relevant to the specific function of mobile terminal 100 execution As, call signal receives sound, message sink sound etc.).Dio Output Modules 152 can include raising one's voice Device, buzzer etc..
Alarm unit 153 can provide output to notify event to mobile terminal 100.Typically Event can include calling reception, message sink, key signals input, touch input etc..Except audio frequency or Outside video frequency output, alarm unit 153 can provide in a different manner and export the generation with notification event. Such as, alarm unit 153 can with vibration form provide output, when receive calling, message or some When other enters communication (incoming communication), alarm unit 153 can provide sense of touch to export (that is, vibration) is to notify to user.By providing such sense of touch to export, even if in the shifting of user When mobile phone is in the pocket of user, user also is able to identify the generation of various event.Alarm unit 153 The output of the generation of notification event can also be provided via display unit 151 or dio Output Modules 152.
Memorizer 160 can store the process performed by controller 180 and the software program controlling operation etc., Or can temporarily store data (such as, telephone directory, message, the static state that oneself maybe will export through output Image, video etc.).And, memorizer 160 can store and export about when touch is applied to touch screen The vibration of various modes and the data of audio signal.
Memorizer 160 can include the storage medium of at least one type, described storage medium include flash memory, Hard disk, multimedia card, card-type memorizer (such as, SD or DX memorizer etc.), random access store Device (RAM, Random Access Memory), static random-access memory (SRAM, Static Random Access Memory), read only memory (ROM, Read Only Memory), electric erasable Programmable read only memory (EEPROM, Electrically Erasable Programmable Read Only Memory), programmable read only memory (PROM, Programmable Read Only Memory), magnetic Property memorizer, disk, CD etc..And, mobile terminal 100 can be deposited with being connected execution by network The network storage device cooperation of the storage function of reservoir 160.
Controller 180 generally controls the overall operation of mobile terminal.Such as, controller 180 performs and voice Control that call, data communication, video calling etc. are relevant and process.It addition, controller 180 can wrap Including the multi-media module 181 for reproducing (or playback) multi-medium data, multi-media module 181 can be with structure Make in controller 180, or it is so structured that separate with controller 180.Controller 180 can perform Pattern recognition process, to be identified as character by the handwriting input performed on the touchscreen or picture drafting input Or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides behaviour Make the suitable electric power needed for each element and assembly.
Various embodiment described herein can be to use such as computer software, hardware or its any combination Computer-readable medium implement.Implementing for hardware, embodiment described herein can be by using At application-specific IC (ASIC, Application Specific Integrated Circuit), digital signal Reason device (DSP, Digital Signal Processing), digital signal processing device (DSPD, Digital Signal Processing Device), programmable logic device (PLD, Programmable Logic Device), existing Field programmable gate array (FPGA, Field Programmable Gate Array), processor, controller, Microcontroller, microprocessor, it is designed to perform at least one in the electronic unit of function described herein Implementing, in some cases, such embodiment can be implemented in controller 180.For software Implement, the embodiment of such as process or function can with allow to perform the independent of at least one function or operation Software module implement.Software code can be by the software application journey write with any suitable programming language Sequence (or program) is implemented, and software code can be stored in memorizer 160 and be held by controller 180 OK.
So far, oneself is through describing mobile terminal according to its function.Below, for the sake of brevity, will describe all In various types of mobile terminals of folded form, board-type, oscillating-type, slide type mobile terminal etc. Slide type mobile terminal is as example.Therefore, the present invention can be applied to any kind of mobile terminal, and And it is not limited to slide type mobile terminal.
As shown in Figure 1 mobile terminal 100 may be constructed such that and utilizes via frame or packet transmission data The most wired and wireless communication system and satellite-based communication system operate.
Referring now to Fig. 2, the communication system that mobile terminal the most according to embodiments of the present invention is operable to is described System.
Such communication system can use different air interfaces and/or physical layer.Such as, by communication system The air interface used includes such as frequency division multiple access (FDMA, Frequency Division Multiple Access), time division multiple acess (TDMA, Time Division Multiple Access), CDMA (CDMA, Code Division Multiple Access) and UMTS (UMTS, Universal Mobile Telecommunications System) (especially, Long Term Evolution (LTE, Long Term Evolution)), Global system for mobile communications (GSM) etc..As non-limiting example, explained below relates to CDMA Communication system, but such teaching is equally applicable to other type of system.
With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 100, multiple base station (BS, Base Station) 270, base station controller (BSC, Base Station Controller) 275 and mobile switch Center (MSC, Mobile Switching Center) 280.MSC280 is configured to and public telephone switch Network (PSTN, Public Switched Telephone Network) 290 forms interface.MSC280 is also The BSC275 formation interface being configured to and base station 270 can be couple to via back haul link.Back haul link If can construct according to any one in the interface that Ganji knows, described interface include such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.It will be appreciated that system as shown in Figure 2 System can include multiple BSC275.
Each BS 270 can service one or more subregion (or region), by multidirectional antenna or point to specific Each subregion that the antenna in direction covers is radially away from BS 270.Or, each subregion can be by being used for Two or more antennas of diversity reception cover.Each BS 270 may be constructed such that the multiple frequencies of support are divided Join, and the distribution of each frequency has specific frequency spectrum (such as, 1.25MHz, 5MHz etc.).
Intersecting that subregion and frequency are distributed can be referred to as CDMA Channel.BS 270 can also be referred to as base Station transceiver subsystem (BTS, Base Transceiver Station) or other equivalent terms.So In the case of, term " base station " may be used for broadly representing single BSC275 and at least one BS 270. Base station can also be referred to as " cellular station ".Or, each subregion of specific BS 270 can be referred to as multiple honeybee Nest station.
As shown in Figure 2, broadcsting transmitter (BT, Broadcast Transmitter) 295 is by broadcast singal It is sent in system the mobile terminal 100 of operation.Broadcast reception module 111 is set as shown in Figure 1 Put the broadcast singal sent by BT295 at mobile terminal 100 with reception.In fig. 2 it is shown that it is several Satellite 300, such as, can use global positioning system (GPS) satellite 300.Satellite 300 helps location many At least one in individual mobile terminal 100.
In fig. 2, depict multiple satellite 300, it is understood that be, it is possible to use any number of defend Star obtains useful location information.GPS module 115 is generally configured to and satellite 300 as shown in Figure 1 Coordinate the location information wanted with acquisition.Substitute GPS tracking technique or outside GPS tracking technique, Other technology of the position that can follow the tracks of mobile terminal can be used.It addition, at least one gps satellite 300 Can optionally or additionally process satellite dmb transmission.
As a typical operation of wireless communication system, BS 270 receives from various mobile terminals 100 Reverse link signal.Mobile terminal 100 generally participates in call, information receiving and transmitting communicates with other type of.Special The each reverse link signal determining base station 270 reception is processed in specific BS 270.The data obtained It is forwarded to the BSC275 being correlated with.BSC provides call resource distribution and the soft handover included between BS 270 The mobile management function of the coordination of process.The data received also are routed to MSC280 by BSC275, and it carries For the extra route service for forming interface with PSTN290.Similarly, PSTN290 and MSC280 Forming interface, MSC Yu BSC275 forms interface, and BSC275 correspondingly controls BS 270 to incite somebody to action Forward link signals is sent to mobile terminal 100.
In mobile terminal, the mobile communication module 112 of wireless communication unit 110 is based on built-in the connecing of mobile terminal The necessary data entering mobile communications network (such as mobile communications networks such as 2G/3G/4G) (includes that user identifies Information and authentication information) accessing mobile communication network is the web page browsing of mobile phone users, network multimedia The business transmission mobile datas such as broadcasting (include up mobile data and descending mobile communication number According to).
The wireless Internet module 113 of wireless communication unit 110 is by running the related protocol merit of hotspot Can and realize the function of hotspot, hotspot supports multiple mobile terminals (any outside mobile terminal Mobile terminal) access, by the mobile communication between multiplexing mobile communication module 112 and mobile communications network It is connected as the business such as the web page browsing of mobile phone users, network multimedia broadcasting transmission mobile data (bag Include up mobile data and descending mobile data), owing to mobile terminal substantially multiplexing moves Dynamic mobile communication between terminal with communication network is connected transmission mobile data, and therefore mobile terminal disappears The flow of the mobile data of consumption is counted the post and telecommunication tariff of mobile terminal by the charging entity of communication network side, Thus consume the data traffic of mobile data that the post and telecommunication tariff that mobile terminal signing uses includes.
Based on above-mentioned mobile terminal 100 hardware configuration and communication system, each is implemented to propose the inventive method Example.
Fig. 3 is the schematic flow sheet of the image processing method of the embodiment of the present invention one, at the image in this example Reason method is applied to terminal, as it is shown on figure 3, described image processing method comprises the following steps:
Step 301: utilizing electric aperture to obtain image data stream, described image data stream includes multiple image Data.
In the embodiment of the present invention, terminal can be the electronic equipment such as mobile phone, panel computer.Terminal has takes pictures Function, and the camera function of terminal has electric aperture pattern;When utilizing electric aperture to take pictures, need user Camera function is set to electric aperture pattern.
Under electric aperture pattern, after user adjusts f-number, in time of exposure, terminal can be carried out continuously Then multiple images shot out are carried out fusion treatment by continual shooting.Employing electric aperture is carried out When taking pictures, carrying out fusion treatment owing to needs are continuously shot multiple images, therefore, each image needs to ensure Alignment.In order to ensure ease for use and the user experience of electric aperture, the embodiment of the present invention is that electric aperture is clapped Taking the photograph and add hand-held mode, under gesture mode, user can utilize electric aperture to enter by handheld terminal easily Row shooting.
When shooting, first obtaining image data stream, described image data stream includes multiple image data.Tool Body ground, first synchronizes to obtain the raw image data stream of shooting;Then raw image data stream is carried out data Read and Image semantic classification, obtain described image data stream.Here, at due to the picture signal of camera shooting Reason (ISP, Image Signal Processing) flow process and the unpredictable conversion of external environment, synchronize to obtain Each frame raw image data in the raw image data stream taken is in illumination, noise, definition, focusing Difference occurs.Before carrying out fusion treatment, need by necessary preprocessing process raw image data stream In each frame raw image data carry out pretreatment, here, preprocessing process includes: image filtering eliminates Noise, contrast stretching improve the definition of image and the light differential of image.So carry out pretreatment After, the difference of each frame image data in image data stream will reduce, and contributes to successive image registration Algorithm The lifting of effect.
Step 302: determine reference frame from described image data stream, will remove institute in described image data stream State each frame image data beyond reference frame to registrate with described reference frame.
In the embodiment of the present invention, after each frame image data in image data stream is registrated, each two field picture In data, the pixel of same spatial location achieves alignment, it is to avoid follow-up carrying out brings mould during image co-registration Stick with paste.
In the embodiment of the present invention, registration also referred to as alignment, image alignment has a lot of algorithm, be broadly divided into based on The method of local feature and method based on global characteristics.Wherein, typical method based on local feature is to carry Take the key feature points of image, then utilize these key feature points to carry out the mapping of image space alignment model Matrix calculus, finally utilizes mapping matrix to carry out image alignment.The registration effect of this kind of method typically can expire The requirement of a lot of scene of foot, such as the change (synthesis of different exposure images) of illumination, image shift on a large scale is (complete Scape image mosaic), the scene of the various complexity such as half-light image (noise increasing).An other class is based on the overall situation The search alignment schemes of Mutual Information Matching, it is possible to reduce the matching error that random character point causes.
Optical flow field is also a kind of matching algorithm based on point, and it is that space motion object is being observed on imaging plane The instantaneous velocity of pixel motion, be the change and consecutive frame utilizing pixel in image sequence in time domain Between dependency find previous frame with the corresponding relation existed between present frame, thus calculate consecutive frame Between a kind of method of movable information of object.The purpose of research optical flow field is contemplated to from sequence of pictures closely Seemingly obtain the sports ground being not directly available.Here, sports ground is exactly that object is in three-dimensional real world in fact Motion;Optical flow field is sports ground projection of (eyes of people or photographic head) in two dimensional image plane.
By a sequence of pictures, movement velocity and the direction of motion of pixel each in every image are found out It it is exactly optical flow field.As shown in Figure 4, the when of T frame, the position of A point is (x1, y1), then we are A point is found again, if its position is (x2, y2), then we are assured that A point when of T+1 frame Motion vector: V=(x2, y2)-(x1, y1).
The when of how finding t+1 frame, the position of A point can be realized by Lucas-Kanade optical flow method, base This process is as follows:
Two frames are the hugest and significantly change front and back to assume the color of an object.Based on this thinking, Image constraint equation can be obtained.Different optical flow algorithms solves and assumes that the light stream of different additional conditions is asked Topic.Using partial derivative for room and time coordinate, image constraint equation can be written as:
I (x, y, t)=I (x+dx, y+dy, t+dt) (1)
Wherein, (x, y are t) that image is at (x, y) pixel value of position to I.
Wherein,
Assume mobile enough little, then image constraint equation is used Taylor's formula, can obtain:
I ( x + d x , y + d y , t + d t ) = I ( x , y , t ) + ∂ I ∂ x d x + ∂ I ∂ y d y + ∂ I ∂ t + H O T - - - ( 2 )
Wherein, HOT refers to higher order, and HOT can ignore in the case of movement is sufficiently small.From equation (2) In can obtain:
∂ I ∂ x d x + ∂ I ∂ y d y + ∂ I ∂ t = 0 - - - ( 3 )
Vx=dx, Vx=dx (4)
I x = ∂ I ∂ x I y = ∂ I ∂ y - - - ( 5 )
Wherein, Vx, Vy are I (x, y, the composition of x, y in light stream vectors t) respectively.Ix and Iy is then image (x, y, t) this point is to the difference of respective direction.So just having:
Ix*Vx+Iy*Vy=-It (6)
▿ I T * V → = - I t - - - ( 7 )
Above-mentioned equation has 2 unknown quantitys, at least needs two uncorrelated equations to solve. Lucas-Kanade optical flow method supposes that the motion of aerial image vegetarian refreshments is consistent, and spot projection neighbouring in a scene is to figure As upper be also neighbor point, and neighbor point speed is consistent.This be Lucas-Kanade optical flow method distinctive it is assumed that Since optical flow method fundamental equation constraint only one of which, and require x, the speed in y direction, there are two known variables. Assuming that do similar movement in characteristic point neighborhood, it is possible to many equations of simultaneous n ask for the speed (n in x, y direction It is characterized vertex neighborhood always to count, including this feature point).Can obtain equation below:
Ix1Vx+Iy1Vy=-It1
Ix2Vx+Iy2Vy=-It2
IxnVx+IynVy=-Itn (8)
In order to solve this overdetermined problem, employing method of least square:
A V → = - b V → = ( A T A ) - 1 A T ( - b ) - - - ( 10 )
Then the adjacent V of light stream can be obtained:
V x V y = Σ I x i 2 Σ I x i I y i Σ I x i I y i Σ I y i 2 - 1 - Σ I x i I t i - Σ I y i I t i - - - ( 11 )
Such scheme is mentioned little motion it is assumed that work as target velocity quickly this supposition and can be false, multiple dimensioned This problem can be solved.First, each frame being set up a gaussian pyramid, out to out picture is pushing up most Layer, original image is at bottom.Then, start to estimate next frame position, as next layer from top layer Initial position, searches for downwards along pyramid, repeats estimation action, until arriving pyramidal bottom.This Sample search can quickly navigate to the direction of motion and the position of pixel.
Fig. 6 gives and utilizes light stream that multiple image carries out flow process of aliging.Utilize the computational methods of optical flow field, The sparse match point between two width images can be obtained, utilize the coordinate of these points to calculate image afterwards and map mould Type.In image alignment step, select correct image alignment transformation model critically important.Common space becomes Die change type has affine transformation and Perspective transformation model.
Affine transformation can be expressed as following form with vivid.Any parallelogram in one plane is permissible Being affine transformation and be mapped as another parallelogram, the map operation of image enters in same space plane OK, make it deform by different transformation parameters and obtain different types of parallelogram.
x ′ y ′ 1 = a 00 a 01 a 02 a 10 a 11 a 12 0 0 0 x y 1 - - - ( 12 )
Transitting probability is the transformation model more typically changed, and compare affine transformation, and transitting probability has more flexibly Property, rectangle can be transformed into trapezoidal by a transitting probability, and it describes arrives in space plane projection In another space plane, affine transformation is a special case of perspective transform.
x ′ y ′ 1 = a 00 a 01 a 02 a 10 a 11 a 12 a 20 a 21 1 x y 1 - - - ( 13 )
The meaning of above-mentioned each element of matrix is as follows:
a02And a12For displacement parameter;
a00a01And a10a11For scaling and rotation parameter;
a20a21Deflection for level Yu vertical direction.
Need exist for select Perspective transformation model, be mainly in view of handheld terminal, as mobile phone be continuously shot many During width image, the jitter motion of mobile phone the most not in approximately the same plane, easy motion model such as Fig. 5 institute Show.
Step 303: when each frame image data after registration is carried out fusion treatment, by black at image boundary The pixel in region carries out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtains target image.
In the embodiment of the present invention, after each frame image data carries out registrating (namely alignment), need image is entered Row fusion treatment, here, takes the fusion method that image slices vegetarian refreshments is sequentially overlapped.Enter according to formula (14) Row fusion treatment:
F k = k N Σ m = 1 k I ( m ) - - - ( 14 )
Wherein, I is every image, and m is that m opens image, and k is the image number synthesized, and N is Image always synthesizes number.
With reference to Fig. 9, black surround can be formed, at these black surrounds after the boundary of image is due to image mapping transformation If pixel carry out fusion according to average weighted mode and can form the difference of brightness, regard to the entirety of image Feel and bring impact.As a example by the first frame image data in reference frame is as image data stream, other each two field pictures Data all carry out registration under the first frame image data and merge, and reference frame does not haves black surround pixel, for tool For having the view data of black surround, the pixel that the pixel at black surround replaces with reference frame corresponding position is joined With weighted average, thus the problem effectively solving image black surround.
The technical scheme of the embodiment of the present invention, it is proposed that a kind of image processing method, utilizes image registration principle The view data that multiframe is to be synthesized is carried out image alignment, and this alignment schemes allows a range of image to tremble Dynamic error, the pixel deviations of the synthetic effect appearance that image is last is less.What is more important, melts at image During conjunction, the pixel black surround caused for image registration is processed, it is ensured that whole each position of image Pixel transition is natural.
Fig. 7 is the schematic flow sheet of the image processing method of the embodiment of the present invention two, at the image in this example Reason method is applied to terminal, as it is shown in fig. 7, described image processing method comprises the following steps:
Step 701: in a handheld mode, utilizes electric aperture shooting to obtain raw image data stream, described Raw image data stream includes multiframe raw image data;Original to each frame in described raw image data stream View data carries out pretreatment, obtains described image data stream.
In the embodiment of the present invention, terminal can be the electronic equipment such as mobile phone, panel computer.Terminal has takes pictures Function, and the camera function of terminal has electric aperture pattern;When utilizing electric aperture to take pictures, need user Camera function is set to electric aperture pattern.
Under electric aperture pattern, after user adjusts f-number, in time of exposure, terminal can be carried out continuously Then multiple images shot out are carried out fusion treatment by continual shooting.Employing electric aperture is carried out When taking pictures, carrying out fusion treatment owing to needs are continuously shot multiple images, therefore, each image needs to ensure Alignment.In order to ensure ease for use and the user experience of electric aperture, the embodiment of the present invention is that electric aperture is clapped Taking the photograph and add hand-held mode, under gesture mode, user can utilize electric aperture to enter by handheld terminal easily Row shooting.
When shooting, first obtaining image data stream, described image data stream includes multiple image data.Tool Body ground, first synchronizes to obtain the raw image data stream of shooting;Then raw image data stream is carried out data Read and Image semantic classification, obtain described image data stream.Here, stream is processed due to the ISP of camera shooting Journey and the unpredictable conversion of external environment, synchronize each frame original graph in the raw image data stream obtained As data difference occur in illumination, noise, definition, focusing.Before carrying out fusion treatment, need By necessary preprocessing process, each frame raw image data in raw image data stream is carried out pretreatment, Here, preprocessing process includes: image filtering is to eliminate noise, contrast stretching to improve the clear of image Degree and the light differential of image.After so carrying out pretreatment, each frame image data in image data stream Difference will reduce, and contributes to the lifting of successive image registration Algorithm effect.
Step 702: using the first frame image data in described image data stream as reference frame, by described figure As in data stream each frame image data in addition to described first frame image data and described reference frame carry out right Together.
In the embodiment of the present invention, after each frame image data in image data stream is registrated, each two field picture In data, the pixel of same spatial location achieves alignment, it is to avoid follow-up carrying out brings mould during image co-registration Stick with paste.
In the embodiment of the present invention, registration also referred to as alignment, image alignment has a lot of algorithm, be broadly divided into based on The method of local feature and method based on global characteristics.Wherein, typical method based on local feature is to carry Take the key feature points of image, then utilize these key feature points to carry out the mapping of image space alignment model Matrix calculus, finally utilizes mapping matrix to carry out image alignment.The registration effect of this kind of method typically can expire The requirement of a lot of scene of foot, such as the change (synthesis of different exposure images) of illumination, image shift on a large scale is (complete Scape image mosaic), the scene of the various complexity such as half-light image (noise increasing).An other class is based on the overall situation The search alignment schemes of Mutual Information Matching, it is possible to reduce the matching error that random character point causes.
Optical flow field is also a kind of matching algorithm based on point, and it is that space motion object is being observed on imaging plane The instantaneous velocity of pixel motion, be the change and consecutive frame utilizing pixel in image sequence in time domain Between dependency find previous frame with the corresponding relation existed between present frame, thus calculate consecutive frame Between a kind of method of movable information of object.
Step 703: display reminding frame on display interface;Described prompting frame is for pointing out described hand-held Under pattern, the orientation of hand-held shake when utilizing electric aperture to shoot.
With reference to Fig. 8, on application interactive interface, showing user's prompting frame, this prompting frame can be pointed out The orientation of user's current handheld shake, so can be so that user corrects in time.
Step 704: each frame image data after registration is analyzed, determines each two field picture boundary Black region;When each frame image data after registration is carried out fusion treatment, by black region at image boundary Pixel replace with the pixel that described reference frame is corresponding after carry out fusion treatment, obtain target image.
In the embodiment of the present invention, described the pixel of black region at image boundary is replaced with described reference frame pair Carry out fusion treatment after the pixel answered, obtain target image, including:
Place's black region according to described image boundary, determine in the reference frame with described black region phase Corresponding reference zone;
By laggard for the pixel in reference zone during the pixel of black region replaces with described reference frame at image boundary Row fusion treatment, obtains target image.
Here it is possible to divide out by place's black region of image boundary by clustering algorithm, such as K-means Algorithm.
In the embodiment of the present invention, after each frame image data carries out registrating (namely alignment), need image is entered Row fusion treatment, here, takes the fusion method that image slices vegetarian refreshments is sequentially overlapped.
With reference to Fig. 9, black surround can be formed, at these black surrounds after the boundary of image is due to image mapping transformation If pixel carry out fusion according to average weighted mode and can form the difference of brightness, regard to the entirety of image Feel and bring impact.As a example by the first frame image data in reference frame is as image data stream, other each two field pictures Data all carry out registration under the first frame image data and merge, and reference frame does not haves black surround pixel, for tool For having the view data of black surround, the pixel that the pixel at black surround replaces with reference frame corresponding position is joined With weighted average, thus the problem effectively solving image black surround.
The technical scheme of the embodiment of the present invention, it is proposed that a kind of image processing method, utilizes image registration principle The view data that multiframe is to be synthesized is carried out image alignment, and this alignment schemes allows a range of image to tremble Dynamic error, the pixel deviations of the synthetic effect appearance that image is last is less.What is more important, melts at image During conjunction, the pixel black surround caused for image registration is processed, it is ensured that whole each position of image Pixel transition is natural.Additionally, in the shooting process of electric aperture, add prompting frame, prevent user During shooting, terminal jitter range is excessive.
Figure 10 is the structure composition schematic diagram of the terminal of the embodiment of the present invention one, as shown in Figure 10, described end End includes:
Acquiring unit 11, is used for utilizing electric aperture to obtain image data stream, and described image data stream includes many Frame image data;
Registration unit 12, for determining reference frame, by described image data stream from described image data stream In each frame image data in addition to described reference frame registrate with described reference frame;
Integrated unit 13, during for each frame image data after registration is carried out fusion treatment, by image boundary The pixel of place's black region carries out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtains target figure Picture.
Described integrated unit 13 includes:
Analyze subelement 131, for each frame image data after registration is analyzed, determine each frame figure Black region as boundary;
Replace and fusant unit 132, during for each frame image data after registration is carried out fusion treatment, Fusion treatment is carried out after the pixel of black region replaces with the pixel that described reference frame is corresponding at image boundary, Obtain target image.
Described replacement and fusant unit 132, be additionally operable to the place's black region according to described image boundary, really Fixed reference zone corresponding with described black region in the reference frame;By black region at image boundary The pixel that replaces with in described reference frame in reference zone of pixel after carry out fusion treatment, obtain target image.
Described registration unit 12, is additionally operable to using the first frame image data in described image data stream as reference Frame, by each frame image data in addition to described first frame image data in described image data stream and described ginseng Examine frame to align;Wherein, described alignment refers to align the pixel of same spatial location.
Described integrated unit 13, each pixel of each frame image data after being additionally operable to registration is according to space Position correspondence is overlapped.
It need to be noted that: the description of above terminal embodiment, the description with said method embodiment is It is similar to, there is the beneficial effect that same embodiment of the method is similar.For terminal embodiment of the present invention does not discloses Ins and outs, refer to the description of the inventive method embodiment and understand.
Figure 11 is the structure composition schematic diagram of the terminal of the embodiment of the present invention two, and described terminal includes: processor 1101, camera 1102, display screen 1103;Described processor 1101, camera 1102 and display screen 1103 All connected by bus 1104.
Described camera 1102, in a handheld mode, utilizes electric aperture shooting to obtain raw image data Stream, described raw image data stream includes multiframe raw image data;
Described processor 1101, for carrying out each frame raw image data in described raw image data stream Pretreatment, obtains described image data stream;Wherein, described pretreatment includes at least one of: image is filtered Ripple, contrast stretching;From described image data stream, determine reference frame, will described image data stream remove Each frame image data beyond described reference frame registrates with described reference frame;To each two field picture after registration When data carry out fusion treatment, the pixel of black region at image boundary is replaced with described reference frame corresponding Carry out fusion treatment after pixel, obtain target image.
Described display screen 1103, for display reminding frame on display interface;Described prompting frame exists for prompting Described in a handheld mode, the orientation of hand-held shake when utilizing electric aperture to shoot.
Figure 12 is the electrical structure block diagram of camera.
Camera lens 1211 is made up of, for single-focus lens or zoom the multiple optical lens being used for being formed shot object image Camera lens.Camera lens 1211 can move under the control of lens driver 1221 in the direction of the optical axis, and camera lens drives Dynamic device 1221, according to the control signal from lens driving control circuit 1222, controls the focus of camera lens 1211 Position, in the case of zoom lens, it is possible to control focal length.Lens driving control circuit 1222 according to Control command from microprocessor 1217 carries out the driving of lens driver 1221 and controls.
It is configured with near the position of the shot object image formed on the optical axis of camera lens 1211, by camera lens 1211 and takes the photograph Element 1212.Imaging apparatus 1212 is for imaging shot object image and obtaining image data.Taking the photograph On element 1212 two dimension and be arranged in a matrix the photodiode constituting each pixel.Each photodiode Producing the opto-electronic conversion electric current corresponding with light income, this opto-electronic conversion electric current is by being connected with each photodiode Capacitor carries out charge accumulation.The front surface of each pixel is configured with RGB (RGB) colour filter of Bayer arrangement Device.
Imaging apparatus 1212 is connected with imaging circuit 1213, and this imaging circuit 1213 is at imaging apparatus 1212 In carry out charge accumulation and control and picture signal reads and controls, to the picture signal of this reading, (analog image is believed Number) reduce to reset and carry out waveform shaping after noise, and then carry out gain raising etc. to become suitable level letter Number.
Imaging circuit 1213 is connected with A/D converter 1214, and this A/D converter 1214 is to analog image Signal carries out analog digital conversion, to bus 1227 output digital image signal (hereinafter referred to as view data).
Bus 1227 is the transmission path for being transmitted in the various data that the inside of camera reads or generates.? Bus 1227 is connected to above-mentioned A/D converter 1214, is additionally connected to image processor 1215, JPEG Processor 1216, microprocessor 1217, SDRAM (Synchronous dynamic random access memory) (SDRAM, Synchronous Dynamic random access memory) 1218, memory interface (hereinafter referred to as memory I/F) 1219, liquid crystal display (LCD, Liquid Crystal Display) driver 1220.
Image processor 1215 carries out optics black to the view data of output based on imaging apparatus 1212 (OB, Optical Black) subtracts each other process, blank level adjustment, color matrix computing, gamma conversion, color The various image procossing such as difference signal process, noise removal process, change process simultaneously, edge treated.At JPEG Reason device 1216 when by Imagery Data Recording in storage medium 1225, according to JPEG compression mode compress from The view data that SDRAM 1218 reads.Additionally, jpeg processor 1216 shows to carry out image reproducing Show and carry out the decompression of jpeg image data.When decompressing, read record at storage medium 1225 In file, after implementing decompression in jpeg processor 1216, will decompress view data It is temporarily stored in SDRAM 1218 and shows on LCD 1226.It addition, in present embodiment In, decompress mode as compression of images and use JPEG mode, but compressed and decompressed mode does not limits H.264 in this, it is of course possible to use MPEG, TIFF, other the compressed and decompressed mode such as.
Microprocessor 1217 plays the function in the control portion overall as this camera, is uniformly controlled the various of camera Process sequence.Microprocessor 1217 is connected to operating unit 1223 and flash memory 1224.
Operating unit 1223 includes but not limited to physical button or virtual key, and this entity or virtual key can Think power knob, key of taking pictures, edit key, dynamic image button, reproduction button, menu button, ten The operation controls such as various load buttons and various input keys such as keyboard, OK button, deletion button, large buttons Part, detects the mode of operation of these operational controls.
Testing result is exported to microprocessor 1217.Additionally, before as the LCD1226 of display Surface is provided with touch panel, the touch location of detection user, is exported to microprocessor 1217 by this touch location. Microprocessor 1217, according to the testing result of the operating position from operating unit 1223, performs with user's The various process sequences that operation is corresponding.
Flash memory 1224 storage is for performing the program of the various process sequences of microprocessor 1217.Microprocessor 1217 carry out, according to this program, the control that camera is overall.Additionally, flash memory 1224 stores the various adjustment of camera Value, microprocessor 1217 reads adjusted value, carries out the control of camera according to this adjusted value.
SDRAM 1218 is can the electric volatile storage rewritten for temporarily store view data etc. Device.This SDRAM 1218 temporarily stores the picture number from the output of analog/digital (A/D) transducer 1214 According to carried out in image processor 1215, jpeg processor 1216 etc. process after view data.
Memory interface 1219 is connected with storage medium 1225, carries out view data and is attached to picture number File first-class data write storage medium 1225 according to and the control of reading from storage medium 1225.Deposit Storage media 1225 may be embodied as can on camera main-body the storage medium such as memory card of disassembled and assembled freely, so And it is not limited to this, it is also possible to it is the hard disk etc. being built in camera main-body.
Lcd driver 1210 is connected with LCD 1226, the image after being processed by image processor 1215 Data are stored in SDRAM 1218, when needing display, read the view data of SDRAM 1218 storage also LCD 1226 shows, or, the compressed view data of jpeg processor 1216 is stored in SDRAM 1218, when needs show, jpeg processor 1216 reads the compressed picture number of SDRAM 1218 According to, then decompress, the view data after decompressing is shown by LCD 1226.
LCD1226 is arranged in the back side of camera main-body to carry out image and shows, but is not limited to this, it is also possible to adopt With based on organic EL i.e. Organic Light Emitting Diode (OLED, Organic Electro-Luminescence) Various display floaters carry out image and show.
Between technical scheme described in the embodiment of the present invention, in the case of not conflicting, can be in any combination.
In several embodiments provided by the present invention, it should be understood that disclosed method and smart machine, Can realize by another way.Apparatus embodiments described above is only schematically, such as, The division of described unit, is only a kind of logic function and divides, and actual can have other division side when realizing Formula, such as: multiple unit or assembly can be in conjunction with, or are desirably integrated into another system, or some features can To ignore, or do not perform.It addition, the coupling or straight that shown or discussed each ingredient is each other Connect coupling or communication connection can be the INDIRECT COUPLING by some interfaces, equipment or unit or communication connection, Can be electrical, machinery or other form.
The above-mentioned unit illustrated as separating component can be or may not be physically separate, as The parts that unit shows can be or may not be physical location, i.e. may be located at a place, it is possible to To be distributed on multiple NE;Part or all of unit therein can be selected according to the actual needs Realize the purpose of the present embodiment scheme.
Process single it addition, each functional unit in various embodiments of the present invention can be fully integrated into one second In unit, it is also possible to be that each unit is individually as a unit, it is also possible to two or more unit collection Become in a unit;Above-mentioned integrated unit both can realize to use the form of hardware, it would however also be possible to employ hard Part adds the form of SFU software functional unit and realizes.
The above, the only detailed description of the invention of the present invention, but protection scope of the present invention is not limited to This, any those familiar with the art, in the technical scope that the invention discloses, can readily occur in Change or replacement, all should contain within protection scope of the present invention.

Claims (10)

1. a terminal, it is characterised in that described terminal includes:
Acquiring unit, is used for utilizing electric aperture to obtain image data stream, and described image data stream includes multiframe View data;
Registration unit, for determining reference frame, by described image data stream from described image data stream Each frame image data in addition to described reference frame registrates with described reference frame;
Integrated unit, during for each frame image data after registration is carried out fusion treatment, at image boundary The pixel of black region carries out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtains target image.
Terminal the most according to claim 1, it is characterised in that described integrated unit includes:
Analyze subelement, for each frame image data after registration is analyzed, determine each two field picture limit Black region at boundary;
Replace and fusant unit, during for each frame image data after registration is carried out fusion treatment, will figure Carry out fusion treatment after replacing with, as the pixel of boundary black region, the pixel that described reference frame is corresponding, obtain Target image.
Terminal the most according to claim 2, it is characterised in that described replacement and fusant unit, also For the place's black region according to described image boundary, determine in the reference frame with described black region phase Corresponding reference zone;By reference zone during the pixel of black region replaces with described reference frame at image boundary In pixel after carry out fusion treatment, obtain target image.
4. according to the terminal described in any one of claims 1 to 3, it is characterised in that described registration unit, It is additionally operable to using the first frame image data in described image data stream as reference frame, by described image data stream In each frame image data in addition to described first frame image data align with described reference frame;Wherein, Described alignment refers to align the pixel of same spatial location.
5. according to the terminal described in any one of claims 1 to 3, it is characterised in that described integrated unit, Each pixel of each frame image data after being additionally operable to registration is overlapped according to locus correspondence.
6. an image processing method, it is characterised in that described method includes:
Utilizing electric aperture to obtain image data stream, described image data stream includes multiple image data;
From described image data stream, determine reference frame, by described image data stream except described reference frame with Outer each frame image data registrates with described reference frame;
When each frame image data after registration is carried out fusion treatment, by the pixel of black region at image boundary Carry out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtain target image.
Image processing method the most according to claim 6, it is characterised in that described to registration after each When frame image data carries out fusion treatment, the pixel of black region at image boundary is replaced with described reference frame Carry out fusion treatment after corresponding pixel, obtain target image, including:
Each frame image data after registration is analyzed, determines the black region of each two field picture boundary;
When each frame image data after registration is carried out fusion treatment, by the pixel of black region at image boundary Carry out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtain target image.
Image processing method the most according to claim 7, it is characterised in that described by image boundary The pixel of black region carries out fusion treatment after replacing with the pixel that described reference frame is corresponding, obtains target image, Including:
Place's black region according to described image boundary, determine in the reference frame with described black region phase Corresponding reference zone;
By laggard for the pixel in reference zone during the pixel of black region replaces with described reference frame at image boundary Row fusion treatment, obtains target image.
9. according to the image processing method described in any one of claim 6 to 8, it is characterised in that described from Described image data stream determines reference frame, by each in addition to described reference frame in described image data stream Frame image data registrates with described reference frame, including:
Using the first frame image data in described image data stream as reference frame, by described image data stream Each frame image data in addition to described first frame image data aligns with described reference frame;
Wherein, described alignment refers to align the pixel of same spatial location.
10. according to the image processing method described in any one of claim 6 to 8, it is characterised in that described Each frame image data after registration is carried out fusion treatment, including:
Each pixel of each frame image data after registration is overlapped according to locus correspondence.
CN201610375523.3A 2016-05-31 2016-05-31 A kind of image processing method and terminal Active CN105898159B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610375523.3A CN105898159B (en) 2016-05-31 2016-05-31 A kind of image processing method and terminal
PCT/CN2017/082941 WO2017206656A1 (en) 2016-05-31 2017-05-03 Image processing method, terminal, and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610375523.3A CN105898159B (en) 2016-05-31 2016-05-31 A kind of image processing method and terminal

Publications (2)

Publication Number Publication Date
CN105898159A true CN105898159A (en) 2016-08-24
CN105898159B CN105898159B (en) 2019-10-29

Family

ID=56709763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610375523.3A Active CN105898159B (en) 2016-05-31 2016-05-31 A kind of image processing method and terminal

Country Status (1)

Country Link
CN (1) CN105898159B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106612397A (en) * 2016-11-25 2017-05-03 努比亚技术有限公司 Image processing method and terminal
CN106780300A (en) * 2016-11-30 2017-05-31 努比亚技术有限公司 Image processing method and device
CN107205119A (en) * 2017-06-30 2017-09-26 维沃移动通信有限公司 A kind for the treatment of method and apparatus of view data
WO2017206656A1 (en) * 2016-05-31 2017-12-07 努比亚技术有限公司 Image processing method, terminal, and computer storage medium
CN107566753A (en) * 2017-09-29 2018-01-09 努比亚技术有限公司 Method, photo taking and mobile terminal
CN107995421A (en) * 2017-11-30 2018-05-04 潍坊歌尔电子有限公司 A kind of panorama camera and its image generating method, system, equipment, storage medium
CN109819163A (en) * 2019-01-23 2019-05-28 努比亚技术有限公司 A kind of image processing control, terminal and computer readable storage medium
CN110035141A (en) * 2019-02-22 2019-07-19 华为技术有限公司 A kind of image pickup method and equipment
CN111355896A (en) * 2018-12-20 2020-06-30 中国科学院国家天文台 Method for acquiring automatic exposure parameters of all-day camera
CN111583211A (en) * 2020-04-29 2020-08-25 广东利元亨智能装备股份有限公司 Defect detection method and device and electronic equipment
CN111627041A (en) * 2020-04-15 2020-09-04 北京迈格威科技有限公司 Multi-frame data processing method and device and electronic equipment
CN111932587A (en) * 2020-08-03 2020-11-13 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN112581415A (en) * 2020-11-20 2021-03-30 北京迈格威科技有限公司 Image processing method, image processing device, electronic equipment and storage medium
TWI755768B (en) * 2019-11-29 2022-02-21 大陸商北京市商湯科技開發有限公司 Image processing method, image processing device and storage medium thereof
CN116434128A (en) * 2023-06-15 2023-07-14 安徽科大擎天科技有限公司 Method for removing unfilled region of electronic stable image based on cache frame
CN117078538A (en) * 2023-07-19 2023-11-17 华中科技大学 Correction method of remote atmospheric turbulence image based on pixel motion statistics

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010187207A (en) * 2009-02-12 2010-08-26 Olympus Corp Image synthesizing apparatus, image synthesizing program, and image synthesizing method
CN102201115A (en) * 2011-04-07 2011-09-28 湖南天幕智能科技有限公司 Real-time panoramic image stitching method of aerial videos shot by unmanned plane
WO2012108094A1 (en) * 2011-02-08 2012-08-16 オリンパス株式会社 Image processing device, image processing method, image processing program, and image pick-up device
CN103729833A (en) * 2013-11-27 2014-04-16 乐视致新电子科技(天津)有限公司 Image splicing method and device
US9129399B2 (en) * 2013-03-11 2015-09-08 Adobe Systems Incorporated Optical flow with nearest neighbor field fusion
CN105430263A (en) * 2015-11-24 2016-03-23 努比亚技术有限公司 Long-exposure panoramic image photographing device and method
CN105611181A (en) * 2016-03-30 2016-05-25 努比亚技术有限公司 Multi-frame photographed image synthesizer and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010187207A (en) * 2009-02-12 2010-08-26 Olympus Corp Image synthesizing apparatus, image synthesizing program, and image synthesizing method
WO2012108094A1 (en) * 2011-02-08 2012-08-16 オリンパス株式会社 Image processing device, image processing method, image processing program, and image pick-up device
CN102201115A (en) * 2011-04-07 2011-09-28 湖南天幕智能科技有限公司 Real-time panoramic image stitching method of aerial videos shot by unmanned plane
US9129399B2 (en) * 2013-03-11 2015-09-08 Adobe Systems Incorporated Optical flow with nearest neighbor field fusion
CN103729833A (en) * 2013-11-27 2014-04-16 乐视致新电子科技(天津)有限公司 Image splicing method and device
CN105430263A (en) * 2015-11-24 2016-03-23 努比亚技术有限公司 Long-exposure panoramic image photographing device and method
CN105611181A (en) * 2016-03-30 2016-05-25 努比亚技术有限公司 Multi-frame photographed image synthesizer and method

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017206656A1 (en) * 2016-05-31 2017-12-07 努比亚技术有限公司 Image processing method, terminal, and computer storage medium
CN106612397A (en) * 2016-11-25 2017-05-03 努比亚技术有限公司 Image processing method and terminal
CN106780300A (en) * 2016-11-30 2017-05-31 努比亚技术有限公司 Image processing method and device
CN106780300B (en) * 2016-11-30 2020-05-05 努比亚技术有限公司 Picture processing method and device
CN107205119A (en) * 2017-06-30 2017-09-26 维沃移动通信有限公司 A kind for the treatment of method and apparatus of view data
CN107566753A (en) * 2017-09-29 2018-01-09 努比亚技术有限公司 Method, photo taking and mobile terminal
CN107995421A (en) * 2017-11-30 2018-05-04 潍坊歌尔电子有限公司 A kind of panorama camera and its image generating method, system, equipment, storage medium
CN111355896B (en) * 2018-12-20 2021-05-04 中国科学院国家天文台 Method for acquiring automatic exposure parameters of all-day camera
CN111355896A (en) * 2018-12-20 2020-06-30 中国科学院国家天文台 Method for acquiring automatic exposure parameters of all-day camera
CN109819163A (en) * 2019-01-23 2019-05-28 努比亚技术有限公司 A kind of image processing control, terminal and computer readable storage medium
CN110035141A (en) * 2019-02-22 2019-07-19 华为技术有限公司 A kind of image pickup method and equipment
TWI755768B (en) * 2019-11-29 2022-02-21 大陸商北京市商湯科技開發有限公司 Image processing method, image processing device and storage medium thereof
CN111627041A (en) * 2020-04-15 2020-09-04 北京迈格威科技有限公司 Multi-frame data processing method and device and electronic equipment
CN111627041B (en) * 2020-04-15 2023-10-10 北京迈格威科技有限公司 Multi-frame data processing method and device and electronic equipment
CN111583211A (en) * 2020-04-29 2020-08-25 广东利元亨智能装备股份有限公司 Defect detection method and device and electronic equipment
CN111932587A (en) * 2020-08-03 2020-11-13 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN112581415A (en) * 2020-11-20 2021-03-30 北京迈格威科技有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN116434128A (en) * 2023-06-15 2023-07-14 安徽科大擎天科技有限公司 Method for removing unfilled region of electronic stable image based on cache frame
CN116434128B (en) * 2023-06-15 2023-08-22 安徽科大擎天科技有限公司 Method for removing unfilled region of electronic stable image based on cache frame
CN117078538A (en) * 2023-07-19 2023-11-17 华中科技大学 Correction method of remote atmospheric turbulence image based on pixel motion statistics
CN117078538B (en) * 2023-07-19 2024-02-13 华中科技大学 Correction method of remote atmospheric turbulence image based on pixel motion statistics

Also Published As

Publication number Publication date
CN105898159B (en) 2019-10-29

Similar Documents

Publication Publication Date Title
CN105898159B (en) A kind of image processing method and terminal
CN106612397A (en) Image processing method and terminal
CN106303225A (en) A kind of image processing method and electronic equipment
CN105915796A (en) Electronic aperture shooting method and terminal
CN105100775B (en) A kind of image processing method and device, terminal
WO2017206656A1 (en) Image processing method, terminal, and computer storage medium
CN106878588A (en) A kind of video background blurs terminal and method
CN105744159A (en) Image synthesizing method and device
CN103813108A (en) Array camera, mobile terminal, and methods for operating the same
CN105430263A (en) Long-exposure panoramic image photographing device and method
CN101674409A (en) Mobile terminal having panorama photographing function and method for controlling operation thereof
CN105611181A (en) Multi-frame photographed image synthesizer and method
CN105187724B (en) A kind of mobile terminal and method handling image
CN105488756B (en) Picture synthetic method and device
CN105120164B (en) The processing means of continuous photo and method
CN106803879A (en) Cooperate with filming apparatus and the method for finding a view
CN105959551A (en) Shooting device, shooting method and mobile terminal
CN106791455A (en) Panorama shooting method and device
CN106954020B (en) A kind of image processing method and terminal
CN106303229A (en) A kind of photographic method and device
CN106454105A (en) Device and method for image processing
CN106851113A (en) A kind of photographic method and mobile terminal based on dual camera
CN106303290A (en) A kind of terminal and the method obtaining video
CN107071263A (en) A kind of image processing method and terminal
CN106534590A (en) Photo processing method and apparatus, and terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant