CN108965835A - A kind of image processing method, image processing apparatus and terminal device - Google Patents
A kind of image processing method, image processing apparatus and terminal device Download PDFInfo
- Publication number
- CN108965835A CN108965835A CN201810965516.8A CN201810965516A CN108965835A CN 108965835 A CN108965835 A CN 108965835A CN 201810965516 A CN201810965516 A CN 201810965516A CN 108965835 A CN108965835 A CN 108965835A
- Authority
- CN
- China
- Prior art keywords
- image
- colour temperature
- environment
- camera
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/13—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Processing Of Color Television Signals (AREA)
Abstract
This application provides a kind of image processing method, image processing apparatus, terminal device and computer readable storage mediums, the described method includes: obtaining the image of the default frame number of each camera institute continuous acquisition, it wherein, include the current acquired image of the camera in the image of the default frame number of each camera institute continuous acquisition;According to acquired each image, the colour temperature of current environment is estimated;White balance adjusting is carried out to target image according to the colour temperature, the target image is one or more image in the current acquired image of each camera.Technical solution provided herein can make in user when having just enter into another colour temperature environment from certain color temperature environment, more accurately estimate the colour temperature of current environment.
Description
Technical field
The application belongs to technical field of image processing more particularly to a kind of image processing method, image processing apparatus, terminal
Equipment and computer readable storage medium.
Background technique
After terminal device starts camera-type application program, in order to carry out white balance to the current acquired image of camera
It adjusts and (that is, making the object in acquired image that normal color be presented, avoids the occurrence of color difference), need first to current environment
Colour temperature estimated.In order to more accurately estimate that the colour temperature of current environment, traditional color temperature estimation method are according to main camera shooting
Head (i.e. for acquiring the camera of image shown by display screen) multiple image collected is (for example, 60 frames or 100 frame images
Deng) colour temperature of current environment estimated.
However, when user enters another colour temperature environment from certain color temperature environment, due to just entering another colour temperature ring
When border, the image under terminal device current environment collected is simultaneously few (for example, when user has just entered another colour temperature environment
Sometime, totally 10 frame of the image under the main camera current environment collected of terminal device, if being acquired according to main camera
60 frame images carry out color temperature estimation, then used by 60 frame images, the image under current environment only has 10 frames, previous environment
Under image have 50 frames), it is thus impossible to the colour temperature to current environment is accurately estimated, so as to cause cannot well to
Acquired image carries out white balance adjusting when family has just enter into another colour temperature environment.
Summary of the invention
It can in view of this, this application provides a kind of image processing method, image processing apparatus, terminal device and computers
Storage medium is read, can make in user when having just enter into another colour temperature environment from certain color temperature environment, more accurately estimate
The colour temperature of current environment.
The application first aspect provides a kind of image processing method, is applied to terminal device, and above-mentioned terminal device includes
Multiple cameras, above-mentioned image processing method include:
Obtain the image of the default frame number of each camera institute continuous acquisition, wherein each camera institute continuous acquisition
It include the current acquired image of the camera in the image of default frame number;
According to acquired each image, the colour temperature of current environment is estimated;
White balance adjusting is carried out to target image according to above-mentioned colour temperature, above-mentioned target image is currently adopted by each camera
One or more image in the image of collection.
The application second aspect provides a kind of image processing apparatus, is applied to terminal device, and above-mentioned terminal device includes
Multiple cameras, above-mentioned image processing apparatus include:
Image collection module, the image of the default frame number for obtaining each camera institute continuous acquisition, wherein each take the photograph
As the default frame number of head institute continuous acquisition image in include the current acquired image of the camera;
Color temperature estimation module, for estimating the colour temperature of current environment according to acquired each image;
White balance adjusting module, for carrying out white balance adjusting, above-mentioned target image to target image according to above-mentioned colour temperature
For one or more image in the current acquired image of each camera.
The application third aspect provides a kind of terminal device, including memory, processor and is stored in above-mentioned storage
In device and the computer program that can run on above-mentioned processor, above-mentioned processor are realized as above when executing above-mentioned computer program
The step of stating first aspect method.
The application fourth aspect provides a kind of computer readable storage medium, above-mentioned computer-readable recording medium storage
There is computer program, realizes when above-mentioned computer program is executed by processor such as the step of above-mentioned first aspect method.
The 5th aspect of the application provides a kind of computer program product, and above-mentioned computer program product includes computer journey
Sequence is realized when above-mentioned computer program is executed by one or more processors such as the step of above-mentioned first aspect method.
Therefore this application provides a kind of image processing methods, are applied to terminal device, which includes more
A camera, firstly, obtaining the image of the default frame number of each camera institute continuous acquisition, wherein each camera institute is continuous
It include the current acquired image of the camera in the image of the default frame number of acquisition, for example, if the terminal device includes 3
Camera, respectively the first camera, second camera and third camera, then obtain respectively above-mentioned first camera, on
It states second camera and above-mentioned third camera distinguishes the image of default frame number collected, wherein first camera acquires
Default frame number image in the image that currently acquires comprising first camera, the default frame number of second camera acquisition
It include the image that the second camera currently acquires in image, including should in the image of the default frame number of third camera acquisition
The image that third camera currently acquires;Secondly, estimate the colour temperature of current environment according to acquired each image, that is to say,
Each image according to acquired in last step, estimates the colour temperature of current environment;Finally, according to above-mentioned colour temperature to target image into
Row white balance adjusting, above-mentioned target image are one or more image in the current acquired image of each camera.Therefore,
When user has just entered another colour temperature environment, although the image under each camera current environment collected is few,
It since the application is when the colour temperature to current environment is estimated, therefore, is estimated using multiple camera acquired images
In image used by colour temperature, ratio shared by the image under current environment opposite can increase relative to traditional color temperature estimation method
Greatly (for example, each camera of terminal device is collected current when user has just entered another colour temperature environment sometime
Image under environment totally 10 frame, if estimating to work as front ring using 3 cameras totally 60 frame images collected of the terminal device
The colour temperature in border, then in the 60 frame images utilized, ratio shared by the image under current environment is 1/2, and traditional colour temperature is estimated
1/6) meter method only has, therefore, technical solution provided herein, can make user from certain color temperature environment just into
When entering another colour temperature environment, the colour temperature of current environment is more accurately estimated.
Detailed description of the invention
It in order to more clearly explain the technical solutions in the embodiments of the present application, below will be to embodiment or description of the prior art
Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description is only some of the application
Embodiment for those of ordinary skill in the art without any creative labor, can also be according to these
Attached drawing obtains other attached drawings.
Fig. 1 is a kind of implementation process schematic diagram for image processing method that the embodiment of the present application one provides;
Fig. 2 is that the embodiment of the present application one multiple cameras of acquisition for the providing image that continuous acquisition presets frame number respectively shows
It is intended to;
Fig. 3 is the implementation process schematic diagram for another image processing method that the embodiment of the present application two provides;
Fig. 4 is a kind of schematic diagram determined whether for white area that the embodiment of the present application two provides;
Fig. 5 is a kind of realization stream for determining selected image and whether there is white area that the embodiment of the present application two provides
Journey schematic diagram;
Fig. 6 is that the another kind that the embodiment of the present application two provides determines whether schematic diagram for white area;
Fig. 7 is a kind of structural schematic diagram for image processing apparatus that the embodiment of the present application three provides;
Fig. 8 is the structural schematic diagram for the terminal device that the embodiment of the present application four provides.
Specific embodiment
In being described below, for illustration and not for limitation, the tool of such as particular system structure, technology etc is proposed
Body details, so as to provide a thorough understanding of the present application embodiment.However, it will be clear to one skilled in the art that there is no these specific
The application also may be implemented in the other embodiments of details.In other situations, it omits to well-known system, device, electricity
The detailed description of road and method, so as not to obscure the description of the present application with unnecessary details.
Image processing method provided by the embodiments of the present application can be adapted for terminal device, and illustratively, above-mentioned terminal is set
It is standby to include but is not limited to: smart phone, tablet computer, learning machine, intelligent wearable device etc..
It should be appreciated that ought use in this specification and in the appended claims, term " includes " instruction is described special
Sign, entirety, step, operation, the presence of element and/or component, but be not precluded one or more of the other feature, entirety, step,
Operation, the presence or addition of element, component and/or its set.
It is also understood that mesh of the term used in this present specification merely for the sake of description specific embodiment
And be not intended to limit the application.As present specification and it is used in the attached claims, unless on
Other situations are hereafter clearly indicated, otherwise " one " of singular, "one" and "the" are intended to include plural form.
It will be further appreciated that the term "and/or" used in present specification and the appended claims is
Refer to any combination and all possible combinations of one or more of associated item listed, and including these combinations.
As used in this specification and in the appended claims, term " if " can be according to context quilt
Be construed to " when ... " or " once " or " in response to determination " or " in response to detecting ".Similarly, phrase " if it is determined that " or
" if detecting [described condition or event] " can be interpreted to mean according to context " once it is determined that " or " in response to true
It is fixed " or " once detecting [described condition or event] " or " in response to detecting [described condition or event] ".
In the specific implementation, terminal device described in the embodiment of the present application is including but not limited to such as with the sensitive table of touch
Mobile phone, laptop computer or the tablet computer in face (for example, touch-screen display and/or touch tablet) etc it is other
Portable device.It is to be further understood that in certain embodiments, above equipment is not portable communication device, but is had
The desktop computer of touch sensitive surface (for example, touch-screen display and/or touch tablet).
In following discussion, the terminal device including display and touch sensitive surface is described.However, should manage
Solution, terminal device may include that one or more of the other physical User of such as physical keyboard, mouse and/or control-rod connects
Jaws equipment.
At least one of such as touch sensitive surface can be used in the various application programs that can be executed on the terminal device
Public physical user-interface device.It can be adjusted among applications and/or in corresponding application programs and/or change touch is quick
Feel the corresponding information shown in the one or more functions and terminal on surface.In this way, terminal public physical structure (for example,
Touch sensitive surface) it can support the various application programs with user interface intuitive and transparent for a user.
In addition, term " first ", " second " etc. are only used for distinguishing description, and should not be understood as in the description of the present application
Indication or suggestion relative importance.
In order to illustrate the above-mentioned technical solution of the application, the following is a description of specific embodiments.
Embodiment one
Image processing method provided by the embodiment of the present application one is applied to the terminal device including multiple cameras, please join
Attached drawing 1 is read, the image processing method in the embodiment of the present application one includes:
In step s101, the image of the default frame number of each camera institute continuous acquisition is obtained, wherein each camera
It include the current acquired image of the camera in the image of the default frame number of institute's continuous acquisition;
In the embodiment of the present application, when detecting that user starts the camera-type application program in the terminal device and (have
The application program of camera function) after, which starts at least two cameras simultaneously, and then, acquisition is started every
The image of the default frame number of a camera difference continuous acquisition.In the embodiment of the present application, in order to guarantee that terminal device can be same
The multiple cameras of Shi Qidong can develop camera-type application program based on camera2.0 framework, so that camera-type application
Program can support multiple cameras to work at the same time, and Camera2.0 is a camera exploitation journey based on Android operation system
Sequence can make camera-type application program that multiple cameras be supported to work at the same time, and each camera can be got
Each frame image is handled, and traditional camera exploitation program is to be based on camera1.0 framework based on camera1.0 framework
The camera-type application program of design can only support a camera job in the same time, and frame is not achieved to the processing of data
The control of rank can only arrive stream rank.Since technical solution provided herein needs multiple cameras to work at the same time, because
This, can develop camera-type application program based on camera2.0 framework.
In the embodiment of the present application, above-mentioned default frame number is to can be fixed numerical value, for example be fixed as 60 frames;Alternatively,
It is also possible to a variable numerical value, for example, the default frame number can change according to the difference of external environment, if detecting current
Environment is relatively more steady (for example, if detecting, the geographical location of user does not change, it is believed that current environment is more steady),
Then above-mentioned default frame number can be a lesser numerical value, if detect current environment it is unstable (for example, if detect user just with
When one biggish speed is mobile, then it is believed that current environment is unstable), then above-mentioned default frame number can be a biggish numerical value.
It in the embodiment of the present application, can be simultaneously in order to which the estimation for guaranteeing the subsequent colour temperature to current environment is more accurate
Start visual angle and differ biggish camera, for example starts front camera and rear camera simultaneously.
In step s 102, according to acquired each image, estimate the colour temperature of current environment;
In the embodiment of the present application, environment colour temperature corresponding to each frame image in each image can be obtained first, so
Each environment colour temperature is weighted and averaged afterwards, obtains the colour temperature of current environment;Alternatively, each environment colour temperature can also be weeded out
In maximum value and minimum value, calculate the average value of remaining each environment colour temperature, color of the application to estimation current environment
The method of temperature is not construed as limiting.
It discusses one kind below with attached drawing 2 to be weighted and averaged each environment colour temperature, to obtain the color of current environment
The method of temperature:
In the embodiment of the present application, each environment colour temperature can be weighted and averaged according to colour temperature calculation formula (1), from
And the colour temperature of current environment is obtained, above-mentioned colour temperature calculation formula (1) are as follows:
Wherein, T is the colour temperature of current environment, and M is the number of above-mentioned terminal device camera, and K is above-mentioned default frame number,Environment colour temperature corresponding to the current each image collected of respectively M camera,Environment colour temperature corresponding to the previous frame image of the current each image collected of respectively M camera,Environment colour temperature corresponding to the preceding K frame image of the current each image collected of respectively M camera,For each weight,
As shown in Fig. 2, terminal device includes 3 cameras, i.e. M=3, respectively camera 1, camera 2 and camera shooting
First 3, each 20 frame image of camera continuous acquisition, i.e. K=20,AndIt is currently acquired by 3 cameras
Each image corresponding to environment colour temperature,AndBefore the current each image collected of 3 cameras
Environment colour temperature corresponding to one frame image,AndBefore the current each image collected of 3 cameras
Environment colour temperature corresponding to 20 frame images.When calculating the colour temperature of current environment, each figure collected for each camera
Picture, the time apart from present frame is longer, smaller with the correlation of current environment, therefore, can incite somebody to actionAndInstitute
Corresponding weight is chosen for a biggish numerical value, willAndCorresponding weight is chosen for a lesser number
Value, to more accurately estimate the colour temperature of current environment.
In step s 103, white balance adjusting is carried out to target image according to above-mentioned colour temperature, above-mentioned target image is each
One or more image in the current acquired image of camera;
In the embodiment of the present application, the corresponding relationship letter of " colour temperature-pixel correction value " can be stored in advance in terminal device
Breath, then, according to the color temperature value of current environment estimated by step S102, and pair of " colour temperature-pixel correction value " prestored
Relation information is answered to be modified the pixel value of each pixel in target image.Wherein, above-mentioned target image can be for eventually
The currently displayed image of the display screen of end equipment.
The embodiment of the present application one provides a kind of image processing method, when user has just entered another colour temperature environment, by
In when the colour temperature to current environment is estimated, be using multiple camera acquired images, therefore, used image
In, ratio shared by the image under current environment opposite can increase relative to traditional color temperature estimation method, therefore, the application institute
The technical solution of offer can make in user when just having entered another colour temperature environment from certain color temperature environment, more accurately
The colour temperature of ground estimation current environment.
Embodiment two
Image processing method provided by the embodiment of the present application two is applied to the terminal device including multiple cameras, please join
Attached drawing 3 is read, the image processing method in the embodiment of the present application two includes:
In step S301, the image of the default frame number of each camera institute continuous acquisition is obtained, wherein each camera
It include the current acquired image of the camera in the image of the default frame number of institute's continuous acquisition;
In the embodiment of the present application two, step S301 is identical as the step S101 in embodiment one, and for details, reference can be made to realities
The description of example one is applied, details are not described herein again.
In step s 302, an image is chosen from acquired each image, and selected image is divided into
Multiple regions;
It in the embodiment of the present application, can be with after the image that each camera acquires respectively being got by step S301
An image is arbitrarily chosen from acquired image, the selected image is then divided into multiple regions, can be more
A rectangular area;Or may be the region of multiple other shapes, the application is not construed as limiting this.It is assumed that eventually
End equipment includes 3 cameras, after obtaining 20 frame image of each camera institute continuous acquisition, from 60 acquired frame images,
The present frame 401 collected of camera 2 is chosen, and the image 401 is divided into 6 × 6 rectangular areas.
In step S303, according to mapping table, determines and whether there is white area in selected image, if so,
S304 is thened follow the steps, otherwise, executes step S305;
In technical solution provided by the embodiment of the present application two, terminal device is previously stored with mapping table, the correspondence
Relation table records the correspondence relationship information for having the pixel value of each colour temperature and white pixel under each colour temperature.Different colour temperatures
Under, the color that white pixel is presented is not identical, that is, has different pixel values, for example, white pixel can be inclined under high color temperature
Indigo plant, white pixel can be partially yellow under low color temperature, we can record in advance under different colour temperatures, the pixel value size of white pixel,
And can terminal device factory before by the mapping table that record has " colour temperature-white pixel pixel value " save to
In memory, as shown in figure 4, being the schematic diagram of mapping table 402 provided by the embodiments of the present application.
Specifically, it can use attached drawing 5 to determine in selected image with the presence or absence of white area:
In step S501, pixel value average value corresponding to each region in selected image is calculated, wherein each
Pixel value average value corresponding to region is the average value of the pixel value of all pixels point in the region;
As shown in figure 4, calculating the average value of R value of all pixels point in first region 4011 of image 401, G value
The average value of average value and B value, to obtain the pixel value average value in region 4011, i.e.,AndTraverse image
401 all areas obtain pixel value average value corresponding to each region.
In step S502, according to the corresponding pixel value average value in each region, the corresponding storage picture in each region is obtained
Element value, wherein the corresponding storing pixel values in each region store in mapping table, and pixel corresponding with the region
It is worth the immediate pixel value of average value;
As shown in figure 4, searching the pixel value average value with region 4011 in mapping table 402 AndMost
Close pixel value, the pixel value that this is found are determined as the storing pixel values in the region 4011.Specifically, this can be calculated
Region 4011AndRespectively with each pixel value in 402 tables, i.e. R1/G1/B1, R2/G2/B2 and R3/
The distance of G3/B3, will be withAndIt is determined as the storing pixel values in the region 4011 apart from the smallest pixel value, than
Such as, if R1, G1 and B1 with AndDistance it is minimum, then the storing pixel values in the region 4011 be R1, G1 and
B1.The all areas in image 401 are traversed, the storing pixel values in each region are obtained.In addition, if existing in mapping table more
When a pixel value immediate with region pixel value average value, the pixel value conduct of any of mapping table can be chosen
The storing pixel values in the region.
In step S503, the corresponding distance value in each region is determined, wherein the corresponding distance value in each region is the area
The corresponding pixel value average value in domain is at a distance from corresponding storing pixel values;
As shown in figure 4, if the storing pixel values that step S502 obtains region 4011 are R1, G1 and B1, by region
4011 pixel value average valueAndWith the distance value for being determined as the region 4011 at a distance from R1, G1 and B1, time
The all areas for going through image 401 obtain the distance value in each region.
In step S504, the region that distance value is less than pre-determined distance is determined as white area, and distance value is greater than
Or it is determined as non-white region equal to the region of above-mentioned pre-determined distance;
If the pixel value average value in some region differs too big at a distance from corresponding storing pixel values, which is white
The probability in color region is smaller, therefore, which can be determined as to non-white region, traverses all areas of selected image
Domain determines that whether there is or not white areas in selected image.
In addition, in the embodiment of the present application, according to mapping table, determining and whether there is white area in selected image
The method in domain is not only limited in above-mentioned steps S501-S504, and can also have other methods to determine in selected image is
No there are white areas, as shown in fig. 6, the color histogram in each region in selected image can be calculated, and obtain
Then pre-set color histogram corresponding to pixel value in mapping table under each colour temperature calculates in selected image
The similarity of the color histogram in each region and each pre-set color histogram, for any one region, if the area
The both less than default similarity of each similarity corresponding to domain, then the region is non-white region, and similarity is greater than pre- if it exists
If the pre-set color histogram of similarity, then the region is white area.
In step s 304, according to above-mentioned mapping table, the corresponding environment colour temperature of each white area is determined, according to each
Environment colour temperature corresponding to a white area determines environment colour temperature corresponding to selected image;
In the embodiment of the present application, if determining that selected image whether there is white area according to step S501-S504,
Then in selected image, there are that can search in above-mentioned mapping table when white area, each white area is corresponding to be deposited
Colour temperature corresponding to pixel value is stored up, and each colour temperature found is determined as to the environment colour temperature of corresponding white area.It can be with
Environment colour temperature corresponding to each white area is weighted and averaged, to determine the corresponding environment colour temperature of selected image;
Or the maximum value and minimum value of ambient color middle benefit gas corresponding to each white area can be weeded out, by remaining each white
The average value of environment colour temperature corresponding to region is determined as the selected corresponding environment colour temperature of image.
In step S305, selected according to the determination of the place of terminal device, current weather condition and current time
Environment colour temperature corresponding to image;
It, can place according to terminal device, current weather condition if white area is not present in selected image
And current time determines environment colour temperature corresponding to selected image.For example, weather is yin if terminal device is in outdoor
It, when current time is at 4 in afternoon, then it is believed that the colour temperature of current environment is higher.
In the embodiment of the present application, each different position, each different day can be stored in advance in terminal device
Vaporous condition, the correspondence relationship information of each different time and environment colour temperature, so as to according to the pre-stored corresponding pass
It is the colour temperature that information determines current environment.
In step S306, judges whether to have traversed all acquired images, otherwise be held if so, executing step S308
Row step S307;
In the embodiment of the present application, it is thus necessary to determine that environment colour temperature corresponding to each image acquired in step S301,
Therefore, it is necessary to judge whether that traversed all acquired images continues to choose another figure if not traversed all images
Picture, and continue to determine the environment colour temperature of another selected image.
In step S307, an image is chosen from acquired residual image, and selected image is divided into
Multiple regions, and return to step S303;
In the embodiment of the present application, if step S306 judgement has not traversed all acquired images, from acquired
Another image is chosen in residual image again, and divides the image into multiple regions, return step S303 continues to determine the step
Environment colour temperature corresponding to image selected by rapid S307.
In step S308, each environment colour temperature is weighted and averaged, the colour temperature of current environment is obtained;
In step S309, white balance adjusting is carried out to target image according to above-mentioned colour temperature, above-mentioned target image is each
One or more image in the current acquired image of camera;
In the embodiment of the present application two, step S308-S309 is described in example 1, and for details, reference can be made to realities
Example one is applied, details are not described herein again.
The embodiment of the present application two gives a kind of method of the environment colour temperature of each image acquired in specific determination.
When user has just entered another colour temperature environment, due to when the colour temperature to current environment is estimated, being to utilize multiple camera shootings
Head acquired image, therefore, in used image, ratio shared by the image under current environment is relative to traditional colour temperature
Estimation method opposite can increase, and therefore, technical solution provided herein can make in user from certain color temperature environment
When just having entered another colour temperature environment, the colour temperature of current environment is more accurately estimated.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process
Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present application constitutes any limit
It is fixed.
Embodiment three
The embodiment of the present application three provides a kind of image processing apparatus, for purposes of illustration only, only showing relevant to the application
Part, as shown in fig. 7, image processing apparatus 700 includes:
Image collection module 701, the image of the default frame number for obtaining each camera institute continuous acquisition, wherein every
It include the current acquired image of the camera in the image of the default frame number of a camera institute continuous acquisition;
Color temperature estimation module 702, for estimating the colour temperature of current environment according to acquired each image;
White balance adjusting module 703, for carrying out white balance adjusting, above-mentioned target figure to target image according to above-mentioned colour temperature
As being one or more image in the current acquired image of each camera.
Optionally, above-mentioned color temperature estimation module 702 includes:
Colour temperature acquiring unit, for obtaining environment colour temperature corresponding to each frame image in each image;
It is weighted and averaged unit and obtains the colour temperature of current environment for being weighted and averaged to each environment colour temperature.
Optionally, above-mentioned weighted average unit is specifically used for:
Each environment colour temperature is weighted and averaged according to colour temperature calculation formula, obtains the colour temperature of current environment, above-mentioned color
Warm calculation formula are as follows:
Wherein, T is the colour temperature of current environment, and M is the number of above-mentioned terminal device camera, and K is above-mentioned default frame number,Environment colour temperature corresponding to the current each image collected of respectively M camera,Environment colour temperature corresponding to the previous frame image of the current each image collected of respectively M camera,Environment colour temperature corresponding to the preceding K frame image of the current each image collected of respectively M camera,For each weight,
Optionally, preset mapping table is preserved in above-mentioned terminal device, above-mentioned mapping table record has each
The correspondence relationship information of the pixel value of colour temperature and white pixel under each colour temperature, wherein the corresponding pixel of each colour temperature
Value, correspondingly, above-mentioned colour temperature acquiring unit, comprising:
Region division subelement, for choosing an image from acquired each image, and by selected image
It is divided into multiple regions;
White area determines subelement, for determining each area in selected image according to above-mentioned mapping table
Whether domain is white area;
First colour temperature determines subelement, if being white area for there are one or more regions in selected image,
Then according to above-mentioned mapping table, the corresponding environment colour temperature of each white area is determined, according to corresponding to each white area
Environment colour temperature determines environment colour temperature corresponding to selected image;
Second colour temperature determines subelement, if not being white area, root for each region in selected image
Ambient color corresponding to selected image is determined according to the place of above-mentioned terminal device, current weather condition and current time
Temperature;
Subelement is traversed, for having traversed acquired each image, obtains environment colour temperature corresponding to each image.
Optionally, above-mentioned white area determines subelement, comprising:
Pixel is averaged junior unit, for calculating pixel value average value corresponding to each region in selected image,
In, pixel value average value corresponding to each region is the average value of the pixel value of all pixels point in the region;
Pixel junior unit is stored, for it is corresponding to obtain each region according to the corresponding pixel value average value in each region
Storing pixel values, wherein the corresponding storing pixel values in each region store in above-mentioned mapping table, and with the region
The corresponding immediate pixel value of pixel value average value;
Distance value determines junior unit, for determining the corresponding distance value in each region, wherein the corresponding distance in each region
Value is the corresponding pixel value average value in the region at a distance from corresponding storing pixel values;
White determines junior unit, and the region for distance value to be less than pre-determined distance is determined as white area, and by distance
The region that value is greater than or equal to above-mentioned pre-determined distance is determined as non-white region,
Correspondingly, above-mentioned first colour temperature determines that subelement is specifically used for:
If there are one or more regions in selected image is white area, searched in above-mentioned mapping table
Colour temperature corresponding to the corresponding storing pixel values of each white area, and each colour temperature found is determined as corresponding white
The environment colour temperature in region determines environment corresponding to selected image according to environment colour temperature corresponding to each white area
Colour temperature.
Optionally, above-mentioned white balance adjusting module 703, comprising:
Target image determination unit, for the currently displayed image of the display screen of above-mentioned terminal device to be determined as target
Image;
Target image adjusts unit, for carrying out white balance adjusting to above-mentioned target image according to above-mentioned colour temperature.
It should be noted that the contents such as information exchange, implementation procedure between above-mentioned apparatus/unit, due to the application
Embodiment of the method is based on same design, concrete function and bring technical effect, for details, reference can be made to embodiment of the method part, this
Place repeats no more.
Example IV
Fig. 8 is the schematic diagram for the terminal device that the embodiment of the present application four provides.As shown in figure 8, the terminal of the embodiment is set
Standby 8 include: processor 80, memory 81 and are stored in the meter that can be run in above-mentioned memory 81 and on above-mentioned processor 80
Calculation machine program 82.Above-mentioned processor 80 realizes the step in above-mentioned each embodiment of the method when executing above-mentioned computer program 82,
Such as step S101 to S103 shown in FIG. 1.Alternatively, above-mentioned processor 80 realized when executing above-mentioned computer program 82 it is above-mentioned each
The function of each module/unit in Installation practice, such as the function of module 701 to 703 shown in Fig. 7.
Illustratively, above-mentioned computer program 82 can be divided into one or more module/units, said one or
Multiple module/units are stored in above-mentioned memory 81, and are executed by above-mentioned processor 80, to complete the application.Above-mentioned one
A or multiple module/units can be the series of computation machine program instruction section that can complete specific function, which is used for
Implementation procedure of the above-mentioned computer program 82 in above-mentioned terminal device 8 is described.For example, above-mentioned computer program 82 can be divided
It is cut into image collection module, color temperature estimation module and white balance adjusting module, each module concrete function is as follows:
Obtain the image of the default frame number of each camera institute continuous acquisition, wherein each camera institute continuous acquisition
It include the current acquired image of the camera in the image of default frame number;
According to acquired each image, the colour temperature of current environment is estimated;
White balance adjusting is carried out to target image according to above-mentioned colour temperature, above-mentioned target image is currently adopted by each camera
One or more image in the image of collection.
Above-mentioned terminal device may include, but be not limited only to, processor 80, memory 81.Those skilled in the art can manage
Solution, Fig. 8 is only the example of terminal device 8, does not constitute the restriction to terminal device 8, may include more or more than illustrating
Few component perhaps combines certain components or different components, such as above-mentioned terminal device can also be set including input and output
Standby, network access equipment, bus etc..
Alleged processor 80 can be central processing unit (Central Processing Unit, CPU), can also be
Other general processors, digital signal processor (Digital Signal Processor, DSP), specific integrated circuit
(Application Specific Integrated Circuit, ASIC), field programmable gate array (Field-
Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic,
Discrete hardware components etc..General processor can be microprocessor or the processor is also possible to any conventional processor
Deng.
Above-mentioned memory 81 can be the internal storage unit of above-mentioned terminal device 8, such as the hard disk or interior of terminal device 8
It deposits.Above-mentioned memory 81 is also possible to the External memory equipment of above-mentioned terminal device 8, such as be equipped on above-mentioned terminal device 8
Plug-in type hard disk, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card dodge
Deposit card (Flash Card) etc..Further, above-mentioned memory 81 can also both include the storage inside list of above-mentioned terminal device 8
Member also includes External memory equipment.Above-mentioned memory 81 is for storing needed for above-mentioned computer program and above-mentioned terminal device
Other programs and data.Above-mentioned memory 81 can be also used for temporarily storing the data that has exported or will export.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each function
Can unit, module division progress for example, in practical application, can according to need and by above-mentioned function distribution by different
Functional unit, module are completed, i.e., the internal structure of above-mentioned apparatus is divided into different functional unit or module, more than completing
The all or part of function of description.Each functional unit in embodiment, module can integrate in one processing unit, can also
To be that each unit physically exists alone, can also be integrated in one unit with two or more units, it is above-mentioned integrated
Unit both can take the form of hardware realization, can also realize in the form of software functional units.In addition, each function list
Member, the specific name of module are also only for convenience of distinguishing each other, the protection scope being not intended to limit this application.Above system
The specific work process of middle unit, module, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment
The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure
Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually
It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician
Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed
Scope of the present application.
In embodiment provided herein, it should be understood that disclosed device/terminal device and method, it can be with
It realizes by another way.For example, device described above/terminal device embodiment is only schematical, for example, on
The division of module or unit is stated, only a kind of logical function partition, there may be another division manner in actual implementation, such as
Multiple units or components can be combined or can be integrated into another system, or some features can be ignored or not executed.Separately
A bit, shown or discussed mutual coupling or direct-coupling or communication connection can be through some interfaces, device
Or the INDIRECT COUPLING or communication connection of unit, it can be electrical property, mechanical or other forms.
Above-mentioned unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, each functional unit in each embodiment of the application can integrate in one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list
Member both can take the form of hardware realization, can also realize in the form of software functional units.
If above-mentioned integrated module/unit be realized in the form of SFU software functional unit and as independent product sale or
In use, can store in a computer readable storage medium.Based on this understanding, the application realizes above-mentioned implementation
All or part of the process in example method, can also instruct relevant hardware to complete, above-mentioned meter by computer program
Calculation machine program can be stored in a computer readable storage medium, the computer program when being executed by processor, it can be achieved that on
The step of stating each embodiment of the method.Wherein, above-mentioned computer program includes computer program code, above-mentioned computer program generation
Code can be source code form, object identification code form, executable file or certain intermediate forms etc..Above-mentioned computer-readable medium
It may include: any entity or device, recording medium, USB flash disk, mobile hard disk, magnetic that can carry above-mentioned computer program code
Dish, CD, computer storage, read-only memory (ROM, Read-Only Memory), random access memory (RAM,
Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..It should be noted that above-mentioned
The content that computer-readable medium includes can carry out increasing appropriate according to the requirement made laws in jurisdiction with patent practice
Subtract, such as does not include electric carrier signal and electricity according to legislation and patent practice, computer-readable medium in certain jurisdictions
Believe signal.
Above above-described embodiment is only to illustrate the technical solution of the application, rather than its limitations;Although referring to aforementioned reality
Example is applied the application is described in detail, those skilled in the art should understand that: it still can be to aforementioned each
Technical solution documented by embodiment is modified or equivalent replacement of some of the technical features;And these are modified
Or replacement, the spirit and scope of each embodiment technical solution of the application that it does not separate the essence of the corresponding technical solution should all
Comprising within the scope of protection of this application.
Claims (10)
1. a kind of image processing method, which is characterized in that it is applied to terminal device, the terminal device includes multiple cameras,
Described image processing method includes:
Obtain the image of the default frame number of each camera institute continuous acquisition, wherein each camera institute continuous acquisition is preset
It include the current acquired image of the camera in the image of frame number;
According to acquired each image, the colour temperature of current environment is estimated;
White balance adjusting is carried out to target image according to the colour temperature, the target image is that each camera is currently collected
One or more image in image.
2. image processing method as described in claim 1, which is characterized in that each image according to acquired in, estimation
The colour temperature of current environment, comprising:
Obtain environment colour temperature corresponding to each frame image in each image;
Each environment colour temperature is weighted and averaged, the colour temperature of current environment is obtained.
3. image processing method as claimed in claim 2, which is characterized in that it is described each environment colour temperature is weighted it is flat
, the colour temperature of current environment is obtained, comprising:
Each environment colour temperature is weighted and averaged according to colour temperature calculation formula, obtains the colour temperature of current environment, the colour temperature meter
Calculate formula are as follows:
Wherein, T is the colour temperature of current environment, and M is the number of the terminal device camera, and K is the default frame number,Environment colour temperature corresponding to the current each image collected of respectively M camera,Ambient color corresponding to the previous frame image of the current each image collected of respectively M camera
Temperature,Environment corresponding to the preceding K frame image of the current each image collected of respectively M camera
Colour temperature,For each weight,
4. image processing method as claimed in claim 2, which is characterized in that preserve preset correspondence in the terminal device
Relation table, the mapping table record have the corresponding relationship of the pixel value of each colour temperature and white pixel under each colour temperature to believe
Breath, wherein the corresponding pixel value of each colour temperature;
Correspondingly, environment colour temperature corresponding to each frame image obtained in each image, comprising:
An image is chosen from acquired each image, and selected image is divided into multiple regions;
According to the mapping table, determine whether each region in selected image is white area;
If there are one or more regions in selected image is white area, according to the mapping table, determine each
The corresponding environment colour temperature of a white area determines selected image institute according to environment colour temperature corresponding to each white area
Corresponding environment colour temperature;
If each region in selected image is not white area, according to the place of the terminal device, work as the day before yesterday
Vaporous condition and current time determine environment colour temperature corresponding to selected image;
Acquired each image has been traversed, environment colour temperature corresponding to each image is obtained.
5. image processing method as claimed in claim 4, which is characterized in that it is described according to the mapping table, determine institute
Whether each region in the image of selection is white area, comprising:
Calculate pixel value average value corresponding to each region in selected image, wherein pixel corresponding to each region
It is worth the average value that average value is the pixel value of all pixels point in the region;
According to the corresponding pixel value average value in each region, the corresponding storing pixel values in each region are obtained, wherein each region
Corresponding storing pixel values store in the mapping table, and pixel value average value corresponding with the region is closest
Pixel value;
Determine the corresponding distance value in each region, wherein the corresponding distance value in each region is flat for the corresponding pixel value in the region
Mean value is at a distance from corresponding storing pixel values;
The region that distance value is less than pre-determined distance is determined as white area, and distance value is greater than or equal to the pre-determined distance
Region be determined as non-white region,
Correspondingly, described according to the mapping table, determine the corresponding environment colour temperature of each white area, comprising:
Colour temperature corresponding to the corresponding storing pixel values of each white area is searched in the mapping table, and will be found
Each colour temperature be determined as the environment colour temperature of corresponding white area.
6. the image processing method as described in any one of claims 1 to 5, which is characterized in that described according to the colour temperature pair
Target image carries out white balance adjusting, and the target image is one or more in the current acquired image of each camera
Image, comprising:
The currently displayed image of the display screen of the terminal device is determined as target image;
White balance adjusting is carried out to the target image according to the colour temperature.
7. a kind of image processing apparatus, which is characterized in that it is applied to terminal device, the terminal device includes multiple cameras,
Described image processing unit includes:
Image collection module, the image of the default frame number for obtaining each camera institute continuous acquisition, wherein each camera
It include the current acquired image of the camera in the image of the default frame number of institute's continuous acquisition;
Color temperature estimation module, for estimating the colour temperature of current environment according to acquired each image;
White balance adjusting module, for carrying out white balance adjusting to target image according to the colour temperature, the target image is each
One or more image in a current acquired image of camera.
8. image processing apparatus as claimed in claim 7, which is characterized in that the color temperature estimation module includes:
Colour temperature acquiring unit, for obtaining environment colour temperature corresponding to each frame image in each image;
It is weighted and averaged unit and obtains the colour temperature of current environment for being weighted and averaged to each environment colour temperature.
9. a kind of terminal device, including memory, processor and storage are in the memory and can be on the processor
The computer program of operation, which is characterized in that the processor realizes such as claim 1 to 6 when executing the computer program
The step of any one the method.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, and feature exists
In when the computer program is executed by processor the step of any one of such as claim 1 to 6 of realization the method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810965516.8A CN108965835B (en) | 2018-08-23 | 2018-08-23 | Image processing method, image processing device and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810965516.8A CN108965835B (en) | 2018-08-23 | 2018-08-23 | Image processing method, image processing device and terminal equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108965835A true CN108965835A (en) | 2018-12-07 |
CN108965835B CN108965835B (en) | 2019-12-27 |
Family
ID=64473637
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810965516.8A Active CN108965835B (en) | 2018-08-23 | 2018-08-23 | Image processing method, image processing device and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108965835B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110930455A (en) * | 2019-11-29 | 2020-03-27 | 深圳市优必选科技股份有限公司 | Positioning method, positioning device, terminal equipment and storage medium |
CN111551265A (en) * | 2020-04-03 | 2020-08-18 | 深圳市爱图仕影像器材有限公司 | Color temperature measuring method and color temperature measuring device |
CN111800568A (en) * | 2020-08-06 | 2020-10-20 | 珠海格力电器股份有限公司 | Light supplement method and device |
CN112087611A (en) * | 2020-09-07 | 2020-12-15 | Oppo广东移动通信有限公司 | Electronic equipment and display screen adjusting method thereof |
CN113542711A (en) * | 2020-04-14 | 2021-10-22 | 青岛海信移动通信技术股份有限公司 | Image display method and terminal |
CN113676663A (en) * | 2021-08-13 | 2021-11-19 | 惠州Tcl云创科技有限公司 | Camera white balance adjusting method and device, storage medium and terminal equipment |
CN114554170A (en) * | 2022-03-08 | 2022-05-27 | 三星半导体(中国)研究开发有限公司 | Method of multi-sensor white balance synchronization and electronic device using the same |
CN117995137A (en) * | 2024-04-07 | 2024-05-07 | 荣耀终端有限公司 | Method for adjusting color temperature of display screen, electronic equipment and related medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101283604A (en) * | 2005-08-30 | 2008-10-08 | 诺基亚公司 | Image processing device with automatic white balance |
CN103051804A (en) * | 2012-12-28 | 2013-04-17 | 广东欧珀移动通信有限公司 | Intelligent photo taking method and system of mobile terminal |
CN104320642A (en) * | 2014-10-11 | 2015-01-28 | 广东欧珀移动通信有限公司 | Picture processing method and device |
US20150350620A1 (en) * | 2014-05-30 | 2015-12-03 | Canon Kabushiki Kaisha | Image pickup apparatus that performs white balance control and method of controlling the same |
CN106713887A (en) * | 2017-01-03 | 2017-05-24 | 捷开通讯(深圳)有限公司 | Mobile terminal, and white balance adjustment method |
CN107371007A (en) * | 2017-07-25 | 2017-11-21 | 广东欧珀移动通信有限公司 | White balancing treatment method, device and terminal |
CN107911682A (en) * | 2017-11-28 | 2018-04-13 | 广东欧珀移动通信有限公司 | Image white balancing treatment method, device, storage medium and electronic equipment |
CN107959851A (en) * | 2017-12-25 | 2018-04-24 | 广东欧珀移动通信有限公司 | Colour temperature detection method and device, computer-readable recording medium and computer equipment |
-
2018
- 2018-08-23 CN CN201810965516.8A patent/CN108965835B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101283604A (en) * | 2005-08-30 | 2008-10-08 | 诺基亚公司 | Image processing device with automatic white balance |
CN103051804A (en) * | 2012-12-28 | 2013-04-17 | 广东欧珀移动通信有限公司 | Intelligent photo taking method and system of mobile terminal |
US20150350620A1 (en) * | 2014-05-30 | 2015-12-03 | Canon Kabushiki Kaisha | Image pickup apparatus that performs white balance control and method of controlling the same |
CN104320642A (en) * | 2014-10-11 | 2015-01-28 | 广东欧珀移动通信有限公司 | Picture processing method and device |
CN106713887A (en) * | 2017-01-03 | 2017-05-24 | 捷开通讯(深圳)有限公司 | Mobile terminal, and white balance adjustment method |
CN107371007A (en) * | 2017-07-25 | 2017-11-21 | 广东欧珀移动通信有限公司 | White balancing treatment method, device and terminal |
CN107911682A (en) * | 2017-11-28 | 2018-04-13 | 广东欧珀移动通信有限公司 | Image white balancing treatment method, device, storage medium and electronic equipment |
CN107959851A (en) * | 2017-12-25 | 2018-04-24 | 广东欧珀移动通信有限公司 | Colour temperature detection method and device, computer-readable recording medium and computer equipment |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110930455A (en) * | 2019-11-29 | 2020-03-27 | 深圳市优必选科技股份有限公司 | Positioning method, positioning device, terminal equipment and storage medium |
CN110930455B (en) * | 2019-11-29 | 2023-12-29 | 深圳市优必选科技股份有限公司 | Positioning method, positioning device, terminal equipment and storage medium |
CN111551265A (en) * | 2020-04-03 | 2020-08-18 | 深圳市爱图仕影像器材有限公司 | Color temperature measuring method and color temperature measuring device |
CN113542711A (en) * | 2020-04-14 | 2021-10-22 | 青岛海信移动通信技术股份有限公司 | Image display method and terminal |
CN111800568A (en) * | 2020-08-06 | 2020-10-20 | 珠海格力电器股份有限公司 | Light supplement method and device |
CN111800568B (en) * | 2020-08-06 | 2021-11-05 | 珠海格力电器股份有限公司 | Light supplement method and device |
CN112087611A (en) * | 2020-09-07 | 2020-12-15 | Oppo广东移动通信有限公司 | Electronic equipment and display screen adjusting method thereof |
CN113676663A (en) * | 2021-08-13 | 2021-11-19 | 惠州Tcl云创科技有限公司 | Camera white balance adjusting method and device, storage medium and terminal equipment |
CN114554170A (en) * | 2022-03-08 | 2022-05-27 | 三星半导体(中国)研究开发有限公司 | Method of multi-sensor white balance synchronization and electronic device using the same |
CN114554170B (en) * | 2022-03-08 | 2024-06-11 | 三星半导体(中国)研究开发有限公司 | Method for multi-sensor white balance synchronization and electronic device using same |
CN117995137A (en) * | 2024-04-07 | 2024-05-07 | 荣耀终端有限公司 | Method for adjusting color temperature of display screen, electronic equipment and related medium |
Also Published As
Publication number | Publication date |
---|---|
CN108965835B (en) | 2019-12-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108965835A (en) | A kind of image processing method, image processing apparatus and terminal device | |
US11657609B2 (en) | Terminal device, information processing device, object identifying method, program, and object identifying system | |
US12020474B2 (en) | Image processing apparatus, image processing method, and non-transitory computer-readable storage medium | |
CN110113534B (en) | Image processing method, image processing device and mobile terminal | |
CN106898026B (en) | A kind of the dominant hue extracting method and device of picture | |
CN109064390A (en) | A kind of image processing method, image processing apparatus and mobile terminal | |
US11720954B2 (en) | Image-based listing using image of multiple items | |
CN110175980A (en) | Image definition recognition methods, image definition identification device and terminal device | |
CN108769634B (en) | Image processing method, image processing device and terminal equipment | |
US20220254143A1 (en) | Method and apparatus for determining item name, computer device, and storage medium | |
CN105069042A (en) | Content-based data retrieval methods for unmanned aerial vehicle spying images | |
CN104081307A (en) | Image processing apparatus, image processing method, and program | |
CN110533694A (en) | Image processing method, device, terminal and storage medium | |
CN105354792A (en) | Method for trying virtual glasses and mobile terminal | |
CN109377502A (en) | A kind of image processing method, image processing apparatus and terminal device | |
JP2017182628A (en) | Augmented reality user interface application device and control method | |
CN104125397B (en) | A kind of data processing method and electronic equipment | |
CN113676713A (en) | Image processing method, apparatus, device and medium | |
CN111598149B (en) | Loop detection method based on attention mechanism | |
CN108764139A (en) | A kind of method for detecting human face, mobile terminal and computer readable storage medium | |
CN108932703A (en) | Image processing method, picture processing unit and terminal device | |
CN114066814A (en) | Gesture 3D key point detection method of AR device and electronic device | |
CN101860680B (en) | Image processing device and image processing method thereof | |
CN108763491B (en) | Picture processing method and device and terminal equipment | |
US20230131717A1 (en) | Search processing device, search processing method, and computer program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |