CN103869977B - Method for displaying image, device and electronics - Google Patents

Method for displaying image, device and electronics Download PDF

Info

Publication number
CN103869977B
CN103869977B CN201410056786.9A CN201410056786A CN103869977B CN 103869977 B CN103869977 B CN 103869977B CN 201410056786 A CN201410056786 A CN 201410056786A CN 103869977 B CN103869977 B CN 103869977B
Authority
CN
China
Prior art keywords
degree
pixel
sample point
depth
close
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410056786.9A
Other languages
Chinese (zh)
Other versions
CN103869977A (en
Inventor
王琳
陈志军
张涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201410056786.9A priority Critical patent/CN103869977B/en
Publication of CN103869977A publication Critical patent/CN103869977A/en
Application granted granted Critical
Publication of CN103869977B publication Critical patent/CN103869977B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention discloses a kind of method for displaying image, device and electronics, belongs to image processing field. Described method comprises: gather the Eyeball motion information of user when the image of browse displays; Determine that eyeball corresponds to the focus in described image according to described Eyeball motion information; Image processing region is determined according to described focus; According to predetermined image processing mode, described image processing region is processed; Refresh the described image after display process. The disclosure solves the image display pattern related in background technology and lacks interactivity, the problem that image display effect is not ideal enough; Compared to the image display pattern related in background technology, the present embodiment determines user's focus in the picture by gathering user in real time browsing the movable information of eyeball in image process, both the ideology of user had been met, also improve the interactivity in image display process, and then improve image display effect.

Description

Method for displaying image, device and electronics
Technical field
It relates to image processing field, in particular to a kind of method for displaying image, device and electronics.
Background technology
In order to improve the display effect of image, some application software provide multiple image processing functions such as comprising Iamge Segmentation, light and shade adjustment and background virtualization.
For background virtualization, background virtualization refers to the process background parts in image being carried out fuzzyization and foreground portion is divided clear display. Incorporated by reference to the image that the camera being the electronics using such as mobile phone or panel computer and so on reference to image shown in figure 1, Fig. 1 is taken. Assume to need the region beyond the object to image lower left to carry out background virtualization process, and using this object as the clear display of prospect, then first selected focus L, this focus L are chosen to be object central authorities position usually; Then non-virtualization region 11 is determined according to distance between each pixel and focus L in image, specifically, using and focus L between distance be less than region that the pixel of a forms as non-virtualization region 11 (as dotted line in figure encloses the region closed and become); Show after finally the region beyond non-virtualization region 11 being carried out background virtualization process.
Contriver is realizing in process of the present disclosure, find that aforesaid way at least exists following defect: above-mentioned image display pattern lacks interactivity, for piece image, its focus is fixing after selecting, and then after it is carried out image procossing, its display effect is also fixing. Therefore, there is shortage interactivity in above-mentioned image display pattern, the problem that image display effect is not ideal enough.
Summary of the invention
Lacking interactivity, the problem that image display effect is not ideal enough to solve above-mentioned image display pattern, disclosure embodiment provides a kind of method for displaying image, device and electronics. Described technical scheme is as follows:
First aspect, it provides a kind of method for displaying image, described method comprises:
Gather the Eyeball motion information of user when the image of browse displays;
Determine that eyeball corresponds to the focus in described image according to described Eyeball motion information;
For each pixel in described image, determine the spatial proximity P of described pixel according to the distance between described pixel and described focusS, determine that the degree of depth of described pixel is close to degree P according to the difference between the depth value of described pixel and the depth value of described focusD, and according to described spatial proximity PSWith the described degree of depth close to degree PDDetermining the type of described pixel, described type comprises background sample point and prospect sample point;
Using the region corresponding to described background sample point and/or described prospect sample point as image processing region;
According to predetermined image processing mode, described image processing region is processed;
Refresh the described image after display process;
Wherein, the described spatial proximity P determining described pixel according to the distance between described pixel and described focusS, comprising: calculate described pixel (xi, yi) and described focus (xL, xL) between distance S, described distance S=(xi-xL)2+(yi-yL)2; Described spatial proximity P is calculated according to described distance S, natural constant e and the first empirical value ��S, described spatial proximity PS=e^ [-S/ (2 ��2)]=e^{-[(xi-xL)2+(yi-yL)2]/(2��2);
Difference between the depth value of the described depth value according to described pixel and described focus determines that the degree of depth of described pixel is close to degree PD, comprising: the depth value D (x calculating described pixeli, yi) with the depth value D (x of described focusL, yL) difference R, described difference R=D (xi, yi)-D(xL, yL); The described degree of depth is calculated close to degree P according to described difference R, natural constant e and the 2nd empirical value ��D, the described degree of depth is close to degree PD=e^ [-R2/(2��2)]=e^{-[D (xi, yi)-D(xL, yL)]2/(2��2)}��
Optionally, described according to described spatial proximity PSWith the described degree of depth close to degree PDDetermine the type of described pixel, comprising:
According to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel;
Judge described final close to the magnitude relationship of degree P and predetermined threshold;
If judged result is described final close degree P is less than described predetermined threshold, then determine that the type of described pixel is described background sample point;
If judged result is described final close degree P is greater than described predetermined threshold, then determine that the type of described pixel is described prospect sample point.
Optionally, described according to described spatial proximity PSWith the described degree of depth close to degree PDDetermine the type of described pixel, comprising:
According to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel;
Judging described final close to the magnitude relationship of degree P and the first threshold value and Second Threshold, wherein, described first threshold value is less than described Second Threshold;
If judged result is described final close degree P is less than described first threshold value, then determine that the type of described pixel is described background sample point;
If judged result is described final close degree P is greater than described Second Threshold, then determine that the type of described pixel is described prospect sample point;
It is greater than described first threshold value if judged result is described final close degree P and it is less than described Second Threshold, then determining that described pixel is sample point undetermined;
Color vector according to described sample point undetermined finally determines the type of described sample point undetermined.
Optionally, the described color vector according to described sample point undetermined finally determines the type of described sample point undetermined, comprising:
For sample point undetermined described in each, obtain the color vector of described sample point undetermined respectively;
Calculate described sample point undetermined respectively according to Bayes posterior probability formula belong to the probability of described prospect sample point and belong to the probability of described background sample point;
Choose the type of the type corresponding to value bigger in described probability as described sample point undetermined.
Optionally, described according to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel, comprising:
By described spatial proximity PSWith the described degree of depth close to degree PDIt is multiplied and obtains the final close to degree P of described pixel.
Optionally, described according to predetermined image processing mode, described image processing region is processed, comprising:
According to the first predetermined image processing mode, the region corresponding to described background sample point is processed;
Or,
According to the first predetermined image processing mode, the region corresponding to described background sample point is processed, and according to the 2nd predetermined image processing mode, the region corresponding to described prospect sample point is processed;
Or,
According to the 2nd predetermined image processing mode, the region corresponding to described prospect sample point is processed;
Wherein, described first predetermined image processing mode comprises image virtualization process, and described 2nd predetermined image processing mode comprises image Edge contrast.
Second aspect, it provides a kind of image display device, described device comprises:
Eyeball acquisition module, for gathering the Eyeball motion information of user when the image of browse displays;
Pay close attention to determination module, for determining that eyeball corresponds to the focus in described image according to described Eyeball motion information;
Area determination module, for determining image processing region according to described focus;
Image procossing module, for processing described image processing region according to predetermined image processing mode;
Image display, for refreshing the described image after display process;
Wherein, described area determination module, comprising: the true stator modules of Type division submodule block and region;
Described Type division submodule block, for for each pixel in described image, difference between depth value according to the distance between described pixel and described focus and described pixel and the depth value of described focus determines the type of described pixel, and described type comprises background sample point and prospect sample point;
The true stator modules in described region, for using the region corresponding to described background sample point and/or described prospect sample point as image processing region;
Described Type division submodule block, comprising: distance determining unit, depth determining unit and type determining unit;
Described distance determining unit, for determining the spatial proximity P of described pixel according to the distance between described pixel and described focusS;
For the difference between the depth value according to described pixel and the depth value of described focus, described depth determining unit, determines that the degree of depth of described pixel is close to degree PD;
Described type determining unit, for according to described spatial proximity PSWith the described degree of depth close to degree PDDetermine the type of described pixel;
Described distance determining unit, comprising: distance computation subunit and distance determine subelement;
Described distance computation subunit, for calculating described pixel (xi, yi) and described focus (xL, yL) between distance S, described distance S=(xi-xL)2+(yi-yL)2;
Described distance determines subelement, for calculating described spatial proximity P according to described distance S, natural constant e and the first empirical value ��S, described spatial proximity PS=e^ [-S/ (2 ��2)]=e^{-[(xi-xL)2+(yi-yL)2]/(2��2);
Described depth determining unit, comprising: depth calculation subelement and the degree of depth determine subelement;
Described depth calculation subelement, for calculating the depth value D (x of described pixeli, yi) with the depth value D (x of described focusL, yL) difference R, described difference R=D (xi, yi)-D(xL, yL);
The described degree of depth determines subelement, for calculating the described degree of depth close to degree P according to described difference R, natural constant e and the 2nd empirical value ��D, the described degree of depth is close to degree PD=e^ [-R2/(2��2)]=e^{-[D (xi, yi)-D(xL, yL)]2/(2��2)}��
Optionally, described type determining unit, comprising: similar determine that subelement, threshold decision subelement, background determine that subelement and prospect determine subelement;
Described similar determine subelement, for according to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel;
Described threshold decision subelement, described final close to the magnitude relationship of degree P and predetermined threshold for judging;
Described background determines subelement, if being that described final close degree P is less than described predetermined threshold for judged result, then determines that the type of described pixel is described background sample point;
Described prospect determines subelement, if being that described final close degree P is greater than described predetermined threshold for judged result, then determines that the type of described pixel is described prospect sample point.
Optionally, described type determining unit, comprising: similar determine that subelement, first is determined that subelement, the 2nd determines that subelement, the 3rd is determined subelement and finally determines subelement by subelement, threshold ratio;
Described similar determine subelement, for according to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel;
Described threshold ratio is to subelement, and described final close to the magnitude relationship of degree P and the first threshold value and Second Threshold for judging, wherein, described first threshold value is less than described Second Threshold;
Described first determines subelement, if being that described final close degree P is less than described first threshold value for judged result, then determines that the type of described pixel is described background sample point;
Described 2nd determines subelement, if being that described final close degree P is greater than described Second Threshold for judged result, then determines that the type of described pixel is described prospect sample point;
Described 3rd determines subelement, if being that described final close degree P is greater than described first threshold value and is less than described Second Threshold for judged result, then determines that described pixel is sample point undetermined;
Described finally determine subelement, for finally determining the type of described sample point undetermined according to the color vector of described sample point undetermined.
Optionally, described finally determine subelement, also for for sample point undetermined described in each, obtaining the color vector of described sample point undetermined respectively; Calculate described sample point undetermined respectively according to Bayes posterior probability formula belong to the probability of described prospect sample point and belong to the probability of described background sample point; Choose the type of the type corresponding to value bigger in described probability as described sample point undetermined.
Optionally, described similar determine subelement, also for by described spatial proximity PSWith the described degree of depth close to degree PDIt is multiplied and obtains the final close to degree P of described pixel.
Optionally, described image procossing module, comprising: background process submodule block; Or, comprehensive treating process submodule block; Or, perspective process submodule block;
Described background process submodule block, for processing the region corresponding to described background sample point according to the first predetermined image processing mode;
Described comprehensive treating process submodule block, for being processed in the region corresponding to described background sample point according to the first predetermined image processing mode, and processes the region corresponding to described prospect sample point according to the 2nd predetermined image processing mode;
Described perspective process submodule block, for processing the region corresponding to described prospect sample point according to the 2nd predetermined image processing mode;
Wherein, described first predetermined image processing mode comprises image virtualization process, and described 2nd predetermined image processing mode comprises image Edge contrast.
The third aspect, it provides a kind of electronics, described electronics comprises:
One or more treater;
Storer; With
One or more module, described one or more module is stored in described storer and is configured to perform by described one or more treater, and described one or more module has following function:
Gather the Eyeball motion information of user when the image of browse displays;
Determine that eyeball corresponds to the focus in described image according to described Eyeball motion information;
For each pixel in described image, determine the spatial proximity P of described pixel according to the distance between described pixel and described focusS, determine that the degree of depth of described pixel is close to degree P according to the difference between the depth value of described pixel and the depth value of described focusD, and according to described spatial proximity PSWith the described degree of depth close to degree PDDetermining the type of described pixel, described type comprises background sample point and prospect sample point;
Using the region corresponding to described background sample point and/or described prospect sample point as image processing region;
According to predetermined image processing mode, described image processing region is processed;
Refresh the described image after display process;
Wherein, the described spatial proximity P determining described pixel according to the distance between described pixel and described focusS, comprising: calculate described pixel (xi, yi) and described focus (xL, xL) between distance S, described distance S=(xi-xL)2+(yi-yL)2; Described spatial proximity P is calculated according to described distance S, natural constant e and the first empirical value ��S, described spatial proximity PS=e^ [-S/ (2 ��2)]=e^{-[(xi-xL)2+(yi-yL)2]/(2��2);
Difference between the depth value of the described depth value according to described pixel and described focus determines that the degree of depth of described pixel is close to degree PD, comprising: the depth value D (x calculating described pixeli, yi) with the depth value D (x of described focusL, yL) difference R, described difference R=D (xi, yi)-D(xL, yL); The described degree of depth is calculated close to degree P according to described difference R, natural constant e and the 2nd empirical value ��D, the described degree of depth is close to degree PD=e^ [-R2/(2��2)]=e^{-[D (xi, yi)-D(xL, yL)]2/(2��2)}��
Some the useful effects of the technical scheme that disclosure embodiment provides can comprise:
By gathering the Eyeball motion information of user when the image of browse displays, determine, according to Eyeball motion information, the focus that eyeball corresponds in image, and according to predetermined image processing mode, image processed after determining image processing region according to focus, show; Solve the image display pattern related in background technology and lack interactivity, the problem that image display effect is not ideal enough; Compared to the image display pattern related in background technology, the present embodiment determines user's focus in the picture by gathering user in real time browsing the movable information of eyeball in image process, both the ideology of user had been met, also improve the interactivity in image display process, and then improve image display effect.
Should be understood that, it is only exemplary that above general description and details hereinafter describe, and can not limit the disclosure.
Accompanying drawing explanation
In order to be illustrated more clearly in embodiment of the present disclosure, below the accompanying drawing used required in embodiment being described is briefly described, apparently, accompanying drawing in the following describes is only embodiments more of the present disclosure, for those of ordinary skill in the art, under the prerequisite not paying creative work, it is also possible to obtain other accompanying drawing according to these accompanying drawings.
Fig. 1 is the schematic diagram involved by background weakening method provided in background technology;
Fig. 2 is the exemplary method flowchart of the method for displaying image that disclosure embodiment provides;
Fig. 3 A is the exemplary method flowchart of the method for displaying image that another embodiment of the disclosure provides;
Fig. 3 B is the schematic diagram processing front image and depth map involved by method for displaying image that the disclosure provides;
Fig. 3 C is the exemplary method flowchart of the step 304 involved by method for displaying image that another embodiment of the disclosure provides;
Fig. 3 D be involved by the method for displaying image that provides of the disclosure for reflecting spatial proximity and the degree of depth schematic diagram close to degree;
Fig. 3 E is the schematic diagram of the fore/background sample point finally determined involved by method for displaying image that the disclosure provides;
Fig. 3 F is the schematic diagram of the rear image of the process involved by image processing method that the disclosure provides;
Fig. 3 G is the schematic diagram of image after the different treatment corresponding to different focus involved by the image processing method of disclosure offer;
Fig. 4 is the example arrangement skeleton diagram of the image display device that disclosure embodiment provides;
Fig. 5 is the example arrangement skeleton diagram of the image display device that another embodiment of the disclosure provides;
Fig. 6 is the example arrangement schematic diagram of electronics involved in each embodiment of the disclosure.
By above-mentioned accompanying drawing, the embodiment that the disclosure is clear and definite is shown, will have more detailed description hereinafter. These accompanying drawings and text description be not in order to by any mode limit the disclosure design scope, but by reference to specific embodiment for those skilled in the art illustrate concept of the present disclosure.
Embodiment
In order to make object of the present disclosure, technical scheme and advantage clearly, below in conjunction with accompanying drawing, the disclosure is described in further detail, it is clear that described embodiment is only a part of embodiment of the disclosure, instead of whole embodiments. Based on the embodiment in the disclosure, those of ordinary skill in the art are not making other embodiments all obtained under creative work prerequisite, all belong to the scope of disclosure protection.
In each embodiment of the disclosure, electronics can be mobile phone, panel computer, E-book reader, MP3 player (MovingPictureExpertsGroupAudioLayerIII, dynamic image expert compresses standard audio aspect 3), MP4 (MovingPictureExpertsGroupAudioLayerIV, dynamic image expert compresses standard audio aspect 3) player and intelligent television etc.
Please refer to Fig. 2, it illustrates the exemplary method flowchart of the method for displaying image that disclosure embodiment provides, the present embodiment is applied in electronics with this method for displaying image and illustrates. This method for displaying image can comprise the steps:
In step 202., the Eyeball motion information of user when the image of browse displays is gathered.
In step 204, determine, according to Eyeball motion information, the focus that eyeball corresponds in image.
In step 206, image processing region is determined according to focus.
In a step 208, according to predetermined image processing mode, image processing region is processed.
In step 210, the image after display process is refreshed.
In sum, the method for displaying image that the present embodiment provides, by gathering the Eyeball motion information of user when the image of browse displays, determine, according to Eyeball motion information, the focus that eyeball corresponds in image, and according to predetermined image processing mode, image processed after determining image processing region according to focus, show; Solve the image display pattern related in background technology and lack interactivity, the problem that image display effect is not ideal enough; Compared to the image display pattern related in background technology, the present embodiment determines user's focus in the picture by gathering user in real time browsing the movable information of eyeball in image process, both the ideology of user had been met, also improve the interactivity in image display process, and then improve image display effect.
Please refer to Fig. 3 A, it illustrates the exemplary method flowchart of the method for displaying image that another embodiment of the disclosure provides, the present embodiment is applied in electronics with this method for displaying image and illustrates. This method for displaying image can comprise the steps:
In step 301, image, image depth information and both corresponding relations is prestored.
Electronics prestores image, image depth information and both corresponding relations. Image depth information includes the depth value of each pixel in image, and this depth value can be gathered by depth transducer or parallel binocular camera group. The depth value of a pixel refers to the subject corresponding to this pixel and the distance between the imaging plane of camera. Assume the imaging plane of camera includes orthogonal x-axis and y-axis, simultaneously by initial point and the straight line that is perpendicular to this imaging plane of the intersection point of x-axis and y-axis for z-axis sets up three-dimensional cartesian coordinate system. If the coordinate of a certain pixel in this three-dimensional cartesian coordinate system is (X, Y, Z), then Z value is the depth value of this pixel.
In addition, electronics gathers the depth value of each pixel in image by depth transducer or parallel binocular camera group. Wherein, depth transducer includes light emission device and optical receiver usually, and depth transducer is issued to optical receiver from light emission device and receives the Time Calculation that experiences by gathering optical signal and go out depth value; Parallel binocular camera group simulation human visual system, by two cameras from the depth value obtaining each pixel image after image being carried out pixel coupling, analysis and calculation after two width images of different angle acquisition subject.
In image, the depth value of each pixel can represent with depth map. Incorporated by reference to reference to figure 3B, it is assumed that the image that electronics prestores in the present embodiment is for as illustrated on the left of Fig. 3 B. Fig. 3 B right diagram is corresponding to depth map illustrated on the left of Fig. 3 B. Wherein, the more dark part of color represents that the depth value of this partial pixel point is more big, and also namely distance imaging plane is more far away; Otherwise, the more shallow part of color represents that the depth value of this partial pixel point is more little, and also namely distance imaging plane is more near.
In step 302, the Eyeball motion information of user when the image of browse displays is gathered.
In the process of user's browse graph picture, electronics gathers the Eyeball motion information of user when the image of browse displays. Electronics can the Eyeball motion information of continuous collecting user when the image of browse displays, it is also possible to the Eyeball motion information when predetermined time interval gathers the image of user in browse displays.
Electronics can carry out the collection of Eyeball motion information by hardware such as front-facing camera, image collection assembly or infrared assemblies, it is also possible to by the collection of software simulating Eyeball motion information. When user is when browsing pictures, eye has trickle change, and these changes can produce the feature that can extract, and electronics is by extracting these features as Eyeball motion information. Specifically, the changing features of track record eyeball and eyeball periphery can obtain Eyeball motion information, or the change of track record iris angle obtains Eyeball motion information, or obtain Eyeball motion information by extracting the mode of variation characteristic after initiatively projecting the light beams such as infrared rays to iris.
In step 303, determine, according to Eyeball motion information, the focus that eyeball corresponds in image.
Electronics determines, according to Eyeball motion information, the focus that eyeball corresponds in image. Electronics, according to the feature of the eye change comprised in Eyeball motion information, determines, by the process of analysis, modeling and simulation, the focus that eyeball corresponds in image.
In the present embodiment, it is assumed that the focus in the image that electronics is determined is shown in the L in illustrating on the left of Fig. 3 B.
In step 304, for each pixel in image, determine the type of pixel according to the difference between the depth value of the distance between pixel and focus and pixel and the depth value of focus.
After determining the focus that eyeball corresponds in image, electronics determines the type of pixel according to the difference between the depth value of the distance between pixel and focus and pixel and the depth value of focus. The type of pixel comprises background sample point and prospect sample point.
Incorporated by reference to reference to figure 3C, this step can comprise following a few sub-steps:
In step 304a, determine the spatial proximity P of pixel according to the distance between pixel and focusS��
Electronics determines the spatial proximity P of pixel according to the distance between pixel and focusS, spatial proximity PSFor representing pixel and the focus degree of closeness in the plane of image. Need to show with effect clearly due to focus, so the pixel corresponding to focus should belong to prospect sample point, and spatial proximity PSFrom the distance between pixel and focus, this weighs the probability that pixel belongs to prospect sample point on the one hand, PSSpan be 0 < PS��1��
Specifically, it is possible in the plane of image, set up rectangular coordinate system, the spatial proximity P of pixel is obtained by the coordinate of pixel and focusS. Electronics calculates pixel (xi, yi) and focus (xL, yL) between distance S, distance S=(xi-xL)2+(yi-yL)2; Spatial proximity P is calculated according to distance S, natural constant e and the first empirical value ��S, spatial proximity PS=e^ [-S/ (2 ��2)]=e^{-[(xi-xL)2+(yi-yL)2]/(2��2). Wherein, the value of �� can be determined according to practical situation. Realizing in process of the present disclosure, contriver obtains, through test, longer length 1/20 comparatively suitable that �� gets image.
In step 304b, determine that the degree of depth of pixel is close to degree P according to the difference between the depth value of pixel and the depth value of focusD��
According to the difference between the depth value of pixel and the depth value of focus, electronics determines that the degree of depth of pixel is close to degree PD, the degree of depth is close to degree PDFor representing pixel and the focus degree of closeness in depth value. Needing to show with effect clearly due to focus, so the pixel corresponding to focus should belong to prospect sample point, and the degree of depth is close to degree PDFrom the depth value of pixel and focus, this weighs the probability that pixel belongs to prospect sample point on the one hand, PDSpan be 0 < PD��1��
Specifically, electronics calculates the depth value D (x of pixeli, yi) with the depth value D (x of focusL, yL) difference R, difference R=D (xi, yi)-D(xL, yL); The degree of depth is calculated close to degree P according to difference R, natural constant e and the 2nd empirical value ��D, the degree of depth is close to degree PD=e^ [-R2/(2��2)]=e^{-[D (xi, yi)-D(xL, yL)]2/(2��2). Wherein, the value of �� can be determined according to practical situation. Realizing in process of the present disclosure, contriver obtains �� through test, and to get 10 comparatively suitable.
Incorporated by reference to reference to figure 3D, on the left of Fig. 3 D, diagram reflects the spatial proximity P of each pixel in imageS, the spatial proximity P of the pixel in this part region of the region representation that in figure, color is more darkSMore little, also namely the pixel in this part region to belong to the probability of prospect sample point more little; Otherwise, the spatial proximity P of the pixel in this part region of the region representation that in figure, color is more brightSMore big, also namely the pixel in this part region to belong to the probability of prospect sample point more big.
And Fig. 3 D right diagram reflects the degree of depth of each pixel in image close to degree PD, the degree of depth of the pixel in this part region of the region representation that in figure, color is more dark is close to degree PDMore little, also namely the pixel in this part region to belong to the probability of prospect sample point more little; Otherwise, the degree of depth of the pixel in this part region of the region representation that in figure, color is more bright is close to degree PDMore big, also namely the pixel in this part region to belong to the probability of prospect sample point more big.
In step 304c, according to spatial proximity PSWith the degree of depth close to degree PDDetermine the type of pixel.
The spatial proximity P of pixel determined by electronicsSWith the degree of depth close to degree PDAfterwards, consider two aspect data, determine the type of pixel according to this two aspect data.
In the implementation that the first is possible, step 304c can comprise following a few sub-steps:
The first, according to spatial proximity PSWith the degree of depth close to degree PDObtain the final close to degree P of pixel.
Electronics is according to spatial proximity PSWith the degree of depth close to degree PDObtain the final close to degree P of pixel. Specifically, it is possible to by spatial proximity PSWith the degree of depth close to degree PDIt is multiplied and obtains the final close to degree P of pixel. Work as PS=e^{-[(xi-xL)2+(yi-yL)2]/(2��2) and PD=e^{-[D (xi, yi)-D(xL, yL)]2/(2��2) time, final close to degree P=PS��PD=e^{-[(xi-xL)2+(yi-yL)2]/(2��2)}��e^{-[D(xi, yi)-D(xL, yL)]2/(2��2). Due to 0 < PS�� 1 and 0 < PD�� 1, so 0 < P��1.
2nd, judge final close to the magnitude relationship of degree P and predetermined threshold.
Electronics judges final close to the magnitude relationship of degree P and predetermined threshold. Due to 0 < P��1, so predetermined threshold gets 0.5 usually. , it is possible to suitably adjust the size of predetermined threshold in conjunction with practical situation, certainly this is not done concrete restriction.
3rd, if judged result is less than predetermined threshold for final close to degree P, then determine that the type of pixel is background sample point.
When judged result is less than predetermined threshold for final close degree P, electronics determines that the type of pixel is background sample point. Pixel final more big close to degree P, shows that this pixel belongs to the probability of prospect sample point more big; Anyway, pixel final more little close to degree P, shows that this pixel belongs to the probability of background sample point more big.
In the present embodiment, it is assumed that predetermined threshold gets 0.5, then when the final close degree P of pixel is less than 0.5, electronics determines that the type of pixel is background sample point.
4th, if judged result is greater than predetermined threshold for final close to degree P, then determine that the type of pixel is prospect sample point.
When judged result is greater than predetermined threshold for final close degree P, electronics determines that the type of pixel is prospect sample point. In the present embodiment, when the final close degree P of pixel is greater than 0.5, electronics determines that the type of pixel is background sample point.
Incorporated by reference to reference to figure 3E, Fig. 3 E shows the schematic diagram of prospect sample point and the background sample point finally determined by the implementation that the first in step 304c is possible, in figure, black region is the pixel corresponding to background sample point, and white portion is the pixel corresponding to prospect sample point. The region that pixel from figure this it appears that corresponding to prospect sample point is formed, compared to the background virtualization processing mode provided in background technology, has been separated with obvious progress to dividing of region.
By four steps related in the first possible implementation above-mentioned, after having considered the depth value these two aspects factor of the distance between pixel and focus and pixel and focus, pixels all in image is classified, obtains prospect sample point and background sample point. Below, in the 2nd kind of possible implementation, it provides one classify of image element mode more accurate, rational.
In the 2nd kind of possible implementation, step 304c can comprise following several steps:
The first, according to spatial proximity PSWith the degree of depth close to degree PDObtain the final close to degree P of pixel.
Electronics is according to spatial proximity PSWith the degree of depth close to degree PDObtain the final close to degree P of pixel. Specifically, it is possible to by spatial proximity PSWith the degree of depth close to degree PDIt is multiplied and obtains the final close to degree P of pixel. Work as PS=e^{-[(xi-xL)2+(yi-yL)2]/(2��2) and PD=e^{-[D (xi, yi)-D(xL, yL)]2/(2��2) time, final close to degree P=PS��PD=e^{-[(xi-xL)2+(yi-yL)2]/(2��2)}��e^{-[D(xi, yi)-D(xL, yL)]2/(2��2). Due to 0 < PS�� 1 and 0 < PD�� 1, so 0 < P��1.
2nd, judge final close to the magnitude relationship of degree P and the first threshold value and Second Threshold.
Electronics judges final close to the magnitude relationship of degree P and the first threshold value and Second Threshold; Wherein, the first threshold value is less than Second Threshold. The first threshold value is chosen the difference is that, electronics and Second Threshold two threshold values compare close to degree P with final with the first possible implementation. Due to 0 < P��1, so the first threshold value can be redefined for 0.1 and Second Threshold can be redefined for 0.9. Certainly, it is possible to suitably adjust the size of the first threshold value and/or the Second Threshold set in advance in conjunction with practical situation, this is not done concrete restriction.
3rd, if judged result is less than the first threshold value for final close to degree P, then determine that the type of pixel is background sample point.
When judged result is less than the first threshold value for final close degree P, electronics determines that the type of pixel is background sample point. Pixel final more big close to degree P, shows that this pixel belongs to the probability of prospect sample point more big; Anyway, pixel final more little close to degree P, shows that this pixel belongs to the probability of background sample point more big.
In the present embodiment, it is assumed that the first threshold value gets 0.1, then when the final close degree P of pixel is less than 0.1, electronics determines that the type of pixel is background sample point.
4th, if judged result is greater than Second Threshold for final close to degree P, then determine that the type of pixel is prospect sample point.
When judged result is greater than Second Threshold for final close degree P, electronics determines that the type of pixel is prospect sample point. In the present embodiment, it is assumed that Second Threshold is 0.9, then when the final close degree P of pixel is greater than 0.9, electronics determines that the type of pixel is prospect sample point.
5th, if judged result is greater than the first threshold value for final close to degree P and is less than Second Threshold, then determine that pixel is sample point undetermined.
When judged result is greater than the first threshold value for final close degree P and is less than Second Threshold, electronics determines that pixel is sample point undetermined. Sample point undetermined refers to currently cannot determine that this pixel belongs to prospect sample point or background sample point, it is necessary to analyze the type determining this pixel further.
In the present embodiment, final when pixel is greater than 0.1 and when being less than 0.9 close to degree P, and electronics determines that pixel is sample point undetermined.
6th, the type of sample point undetermined is finally determined according to the color vector of sample point undetermined.
Electronics finally determines the type of sample point undetermined according to the color vector of sample point undetermined. At RGB color, often kind of color all can use three-dimensional vectorRepresenting, such as redness isGreen isBlueness isWhite isBlack isEtc.. Wherein, the value of R, G, B is all between [0,1]. Electronics calculates this sample point undetermined respectively in conjunction with the color vector of sample point undetermined and belongs to prospect sample point and the probability of background sample point, chooses the type of the type corresponding to the bigger value of probability as sample point undetermined.
Specifically, this step comprises:
(1) for each sample point undetermined, the color vector of sample point undetermined is obtained respectively
(2) sample point (x undetermined is calculated respectively according to Bayes posterior probability formulai, yi) belong to the probability of prospect sample pointAnd belong to the probability of background sample point
Bayes posterior probability formula is P (B | A)=P (A | B) * P (B)/P (A), so
From upper formula,
Wherein, P (I(xi,yi)=1) representing that this sample point undetermined is the probability estimating prospect sample point, the prospect sample point of estimating refers to the prospect sample point that electronics is estimated according to the depth value of each pixel in image; P (I(xi,yi)=0) represent that this sample point undetermined is the probability estimating background sample point, estimate background sample point and refer to the background sample point that electronics is estimated according to the depth value of each pixel in image. Specifically, assume a certain image includes 10000 pixels, after electronics obtains the depth value of each pixel, estimate out after 10000 depth values are compared, analyzed wherein 1000 for estimating prospect sample point and 9000 for estimating background sample point. Now, choose a sample point undetermined at random, then this sample point undetermined is the probability estimating prospect sample point is 1000/10000=0.1, also i.e. P (I(xi,yi)=1)=0.1; This sample point undetermined is the probability estimating background sample point is 9000/10000=0.9, also i.e. P (I(xi,yi)=0)=0.9.
Representing is estimating in prospect sample point, and the color vector of pixel isProbability; Accordingly,Representing is estimating in background sample point, and the color vector of pixel isProbability. Specifically, it is assumed that 1000 estimate prospect sample point and 9000 estimate in background sample point, red pixel is estimated at 1000 and is occupied 650 in prospect sample point, estimates at 9000 and occupies 150 in background sample point. Now, choosing a sample point undetermined at random, if this sample point undetermined is red, also namely color vector isThen
To sum up, Due to SoFurther, the type determining the sample point undetermined of this redness in following step is prospect sample point.
(3) type of the type corresponding to value bigger in probability as sample point undetermined is chosen.
Electronics calculates the probability that sample point undetermined belongs to prospect sample point respectively and after the probability belonging to background sample point, chooses the type of the type corresponding to value bigger in probability as sample point undetermined.
It should be noted that, in the present embodiment, only illustrate by redness of the sample point undetermined chosen, and the quantity related in above-described embodiment and probability are only exemplary. In actual applications, the type of this sample point undetermined finally determined by electronics after needing that each sample point undetermined is carried out color vector acquisition, probability calculation.
Also it should be noted that, determine the mode of the type of pixel in step 304c for two kinds related to, the first computation process is comparatively simple, and efficiency is higher; Although and the 2nd kind of computation process is comparatively complicated, but the division result finally obtained is comparatively accurate, it is more accurate that front background is distinguished. In actual applications, the type that suitable mode determines pixel is chosen according to demand.
In step 305, using the region corresponding to background sample point and/or prospect sample point as image processing region.
After the type of each pixel in image is divided, electronics using the region corresponding to background sample point and/or prospect sample point as image processing region.
Within step 306, according to predetermined image processing mode, image processing region is processed.
Image processing region is processed by electronics according to predetermined image processing mode.
Specifically, in the implementation that the first is possible, according to the first predetermined image processing mode, the region corresponding to background sample point is processed.
Region corresponding to background sample point is processed by electronics according to the first predetermined image processing mode so that the region corresponding to background sample point shows with fuzzy effect. First predetermined image processing mode includes but not limited to Gauss's Fuzzy Processing or background virtualization process.
Incorporated by reference to reference to figure 3F, Fig. 3 F shows the schematic diagram obtained after the image to the present embodiment offer carries out background virtualization process.
In the 2nd kind of possible implementation, according to the first predetermined image processing mode, the region corresponding to background sample point is processed, and according to the 2nd predetermined image processing mode, the region corresponding to prospect sample point is processed.
Region corresponding to background sample point is processed by electronics according to the first predetermined image processing mode, and according to the 2nd predetermined image processing mode, the region corresponding to prospect sample point is processed, region corresponding to background sample point is shown with fuzzy effect, makes the region corresponding to prospect sample point show with effect more clearly simultaneously. First predetermined image processing mode includes but not limited to Gauss's Fuzzy Processing or background virtualization process; 2nd predetermined image processing mode includes but not limited to histogram equalization processing or image Edge contrast.
In the implementation that the third is possible, according to the 2nd predetermined image processing mode, the region corresponding to described prospect sample point is processed.
Region corresponding to described prospect sample point is processed by electronics according to the 2nd predetermined image processing mode so that the region corresponding to prospect sample point shows with effect more clearly. 2nd predetermined image processing mode includes but not limited to histogram equalization processing or image Edge contrast.
In step 307, the image after display process is refreshed.
Electronics refreshes the image after display process. Along with the motion of user's eyeball when browsing pictures, the focus that electronics is determined also can change thereupon, and then the image processing region obtained also can change, thus realizes the effect of the image after dynamic display processing.
Incorporated by reference to reference to figure 3G, Fig. 3 G shows the schematic diagram of image after the different treatment corresponding to different focus L. Wherein, left side diagram focus is L1, and middle diagram focus is L2, and right diagram focus is L3.
It should be noted that, the pixel in image is carried out the division of fore/background after only determining focus with electronics according to Eyeball motion information by the present embodiment, and then background carries out virtualization process illustrates. In actual applications, except the background virtualization provided except the present embodiment processes, it is also possible to image carries out partial enlargement, local reduces or other image processing operations such as regional area extraction. Such as, along with the change of focus, the region near focus is carried out in real time, dynamically amplifies.
In sum, the method for displaying image that the present embodiment provides, by gathering the Eyeball motion information of user when the image of browse displays, determine, according to Eyeball motion information, the focus that eyeball corresponds in image, and according to predetermined image processing mode, image processed after determining image processing region according to focus, show; Solve the image display pattern related in background technology and lack interactivity, the problem that image display effect is not ideal enough; Compared to the image display pattern related in background technology, the present embodiment determines user's focus in the picture by gathering user in real time browsing the movable information of eyeball in image process, both the ideology of user had been met, also improve the interactivity in image display process, and then improve image display effect.
In addition, present embodiments providing two kinds of modes determining the type of pixel, the first computation process is comparatively simple, and efficiency is higher; Although and the 2nd kind of computation process is comparatively complicated, but the division result finally obtained is comparatively accurate, it is more accurate that front background is distinguished. In actual applications, higher image processing operations is required for instantaneity, preferentially chooses the mode of the first type determining pixel; And when the computing ability of electronics is higher, it is also possible to choose the 2nd kind of mode determining the type of pixel so that finally obtain fore/background region more reasonable, accurate, the display effect of image after further raising process.
Following is disclosure device embodiment, it is possible to for performing disclosure embodiment of the method. For the details not disclosed in disclosure device embodiment, please refer to disclosure embodiment of the method.
Please refer to Fig. 4, it illustrates the example arrangement skeleton diagram of the image display device that disclosure embodiment provides, this image display device can realize becoming the whole or a part of of electronics by software, hardware or both combinations. This image display device can comprise: eyeball acquisition module 410, concern determination module 420, area determination module 430, image procossing module 440 and image display 450.
Eyeball acquisition module 410, for gathering the Eyeball motion information of user when the image of browse displays.
Pay close attention to determination module 420, for determining that eyeball corresponds to the focus in described image according to described Eyeball motion information.
Area determination module 430, for determining image processing region according to described focus.
Image procossing module 440, for processing described image processing region according to predetermined image processing mode.
Image display 450, for refreshing the described image after display process.
In sum, the image display device that the present embodiment provides, by gathering the Eyeball motion information of user when the image of browse displays, determine, according to Eyeball motion information, the focus that eyeball corresponds in image, and according to predetermined image processing mode, image processed after determining image processing region according to focus, show; Solve the image display pattern related in background technology and lack interactivity, the problem that image display effect is not ideal enough; Compared to the image display pattern related in background technology, the present embodiment determines user's focus in the picture by gathering user in real time browsing the movable information of eyeball in image process, both the ideology of user had been met, also improve the interactivity in image display process, and then improve image display effect.
Please refer to Fig. 5, it illustrates the example arrangement skeleton diagram of the image display device that another embodiment of the disclosure provides, this image display device can realize becoming the whole or a part of of electronics by software, hardware or both combinations. This image display device can comprise: eyeball acquisition module 410, concern determination module 420, area determination module 430, image procossing module 440 and image display 450.
Eyeball acquisition module 410, for gathering the Eyeball motion information of user when the image of browse displays.
Pay close attention to determination module 420, for determining that eyeball corresponds to the focus in described image according to described Eyeball motion information.
Area determination module 430, for determining image processing region according to described focus.
Specifically, described area determination module 430, comprising: the true stator modules 434 of Type division submodule block 432 and region;
Described Type division submodule block 432, for for each pixel in described image, difference between depth value according to the distance between described pixel and described focus and described pixel and the depth value of described focus determines the type of described pixel, and described type comprises background sample point and prospect sample point.
The true stator modules 434 in described region, for using the region corresponding to described background sample point and/or prospect sample point as described image processing region.
Further, described Type division submodule block 432, comprising: distance determining unit 432a, depth determining unit 432b and type determining unit 432c.
Described distance determining unit 432a, for determining the spatial proximity P of described pixel according to the distance between described pixel and described focusS��
Specifically, described distance determining unit 432a, comprising: distance computation subunit 432a1 and distance determine subelement 432a2.
Described distance computation subunit 432a1, for calculating described pixel (xi, yi) and described focus (xL, yL) between distance S, described distance S=(xi-xL)2+(yi-yL)2��
Described distance determines subelement 432a2, for calculating described spatial proximity P according to described distance S, natural constant e and the first empirical value ��S, described spatial proximity PS=e^ [-S/ (2 ��2)]=e^{-[(xi-xL)2+(yi-yL)2]/(2��2)}��
For the difference between the depth value according to described pixel and the depth value of described focus, described depth determining unit 432b, determines that the degree of depth of described pixel is close to degree PD��
Specifically, described depth determining unit 432b, comprising: depth calculation subelement 432b1 and the degree of depth determine subelement 432b2.
Described depth calculation subelement 432b1, for calculating the depth value D (x of described pixeli, yi) with the depth value D (x of described focusL, yL) difference R, described difference R=D (xi, yi)-D(xL, yL)��
The described degree of depth determines subelement 432b2, for calculating the described degree of depth close to degree P according to described difference R, natural constant e and the 2nd empirical value ��D, the described degree of depth is close to degree PD=e^ [-R2/(2��2)]=e^{-[D (xi, yi)-D(xL, yL)]2/(2��2)}��
Described type determining unit 432c, for according to described spatial proximity PSWith the described degree of depth close to degree PDDetermine the type of described pixel.
In the implementation that the first is possible, described type determining unit 432c, comprising: similar determine that subelement 432c1, threshold decision subelement 432c2, background determine that subelement 432c3 and prospect determine subelement 432c4.
Described similar determine subelement 432c1, for according to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel.
Optionally, described similar determine subelement 432c1, also for by described spatial proximity PSWith the described degree of depth close to degree PDIt is multiplied and obtains the final close to degree P of described pixel.
Described threshold decision subelement 432c2, described final close to the magnitude relationship of degree P and predetermined threshold for judging.
Described background determines subelement 432c3, if being that described final close degree P is less than described predetermined threshold for judged result, then determines that the type of described pixel is described background sample point.
Described prospect determines subelement 432c4, if being that described final close degree P is greater than described predetermined threshold for judged result, then determines that the type of described pixel is described prospect sample point.
In the 2nd kind of possible implementation, described type determining unit 432c, comprising: similar determine that subelement 432c5, first is determined that subelement 432c6, the 2nd determines that subelement 432c7, the 3rd determines subelement 432c8 and finally determines subelement 432c9 by subelement 432c1, threshold ratio.
Described similar determine subelement 432c1, for according to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel.
Optionally, described similar determine subelement 432c1, also for by described spatial proximity PSWith the described degree of depth close to degree PDIt is multiplied and obtains the final close to degree P of described pixel.
Described threshold ratio is to subelement 432c5, and described final close to the magnitude relationship of degree P and the first threshold value and Second Threshold for judging, wherein, described first threshold value is less than described Second Threshold.
Described first determines subelement 432c6, if being that described final close degree P is less than described first threshold value for judged result, then determines that the type of described pixel is described background sample point.
Described 2nd determines subelement 432c7, if being that described final close degree P is greater than described Second Threshold for judged result, then determines that the type of described pixel is described prospect sample point.
Described 3rd determines subelement 432c8, if being that described final close degree P is greater than described first threshold value and is less than described Second Threshold for judged result, then determines that described pixel is sample point undetermined.
Described finally determine subelement 432c9, for finally determining the type of described sample point undetermined according to the color vector of described sample point undetermined.
Optionally, described finally determine subelement 432c9, also for for sample point undetermined described in each, obtaining the color vector of described sample point undetermined respectively; Calculate described sample point undetermined respectively according to Bayes posterior probability formula belong to the probability of described prospect sample point and belong to the probability of described background sample point; Choose the type of the type corresponding to value bigger in described probability as described sample point undetermined.
Image procossing module 440, for processing described image processing region according to predetermined image processing mode.
Described image procossing module 440, comprising: background process submodule block 442; Or, comprehensive treating process submodule block 444; Or, perspective process submodule block 446.
Described background process submodule block 442, for processing the region corresponding to described background sample point according to the first predetermined image processing mode.
Described comprehensive treating process submodule block 444, for being processed in the region corresponding to described background sample point according to the first predetermined image processing mode, and processes the region corresponding to described prospect sample point according to the 2nd predetermined image processing mode.
Described perspective process submodule block 446, for processing the region corresponding to described prospect sample point according to the 2nd predetermined image processing mode.
Wherein, described first predetermined image processing mode comprises image virtualization process, and described 2nd predetermined image processing mode comprises image Edge contrast.
Image display 450, for refreshing the described image after display process.
In sum, the image display device that the present embodiment provides, by gathering the Eyeball motion information of user when the image of browse displays, determine, according to Eyeball motion information, the focus that eyeball corresponds in image, and according to predetermined image processing mode, image processed after determining image processing region according to focus, show; Solve the image display pattern related in background technology and lack interactivity, the problem that image display effect is not ideal enough; Compared to the image display pattern related in background technology, the present embodiment determines user's focus in the picture by gathering user in real time browsing the movable information of eyeball in image process, both the ideology of user had been met, also improve the interactivity in image display process, and then improve image display effect.
In addition, present embodiments providing two kinds of modes determining the type of pixel, the first computation process is comparatively simple, and efficiency is higher; Although and the 2nd kind of computation process is comparatively complicated, but the division result finally obtained is comparatively accurate, it is more accurate that front background is distinguished. In actual applications, higher image processing operations is required for instantaneity, preferentially chooses the mode of the first type determining pixel; And when the computing ability of electronics is higher, it is also possible to choose the 2nd kind of mode determining the type of pixel so that finally obtain fore/background region more reasonable, accurate, the display effect of image after further raising process.
It should be understood that the image display device that above-described embodiment provides is when carrying out image procossing, only it is illustrated with the division of above-mentioned each function module, in practical application, can complete by different function modules as required and by above-mentioned functions distribution, it is divided into different function modules, to complete all or part of function described above by the internal structure of equipment. In addition, the embodiment of the method for the image display device that above-described embodiment provides and method for displaying image belongs to same design, and its specific implementation process refers to embodiment of the method, repeats no more here.
Please refer to Fig. 6, it illustrates the example arrangement schematic diagram of electronics involved in each embodiment of the disclosure. This electronics may be used for the method for displaying image implementing in above-described embodiment to provide.
Electronics 600 can comprise camera 601, communication unit 610, the storer 620 including one or more computer-readable recording mediums, input unit 630, display unit 640, sensor 650, voice frequency circuit 660, wireless communication unit 670, includes one or the parts such as treater 680 that more than one processes core and power supply 690. It will be appreciated by those skilled in the art that, the electronic devices structure shown in figure does not form the restriction to electronics, it is possible to comprises the parts more more or less than diagram, or combines some parts, or different parts are arranged. Wherein:
Camera 601 can be used for gathering image. Or, camera 601 can be set to parallel binocular camera group, and parallel binocular camera group can be used for gathering the depth value of each pixel in image. In addition, camera can also be used for gathering the Eyeball motion information of user when the image of browse displays. Optionally, when camera is for gathering Eyeball motion information, this camera can substitute with image collection assembly or infrared assembly.
Communication unit 610 can be used for receiving and sending messages or in communication process, the reception of signal and transmission, this communication unit 610 can be RF (RadioFrequency, radio frequency) circuit, router, modulator-demodulator unit, etc. network communication equipment. especially, when the unit 610 that communicates be RF circuit, after the reception of the downlink information of base station, transfer to one or more than one treater 680 processes, in addition, the data relating to upper row are sent to base station. usually, RF circuit as communication unit includes but not limited to antenna, at least one amplifier, tuner, one or more vibrator, subscriber identity module (SIM) card, transceiver, coupling mechanism, LNA (LowNoiseAmplifier, low-noise amplifier), duplexer etc. in addition, communicating unit 610 can also by radio communication and network and other devices communicating. described radio communication can use arbitrary communication standard or agreement, include but not limited to GSM (GlobalSystemofMobilecommunication, global system for mobile communications), GPRS (GeneralPacketRadioService, general packet radio service), CDMA (CodeDivisionMultipleAccess, code division multple access), WCDMA (WidebandCodeDivisionMultipleAccess, the many locations of wideband code division), LTE (LongTermEvolution, long-term evolution), e-mail, SMS (ShortMessagingService, Short Message Service) etc. storer 620 can be used for storing software program and module, and treater 680 is stored in software program and the module of storer 620 by running, thus performs the application of various function and data processing. storer 620 can mainly comprise storage program district and store data field, wherein, stores program district and can store the application program needed for operating system, at least one function (such as sound-playing function, image player function etc.) etc., store the data (such as audio frequency data, phone directory etc.) etc. that data field can store the use according to electronics 600 and create. in addition, storer 620 can comprise high-speed random access memory, it is also possible to comprises nonvolatile memory, such as at least one disk memory, flush memory device or other volatile solid-state parts. correspondingly, storer 620 can also comprise storer controller, to provide treater 680 and input unit 630 to the access of storer 620.
Input unit 630 can be used for receiving numeral or the character information of input, and produces to arrange with user and function controls relevant keyboard, mouse, operating stick, optics or track ball signal input. Preferably, input unit 630 and can comprise touch-sensitive surperficial 631 and other input units 632. Touch-sensitive surperficial 631, also referred to as touch display screen or Trackpad, user can be collected or neighbouring touch operation (such as user uses finger, touch any applicable object such as pen or the operation of annex on touch-sensitive surperficial 631 or near touch-sensitive surperficial 631) thereon, and drive corresponding coupling device according to the formula set in advance. Optionally, touch-sensitive surperficial 631 can comprise touch detecting apparatus and touch controller two parts. Wherein, the touch orientation of touch detecting apparatus detection user, and detect the signal that touch operation brings, send signal to touch controller; Touch controller receives touch information from touch detecting apparatus, and converts it to contact coordinate, then gives treater 680, and can the order sent of receiving processor 680 and performed. In addition, it is possible to adopt the broad varietys such as resistance-type, condenser type, infrared rays and surface acoustic wave to realize touch-sensitive surperficial 631. Except touch-sensitive surperficial 631, input unit 630 can also comprise other input units 632. Preferably, other input units 632 can include but not limited to one or more in physical keyboard, function key (such as volume control button, switch key etc.), track ball, mouse, operating stick etc.
Display unit 640 can be used for showing the various graphical user interface of information or the information being supplied to user and the electronics 600 inputted by user, and these graphical user interface can be made up of figure, text, icon, video and its arbitrary combination. Display unit 640 can comprise display panel 641, optionally, the form such as LCD (LiquidCrystalDisplay, liquid-crystal display), OLED (OrganicLight-EmittingDiode, Organic Light Emitting Diode) can be adopted to configure display panel 641. Further, touch-sensitive surperficial 631 can cover display panel 641, when touch-sensitive surperficial 631 detect thereon or after neighbouring touch operation, send treater 680 to determine the type of touch event, provide corresponding vision to export with preprocessor 680 on display panel 641 according to the type of touch event. Although in figure 6, touch-sensitive surperficial 631 is realize input and input function as two independent parts with display panel 641, but in certain embodiments, it is possible to integrated and realize input and output function with display panel 641 by touch-sensitive surperficial 631.
Electronics 600 also can comprise at least one sensor 650, such as depth transducer, optical sensor, motion-sensing device and other sensors. Optical sensor can comprise ambient light sensor and close to sensor, wherein, depth transducer can be used for gathering the depth value of each pixel in image, depth transducer includes light emission device and optical receiver usually, and depth transducer is issued to optical receiver from light emission device and receives the Time Calculation that experiences by gathering optical signal and go out depth value. Ambient light sensor can regulate the brightness of display panel 641 according to the light and shade of environment light, when electronics 600 moves in one's ear, can close display panel 641 and/or backlight close to sensor. As the one of motion-sensing device, Gravity accelerometer can detect the size of the acceleration that (is generally three axles) in all directions, size and the direction of gravity can be detected out time static, can be used for identifying the application (such as anyhow shielding switching, dependent game, magnetometer pose calibrating) of mobile phone attitude, Vibration identification correlation function (such as passometer, knock) etc.; As for other sensors such as the also configurable gyrostat of electronics 600, weather gauge, wet bulb thermometer, thermometer, infrared sensors, do not repeat them here.
Voice frequency circuit 660, loud speaker 661, microphone 662 can provide the audio frequency interface between user and electronics 600. Voice frequency circuit 660 the audio frequency data that receive can be changed after electrical signal, be transferred to loud speaker 661, be converted to voice signal by loud speaker 661 and export; On the other hand, the voice signal of collection is converted to electrical signal by microphone 662, audio frequency data are converted to after receiving by voice frequency circuit 660, after again audio frequency data output processing device 680 being processed, through RF circuit 610 to be sent to such as another electronics, or export audio frequency data to storer 620 to process further. Voice frequency circuit 660 also may comprise earphone jack, to provide peripheral hardware earphone and the communication of electronics 600.
In order to realize radio communication, this electronics can being configured with wireless communication unit 670, this wireless communication unit 670 can be WIFI module. WIFI belongs to short range wireless transmission technology, and electronics 600 can help user to send and receive e-mail by wireless communication unit 670, browse webpage and access streaming media etc., and its broadband internet wireless for user provides is accessed. Although there is shown wireless communication unit 670, it should be understood that, it does not belong to must forming of electronics 600, can omit in the scope of essence not changing invention as required completely.
Treater 680 is the control center of electronics 600, utilize various interface and each part of the whole mobile phone of connection, by running or perform the software program that is stored in storer 620 and/or module, and call the data being stored in storer 620, perform various function and the process data of electronics 600, thus mobile phone is carried out integral monitoring. Optionally, treater 680 can comprise one or more process core; Preferably, treater 680 can Integrated predict model treater and modem processor, wherein, answer purpose processor mainly to process operating system, user interface and application program etc., modem processor mainly processes radio communication. It should be appreciated that above-mentioned modem processor can not also be integrated in treater 680.
Electronics 600 also comprises to the power supply 690 (such as battery) that each parts are powered, preferably, power supply can be connected with treater 680 logic by power-supply management system, thus realized the functions such as management charging, electric discharge and power managed by power-supply management system. Power supply 690 can also comprise one or more direct current or AC power, recharges system, power failure detection circuit, Power convert device or the random component such as invertor, power supply status indicator.
Although not illustrating, electronics 600 can also comprise bluetooth module etc., does not repeat them here. In the present embodiment, electronics also includes storer, and one or more than one program, one of them or more than one program are stored in storer, and are configured to perform described one or more than one routine package containing the instruction for performing the electronics end involved by the image processing method provided such as disclosure Fig. 2 or Fig. 3 A illustrated embodiment by one or more than one treater.
In addition, typically, the electronics described in the disclosure can be various hand-held terminal device, such as mobile phone, personal digital assistant (PDA) etc., and therefore protection domain of the present disclosure should not be defined as the electronics of certain particular type.
In addition, can also being implemented as the computer program performed by CPU according to method of the present disclosure, this computer program can store in a computer-readable storage medium. When this computer program is performed by CPU, perform in method of the present disclosure the above-mentioned functions limited.
In addition, the computer-readable recording medium of the computer program that aforesaid method step and system element can also utilize controller and make controller realize above-mentioned steps or unit function for storing realizes.
In addition, it will be understood that computer-readable recording medium as herein described (such as, storer) can be volatile memory or nonvolatile memory, or can comprise volatile memory and nonvolatile memory. As an example but not restrictive, nonvolatile memory can comprise read-only storage (ROM), ROM able to programme (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM) or flash device. Volatile memory can comprise random access memory (RAM), and this RAM can serve as external cache. As an example but not restrictive, RAM can obtain in a variety of forms, such as synchronous random access memory (DRAM), dynamically RAM (DRAM), synchronous dram (SDRAM), Double Data speed SDRAM (DDRSDRAM), enhancing SDRAM (ESDRAM), synchronous link DRAM (SLDRAM) and direct RambusRAM (DRRAM). The storing device of disclosed aspect is intended to include but not limited to the storer of these and other suitable type.
Those skilled in the art will also understand is that, may be implemented as electronic hardware, computer software or both combinations in conjunction with the various illustrative logical blocks described by disclosure herein, module, circuit and algorithm steps. In order to this kind of interchangeableness of hardware and software is clearly described, it has been carried out general description by function with regard to various exemplary components, square, module, circuit and step. This kind of function is implemented as software or is implemented as hardware and depend on embody rule and be applied to the design constraint of whole system. Those skilled in the art can realize described function for often kind of embody rule in various mode, but this kind realizes determining should not be interpreted as causing departing from the scope of the present disclosure.
The following parts being designed to perform function described here can be utilized to realize or perform in conjunction with the various illustrative logical blocks described by disclosure herein, module and circuit: any combination of general procedure device, digital signal processor (DSP), application specific integrated circuit (ASIC), field-programmable gate array (FPGA) or other programmable logic device part, discrete gate or transistor logic, discrete hardware assembly or these parts. General procedure device can be microprocessor, but alternatively, treater can be any conventional processors, controller, microcontroller or state machine. Treater can also be implemented as the combination of calculating equipment, and such as, the combination of DSP and microprocessor, multi-microprocessor, one or more microprocessor configure in conjunction with DSP core or other this kind any.
In the software module can directly comprise within hardware in conjunction with the step of the method described by disclosure herein or algorithm, performed by treater or in the combination of both. Software module can reside in the storage media of RAM memory, flash device, ROM memory, eprom memory, eeprom memory, register, hard disk, removable dish, CD-ROM or other form any known in the art. Exemplary storage media is coupled to treater so that treater can read information or to this storage media written information from this storage media. In an alternative, described storage media can be integral to the processor together. Treater and storage media can reside in ASIC. ASIC can be in the user terminal resident. In an alternative, treater and storage media can reside in user terminal as discrete assembly.
In one or more exemplary design, described function can realize in hardware, software, firmware or its arbitrary combination. If realized in software, then described function can be transmitted on a computer-readable medium or by computer-readable medium as one or more instruction or code storage. Computer-readable medium comprises computer-readable storage medium and communication media, and this communication media comprises any medium contributing to computer program is sent to another position from a position. Storage media can be can by any usable medium of universal or special computer access. As an example but not restrictive, this computer-readable medium can comprise RAM, ROM, EEPROM, CD-ROM or other optical disc memory apparatus, disk storage equipment or other magnetic storage apparatus, or may be used for carrying or file layout be instruction or data structure required program code and can by other medium any of universal or special computer or the access of universal or special treater. In addition, any connection can suitably be called computer-readable medium. Such as, if the wireless technology using coaxial cable, optical fiber cable, twisted-pair feeder, digital subscriber line (DSL) or such as infrared rays, wireless and microwave is come from website, server or other remote source send software, then the wireless technology of above-mentioned coaxial cable, optical fiber cable, twisted-pair feeder, DSL or such as infrared first, wireless and microwave includes the definition at medium. As used herein, disk and CD comprise compact disk (CD), laser disk, CD, digital versatile disc (DVD), floppy disk, Blu-ray disc, wherein disk magnetically reproduces data usually, and CD reproduces data with utilizing laser optics. The combination of foregoing also should be included in the scope of computer-readable medium.
Disclosed exemplary embodiment, but disclosed exemplary embodiment should be noted, but it should be appreciated that, under the prerequisite of the scope of the present disclosure not deviating from scope, it is possible to carry out multiple change and amendment. The function of the claim to a method according to disclosed embodiment described herein, step and/or action need to not perform with any particular order. In addition, although element of the present disclosure can describe or requirement with individual form, but it is also contemplated that multiple, it is odd number unless explicitly limited.
Should be understood that, used herein, unless exception clearly supported in context, odd number form " " (" a ", " an ", " the ") is intended to also comprise plural form. It will be further understood that refer in "and/or" used herein and comprise any of one or more than one project listed explicitly and likely combine.
Above-mentioned disclosure embodiment sequence number, just to describing, does not represent the quality of embodiment.
One of ordinary skill in the art will appreciate that all or part of step realizing above-described embodiment can be completed by hardware, can also be completed by the hardware that program carrys out instruction relevant, described program can be stored in a kind of computer-readable recording medium, the above-mentioned storage media mentioned can be read-only storage, disk or CD etc.
The foregoing is only better embodiment of the present disclosure, not in order to limit the disclosure, all within spirit of the present disclosure and principle, any amendment of doing, equivalent replacement, improvement etc., all should be included within protection domain of the present disclosure.

Claims (13)

1. a method for displaying image, it is characterised in that, described method comprises:
Gather the Eyeball motion information of user when the image of browse displays;
Determine that eyeball corresponds to the focus in described image according to described Eyeball motion information;
For each pixel in described image, determine the spatial proximity P of described pixel according to the distance between described pixel and described focusS, determine that the degree of depth of described pixel is close to degree P according to the difference between the depth value of described pixel and the depth value of described focusD, and according to described spatial proximity PSWith the described degree of depth close to degree PDDetermining the type of described pixel, described type comprises background sample point and prospect sample point;
Using the region corresponding to described background sample point and/or described prospect sample point as image processing region;
According to predetermined image processing mode, described image processing region is processed;
Refresh the described image after display process;
Wherein, the described spatial proximity P determining described pixel according to the distance between described pixel and described focusS, comprising: calculate described pixel (xi, yi) and described focus (xL, xL) between distance S, described distance S=(xi-xL)2+(yi-yL)2; Described spatial proximity P is calculated according to described distance S, natural constant e and the first empirical value ��S, described spatial proximity PS=e^ [-S/ (2 ��2)]=e^{-[(xi-xL)2+(yi-yL)2]/(2��2);
Difference between the depth value of the described depth value according to described pixel and described focus determines that the degree of depth of described pixel is close to degree PD, comprising: the depth value D (x calculating described pixeli, yi) with the depth value D (x of described focusL, yL) difference R, described difference R=D (xi, yi)-D(xL, yL); The described degree of depth is calculated close to degree P according to described difference R, natural constant e and the 2nd empirical value ��D, the described degree of depth is close to degree PD=e^ [-R2/(2��2)]=e^{-[D (xi, yi)-D(xL, yL)]2/(2��2)}��
2. method according to claim 1, it is characterised in that, described according to described spatial proximity PSWith the described degree of depth close to degree PDDetermine the type of described pixel, comprising:
According to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel;
Judge described final close to the magnitude relationship of degree P and predetermined threshold;
If judged result is described final close degree P is less than described predetermined threshold, then determine that the type of described pixel is described background sample point;
If judged result is described final close degree P is greater than described predetermined threshold, then determine that the type of described pixel is described prospect sample point.
3. method according to claim 1, it is characterised in that, described according to described spatial proximity PSWith the described degree of depth close to degree PDDetermine the type of described pixel, comprising:
According to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel;
Judging described final close to the magnitude relationship of degree P and the first threshold value and Second Threshold, wherein, described first threshold value is less than described Second Threshold;
If judged result is described final close degree P is less than described first threshold value, then determine that the type of described pixel is described background sample point;
If judged result is described final close degree P is greater than described Second Threshold, then determine that the type of described pixel is described prospect sample point;
It is greater than described first threshold value if judged result is described final close degree P and it is less than described Second Threshold, then determining that described pixel is sample point undetermined;
Color vector according to described sample point undetermined finally determines the type of described sample point undetermined.
4. method according to claim 3, it is characterised in that, the described color vector according to described sample point undetermined finally determines the type of described sample point undetermined, comprising:
For sample point undetermined described in each, obtain the color vector of described sample point undetermined respectively;
Calculate described sample point undetermined respectively according to Bayes posterior probability formula belong to the probability of described prospect sample point and belong to the probability of described background sample point;
Choose the type of the type corresponding to value bigger in described probability as described sample point undetermined.
5. according to the method in claim 2 or 3, it is characterised in that, described according to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel, comprising:
By described spatial proximity PSWith the described degree of depth close to degree PDIt is multiplied and obtains the final close to degree P of described pixel.
6. according to the arbitrary described method of Claims 1-4, it is characterised in that, described according to predetermined image processing mode, described image processing region is processed, comprising:
According to the first predetermined image processing mode, the region corresponding to described background sample point is processed;
Or,
According to the first predetermined image processing mode, the region corresponding to described background sample point is processed, and according to the 2nd predetermined image processing mode, the region corresponding to described prospect sample point is processed;
Or,
According to the 2nd predetermined image processing mode, the region corresponding to described prospect sample point is processed;
Wherein, described first predetermined image processing mode comprises image virtualization process, and described 2nd predetermined image processing mode comprises image Edge contrast.
7. an image display device, it is characterised in that, described device comprises:
Eyeball acquisition module, for gathering the Eyeball motion information of user when the image of browse displays;
Pay close attention to determination module, for determining that eyeball corresponds to the focus in described image according to described Eyeball motion information;
Area determination module, for determining image processing region according to described focus;
Image procossing module, for processing described image processing region according to predetermined image processing mode;
Image display, for refreshing the described image after display process;
Wherein, described area determination module, comprising: the true stator modules of Type division submodule block and region;
Described Type division submodule block, for for each pixel in described image, difference between depth value according to the distance between described pixel and described focus and described pixel and the depth value of described focus determines the type of described pixel, and described type comprises background sample point and prospect sample point;
The true stator modules in described region, for using the region corresponding to described background sample point and/or described prospect sample point as image processing region;
Described Type division submodule block, comprising: distance determining unit, depth determining unit and type determining unit;
Described distance determining unit, for determining the spatial proximity P of described pixel according to the distance between described pixel and described focusS;
For the difference between the depth value according to described pixel and the depth value of described focus, described depth determining unit, determines that the degree of depth of described pixel is close to degree PD;
Described type determining unit, for according to described spatial proximity PSWith the described degree of depth close to degree PDDetermine the type of described pixel;
Described distance determining unit, comprising: distance computation subunit and distance determine subelement;
Described distance computation subunit, for calculating described pixel (xi, yi) and described focus (xL, yL) between distance S, described distance S=(xi-xL)2+(yi-yL)2;
Described distance determines subelement, for calculating described spatial proximity P according to described distance S, natural constant e and the first empirical value ��S, described spatial proximity PS=e^ [-S/ (2 ��2)]=e^{-[(xi-xL)2+(yi-yL)2]/(2��2);
Described depth determining unit, comprising: depth calculation subelement and the degree of depth determine subelement;
Described depth calculation subelement, for calculating the depth value D (x of described pixeli, yi) with the depth value D (x of described focusL, yL) difference R, described difference R=D (xi, yi)-D(xL, yL);
The described degree of depth determines subelement, for calculating the described degree of depth close to degree P according to described difference R, natural constant e and the 2nd empirical value ��D, the described degree of depth is close to degree PD=e^ [-R2/(2��2)]=e^{-[D (xi, yi)-D(xL, yL)]2/(2��2)}��
8. device according to claim 7, it is characterised in that, described type determining unit, comprising: similar determine that subelement, threshold decision subelement, background determine that subelement and prospect determine subelement;
Described similar determine subelement, for according to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel;
Described threshold decision subelement, described final close to the magnitude relationship of degree P and predetermined threshold for judging;
Described background determines subelement, if being that described final close degree P is less than described predetermined threshold for judged result, then determines that the type of described pixel is described background sample point;
Described prospect determines subelement, if being that described final close degree P is greater than described predetermined threshold for judged result, then determines that the type of described pixel is described prospect sample point.
9. device according to claim 7, it is characterised in that, described type determining unit, comprising: similar determine that subelement, first is determined that subelement, the 2nd determines that subelement, the 3rd is determined subelement and finally determines subelement by subelement, threshold ratio;
Described similar determine subelement, for according to described spatial proximity PSWith the described degree of depth close to degree PDObtain the final close to degree P of described pixel;
Described threshold ratio is to subelement, and described final close to the magnitude relationship of degree P and the first threshold value and Second Threshold for judging, wherein, described first threshold value is less than described Second Threshold;
Described first determines subelement, if being that described final close degree P is less than described first threshold value for judged result, then determines that the type of described pixel is described background sample point;
Described 2nd determines subelement, if being that described final close degree P is greater than described Second Threshold for judged result, then determines that the type of described pixel is described prospect sample point;
Described 3rd determines subelement, if being that described final close degree P is greater than described first threshold value and is less than described Second Threshold for judged result, then determines that described pixel is sample point undetermined;
Described finally determine subelement, for finally determining the type of described sample point undetermined according to the color vector of described sample point undetermined.
10. device according to claim 9, it is characterised in that,
Described finally determine subelement, also for for sample point undetermined described in each, obtaining the color vector of described sample point undetermined respectively; Calculate described sample point undetermined respectively according to Bayes posterior probability formula belong to the probability of described prospect sample point and belong to the probability of described background sample point; Choose the type of the type corresponding to value bigger in described probability as described sample point undetermined.
11. devices according to claim 8 or claim 9, it is characterised in that,
Described similar determine subelement, also for by described spatial proximity PSWith the described degree of depth close to degree PDIt is multiplied and obtains the final close to degree P of described pixel.
12. according to the arbitrary described device of claim 7 to 10, it is characterised in that, described image procossing module, comprising: background process submodule block; Or, comprehensive treating process submodule block; Or, perspective process submodule block;
Described background process submodule block, for processing the region corresponding to described background sample point according to the first predetermined image processing mode;
Described comprehensive treating process submodule block, for being processed in the region corresponding to described background sample point according to the first predetermined image processing mode, and processes the region corresponding to described prospect sample point according to the 2nd predetermined image processing mode;
Described perspective process submodule block, for processing the region corresponding to described prospect sample point according to the 2nd predetermined image processing mode;
Wherein, described first predetermined image processing mode comprises image virtualization process, and described 2nd predetermined image processing mode comprises image Edge contrast.
13. 1 kinds of electronicss, it is characterised in that, described electronics comprises:
One or more treater;
Storer; With
One or more module, described one or more module is stored in described storer and is configured to perform by described one or more treater, and described one or more module has following function:
Gather the Eyeball motion information of user when the image of browse displays;
Determine that eyeball corresponds to the focus in described image according to described Eyeball motion information;
For each pixel in described image, determine the spatial proximity P of described pixel according to the distance between described pixel and described focusS, determine that the degree of depth of described pixel is close to degree P according to the difference between the depth value of described pixel and the depth value of described focusD, and according to described spatial proximity PSWith the described degree of depth close to degree PDDetermining the type of described pixel, described type comprises background sample point and prospect sample point;
Using the region corresponding to described background sample point and/or described prospect sample point as image processing region;
According to predetermined image processing mode, described image processing region is processed;
Refresh the described image after display process;
Wherein, the described spatial proximity P determining described pixel according to the distance between described pixel and described focusS, comprising: calculate described pixel (xi, yi) and described focus (xL, xL) between distance S, described distance S=(xi-xL)2+(yi-yL)2; Described spatial proximity P is calculated according to described distance S, natural constant e and the first empirical value ��S, described spatial proximity PS=e^ [-S/ (2 ��2)]=e^{-[(xi-xL)2+(yi-yL)2]/(2��2);
Difference between the depth value of the described depth value according to described pixel and described focus determines that the degree of depth of described pixel is close to degree PD, comprising: the depth value D (x calculating described pixeli, yi) with the depth value D (x of described focusL, yL) difference R, described difference R=D (xi, yi)-D(xL, yL); The described degree of depth is calculated close to degree P according to described difference R, natural constant e and the 2nd empirical value ��D, the described degree of depth is close to degree PD=e^ [-R2/(2��2)]=e^{-[D (xi, yi)-D(xL, yL)]2/(2��2)}��
CN201410056786.9A 2014-02-19 2014-02-19 Method for displaying image, device and electronics Active CN103869977B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410056786.9A CN103869977B (en) 2014-02-19 2014-02-19 Method for displaying image, device and electronics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410056786.9A CN103869977B (en) 2014-02-19 2014-02-19 Method for displaying image, device and electronics

Publications (2)

Publication Number Publication Date
CN103869977A CN103869977A (en) 2014-06-18
CN103869977B true CN103869977B (en) 2016-06-08

Family

ID=50908596

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410056786.9A Active CN103869977B (en) 2014-02-19 2014-02-19 Method for displaying image, device and electronics

Country Status (1)

Country Link
CN (1) CN103869977B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6612865B2 (en) * 2014-11-17 2019-11-27 ヤンマー株式会社 Display system for remote control of work equipment
CN104318228A (en) * 2014-11-24 2015-01-28 段然 Method for acquiring optimal visual field through head-mounted video recording device
CN106101533B (en) * 2016-06-15 2019-09-13 努比亚技术有限公司 Render control method, device and mobile terminal
CN111010514B (en) * 2019-12-24 2021-07-06 维沃移动通信(杭州)有限公司 Image processing method and electronic equipment
CN113256661A (en) * 2021-06-23 2021-08-13 北京蜂巢世纪科技有限公司 Image processing method, apparatus, device, medium, and program product

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5850211A (en) * 1996-06-26 1998-12-15 Sun Microsystems, Inc. Eyetrack-driven scrolling
CN102714741A (en) * 2009-10-14 2012-10-03 汤姆森特许公司 Filtering and edge encoding
CN103207664A (en) * 2012-01-16 2013-07-17 联想(北京)有限公司 Image processing method and equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5850211A (en) * 1996-06-26 1998-12-15 Sun Microsystems, Inc. Eyetrack-driven scrolling
CN102714741A (en) * 2009-10-14 2012-10-03 汤姆森特许公司 Filtering and edge encoding
CN103207664A (en) * 2012-01-16 2013-07-17 联想(北京)有限公司 Image processing method and equipment

Also Published As

Publication number Publication date
CN103869977A (en) 2014-06-18

Similar Documents

Publication Publication Date Title
CN103871051B (en) Image processing method, device and electronic equipment
CN104135609B (en) Auxiliary photo-taking method, apparatus and terminal
CN103869977B (en) Method for displaying image, device and electronics
CN104143078A (en) Living body face recognition method and device and equipment
CN106296617A (en) The processing method and processing device of facial image
CN109213728A (en) Cultural relic exhibition method and system based on augmented reality
CN105487649A (en) Prompt method and mobile terminal
CN108304758A (en) Facial features tracking method and device
CN105828068A (en) Method and device for carrying out occlusion detection on camera and terminal device
CN107592466A (en) A kind of photographic method and mobile terminal
CN106204552B (en) A kind of detection method and device of video source
CN106127829A (en) The processing method of a kind of augmented reality, device and terminal
CN108921941A (en) Image processing method, device, storage medium and electronic equipment
CN108259758A (en) Image processing method, device, storage medium and electronic equipment
CN107111882A (en) Striped set lookup method, device and system
CN106204423A (en) A kind of picture-adjusting method based on augmented reality, device and terminal
CN107864336B (en) A kind of image processing method, mobile terminal
CN105447583A (en) User churn prediction method and device
CN104463105A (en) Guide board recognizing method and device
CN109213885A (en) Car show method and system based on augmented reality
CN106296634A (en) A kind of method and apparatus detecting similar image
CN107223265A (en) Striped set lookup method, device and system
CN108427938A (en) Image processing method, device, storage medium and electronic equipment
CN103399657A (en) Mouse pointer control method, device and terminal device
US10706282B2 (en) Method and mobile terminal for processing image and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant