CN103019571A - Method and terminal for performing 3D (three-dimensional) image edition adjustment on terminal touch screen - Google Patents

Method and terminal for performing 3D (three-dimensional) image edition adjustment on terminal touch screen Download PDF

Info

Publication number
CN103019571A
CN103019571A CN2011102804102A CN201110280410A CN103019571A CN 103019571 A CN103019571 A CN 103019571A CN 2011102804102 A CN2011102804102 A CN 2011102804102A CN 201110280410 A CN201110280410 A CN 201110280410A CN 103019571 A CN103019571 A CN 103019571A
Authority
CN
China
Prior art keywords
rendering
user
described target
target
finger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011102804102A
Other languages
Chinese (zh)
Inventor
叶敏华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inspur LG Digital Mobile Communications Co Ltd
Original Assignee
Inspur LG Digital Mobile Communications Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inspur LG Digital Mobile Communications Co Ltd filed Critical Inspur LG Digital Mobile Communications Co Ltd
Priority to CN2011102804102A priority Critical patent/CN103019571A/en
Publication of CN103019571A publication Critical patent/CN103019571A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a method and a terminal for performing 3D (three-dimensional) image edition adjustment on a terminal touch screen. The method comprises the following steps of detecting a selection operation of a user finger to a 3D image displayed on a touch screen; determining a target 3D image to be edited and adjusted which is selected by the user according to the selection operation; detecting the special operation of the user finger to the target 3D image; and performing corresponding edition adjustment for the target 3D image according to the special operation. By adopting the method provided by the invention, uses can conveniently edit 3D images.

Description

A kind of method and terminal of carrying out 3D rendering editor adjustment at terminal touch screen
Technical field
The present invention relates to 3D and show and 3D rendering adjustment technology field, relate to particularly a kind of method and terminal of carrying out 3D rendering editor adjustment at terminal touch screen.
Background technology
3D displaying principle of the prior art is, original image is two 2D pictures, and the visual angle of these two 2D picture shootings is some difference a little.When showing, allow left eye see seeing left image, right eye is seen right figure.Two eyes are seen same object at the photo of diverse location, and the user will produce stereoscopic sensation to this object, sees also shown in Figure 1.
Allow two eyes see that respectively same object has a variety of in the display packing of diverse location:
1, such as the user with special glasses, the duty of these glasses is that two eyeglasses are opened in turn and closed, in the down periods, eyeglass is impermeable.When left side eyeglass was opened, left figure was in the demonstration screen display of televisor/mobile phone; The right is as the same; User's right and left eyes just sees respectively that the right figure of seeing left image, result produce stereoeffect like this.
2, such as the user with special glasses, the polarization direction of right and left eyes eyeglass is different, on the display screen about the polarization direction of figure also different, left eye can be seen seeing left image as a result, right eye is seen right figure.
3, bore hole 3D, this is to see seeing left image by special display screen by left eye, right eye is seen right figure.
In a word, no matter which kind of televisor/terminal device adopts show the 3D means, and the essence of the generation of 3D effect is that right and left eyes has been seen respectively same object at the image of diverse location,
The inventor finds in realizing process of the present invention, in the prior art, only can show 3D rendering in terminal, and still, the user can't adjust 3D rendering the terminal editor.
Summary of the invention
The object of the invention is to, provide a kind of and carry out method and the terminal that the 3D rendering editor adjusts at terminal touch screen, so that the user can edit 3D rendering very easily.
For reaching above-mentioned purpose, on the one hand, the embodiment of the invention provides a kind of and has carried out the method that the 3D rendering editor adjusts at terminal touch screen, comprising:
Detect user's finger to the selected operation of the 3D rendering of touch screen display;
According to described selected operation, determine the target 3D rendering of the adjustment to be edited that the user chooses;
Detect the specific operation that user's finger is carried out described target 3D rendering;
According to described specific operation, described target 3D rendering is edited adjustment accordingly.
For reaching above-mentioned purpose, on the other hand, the embodiment of the invention provides a kind of terminal of the 3D of having display screen, and described 3D display screen comprises touch-screen, it is characterized in that, described terminal comprises:
The first detecting unit is for detection of the selected operation of user's finger to the 3D rendering of touch screen display;
Selected unit is used for according to described selected operation, determines the target 3D rendering of the adjustment to be edited that the user chooses;
The second detecting unit is for detection of the specific operation of user's finger to described target 3D rendering execution;
Editor's adjustment unit is used for according to described specific operation, and described target 3D rendering is edited adjustment accordingly.
For reaching above-mentioned purpose, another aspect, the embodiment of the invention also provide a kind of and have carried out the method that the 3D rendering editor adjusts at terminal touch screen, and described method comprises:
Detect a plurality of felt pens of user to the selected operation of the 3D rendering of touch screen display;
According to described selected operation, determine the target 3D rendering of the adjustment to be edited that the user chooses;
Detect a plurality of felt pens of user to the specific operation of described target 3D rendering execution;
According to described specific operation, described target 3D rendering is edited adjustment accordingly.
The said method that the embodiment of the invention provides and the beneficial effect of terminal are, can allow the user edit very easily 3D rendering, the user only need to operate accordingly at the use finger of terminal or felt pen touch-screen and just can realize position, display direction, display size and the visual effect of 3D rendering on display screen adjusted editor, thereby satisfies the diversified needs of user.
Description of drawings
In order to be illustrated more clearly in the embodiment of the invention or technical scheme of the prior art, the below will do one to the accompanying drawing of required use in embodiment or the description of the Prior Art and introduce simply, apparently, accompanying drawing in the following describes only is some embodiments of the present invention, for those of ordinary skills, under the prerequisite of not paying creative work, can also obtain according to these accompanying drawings other accompanying drawing.
Fig. 1 is the principle schematic that the 3D rendering of prior art shows;
Fig. 2 is a kind of overall flow figure that carries out the method for 3D rendering editor adjustment at terminal touch screen of the embodiment of the invention;
Fig. 3 is crawl target 3D rendering is moved to the left under the negative depth of field operation and the effect synoptic diagram of the embodiment of the invention;
Fig. 4 is crawl target 3D rendering and the operation of amplifying and the effect synoptic diagram under the negative depth of field of the embodiment of the invention;
Fig. 5 is crawl target 3D rendering and the operation that is rotated and the effect synoptic diagram under the negative depth of field of the embodiment of the invention;
Fig. 6 is the crawl target 3D rendering under the negative depth of field of the embodiment of the invention and operation and the effect synoptic diagram that pushes away far object;
Fig. 7 is a kind of allomeric function block diagram with terminal of 3D display screen of the embodiment of the invention;
Fig. 8 is that another of the embodiment of the invention has the concrete function block diagram of the terminal of 3D display screen;
Fig. 9 is that another of the embodiment of the invention has the concrete function block diagram of the terminal of 3D display screen;
Figure 10 is that another of the embodiment of the invention has the concrete function block diagram of the terminal of 3D display screen;
Figure 11 is that another of the embodiment of the invention has the concrete function block diagram of the terminal of 3D display screen.
Embodiment
For the purpose, technical scheme and the advantage that make the embodiment of the invention clearer, below in conjunction with the accompanying drawing in the embodiment of the invention, technical scheme in the embodiment of the invention is clearly and completely described, obviously, described embodiment is the present invention's part embodiment, rather than whole embodiment.Based on the embodiment among the present invention, those of ordinary skills belong to the scope of protection of the invention not making the every other embodiment that obtains under the creative work prerequisite.
Fig. 2 is a kind of overall flow figure that carries out the method for 3D rendering editor adjustment at terminal touch screen of the embodiment of the invention.As shown in Figure 2, the method 100 comprises the steps:
110, detect user's finger to the selected operation of the 3D rendering of touch screen display;
120, according to described selected operation, determine the target 3D rendering of the adjustment to be edited that the user chooses;
130, detect the specific operation that user's finger is carried out described target 3D rendering;
140, according to described specific operation, described target 3D rendering is edited adjustment accordingly.
In the embodiment of the invention, terminal possesses the 3D rendering display capabilities, possesses the 3D display screen, and this 3D display screen comprises a touch-screen.The user edits and/or adjusting operation 3D rendering by using several fingers to select with other operation at the touch-screen of terminal.This terminal device can comprise mobile communication terminal, televisor, panel computer etc.Wherein, above-mentioned user's finger comprises the specific operation of target 3D rendering execution: the operations such as movement, convergent-divergent, rotation, the change depth of field.Above-mentioned described target 3D rendering is edited to adjust accordingly comprise: the position of mobile this target 3D rendering, zoom in or out this target 3D rendering, rotate this target 3D rendering, adjust the depth of field of this target 3D rendering etc.But the embodiment of the invention can also comprise other mode of operation or adjustment mode not as limit.Below to various embodiment of the present invention is described respectively.
In another embodiment, the user can come by one or more felt pens or other touch apparatus replacement user's finger 3D rendering is carried out specific operation to realize the corresponding editor's adjustment to this 3D rendering.Concrete, above-mentioned processing procedure can comprise the steps: to detect a plurality of felt pens of user to the selected operation of the 3D rendering of touch screen display; According to described selected operation, determine the target 3D rendering of the adjustment to be edited that the user chooses; Detect a plurality of felt pens of user to the specific operation of described target 3D rendering execution; According to described specific operation, described target 3D rendering is edited adjustment accordingly.
In one embodiment, the concrete processing procedure of step 110 and step 120 can be: whether a plurality of fingers that detect the user touch the profile of the 3D rendering that touches screen display, in this way, determine that then described 3D rendering is the target 3D rendering of the adjustment to be edited chosen of user.It should be noted that user's finger should be the centre of eye pattern position, the left and right sides in the position of touch display screen crawl object because relief generation is because object causes in the position of left and right sides eye pattern difference.
In another embodiment, be the situation of move operation corresponding to specific operation, the concrete processing procedure of step 130 and step 140 can comprise the steps: to detect the user and point behind the selected described target 3D rendering move operation performed on touch-screen; Obtain moving direction and the mobile range of user's finger; The left eye figure that described target 3D rendering is corresponding and right eye figure move according to moving direction and the mobile range of user's finger; On the 3D display screen that the left eye figure that described target 3D rendering after the movement is corresponding and right eye figure are shown in terminal.In above-mentioned moving process, just movement has occured in the target 3D rendering, and the invariant position of background and other image.Wherein, moving direction can be along any direction that is parallel to plane, touch-screen place.
In another embodiment, be the situation of zoom operations corresponding to specific operation, the concrete processing procedure of step 130 and step 140 can comprise the steps: to detect the user and point behind the selected described target 3D rendering zoom operations performed on touch-screen; Obtain convergent-divergent indication and the convergent-divergent amplitude of user's finger; The left eye figure that described target 3D rendering is corresponding and right eye figure dwindle according to the convergent-divergent indication of user finger and convergent-divergent amplitude or amplify; On the 3D display screen that the left eye figure that described target 3D rendering behind the convergent-divergent is corresponding and right eye figure are shown in terminal.In above-mentioned convergent-divergent process, only have the target 3D rendering to occur to zoom in or out, and not the changing of background and other image.What wherein convergent-divergent indication was used to indicate that the user points execution is to dwindle or amplifieroperation.Concrete, by detecting two fingers of user behind object corresponding to selected target 3D rendering, the user points performed contraction or extension movement and indicates and dwindle or amplifieroperation.Can detect the variation of the distance between two fingers of user, obtain convergent-divergent amplitude to the left and right eye pattern of 3D rendering according to the change of distance amount between pre-stored user's finger and the mapping relations between the scaling.
In another embodiment, be the situation of rotary manipulation corresponding to specific operation, the concrete processing procedure of step 130 and step 140 can comprise: detect the user and point the rotary manipulation of carrying out at touch-screen behind the selected described target 3D rendering; Obtain sense of rotation and the anglec of rotation of user's finger; The left eye figure that described target 3D rendering is corresponding and right eye figure are rotated according to sense of rotation and the anglec of rotation of user's finger; On the 3D display screen that the left eye figure that postrotational described target 3D rendering is corresponding and right eye figure are shown in terminal.In above-mentioned rotary course, only have the target 3D rendering that rotation has occured, and background and other image do not change.
In another embodiment, by rotating the situation of the depth of field that changes 3D rendering, method shown in Figure 2 also comprises step corresponding to the user: start the rotating object representative and change depth of field function; Particularly, the triggering by physical button or virtual soft starts rotating object representative change depth of field function; The concrete processing procedure of step 130 and step 140 can comprise: detect the user and point the rotary manipulation of carrying out at touch-screen behind the selected described target 3D rendering; Obtain sense of rotation and the rotation amplitude of user's finger; Rotation amplitude according to described user's finger generates left eye figure corresponding to target 3D rendering or the mobile range of right eye figure; When described sense of rotation is when rotating clockwise, the left eye figure that described target 3D rendering is corresponding moves right according to described mobile range or is moved to the left, and the right eye figure that described target 3D rendering is corresponding is moved to the left according to described mobile range or moves right; Perhaps, when described sense of rotation is when being rotated counterclockwise, the left eye figure that described target 3D rendering is corresponding is moved to the left according to described mobile range or moves right, and the right eye figure that described target 3D rendering is corresponding moves right according to described mobile range or is moved to the left; On the 3D display screen that the left eye figure that described target 3D rendering after the movement is corresponding and right eye figure are shown in terminal.In above-mentioned process by the rotation change depth of field, only have the depth of field of target 3D rendering that change has occured, and the depth of field of background and other image does not change.
Particularly, generate in the process of mobile range of left eye figure corresponding to target 3D rendering or right eye figure in the rotation amplitude according to described user finger, can detect the anglec of rotation of user's finger, according to the mapping relations between the mobile displacement of the anglec of rotation of the pre-stored user finger left eye figure corresponding with 3D rendering or right eye figure, obtain the mobile range of left and right eye pattern again.
Wherein, corresponding to situation about turning clockwise, produce an effect is: it is more shallow that the depth of field of 3D target object in 3D figure that shows for the negative depth of field (NegativeParallax) 3D becomes, and the position of object in 3D figure is more forward, more near display screen surface; If the 3D target object that (Positive Parallax) 3D of the positive depth of field shows the 3D figure depth of field become darker, after the position of object in 3D figure more leaned on, be located farther from display screen surface; And other objects and background do not change.
Wherein, corresponding to situation about being rotated counterclockwise, produce an effect is: it is more shallow that the depth of field of 3D target object in 3D figure that shows for the negative depth of field (NegativeParallax) 3D becomes, and the position of object in 3D figure is more forward, more near display screen surface; If the 3D target object that (Positive Parallax) 3D of the positive depth of field shows the 3D figure depth of field become darker, after the position of object in 3D figure more leaned on, be located farther from display screen surface; And other objects and background do not change.
Below be elaborated from the working method of user perspective to present embodiment:
1) user catches/selectes him to want to operate editor's target object.
Thrust out one's fingers with hand, the appropriate location on the crawl touch-screen, several finger touch of user are to the profile of this target object on display screen.Terminal device i.e. selected user is wanted the target object editing or operate.Such as the apple among Fig. 3.
It should be noted that since relief generation be since object about the position difference of figure cause the centre of user's finger figure position about the position of touch display screen crawl object should be.
2) user's finger is mobile at touch-screen.
As shown in Figure 3, after the user caught the object of 3D demonstration, user's finger was mobile at touch-screen, and terminal device is with the also corresponding movement of position of this object on left figure and the right figure, and mobile range is with the mobile range of finger.Equipment about having moved behind the object on the figure, about scheme again to show at the 3D display screen, such effect is: movement has occured in the target object of this 3D in 3D figure, and other objects, and background is not moving.
3) user's finger is carried out zoom action at touch-screen.
As shown in Figure 4, after the user caught the object of 3D demonstration, user's finger convergent-divergent on touch-screen, equipment were with the big or small corresponding convergent-divergent of this object on left figure and the right figure, and the convergent-divergent amplitude is with the convergent-divergent amplitude of finger.Equipment after the object zooming effect on the figure, schemes again to show at the 3D display screen about behind the convergent-divergent about having processed respectively, and such effect is: the target object of this 3D in 3D figure convergent-divergent has occured, and other objects, and background is not moving.
4) user's finger is carried out clockwise or the spinning movement that is rotated counterclockwise at touch-screen.
As shown in Figure 5, after the user caught the object of 3D demonstration, user's finger integral body was rotated at touch-screen, if turn clockwise, equipment all turns clockwise the object on left figure and the right figure, and anglec of rotation amplitude is with the anglec of rotation of finger.
Equipment about having processed respectively behind the object rotate effect on the figure, about scheme again to show at the 3D display screen, the result, the object of 3D effect has also rotated identical angle.
If be rotated counterclockwise, equipment all is rotated counterclockwise the object on left figure and the right figure, and anglec of rotation amplitude is with the anglec of rotation of finger.
Equipment about having processed respectively behind the object rotate effect on the figure, about scheme again to show at the 3D display screen, the result, the object of 3D effect has also rotated identical angle.
And other objects and background do not change.
5) user's finger is carried out clockwise or the spinning movement that is rotated counterclockwise at touch-screen, cooperates simultaneously other settings or other actuation of keys.
Particularly, can be before user's rotating object, software arranges an option: " the rotating object representative changes the depth of field "; Perhaps the software default user is pinning a special key, and in the time of such as volume key, the user is not for rotating object in the effect of touch-screen pivoting finger, but in order to change the depth of field.
As shown in Figure 6, after then the user caught the object of 3D demonstration, finger is whole to be rotated at touch-screen, if turn clockwise, equipment moves to right this object on the left figure; This object on the right figure moves to left; Mobile range is with the rotation amplitude of finger.
Equipment about having processed respectively after the movement of objects effect on the figure, about scheme again to show at the 3D display screen, such effect is:
It is darker that the depth of field of 3D target object in 3D figure that shows for the negative depth of field (Negative Parallax) 3D becomes, after the position of object in 3D figure more leaned on, further from display screen surface;
If the depth of field change of the 3D target object that (Positive Parallax) 3D of the positive depth of field shows in 3D figure is more shallow, the position of object in 3D figure is more forward, closer to display screen surface;
If be rotated counterclockwise, equipment moves to left this object on the left figure, and this object on the right figure is moved to right, and mobile range is with the rotation amplitude of finger.
Equipment about having processed respectively after the movement of objects effect on the figure, about scheme again to show at the 3D display screen, such effect is:
It is more shallow that the depth of field of 3D target object in 3D figure that shows for the negative depth of field (Negative Parallax) 3D becomes, and the position of object in 3D figure is more forward, more near display screen surface;
If the 3D target object that (Positive Parallax) 3D of the positive depth of field shows the 3D figure depth of field become darker, after the position of object in 3D figure more leaned on, be located farther from display screen surface;
And other objects and background do not change.
The advantage of embodiment of the method for the present invention is that the user can edit 3D rendering very easily.
Fig. 7 is a kind of allomeric function block diagram with terminal of 3D touch-screen of the embodiment of the invention.As shown in Figure 7, this terminal 200 has the 3D display screen, and this 3D display screen comprises touch-screen, and this terminal 200 also comprises:
The first detecting unit 210 is for detection of the selected operation of user's finger to the 3D rendering of touch screen display;
Selected unit 220 is used for according to described selected operation, determines the target 3D rendering of the adjustment to be edited that the user chooses;
The second detecting unit 230 is for detection of the specific operation of user's finger to described target 3D rendering execution;
Editor's adjustment unit 240 is used for according to described specific operation, and described target 3D rendering is edited adjustment accordingly.
Alternatively, whether this first detecting unit 210 specifically can touch for detection of a plurality of fingers of user the profile of the 3D rendering that touches screen display; Should selected unit 220, specifically can be used for when the judged result of described the first detecting unit when being, determine that described 3D rendering is the target 3D rendering of the adjustment to be edited chosen of user.
In one embodiment, as shown in Figure 8, in terminal 300, the second detecting unit 230 comprises: mobile detection module 231, point behind the selected described target 3D rendering move operation performed on touch-screen for detection of the user; Moving parameter acquisition module 232 is used for obtaining moving direction and the mobile range that the user points;
Editor's adjustment unit 240 comprises: the first image mobile module 241 is used for the left eye figure that described target 3D rendering is corresponding and right eye figure and moves according to moving direction and the mobile range that the user points; Mobile display processing module 242, corresponding left eye figure and the right eye figure of described target 3D rendering after being used for moving is shown in the 3D display screen.
In another embodiment, as shown in Figure 9, in terminal 400, the second detecting unit 230 comprises: convergent-divergent detection module 233, point behind the selected described target 3D rendering zoom operations performed on touch-screen for detection of the user; Zooming parameter acquisition module 234 is used for obtaining convergent-divergent indication and the convergent-divergent amplitude that the user points;
Editor's adjustment unit 240 comprises: image scaling module 243 is used for the left eye figure that described target 3D rendering is corresponding and right eye figure and dwindles according to the convergent-divergent indication of user's finger and convergent-divergent amplitude or amplify; Zoom display processing module 244 is used for left eye figure and the right eye figure that the described target 3D rendering behind the convergent-divergent is corresponding and is shown in the 3D display screen.
In another embodiment, as shown in figure 10, in terminal 500, the second detecting unit 230 comprises: the first rotation detection module 235, point the rotary manipulation of carrying out at touch-screen behind the selected described target 3D rendering for detection of the user; The first rotation parameter acquisition module 236 is used for obtaining sense of rotation and the anglec of rotation that the user points;
Editor's adjustment unit 240 comprises: image rotation module 245 is used for the left eye figure that described target 3D rendering is corresponding and right eye figure and is rotated according to sense of rotation and the anglec of rotation that the user points; Rotational display processing module 246 is used for left eye figure and the right eye figure that postrotational described target 3D rendering is corresponding and is shown in the 3D display screen.
In another embodiment, as shown in figure 11, in terminal 600, this terminal 600 also comprises: the depth of field changes trigger element 250, is used for starting the rotating object representative and changes depth of field function;
The second detecting unit 230 comprises: the second rotation detection module 237, point the rotary manipulation of carrying out at touch-screen behind the selected described target 3D rendering for detection of the user; The second rotation parameter acquisition module 238 is used for obtaining sense of rotation and the rotation amplitude that the user points;
Editor's adjustment unit 240 comprises: mobile range generation module 247 is used for generating left eye figure corresponding to target 3D rendering or the mobile range of right eye figure according to the rotation amplitude of described user's finger; The second image mobile module 248, be used for when described sense of rotation be when rotating clockwise, the left eye figure that described target 3D rendering is corresponding moves right according to described mobile range or is moved to the left, and the right eye figure that described target 3D rendering is corresponding is moved to the left according to described mobile range or moves right; Perhaps, when described sense of rotation is when being rotated counterclockwise, the left eye figure that described target 3D rendering is corresponding is moved to the left according to described mobile range or moves right, and the right eye figure that described target 3D rendering is corresponding moves right according to described mobile range or is moved to the left; The depth of field is adjusted Graphics Processing module 249, and corresponding left eye figure and the right eye figure of described target 3D rendering after being used for moving is shown in the 3D display screen.
The method of work of terminal 200-600 of the invention process describes in detail in aforesaid embodiment of the method, so be not repeated herein.
The advantage of device embodiment of the present invention is that by above-mentioned terminal, the user can edit 3D rendering very easily.
Those of ordinary skills can recognize, unit and the algorithm steps of each example of describing in conjunction with embodiment disclosed herein, can realize with electronic hardware, computer software or the combination of the two, for the interchangeability of hardware and software clearly is described, composition and the step of each example described in general manner according to function in the above description.These functions are carried out with hardware or software mode actually, depend on application-specific and the design constraint of technical scheme.The professional and technical personnel can specifically should be used for realizing described function with distinct methods to each, but this realization should not thought and exceeds scope of the present invention.
The method of describing in conjunction with embodiment disclosed herein or the step of algorithm can use the software module of hardware, processor execution, and perhaps the combination of the two is implemented.Software module can place the storage medium of any other form known in random access memory (RAM), internal memory, ROM (read-only memory) (ROM), electrically programmable ROM, electrically erasable ROM, register, hard disk, moveable magnetic disc, CD-ROM or the technical field.
The above; only for the better embodiment of the present invention, but protection scope of the present invention is not limited to this, anyly is familiar with those skilled in the art in the technical scope that the present invention discloses; the variation that can expect easily or replacement all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain of claim.

Claims (13)

1. one kind is carried out the method that the 3D rendering editor adjusts at terminal touch screen, it is characterized in that described method comprises:
Detect user's finger to the selected operation of the 3D rendering of touch screen display;
According to described selected operation, determine the target 3D rendering of the adjustment to be edited that the user chooses;
Detect the specific operation that user's finger is carried out described target 3D rendering;
According to described specific operation, described target 3D rendering is edited adjustment accordingly.
2. method according to claim 1 is characterized in that, described detection user finger is to the selected operation of the 3D rendering of touch screen display; According to described selected operation, determine the target 3D rendering of the adjustment to be edited that the user chooses; Comprise:
Whether a plurality of fingers that detect the user touch the profile of the 3D rendering that touches screen display, in this way, determine that then described 3D rendering is the target 3D rendering of the adjustment to be edited chosen of user.
3. method according to claim 1 and 2 is characterized in that, the specific operation that described detection user finger is carried out described target 3D rendering; According to described specific operation, described target 3D rendering is edited adjustment accordingly; Comprise:
Detect the user and point behind the selected described target 3D rendering move operation performed on touch-screen;
Obtain moving direction and the mobile range of user's finger;
The left eye figure that described target 3D rendering is corresponding and right eye figure move according to moving direction and the mobile range of user's finger;
On the 3D display screen that the left eye figure that described target 3D rendering after the movement is corresponding and right eye figure are shown in terminal.
4. method according to claim 1 and 2 is characterized in that, the specific operation that described detection user finger is carried out described target 3D rendering; According to described specific operation, described target 3D rendering is edited adjustment accordingly; Comprise:
Detect the user and point behind the selected described target 3D rendering zoom operations performed on touch-screen;
Obtain convergent-divergent indication and the convergent-divergent amplitude of user's finger;
The left eye figure that described target 3D rendering is corresponding and right eye figure dwindle according to the convergent-divergent indication of user finger and convergent-divergent amplitude or amplify;
On the 3D display screen that the left eye figure that described target 3D rendering behind the convergent-divergent is corresponding and right eye figure are shown in terminal.
5. method according to claim 1 and 2 is characterized in that, the specific operation that described detection user finger is carried out described target 3D rendering; According to described specific operation, described target 3D rendering is edited adjustment accordingly; Comprise:
Detect the user and point the rotary manipulation of carrying out at touch-screen behind the selected described target 3D rendering;
Obtain sense of rotation and the anglec of rotation of user's finger;
The left eye figure that described target 3D rendering is corresponding and right eye figure are rotated according to sense of rotation and the anglec of rotation of user's finger;
On the 3D display screen that the left eye figure that postrotational described target 3D rendering is corresponding and right eye figure are shown in terminal.
6. method according to claim 1 and 2 is characterized in that, before the specific operation of described detection user finger to described target 3D rendering execution, described method also comprises: start the rotating object representative and change depth of field function;
The specific operation that described detection user finger is carried out described target 3D rendering; According to described specific operation, described target 3D rendering is edited adjustment accordingly; Comprise:
Detect the user and point the rotary manipulation of carrying out at touch-screen behind the selected described target 3D rendering;
Obtain sense of rotation and the rotation amplitude of user's finger;
Rotation amplitude according to described user's finger generates left eye figure corresponding to target 3D rendering or the mobile range of right eye figure;
When described sense of rotation is when rotating clockwise, the left eye figure that described target 3D rendering is corresponding moves right according to described mobile range or is moved to the left, and the right eye figure that described target 3D rendering is corresponding is moved to the left according to described mobile range or moves right; Perhaps, when described sense of rotation is when being rotated counterclockwise, the left eye figure that described target 3D rendering is corresponding is moved to the left according to described mobile range or moves right, and the right eye figure that described target 3D rendering is corresponding moves right according to described mobile range or is moved to the left;
On the 3D display screen that the left eye figure that described target 3D rendering after the movement is corresponding and right eye figure are shown in terminal.
7. the terminal with 3D display screen is characterized in that, described 3D display screen comprises touch-screen, and described terminal comprises:
The first detecting unit is for detection of the selected operation of user's finger to the 3D rendering of touch screen display;
Selected unit is used for according to described selected operation, determines the target 3D rendering of the adjustment to be edited that the user chooses;
The second detecting unit is for detection of the specific operation of user's finger to described target 3D rendering execution;
Editor's adjustment unit is used for according to described specific operation, and described target 3D rendering is edited adjustment accordingly.
8. terminal according to claim 7 is characterized in that,
Whether described the first detecting unit specifically touches the profile of the 3D rendering that touches screen display for detection of a plurality of fingers of user;
Described selected unit, concrete be used for when the judged result of described the first detecting unit when being, determine that described 3D rendering is the target 3D rendering of the adjustment to be edited chosen of user.
9. according to claim 7 or 8 described terminals, it is characterized in that,
Described the second detecting unit comprises: mobile detection module, point behind the selected described target 3D rendering move operation performed on touch-screen for detection of the user; The moving parameter acquisition module is used for obtaining moving direction and the mobile range that the user points;
Described editor's adjustment unit comprises: the first image mobile module is used for the left eye figure that described target 3D rendering is corresponding and right eye figure and moves according to moving direction and the mobile range that the user points; The mobile display processing module, the 3D display screen that the left eye figure that the described target 3D rendering after being used for moving is corresponding and right eye figure are shown in terminal.
10. according to claim 7 or 8 described terminals, it is characterized in that,
Described the second detecting unit comprises, the convergent-divergent detection module is pointed behind the selected described target 3D rendering zoom operations performed on touch-screen for detection of the user; The zooming parameter acquisition module is used for obtaining convergent-divergent indication and the convergent-divergent amplitude that the user points;
Described editor's adjustment unit comprises: the image scaling module is used for the left eye figure that described target 3D rendering is corresponding and right eye figure and dwindles according to the convergent-divergent indication of user's finger and convergent-divergent amplitude or amplify; The Zoom display processing module is used for the 3D display screen that the left eye figure that the described target 3D rendering behind the convergent-divergent is corresponding and right eye figure are shown in terminal.
11. according to claim 7 or 8 described terminals, it is characterized in that,
Described the second detecting unit comprises: the first rotation detection module, point the rotary manipulation of carrying out at touch-screen behind the selected described target 3D rendering for detection of the user; The first rotation parameter acquisition module is used for obtaining sense of rotation and the anglec of rotation that the user points;
Described editor's adjustment unit comprises: the image rotation module is used for the left eye figure that described target 3D rendering is corresponding and right eye figure and is rotated according to sense of rotation and the anglec of rotation that the user points; The rotational display processing module is used for the 3D display screen that the left eye figure that postrotational described target 3D rendering is corresponding and right eye figure are shown in terminal.
12. according to claim 7 or 8 described terminals, it is characterized in that described terminal also comprises: the depth of field changes trigger element, is used for starting the rotating object representative and changes depth of field function;
Described the second detecting unit comprises: the second rotation detection module, point the rotary manipulation of carrying out at touch-screen behind the selected described target 3D rendering for detection of the user; The second rotation parameter acquisition module is used for obtaining sense of rotation and the rotation amplitude that the user points;
Described editor's adjustment unit comprises: the mobile range generation module is used for generating left eye figure corresponding to target 3D rendering or the mobile range of right eye figure according to the rotation amplitude of described user's finger; The second image mobile module, be used for when described sense of rotation be when rotating clockwise, the left eye figure that described target 3D rendering is corresponding moves right according to described mobile range or is moved to the left, and the right eye figure that described target 3D rendering is corresponding is moved to the left according to described mobile range or moves right; Perhaps, when described sense of rotation is when being rotated counterclockwise, the left eye figure that described target 3D rendering is corresponding is moved to the left according to described mobile range or moves right, and the right eye figure that described target 3D rendering is corresponding moves right according to described mobile range or is moved to the left; The depth of field is adjusted the Graphics Processing module, the 3D display screen that the left eye figure that the described target 3D rendering after being used for moving is corresponding and right eye figure are shown in terminal.
13. a method of carrying out 3D rendering editor adjustment at terminal touch screen is characterized in that described method comprises:
Detect a plurality of felt pens of user to the selected operation of the 3D rendering of touch screen display;
According to described selected operation, determine the target 3D rendering of the adjustment to be edited that the user chooses;
Detect a plurality of felt pens of user to the specific operation of described target 3D rendering execution;
According to described specific operation, described target 3D rendering is edited adjustment accordingly.
CN2011102804102A 2011-09-20 2011-09-20 Method and terminal for performing 3D (three-dimensional) image edition adjustment on terminal touch screen Pending CN103019571A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011102804102A CN103019571A (en) 2011-09-20 2011-09-20 Method and terminal for performing 3D (three-dimensional) image edition adjustment on terminal touch screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011102804102A CN103019571A (en) 2011-09-20 2011-09-20 Method and terminal for performing 3D (three-dimensional) image edition adjustment on terminal touch screen

Publications (1)

Publication Number Publication Date
CN103019571A true CN103019571A (en) 2013-04-03

Family

ID=47968228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011102804102A Pending CN103019571A (en) 2011-09-20 2011-09-20 Method and terminal for performing 3D (three-dimensional) image edition adjustment on terminal touch screen

Country Status (1)

Country Link
CN (1) CN103019571A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106933350A (en) * 2017-02-09 2017-07-07 深圳市创想天空科技股份有限公司 AR exchange methods and device
CN107357500A (en) * 2017-06-21 2017-11-17 努比亚技术有限公司 A kind of picture-adjusting method, terminal and storage medium
WO2018166156A1 (en) * 2017-03-13 2018-09-20 中兴通讯股份有限公司 Method and apparatus for browsing images
CN108701352A (en) * 2016-03-23 2018-10-23 英特尔公司 Amending image using the identification based on three dimensional object model and enhancing
CN115268723A (en) * 2022-07-29 2022-11-01 联想(北京)有限公司 Control method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266546A (en) * 2008-05-12 2008-09-17 深圳华为通信技术有限公司 Method for accomplishing operating system three-dimensional display and three-dimensional operating system
CN101770324A (en) * 2008-12-31 2010-07-07 商泰软件(上海)有限公司 Method for realizing interactive operation of 3D graphical interface
US20110010666A1 (en) * 2009-07-07 2011-01-13 Lg Electronics Inc. Method for displaying three-dimensional user interface
US20110093778A1 (en) * 2009-10-20 2011-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20110164029A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Working with 3D Objects

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266546A (en) * 2008-05-12 2008-09-17 深圳华为通信技术有限公司 Method for accomplishing operating system three-dimensional display and three-dimensional operating system
CN101770324A (en) * 2008-12-31 2010-07-07 商泰软件(上海)有限公司 Method for realizing interactive operation of 3D graphical interface
US20110010666A1 (en) * 2009-07-07 2011-01-13 Lg Electronics Inc. Method for displaying three-dimensional user interface
US20110093778A1 (en) * 2009-10-20 2011-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20110164029A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Working with 3D Objects

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108701352A (en) * 2016-03-23 2018-10-23 英特尔公司 Amending image using the identification based on three dimensional object model and enhancing
CN106933350A (en) * 2017-02-09 2017-07-07 深圳市创想天空科技股份有限公司 AR exchange methods and device
WO2018166156A1 (en) * 2017-03-13 2018-09-20 中兴通讯股份有限公司 Method and apparatus for browsing images
CN107357500A (en) * 2017-06-21 2017-11-17 努比亚技术有限公司 A kind of picture-adjusting method, terminal and storage medium
CN115268723A (en) * 2022-07-29 2022-11-01 联想(北京)有限公司 Control method and device

Similar Documents

Publication Publication Date Title
CN104731471B (en) Mobile terminal and control method thereof
KR101763263B1 (en) 3d display terminal apparatus and operating method
US9864495B2 (en) Indirect 3D scene positioning control
US9824485B2 (en) Presenting a view within a three dimensional scene
JP5703703B2 (en) Information processing apparatus, stereoscopic display method, and program
KR101601657B1 (en) Methods and systems for interacting with projected user interface
EP2638461B1 (en) Apparatus and method for user input for controlling displayed information
CN103019571A (en) Method and terminal for performing 3D (three-dimensional) image edition adjustment on terminal touch screen
JP2011108152A (en) Three-dimensional input display device
US20130222363A1 (en) Stereoscopic imaging system and method thereof
US9432652B2 (en) Information processing apparatus, stereoscopic display method, and program
JP2014503927A (en) Mobile device for displaying 3D video including a plurality of layers and display method thereof
JP2012256214A (en) Information processing device, information processing method, and program
JP5640680B2 (en) Information processing apparatus, stereoscopic display method, and program
JP2012247838A (en) Display device, display control method, and program
US20120120063A1 (en) Image processing device, image processing method, and program
US8941648B2 (en) Mobile terminal and control method thereof
CN116325720A (en) Dynamic resolution of depth conflicts in telepresence
KR101850391B1 (en) Mobile terminal and control method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130403

WD01 Invention patent application deemed withdrawn after publication