WO2014067110A1 - 绘图控制方法、装置及移动终端 - Google Patents
绘图控制方法、装置及移动终端 Download PDFInfo
- Publication number
- WO2014067110A1 WO2014067110A1 PCT/CN2012/083876 CN2012083876W WO2014067110A1 WO 2014067110 A1 WO2014067110 A1 WO 2014067110A1 CN 2012083876 W CN2012083876 W CN 2012083876W WO 2014067110 A1 WO2014067110 A1 WO 2014067110A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- attribute information
- gesture trajectory
- trajectory
- point
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 76
- 238000012217 deletion Methods 0.000 claims description 22
- 230000037430 deletion Effects 0.000 claims description 22
- 230000001133 acceleration Effects 0.000 claims description 19
- 238000012545 processing Methods 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 abstract description 27
- 230000008859 change Effects 0.000 description 7
- 238000006243 chemical reaction Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 239000007921 spray Substances 0.000 description 5
- 230000001427 coherent effect Effects 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 238000009877 rendering Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04804—Transparency, e.g. transparent or translucent windows
Definitions
- the present invention relates to mobile communication technologies, and in particular, to a drawing control method, apparatus, and terminal. Background technique
- the existing drawing tools require manual selection by the user when changing the stroke thickness and transparency.
- For current touch screen drawing before drawing, select the stroke thickness, transparency and then draw, or after drawing, change the stroke thickness and transparency by clicking on the selection.
- you need to select multiple options such as stroke type, thickness, transparency, color, etc., and the operation is complicated.
- an embodiment of the present invention provides a drawing control method, including: detecting a gesture track input by a user, wherein the gesture track is a gesture generated during a non-contact movement process of the user control input device and the display screen a first attribute information of the gesture track, where the first attribute information of the gesture track is feature information of the gesture track recognized by the terminal; and the first attribute information according to the preset rule and the gesture track Identifying the gesture track to obtain the second attribute information of the gesture track, where the second attribute information of the gesture track includes partial feature information or all feature information required by the terminal to display the gesture track; The second attribute information of the gesture track presents the gesture track.
- the first attribute information of the gesture track includes any one of the following attributes or a combination thereof: a distance between each point on the gesture track and the display screen; The speed or acceleration of each point on the trajectory; the pause time of each point on the gesture trajectory.
- the second attribute information of the gesture track includes any one of the following attributes or a combination thereof: The thickness of each point on the trajectory to be presented; the transparency of each point to be presented on the gesture trajectory.
- the method before the presenting the gesture track according to the second attribute information of the gesture track, the method further includes: acquiring a drawing stroke type; and displaying the gesture track according to the second attribute information of the gesture track The method includes: presenting the gesture track according to the second attribute information of the gesture track and the drawing stroke type.
- the acquiring a drawing stroke type specifically includes: acquiring a first gesture command input by a user; Determine the type of drawing strokes.
- the determining, according to the first gesture command, the drawing stroke type comprises: determining, according to the first gesture command, Depicting a stroke type corresponding to the first gesture; or presenting at least one type of drawing stroke according to the first gesture command; determining a drawing stroke type by receiving the user's selection of at least one type of drawing stroke.
- the method further includes: acquiring a second gesture command input by the user; determining a color of the drawing stroke according to the second gesture command
- the presenting the gesture track according to the second attribute information of the gesture track specifically includes: presenting the gesture track according to the second attribute information of the gesture track and the drawing stroke color; or according to the gesture track
- the second attribute information, the drawing stroke type, and the drawing stroke color represent the gesture trajectory.
- the method further includes: acquiring user input a third gesture command; performing a full deletion operation or a partial deletion operation on the presented gesture trajectory according to the third gesture command.
- the first attribute information of the gesture track is a point on the gesture track and the display a distance of the screen
- the second attribute information of the gesture track is the thickness of each point on the gesture track to be presented
- the preset rule is the closer the distance between each point on the gesture track and the display screen The coarser the thickness of the points to be presented on the gesture track, or the closer the points on the gesture track are to the display screen, the finer the thickness of each point on the gesture track to be presented.
- the first attribute information of the gesture track is a distance between each point on the gesture track and the display screen
- the preset rule is that the closer the points on the gesture track are to the display screen, the gesture The lower the transparency of each point on the track to be presented; or the closer the points on the gesture track are to the display screen, the higher the transparency of the points to be presented on the gesture track.
- the gesture track when the first attribute information of the gesture track is a moving speed or acceleration of the gesture track, the gesture track
- the second attribute information is the thickness of the point to be presented on the gesture track
- the preset rule is that the moving speed or acceleration of the gesture track is smaller, and the thickness of each point on the gesture track is to be presented. The closer the distance between the points on the gesture track and the display screen, the finer the thickness of each point on the gesture track to be presented.
- the eleventh possible implementation manner of the first aspect when the first attribute information of the gesture track is a pause time of each point on the gesture track, when the second attribute information of the gesture track is the thickness of each point on the gesture track, the pause time of each point on the gesture track is longer, and the point of the gesture track is to be presented. The thicker the thickness is; or the longer the pause time of each point on the gesture track, the finer the thickness of each point on the gesture track to be presented.
- a drawing control apparatus including: a detecting module, configured to detect a gesture track input by a user, wherein the gesture track is generated during a non-contact movement process of the user control input device and the display screen a first attribute information acquiring module, configured to acquire first attribute information of the gesture track according to the gesture track detected by the detecting module, where the first attribute information of the gesture track is recognized by the terminal And the second attribute information acquiring module is configured to identify the gesture track according to the preset rule and the first attribute information of the gesture track acquired by the first attribute information acquiring module, Obtaining second attribute information of the gesture track; wherein the second attribute information of the gesture track includes the terminal The gesture track displays a part of the feature information or all the feature information.
- the presentation module is configured to present the gesture track according to the second attribute information of the gesture track acquired by the second attribute information acquiring module.
- the first attribute information of the gesture track includes any one of the following attributes or a combination thereof: each point on the gesture track and the display screen Distance: a moving speed or acceleration of each point on the gesture track; a pause time of each point on the gesture track.
- the second attribute information of the gesture track includes any one of the following attributes or a combination thereof: The thickness of each point to be presented on the gesture track; the transparency of each point to be presented on the gesture track.
- the drawing control apparatus further includes: a drawing stroke type acquiring module, configured to acquire a drawing stroke type; Specifically, the second Attribute information of the gesture trajectory acquired by the second attribute information acquiring module and the drawing stroke type acquired by the drawing stroke type acquiring module are used to present the gesture trajectory.
- the drawing control device further includes: a gesture command acquiring module, configured to acquire a first gesture command input by the user
- the drawing stroke type acquisition module is specifically configured to determine a drawing stroke type according to the first gesture command acquired by the gesture command acquisition module.
- the drawing stroke type acquiring module is specifically configured to: obtain, according to the gesture command acquiring module, the first Determining, by a gesture command, a type of drawing stroke corresponding to the first gesture; or presenting at least one type of drawing stroke according to the first gesture command acquired by the gesture command acquisition module; receiving at least one drawing stroke by the user The choice of type to determine the type of drawing stroke.
- the gesture command acquiring module is further configured to acquire a second gesture command input by the user;
- the drawing control device further includes: a drawing stroke color acquiring module, configured to determine a drawing stroke color according to the second gesture command acquired by the gesture command acquiring module; the rendering module is specifically configured to be used according to the gesture track Two attribute information and the drawing stroke color presentation Or the gesture track; or, the gesture track is presented according to the second attribute information of the gesture track, the drawing stroke type, and the drawing stroke color.
- the gesture command acquiring module is further configured to acquire a third gesture command input by a user;
- the control device further includes a deletion module, configured to perform a total deletion operation or a partial deletion operation on the presented gesture track according to the third gesture command acquired by the gesture command acquisition module.
- a terminal including: an input device, configured to detect a gesture track input by a user, wherein the gesture track is a gesture generated during a non-contact movement of the user control input device and the display screen a trajectory; a central processor, configured to parse the first attribute information of the gesture track acquired by the input device; wherein the first attribute information of the gesture track is feature information of the gesture track recognized by the terminal; And the second attribute information of the gesture track is included in the second attribute information of the gesture track, where the second attribute information of the gesture track is included.
- the terminal displays the required part of the feature information or all the feature information; the display screen is configured to present the gesture track according to the second attribute information of the gesture track acquired by the central processor; Store the preset rule.
- the first attribute information of the gesture track includes any one of the following attributes or a combination thereof: each point on the gesture track and the display screen Distance: a moving speed or acceleration of each point on the gesture track; a pause time of each point on the gesture track.
- the second attribute information of the gesture track includes any one of the following attributes or a combination thereof: The thickness of each point to be presented on the gesture track; the transparency of each point to be presented on the gesture track.
- the central processing unit is further configured to acquire a drawing stroke type;
- the gesture trajectory is presented according to the second attribute information of the gesture trajectory and the drawing stroke type.
- the input device is further configured to acquire a first gesture command input by the user; the central processor is specifically configured to determine a drawing stroke type according to the first gesture command acquired by the input device.
- the central processing unit is specifically configured to: determine, according to the first gesture command acquired by the input device a drawing stroke type corresponding to the first gesture; or rendering at least one drawing stroke type according to the first gesture command acquired by the input device; the user receiving at least one drawing stroke type received by the input device The choice is to determine the type of drawing strokes.
- the input device is further configured to acquire a second gesture command input by a user;
- the processor is further configured to determine a drawing stroke color according to the second gesture command acquired by the input device, where the display screen is specifically configured to be presented according to the second attribute information of the gesture track and the drawing stroke color Or the gesture track; or, the gesture track is presented according to the second attribute information of the gesture track, the drawing stroke type, and the drawing stroke color.
- the input device is further configured to acquire a third gesture command input by a user;
- the third gesture command acquired by the input device instructs the display screen to perform a full deletion operation or a partial deletion operation on the presented gesture track.
- the drawing control method, the device and the terminal provided by the embodiment detect the gesture trajectory input by the user, and the gesture trajectory is a gesture trajectory generated during the non-contact movement process of the user control input device and the display screen, and then acquires the Determining the first attribute information of the gesture track, and then identifying the gesture track according to the preset rule and the first attribute information of the gesture track, to obtain the second attribute information of the gesture track, and finally according to the gesture
- the second attribute information of the trajectory presents the gesture trajectory, so that a part of the feature information required to present the trajectory is carried in the first attribute information of the gesture trajectory, thereby preventing the user from frequently manually switching various options to complete the gesture trajectory.
- Input which solves the problem of complicated drawing operations.
- FIG. 1 is a flowchart of a drawing control method according to an embodiment of the present invention
- FIGS. 1A-1F are schematic diagrams of gesture trajectories presented in an embodiment of the present invention.
- FIG. 2 is a flowchart of a drawing control method according to another embodiment of the present invention.
- FIG. 3 is a flowchart of a drawing control method according to another embodiment of the present invention.
- FIG. 4 is a flowchart of a drawing control method according to still another embodiment of the present invention.
- FIG. 5 is a flowchart of a drawing control method according to still another embodiment of the present invention.
- FIG. 6 to FIG. 10 are schematic diagrams showing the structure of a drawing control device according to an embodiment of the present invention.
- FIG. 11 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
- the execution subject of each embodiment of the present invention is a terminal. It can be a fixed terminal, such as a desktop computer, a television, etc.; it can also be a mobile terminal, such as a tablet computer, a mobile phone, etc.; or even a projector.
- the terminal of each embodiment of the present invention can support contactless touch, and of course, can also support contact touch control.
- contactless touch or touch touch the display screen or the terminal can detect the distance of the finger from the screen, the speed or acceleration of the finger movement, the pause time of the finger, and the like.
- the gesture involved in all embodiments of the present invention may be a gesture that is recognized after being directly contacted with the terminal, for example: clicking, double-clicking, moving, etc. on the surface of the display screen; or a gesture recognized by the terminal without direct contact with the terminal
- a gesture that is recognized while moving around the display screen of the terminal includes a movement operation of the hand in front of the display screen.
- FIG. 1 is a flowchart of a drawing control method according to an embodiment of the present invention. As shown in FIG. 1, the method of this embodiment includes:
- Step 100 Detect a gesture track input by a user; wherein the gesture track is a gesture track generated by the user controlling the non-contact movement of the input device and the display screen.
- the gesture track is that the user controls the input device and the display screen.
- the gesture track generated during the contact movement, in which case the input device is the user's finger or other part of the body, as long as it can be recognized by the terminal.
- the gesture track may also be a gesture track generated during the contact movement of the user control input device and the display screen, and the input device may be a stylus, a stylus, a touch pen, a touch pen, a mobile phone pen Input devices for touch screen devices.
- the first attribute information of the gesture track includes any one of the following attributes or a combination thereof: a moving speed or an acceleration of each point on the gesture track; and a pause time of each point on the gesture track.
- Step 200 Acquire first attribute information of the gesture track.
- the first attribute information of the gesture track is the feature information of the gesture track recognized by the terminal, and specifically, the feature information of the gesture track may be identified by using a display screen of the terminal, various sensors, or a microphone.
- the first attribute information of the gesture track includes any one of the following attributes or a combination thereof: a distance between each point on the gesture track and the display screen; a moving speed or acceleration of the gesture track; The pause time of each point.
- the first attribute information listed above is only an example, and other attribute information may be used, that is, the attribute information used by the terminal to identify the gesture track may be applied to the embodiments of the present invention, which are all in the protection scope of the embodiment of the present invention.
- Step 300 Identify the gesture track according to a preset rule and first attribute information of the gesture track to obtain second attribute information of the gesture track.
- the second attribute information of the gesture track includes partial feature information or all feature information required for the terminal to display the gesture track.
- the second attribute information of the gesture track includes any one of the following attributes or a combination thereof: a thickness of each point to be presented on the gesture track; and a transparency of each point to be presented on the gesture track.
- the second attribute information listed above is only an example, and other attribute information may be used, that is, the feature information used by the terminal to display the gesture track display can be applied to the embodiment of the present invention, and is protected in the embodiment of the present invention. range.
- the preset rule is a pre-stored correspondence between the first attribute information of the gesture track and the second attribute information of the gesture track.
- the preset rule may be set when the terminal is shipped from the factory, or may be downloaded from the server to the terminal through the network. local.
- the correspondence between the first attribute information of the gesture track and the second attribute information of the gesture track may be discrete, that is, the first attribute information of the gesture track is divided into different intervals, each The interval corresponds to a value of the second attribute information of the gesture track; or may be continuous, that is, without dividing any interval, but preset a conversion coefficient, the conversion coefficient
- the second attribute information of the gesture track is obtained by converting the first attribute information of the gesture track, that is, the first attribute information of the gesture track is multiplied by a preset conversion coefficient to obtain a second attribute of the gesture track. information.
- the second attribute information of the gesture track is the thickness of each point on the gesture track to be presented.
- the preset rule is that a correspondence between a distance of each point on the gesture track and the display screen and a thickness of each point on the gesture track to be presented is a point on the gesture track and the display The closer the distance of the screen, the thicker the thickness of the points to be presented on the gesture track, as shown in Fig. 1A.
- the screen can detect the distance between the finger and the display screen.
- the terminal can perceive the maximum distance of the finger as the upper limit, and divide the distance between the finger and the display screen into a plurality of intervals, which are assumed to be divided into three intervals (0, al), (al, a2), (a2, a3) ( Al ⁇ a2 ⁇ a3, al, a2, a3 represent distance values), and each interval corresponds to a thickness value.
- the preset rule may also be the closer the distance between each point on the gesture track and the display screen, and the thinner the thickness of each point to be presented on the gesture track, the specificity may be set according to actual conditions. Therefore, the thickness of the trajectory on the display screen changes as the distance between the finger and the display screen changes.
- the effect of different thicknesses in a line can be realized by controlling the distance between the finger and the display screen, thereby realizing free drawing in the process of drawing an image.
- the area 11 in the pen line is the distance between the finger and the display screen 51 in the interval (0, al)
- the area 12 in the pen line is between the finger and the display screen 51 when drawing.
- the distance in the interval (al, a2), the area 13 in a stroke is the distance between the finger and the display screen 51 in the interval (a2, a3), and this drawing effect cannot be found in the existing drawing tool.
- the correspondence between the distance between each point on the gesture track and the display screen and the thickness of each point on the gesture track is discrete, or a continuous manner may be used.
- a conversion factor is preset, which is used to convert the distance between each point on the gesture track and the display screen to obtain the thickness of each point on the gesture track to be presented, that is, on the gesture track.
- the distance between each point and the display screen is multiplied by a preset conversion factor to obtain the thickness of each point on the gesture track. For example, when the first attribute information of the gesture track is the distance between each point on the gesture track and the display screen, and the second attribute information of the gesture track is the transparency of each point on the gesture track to be presented.
- the preset rule is a correspondence between a distance between each point on the gesture track and the display screen and a transparency of the gesture track, and the closer the points on the gesture track are to the display screen, the closer The lower the transparency of the points to be presented on the gesture track, as shown in Figure 1C.
- the screen can detect the distance between the finger and the display screen, and the terminal can perceive the maximum distance of the finger as the upper limit, and divide the distance d between the finger and the display screen into a plurality of intervals.
- each interval Corresponds to a transparency value.
- the preset rule may also be the closer the distance between the points on the gesture track and the display screen, and the transparency of the points to be presented on the gesture track is higher, which may be set according to actual conditions. Therefore, the transparency of the trajectory on the display screen changes as the distance between the finger and the display screen changes. When the distance between the finger and the display screen is in a different interval, the trajectory drawn on the display screen is different, so that the user can control The distance between the finger and the display screen is free to switch between multiple transparencys, eliminating the need to manually switch the transparency option before drawing.
- the effect of different transparency in a line can also be achieved by controlling the distance between the finger and the display screen, thereby realizing free drawing in the process of drawing an image.
- the area 14 in the line of the pen is the distance d between the finger and the display screen 51 at the time of drawing in the interval a1 ⁇ d ⁇ a2
- the area 15 in the line of the pen is between the finger and the display screen 51 at the time of drawing
- the distance d is in the interval d>al
- the area 16 in the pen line is the distance d between the finger and the display screen 51 when drawing, in the interval a3 ⁇ d ⁇ a4
- this drawing effect is in the existing drawing tool Cannot be implemented between input.
- the correspondence between the distance between each point on the gesture track and the display screen and the transparency of each point on the gesture track is discrete, or a continuous manner, that is, no Dividing any interval, but presetting a conversion factor, the conversion factor is used to convert the distance between each point on the gesture track and the display screen to obtain the transparency of each point on the gesture track to be presented, that is, the points on the gesture track
- the distance from the display screen is multiplied by a preset conversion factor to obtain the transparency of each point on the gesture track.
- the preset rule is the moving speed or acceleration of each point on the gesture track with the hand
- the preset rule is that the moving speed or acceleration of the gesture trajectory is smaller, and the thickness of the points to be presented on the gesture trajectory is thicker, refer to the drawing. 1E.
- the screen can detect the moving speed V of the finger near the display screen, and divide the moving speed of the finger near the display screen into a plurality of intervals, assuming that the interval is divided into three intervals v ⁇ bl, bl ⁇ v ⁇ b2, v>b2, ( bl ⁇ b2, bl, b2 represents the velocity value), and each interval corresponds to one thickness value.
- the preset rule may also be that the preset rule is that the moving speed or acceleration of the gesture track is smaller, and the thickness of each point to be presented on the gesture track is finer, and may be set according to actual conditions. Therefore, the thickness of the trajectory drawn on the display screen changes as the moving speed V of the finger near the display screen changes.
- the trajectory drawn on the display screen is Different, so that the user can freely switch between various thicknesses by controlling the moving speed V of the finger near the display screen, without manually switching the thickness option before drawing.
- the preset rule is a correspondence between the pause time of each point on the gesture track and the thickness of each point on the gesture track, and the pause time of each point on the gesture track is longer. The thicker the thickness of each point to be presented on the gesture track, see Figure 1 ⁇ .
- the screen can detect the pause time t of the finger near the display screen, and divide the pause time t of the finger near the display screen into a plurality of sections, and the four sections are divided into three sections t ⁇ cl, Cl ⁇ t ⁇ c2, t>c2, ( cl ⁇ c2, cl, c2 represents the pause time of the finger near the display screen), and each interval corresponds to a value of thickness.
- the preset rule may also be that the pause time of each point on the gesture track is longer, and the thickness of each point to be presented on the gesture track is finer, and may be set according to actual conditions.
- the thickness of the trajectory drawn on the display screen changes as the dwell time t of the finger near the display screen changes.
- the trajectory drawn on the display screen is Different, so that the user can freely switch between various thicknesses by controlling the pause time t of the finger near the display screen, without manually switching the thickness option before drawing.
- the above is only a single ticket to illustrate the first attribute information of the gesture track and the gesture track.
- the corresponding relationship may also be designed into a more complex correspondence according to actual needs, for example, when the first attribute information of the gesture track is the point on the gesture track and the display screen
- the combination of the distance and the moving speed of each point on the gesture track, the first attribute information of the gesture track is the thickness of each point to be presented on the gesture track, and the transparency of each point to be presented on the gesture track.
- the preset rule is a distance between each point on the gesture track and the display screen, a moving speed of each point on the gesture track, a thickness of each point to be presented on the gesture track, and the gesture track. The correspondence of the transparency of each point to be presented.
- the preset rule is that the closer the points on the gesture track are to the display screen, the thicker the thickness of each point on the gesture track is, and the smaller the moving speed of each point on the gesture track is. The lower the transparency of the points to be presented on the gesture track, as shown in FIG. 1F. It is even possible to use only one first attribute information to correspond to a plurality of second attribute information. For example, the corresponding relationship between the first attribute information of the gesture track and the second attribute information of the gesture track may be set according to an actual requirement, which is not specifically limited in the embodiment of the present invention.
- Step 400 Present the gesture track according to the second attribute information of the gesture track.
- drawing stroke types are supported in the drawing tool, and these drawing stroke types can be selected at will, and the type of drawing stroke selected by the user, and then the drawn lines have the characteristics of such drawing stroke types.
- drawing stroke types For example: pens, brushes, crayons, spray guns, etc.
- the feature information such as drawing stroke type and color it does not need to be changed frequently, so the feature information such as the fixed stroke type and color can be used in the case of the gesture track input, but the user is allowed to touch the fixed drawing in advance.
- the feature information such as type and color is modified, which saves the trouble of obtaining feature information such as drawing stroke type and color each time the input gesture track is presented, and saves the consumption of terminal resources.
- the feature information required for the gesture track display includes many types, such as: drawing stroke type, color, etc.
- the value of the feature information set as the second attribute information is preferentially used, and the others are not
- the value of the feature information set as the second attribute information may be obtained from a space in the drawing tool that stores the feature information, and finally presented according to the second attribute information of the gesture track and other feature information not set as the second attribute information.
- the selection of the points on the gesture track may be determined according to the display requirements of the display screen of the terminal, and may also be selected according to the needs of the implementation of the solution, which is not specifically limited in the embodiment of the present invention.
- the first attribute information involved in the embodiment of the present invention is existing, The property information in the drawing tool is the same. If the second attribute information affected by the two does not conflict, the second attribute information referred to in the embodiment of the present invention may be combined with the attribute information in the existing drawing tool; if there is a conflict, the implementation of the present invention is selected.
- the second attribute information of the example or the attribute information in the existing drawing tool is presented.
- the drawing stroke type is a spray gun
- the degree of the degree of drawing a certain point varies with the length of the pause time of the spray gun, and in the embodiment of the present invention, the thickness of a certain point is drawn along with
- the display effect of the two can be combined and displayed, that is, the degree of sparseness and thickness of a certain point are changed simultaneously with the length of the pause time of the spray gun.
- the drawing control method provided by the embodiment is configured to detect a gesture track input by a user, where the gesture track is a gesture track generated by the user controlling a non-contact movement process between the input device and the display screen, and then acquiring the gesture track.
- the first attribute information is used to identify the gesture track according to the preset rule and the first attribute information of the gesture track, to obtain the second attribute information of the gesture track, and finally according to the second of the gesture track.
- the attribute information presents the gesture track, so that a part of the feature information required to present the track is carried in the first attribute information of the gesture track, thereby preventing the user from frequently manually switching various options to complete the input of the gesture track, thereby solving the problem.
- FIG. 2 is a flowchart of a drawing control method according to another embodiment of the present invention. This embodiment is implemented based on the embodiment shown in FIG. As shown in FIG. 2, before step 400, the method further includes:
- Step 309 Acquire a drawing stroke type.
- the presenting the gesture track according to the second attribute information of the gesture track specifically includes: presenting the gesture track according to the second attribute information of the gesture track and the drawing stroke type.
- the embodiment of the present invention It is also possible to obtain a drawing stroke type before each gesture track, in order to update the drawing stroke type of the gesture track in time.
- the drawing stroke type is first viewed, and then the gesture trajectory is presented on the display screen using the drawing stroke type and the second attribute information of the gesture trajectory instead of using the fixed drawing stroke type all the time.
- the gesture track can be displayed in time using the user-replaced drawing stroke type throughout the drawing process.
- the terminal may also receive a gesture input by the user, and then match the received gesture with the pre-stored gesture, if the pre-stored gesture matches the received gesture. The gesture further acquires a gesture command corresponding to the gesture.
- the existing graphics matching algorithm can be used to calculate the similarity between the pre-stored gesture and the received gesture, such as the Shape matching algorithm. It is judged whether the pre-stored gesture is the same or similar to the received gesture, and measures the degree of similarity of the graph, and returns a similarity value between the graphs.
- the graph matching method is invariant to geometric transformations such as translation, rotation, and scale change. The degree of similarity of the graph should be measurable and easy to calculate. The judgment based on the matching algorithm should be consistent with human intuition.
- the pre-stored gesture command may be set by the terminal at the factory, or may be preset by the user, or may be downloaded from the network and saved to the terminal. The operation of several gestures involved in the embodiments of the present invention is described in detail below.
- FIG. 3 is a flowchart of a drawing control method according to another embodiment of the present invention. This embodiment is implemented based on the embodiment shown in FIG. 2. As shown in FIG. 3, step 309 specifically includes:
- Step 3091 Obtain a first gesture command input by a user.
- Step 3092 Determine a drawing stroke type according to the first gesture command.
- determining, according to the first gesture command, a type of drawing stroke corresponding to the first gesture is determining, according to the first gesture command, a type of drawing stroke corresponding to the first gesture.
- a mapping relationship between the first gesture command and the drawing stroke type is pre-stored in the terminal, and each of the first gesture commands corresponds to one drawing stroke type.
- the mapping relationship between the pre-stored first gesture command and the drawing stroke type is searched for, and the drawing stroke type corresponding to the first gesture command is obtained.
- the user can flexibly control the drawing stroke type through the gesture command, thereby making the drawing process of the user more coherent, smooth, and closer.
- the feeling of drawing in reality improves the usability of the terminal.
- At least one type of drawing stroke is presented according to the first gesture command; and then the drawing stroke type is determined by receiving the user's selection of at least one type of drawing stroke.
- FIG. 4 is a flowchart of a drawing control method according to another embodiment of the present invention. This embodiment is implemented based on the embodiments shown in Figs. 1, 2, and 3. A flowchart based on the implementation of the embodiment of FIG. 3 is shown in FIG. 4, and the method further includes:
- Step 500 Acquire a second gesture command input by the user.
- Step 600 Determine a drawing stroke color according to the second gesture command.
- the execution order of the steps 3091-3092-500-600 in FIG. 4 is only an example, and the specific implementation process may also be the steps 500-600-3091 -3092.
- the presenting the gesture track according to the second attribute information of the gesture track specifically includes: presenting the gesture track according to the second attribute information of the gesture track and the drawing stroke color; or And presenting the gesture track according to the second attribute information of the gesture track, the drawing stroke type, and the drawing stroke color.
- a mapping relationship between the second gesture command and the common drawing stroke color is pre-stored in the terminal, and each second gesture command corresponds to one drawing stroke color.
- searching for a mapping relationship between the pre-stored second gesture command and the drawing stroke color searching for a mapping relationship between the pre-stored second gesture command and the drawing stroke color, and obtaining a drawing stroke type corresponding to the second gesture command .
- the second gesture command input by the user is acquired, and then the drawing stroke color is determined according to the second gesture command, thereby enabling the user to flexibly control the drawing stroke color through the gesture command, thereby making the user's drawing
- the process is more coherent, smooth, and closer to the feeling of drawing in reality, improving the usability of the terminal.
- the color artboard is presented according to the second gesture command; and then the drawing stroke color is determined by receiving the user's selection of the color in the color palette.
- the second gesture command is a command for calling up all or part of the drawing stroke color, and then instructing the user to select a drawing stroke color, so that the user can invoke the gesture command through a single ticket.
- the selection operation can complete the selection of the drawing stroke color, thereby completing the drawing process more simply, and providing the terminal's ease of use.
- FIG. 5 is a flowchart of a drawing control method according to another embodiment of the present invention. This embodiment is based on The embodiment shown in Figures 1, 2, 3 and 4 is implemented.
- FIG. 5 is a flowchart of an implementation according to the embodiment shown in FIG. 4, where the method further includes:
- Step 700 Acquire a third gesture command input by a user.
- Step 800 Perform a delete operation or a partial delete operation on the gesture track presented on the display screen according to the third gesture command.
- the user obtains a third gesture command input by the user, and then performs a total deletion operation or a partial deletion operation on the gesture track according to the third gesture command, thereby enabling the user to flexibly perform the gesture through the gesture command.
- the gesture track is modified to improve the usability of the terminal.
- the method implemented by all the embodiments of the present invention is not limited to the drawing, and may also use other input information in the terminal, for example, inputting a short message, that is, when the user inputs the short message information, the terminal may also according to the acquired gesture track.
- the first attribute information acquires the second attribute information, and finally presents the gesture track of the user according to the second attribute information, so that the gesture track input by the user is no longer single, and the first attribute information of the gesture track can be freely adjusted to adjust the presented The thickness and transparency of the gesture track. Also, because of such a function, the user can easily increase the thickness of the input information, so that the person with poor eyesight can also use the terminal to input the content.
- FIG. 6 is a schematic structural diagram of a drawing control apparatus according to an embodiment of the present invention.
- the drawing control device shown in FIG. 6 includes a detection module 61, a first attribute information acquisition module 62, a second attribute information acquisition module 63, and a presentation module 64.
- the detecting module 61 is configured to detect a gesture trajectory input by the user; wherein the gesture trajectory is a gesture trajectory generated by the at least one finger of the user during the moving process; the first attribute information acquiring module 62 is configured to perform the detecting according to the detecting The first Attribute trajectory of the gesture trajectory is obtained by the mask trajectory detected by the module 61.
- the second attribute information acquiring module 63 is configured to: according to the preset rule and the gesture trajectory acquired by the first attribute information acquiring module 62 The first attribute information identifies the gesture trajectory to obtain the second attribute information of the gesture trajectory; the presentation module 64 is configured to use the second trajectory of the gesture trajectory acquired by the second attribute information acquiring module 63.
- the attribute information presents the gesture trajectory.
- the first attribute information of the gesture track is feature information of the gesture track recognized by the drawing control device.
- the first attribute information of the gesture track includes the following An attribute or a combination thereof: a distance between each point on the gesture track and the display screen; a moving speed or acceleration of the gesture track; and a pause time of each point on the gesture track.
- the first attribute information listed above is only an example, and other attribute information may be used, that is, the attribute information used by the terminal to identify the gesture track may be applied to the embodiment of the present invention, which is within the protection scope of the embodiment of the present invention.
- the second attribute information of the gesture track includes partial feature information or all feature information required for the drawing control device to display the gesture track.
- the second attribute information of the gesture track includes any one of the following attributes or a combination thereof: a thickness of each point to be presented on the gesture track; and a transparency of each point to be presented on the gesture track.
- the second attribute information listed above is only an example, and other attribute information may be used, that is, the feature information used by the terminal to display the gesture track display can be applied to the embodiment of the present invention, and is protected in the embodiment of the present invention. range.
- the preset rule is a pre-stored correspondence between the first attribute information of the gesture track and the second attribute information of the gesture track.
- the preset rules refer to the description in the method embodiment, and details are not described herein again.
- the correspondence between the first attribute information of the gesture track and the second attribute information of the gesture track may be discrete, that is, the first attribute information of the gesture track is divided into different intervals, each The interval corresponds to a value of the second attribute information of the gesture track; or may be continuous, that is, without dividing any interval, but preset a conversion coefficient, the conversion coefficient is used to set the first attribute information of the gesture track
- the second attribute information of the gesture track is obtained by multiplying the first attribute information of the gesture track by a preset conversion factor to obtain second attribute information of the gesture track.
- the feature information required for the gesture track display includes many types, such as: drawing stroke type, color, etc.
- the value of the feature information set as the second attribute information is preferentially used, and the others are not
- the value of the feature information set as the second attribute information may be obtained from a space in the drawing tool that stores the feature information, and finally presented according to the second attribute information of the gesture track and other feature information not set as the second attribute information.
- the drawing control device detects the gesture track input by the user through the detecting module 61, wherein the gesture track is a gesture track generated by at least one finger of the user during the moving process; and then the first attribute information acquiring module Obtaining the first attribute information of the gesture track according to the gesture track detected by the detecting module 61; and then the second attribute information acquiring module 63 is acquired according to the preset rule and the first attribute information acquiring module 62.
- the first attribute information identifies the gesture trajectory, and acquires second attribute information of the gesture trajectory; the second attribute information presenting by the second attribute information acquiring module 63 according to the second attribute information acquiring module 63
- the gesture trajectory is such that a part of the feature information required to present the trajectory is carried in the first attribute information of the gesture trajectory, thereby preventing the user from manually switching various options to complete the input of the gesture trajectory, thereby solving the complicated drawing operation. The problem.
- the drawing control device further includes: a drawing stroke type acquiring module 71, configured to acquire a drawing stroke type; in this case, the rendering module 64 is specifically configured to The gesture track is presented according to the second attribute information of the gesture track acquired by the second attribute information acquiring module 63 and the drawing stroke type acquired by the drawing stroke type acquiring module 71.
- the embodiment of the present invention It is also possible to set the drawing stroke type acquisition module 71 to acquire the drawing stroke type before each gesture track, so as to promptly update the drawing stroke type of the gesture track.
- the drawing stroke type acquisition module 71 first views the drawing stroke type, and then presents the gesture track on the display screen using the drawing stroke type and the second attribute information of the gesture track, instead of always using the fixed
- the constant drawing stroke type displays the gesture trajectory, so that the drawing control device can promptly display the gesture trajectory by the user's replaced drawing stroke type.
- the drawing control device further includes a gesture command obtaining module 81, configured to acquire a first gesture command input by the user.
- the drawing stroke type acquisition module 71 is specifically configured to determine a drawing stroke type according to the first gesture command acquired by the gesture command acquisition module 81.
- the drawing control device further includes a drawing stroke color acquiring module 91, configured to determine a drawing stroke according to the second gesture command acquired by the gesture command acquiring module 81.
- the rendering module 64 is configured to present the gesture track according to the second attribute information of the gesture track, the drawing stroke type, and the drawing stroke color.
- the presentation module 64 is further configured to: display the gesture track according to the second attribute information of the gesture track and the drawing stroke color, when the drawing stroke type is not frequently changed.
- the second gesture command input by the user is acquired by the drawing stroke color acquiring module 91, and then the drawing stroke color is determined according to the second gesture command, thereby enabling the user to flexibly control the drawing stroke color through the gesture command. Therefore, the user's drawing process is more consistent, smooth, and closer to the feeling of drawing in reality, improving the usability of the terminal.
- a device module diagram is implemented based on the device of FIG. 8.
- the gesture command obtaining module 81 is further configured to acquire a third gesture command input by a user;
- the deletion module 101 is further configured to perform a total deletion operation or a partial deletion operation on the presented gesture track according to the third gesture command acquired by the gesture command acquisition module 81.
- the deleting module 101 may also be added based on the device of FIG. 8, and details are not described herein again.
- the user obtains a third gesture command input by the user, and then performs a total deletion operation or a partial deletion operation on the gesture track according to the third gesture command, thereby enabling the user to flexibly perform the gesture through the gesture command.
- the gesture track is modified to improve the usability of the terminal.
- FIG. 11 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
- the device 110 includes: a display screen 111, an input device 112, a memory 113, a central processing unit
- the display screen 111 may be a cathode ray tube (CRT, Cathode Ray Tube) display screen, a liquid crystal display (LCD) display screen, or a touch display screen, etc., and receives instructions through the bus 115.
- a graphical user interface is presented on the screen of the display screen.
- Input device 112 may include any suitable means, such as a keyboard, mouse, track recognizer, voice recognition interface, etc., for receiving user input, and generating control inputs for transmission to a central processing unit or other component via bus 115.
- the display screen of the device 110 has a touch screen
- the display screen is also an input device.
- the memory 113 may be a RAM and a ROM, or any fixed storage medium, or a removable storage medium for storing an application database that can execute the program of the embodiment of the present invention or the embodiment of the present invention, and receiving the other through the bus 115.
- the central processing unit 114 is operative to execute the program of the embodiment of the present invention stored in the memory 113 and to communicate bidirectionally with other devices via the bus.
- Memory 113 and central processor 114 may also be integrated into a physical module to which embodiments of the present invention are applied, on which the programs implementing the embodiments of the present invention are stored and executed.
- each unit of the device 110 performs the following content: the input device 112 is configured to detect a gesture track input by a user, wherein the gesture track is that at least one finger of the user is in a moving process.
- the first trajectory generated by the input device 112 is used to parse the first attribute information of the gesture trajectory acquired by the input device 112; and is further configured to use the first attribute according to the preset rule and the gesture trajectory
- the information is used to identify the gesture track to obtain the second attribute information of the gesture track.
- the display screen 111 is configured to present the gesture according to the second attribute information of the gesture track acquired by the central processor. a trajectory; the memory 113, configured to store the preset rule.
- the first attribute information of the gesture track is feature information of the gesture track recognized by the drawing control device.
- the first attribute information of the gesture track includes any one of the following attributes or a combination thereof: a distance between each point on the gesture track and the display screen; a moving speed or acceleration of the gesture track; The pause time of each point.
- the first attribute information listed above is only an example, and other attribute information may be used, that is, the attribute information used by the terminal to identify the gesture track may be applied to the embodiment of the present invention, and the protection scope of the embodiment of the present invention is Wai.
- the second attribute information of the gesture track includes partial feature information or all feature information required for the drawing control device to display the gesture track.
- the second attribute information of the gesture track includes any one of the following attributes or a combination thereof: a thickness of each point to be presented on the gesture track; and a transparency of each point to be presented on the gesture track.
- the second attribute information listed above is only an example, and other attribute information may be used, that is, the feature information used by the terminal to display the gesture track display can be applied to the embodiment of the present invention, and is protected in the embodiment of the present invention. range.
- the preset rule is a pre-stored correspondence between the first attribute information of the gesture track and the second attribute information of the gesture track.
- the preset rules refer to the description in the method embodiment, and details are not described herein again.
- the correspondence between the first attribute information of the gesture track and the second attribute information of the gesture track may be discrete, that is, the first attribute information of the gesture track is divided into different intervals, each The interval corresponds to a value of the second attribute information of the gesture track; or may be continuous, that is, without dividing any interval, but preset a conversion coefficient, the conversion coefficient is used to set the first attribute information of the gesture track
- the second attribute information of the gesture track is obtained by multiplying the first attribute information of the gesture track by a preset conversion factor to obtain second attribute information of the gesture track.
- the feature information required for the gesture track display includes many types, such as: drawing stroke type, color, etc.
- the value of the feature information set as the second attribute information is preferentially used, and the others are not
- the value of the feature information set as the second attribute information may be obtained from a space in the drawing tool that stores the feature information, and finally presented according to the second attribute information of the gesture track and other feature information not set as the second attribute information.
- the terminal provided by the embodiment is configured to detect, by the input device 112, a gesture track input by a user, where the gesture track is a gesture track generated by at least one finger of the user during the moving process;
- the first attribute information is used to parse the gesture track acquired by the input device 112.
- the central processor 113 is further configured to perform the gesture track according to a preset rule and first attribute information of the gesture track.
- the display screen 111 is configured to present the gesture track according to the second attribute information of the gesture track acquired by the central processor, so that a part of the track is presented
- the required feature information is carried in the first attribute information of the gesture track, so that the user is frequently prevented from manually switching various options to complete the input of the gesture track, thereby solving the problem that the drawing operation is complicated.
- the central processing unit 113 is further configured to acquire a drawing stroke type; the central processing unit 111 is specifically configured to present the gesture track according to the second attribute information of the gesture track and the drawing stroke type. .
- the embodiment of the present invention It is also possible to set the central processing unit 111 to acquire the drawing stroke type before each gesture track, so as to promptly update the drawing stroke type of the gesture track.
- the input device 112 is further configured to acquire a first gesture command input by the user.
- the central processor is specifically configured to determine a drawing stroke type according to the first gesture command acquired by the input device 112.
- the gesture trajectory is displayed on the display screen by first viewing the drawing stroke type, and then using the drawing stroke type and the second attribute information of the gesture trajectory, instead of always using a fixed drawing stroke type, Therefore, the terminal can display the gesture track by the user's changed drawing stroke type in time.
- the central processing unit 113 is configured to: determine, according to the first gesture command acquired by the input device 112, a type of drawing stroke corresponding to the first gesture, by acquiring a first gesture command input by a user, and then according to The first gesture command determines the type of the stroke of the drawing, so that the user can flexibly control the type of the stroke of the drawing through the gesture command, thereby making the drawing process of the user more coherent, smooth, and closer to the feeling of drawing in reality, thereby improving the ease of use of the terminal. Sex.
- the input device 112 is further configured to acquire a second gesture command input by the user; the central processor 113 is further configured to determine a drawing stroke color according to the second gesture command acquired by the input device 112.
- the central processing unit 111 is configured to: display the gesture track according to the second attribute information of the gesture track and the drawing stroke color; or, according to the second attribute information of the gesture track, the drawing stroke The type and the drawing stroke color represent the gesture trajectory.
- the input device 112 is further configured to acquire a third gesture command input by the user; the central processor 113 is further configured to indicate the center according to the third gesture command acquired by the input device 112.
- the processor 111 performs a full delete operation or a partial delete operation on the gesture track presented.
- the hardware of the terminal in this embodiment can be used to execute the process of the drawing control method shown in FIG. 1 to FIG. 5.
- the specific working principle is not described here. For details, refer to the description of the method embodiment.
- the terminal involved in all embodiments of the present invention may be a terminal capable of recognizing a gesture directly contacting the display screen of the terminal, for example: clicking, double-clicking, moving, etc. on the display screen surface of the terminal;
- the terminal of the display screen having no direct contact gesture for example, a gesture that is recognized while moving around the display screen of the terminal, includes a movement operation of the hand before the display screen, and the like.
- It can be a mobile phone, a tablet, an ipad, a projector, a television, a desktop, etc.
- the terminal provided by the embodiment detects a gesture trajectory input by the user, and the gesture trajectory is a gesture trajectory generated during the non-contact movement process of the user control input device and the display screen, and then acquires Determining the first attribute information of the gesture track, and then identifying the gesture track according to the preset rule and the first attribute information of the gesture track, to obtain the second attribute information of the gesture track, and finally according to the gesture
- the second attribute information of the trajectory presents the gesture trajectory, so that a part of the feature information required to present the trajectory is carried in the first attribute information of the gesture trajectory, thereby preventing the user from frequently manually switching various options to complete the gesture trajectory.
- Input which solves the problem of complicated drawing operations.
- the aforementioned program can be stored in a computer readable storage medium.
- the program when executed, performs the steps including the foregoing method embodiments; and the foregoing storage medium includes: a medium that can store program code, such as a ROM, a RAM, a magnetic disk, or an optical disk.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014543756A JP2015504565A (ja) | 2012-10-31 | 2012-10-31 | 描画制御方法、装置、およびモバイル端末 |
KR1020137032315A KR101522919B1 (ko) | 2012-10-31 | 2012-10-31 | 드로잉 제어 방법, 장치 및 이동 단말기 |
EP12876594.8A EP2752740A4 (en) | 2012-10-31 | 2012-10-31 | DRAWING CONTROL METHOD, DEVICE THEREFOR AND MOBILE DEVICE |
CN2012800029776A CN103403650A (zh) | 2012-10-31 | 2012-10-31 | 绘图控制方法、装置及移动终端 |
PCT/CN2012/083876 WO2014067110A1 (zh) | 2012-10-31 | 2012-10-31 | 绘图控制方法、装置及移动终端 |
US14/108,082 US20140123079A1 (en) | 2012-10-31 | 2013-12-16 | Drawing control method, apparatus, and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2012/083876 WO2014067110A1 (zh) | 2012-10-31 | 2012-10-31 | 绘图控制方法、装置及移动终端 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/108,082 Continuation US20140123079A1 (en) | 2012-10-31 | 2013-12-16 | Drawing control method, apparatus, and mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014067110A1 true WO2014067110A1 (zh) | 2014-05-08 |
Family
ID=49565840
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2012/083876 WO2014067110A1 (zh) | 2012-10-31 | 2012-10-31 | 绘图控制方法、装置及移动终端 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20140123079A1 (zh) |
EP (1) | EP2752740A4 (zh) |
JP (1) | JP2015504565A (zh) |
KR (1) | KR101522919B1 (zh) |
CN (1) | CN103403650A (zh) |
WO (1) | WO2014067110A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016110249A (ja) * | 2014-12-03 | 2016-06-20 | 日本ユニシス株式会社 | 空間手書き入力システム、空間手書き入力方法およびコンピュータプログラム |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2013243947A1 (en) * | 2012-04-02 | 2014-10-30 | Moderna Therapeutics, Inc. | Modified polynucleotides for the production of proteins |
CN103702156A (zh) * | 2013-12-09 | 2014-04-02 | 乐视致新电子科技(天津)有限公司 | 一种自定义手势轨迹的方法及装置 |
WO2015174111A1 (ja) | 2014-05-14 | 2015-11-19 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
CN104134028B (zh) * | 2014-07-29 | 2017-03-29 | 广州视源电子科技股份有限公司 | 基于手势特征的身份认证方法及*** |
CN104951083A (zh) * | 2015-07-21 | 2015-09-30 | 石狮市智诚通讯器材贸易有限公司 | 一种远距离手势输入法及输入*** |
KR20170019956A (ko) | 2015-08-13 | 2017-02-22 | 삼성전자주식회사 | 전자 장치 및 전자 장치의 입력제어 방법 |
US10216405B2 (en) * | 2015-10-24 | 2019-02-26 | Microsoft Technology Licensing, Llc | Presenting control interface based on multi-input command |
EP3399388A4 (en) * | 2015-12-30 | 2019-09-04 | Shenzhen Royole Technologies Co. Ltd. | HEAD-MOUNTED DISPLAY DEVICE, HEAD-MOUNTED DISPLAY SYSTEM AND INPUT METHOD |
CN107728916B (zh) * | 2017-09-20 | 2020-07-28 | 科大讯飞股份有限公司 | 隔空手写轨迹的显示方法及装置 |
CN108459812B (zh) * | 2018-01-22 | 2021-03-02 | 郑州升达经贸管理学院 | 一种美术轨迹显示追捕***及方法 |
CN109407842A (zh) * | 2018-10-22 | 2019-03-01 | Oppo广东移动通信有限公司 | 界面操作方法、装置、电子设备和计算机可读存储介质 |
JP7238371B2 (ja) * | 2018-12-06 | 2023-03-14 | セイコーエプソン株式会社 | 表示装置、表示システム及び表示方法 |
CN111382598B (zh) * | 2018-12-27 | 2024-05-24 | 北京搜狗科技发展有限公司 | 一种识别方法、装置和电子设备 |
US11392659B2 (en) * | 2019-02-28 | 2022-07-19 | Adobe Inc. | Utilizing machine learning models to generate experience driven search results based on digital canvas gesture inputs |
CN111862349A (zh) * | 2019-04-26 | 2020-10-30 | 北京字节跳动网络技术有限公司 | 虚拟画笔实现方法、装置和计算机可读存储介质 |
KR102570009B1 (ko) | 2019-07-31 | 2023-08-23 | 삼성전자주식회사 | Ar 객체 생성 방법 및 전자 장치 |
CN110519517B (zh) * | 2019-08-30 | 2021-05-07 | 维沃移动通信有限公司 | 临摹引导方法、电子设备及计算机可读存储介质 |
CN110688044B (zh) * | 2019-09-30 | 2021-04-13 | 联想(北京)有限公司 | 一种输入方法及电子设备 |
CN112925414A (zh) * | 2021-02-07 | 2021-06-08 | 深圳创维-Rgb电子有限公司 | 显示屏手势绘画方法、装置及计算机可读存储介质 |
CN112925470B (zh) * | 2021-05-10 | 2021-10-01 | 广州朗国电子科技股份有限公司 | 交互式电子白板的触摸控制方法、***和可读介质 |
CN113360009A (zh) * | 2021-05-31 | 2021-09-07 | 维沃移动通信有限公司 | 图像显示方法和电子设备 |
CN114401443B (zh) * | 2022-01-24 | 2023-09-01 | 脸萌有限公司 | 特效视频处理方法、装置、电子设备及存储介质 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101642326A (zh) * | 2008-08-08 | 2010-02-10 | 许素朱 | 触控式书法音乐桌 |
CN102681702A (zh) * | 2011-03-07 | 2012-09-19 | 联想(北京)有限公司 | 控制方法、控制装置以及电子设备 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3109538B2 (ja) * | 1991-07-12 | 2000-11-20 | ソニー株式会社 | 情報処理装置および入力処理方法 |
JP3895406B2 (ja) * | 1996-03-12 | 2007-03-22 | 株式会社東邦ビジネス管理センター | データ処理装置およびデータ処理方法 |
JP2803718B2 (ja) * | 1996-03-21 | 1998-09-24 | 日本電気株式会社 | 手書き入力表示装置 |
JP2002207565A (ja) * | 2000-12-19 | 2002-07-26 | Internatl Business Mach Corp <Ibm> | 入力システム、電子入力装置、デジタイザ入力用筆記具、デジタイザ、座標入力方法、座標情報伝送方法、および記憶媒体 |
US8466893B2 (en) * | 2004-06-17 | 2013-06-18 | Adrea, LLC | Use of a two finger input on touch screens |
KR20070036077A (ko) * | 2004-06-29 | 2007-04-02 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | 그래픽 사용자 인터페이스의 다중층 디스플레이 |
JP4534954B2 (ja) * | 2005-10-31 | 2010-09-01 | ぺんてる株式会社 | 手書き筆跡入力システム |
JP2008065730A (ja) * | 2006-09-11 | 2008-03-21 | Nec Corp | 携帯通信端末装置、携帯通信端末装置における座標入力方法および座標入力装置 |
US7697002B2 (en) * | 2007-01-25 | 2010-04-13 | Ricoh Co. Ltd. | Varying hand-drawn line width for display |
JP2009116769A (ja) * | 2007-11-09 | 2009-05-28 | Sony Corp | 入力装置、入力装置の制御方法、及びプログラム |
US8363019B2 (en) * | 2008-05-26 | 2013-01-29 | Lg Electronics Inc. | Mobile terminal using proximity sensor and method of controlling the mobile terminal |
US8610744B2 (en) * | 2009-07-10 | 2013-12-17 | Adobe Systems Incorporated | Methods and apparatus for natural media painting using proximity-based tablet stylus gestures |
JP4947668B2 (ja) * | 2009-11-20 | 2012-06-06 | シャープ株式会社 | 電子機器、表示制御方法、およびプログラム |
US8487889B2 (en) * | 2010-01-15 | 2013-07-16 | Apple Inc. | Virtual drafting tools |
CA2802041A1 (en) * | 2010-06-11 | 2011-12-15 | Precision Dermatology, Inc. | High oil-content emollient aerosol foam compositions |
JP2012048623A (ja) * | 2010-08-30 | 2012-03-08 | Sony Corp | 情報処理装置、パラメータ設定方法、及びプログラム |
US10345912B2 (en) * | 2011-03-07 | 2019-07-09 | Lenovo (Beijing) Co., Ltd. | Control method, control device, display device and electronic device |
-
2012
- 2012-10-31 KR KR1020137032315A patent/KR101522919B1/ko active IP Right Grant
- 2012-10-31 WO PCT/CN2012/083876 patent/WO2014067110A1/zh active Application Filing
- 2012-10-31 JP JP2014543756A patent/JP2015504565A/ja active Pending
- 2012-10-31 CN CN2012800029776A patent/CN103403650A/zh active Pending
- 2012-10-31 EP EP12876594.8A patent/EP2752740A4/en not_active Withdrawn
-
2013
- 2013-12-16 US US14/108,082 patent/US20140123079A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101642326A (zh) * | 2008-08-08 | 2010-02-10 | 许素朱 | 触控式书法音乐桌 |
CN102681702A (zh) * | 2011-03-07 | 2012-09-19 | 联想(北京)有限公司 | 控制方法、控制装置以及电子设备 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016110249A (ja) * | 2014-12-03 | 2016-06-20 | 日本ユニシス株式会社 | 空間手書き入力システム、空間手書き入力方法およびコンピュータプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP2752740A4 (en) | 2015-03-11 |
EP2752740A8 (en) | 2014-08-13 |
EP2752740A1 (en) | 2014-07-09 |
KR20140082596A (ko) | 2014-07-02 |
KR101522919B1 (ko) | 2015-05-22 |
JP2015504565A (ja) | 2015-02-12 |
CN103403650A (zh) | 2013-11-20 |
US20140123079A1 (en) | 2014-05-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014067110A1 (zh) | 绘图控制方法、装置及移动终端 | |
US11550399B2 (en) | Sharing across environments | |
JP2019220237A (ja) | 文字入力インターフェース提供方法及び装置 | |
US10748506B2 (en) | Input display device and input display method | |
US20160103655A1 (en) | Co-Verbal Interactions With Speech Reference Point | |
WO2015161653A1 (zh) | 一种终端操作方法及终端设备 | |
US20170047065A1 (en) | Voice-controllable image display device and voice control method for image display device | |
KR20040063153A (ko) | 제스쳐에 기초를 둔 사용자 인터페이스를 위한 방법 및 장치 | |
US20140354553A1 (en) | Automatically switching touch input modes | |
US20150177843A1 (en) | Device and method for displaying user interface of virtual input device based on motion recognition | |
KR20160086090A (ko) | 이미지를 디스플레이하는 사용자 단말기 및 이의 이미지 디스플레이 방법 | |
US20180052598A1 (en) | Multi-touch based drawing input method and apparatus | |
CN112587925A (zh) | 引导信息的显示方法、装置、存储介质及计算机设备 | |
CN110427138A (zh) | 翻译信息处理方法、装置、电子设备及存储介质 | |
US20170038962A1 (en) | System and method for receiving a touch input at a location of a cursor extended from the touch input on a touchscreen device | |
CN107391015B (zh) | 一种智能平板的控制方法、装置、设备及存储介质 | |
TW201520877A (zh) | 手勢操作方法及游標的呼叫方法 | |
US20190265881A1 (en) | Information processing apparatus, information processing method, and storage medium | |
JP2018005663A (ja) | 情報処理装置、表示システム、プログラム | |
WO2023024536A1 (zh) | 一种绘图方法、装置、计算机设备及存储介质 | |
CN104484117B (zh) | 人机交互方法及装置 | |
JP2018005660A (ja) | 情報処理装置、プログラム、位置情報作成方法、情報処理システム | |
Lee et al. | Mouse operation on monitor by interactive analysis of intuitive hand motions | |
KR101844651B1 (ko) | 모바일 클라우드 컴퓨팅 클라이언트 환경에서 3d 터치를 이용한 모바일 기기의 마우스 입력장치 및 입력방법 | |
CN111399728B (zh) | 设置方法、电子设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2012876594 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20137032315 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2014543756 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12876594 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |