CN109558061A - A kind of method of controlling operation thereof and terminal - Google Patents

A kind of method of controlling operation thereof and terminal Download PDF

Info

Publication number
CN109558061A
CN109558061A CN201811452861.8A CN201811452861A CN109558061A CN 109558061 A CN109558061 A CN 109558061A CN 201811452861 A CN201811452861 A CN 201811452861A CN 109558061 A CN109558061 A CN 109558061A
Authority
CN
China
Prior art keywords
screen
touch
target object
terminal
control body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811452861.8A
Other languages
Chinese (zh)
Other versions
CN109558061B (en
Inventor
龚贺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811452861.8A priority Critical patent/CN109558061B/en
Publication of CN109558061A publication Critical patent/CN109558061A/en
Application granted granted Critical
Publication of CN109558061B publication Critical patent/CN109558061B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The embodiment of the invention provides a kind of method of controlling operation thereof and terminals.Wherein, which includes: the second place displaying target object in second screen;Receiving the touch control operation of user, wherein the touch control operation includes: first touch control operation of the user at the first position of first screen, and/or, second touch control operation of the user in the second place of second screen;Wherein, the first position of first screen is corresponding with the second position of second screen;In response to the touch control operation, the first instruction corresponding with the target object is executed;The implementing result of first instruction is shown on second screen.In the embodiment of the present invention, the touch control operation for generating instruction is carried out on a screen of terminal, shows that the implementing result for executing the instruction can increase the cooperation and interaction between the multiple screens of terminal in this way on another screen.

Description

A kind of method of controlling operation thereof and terminal
Technical field
The present embodiments relate to field of communication technology more particularly to a kind of method of controlling operation thereof and terminals.
Background technique
With the development of science and technology the fast development of the communications industry, the function and form of terminal (such as mobile phone, platform computer) Earth-shaking variation all occurs.With the continuous development of technique, terminal screen gradually develops to flexible screen, folding from capacitance plate Folded screen, double screen.Dual-band Handy Phone is more and more, provides more multi-screen interactive experience for people.But in the prior art, for The operation of two screens, all separates, that is to say, that when being operated to the content shown in each screen, Zhi Neng Current screen is operated, and is lacked and is cooperated between double screen and interact.
Summary of the invention
The embodiment of the invention provides a kind of method of controlling operation thereof and terminals, are lacked with solving double screen terminal in the prior art The problem for cooperating and interacting.
In order to solve the above-mentioned technical problem, the present invention adopts the following technical scheme:
In a first aspect, being applied to terminal the embodiment of the invention provides a kind of method of controlling operation thereof, the terminal is at least wrapped It includes: the first screen and the second screen being oppositely arranged, which is characterized in that the described method includes:
In the second place displaying target object of second screen;
The touch control operation of user is received, the touch control operation includes: user at the first position of first screen First touch control operation, and/or, second touch control operation of the user in the second place of second screen;Wherein, described first The first position of screen is corresponding with the second position of second screen;
In response to the touch control operation, the first instruction corresponding with the target object is executed;
The implementing result of first instruction is shown on second screen.
Second aspect, the embodiment of the invention provides a kind of terminal, the terminal is included at least: the first screen being oppositely arranged Curtain and the second screen.Wherein, the terminal further include:
First display module, for the second place displaying target object in second screen;
First receiving module, for receiving the touch control operation of user, the touch control operation includes: user in first screen The first touch control operation at the first position of curtain, and/or, second touch-control of the user in the second place of second screen is grasped Make;Wherein, the first position of first screen is corresponding with the second position of second screen;
Execution module, for executing the first instruction corresponding with the target object in response to the touch control operation;
Second display module, for showing the implementing result of first instruction on second screen.
The third aspect the embodiment of the invention provides a kind of terminal, including processor, memory and is stored in the storage On device and the computer program that can run on the processor, realized such as when the computer program is executed by the processor Above the step of method of controlling operation thereof.
Fourth aspect, the embodiment of the invention provides a kind of computer readable storage medium, the computer-readable storage Computer program is stored on medium, the computer program realizes method of controlling operation thereof as described above when being executed by processor Step.
In the embodiment of the present invention, the touch control operation for generating instruction is carried out on a screen of terminal, at another Show that the implementing result for executing the instruction can increase the cooperation between the multiple screens of terminal in this way on screen With interaction.
Detailed description of the invention
Fig. 1 shows the flow charts of method of controlling operation thereof provided in an embodiment of the present invention;
Fig. 2 indicates one of the schematic diagram of the display picture in the second screen provided in an embodiment of the present invention;
Fig. 3 indicates the two of the schematic diagram of the display picture in the second screen provided in an embodiment of the present invention;
Fig. 4 indicates the schematic diagram of gesture identification provided in an embodiment of the present invention;
Fig. 5 indicates the three of the schematic diagram of the display picture in the second screen provided in an embodiment of the present invention;
Fig. 6 indicates the four of the schematic diagram of the display picture in the second screen provided in an embodiment of the present invention;
Fig. 7 indicates one of the block diagram of terminal provided in an embodiment of the present invention;
Fig. 8 shows the two of the block diagram of terminal provided in an embodiment of the present invention.
Specific embodiment
The exemplary embodiment that the present invention will be described in more detail below with reference to accompanying drawings.Although showing the present invention in attached drawing Exemplary embodiment, it being understood, however, that may be realized in various forms the present invention without should be by embodiments set forth here It is limited.It is to be able to thoroughly understand the present invention on the contrary, providing these embodiments, and can be by the scope of the present invention It is fully disclosed to those skilled in the art.
One aspect according to an embodiment of the present invention provides a kind of method of controlling operation thereof, is applied to terminal.
Wherein, which includes at least: the first screen and the second screen being oppositely arranged, i.e. the first screen and the second screen The front that heteropleural in terminal, such as the first screen are in terminal, the second screen is in the back side of terminal or the first screen is in The back side of terminal, the second screen are in the front etc. of terminal.Wherein, the first screen and the second screen use two independent touchings respectively Control chip is controlled, and the two touch chips are controlled by the same CPU.The touch-control core of first screen and the second screen Sector-meeting is opened simultaneously, captures the touch information of user.
In embodiments of the present invention, which can be mobile terminal (such as mobile phone, tablet computer, laptop, palm Computer, desktop computer, car-mounted terminal, wearable device or pedometer), desktop computer, the electronic equipments such as smart television.
As shown in Figure 1, the method for controlling operation thereof includes:
Step 101: in the second place displaying target object of the second screen.
Wherein, target object described here includes but is not limited to: the locking screen interface of terminal, the main interface of terminal, application At least one of in program interface, virtual key and object for carrying out message notifying.
Step 102: receiving the touch control operation of user.
Wherein, touch control operation described here includes: the first touch control operation of the user at the first position of the first screen, And/or user is in the second touch control operation of the second place of the second screen.That is, being shown in the embodiment of the present invention Target object in second screen, can be by input control on the first screen, can also be by the input on the second screen Control.
Wherein, in the embodiment of the present invention, position coordinates in position coordinates and the second screen in the first screen are built in advance Corresponding relationship is found, display position (i.e. the second position) of the target object on the second screen has correspond on the first screen Position (i.e. first position), i.e. the first position of the first screen is corresponding with the second position of the second screen.
Wherein, which includes but is not limited to: slide on the screen, single-click operation, double click operation are pressed Press operation etc..When executing these operations, it can be single-point touch operation, sliding behaviour such as carried out using single finger on the screen Work, single-click operation, double click operation or pressing operation etc., are also possible to multi-point touch operation, are such as being shielded simultaneously using two fingers Slide, single-click operation, double click operation or pressing operation etc. are carried out on curtain.
Step 103, in response to the touch control operation, execute corresponding with target object first and instruct.
Terminal when receiving the touch control operation, if judge whether have the touch control operation touch-control when on target object and The instruction of generation responds the touch control operation if having, and executing should instruction (the i.e. first instruction) corresponding with target object;If Do not have, then ignores the touch control operation.
For example, being when in unlock interface in the first input touch-control if can determine whether if target object is locking screen interface Corresponding instruction (instruction as solved lock screen) under no generation screen lock state;If target object is the homepage of terminal, can sentence If whether the first input touch-control that breaks generates corresponding instruction (such as switching desktop page under display homepage surface state in homepage Instruction);If target object is an Application Program Interface, if then can determine whether the first input touch-control in the Application Program Interface Whether to the application program relevant instruction was generated when upper.
In the embodiment of the present invention, touch-control behaviour can be carried out to the target object being shown in the second screen by the first screen Make, increases the cooperation and interaction between the multiple screens of terminal.
Step 104, the implementing result that display first instructs on the second screen.
In this step, terminal executes the first instruction, and controls the implementing result of the first instruction of display on the second screen.
Continuation is illustrated with aforementioned, however, it is determined that the first instruction is the instruction of solution lock screen, then controls the progress of the second screen Unlock;If the first instruction is the instruction of switching desktop page, controls the desktop page that the second screen is shown and switch over;If the For unread message is marked as read message, then control, which is shown in the second screen in Application Program Interface, does not read to disappear for one instruction Breath is marked as read message.
Wherein, in order to be further understood that above-mentioned steps 101 to the method described in step 104, are continued with an example It is explained.
For example, there are two screens for terminal tool, wherein the first screen is in the back side of terminal, and the second screen is in terminal Front.When user's using terminal, usually by terminal just facing towards oneself, the back side deviates from oneself, and when user's gripping terminal, Thumb is in the side of the second screen, remaining four finger is in the side of the first screen.It is assumed that user is used in terminal Application software sees novel, which is shown in the second screen.It generally, can be by the second screen It horizontally slips and carries out page turning.In embodiments of the present invention, then the touch-control of page turning can be weighed and is set on the first screen, when user thinks It when wanting page turning, is slid left and right on the first screen using the finger in the first screen side, control is shown in the second screen Novel in curtain carries out page turning, can avoid user in this way when the second screen carries out page turning by touch control operation, finger blocks small Say content, in addition, on the second screen carry out touch control operation when, usually operated using thumb, other fingers due to The second screen is operated in the first screen inconvenience, therefore when using the second screen, other fingers are then in idle State, and the method provided through the embodiment of the present invention can also then be such that other fingers in addition to thumb are added to screen Touch control operation in, enrich the interaction between finger and screen.
In another example when user watches video (video clip is target object at this time) that the second screen plays, if passing through touching Screen operates when adjusting screen intensity or sound size, usually touch control operation can be carried out in the second screen, to adjust screen intensity Or the touch-control power for adjusting screen intensity or sound size can also be then set to the and in embodiments of the present invention by sound size On one screen, when user wants to adjust screen intensity or sound size, using the finger in the first screen side first Different location on screen slides up and down, and control is shown in the brightness or sound size of the second screen, improves two screens Between interactivity.
In the embodiment of the present invention, the touch control operation for generating instruction is carried out on a screen of terminal, at another Show that the implementing result for executing the instruction can increase the cooperation between the multiple screens of terminal in this way on screen The usage experience of user is improved further, it is also possible to sufficiently call the use of finger with interaction.
Preferably, in the embodiment of the present invention, first position is Chong Die with the projection of the second position on the first plane, wherein First plane is parallel with the first screen or the second screen.
For example, when the first screen is identical as the second screen size, is parallel to each other, and when marginal position is mutually aligned, first It is Chong Die that position with the second position shows as projection on the first plane.
Wherein, at the first screen (such as shape is identical but area is different) different from the second screen size, first position with The second position can also show as the overlapping of projection on the first plane.It is such as identical as the shape of the second screen in the first screen, But when the area of the first screen is greater than the area of the second screen, a touch area, the touch-control can be set out in the first screen Region is identical as the area of the second screen, shape, and the edge of the touch area and the edge of the second screen are mutually aligned.User Target object can be manipulated in the touch area, at this point, first position and the second position are shown as in the first plane On projection overlapping.
Of course, it should be understood that in the first screen and the second screen size difference, first position and the second position The projection that can be shown as in the first plane is not overlapped.Such as the first screen is identical as the second screen shape, but the face of the first screen Product when especially the first screen differs larger with the area of the second screen, is difficult at this time first less than the area of the second screen A touch area is determined in screen, keeps the touch area identical as the area of the second screen, shape, and the side of the touch area The edge of edge and the second screen is mutually aligned.At this moment, it is sat according to the position in the position coordinates and the second screen in the first screen The corresponding relationship pre-established is marked, the projection of the mutual corresponding first position determined and the second position on the first plane can It can will not be overlapped.
It should be noted that no matter which kind of above-mentioned situation, the corresponding relationship of first position and the second position is according to Corresponding relationship that the position coordinates in position coordinates and the second screen in one screen pre-establish and determination.
Further, target object described in the embodiment of the present invention not only can be the locking screen interface of the terminal, institute Main interface, the Application Program Interface for stating terminal can also be that (such as message mentions for virtual key, the object for carrying out message notifying Show frame, message notifying column, message notifying icon etc.) etc..In order to be further understood to this, it is exemplified below, following institute It states:
For example, operational readiness is more demanding for class game application of racing, generally there is at least eight virtual key.This 8 Virtual key generally requires two thumbs with two hands to operate, and remaining finger is then in idle state.It is assumed that Interface is currently shown in the second screen, in embodiments of the present invention, at least one in interface can virtually be pressed The touch-control of key is weighed, and setting is on the first screen.When carrying out virtual key touch-control, the finger for being in the first screen side is directly utilized Touch-control is carried out, the interactivity between finger and screen is improved.And user is to virtually pressing in interface for convenience Key is had gained some understanding, and touch-control power virtual key on the first screen is shown in the second screen, as in Fig. 2 compared with light colour Several circular patterns that lines are shown are the virtual key of touch-control power on the first screen.Preferably, in order to be different from touch-control The virtual key on the second screen is weighed, the virtual key that touch-control is weighed on the first screen can be differently shown, such as be adjusted Its whole transparency or colouring discrimination show that concrete condition can design according to actual needs.
In another example as shown in figure 3, terminal has received when user is passing through a game application of experience of the second screen The new information of other application (such as wechat), and carried out message notifying (as prompted by message notifying frame).In the present invention In embodiment, for message notifying, it can be set by carrying out touch control operation on the first screen, to check message, such as user Want to check the new information received, then can on the first screen with show that the position of message notifying frame is opposite on the second screen Single-click operation is carried out at the position answered, terminal responds the single-click operation, generates the instruction of display message interface, and in the second screen Upper display message interface, checks message for user.In addition to this, user can also be by the first screen to showing in the second screen Message notifying frame, message notifying column or message notifying icon etc. dragged, be moved to and do not block user to be watched Interface.Of course, it should be understood that control message prompt can also be carried out by carrying out touch control operation on the second screen Frame, message notifying column or message notifying icon etc..
Preferably due to which user can not be seen in the touch-control body of the first screen side, therefore user controls with touch-control body When the target object being shown in the second screen, there can be certain difficulty, reduce the accuracy of touch-control.In order to solve this skill Art problem in the embodiment of the present invention, can also detect the characteristic information of the touch-control body positioned at the first screen side, according to what is detected The characteristic information of touch-control body, at the third place on the second screen, display and the matched virtual image of touch-control body.Work as touch-control When the position of body changes, virtual image also changes accordingly, in this way, user can preferably be controlled according to virtual image System operates target object in the touch-control body of the first screen side, improves user in the accuracy of the first screen touch-control.
Wherein, touch-control body described here includes but is not limited to: the object that hand, stylus of user etc. can be conductive. The third place described here is corresponding with the 4th position on the first screen.4th position described here is that the touch-control body exists The position where orthographic projection on first screen.
It should be noted that the position coordinates in position coordinates and the second screen in the first screen, pre-establish pair It should be related to, that is to say, that, can be according in the first screen when the virtual image to match with touch-control body is shown in the second screen Position coordinates and the second screen in the positive throwing on the first screen of the position coordinates corresponding relationship and touch-control body that pre-establish Virtual image is mapped in the second screen at the 4th position corresponding with the third place and shows by the third place where shadow Show.
Wherein, in order to avoid virtual image blocks the picture that the second screen is shown, the transparency of settable virtual image.
Specifically, the characteristic information of touch-control body described here comprises at least one of the following: pre- in the first screen of distance If the location information of the depth information of the touch-control body in distance, touch-control body touching on the first screen.
Wherein, when the characteristic information of touch-control body is aforementioned depth information, display is physically weak quasi- with the matched touch-control of touch-control body Image, for example, shown in the finger figure shown in Fig. 5 with shallower colored line.When the characteristic information of touch-control body is aforementioned location When information, display is used to indicate the virtual image of touch-control body touch position, for example, in Fig. 6 shown in rectangular figure.
Wherein, when touch-control body is the hand of user, the depth information of touch-control body is the depth information of the finger of hand, touching The location information of body touching on the first screen is controlled as hand (such as the finger tip, articulations digitorum manus) touching on the first screen of user Location information.
Wherein, in the embodiment of the present invention, the first screen and the second screen of terminal are capacitive screens, for screen realize every The principle of empty touch-control body identification, is explained by taking hand as an example below:
As shown in figure 4, although finger is without real touch-control to the first screen, due to close to the first screen, so hand Refer to and forms capacitor one by one between meeting and the first screen, since finger is different at a distance from the first screen, the capacitor resulted in Capacitance it is different, the distance between finger and the first screen, available finger are calculated according to the capacitance size of formation Three-dimensional depth information, the depth information of each point is stored in the memory of terminal.By the processing of CPU, software process quality one Virtual finger image (its size is identical with human finger) is mapped to the second screen finally by this virtual finger-image The corresponding position of curtain.
Further, in one embodiment of the invention, target object includes: first object object and the second target pair As.Wherein, first object object and the second target object are overlapped on the second place on the second screen.
User may encounter such problems in using terminal, for example, user is experiencing a game application When, terminal has received the new information of other application, and carries out message notifying (as prompted by message notifying frame), and uses A virtual key in interface is just covered in the object for carrying out message notifying.In this case, the present invention is implemented Example provides two kinds of settling modes, described in detail below:
Mode one includes: the first pressing operation received at the first position of user on the first screen;In response to first Pressing operation determines target object to be manipulated corresponding with the pressing dynamics according to the pressing dynamics of the first pressing operation, and holds Row instruction corresponding with target object to be manipulated.Wherein, target object to be manipulated described here is first object object or the Two target objects.
Mode two includes:
Receive the second pressing operation of the second place of the user on the second screen;In response to the second pressing operation, root According to the pressing dynamics of the second pressing operation, determine target object to be manipulated corresponding with the pressing dynamics, and execution with wait manipulate The corresponding instruction of target object.Wherein, target object to be manipulated described here is first object object or the second target object.
By mode one above-mentioned and mode two it is found that in the embodiment of the present invention, user can according to the difference of pressing dynamics, To select itself to want the target object of manipulation.For example, pre-setting: when pressing dynamics are less than preset pressure value, the first mesh Marking object is by the target object of selection manipulation;When pressing dynamics are greater than or equal to preset pressure value, the second target object is quilt The target object of selection operation.In this way, at the first position of user on the first screen (or the second position on the second screen Place) pressing dynamics when being less than preset pressure value, then first object object is determined as being selected the target object of manipulation, and be held Row instruction corresponding with first object object;At the first position of user on the first screen (or second on the second screen At position) pressing dynamics be greater than or equal to preset pressure value when, then by the second target object be determined as by selected manipulation mesh Object is marked, and executes instruction corresponding with the second target object.
Above-mentioned technical proposal in order to better understand is exemplified below:
For example, when message notifying frame is Chong Die with the virtual key in interface, if being pressed on the first screen Operation, then it is settable: corresponding with message notifying frame and the lap position of the virtual key in interface on the first screen of flicking Position, the first screen show response wechat function result;Again by the first screen with message notifying frame and interface In virtual key the corresponding position in lap position, the first screen show response game function result.If in the second screen Carry out pressing operation on curtain, then it is settable: to press lap position again, show the result of response wechat function in the first screen;Flicking Lap position shows the result of response game function in the first screen.Of course, it should be understood that flicking, again press corresponding mesh Mark object can be exchanged, and be not limited only to described above.
Wherein, user can according to oneself use habit, the flicking of training setting and press again by Compressive Strength, and can To choose whether to open this function.
Preferably, in the embodiment of the present invention, for user in the first input of input of the first screen, the second screen, which is in go out, shields shape State, that is to say, that when user carries out touch control operation on the first screen, the screen backlight of the first screen is in close state, but Touch chip is in the open state, can save the electric energy of terminal in this way.
In conclusion the touch control operation for generating instruction is carried out on a screen of terminal in the embodiment of the present invention, Shown on another screen execute the instruction implementing result, in this way, can increase the multiple screens of terminal it Between cooperation and interaction, further, it is also possible to sufficiently call finger use, improve the usage experience of user.Further, this hair In bright embodiment, the touch-control body situation of the first screen side can also be shown in the second screen, user is facilitated preferably to control touch-control Body is operated in the first screen.
Other side according to an embodiment of the present invention provides a kind of terminal, is able to achieve the center method of above-mentioned terminal In details, and reach identical effect.
Wherein, which includes at least: the first screen and the second screen being oppositely arranged, i.e. the first screen and the second screen Heteropleural in terminal.First screen and the second screen are in the heteropleural of terminal, the front that such as the first screen is in terminal, and second Screen is in the back side of terminal or the first screen is in the back side of terminal, and the second screen is in the front etc. of terminal.Wherein, One screen and the second screen are controlled with two independent touch chips respectively, and the two touch chips are by the same CPU It is controlled.The touch chip of first screen and the second screen can open simultaneously, capture the touch information of user.
As shown in fig. 7, the terminal further include:
First display module 701, for the second place displaying target object in the second screen.
First receiving module 702, for receiving the touch control operation of user.
Wherein, which includes: first touch control operation of the user at the first position of the first screen, and/or, it uses Second touch control operation of the family in the second place of the second screen;Wherein, the first position of first screen and described second The second position of screen is corresponding.
Execution module 703, the first input for being received in response to the first receiving module 702, execution and target object Corresponding first instruction.
Second display module 704, the implementing result for the first instruction of display on the second screen.
Further, the terminal further include:
Detection module, for detecting the characteristic information for being located at the touch-control body of the first screen side.
Third display module, the characteristic information of the touch-control body for being detected according to detection module, on the second screen At the third place, display and the matched virtual image of touch-control body.
Wherein, the third place is corresponding with the 4th position on the first screen, and the 4th position is touch-control body in the first screen On orthographic projection where position.
Further, the characteristic information of touch-control body comprises at least one of the following: in the first screen of distance pre-determined distance Touch-control body depth information, touch-control body touching location information on the first screen.Wherein, pre-determined distance is greater than or equal to 0.
Wherein, third display module includes:
First display unit, for showing and the matched touching of touch-control body when the characteristic information of touch-control body is depth information Control body virtual image;Second display unit, for when the characteristic information of touch-control body is location information, display to be used to indicate touch-control The virtual image of body touch position.
Further, target object includes: first object object and the second target object, first object object and the second mesh Mark object is superimposed and displayed on the second place.
Further, the first receiving module 702 includes:
First receiving unit, for receiving the first pressing operation at the first position of user on the first screen.
Execution module 703 includes:
First determination unit is used in response to the first pressing operation, the first pressing operation received according to receiving unit Pressing dynamics, determine corresponding with pressing dynamics target object to be manipulated.
Wherein, the target object to be manipulated is the first object object or second target object;
First execution unit, for executing instruction corresponding with the target object to be manipulated.
Further, the first receiving module 702 includes:
Second receiving unit, for receiving the second pressing operation of the second place of the user on the second screen.
Execution module 703 includes:
Second determination unit, for the second pressing operation for being received in response to the second receiving module, according to second The pressing dynamics of pressing operation determine target object to be manipulated corresponding with the pressing dynamics.
Wherein, the target object to be manipulated is the first object object or second target object;
Second execution unit, for executing instruction corresponding with the target object to be manipulated.
Further, target object include: the locking screen interface of terminal, the main interface of terminal, Application Program Interface, virtually by At least one of in key and object for carrying out message notifying.
Preferably, the first position is Chong Die with the projection of the second position on the first plane.First plane with First screen or the second screen are parallel.
Preferably, in the embodiment of the present invention, the second screen is in screen state of going out.
In the embodiment of the present invention, the touch control operation for generating instruction is carried out on a screen of terminal, at another Show that the implementing result for executing the instruction can increase the cooperation between the multiple screens of terminal in this way on screen The usage experience of user is improved further, it is also possible to sufficiently call the use of finger with interaction.
A kind of hardware structural diagram of Fig. 8 terminal of each embodiment to realize the present invention.
The terminal 800 includes but is not limited to: radio frequency unit 801, network module 802, audio output unit 803, input are single Member 804, sensor 805, display unit 806, user input unit 807, interface unit 808, memory 809, processor 810, And the equal components of power supply 811.It will be understood by those skilled in the art that the not structure paired terminal of terminal structure shown in Fig. 8 It limits, terminal may include perhaps combining certain components or different component layouts than illustrating more or fewer components. In embodiments of the present invention, terminal include but is not limited to mobile phone, tablet computer, laptop, palm PC, car-mounted terminal, Wearable device and pedometer etc..
Processor 810, for being controlled after the second place displaying target object of the second screen in display unit 806, When user input unit 807 receives the touch control operation of user, in response to the touch control operation, corresponding with target object the is executed One instruction, and the first implementing result instructed is shown on the second screen by display unit 806.
Wherein, touch control operation described here includes: the first touch control operation of the user at the first position of the first screen, And/or user is in the second touch control operation of the second place of the second screen;Wherein, the first position of first screen with The second position of second screen is corresponding.
In the embodiment of the present invention, the touch control operation for generating instruction is carried out on a screen of terminal, at another Show that the implementing result for executing the instruction can increase the cooperation between the multiple screens of terminal in this way on screen With interaction.
It should be understood that the embodiment of the present invention in, radio frequency unit 801 can be used for receiving and sending messages or communication process in, signal Send and receive, specifically, by from base station downlink data receive after, to processor 810 handle;In addition, by uplink Data are sent to base station.In general, radio frequency unit 801 includes but is not limited to antenna, at least one amplifier, transceiver, coupling Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 801 can also by wireless communication system and network and other set Standby communication.
Terminal provides wireless broadband internet by network module 802 for user and accesses, and such as user is helped to receive and dispatch electricity Sub- mail, browsing webpage and access streaming video etc..
Audio output unit 803 can be received by radio frequency unit 801 or network module 802 or in memory 809 The audio data of storage is converted into audio signal and exports to be sound.Moreover, audio output unit 803 can also provide and end The relevant audio output of specific function (for example, call signal receives sound, message sink sound etc.) that end 800 executes.Sound Frequency output unit 803 includes loudspeaker, buzzer and receiver etc..
Input unit 804 is for receiving audio or video signal.Input unit 804 may include graphics processor (Graphics Processing Unit, GPU) 8041 and microphone 8042, graphics processor 8041 is in video acquisition mode Or the image data of the static images or video obtained in image capture mode by image capture apparatus (such as camera) carries out Reason.Treated, and picture frame may be displayed on display unit 806.Through graphics processor 8041, treated that picture frame can be deposited Storage is sent in memory 809 (or other storage mediums) or via radio frequency unit 801 or network module 802.Mike Wind 8042 can receive sound, and can be audio data by such acoustic processing.Treated audio data can be The format output that mobile communication base station can be sent to via radio frequency unit 801 is converted in the case where telephone calling model.
Terminal 800 further includes at least one sensor 805, such as optical sensor, motion sensor and other sensors. Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to ambient light Light and shade adjusts the brightness of display panel 8061, and proximity sensor can close display panel when terminal 800 is moved in one's ear 8061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions (generally three axis) and add The size of speed can detect that size and the direction of gravity when static, can be used to identify terminal posture (such as horizontal/vertical screen switching, Dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap) etc.;Sensor 805 can be with Including fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, hygrometer, thermometer, Infrared sensor etc., details are not described herein.
Display unit 806 is for showing information input by user or being supplied to the information of user.Display unit 806 can wrap Display panel 8061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode can be used Forms such as (Organic Light-Emitting Diode, OLED) configure display panel 8061.
User input unit 807 can be used for receiving the number or character information of input, and generates and set with the user of terminal It sets and the related key signals of function control inputs.Specifically, user input unit 807 include touch panel 8071 and other Input equipment 8072.Touch panel 8071, also referred to as touch screen, collect user on it or nearby touch operation (such as User is using any suitable objects or attachment such as finger, stylus on touch panel 8071 or near touch panel 8071 Operation).Touch panel 8071 may include both touch detecting apparatus and touch controller.Wherein, touch detecting apparatus is examined The touch orientation of user is surveyed, and detects touch operation bring signal, transmits a signal to touch controller;Touch controller from Touch information is received on touch detecting apparatus, and is converted into contact coordinate, then gives processor 810, receives processor 810 The order sent simultaneously is executed.Furthermore, it is possible to using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves Realize touch panel 8071.In addition to touch panel 8071, user input unit 807 can also include other input equipments 8072. Specifically, other input equipments 8072 can include but is not limited to physical keyboard, function key (such as volume control button, switch Key etc.), trace ball, mouse, operating stick, details are not described herein.
Further, touch panel 8071 can be covered on display panel 8061, when touch panel 8071 is detected at it On or near touch operation after, send processor 810 to determine the type of touch event, be followed by subsequent processing device 810 according to touching The type for touching event provides corresponding visual output on display panel 8061.Although in fig. 8, touch panel 8071 and display Panel 8061 is the function that outputs and inputs of realizing terminal as two independent components, but in certain embodiments, it can The function that outputs and inputs of terminal is realized so that touch panel 8071 and display panel 8061 is integrated, is not limited herein specifically It is fixed.
Interface unit 808 is the interface that external device (ED) is connect with terminal 800.For example, external device (ED) may include it is wired or Wireless head-band earphone port, external power supply (or battery charger) port, wired or wireless data port, memory card port, For connecting port, the port audio input/output (I/O), video i/o port, ear port of the device with identification module Etc..Interface unit 808 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) and will One or more elements that the input received is transferred in terminal 800 or can be used for terminal 800 and external device (ED) it Between transmit data.
Memory 809 can be used for storing software program and various data.Memory 809 can mainly include storing program area The storage data area and, wherein storing program area can (such as the sound of application program needed for storage program area, at least one function Sound playing function, image player function etc.) etc.;Storage data area can store according to mobile phone use created data (such as Audio data, phone directory etc.) etc..In addition, memory 809 may include high-speed random access memory, it can also include non-easy The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 810 is the control centre of terminal, using the various pieces of various interfaces and the entire terminal of connection, is led to It crosses operation or executes the software program and/or module being stored in memory 809, and call and be stored in memory 809 Data execute the various functions and processing data of terminal, to carry out integral monitoring to terminal.Processor 810 may include one Or multiple processing units;Preferably, processor 810 can integrate application processor and modem processor, wherein application processing The main processing operation system of device, user interface and application program etc., modem processor mainly handles wireless communication.It can manage Solution, above-mentioned modem processor can not also be integrated into processor 810.
Terminal 800 can also include the power supply 811 (such as battery) powered to all parts, it is preferred that power supply 811 can be with It is logically contiguous by power-supply management system and processor 810, thus by power-supply management system realize management charging, electric discharge, with And the functions such as power managed.
In addition, terminal 800 includes some unshowned functional modules, details are not described herein.
Preferably, the embodiment of the present invention also provides a kind of terminal, including processor 810, and memory 809 is stored in storage It is real when which is executed by processor 810 on device 809 and the computer program that can be run on the processor 810 Each process of existing aforesaid operations control method embodiment, and identical technical effect can be reached, to avoid repeating, here no longer It repeats.
The embodiment of the present invention also provides a kind of computer readable storage medium, and meter is stored on computer readable storage medium Calculation machine program realizes each process of aforesaid operations control method embodiment, and energy when the computer program is executed by processor Reach identical technical effect, to avoid repeating, which is not described herein again.Wherein, the computer readable storage medium, such as only Read memory (Read-Only Memory, abbreviation ROM), random access memory (Random Access Memory, abbreviation RAM), magnetic or disk etc..
It should be noted that, in this document, the terms "include", "comprise" or its any other variant are intended to non-row His property includes, so that the process, method, article or the device that include a series of elements not only include those elements, and And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including being somebody's turn to do There is also other identical elements in the process, method of element, article or device.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on this understanding, technical solution of the present invention substantially in other words does the prior art The part contributed out can be embodied in the form of software products, which is stored in a storage medium In (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal (can be mobile phone, computer, service Device, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specific Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much Form belongs within protection of the invention.

Claims (18)

1. a kind of method of controlling operation thereof is applied to terminal, the terminal is included at least: the first screen and the second screen being oppositely arranged Curtain, which is characterized in that the described method includes:
In the second place displaying target object of second screen;
The touch control operation of user is received, the touch control operation includes: first of user at the first position of first screen Touch control operation, and/or, second touch control operation of the user in the second place of second screen;Wherein, first screen First position it is corresponding with the second position of second screen;
In response to the touch control operation, the first instruction corresponding with the target object is executed;
The implementing result of first instruction is shown on second screen.
2. the method according to claim 1, wherein the method also includes:
Detection is located at the characteristic information of the touch-control body of first screen side;
According to the characteristic information of the touch-control body, at the third place on second screen, display and the touch-control body The virtual image matched;
Wherein, the third place is corresponding with the 4th position on first screen, and the 4th position is the touch-control Body is in the position where the orthographic projection on first screen.
3. according to the method described in claim 2, it is characterized in that, the characteristic information of the touch-control body includes following at least one Kind: depth information, touch-control body in the touch-control body in the first screen pre-determined distance are touched in first screen On location information;
The display and the matched virtual image of touch-control body, comprising:
When the characteristic information of the touch-control body includes the depth information, display is physically weak quasi- with the matched touch-control of touch-control body Image;
When the characteristic information of the touch-control body includes the location information, display is used to indicate the virtual of touch-control body touch position Image.
4. the method according to claim 1, wherein the target object includes: first object object and second Target object, the first object object and second target object are superimposed and displayed on the second place.
5. according to the method described in claim 4, it is characterized in that, the touch control operation for receiving user, comprising:
Receive first pressing operation of the user at the first position on first screen;
It is described in response to the touch control operation, execute corresponding with the target object first and instruct, comprising:
In response to first pressing operation, according to the pressing dynamics of first pressing operation, the determining and pressing dynamics Corresponding target object to be manipulated, the target object to be manipulated are the first object object or second target object;
Execute instruction corresponding with the target object to be manipulated.
6. according to the method described in claim 4, it is characterized in that, the touch control operation for receiving user, comprising:
Receive the second pressing operation of the second place of the user on second screen;
It is described in response to the touch control operation, execute corresponding with the target object first and instruct, comprising:
In response to second pressing operation, according to the pressing dynamics of second pressing operation, the determining and pressing dynamics Corresponding target object to be manipulated, the target object to be manipulated are the first object object or second target object;
Execute instruction corresponding with the target object to be manipulated.
7. the method according to claim 1, wherein the target object include: the terminal locking screen interface, At least one of in the main interface of the terminal, Application Program Interface, virtual key and object for carrying out message notifying.
8. the method according to claim 1, wherein the first position and the second position are in the first plane On projection overlapping, wherein first plane is parallel with first screen or second screen.
9. method according to any one of claims 1 to 8, which is characterized in that second screen is in screen state of going out.
10. a kind of terminal, the terminal is included at least: the first screen and the second screen being oppositely arranged, which is characterized in that described Terminal further include:
First display module, for the second place displaying target object in second screen;
First receiving module, for receiving the touch control operation of user, the touch control operation includes: user in first screen The first touch control operation at first position, and/or, second touch control operation of the user in the second place of second screen; Wherein, the first position of first screen is corresponding with the second position of second screen;
Execution module, for executing the first instruction corresponding with the target object in response to the touch control operation;
Second display module, for showing the implementing result of first instruction on second screen.
11. terminal according to claim 10, which is characterized in that the terminal further include:
Detection module, for detecting the characteristic information for being located at the touch-control body of first screen side;
Third display module, the characteristic information of the touch-control body for being detected according to the detection module, described second At the third place on screen, display and the matched virtual image of touch-control body;
Wherein, the third place is corresponding with the 4th position on first screen, and the 4th position is the touch-control Body is in the position where the orthographic projection on first screen.
12. terminal according to claim 11, which is characterized in that the characteristic information of the touch-control body includes following at least one Kind: depth information, touch-control body in the touch-control body in the first screen pre-determined distance are touched in first screen On location information;
The third display module includes:
First display unit, for showing and the touch-control body when the characteristic information of the touch-control body is the depth information Matched touch-control body virtual image;Second display unit, for when the characteristic information of the touch-control body be the location information when, Display is used to indicate the virtual image of touch-control body touch position.
13. terminal according to claim 10, which is characterized in that the target object includes: first object object and Two target objects, the first object object and second target object are superimposed and displayed on the second place.
14. terminal according to claim 13, which is characterized in that first receiving module includes:
First receiving unit, for receiving first pressing operation of the user at the first position on first screen;
The execution module includes:
First determination unit, the first pressing operation for being received in response to first receiving unit, according to described first The pressing dynamics of pressing operation determine target object to be manipulated corresponding with the pressing dynamics, the target object to be manipulated For the first object object or second target object;
First execution unit, for executing instruction corresponding with the target object to be manipulated.
15. terminal according to claim 13, which is characterized in that first receiving module includes:
Second receiving unit, for receiving the second pressing operation of the second place of the user on second screen;
The execution module includes:
Second determination unit, the second pressing operation for being received in response to second receiving unit, according to described second The pressing dynamics of pressing operation determine target object to be manipulated corresponding with the pressing dynamics, the target object to be manipulated For the first object object or second target object;
Second execution unit, for executing instruction corresponding with the target object to be manipulated.
16. terminal according to claim 10, which is characterized in that the target object includes: screen locking circle of the terminal At least one in face, the main interface of the terminal, Application Program Interface, virtual key and object for carrying out message notifying ?.
17. terminal according to claim 10, which is characterized in that the first position is flat first with the second position Projection overlapping on face, wherein first plane is parallel with first screen or second screen.
18. terminal according to any one of claims 10 to 17, which is characterized in that second screen, which is in go out, shields shape State.
CN201811452861.8A 2018-11-30 2018-11-30 Operation control method and terminal Active CN109558061B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811452861.8A CN109558061B (en) 2018-11-30 2018-11-30 Operation control method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811452861.8A CN109558061B (en) 2018-11-30 2018-11-30 Operation control method and terminal

Publications (2)

Publication Number Publication Date
CN109558061A true CN109558061A (en) 2019-04-02
CN109558061B CN109558061B (en) 2021-05-18

Family

ID=65868261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811452861.8A Active CN109558061B (en) 2018-11-30 2018-11-30 Operation control method and terminal

Country Status (1)

Country Link
CN (1) CN109558061B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110007840A (en) * 2019-04-10 2019-07-12 网易(杭州)网络有限公司 Object control method, apparatus, medium and electronic equipment
CN110083302A (en) * 2019-04-30 2019-08-02 维沃移动通信有限公司 A kind of method, apparatus and terminal executing predetermined registration operation
CN110362231A (en) * 2019-07-12 2019-10-22 腾讯科技(深圳)有限公司 The method and device that new line touch control device, image are shown
CN110502182A (en) * 2019-08-28 2019-11-26 Oppo(重庆)智能科技有限公司 Operation processing method, device, mobile terminal and computer readable storage medium
CN111124243A (en) * 2019-12-18 2020-05-08 华勤通讯技术有限公司 Response method and device
WO2020224640A1 (en) * 2019-05-08 2020-11-12 安徽华米信息科技有限公司 Display method and apparatus, intelligent wearable device, and storage medium
CN113680047A (en) * 2021-09-08 2021-11-23 网易(杭州)网络有限公司 Terminal operation method and device, electronic equipment and storage medium
CN115576451A (en) * 2022-12-09 2023-01-06 普赞加信息科技南京有限公司 Multi-point touch device and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012044749A2 (en) * 2010-10-01 2012-04-05 Imerj LLC Window stack modification in response to orientation change
CN103995610A (en) * 2013-02-19 2014-08-20 瀚思科技股份有限公司 Method for user input from alternative touchpads of a handheld computerized device
CN105094654A (en) * 2014-05-07 2015-11-25 中兴通讯股份有限公司 Screen control method and device
CN107077255A (en) * 2017-01-19 2017-08-18 深圳市汇顶科技股份有限公司 A kind of method and device by pressing dynamics control intelligent terminal operation
CN108153466A (en) * 2017-11-28 2018-06-12 北京珠穆朗玛移动通信有限公司 Operating method, mobile terminal and storage medium based on double screen
CN108205419A (en) * 2017-12-21 2018-06-26 中兴通讯股份有限公司 Double screens control method, apparatus, mobile terminal and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012044749A2 (en) * 2010-10-01 2012-04-05 Imerj LLC Window stack modification in response to orientation change
CN103995610A (en) * 2013-02-19 2014-08-20 瀚思科技股份有限公司 Method for user input from alternative touchpads of a handheld computerized device
CN105094654A (en) * 2014-05-07 2015-11-25 中兴通讯股份有限公司 Screen control method and device
CN107077255A (en) * 2017-01-19 2017-08-18 深圳市汇顶科技股份有限公司 A kind of method and device by pressing dynamics control intelligent terminal operation
CN108153466A (en) * 2017-11-28 2018-06-12 北京珠穆朗玛移动通信有限公司 Operating method, mobile terminal and storage medium based on double screen
CN108205419A (en) * 2017-12-21 2018-06-26 中兴通讯股份有限公司 Double screens control method, apparatus, mobile terminal and computer readable storage medium

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110007840A (en) * 2019-04-10 2019-07-12 网易(杭州)网络有限公司 Object control method, apparatus, medium and electronic equipment
CN110083302A (en) * 2019-04-30 2019-08-02 维沃移动通信有限公司 A kind of method, apparatus and terminal executing predetermined registration operation
WO2020224640A1 (en) * 2019-05-08 2020-11-12 安徽华米信息科技有限公司 Display method and apparatus, intelligent wearable device, and storage medium
CN110362231A (en) * 2019-07-12 2019-10-22 腾讯科技(深圳)有限公司 The method and device that new line touch control device, image are shown
CN110362231B (en) * 2019-07-12 2022-05-20 腾讯科技(深圳)有限公司 Head-up touch device, image display method and device
CN110502182A (en) * 2019-08-28 2019-11-26 Oppo(重庆)智能科技有限公司 Operation processing method, device, mobile terminal and computer readable storage medium
CN110502182B (en) * 2019-08-28 2021-06-29 Oppo(重庆)智能科技有限公司 Operation processing method and device, mobile terminal and computer readable storage medium
CN111124243A (en) * 2019-12-18 2020-05-08 华勤通讯技术有限公司 Response method and device
CN113680047A (en) * 2021-09-08 2021-11-23 网易(杭州)网络有限公司 Terminal operation method and device, electronic equipment and storage medium
CN115576451A (en) * 2022-12-09 2023-01-06 普赞加信息科技南京有限公司 Multi-point touch device and system

Also Published As

Publication number Publication date
CN109558061B (en) 2021-05-18

Similar Documents

Publication Publication Date Title
CN109558061A (en) A kind of method of controlling operation thereof and terminal
CN109683847A (en) A kind of volume adjusting method and terminal
CN109375890A (en) A kind of screen display method and Multi-screen electronic equipment
EP3525075B1 (en) Method for lighting up screen of double-screen terminal, and terminal
CN107728886B (en) A kind of one-handed performance method and apparatus
CN109343759A (en) A kind of control method and terminal of the display of breath screen
CN110413168A (en) A kind of icon management method and terminal
CN110737374A (en) Operation method and electronic equipment
CN108415641A (en) A kind of processing method and mobile terminal of icon
CN109800045A (en) A kind of display methods and terminal
CN108762634A (en) A kind of control method and terminal
CN109669747A (en) A kind of method and mobile terminal of moving icon
CN108958614A (en) A kind of display control method and terminal
CN108536366A (en) A kind of application window method of adjustment and terminal
CN109407949A (en) A kind of display control method and terminal
CN110324497A (en) A kind of method of controlling operation thereof and terminal
CN109683802A (en) A kind of icon moving method and terminal
CN110045843A (en) Electronic pen, electronic pen control method and terminal device
CN109743449A (en) A kind of virtual key display methods and terminal
CN108469940A (en) A kind of screenshot method and terminal
CN109710130A (en) A kind of display methods and terminal
CN108509108A (en) A kind of application icon aligning method and mobile terminal
CN108540668B (en) A kind of program starting method and mobile terminal
CN109857317A (en) A kind of control method and terminal device of terminal device
CN110221799A (en) A kind of control method, terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant