CN110908556A - Interaction method, interaction device, mobile terminal and computer readable storage medium - Google Patents

Interaction method, interaction device, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN110908556A
CN110908556A CN201911015509.2A CN201911015509A CN110908556A CN 110908556 A CN110908556 A CN 110908556A CN 201911015509 A CN201911015509 A CN 201911015509A CN 110908556 A CN110908556 A CN 110908556A
Authority
CN
China
Prior art keywords
instruction
preset
target
information
operation information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911015509.2A
Other languages
Chinese (zh)
Inventor
夏程杰
钱玲
崔士宾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Microphone Holdings Co Ltd
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Microphone Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Microphone Holdings Co Ltd filed Critical Shenzhen Microphone Holdings Co Ltd
Priority to CN201911015509.2A priority Critical patent/CN110908556A/en
Publication of CN110908556A publication Critical patent/CN110908556A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention belongs to the technical field of computers, and relates to an interaction method, an interaction device, a mobile terminal and a computer readable storage medium, wherein the interaction method comprises the following steps: and S11, acquiring operation information for the preset operation area. And S12, responding to the operation information according to the first preset rule to obtain the target instruction. And S13, controlling the target object according to a second preset rule based on the target instruction. Therefore, when the operation information aiming at the preset operation area is obtained, the target instruction can be obtained according to the operation information and the first preset rule, the target object is selected according to the target instruction and the second preset rule, and the target object is controlled, so that man-machine interaction is realized. Therefore, the invention can enable the degree of freedom of man-machine interaction to be higher and control functions to be more, thereby improving the use experience of users.

Description

Interaction method, interaction device, mobile terminal and computer readable storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to an interaction method, an interaction apparatus, a mobile terminal, and a computer-readable storage medium.
Background
With the development of the technology of mobile terminals such as mobile phones, the styles of display screens of mobile terminals such as mobile phones are more and more. Currently, display screens of mainstream mobile terminals include a water drop screen, a qiliu screen, a punching screen, and the like. Moreover, with the development requirement of full-screen display of the mobile terminal, the punching screen with the advantages of small occupied area and the like is pursued by more users.
At present, mobile terminals such as mobile phones and the like mostly have a human-computer interaction function, but the operation mode of the human-computer interaction function is relatively fixed at present (for example, a user needs to click a function control displayed in a touch screen to perform corresponding control so as to perform human-computer interaction), so that the operation habit of the user cannot be well met, and the user experience is poor.
In view of the above problems, those skilled in the art have sought solutions.
The foregoing description is provided for general background information and is not admitted to be prior art.
Disclosure of Invention
The technical problem to be solved by the present invention is to provide an interaction method, an interaction apparatus, a mobile terminal and a computer-readable storage medium, aiming at improving the user experience.
The invention is realized by the following steps:
the invention provides an interaction method, which is applied to a mobile terminal and comprises the following steps: and S11, acquiring operation information for the preset operation area. And S12, responding to the operation information according to the first preset rule to obtain the target instruction. And S13, controlling the target object according to a second preset rule based on the target instruction.
Further, before the step of S11, the method further includes: display instructions and/or status information are obtained. And displaying the preset operation area according to the display instruction and/or the state information. The preset operation area comprises at least one of a mobile terminal screen punching area and a mobile terminal preset screen area.
Further, the first preset rule includes: and when the operation information meets the preset setting condition, triggering a target setting instruction. Or when the operation information meets the preset starting condition, triggering a starting target instruction. Or when the operation information meets the preset control condition, triggering a control target instruction.
Further, the second preset rule includes: the target object includes: at least one of foreground object, background object and preset object. And/or, if the target instruction is a set target instruction, acquiring and storing foreground object information and corresponding operation information of the mobile terminal, and/or background object information and corresponding operation information.
Further, the foreground object information and/or the background object information includes: and starting an entrance of a foreground object and/or a background object. And/or, a launch entry for a function of a foreground object and/or a background object.
Further, the second preset rule includes: and if the target instruction is the starting target instruction, matching with the operation information corresponding to the target object according to the operation information. And if the matching is successful, starting the corresponding target object and/or displaying the corresponding target object. Or if the matching is unsuccessful, outputting prompt information.
Further, the second preset rule includes: and if the target instruction is a control target instruction, searching and outputting the corresponding control instruction according to the operation information.
The invention also provides an interaction device, comprising: and the display module is used for displaying the preset operation area and/or the target object. The operation receiving module is used for receiving operation information of a preset operation area. And the control module is used for controlling the target object according to the operation information.
Further, the preset operation area comprises at least one of a screen punching area and a preset screen area. And/or, the target object comprises: at least one of foreground object, background object and preset object, and starting inlet of foreground object and/or background object. And/or, a launch entry for a function of a foreground object and/or a background object.
Further, controlling the target object according to the operation information, wherein the target object comprises triggering a target instruction through the operation information, and the target instruction comprises at least one of a setting instruction, a starting instruction and a control instruction; and if the target instruction is the set target instruction, acquiring and storing foreground object information and corresponding operation information and/or background object information and corresponding operation information. Or, if the target instruction is a starting target instruction, matching with the operation information corresponding to the target object according to the operation information; and if the matching is successful, starting the corresponding target object and/or displaying the corresponding target object. Or if the matching is unsuccessful, outputting prompt information. Or if the target instruction is a control target instruction, searching the corresponding control instruction according to the operation information and outputting the control instruction.
The invention also provides a mobile terminal, which comprises a memory, a processor and a control program which is stored on the memory and can be operated on the processor, wherein when the control program of the interaction method is executed by the processor, the interaction method is realized.
The invention also provides a computer-readable storage medium, in which a computer program is stored, the computer program comprising program instructions which, when executed by a processor, cause the processor to carry out the interaction method as described above.
The invention provides an interaction method, an interaction device, a terminal and a computer readable storage medium, wherein the interaction method comprises the following steps: and S11, acquiring operation information for the preset operation area. And S12, responding to the operation information according to the first preset rule to obtain the target instruction. And S13, controlling the target object according to a second preset rule based on the target instruction. Therefore, when the operation information aiming at the preset operation area is obtained, the target instruction can be obtained according to the operation information and the first preset rule, the target object is selected according to the target instruction and the second preset rule, and the target object is controlled, so that man-machine interaction is realized. Therefore, the invention can enable the degree of freedom of man-machine interaction to be higher and control functions to be more, thereby improving the use experience of users.
In order to make the aforementioned and other objects, features and advantages of the invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
Fig. 1 is a schematic flow chart of an interaction method according to a first embodiment of the present invention;
FIG. 2 is a schematic view of a perforating screen provided in accordance with a first embodiment of the present invention;
FIG. 3 is a first schematic diagram of a predetermined operation region according to a first embodiment of the present invention;
FIG. 4 is a second schematic diagram of a preset operation region provided by the first embodiment of the present invention;
FIG. 5 is a schematic structural diagram of an interaction device according to a second embodiment of the present invention;
fig. 6 is a schematic structural diagram of a terminal according to a third embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It is to be understood that the described embodiments are merely a few embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The first embodiment:
fig. 1 is a schematic flow chart of an interaction method according to a first embodiment of the present invention. Fig. 2 is a schematic diagram of a punching screen provided by a first embodiment of the present invention. Fig. 3 is a first schematic diagram of a preset operation region according to a first embodiment of the present invention. Fig. 4 is a second schematic diagram of the preset operation region according to the first embodiment of the present invention. For a clear description of the interaction method provided by the first embodiment of the present invention, please refer to fig. 1 to fig. 4.
The interaction method provided by the first embodiment of the invention comprises the following steps:
and S11, acquiring operation information for the preset operation area.
In one embodiment, before the operation information for the preset operation region is acquired in step S11, the method may include, but is not limited to: display instructions and/or status information are obtained. And displaying the preset operation area according to the display instruction and/or the state information.
In an embodiment, in the step of obtaining the display instruction and/or the status information, the step of obtaining the display instruction may include, but is not limited to: verifying whether the control information aiming at the preset control object meets the preset control condition. And if the control information aiming at the preset control object meets the preset control condition, triggering a display instruction.
In an embodiment, in the step of verifying whether the manipulation information for the preset manipulation object satisfies the preset manipulation condition, the preset manipulation object may include, but is not limited to: the terminal comprises at least one of an interactive switch control (a virtual control), a terminal preset screen area, a terminal preset physical key, a terminal preset fingerprint sensor and the like.
In an embodiment, in the step of verifying whether the manipulation information for the preset manipulation object satisfies a preset manipulation condition, the preset manipulation condition may include, but is not limited to, at least one of the following conditions: the interactive switch control receives a first preset operation instruction. And the terminal screen preset area receives a second preset operation instruction. And the terminal preset physical key receives a third preset operation instruction. And the terminal preset fingerprint sensor receives a fourth preset operation instruction.
In an embodiment, the first preset operation instruction, the second preset operation instruction, the third preset operation instruction, and the fourth preset operation instruction may be the same or different.
In an embodiment, the first preset operation instruction, the second preset operation instruction, the third preset operation instruction, and the fourth preset operation instruction may include, but are not limited to, at least one of the following manipulation information: touch information including at least one of a single click, a double click, a long press, a swipe, and the like. And/or, contact information including at least one of a contact area, a contact position, a contact time length, a contact operation direction (e.g., left operation, right operation, up operation, down operation, etc.), and the like.
In other embodiments, the display instructions may also be, but are not limited to being, retrieved from voice information of the user.
In one embodiment, in the step of obtaining the display instruction and/or the status information, the status information may include, but is not limited to: at least one of user status information, terminal status information, environment status information, and the like.
In an embodiment, the user status information may include, but is not limited to: at least one of human eye information, user speed information and a handheld terminal. The user speed information may include a movement speed of the user (the user speed information may be acquired by the terminal itself, or acquired and sent by the wearable terminal). The information of the handheld terminal can be acquired by the terminal itself, or can be detected by other terminals and sent to the terminal.
In an embodiment, the terminal state information may include, but is not limited to: at least one of running APP information (e.g., music player, etc.), time information, location information, speed information, terminal posture information, and the like.
In one embodiment, the environmental status information includes: at least one of brightness information and noise information.
In an embodiment, in the step of displaying the preset operation region according to the display instruction and/or the state information, the displayed preset operation region may include, but is not limited to, the following display states: perfect circles, ovals, squares, rectangles, drag bars, and the like.
In an embodiment, in the step that the interactive switch control receives the first preset operation instruction to trigger the display instruction, so as to display the preset operation area according to the display instruction, the method may include, but is not limited to: and amplifying the interactive switch control according to the display instruction to form the preset operation area. For example, the touch information included in the first preset operation instruction is double-click or re-pressing, and the display instruction is triggered according to the touch information (that is, the display instruction is triggered when the user double-clicks or re-presses the interactive switch control), so that the interactive switch control is enlarged according to the display instruction to form the preset operation area. For example, the touch information included in the first preset operation instruction is a slide, and when the slide meets a preset slide rule (for example, when a user selects an interactive switch control and then performs any slide operation), a display instruction is triggered, so that the interactive switch control is enlarged according to the display instruction to form the preset operation area. For another example, the touch information included in the first preset operation instruction triggers a display instruction (for example, the display instruction is triggered when two fingers press the position of the interactive switch control at the same time) when the contact area is larger than the preset area value, so as to enlarge the interactive switch control to form the preset operation area. For another example, when the contact position matches the preset contact position (for example, clicking a lower right corner of the interactive switch control), the touch information included in the first preset operation instruction enlarges the interactive switch control to form the preset operation area.
In an embodiment, in the step that the interactive switch control receives the first preset operation instruction to trigger the display instruction, so as to display the preset operation area according to the display instruction, the method may include, but is not limited to: and popping out a preset operation area in a display interface corresponding to the interactive switch control according to the display instruction.
In an embodiment, in the step that the interactive switch control receives the first preset operation instruction to trigger the display instruction, so as to display the preset operation area according to the display instruction, the method may include, but is not limited to: and jumping to an interactive control interface according to the display instruction, wherein the interactive control interface comprises a preset operation area.
In an embodiment, in the step of receiving the first preset operation instruction by the interactive switch control to trigger the display instruction, the method may include, but is not limited to: and receiving a first preset operation instruction at the interactive switch control. And triggering a display instruction according to the operation information in the first preset operation instruction. For example, a display instruction is triggered according to touch information (e.g., at least one of single click, double click, re-press, long-press, and sliding) included in the first preset operation instruction. For another example, the first preset operation instruction includes contact information (the contact information includes at least one of a contact area, a contact position, and a contact duration), and the display instruction is triggered when the contact information meets the preset contact condition. For another example, the first preset operation instruction includes an operation direction, and the display instruction is triggered when the operation direction (e.g., a sliding direction of the sliding interactive switch control) matches the preset direction.
In an embodiment, the display instruction is triggered according to touch information included in the first preset operation instruction, where the touch information may include, but is not limited to, at least one of a single click, a double click, a long press, a slide, and the like. For example, when the user performs a sliding operation on the interactive switch control, the first preset operation instruction that the touch information is sliding can be acquired, and when the sliding meets a preset sliding rule, a display instruction is triggered.
In an embodiment, for specific embodiments that the preset area of the terminal screen receives the second preset operation instruction, the preset physical key of the terminal receives the third preset operation instruction, and the preset fingerprint sensor of the terminal receives the fourth preset operation instruction, the specific embodiment that the interactive switch control receives the first preset operation instruction may be referred to above, and will not be described herein again.
In an embodiment, the step of displaying the preset operation region according to the display instruction and/or the status information may include, but is not limited to: and when the state information accords with the preset rule, displaying the preset operation area. Wherein, the condition that the state information meets the preset condition comprises at least one of the following conditions (or the preset rule comprises at least one of the following rules): the method comprises the steps that a user watches a terminal screen, the user holds the terminal, the user speed is within a preset allowable speed range, the terminal time is within a preset allowable time period (such as daytime), the terminal place is within a preset allowable place range (such as at home, in a toilet and the like), the terminal speed is within a preset speed range, the terminal posture is in a held state, the environment brightness value is larger than or equal to a preset brightness threshold value, and the environment noise value is smaller than or equal to a preset noise threshold value.
In an embodiment, after the step of acquiring the display instruction and the state information, in the step of displaying the preset operation region according to the display instruction and the state information, the step may include, but is not limited to: and when the state information accords with the preset display rule, displaying the preset operation area. And/or when the state information does not accord with the preset display rule, the preset operation area is not displayed.
In one embodiment, the display rules are preset, including at least one of the following rules: the method comprises the steps that a user watches a terminal screen, the user holds the terminal, the speed of the user is within a preset allowed speed range, the speed of the terminal is within a preset speed range, and the posture of the terminal is in a held state.
In one embodiment, when the status information conforms to the predetermined display rule, the predetermined operation area is displayed. For example, in fig. 2, when the user looks at the terminal screen, the preset operation area is displayed. For example, as shown in fig. 3, when the user holds the terminal, the preset operation area is displayed.
In an embodiment, the preset operation area may include, but is not limited to, at least one of a mobile terminal screen punching area, a mobile terminal preset screen area, and the like. The mobile terminal screen punching area can be a punching area of a front camera of the punching screen. The perforating screen literally means perforating holes on the screen, and the camera under the screen, the photosensitive element, the microphone and other elements are integrated perfectly, so that only one small hole is reserved on the screen, the screen is guaranteed to have a larger display area, and the perforating screen is a development product of full-screen display of the terminal.
In one embodiment, referring to FIG. 2, punch area A301 of the front facing camera of the punch screen may be, but is not limited to, being located in the upper left corner of punch screen A3, e.g., punch area A301 may also be located in the upper right corner of punch screen A3 (not shown).
In one embodiment, in step S11, operation information for the preset operation region is obtained, where the operation information may refer to a specific embodiment of the above-mentioned manipulation information, specifically, the operation information, such as clicking the preset operation region, pressing the preset operation region for a long time, sliding down within the preset operation region, and clicking the preset operation region with two fingers.
In one embodiment, the operation information for the manipulation area is a slide, and the manipulation area includes a punching area of a front camera of the punching screen, the slide may include, but is not limited to, two operation modes, the first mode is that a finger slides completely through the punching area; the second is that the finger is in sliding contact with the perforated area, i.e. without passing completely through the perforated area. Therefore, the interaction method provided by the embodiment can make full use of the punching area to increase the quick interaction operation of the user, and can judge the validity of the operation information according to whether the user contacts the punching area, so that the interaction accuracy can be improved.
In other embodiments, the step S11 of obtaining the operation information for the preset operation area may include, but is not limited to: the operation information is acquired by the operation acquisition means. The operation acquiring device may include, but is not limited to, a motion sensing device, an image pickup device, and the like. The preset operation area may be, but is not limited to, a detection area of the motion sensing device or the image capturing device. Therefore, the interaction method provided by the embodiment can realize the space-separated interactive operation so as to improve the experience of the user.
And S12, responding to the operation information according to the first preset rule to obtain the target instruction.
In one embodiment, the first preset rule includes: and when the operation information meets the preset setting condition, triggering a target setting instruction. Or when the operation information meets the preset starting condition, triggering a starting target instruction. Or when the operation information meets the preset control condition, triggering a control target instruction.
In an embodiment, specifically, the interaction method provided in this embodiment can match the target instruction according to the operation information for the preset operation area.
In one embodiment, the first preset rule may be, but is not limited to, preset and stored by the user according to the use habit thereof. The first preset rule can enable the operation information and the target instruction to form a mapping relation.
In one embodiment, before responding to the operation information according to the first preset rule to obtain the target instruction in step S12, the method may include, but is not limited to: biometric information of a user is acquired. And when judging that the first preset rule corresponding to the biological characteristic information of the user is not stored, sending a first preset rule acquisition request to the server to acquire and store the first preset rule sent by the server, wherein the first preset rule acquisition request can include but is not limited to the biological characteristic information of the user. Therefore, the interaction method provided by the embodiment can prevent the user from resetting the first preset rule after replacing the mobile phone.
And S13, controlling the target object according to a second preset rule based on the target instruction.
In one embodiment, the second preset rule includes: the target object includes: at least one of foreground object, background object and preset object. And/or, if the target instruction is a set target instruction, acquiring and storing foreground object information and corresponding operation information of the mobile terminal, and/or background object information and corresponding operation information.
In one embodiment, the foreground object information and/or the background object information includes: and starting an entrance of a foreground object and/or a background object. And/or, a launch entry for a function of a foreground object and/or a background object.
In one embodiment, the second preset rule includes: and if the target instruction is the starting target instruction, matching with the operation information corresponding to the target object according to the operation information. And if the matching is successful, starting the corresponding target object and/or displaying the corresponding target object. Or if the matching is unsuccessful, outputting prompt information.
In one embodiment, the second preset rule includes: and if the target instruction is a control target instruction, searching and outputting the corresponding control instruction according to the operation information. Thereby, the target object is caused to respond according to the control instruction.
In one embodiment, S13, before the step of controlling the target object according to the second preset rule based on the target instruction, the step may include, but is not limited to: and acquiring a target object corresponding to the current display interface. And acquiring a second preset rule corresponding to the target object. For example, the target object corresponding to the current display interface is a system application, the target instruction is a target opening instruction, and screen locking control is performed according to a second preset rule based on the target instruction. For another example, when the operation information is a double click, the obtained target instruction is an open target instruction, and the wechat application is started according to a second preset rule based on the target instruction.
In an embodiment, the target instruction obtained according to the operation information is a control target instruction, and in step S13, controlling the target object according to the second preset rule based on the target instruction may include, but is not limited to: and controlling the target object according to a second preset rule based on the control target instruction so that the target object sends the Internet of things control instruction to other equipment. The target object can be an internet of things control application. Therefore, the interaction method provided by the embodiment can realize the control of the internet of things, for example, control of an air conditioner, a refrigerator, a lighting lamp, a vehicle and the like.
The interaction method provided by the first embodiment of the invention comprises the following steps: and S11, acquiring operation information for the preset operation area. And S12, responding to the operation information according to the first preset rule to obtain the target instruction. And S13, controlling the target object according to a second preset rule based on the target instruction. Therefore, when the interaction method provided by the first embodiment of the present invention obtains the operation information for the preset operation area, the target instruction can be obtained according to the operation information and the first preset rule, the target object is selected according to the target instruction and the second preset rule, and the target object is controlled, so as to implement human-computer interaction. Therefore, the interaction method provided by the first embodiment of the invention can enable the degree of freedom of man-machine interaction to be higher and control functions to be more, thereby improving the use experience of users.
Second embodiment:
fig. 5 is a schematic structural diagram of an interaction device according to a second embodiment of the present invention. For a clear description of the interaction device provided in the second embodiment of the present invention, please refer to fig. 5.
An interaction apparatus provided in a second embodiment of the present invention includes: a display module B1, an operation receiving module B2 and a control module B3.
The display module B1 is configured to display a preset operation area and/or a target object.
In an embodiment, the preset operation area may include, but is not limited to, at least one of a screen punching area and a preset screen area.
In an embodiment, the target object may include, but is not limited to: at least one of foreground object, background object and preset object, and starting inlet of foreground object and/or background object. And/or, a launch entry for a function of a foreground object and/or a background object.
In one embodiment, the display module B1 includes an acquisition unit and a display unit. The acquisition unit is used for acquiring the display instruction and/or the state information. And the display unit is used for displaying the preset operation area according to the display instruction and/or the state information.
In one embodiment, the obtaining unit may include a verifying subunit and a triggering subunit. The verification subunit is configured to verify whether the manipulation information for the preset manipulation object satisfies a preset manipulation condition. And the triggering subunit is used for triggering the display instruction if the control information aiming at the preset control object meets the preset control condition.
The operation receiving module B2 is configured to receive operation information of a preset operation area.
In one embodiment, the operation information may include, but is not limited to: touch information including at least one of a single click, a double click, a long press, a swipe, and the like. And/or, contact information including at least one of a contact area, a contact position, a contact time length, a contact operation direction (e.g., left operation, right operation, up operation, down operation, etc.), and the like.
In other embodiments, the operation receiving module B2 may also be, but is not limited to, used for acquiring operation information through the operation acquiring device. The operation acquiring device may include, but is not limited to, a motion sensing device, an image pickup device, and the like. The preset operation area may be, but is not limited to, a detection area of the motion sensing device or the image capturing device. Therefore, the interaction device provided by the embodiment can realize the space-separated interaction operation, so as to improve the experience of the user.
The control module B3 is configured to control the target object according to the operation information.
In one embodiment, the target object is controlled according to the operation information, including that a target instruction is triggered through the operation information, and the target instruction includes at least one of a setting instruction (or a setting target instruction), a starting instruction (or a starting target instruction) and a control instruction (or a control target instruction); and if the target instruction is the set target instruction, acquiring and storing foreground object information and corresponding operation information and/or background object information and corresponding operation information. Or, if the target instruction is a starting target instruction, matching with the operation information corresponding to the target object according to the operation information; and if the matching is successful, starting the corresponding target object and/or displaying the corresponding target object. Or if the matching is unsuccessful, outputting prompt information. Or if the target instruction is a control target instruction, searching the corresponding control instruction according to the operation information and outputting the control instruction.
In one embodiment, the control module B3 may include, but is not limited to, an instruction obtaining unit, a control unit. The instruction obtaining unit may be configured to respond to the operation information according to a first preset rule to obtain the target instruction. The control unit may be configured to control the target object according to a second preset rule based on the target instruction.
In an embodiment, the interaction method provided in the first embodiment of the present invention may be referred to as a collective embodiment and advantageous effects of the interaction device provided in this embodiment, and will not be described herein again.
An interaction apparatus provided in a second embodiment of the present invention includes: a display module B1, an operation receiving module B2 and a control module B3. The display module B1 is configured to display a preset operation area and/or a target object. The operation receiving module B2 is configured to receive operation information of a preset operation area. The control module B3 is configured to control the target object according to the operation information. Therefore, when the interaction device provided by the second embodiment of the present invention obtains the operation information for the preset operation area, the interaction device can obtain the target instruction according to the operation information and the first preset rule, select the target object according to the target instruction and the second preset rule, and control the target object, so as to implement human-computer interaction. Therefore, the interaction device provided by the second embodiment of the invention can enable the degree of freedom of human-computer interaction to be higher and control functions to be more, thereby improving the use experience of users.
The third embodiment:
fig. 6 is a schematic structural diagram of a mobile terminal according to a third embodiment of the present invention. For a clear description of the mobile terminal 1 according to the third embodiment of the present invention, please refer to fig. 6.
A mobile terminal 1 according to a third embodiment of the present invention includes: a processor a101 and a memory a201, wherein the processor a101 is configured to execute the computer program a6 stored in the memory a201 to implement the steps of the interaction method as described in the first embodiment.
In an embodiment, the mobile terminal 1 provided in this embodiment may include at least one processor a101 and at least one memory a 201. Wherein, at least one processor A101 may be referred to as a processing unit A1, and at least one memory A201 may be referred to as a memory unit A2. Specifically, the storage unit a2 stores a computer program a6, and when the computer program a6 is executed by the processing unit a1, the mobile terminal 1 provided by this embodiment implements the steps of the interaction method as described above, for example, step S11 shown in fig. 1: acquiring operation information for a preset operation region; step S12, responding the operation information according to a first preset rule to obtain a target instruction; and step S13, controlling the target object according to a second preset rule based on the target instruction.
In an embodiment, the mobile terminal 1 provided in the present embodiment may include a plurality of memories a201 (referred to as a storage unit A2 for short), and the storage unit A2 may include, for example, a Random Access Memory (RAM) and/or a cache memory and/or a Read Only Memory (ROM), and/or the like.
In one embodiment, the mobile terminal 1 further includes a bus connecting the various components (e.g., processor A101 and memory A201, perforated screen A3, etc.).
In one embodiment, the mobile terminal 1 in this embodiment may further include a communication interface (e.g., I/O interface a4), which may be used for communication with an external device.
In an implementation, the mobile terminal 1 provided in this embodiment may further include a communication device a 5.
The mobile terminal 1 provided by the third embodiment of the present invention includes a memory a101 and a processor a201, and the processor a101 is configured to execute the computer program a6 stored in the memory a201 to implement the steps of the interaction method described in the first embodiment, so that the mobile terminal 1 provided by this embodiment can enable a higher degree of freedom of human-computer interaction and more control functions, thereby improving the user experience.
The third embodiment of the present invention also provides a computer-readable storage medium, which stores a computer program a6, and when being executed by the processor a101, the computer program a6 implements the steps of the interaction method as in the first embodiment, for example, the steps shown in fig. 1 are S11 to S13.
In an embodiment, the computer readable storage medium provided by the embodiment may include any entity or device capable of carrying computer program code, a recording medium, such as ROM, RAM, magnetic disk, optical disk, flash memory, and the like.
When the processor a101 executes the computer program a6 stored in the computer-readable storage medium provided by the third embodiment of the present invention, the degree of freedom of human-computer interaction can be higher, and more control functions can be realized, so that the use experience of the user can be improved.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
As used herein, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, including not only those elements listed, but also other elements not expressly listed.
The present invention is not limited to the above preferred embodiments, and any modification, equivalent replacement or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (12)

1. An interaction method is applied to a mobile terminal, and is characterized in that the interaction method comprises the following steps:
s11, acquiring operation information aiming at the preset operation area;
s12, responding the operation information according to a first preset rule to obtain a target instruction;
and S13, controlling the target object according to a second preset rule based on the target instruction.
2. The interaction method according to claim 1, wherein said step of S11 is preceded by the steps of:
acquiring a display instruction and/or state information;
displaying the preset operation area according to the display instruction and/or the state information;
the preset operation area comprises at least one of the mobile terminal screen punching area and the mobile terminal preset screen area.
3. The interaction method according to claim 1, wherein the first preset rule comprises:
when the operation information meets a preset setting condition, triggering a target setting instruction; or the like, or, alternatively,
when the operation information meets a preset starting condition, triggering a starting target instruction; or the like, or, alternatively,
and when the operation information meets a preset control condition, triggering a control target instruction.
4. The interaction method according to claim 3, wherein the second preset rule comprises:
the target object includes: at least one of foreground object, background object and preset object; and/or the presence of a gas in the gas,
and if the target instruction is the set target instruction, acquiring and storing foreground object information and corresponding operation information of the mobile terminal and/or background object information and corresponding operation information.
5. The interaction method according to claim 4, wherein the foreground object information and/or the background object information comprises:
the starting inlet of the foreground object and/or the background object; and/or the presence of a gas in the gas,
and starting an entrance of the function of the foreground object and/or the background object.
6. The interaction method according to claim 4 or 5, wherein the second preset rule comprises:
if the target instruction is the target starting instruction, matching with operation information corresponding to a target object according to the operation information;
if the matching is successful, starting the corresponding target object and/or displaying the corresponding target object; or the like, or, alternatively,
and if the matching is unsuccessful, outputting prompt information.
7. The interaction method according to claim 3, wherein the second preset rule comprises:
and if the target instruction is the control target instruction, searching the corresponding control instruction according to the operation information and outputting the control instruction.
8. An interactive apparatus, comprising:
the display module is used for displaying a preset operation area and/or a target object;
the operation receiving module is used for receiving the operation information of the preset operation area;
and the control module is used for controlling the target object according to the operation information.
9. The interaction device of claim 8,
the preset operation area comprises at least one of a screen punching area and a preset screen area; and/or the presence of a gas in the gas,
the target object includes: at least one of a foreground object, a background object and a preset object, wherein the starting inlet of the foreground object and/or the background object; and/or a start entry of a function of the foreground object and/or the background object.
10. The interaction device of claim 9,
the control of the target object according to the operation information comprises triggering a target instruction through the operation information, wherein the target instruction comprises at least one of a setting instruction, a starting instruction and a control instruction;
if the target instruction is the set target instruction, foreground object information and corresponding operation information and/or background object information and corresponding operation information are acquired and stored; or the like, or, alternatively,
if the target instruction is the target starting instruction, matching with operation information corresponding to a target object according to the operation information; if the matching is successful, starting the corresponding target object and/or displaying the corresponding target object; or if the matching is unsuccessful, outputting prompt information; or the like, or, alternatively,
and if the target instruction is the control target instruction, searching the corresponding control instruction according to the operation information and outputting the control instruction.
11. A mobile terminal comprising a memory, a processor, and an interactive control program stored on the memory and executable on the processor, the interactive method being implemented when the control program of the interactive method is executed by the processor.
12. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to carry out the interaction method according to any one of claims 1 to 7.
CN201911015509.2A 2019-10-24 2019-10-24 Interaction method, interaction device, mobile terminal and computer readable storage medium Pending CN110908556A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911015509.2A CN110908556A (en) 2019-10-24 2019-10-24 Interaction method, interaction device, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911015509.2A CN110908556A (en) 2019-10-24 2019-10-24 Interaction method, interaction device, mobile terminal and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN110908556A true CN110908556A (en) 2020-03-24

Family

ID=69815058

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911015509.2A Pending CN110908556A (en) 2019-10-24 2019-10-24 Interaction method, interaction device, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110908556A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111787081A (en) * 2020-06-21 2020-10-16 张伟 Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103777875A (en) * 2012-10-18 2014-05-07 中兴通讯股份有限公司 Human-machine interaction method and device and electronic device thereof
WO2017032008A1 (en) * 2015-08-27 2017-03-02 广东欧珀移动通信有限公司 Method for starting application, and mobile terminal
CN108345424A (en) * 2018-01-31 2018-07-31 维沃移动通信有限公司 A kind of method for information display and mobile terminal
US20180255097A1 (en) * 2015-11-05 2018-09-06 Alibaba Group Holding Limited Method and device for application information risk management
CN109739580A (en) * 2019-01-16 2019-05-10 北京小米移动软件有限公司 The starting method, apparatus and storage medium of application program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103777875A (en) * 2012-10-18 2014-05-07 中兴通讯股份有限公司 Human-machine interaction method and device and electronic device thereof
WO2017032008A1 (en) * 2015-08-27 2017-03-02 广东欧珀移动通信有限公司 Method for starting application, and mobile terminal
US20180255097A1 (en) * 2015-11-05 2018-09-06 Alibaba Group Holding Limited Method and device for application information risk management
CN108345424A (en) * 2018-01-31 2018-07-31 维沃移动通信有限公司 A kind of method for information display and mobile terminal
CN109739580A (en) * 2019-01-16 2019-05-10 北京小米移动软件有限公司 The starting method, apparatus and storage medium of application program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111787081A (en) * 2020-06-21 2020-10-16 张伟 Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform
CN111787081B (en) * 2020-06-21 2021-03-23 江苏永鼎通信有限公司 Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform

Similar Documents

Publication Publication Date Title
US11127398B2 (en) Method for voice controlling, terminal device, cloud server and system
CN109240576B (en) Image processing method and device in game, electronic device and storage medium
CN108595230B (en) Application program preloading method and device, storage medium and terminal
CN108920202B (en) Application preloading management method and device, storage medium and intelligent terminal
RU2644130C2 (en) Text input method and device
US20160210034A1 (en) Method and apparatus for switching display mode
CN108647055A (en) Application program preloads method, apparatus, storage medium and terminal
CN104092932A (en) Acoustic control shooting method and device
CN107249084B (en) Mobile terminal, program calling method and computer-readable storage medium
CN108108117B (en) Screen capturing method and device and terminal
US20190369825A1 (en) Electronic device and method for providing information related to image to application through input unit
EP3113051A1 (en) Method and apparatus for acquiring search results, computer program and recording medium
CN105045588A (en) Method and apparatus for switching input method skins
EP3509012B1 (en) Fingerprint recognition method and device
CN112015270A (en) Terminal control method, terminal and computer storage medium
WO2019228149A1 (en) Collection method and apparatus for prediction sample, and storage medium and smart terminal
CN111984347A (en) Interaction processing method, device, equipment and storage medium
CN108664286B (en) Application program preloading method and device, storage medium and mobile terminal
CN111506245A (en) Terminal control method and device
CN111158487A (en) Man-machine interaction method for interacting with intelligent terminal by using wireless earphone
CN110231963B (en) Application control method and related device
CN105786321B (en) A kind of function switching method and device
KR102163996B1 (en) Apparatus and Method for improving performance of non-contact type recognition function in a user device
CN107979701B (en) Method and device for controlling terminal display
CN110908556A (en) Interaction method, interaction device, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination