CN116048366A - Control method and electronic equipment - Google Patents

Control method and electronic equipment Download PDF

Info

Publication number
CN116048366A
CN116048366A CN202310108944.XA CN202310108944A CN116048366A CN 116048366 A CN116048366 A CN 116048366A CN 202310108944 A CN202310108944 A CN 202310108944A CN 116048366 A CN116048366 A CN 116048366A
Authority
CN
China
Prior art keywords
parameter
input
parameter value
mode
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310108944.XA
Other languages
Chinese (zh)
Inventor
叶昊翔
康源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202310108944.XA priority Critical patent/CN116048366A/en
Publication of CN116048366A publication Critical patent/CN116048366A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04892Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application discloses a control method and electronic equipment, the control method includes: obtaining a first input instruction; responding to the first input instruction, and adjusting a parameter value of a first parameter, wherein the first parameter corresponds to a first perception mode; determining a target instruction, wherein the target instruction is used for representing a parameter value adjustment result of the first parameter through a second perception mode; the second sensing mode is a touch mode, and the first sensing mode and the second sensing mode are different.

Description

Control method and electronic equipment
Technical Field
The present disclosure relates to the field of input and output technologies, and in particular, to a control method and an electronic device.
Background
At present, when a certain parameter is adjusted in the device, the result of the parameter adjustment is usually prompted in a corresponding specific manner, which results in a single prompting manner of parameter adjustment, so that the user experience on the device is poor.
Disclosure of Invention
In view of this, the present application provides a control method and an electronic device, as follows:
a control method, comprising:
obtaining a first input instruction;
responding to the first input instruction, and adjusting a parameter value of a first parameter, wherein the first parameter corresponds to a first perception mode;
Determining a target instruction, wherein the target instruction is used for representing a parameter value adjustment result of the first parameter through a second perception mode;
the second sensing mode is a touch mode, and the first sensing mode and the second sensing mode are different.
The above method, preferably, is applied to an electronic device, wherein:
after the determining the target instruction, the method further comprises:
executing the target instruction to enable a target device in the electronic equipment to represent a parameter value adjustment result of the first parameter through the second perception mode;
or,
and transmitting the target instruction to interaction equipment connected to the electronic equipment, so that a target device in the interaction equipment characterizes the parameter value adjustment result of the first parameter through the second perception mode.
In the above method, preferably, the first parameter is an output parameter of an output device, the output device is used for implementing the first sensing mode, and the output device is in a non-output state;
and the second sensing mode characterizes a parameter value adjustment result of the first parameter under the condition that the output device is in an output state.
In the above method, preferably, the first parameter is a display position of an input identifier; the input identifier is at least used for indicating an input position;
wherein the determining the target instruction includes:
if the display relation between the display position of the input identifier and the control object meets a first condition, determining a target instruction, wherein a second perception mode in the target instruction characterizes the display relation between the display position of the input identifier and the control object; the control object is used for triggering corresponding functions under the condition that a second input instruction is received.
The above method, preferably, the method further comprises:
obtaining target information, wherein the target information comprises display parameters of the control object or display relation between the display position of the input identifier and the control object;
and obtaining a parameter value of a second parameter according to the target information, so that the display relation between the display position of the input identifier and the control object is represented by the second parameter in the second perception mode.
In the above method, preferably, the target instruction is further configured to characterize a parameter value adjustment result of the first parameter by a third sensing manner, where the third sensing manner is different from the second sensing manner; wherein the third perception mode is a visual mode.
In the above method, preferably, the third sensing manner characterizes a parameter value adjustment result of the first parameter with a progress bar;
in the second sensing mode, a second parameter is used for representing a parameter value adjustment result of the first parameter;
wherein the parameter value of the second parameter is matched with the parameter value of the first parameter after adjustment;
or (b)
And the adjustment trend of the parameter value of the second parameter is matched with the adjustment trend of the parameter value adjustment result of the first parameter.
An electronic device, comprising:
a memory for storing a computer program and data resulting from the execution of the computer program;
a processor for executing the computer program to implement: obtaining a first input instruction; responding to the first input instruction, and adjusting a parameter value of a first parameter, wherein the first parameter corresponds to a first perception mode; determining a target instruction, wherein the target instruction is used for representing a parameter value adjustment result of the first parameter through a second perception mode; the second sensing mode is a touch mode, and the first sensing mode and the second sensing mode are different.
The above electronic device, preferably, wherein:
The electronic device further includes: a first target device; the first target device responds to the target instruction and characterizes a parameter value adjustment result of the first parameter through the second perception mode;
or alternatively;
the electronic equipment is connected with the interactive equipment, and the interactive equipment comprises a second target device; the second target device is used for receiving the target instruction transmitted by the electronic equipment; and responding to the target instruction, and characterizing a parameter value adjustment result of the first parameter through the second perception mode.
The above electronic device, preferably, wherein:
the electronic device further includes: the input device is provided with an input area, and the input area is used for receiving touch input operation;
the input area is further used for responding to the target instruction and characterizing a parameter value adjustment result of the first parameter through the second perception mode.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a control method according to a first embodiment of the present application;
FIG. 2 is an exemplary diagram of changing the position of an input cursor by dragging a mouse;
FIG. 3 is an example diagram of changing input positions by sliding on a touch screen;
FIG. 4 is an exemplary diagram of a target device within an electronic device characterizing screen brightness after screen adjustment in a vibrotactile manner in an embodiment of the present application;
FIG. 5 is an exemplary diagram of a target device in an electronic device in an embodiment of the present application characterizing a touch input position of a user's finger after sliding on a touch screen in a vibrotactile manner;
FIG. 6 is an exemplary diagram of a target device within an interactive apparatus coupled to an electronic device in an embodiment of the present application to vibrotactile characterize the brightness of a screen after it has been adjusted;
FIG. 7 is an exemplary diagram of a target device within an interactive device coupled to an electronic device in an embodiment of the present application to vibrotactile characterize a display position of an input cursor after movement by a mouse;
FIG. 8 is another exemplary diagram of a target device within an interactive apparatus coupled to an electronic device in an embodiment of the present application to vibrotactile characterize the brightness of a screen after it has been adjusted;
FIG. 9 is an exemplary diagram of brightness after the mobile phone screen is adjusted by the vibration of the mobile phone shell to prompt the user when the mobile phone is in the black screen in the embodiment of the application;
FIG. 10 is a diagram showing an example of the volume of a speaker of a mobile phone after being adjusted by a vibration of a mobile phone housing when the speaker is turned off in an embodiment of the present application;
FIG. 11 is an exemplary diagram of brightness after a mobile phone screen is adjusted by a user being prompted by vibration of a mobile phone housing when the mobile phone is on in an embodiment of the present application;
FIG. 12 is a diagram showing an example of the volume of a speaker of a mobile phone after being adjusted by a user being prompted by vibration of a mobile phone housing when the speaker is turned on in an embodiment of the present application;
FIG. 13 is an exemplary graph of matching vibration amplitude values with screen brightness values in an embodiment of the present application;
FIG. 14 is an exemplary diagram of an icon area where a cursor moves to an application icon, characterized by a second perceived manner, such as a vibrotactile manner, in an embodiment of the present application;
FIG. 15 is an exemplary diagram depicting cursor movement toward an icon area where an image is applied by a second perceived manner, such as a vibrotactile manner, in an embodiment of the present application;
FIG. 16 is another partial flow chart of a control method according to the first embodiment of the present application;
FIG. 17 is an exemplary graph of matching vibration amplitude values to application icon areas in an embodiment of the present application;
FIG. 18 is an exemplary graph of vibration amplitude values matching distance in an embodiment of the present application;
FIG. 19 is an exemplary diagram of an icon area on a screen with icons and cursor jitter representing movement of a cursor to an application icon in an embodiment of the present application;
FIG. 20 is an exemplary diagram of a relative position on a screen identified by a progress bar in an embodiment of the present application depicting brightness after the screen has been adjusted;
FIG. 21 is an exemplary diagram of characterizing the volume of a speaker after it has been adjusted on a screen as a percentage of progress on a progress bar in an embodiment of the present application;
fig. 22 is a schematic structural diagram of a control device according to a second embodiment of the present disclosure;
fig. 23 is a schematic structural diagram of an electronic device according to a third embodiment of the present application;
fig. 24, 25, 26 and 27 are respectively schematic structural diagrams of an electronic device according to a third embodiment of the present application;
FIG. 28 is a schematic diagram of a user operating a cursor in a touchpad control graphical interface to obtain vibration feedback when the present application is applied to a notebook computer;
FIG. 29 is a schematic view of a user operating a touchpad control slider to obtain vibration feedback when the present application is applied to a notebook computer;
FIG. 30 is a schematic diagram of a user operating a mouse control cursor to obtain vibration feedback when the present application is applied to a notebook computer;
fig. 31 is a schematic diagram of a user operating a mouse control slider to obtain vibration feedback when the present application is applied to a notebook computer.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
Referring to fig. 1, a flowchart of a control method according to an embodiment of the present application is shown, and the method may be applied to an electronic device capable of performing data processing, such as a mobile phone, a tablet, and a computer. The technical scheme in the embodiment is mainly used for enriching the perception mode for representing the parameter adjustment result for the user so as to reduce the burden of the user on a certain specific perception mode.
Specifically, the method in this embodiment may include the following steps:
step 101: a first input instruction is obtained.
Wherein the first input instruction is an instruction generated in response to the received first input operation. The first input instruction is used for indicating to adjust the parameter value of the first parameter.
For example, as shown in fig. 2, a user drags a mouse, and a drag operation of the mouse generates an input instruction indicating a change in display position of an input cursor on a display screen.
As another example, as shown in fig. 3, the user slides on the touch screen, and a sliding operation on the touch screen generates an input instruction indicating a change in the input position.
Step 102: in response to the first input instruction, a parameter value of the first parameter is adjusted. The first parameter corresponds to a first perception mode.
The first sensing mode may be a visual mode, or the first sensing mode may be an auditory mode, or the like.
In one implementation, the first parameter may be a display position of an input identifier, where the input identifier is at least used to indicate the input position, and the parameter value adjustment result of the first parameter is the display position of the adjusted input identifier, so as to indicate the adjusted input position. Based on this, the parameter value adjustment result of the first parameter is represented by the first sensing manner in this embodiment.
For example, taking the first parameter as an example of the display position of the input cursor corresponding to the mouse, in this embodiment, the display position of the adjusted input cursor is visually represented.
In another implementation manner, the first parameter may be an output parameter of the output device, where the output device is configured to implement the first sensing manner, that is, a parameter value adjustment result of the first parameter is a parameter value adjustment result of the output parameter of the output device, and the output device characterizes the parameter value adjustment result of the output parameter in the first sensing manner, such as a visual manner or an auditory manner.
For example, taking a first parameter as an example of a screen brightness parameter, the screen visually characterizes the brightness of the screen after being adjusted;
for another example, taking the first parameter as a speaker volume parameter, the speaker audibly characterizes the adjusted volume of the speaker.
Step 103: and determining a target instruction, wherein the target instruction is used for representing a parameter value adjustment result of the first parameter through a second perception mode.
The second sensing mode is a touch mode, and the first sensing mode is different from the second sensing mode.
That is, in this embodiment, the parameter value adjustment result of the first parameter is represented by a haptic mode different from the first perception mode while the parameter value adjustment result of the first parameter is represented by the first perception mode.
Specifically, the second sensing mode may be a vibration mode, a heat transfer mode, an electric shock mode, or the like.
For example, taking a first parameter as an example of a display position of an input cursor corresponding to a mouse, in this embodiment, the display position of the input cursor after being adjusted is visually represented, and the display position of the input cursor after being adjusted is prompted for a user in a vibration manner on the mouse;
for another example, taking the first parameter as a screen brightness parameter, in this embodiment, the brightness of the adjusted screen is visually represented on the screen, and the brightness of the adjusted screen is prompted to the user in a vibration manner on the screen;
For another example, taking the first parameter as a speaker volume parameter, the speaker audibly characterizes the adjusted volume of the speaker, and prompts the user for the adjusted volume of the speaker in an electric shock manner on the mouse.
Based on this, in this embodiment, the user is prompted with the parameter value adjustment result of the first parameter in different sensing manners.
As can be seen from the above, in the control method provided in the first embodiment of the present application, when the parameter value of the first parameter is adjusted in response to the obtained first input command, the first parameter is represented in a first sensing manner, and meanwhile, the target command is determined, so that the parameter value adjustment result of the first parameter is represented in a second sensing manner, and the second sensing manner is a haptic manner different from the first sensing manner. Therefore, in this embodiment, the parameter value adjustment result of the first parameter can be represented for the user in other sensing manners different from the sensing manner corresponding to the adjusted parameter, particularly in a tactile manner, so that the variety of the parameter adjustment result is increased for the user, and the use experience of the user is improved.
In one implementation, after determining the target instruction in step 103, the target instruction may be executed on the electronic device to which the method in this embodiment is applied in this embodiment, so that the target device in the electronic device characterizes the parameter value adjustment result of the first parameter through the second sensing manner.
The target device in the electronic device is a device capable of realizing a second sensing mode, and specifically comprises: the target device is a device capable of contacting a user and having a vibrating structure.
In one possible implementation, the target device is capable of obtaining an input operation and determining an input instruction, such as a touch pad (i.e., a touch pad), from the input operation; in another possible implementation, the target device is capable of outputting content, such as a display screen, for the user; in another possible implementation, the target device is capable of obtaining an input operation and determining an input instruction according to the input operation, and at the same time, is also capable of outputting content for a user, such as a touch screen, that is, the target device is capable of interacting with the user so as to obtain the input operation of the user, and is also capable of implementing the second sensing mode.
In another possible implementation, the target device is capable of implementing the second perception, but is different from the input device that is capable of obtaining input operations of the user.
For example, taking the first parameter as a screen brightness parameter as an example, the target device may be an output device in the electronic device, such as a touch screen, where a motor structure is configured, based on which the brightness of the adjusted screen is visually represented on the touch screen, while a target instruction determined based on an input operation received on the touch screen is executed on the electronic device, so that the motor structure in the touch screen vibrates, as shown in fig. 4, to further represent the brightness of the adjusted screen in a vibrating tactile manner.
For another example, taking the first parameter as the touch input position of the finger of the user as an example, the target device is an output device of the electronic device, such as a touch screen, where a motor structure is configured, based on this, the touch input position of the finger of the user on the touch screen is visually represented on the touch screen, and at the same time, a target instruction determined based on the input operation received on the touch screen is executed on the electronic device, so that the motor structure in the touch screen vibrates, as shown in fig. 5, and further, the touch input position after the finger of the user slides on the touch screen is represented in a vibrating tactile manner.
In another implementation, after determining the target instruction in step 103, the target instruction may be transmitted to an interaction device connected to the electronic device in this embodiment, so that the target apparatus in the interaction device characterizes the parameter value adjustment result of the first parameter through the second sensing manner.
The interaction device may be a device capable of interacting with a user, such as a mouse, a microphone, or the like, which is independent of the electronic device. The interactive device is provided with a target device capable of realizing a second perception mode, and the target device is a device which can be contacted with a user and is provided with a vibration structure. For example, a housing independent of the handset and with a motor structure, the housing can be in contact with the user.
In one possible implementation, the interactive device is capable of obtaining an input operation and determining an input instruction, such as a touch pad or mouse, from the input operation. In another possible implementation, the interactive device is a device that can be in contact with the user, but the interactive device is distinguished from a device that can obtain input operations, such as a device housing that is sleeved outside the handset.
For example, taking the first parameter as the screen brightness parameter as an example, the target device is a motor structure in an input device independent of the electronic device. Based on this, the brightness of the screen after being adjusted is visually represented on the touch screen, and at the same time, the electronic device transmits a target instruction determined based on an input operation received on the mouse to an input device such as the mouse to vibrate a motor structure within the mouse, as shown in fig. 6, thereby representing the brightness of the screen after being adjusted in a vibrotactile manner.
For another example, taking the first parameter as an input cursor corresponding to the mouse as an example, the target device is a motor structure in an input device independent of the electronic device, such as the mouse. The display position of the input cursor after being moved by the mouse is visually represented on the touch screen, and at the same time, the electronic device transmits a target instruction determined based on the input operation received on the mouse to the mouse, so that a motor structure in the mouse vibrates, as shown in fig. 7, and the display position of the input cursor after being moved by the mouse is represented in a vibrating tactile manner.
For another example, taking the first parameter as the screen brightness parameter, the target device is a motor structure in a device housing independent of the electronic device. Based on this, the brightness of the screen after being adjusted is visually represented on the touch screen, and at the same time, the electronic device transmits a target instruction determined based on an input operation received on the touch screen or the microphone to the device housing to vibrate a motor structure within the device housing, as shown in fig. 8, thereby representing the brightness of the screen after being adjusted in a vibrating tactile manner.
In one implementation, the first parameter is an output parameter of an output device, the output device is configured to implement a first sensing mode, and the output device is in a non-output state. Based on this, the second perceptual style characterizes the parameter value adjustment result of the first parameter when the output device is in the output state, that is, the second perceptual style characterizes the parameter value adjustment result of the first parameter when the output device is in the output state.
That is, when the output parameter of the output device is adjusted, the output device may be in a non-output state, and since the output parameter is a parameter of the output device in the output state, even if the output device is adjusted in the non-output state, the parameter value adjustment result of the output parameter of the output device in the output state may be represented by the second sensing manner in this embodiment, so that the user cannot sense the parameter value adjustment result of the output parameter due to the output device being in the non-output state is avoided.
For example, taking the output device as a mobile phone screen, the user turns off the screen before sleeping, and the mobile phone is in a black screen state (i.e. a non-output state), the user can adjust the screen brightness through voice control, and the user cannot be prompted to the real-time brightness after the screen is adjusted through the brightness displayed by the screen because the mobile phone is in the black screen state, but through the technical scheme in the embodiment, the user is prompted to the brightness after the mobile phone screen is adjusted through the vibration of the mobile phone shell, as shown in fig. 9.
For another example, taking the output device as a speaker of the mobile phone, the user turns off the speaker at the conference, and the mobile phone is in a state where the speaker is turned off (i.e. in a non-output state), the user can adjust the volume of the speaker through the volume key of the mobile phone, and the user cannot be prompted to the real-time volume after the speaker is adjusted through the volume of the alert tone due to the fact that the speaker of the mobile phone is in the turned-off state, but through the technical scheme in the embodiment, the user is prompted to the volume after the speaker of the mobile phone is adjusted through the vibration of the mobile phone shell, as shown in fig. 10.
In another implementation manner, the first parameter is an output parameter of the output device, the output device is used for implementing the first sensing mode, and the output device is in an output state. Based on this, the second perceptual approach characterizes the parameter value adjustment result of the first parameter with the output device in the output state.
That is, when the output parameter of the output device is adjusted, the output device may be in an output state, and since the output parameter is a parameter of the output device in the output state, when the output device is adjusted in the output state, the user may be prompted by the first sensing method, such as a visual method, to indicate the result of adjusting the parameter value of the output parameter of the output device in the output state, and the result of adjusting the parameter value of the output parameter of the output device in the output state may be also indicated by the second sensing method
For example, taking the output device as a mobile phone screen as an example, when a user uses the mobile phone outdoors where sunlight is strong, under the condition that the mobile phone is in a bright screen (output state), the user can directly adjust the screen brightness through screen touch, the mobile phone is in a bright screen state, and the user can be prompted on the real-time brightness after the screen is adjusted through the brightness displayed by the screen, but due to strong outdoor light, the user cannot accurately distinguish the brightness change condition, therefore, through the technical scheme in the embodiment, the user can be prompted on the brightness after the mobile phone screen is adjusted through vibration of the mobile phone shell, as shown in fig. 11, the situation that the user cannot distinguish the screen brightness change clearly because the outdoor light is too strong is avoided.
For another example, taking the output device as a mobile phone speaker, when a user uses the mobile phone in a noisy market, the user can adjust the volume of the speaker through a mobile phone volume key, and because the mobile phone speaker is in an on state, the user is prompted to adjust the real-time volume of the speaker through the volume of the prompt tone, but because the external noise is larger, the user cannot accurately distinguish the volume adjustment condition, therefore, through the technical scheme in the embodiment, the user can be prompted to vibrate the volume of the mobile phone speaker after being adjusted through the mobile phone shell, as shown in fig. 12, so that the situation that the user has poor volume adjustment effect due to the overlarge external noise is avoided.
In one implementation, the second sensing mode characterizes the parameter value adjustment result of the first parameter with the second parameter. Wherein the parameter value of the second parameter is matched with the parameter value of the first parameter after adjustment.
Specifically, in this embodiment, a parameter value correspondence table between the first parameter and the second parameter is preset. Based on this, in this embodiment, the parameter value of the second parameter that is matched may be found in the parameter value correspondence table according to the parameter value that is adjusted by the first parameter, and then the target instruction is determined according to the parameter value of the second parameter, so that the target instruction characterizes the parameter value adjustment result of the first parameter by the parameter value of the second parameter in the second sensing manner.
For example, taking the first parameter as a screen brightness parameter as an example, in this embodiment, a vibration amplitude value matched with the adjusted brightness value is searched in a parameter value correspondence table, based on this, while the adjusted brightness of the screen is visually represented on the touch screen, a motor structure in the mouse is controlled by a target instruction to vibrate with the searched vibration amplitude value, as shown in fig. 13, and then the adjusted brightness of the screen is prompted for the user by vibration with the vibration amplitude matched with the brightness value.
In another implementation, the second parameter in the second sensing mode characterizes a result of adjusting a parameter value of the first parameter. The adjusting trend of the parameter value of the second parameter is matched with the adjusting trend of the parameter value adjusting result of the first parameter. That is, the parameter value variation trend of the second parameter matches the parameter value adjustment trend of the first parameter.
Specifically, in this embodiment, according to the parameter value after the adjustment of the first parameter and the parameter value before the adjustment, a parameter value adjustment trend of the first parameter may be determined, for example, the parameter value is increased or decreased; then, according to the parameter value adjustment trend of the first parameter value, adjusting the parameter value of the second parameter, for example, if the first parameter is adjusted to be larger, increasing the parameter value of the second parameter, if the first parameter is adjusted to be smaller, decreasing the parameter value of the second parameter, so that the adjustment trend of the parameter value of the second parameter is matched with the adjustment trend of the parameter value adjustment result of the first parameter; and finally, determining a target instruction according to the adjusted parameter value of the second parameter, wherein in the second perception mode, the parameter value adjustment result of the first parameter can be represented by the second parameter with the adjustment trend consistent with the parameter value adjustment trend of the first parameter.
For example, taking the first parameter as a screen brightness parameter as an example, in this embodiment, when the screen brightness is increased, the vibration amplitude value of the motor structure in the mouse is increased, when the screen brightness is reduced, the vibration amplitude value of the motor structure in the mouse is reduced, based on this, when the brightness of the screen after being adjusted is visually represented on the touch screen, the motor structure in the mouse is controlled to vibrate with the vibration amplitude value of the same variation trend through the target command, and then the brightness after being adjusted is prompted by the user through vibration with the vibration amplitude matching with the brightness value adjustment trend, if the brightness after being adjusted is increased, the corresponding vibration amplitude is increased, and if the brightness after being adjusted is reduced, the corresponding vibration amplitude is reduced.
In one implementation, the first parameter is a display location of the input identifier. The input identification is used at least to indicate the input location. For example, the input identifier is a cursor corresponding to a mouse or a touch pad, and the cursor is at least used for indicating the input position.
Based on this, in determining the target instruction in step 103, this can be achieved by:
detecting whether the display relation between the display position of the input identifier and the control object meets a first condition, if the display relation between the display position of the input identifier and the control object meets the first condition, determining a target instruction, wherein a second perception mode in the target instruction at the moment characterizes the display relation between the display position of the input identifier and the control object, and the control object is used for triggering a corresponding function under the condition of receiving a second input instruction.
The control object may be understood as a control object capable of triggering a corresponding function, such as an application icon, an operation control, and the like. The second input instruction is an input instruction generated based on a selection operation of the control object. For example, the user moves the corresponding cursor to the application icon of the game application through the mouse, double-clicks the left mouse button, and generates a corresponding second input instruction, where the second input instruction is used to instruct to start the game application.
Specifically, the first condition is that the display position of the input identifier is located in an object area corresponding to the control object, based on which, in this embodiment, whether the display position of the input identifier is located in the object area where the control object is located is represented by a second perception manner; for example, the cursor corresponding to the mouse moves to the icon area where the application icon is located, based on which, in this embodiment, the cursor moves to the icon area where the application icon is located by a second sensing manner, such as a vibration haptic manner, as shown in fig. 14.
Alternatively, the first condition is: based on the fact that the change trend of the display position of the input mark faces the object area where the control object is located, the change trend of the display position of the input mark is represented by a second perception mode in the embodiment to face the object area where the control object is located; for example, the cursor corresponding to the mouse moves toward the icon area where the application icon is located, based on which, in this embodiment, the cursor moves toward the icon area where the application image is located by the second sensing means such as the vibration tactile means, as shown in fig. 15.
Based on the above implementation, the second parameter in the second sensing manner may be obtained in this embodiment through the following steps, as shown in fig. 16:
step 1601: target information is obtained, wherein the target information comprises display parameters of the control object or display relation between the display position of the input identifier and the control object.
The display parameters of the control object may include parameters such as an area of the object area where the control object is located, an area color, and brightness. The display relationship between the display position of the input identification and the control object may include: the distance between the display position of the input mark and the object area where the control object is located.
Step 1602: and obtaining parameter values of the second parameters according to the target information, so that the second parameters are used for representing the display relation between the display position of the input identifier and the control object in the second perception mode.
In one implementation manner, taking an area of an object area where the target information includes the control object, in this embodiment, a parameter value of the second parameter is determined according to the area, for example, the larger the area is, the larger the parameter value of the second parameter is, and the target instruction obtained based on the parameter value of the second parameter is used to characterize, by using the parameter value of the second parameter, the area of the object area of the control object corresponding to the display position of the input identifier.
For example, taking the area of the icon area of the application icon as the target information as an example, in this embodiment, the vibration amplitude value of the motor structure is determined according to the area of the application icon, for example, the larger the area of the application icon is, the larger the vibration amplitude value is, and the target instruction obtained based on the vibration amplitude value is used to characterize the icon area of the application icon where the cursor is located by the vibration amplitude value, as shown in fig. 17.
Wherein the maximum value and the minimum value of the second parameter are related to the maximum area and the minimum area of the object area where the control object is located. For example, the maximum value of the vibration amplitude is set according to the area of the largest icon among all the application icons, the minimum value of the vibration amplitude is set according to the area of the smallest icon, and based on this, a value is selected as the vibration amplitude value between the minimum value and the maximum value of the vibration amplitude according to the area of the application icon where the cursor is located. Therefore, in the second sensing mode, the area of the application icon where the cursor is located is represented by vibration amplitude values with different magnitudes.
In another implementation, taking the example that the target information includes the distance between the display position of the input identifier and the control object, the parameter value of the second parameter is determined according to the distance in this embodiment, for example, the smaller the distance is, the larger the parameter value of the second parameter is, and the target instruction obtained based on the parameter value of the second parameter is used to characterize the distance between the display position of the input identifier and the control object through the parameter value of the second parameter.
For example, taking the target information as the distance between the cursor and the application icon as an example, the vibration amplitude value of the motor structure is determined according to the distance in the present embodiment, for example, the smaller the distance is, the larger the vibration amplitude value is, and the target instruction obtained based on the vibration amplitude value is used to characterize the distance between the cursor and the application icon by the vibration amplitude value, as shown in fig. 18.
Wherein the maximum value and the minimum value of the second parameter are determined with the distance between the input identification and the control object at the initial position in the display interface where the control object is located. For example, according to the distance between the initial position of the cursor on the display interface and the application icon, setting the minimum value of the vibration amplitude and setting the maximum value of the vibration amplitude as a preset value, and based on this, selecting a value between the minimum value and the maximum value of the vibration amplitude as the vibration amplitude value according to the real-time distance between the cursor and the application icon. Therefore, in the second sensing mode, the distance between the cursor and the application icon is represented by vibration amplitude values with different magnitudes.
In one implementation, the target instruction is further configured to characterize the parameter value adjustment result of the first parameter by the third perceptual manner. The third perception is different from the second perception. Wherein the third perception mode is a visual mode.
That is, in this embodiment, the target instruction is added to the parameter value adjustment result of the first parameter for the user through the second sensing mode, and the parameter value adjustment result of the first parameter for the user through the third sensing mode of the visual mode is also added.
For example, taking the first parameter as a screen brightness parameter as an example, in this embodiment, the brightness of the screen after being adjusted is visually represented on the screen, and the brightness of the screen after being adjusted is prompted for the user in a vibration manner on the screen, and at the same time, the brightness of the screen after being adjusted is also represented in another visual manner on the screen;
for another example, taking the first parameter as a speaker volume parameter, the speaker audibly characterizes the adjusted volume of the speaker, and prompts the user for the adjusted volume of the speaker in an electric shock manner on the mouse, and at the same time visually characterizes the adjusted volume of the speaker on the screen.
In a preferred implementation manner, the third sensing manner characterizes the parameter value adjustment result of the first parameter with a target object, where the target object is an object corresponding to the first parameter.
Taking the display position with the first parameter as the input identifier as an example, the target object is a control object corresponding to the display position of the input identifier, or the target object is the input identifier. Specifically, the target object has a jitter amplitude value, and the third sensing mode characterizes a parameter value adjustment result of the first parameter by using the jitter amplitude value of the target object, where the jitter amplitude value of the target object is matched with the parameter value adjustment result of the first parameter.
For example, taking the first parameter as the display position of the cursor as an example, the user moves the corresponding cursor to the icon area where the application icon is located by moving the mouse, based on this, in this embodiment, while the cursor is moved to the icon area where the application icon is located by the second sensing manner, such as the vibration tactile manner, the cursor is moved to the icon area where the application icon is located by the third sensing manner, such as dithering the application icon or the cursor, so as to characterize that the cursor is moved to the icon area where the application icon is located, as shown in fig. 19.
In a preferred implementation, the third sensing mode characterizes the parameter value adjustment result of the first parameter with a progress bar, and the second sensing mode characterizes the parameter value adjustment result of the first parameter with a second parameter.
Wherein the parameter value of the second parameter is matched with the parameter value of the first parameter after adjustment;
or the adjustment trend of the parameter value of the second parameter is matched with the adjustment trend of the parameter value adjustment result of the first parameter.
In particular, the progress bar has a progress scale that may be represented by the relative position of the progress mark on the progress bar, or the progress scale may be represented by a percentage.
Based on this, according to the technical scheme in this embodiment, besides the parameter value adjustment result of the first parameter represented by the first sensing manner, the parameter value adjustment result of the first parameter is prompted for the user by the second parameter in the second sensing manner, and meanwhile, the parameter value adjustment result of the first parameter is prompted for the user by the progress proportion in the progress bar.
For example, taking the first parameter as an example of a screen brightness parameter, in this embodiment, the brightness of the screen after being adjusted is visually represented on the screen, and the brightness of the screen after being adjusted is prompted for the user in a vibration manner on the screen, and at the same time, the brightness of the screen after being adjusted is also represented on the screen in a relative position of a progress mark on a progress bar, as shown in fig. 20;
for another example, taking the first parameter as the speaker volume parameter, the speaker audibly characterizes the adjusted volume of the speaker, and prompts the user for the adjusted volume of the speaker in an electric shock manner on the mouse, and at the same time, characterizes the adjusted volume of the speaker in terms of a progress percentage on a progress bar on the screen, as shown in fig. 21.
Referring to fig. 22, a schematic structural diagram of a control device according to a second embodiment of the present application may be configured in an electronic device capable of performing data processing, such as a mobile phone, a tablet, and a computer. The technical scheme in the embodiment is mainly used for enriching the perception mode for representing the parameter adjustment result for the user so as to reduce the burden of the user on a certain specific perception mode.
Specifically, the apparatus in this embodiment may include the following units:
An instruction obtaining unit 2201 configured to obtain a first input instruction;
a parameter adjustment unit 2202, configured to adjust a parameter value of a first parameter in response to the first input instruction, where the first parameter corresponds to a first sensing manner;
an instruction determining unit 2203, configured to determine a target instruction, where the target instruction is used to characterize, by a second perceptual manner, a parameter value adjustment result of the first parameter; the second sensing mode is a touch mode, and the first sensing mode and the second sensing mode are different.
As can be seen from the above, in the control device provided in the second embodiment of the present application, when the parameter value of the first parameter is adjusted in response to the obtained first input instruction, the first parameter is represented in a first sensing manner, and meanwhile, the target instruction is determined, so that the parameter value adjustment result of the first parameter is represented in a second sensing manner, and the second sensing manner is a haptic manner different from the first sensing manner. Therefore, in this embodiment, the parameter value adjustment result of the first parameter can be represented for the user in other sensing manners different from the sensing manner corresponding to the adjusted parameter, particularly in a tactile manner, so that the variety of the parameter adjustment result is increased for the user, and the use experience of the user is improved.
In one implementation, after the instruction determination unit 2203 determines the target instruction, it is further configured to: executing the target instruction to enable a target device in the electronic equipment to represent a parameter value adjustment result of the first parameter through the second perception mode;
alternatively, the instruction determining unit 2203 transmits the target instruction to an interaction device connected to the electronic device, so that a target apparatus within the interaction device characterizes the parameter value adjustment result of the first parameter by the second perception manner.
In one implementation manner, the first parameter is an output parameter of an output device, the output device is used for implementing the first sensing mode, and the output device is in a non-output state;
and the second sensing mode characterizes a parameter value adjustment result of the first parameter under the condition that the output device is in an output state.
In one implementation, the first parameter is a display position of an input identifier; the input identifier is at least used for indicating an input position;
wherein the instruction determining unit 2203 is specifically configured to: if the display relation between the display position of the input identifier and the control object meets a first condition, determining a target instruction, wherein a second perception mode in the target instruction characterizes the display relation between the display position of the input identifier and the control object; the control object is used for triggering corresponding functions under the condition that a second input instruction is received.
Specifically, the instruction determining unit 2203 is further configured to: obtaining target information, wherein the target information comprises display parameters of the control object or display relation between the display position of the input identifier and the control object; and obtaining a parameter value of a second parameter according to the target information, so that the display relation between the display position of the input identifier and the control object is represented by the second parameter in the second perception mode.
In one implementation, the target instruction is further configured to characterize a parameter value adjustment result of the first parameter by a third sensing manner, where the third sensing manner is different from the second sensing manner; wherein the third perception mode is a visual mode.
Specifically, the third sensing mode characterizes a parameter value adjustment result of the first parameter by a progress bar; in the second sensing mode, a second parameter is used for representing a parameter value adjustment result of the first parameter;
wherein the parameter value of the second parameter is matched with the parameter value of the first parameter after adjustment; or the adjustment trend of the parameter value of the second parameter is matched with the adjustment trend of the parameter value adjustment result of the first parameter.
It should be noted that, the specific implementation of each unit in this embodiment may refer to the corresponding content in the foregoing, which is not described in detail herein.
Referring to fig. 23, a schematic structural diagram of an electronic device according to a third embodiment of the present application may be an electronic device capable of performing data processing, such as a mobile phone, a tablet, and a computer. The technical scheme in the embodiment is mainly used for enriching the perception mode for representing the parameter adjustment result for the user so as to reduce the burden of the user on a certain specific perception mode.
Specifically, the electronic device in this embodiment may include the following structure:
a memory 2301 for storing a computer program and data resulting from the execution of the computer program;
a processor 2302 for executing the computer program to implement: obtaining a first input instruction; responding to the first input instruction, and adjusting a parameter value of a first parameter, wherein the first parameter corresponds to a first perception mode; determining a target instruction, wherein the target instruction is used for representing a parameter value adjustment result of the first parameter through a second perception mode; the second sensing mode is a touch mode, and the first sensing mode and the second sensing mode are different.
As can be seen from the above-mentioned scheme, in the electronic device provided in the third embodiment of the present application, when the parameter value of the first parameter is adjusted in response to the obtained first input instruction, the first parameter is represented in a first sensing manner, and meanwhile, the target instruction is determined, so that the parameter value adjustment result of the first parameter is represented in a second sensing manner, and the second sensing manner is a haptic manner different from the first sensing manner. Therefore, in this embodiment, the parameter value adjustment result of the first parameter can be represented for the user in other sensing manners different from the sensing manner corresponding to the adjusted parameter, particularly in a tactile manner, so that the variety of the parameter adjustment result is increased for the user, and the use experience of the user is improved.
In one implementation, the electronic device further includes the following structure, as shown in fig. 24:
a first target device 2303; the first target device 2303 characterizes the parameter value adjustment of the first parameter in the second perceptual manner in response to the target instruction; the first target device 2303 is a device capable of implementing a second sensing manner, specifically: the first target device 2303 is a device that can be in contact with a user and has a vibrating structure. In addition, the first target device 2303 may also be a device capable of interacting with a user.
In another implementation, the electronic device is connected to an interactive device, as shown in fig. 25, and the interactive device includes a second target apparatus 2304; the second target device 2304 is configured to receive the target instruction transmitted by the electronic device; and responding to the target instruction, and characterizing a parameter value adjustment result of the first parameter through the second perception mode. For example, the second target device 2304 is a device that can be in contact with a user and has a vibrating structure. For example, a housing independent of the handset and with a motor structure, the housing can be in contact with the user.
In one implementation, the electronic device further includes the following structure, as shown in fig. 26:
an input device 2305, the input device 2305 having an input area, such as a touch area of a touch screen, for receiving touch input operations;
the input area is further used for responding to the target instruction and characterizing a parameter value adjustment result of the first parameter through the second perception mode.
For example, the brightness of the screen after being adjusted is visually represented on the touch screen, and meanwhile, the motor structure in the touch screen represents the brightness of the screen after being adjusted in a vibration tactile manner through vibration.
In one implementation, the electronic device further includes the following structure, as shown in fig. 27:
an output device 2306; the output device 2306 is configured to implement the first sensing mode, and the output device is in a non-output state;
the first parameter is an output parameter of the output device, and the second sensing mode characterizes a parameter value adjustment result of the first parameter when the output device is in an output state.
Or, the output device 2306 outputs an output interface, where an input identifier is displayed on the output interface, and the input identifier is at least used to indicate an input position;
the first parameter is a display position of the input identifier; the second perception is used for representing the display relation between the display position of the input identifier and a control object, and the control object is used for triggering corresponding functions under the condition that a second input instruction is received.
Or, the target instruction is further configured to characterize a parameter value adjustment result of the first parameter through a third sensing manner, where the third sensing manner is different from the second sensing manner; wherein the third perception mode is a visual mode; and the output device 2306 is configured to implement the third sensing scheme.
The third sensing mode characterizes a parameter value adjustment result of the first parameter by a progress bar; in the second sensing mode, a second parameter is used for representing a parameter value adjustment result of the first parameter; wherein the parameter value of the second parameter is matched with the parameter value of the first parameter after adjustment; or the adjustment trend of the parameter value of the second parameter is matched with the adjustment trend of the parameter value adjustment result of the first parameter.
It should be noted that, the specific implementation of each device in this embodiment may refer to the corresponding content in the foregoing, which is not described in detail herein.
Taking an electronic device as a notebook computer as an example, a touch pad and a mouse of the computer are components with highest use frequency when a user uses the computer. Currently, a touch pad and a mouse can only provide a function of operating a cursor for a user, and cannot provide any feedback about a graphical operation interface for the user. The user needs to stare at the cursor when operating the computer, so that the visual burden of the user is increased; for special users with vision disorder, the operation difficulty is increased greatly.
In view of this, the present application provides an interaction scheme for providing synchronous vibration feedback to a user through a vibration motor in a touch pad or a mouse, in combination with a graphical interface. The following is an example of the technical solution of the present application with reference to the accompanying drawings:
As shown in fig. 28, a schematic diagram of vibration feedback is obtained by operating a cursor (4) in a touch pad (7) to control a graphical interface by a user, wherein (1), (2) and (3) are interactable elements A, B, C in the graphical interface respectively, and (5) and (6) respectively represent different vibration intensities, and the touch pad characterizes interactable elements with different icon areas by the different vibration intensities.
As shown in fig. 29, a schematic diagram of vibration feedback obtained by operating the control slider (8) of the touch panel (7) for the user; the relative position of the slider (8) on the speaker volume progress bar corresponds to the volume of the speaker, and the touch pad characterizes different speaker volumes with different vibration intensities.
As shown in fig. 30, a schematic diagram of vibration feedback is obtained by operating a mouse (9) to control a cursor (4) by a user, wherein the mouse characterizes interactable elements with different icon areas with different vibration intensities.
As shown in fig. 31, a schematic diagram of vibration feedback obtained by operating a mouse (9) to control a slider (8) by a user, wherein the mouse characterizes different speaker volumes with different vibration intensities.
In the scheme, the vibrating motors are arranged in the touch pad (7) and the mouse (9) and used for feeding back different vibrations to a user. The touch pad, the mouse and the graphical interface are matched in a software layer to realize vibration feedback synchronous with user operation.
As shown in fig. 28 and 30, when the user moves the cursor (4) by operating the touch screen or the mouse, the cursor contacts the interactable element A, B, C, and accordingly, the touch panel or the mouse vibrates to prompt the user of the position of the cursor. When the cursor contacts smaller elements such as A, B, the vibration intensity of the touch pad or the mouse is smaller, such as (5); when the cursor contacts a larger element such as C, the vibration intensity of the touch pad or the mouse is larger, as in (6), so that different kinds of interactable elements can be prompted through different vibration intensities.
As shown in fig. 29 and 31, when the user operates the cursor (4), and drags the slider (8), the touch panel (7) or the mouse (9) generates continuous vibration feedback, and the more the slider (8) drags in one direction, the stronger the vibration, thereby prompting the dragging distance of the slider by different vibration intensities.
Therefore, the method and the device can enable the common user to know the position of the cursor, the type of the interactable element and the operation completion degree through vibration feedback without staring at the screen, and can remarkably improve the operation experience of the user. And for special users with vision disorder, the computer can be operated more easily through vibration feedback, so that the operation difficulty of the users is reduced.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A control method, comprising:
obtaining a first input instruction;
responding to the first input instruction, and adjusting a parameter value of a first parameter, wherein the first parameter corresponds to a first perception mode;
determining a target instruction, wherein the target instruction is used for representing a parameter value adjustment result of the first parameter through a second perception mode;
the second sensing mode is a touch mode, and the first sensing mode and the second sensing mode are different.
2. The method of claim 1, the method being applied to an electronic device, wherein:
after the determining the target instruction, the method further comprises:
Executing the target instruction to enable a target device in the electronic equipment to represent a parameter value adjustment result of the first parameter through the second perception mode;
or,
and transmitting the target instruction to interaction equipment connected to the electronic equipment, so that a target device in the interaction equipment characterizes the parameter value adjustment result of the first parameter through the second perception mode.
3. The method of claim 1 or 2, the first parameter being an output parameter of an output device, the output device being configured to implement the first perception mode, and the output device being in a non-output state;
and the second sensing mode characterizes a parameter value adjustment result of the first parameter under the condition that the output device is in an output state.
4. The method of claim 1 or 2, the first parameter being a display location of an input identity; the input identifier is at least used for indicating an input position;
wherein the determining the target instruction includes:
if the display relation between the display position of the input identifier and the control object meets a first condition, determining a target instruction, wherein a second perception mode in the target instruction characterizes the display relation between the display position of the input identifier and the control object; the control object is used for triggering corresponding functions under the condition that a second input instruction is received.
5. The method of claim 4, the method further comprising:
obtaining target information, wherein the target information comprises display parameters of the control object or display relation between the display position of the input identifier and the control object;
and obtaining a parameter value of a second parameter according to the target information, so that the display relation between the display position of the input identifier and the control object is represented by the second parameter in the second perception mode.
6. The method of claim 1 or 2, the target instruction further configured to characterize the parameter value adjustment result of the first parameter by a third perceptual manner, the third perceptual manner being different from the second perceptual manner; wherein the third perception mode is a visual mode.
7. The method of claim 6, wherein the third sensing means characterizes the parameter value adjustment result of the first parameter in terms of a progress bar;
in the second sensing mode, a second parameter is used for representing a parameter value adjustment result of the first parameter;
wherein the parameter value of the second parameter is matched with the parameter value of the first parameter after adjustment;
or (b)
And the adjustment trend of the parameter value of the second parameter is matched with the adjustment trend of the parameter value adjustment result of the first parameter.
8. An electronic device, comprising:
a memory for storing a computer program and data resulting from the execution of the computer program;
a processor for executing the computer program to implement: obtaining a first input instruction; responding to the first input instruction, and adjusting a parameter value of a first parameter, wherein the first parameter corresponds to a first perception mode; determining a target instruction, wherein the target instruction is used for representing a parameter value adjustment result of the first parameter through a second perception mode; the second sensing mode is a touch mode, and the first sensing mode and the second sensing mode are different.
9. The electronic device of claim 8, wherein:
the electronic device further includes: a first target device; the first target device responds to the target instruction and characterizes a parameter value adjustment result of the first parameter through the second perception mode;
or alternatively;
the electronic equipment is connected with the interactive equipment, and the interactive equipment comprises a second target device; the second target device is used for receiving the target instruction transmitted by the electronic equipment; and responding to the target instruction, and characterizing a parameter value adjustment result of the first parameter through the second perception mode.
10. The electronic device of claim 8, wherein:
the electronic device further includes: the input device is provided with an input area, and the input area is used for receiving touch input operation;
the input area is further used for responding to the target instruction and characterizing a parameter value adjustment result of the first parameter through the second perception mode.
CN202310108944.XA 2023-01-31 2023-01-31 Control method and electronic equipment Pending CN116048366A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310108944.XA CN116048366A (en) 2023-01-31 2023-01-31 Control method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310108944.XA CN116048366A (en) 2023-01-31 2023-01-31 Control method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116048366A true CN116048366A (en) 2023-05-02

Family

ID=86118131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310108944.XA Pending CN116048366A (en) 2023-01-31 2023-01-31 Control method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116048366A (en)

Similar Documents

Publication Publication Date Title
JP6719011B2 (en) Implementation of biometrics
US10656715B2 (en) Systems and methods for a haptically-enabled projected user interface
CN106406710B (en) Screen recording method and mobile terminal
WO2009084140A1 (en) Input device, input operation method, and input control program for electronic device
EP3179358A1 (en) Method and apparatus for inputting contents based on virtual keyboard, and touch device
CN106793046B (en) Screen display adjusting method and mobile terminal
EP2075671A1 (en) User interface of portable device and operating method thereof
US20100064261A1 (en) Portable electronic device with relative gesture recognition mode
EP2669774A2 (en) Method for controlling application program and electronic device thereof
JP2008512881A (en) Object discovery method and system, device control method and system, interface, and pointing device
CN107124508A (en) Location regulation method, device and the terminal of suspension control, readable storage medium storing program for executing
CN106971704B (en) Audio processing method and mobile terminal
CN108595074A (en) Status bar operating method, device and computer readable storage medium
JP2018514865A (en) Wearable device, touch screen thereof, touch operation method thereof, and graphical user interface thereof
KR20060107811A (en) Method and system for control of a device
CN111309229A (en) Parameter adjusting method, device, terminal and storage medium
KR101608799B1 (en) Apparatus and method for having object display dial of in portable terminal
JP2019208952A (en) Program, recording medium, and control method
US20040017401A1 (en) Adjusting target size of display images based on input device detection
US20210382736A1 (en) User interfaces for calibrations and/or synchronizations
JP2004164375A (en) Information processor
CN116048366A (en) Control method and electronic equipment
WO2011067985A1 (en) Mobile terminal device and mobile terminal device function setting method
KR20170103379A (en) Method for providing responsive user interface
CN106775743B (en) Bottom tray and virtual key display method, device and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination