CN111158474B - Interaction method and electronic equipment - Google Patents

Interaction method and electronic equipment Download PDF

Info

Publication number
CN111158474B
CN111158474B CN201911319984.9A CN201911319984A CN111158474B CN 111158474 B CN111158474 B CN 111158474B CN 201911319984 A CN201911319984 A CN 201911319984A CN 111158474 B CN111158474 B CN 111158474B
Authority
CN
China
Prior art keywords
touch
touch screen
current value
user
skin roughness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911319984.9A
Other languages
Chinese (zh)
Other versions
CN111158474A (en
Inventor
高齐鲜
于海琛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911319984.9A priority Critical patent/CN111158474B/en
Publication of CN111158474A publication Critical patent/CN111158474A/en
Application granted granted Critical
Publication of CN111158474B publication Critical patent/CN111158474B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an interaction method and electronic equipment. An embodiment of the method comprises: under the condition that a user touches a touch screen of the electronic equipment, acquiring skin roughness of the user and acquiring touch behavior data; determining a current value to be released by the touch screen based on the skin roughness and the touch behavior data; and controlling the touch screen to release current according to the current value. The implementation mode enriches the touch sense of the user in the process of using the touch screen, and improves the immersion sense and the reality sense of the interaction process.

Description

Interaction method and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an interaction method and electronic equipment.
Background
With the development and continuous progress of science and technology, man-machine interaction modes in three-dimensional scenes are more and more abundant. For example, based on the virtual touch screen technology, a user can make the touch screen output electromagnetic pulse feedback by pressing, sliding and other operations on the touch screen, so that the user can feel different touch feelings.
In the prior art, the magnitude of the current output by the touch screen is generally controlled by detecting the pressure, the touch area and the duration of the touch screen by a user. However, this approach takes fewer factors into consideration, resulting in a single tactile sensation experienced by the user during use of the touch screen, and a lack of immersion and reality during user interaction with the device.
Disclosure of Invention
The embodiment of the invention provides an interaction method and electronic equipment, and aims to solve the technical problems that in the prior art, a touch screen has single touch sense, and the immersion sense and the reality sense in an interaction process are insufficient.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an interaction method, applied to an electronic device with a touch screen, including: under the condition that the touch screen is detected to be touched by a user, acquiring the skin roughness of the user and acquiring touch behavior data; determining a current value to be released by the touch screen based on the skin roughness and the touch behavior data; and controlling the touch screen to release current according to the current value.
In a second aspect, an embodiment of the present invention provides an electronic device, including: the detection unit is used for acquiring the skin roughness of the user and acquiring touch behavior data under the condition that the user touches the touch screen; the first determining unit is used for determining a current value to be released by the touch screen based on the skin roughness and the touch behavior data; and the release unit is used for controlling the touch screen to release current according to the current value.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the method described in any one of the embodiments of the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the method described in any one of the embodiments in the first aspect.
In the embodiment of the invention, under the condition that the touch of the user on the touch screen of the electronic equipment is detected, the skin roughness of the user is obtained, the touch behavior data is obtained, and then the current value to be released of the touch screen is determined based on the skin roughness and the touch behavior data, so that the touch screen is controlled to release the current according to the current value. Because the skin roughness of the user is considered in the process of determining the current value, when different users touch the screen or the user uses different parts to touch the screen, currents with different sizes can be released so that the user can feel different touch feelings, therefore, the touch feelings of the user in the process of using the touch screen are enriched, and the immersion feeling and the sense of reality of the interaction process are improved.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
FIG. 1 is a flow chart of an interaction method provided by an embodiment of the invention;
FIG. 2 is one of schematic diagrams of application scenarios of an interaction method provided by an embodiment of the present invention;
FIG. 3 is a schematic diagram of a Wheatstone bridge in an interaction method provided by an embodiment of the invention;
fig. 4 is a second schematic diagram of an application scenario of the interaction method according to the embodiment of the present invention;
fig. 5 is a third schematic diagram of an application scenario of the interaction method provided by the embodiment of the present invention;
FIG. 6 is a second flowchart of an interaction method according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an electronic device provided in an embodiment of the present invention;
fig. 8 is a hardware configuration diagram of an electronic device for implementing an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flowchart of an interaction method according to an embodiment of the invention is shown. The interaction method provided by the embodiment of the invention can be applied to electronic equipment with a Touch Panel (also called a Touch screen or a Touch Panel). In practice, the electronic device may be a smartphone, a tablet computer, a laptop, etc.
The process 100 of the interaction method provided by the embodiment of the present invention includes the following steps:
step 101, detecting the skin roughness of the user and acquiring touch behavior data under the condition that the user touches the touch screen.
In this embodiment, the electronic device may detect the skin roughness of the user and acquire the touch behavior data when the user touches the touch screen. The skin roughness may be a value for characterizing the degree of skin roughness. The touch behavior data may be data for characterizing touch behavior. For example, the touch behavior data may include, but is not limited to, pressure applied by the user on the touch screen, contact area of the user's skin with the touch screen, duration of touch, and the like.
In this embodiment, the user may perform various operations on the touch screen, such as a pressing operation, a sliding operation, and the like, and during the above operations, the user needs to touch the touch screen. In practice, the user can use any finger or any skin part to operate the touch screen, and details are not repeated here.
By way of example, fig. 2 is one of schematic diagrams of application scenarios of an interaction method provided by an embodiment of the present invention. In this scenario, the electronic device may be a terminal device such as a mobile phone. As shown in fig. 2, an image of an article (e.g., cotton) is presented in the display interface of the terminal device, and the image may be a two-dimensional image or a three-dimensional image, which is not limited herein. When a user presses a local part of the three-dimensional image through the touch screen, the skin roughness of the user can be detected through the touch screen, and touch behavior data such as pressure applied to the touch screen by the skin of the user, the contact area between the skin of the user and the touch screen, the touch duration and the like are acquired. Since the user touches the touch screen with a finger, the skin roughness at this time is the skin roughness of the user's finger.
In this embodiment, the skin roughness of the user may be determined in a number of ways. For example, the skin roughness may be set by the user in advance according to the skin condition of the user, or may be detected by mounting a skin roughness detecting device in the electronic apparatus.
In some optional implementations of the present embodiment, a wheatstone bridge is installed in the touch screen. Wherein, a wheatstone bridge is a device for measuring resistance value of a resistor therein. The Wheatstone bridge comprises four resistors, wherein one resistor is a resistor to be tested, and the resistance values of the other three resistors are known.
As an example, fig. 3 is a schematic structural diagram of a wheatstone bridge in the interaction method provided by the embodiment of the present invention. As shown in FIG. 3, the Wheatstone bridge comprises a power supply and a voltmeter VGAnd R1、R2、R3、RXFour resistors. It is composed ofIn, R1、R2、R3The resistance value of (a) is known. RXWhen the voltage between the two points B, D in fig. 3 changes, the voltage value is collected to calculate the R in the environmentXThe value of (c).
In this implementation, the electronic device may first determine the resistance value of the variable resistance using the skin of the user as the variable resistance in the wheatstone bridge. Then, the skin roughness of the user may be determined based on the resistance value and preset correspondence information indicating a correspondence of the resistance value and the skin roughness.
The correspondence between the resistance value and the skin roughness may be preset by a large number of data statistics and calculations. In practice, a model for characterizing the correspondence of the resistance value to the skin roughness may be established. The model can be trained by a machine learning method (such as a supervised learning method) by taking the resistance value as an input and the skin roughness as an output.
And 102, determining a current value to be released by the touch screen based on the skin roughness and the touch behavior data.
In this embodiment, the electronic device may determine the current value to be released by the touch screen based on the skin roughness and the touch behavior data. It will be appreciated that when a user touches a physical object, the tactile sensation will typically be different if the user's force, touch area, skin roughness, etc. are different. While when the user's tactile sensation is simulated by discharging the current, different tactile sensations can be realized by discharging currents of different magnitudes. Thus, the touch screen should release different current values for different skin roughness and touch behavior data, thereby providing different tactile sensations to the user.
In this embodiment, the skin roughness and the touch behavior data may be used as the influence factors of the current values, and a current value calculation model for representing the correspondence between the influence factors and the current values may be established in advance. And inputting the influence factors such as skin roughness and touch behavior data into the model to obtain the current value to be released of the touch screen.
The current value calculation model may be a function or a formula established in advance for calculating the current value. As an example, the current value calculation model may contain the following formula:
Ix=Aix+Cix
Iy=Aiy+Ciy
Iz=Aiz+Ciz
I=(Ix+Iy+Iz)BD
wherein I is a current value to be released by the touch screen; i isxThe component of the current value I to be released of the touch screen in the x-axis direction is shown; i isyThe component of the current value I to be released of the touch screen in the y-axis direction is shown; i iszThe component of the current value I to be released of the touch screen in the z-axis direction is shown; i is a preset basic current value; i.e. ixIs the component of the preset basic current value i in the direction of the x axis; i.e. iyIs the component of the preset basic current value i in the y-axis direction; i.e. izIs the component of the preset basic current value i in the z-axis direction; a is a pressure coefficient; b is a touch area coefficient; c is a time length coefficient; d is the user skin roughness. The x-axis, y-axis, and z-axis are coordinate axes in three directions in a cartesian coordinate system established in advance.
Here, the value of the pressure coefficient may be determined based on the pressure applied to the touch screen by the user in the touch behavior data. In practice, the numerical relationship of the value of the pressure coefficient to the value of the pressure may be established in advance. For example, the value of the pressure coefficient may have a linear relationship with the value of the pressure. Similarly, the value of the touch area coefficient may be determined based on the contact area of the user's skin with the touch screen in the touch behavior data; the value of the duration factor may be determined based on the duration of the touch in the touch behavior data.
And 103, controlling the touch screen to release current according to the current value.
In this embodiment, the electronic device may control the touch screen to release the current according to the current value determined in step 102. Taking the current value calculation model in the above example as an example, the current value may be released according to the components of the current value to be released by the touch screen in the directions of the x axis, the y axis and the z axis, respectively.
In some optional implementations of the embodiment, the electronic device may control the touch screen to discharge current according to the current value, and convert the current into electrostatic attraction, i.e., nerve pulse, during the discharge of the current. When the skin of the user is contacted with the electrostatic attraction, the neuron-like signals generated by the electrostatic attraction can be transmitted to the peripheral nerves of the skin of the user, and the cerebral cortex of the user can feel the touch sense of the skin.
In some optional implementations of the present embodiment, a three-dimensional virtual object may be presented in the touch screen. At this time, in a case where it is detected that the user slides the touch screen with one finger, the electronic device may acquire a first sliding distance and a first sliding direction. The first sliding distance is a distance that a user slides in a single finger of the touch screen, and the first sliding direction is a direction that the user slides in the single finger of the touch screen. Then, the electronic device may determine an angle to be rotated of the three-dimensional virtual object based on the first sliding distance and the first sliding direction. Here, the correspondence between the sliding direction and the direction to be rotated of the three-dimensional virtual object may be preset, and the correspondence between the sliding distance and the angle to be rotated of the three-dimensional virtual object may be preset as needed, which is not limited herein. And finally, rotating the three-dimensional virtual object according to the angle to be rotated and the direction to be rotated by taking a preset point (such as the origin of a preset cartesian coordinate system or the center point of the screen) in the touch screen as a reference point.
By way of example, fig. 4 is a second schematic diagram of an application scenario of the interaction method provided by the embodiment of the present invention. In this scenario, the user has performed a single-finger sliding operation in the touch screen. At this time, the three-dimensional virtual object (such as a rectangular parallelepiped shown in fig. 4) on the touch screen can be rotated with the origin of the preset cartesian coordinate system as a reference point.
In some optional implementations of the present embodiment, the touch screen has a three-dimensional virtual object presented therein. At this time, in the case that the multi-finger sliding (e.g., two fingers, three fingers, etc.) of the touch screen by the user is detected, the electronic device may acquire a second sliding distance and a second sliding direction. The second sliding distance may be an average value of distances that a user slides in two fingers in the touch screen, or may be a distance that any finger slides. The second sliding direction may be an average value of the sliding directions of the single fingers of the user on the touch screen, or any sliding direction. Then, the electronic device may determine a distance to be moved and a direction to be moved of the three-dimensional virtual object based on the second sliding distance and the second sliding direction. Here, the sliding distance may be a second sliding distance, and the sliding direction may be the above-described second sliding direction. Finally, the three-dimensional virtual object may be moved according to the distance to be moved and the direction to be moved, with a preset point in the touch screen as a reference point (e.g., an origin of a preset cartesian coordinate system or a center point of the screen).
By way of example, fig. 5 is a third schematic diagram of an application scenario of the interaction method provided by the embodiment of the present invention. In this scenario, the user has performed a two-finger sliding operation in the touch screen. At this time, the three-dimensional virtual object (such as a rectangular parallelepiped shown in fig. 5) on the touch screen can be rotated with the origin of the preset cartesian coordinate system as a reference point.
According to the method provided by the embodiment of the invention, under the condition that the touch screen of the electronic equipment is detected to be touched by the user, the skin roughness of the user is obtained, the touch behavior data is obtained, and then the current value to be released of the touch screen is determined based on the skin roughness and the touch behavior data, so that the touch screen is controlled to release the current according to the current value. Because the skin roughness of the user is considered in the process of determining the current value, when different users touch the screen or the user uses different parts to touch the screen, currents with different sizes can be released so that the user can feel different touch feelings, therefore, the touch feelings of the user in the process of using the touch screen are enriched, and the immersion feeling and the sense of reality of the interaction process are improved.
Further referring to fig. 6, which shows a second flowchart of the interaction method provided by the embodiment of the present invention, the interaction method provided by the embodiment of the present invention may be applied to an electronic device having a Touch Panel (also referred to as a Touch screen or a Touch Panel). In practice, the electronic device may be a smartphone, a tablet computer, a laptop, etc.
The flow 600 of the interaction method provided by the embodiment of the present invention includes the following steps:
step 601, detecting the skin roughness of the user and acquiring touch behavior data under the condition that the user touches the touch screen.
Step 601 in this embodiment can refer to step 101 in the corresponding embodiment of fig. 1, and is not described herein again.
Step 602, obtaining the material of the three-dimensional virtual object in the touch area.
In this embodiment, a three-dimensional virtual object, such as cotton, a porcelain plate, etc., may be presented in the display interface of the electronic device. The user can feel the touch feeling of the surface of the three-dimensional virtual object by operating the touch screen and enabling the touch screen to release current. When the skin roughness and the touch behavior data of the user are the same, the three-dimensional virtual object made of different materials is touched, and the touch feeling is different.
In some optional implementations of this embodiment, before step 601 is executed, the electronic device may perform three-dimensional image acquisition on the target object, so as to obtain a three-dimensional image. And the three-dimensional image comprises a three-dimensional virtual object corresponding to the target object. Then, the material of the three-dimensional virtual object in the three-dimensional image can be identified to obtain an identification result. In practice, the three-dimensional image may be sent to the server. The server side can identify the material of the three-dimensional virtual object in the three-dimensional image based on an artificial intelligence technology, and then returns the identification result to the electronic equipment. And then, setting a material mark for the pixel point corresponding to the three-dimensional virtual object based on the identification result. In practice, a material mark can be set for each pixel point in the three-dimensional virtual object. Therefore, after step 601 is executed, the material of the three-dimensional virtual object in the touch area may be determined based on the material marks of the pixel points in the touch area. The touch area is an area in the touch screen, which is in contact with the skin of the user.
Step 603, inputting the skin roughness and the touch behavior data to the current value calculation model under the condition that the current value calculation model corresponding to the material is found, and obtaining the current value to be released of the touch screen.
In the present embodiment, the electronic device may be stored with a plurality of current value calculation models corresponding to respective materials in advance. The current value calculation model corresponding to each material can calculate the current value to be released by the touch screen based on the skin roughness and the touch behavior data aiming at the material.
It will be appreciated that when a user touches a physical object, the tactile sensation will typically be different if the user's force, touch area, skin roughness, etc. are different. While when the user's tactile sensation is simulated by discharging the current, different tactile sensations can be realized by discharging currents of different magnitudes. Thus, the touch screen should release different current values for different skin roughness and touch behavior data, thereby providing different tactile sensations to the user. When a user touches a physical object, the tactile sensation is usually different depending on the material of the physical object. For example, cotton is soft, while porcelain plates are hard. Therefore, the material can also influence the current value to be released by the touch screen, so as to provide different touch feelings for users.
In view of this, in the present embodiment, for each material, the skin roughness and the touch behavior data for the material may be used as the influence factor of the current value in advance, and a current value calculation model for representing the correspondence between the influence factor and the current value may be established in advance. Therefore, after the material of the three-dimensional virtual object in the touch area is determined, the current value calculation model corresponding to the material can be searched. Under the condition that the current value calculation model corresponding to the material is found, the skin roughness and the touch behavior data can be input into the current value calculation model, so that the current value to be released of the touch screen is obtained.
Here, the current value calculation model may be a function or a formula or the like established in advance for calculating the current value. As an example, the current value calculation model may contain the following formula:
Ix=Aix+Cix
Iy=Aiy+Ciy
Iz=Aiz+Ciz
I=(Ix+Iy+Iz)BD
wherein I is a current value to be released by the touch screen; i isxThe component of the current value I to be released of the touch screen in the x-axis direction is shown; i isyThe component of the current value I to be released of the touch screen in the y-axis direction is shown; i iszThe component of the current value I to be released of the touch screen in the z-axis direction is shown; i is a preset basic current value, and the basic current value of each material is different; i.e. ixIs the component of the preset basic current value i in the direction of the x axis; i.e. iyIs the component of the preset basic current value i in the y-axis direction; i.e. izIs the component of the preset basic current value i in the z-axis direction; a is a pressure coefficient; b is a touch area coefficient; c is a time length coefficient; d is the user skin roughness. The x-axis, y-axis, and z-axis are coordinate axes in three directions in a cartesian coordinate system established in advance.
It should be noted that the value of the pressure coefficient may be determined based on the pressure applied to the touch screen by the user in the touch behavior data. In practice, the numerical relationship of the value of the pressure coefficient to the value of the pressure may be established in advance. For example, the value of the pressure coefficient may have a linear relationship with the value of the pressure. Similarly, the value of the touch area coefficient may be determined based on the contact area of the user's skin with the touch screen in the touch behavior data; the value of the duration factor may be determined based on the duration of the touch in the touch behavior data.
It should be noted that, in the current value calculation model corresponding to different materials, the base current value i and the component i of the base current value i in the x-axis directionxComponent i of base current value i in y-axis directionyAnd a component i of the base current value i in the z-axis directionzMay be different.
In practice, the base current values corresponding to different materials can be preset based on the data (such as hardness, density, etc.) of different materials. Specifically, first, a touch value may be set for each material to be acquired based on data of each material. In practice, the touch value of each material may be set manually, or may be obtained by inputting the data of the surface of each material into a touch value calculation model trained in advance. The touch value calculation model may be obtained in advance by using a large number of training samples and training using a machine learning method (e.g., a supervised learning method). Then, different basic current values can be set for different touch values, so that the basic current value corresponding to each material is obtained. In practice, the base current value corresponding to each touch value can be preset through a large amount of data statistics and experiments.
In some optional implementation manners of this embodiment, in a case that a current value calculation model corresponding to a material of a three-dimensional virtual object in a touch area is not found, the electronic device may establish the current value calculation model corresponding to the material, and input the skin roughness and the touch behavior data to the current value calculation model to obtain a current value to be released by the touch screen. Specifically, the touch value of the material of the three-dimensional virtual object in the touch area may be first estimated based on the known touch values of the respective materials. For example, if the known material includes cotton and ceramic, and the material of the three-dimensional virtual object in the touch region is wood, the base current value of wood can be set to a value between the base current value of cotton and the base current value of ceramic because the hardness of wood is between that of cotton and that of ceramic. Likewise, the components of the base current value in the respective directions may be set in this manner. Thus, a current value calculation model corresponding to the wood is obtained.
And step 604, controlling the touch screen to release current according to the current value.
In this embodiment, the electronic device may control the touch screen to release the current according to the current value determined in step 102. Taking the current value calculation model in the above example as an example, the current value may be released according to the components of the current value to be released by the touch screen in the directions of the x axis, the y axis and the z axis, respectively.
In practice, the electronic device may control the touch screen to discharge current according to the current value, and convert the current into electrostatic attraction, i.e., nerve pulse, during the process of discharging the current. When the skin of the user is contacted with the electrostatic attraction, the neuron-like signals generated by the electrostatic attraction can be transmitted to the peripheral nerves of the skin of the user, and the cerebral cortex of the user can feel the touch sense of the skin.
In some optional implementations of the embodiment, after determining the current value to be released by the touch screen, the executing body may further determine the degree of the recess of the three-dimensional virtual object based on the skin roughness, the touch behavior data, and the direction of the current. Touch behavior data may include, but is not limited to, pressure applied by the user on the touch screen, contact area of the user's skin with the touch screen, duration of touch, and the like.
As an example, the degree of concavity of the three-dimensional virtual object may be determined according to the following formula:
X=Axcos(ax)C’x/BD
Y=Aycos(ay)C’y/BD
Z=Azcos(az)C’z/BD
wherein X is the deformation distance in the X-axis direction; y is the deformation distance in the Y-axis direction; z is the deformation distance in the Z-axis direction; a. thexA component in the x-axis direction of the pressure applied to the touch screen for the user; a. theyA component in the y-axis direction of the pressure applied to the touch screen for the user; a. thezA component in the y-axis direction of the pressure applied to the touch screen for the user; a isxIs the included angle between the current direction and the x-axis direction; a isyIs the included angle between the current direction and the y-axis direction; a iszIs the included angle between the current direction and the z-axis direction; b is a touch area coefficient; d is the roughness of the skin of the user; c'x、C’y、C’zRespectively, time length coefficients affecting the three-dimensional virtual object. Wherein, C'xCan be based on AxIs determined. In practice, the value of the pressure coefficient and the pressure may be established beforehandNumerical relationship of values. E.g. C'xValue of (A) andxmay have a linear relationship. In the same way, C'yCan be based on AyIs determined by the duration of'zCan be based on AzIs determined.
In practice, the degree of concavity may be characterized by X, Y, Z, and based on the degree of concavity, the three-dimensional virtual object is subjected to deformation processing.
As can be seen from fig. 6, compared with the embodiment corresponding to fig. 1, the flow 600 of the interaction method in this embodiment relates to the steps of obtaining the material of the three-dimensional virtual object in the touch area, and determining the current value to be released by the touch screen based on the current value calculation model corresponding to the material. Therefore, according to the scheme described in the embodiment, when different users touch the screen or the users use different parts of the screen, and the materials of the three-dimensional virtual objects in the touched screen are different, currents with different sizes are released, so that the users feel different touch senses, therefore, the touch senses of the users in the process of using the touch screen are further enriched, and the immersion and reality senses of the interaction process are improved.
With further reference to fig. 7, as an implementation of the method shown in fig. 1 described above, the present invention provides an embodiment of an electronic device, which corresponds to the embodiment of the method shown in fig. 1.
As shown in fig. 7, the electronic device 700 of the present embodiment includes: a detecting unit 701, configured to, when it is detected that a user touches the touch screen, obtain skin roughness of the user, and obtain touch behavior data; a first determining unit 702, configured to determine a current value to be released by the touch screen based on the skin roughness and the touch behavior data; and a release unit 703 for controlling the touch screen to release the current according to the current value.
In some optional implementations of this embodiment, a wheatstone bridge is installed in the touch screen; and, the detecting unit 701 is further configured to: determining a resistance value of the variable resistor using the skin of the user as the variable resistor in the wheatstone bridge; and determining the skin roughness of the user based on the resistance value and preset corresponding relation information, wherein the corresponding relation information is used for indicating the corresponding relation between the resistance value and the skin roughness.
In some optional implementations of the present embodiment, the first determining unit 702 is further configured to: acquiring the material of a three-dimensional virtual object in the touch area; and under the condition that a current value calculation model corresponding to the material is found, inputting the skin roughness and the touch behavior data into the current value calculation model to obtain a current value to be released by the touch screen.
In some optional implementations of the present embodiment, the first determining unit 702 is further configured to: and under the condition that the current value calculation model corresponding to the material is not found, establishing the current value calculation model corresponding to the material, and inputting the skin roughness and the touch behavior data into the current value calculation model to obtain the current value to be released of the touch screen.
In some optional implementations of this embodiment, the electronic device further includes: the acquisition unit is used for acquiring a three-dimensional image of a target object to obtain a three-dimensional image, wherein the three-dimensional image comprises a three-dimensional virtual object corresponding to the target object; the identification unit is used for identifying the material of the three-dimensional virtual object in the three-dimensional image to obtain an identification result; and the marking unit is used for setting material marks for the pixel points corresponding to the three-dimensional virtual object based on the identification result. And, the first determining unit 702 is further configured to: and determining the material of the three-dimensional virtual object in the touch area based on the material mark of the pixel point in the touch area.
In some optional implementations of this embodiment, the electronic device further includes: a second determination unit configured to determine a degree of concavity of the three-dimensional virtual object based on the skin roughness, the touch behavior data, and the direction of the current; and a deformation processing unit for performing deformation processing on the three-dimensional virtual object based on the degree of the recess.
In some optional implementations of this embodiment, the releasing unit 703 is further configured to: and controlling the touch screen to release current according to the current value, and converting the current into electrostatic attraction in the process of releasing the current.
In some optional implementations of the present embodiment, a three-dimensional virtual object is presented in the touch screen; and, the above-mentioned electronic equipment also includes: the first acquisition unit is used for acquiring a first sliding distance and a first sliding direction under the condition that the touch screen is detected to be slid by a single finger of a user; a third determining unit configured to determine an angle to be rotated and a direction to be rotated of the three-dimensional virtual object based on the first sliding distance and the first sliding direction; and the rotating unit is used for rotating the three-dimensional virtual object according to the angle to be rotated and the direction to be rotated by taking a preset point in the touch screen as a reference point.
In some optional implementations of the present embodiment, a three-dimensional virtual object is presented in the touch screen; and, the above-mentioned electronic equipment also includes: the second acquisition unit is used for acquiring a second sliding distance and a second sliding direction under the condition that the touch screen is detected to be slid by multiple fingers of the user; a fourth determination unit configured to determine a distance to be moved and a direction to be moved of the three-dimensional virtual object based on the second sliding distance and the second sliding direction; and the moving unit is used for moving the three-dimensional virtual object according to the distance to be moved and the direction to be moved by taking a preset point in the touch screen as a reference point.
According to the electronic device provided by the embodiment of the invention, under the condition that the touch screen of the electronic device is detected to be touched by the user, the skin roughness of the user is obtained, the touch behavior data is obtained, and then the current value to be released of the touch screen is determined based on the skin roughness and the touch behavior data, so that the touch screen is controlled to release the current according to the current value. Because the skin roughness of the user is considered in the process of determining the current value, when different users touch the screen or the user uses different parts to touch the screen, currents with different sizes can be released so that the user can feel different touch feelings, therefore, the touch feelings of the user in the process of using the touch screen are enriched, and the immersion feeling and the sense of reality of the interaction process are improved.
With further reference to fig. 8, a hardware structure of an electronic device for implementing various embodiments of the present invention is schematically illustrated.
The electronic device 800 includes, but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, a processor 810, and a power supply 811. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 6 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
A processor 810, configured to, in a case that it is detected that a user touches the touch screen, obtain skin roughness of the user, and obtain touch behavior data; determining a current value to be released by the touch screen based on the skin roughness and the touch behavior data; and controlling the touch screen to release current according to the current value.
In the embodiment of the invention, under the condition that the touch of the user on the touch screen of the electronic equipment is detected, the skin roughness of the user is obtained, the touch behavior data is obtained, and then the current value to be released of the touch screen is determined based on the skin roughness and the touch behavior data, so that the touch screen is controlled to release the current according to the current value. Because the skin roughness of the user is considered in the process of determining the current value, when different users touch the screen or the user uses different parts to touch the screen, currents with different sizes can be released so that the user can feel different touch feelings, therefore, the touch feelings of the user in the process of using the touch screen are enriched, and the immersion feeling and the sense of reality of the interaction process are improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 801 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 810; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 801 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 801 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 802, such as to assist the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 803 may convert audio data received by the radio frequency unit 801 or the network module 802 or stored in the memory 809 into an audio signal and output as sound. Also, the audio output unit 803 may also provide audio output related to a specific function performed by the electronic apparatus 800 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 803 includes a speaker, a buzzer, a receiver, and the like.
The input unit 804 is used for receiving an audio or video signal. The input Unit 804 may include a Graphics Processing Unit (GPU) 8041 and a microphone 8042, and the Graphics processor 8041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 806. The image frames processed by the graphics processor 8041 may be stored in the memory 809 (or other storage medium) or transmitted via the radio frequency unit 801 or the network module 802. The microphone 8042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 801 in case of a phone call mode.
The electronic device 800 also includes at least one sensor 805, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 8081 according to the brightness of ambient light and a proximity sensor that can turn off the display panel 8081 and/or the backlight when the electronic device 800 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 805 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 806 is used to display information input by the user or information provided to the user. The Display unit 806 may include a Display panel 8081, and the Display panel 8081 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 807 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus. Specifically, the user input unit 807 includes a touch panel 8071 and other input devices 8072. The touch panel 8071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 8071 (e.g., operations by a user on or near the touch panel 8071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 8071 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 810, receives a command from the processor 810, and executes the command. In addition, the touch panel 8071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 8071, the user input unit 807 can include other input devices 8072. In particular, other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 8071 can be overlaid on the display panel 8081, and when the touch panel 8071 detects a touch operation on or near the touch panel 8071, the touch operation can be transmitted to the processor 810 to determine the type of the touch event, and then the processor 810 can provide a corresponding visual output on the display panel 8081 according to the type of the touch event. Although the touch panel 8071 and the display panel 8081 are shown in fig. 6 as two separate components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 8071 and the display panel 8081 may be integrated to implement the input and output functions of the electronic device, and the implementation is not limited herein.
The interface unit 808 is an interface for connecting an external device to the electronic apparatus 800. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 808 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the electronic device 800 or may be used to transmit data between the electronic device 800 and external devices.
The memory 809 may be used to store software programs as well as various data. The memory 809 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 809 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 810 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 809 and calling data stored in the memory 809, thereby monitoring the whole electronic device. Processor 810 may include one or more processing units; preferably, the processor 810 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
The electronic device 800 may also include a power supply 811 (e.g., a battery) for powering the various components, and preferably, the power supply 811 may be logically coupled to the processor 810 via a power management system to manage charging, discharging, and power consumption management functions via the power management system.
In addition, the electronic device 800 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor 810, a memory 809, and a computer program stored in the memory 809 and capable of running on the processor 810, where the computer program, when executed by the processor 810, implements each process of the above interaction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the interaction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. An interaction method is applied to an electronic device with a touch screen, and comprises the following steps:
under the condition that the touch screen is detected to be touched by a user, acquiring the skin roughness of the user and acquiring touch behavior data;
determining a current value to be released by the touch screen based on the skin roughness and the touch behavior data;
and controlling the touch screen to release current according to the current value.
2. The method of claim 1, wherein a wheatstone bridge is mounted in the touch screen; and
the acquiring the skin roughness of the user comprises:
determining a resistance value of a variable resistance in the Wheatstone bridge using the user's skin as the variable resistance;
and determining the skin roughness of the user based on the resistance value and preset corresponding relation information, wherein the corresponding relation information is used for indicating the corresponding relation between the resistance value and the skin roughness.
3. The method of claim 1, wherein determining a current value to be released by the touch screen based on the skin roughness and the touch behavior data comprises:
acquiring the material of a three-dimensional virtual object in the touch area;
and under the condition that a current value calculation model corresponding to the material is found, inputting the skin roughness and the touch behavior data into the current value calculation model to obtain a current value to be released of the touch screen.
4. The method of claim 3, wherein after the obtaining material of the three-dimensional virtual object in the touch area, the determining a current value to be released by the touch screen based on the skin roughness and the touch behavior data further comprises:
and under the condition that the current value calculation model corresponding to the material is not found, establishing the current value calculation model corresponding to the material, and inputting the skin roughness and the touch behavior data into the established current value calculation model to obtain the current value to be released of the touch screen.
5. The method of claim 3, wherein prior to said detecting skin roughness of the user and obtaining touch behavior data in the event that the user is detected touching the touch screen, the method further comprises:
acquiring a three-dimensional image of a target object to obtain a three-dimensional image, wherein the three-dimensional image comprises a three-dimensional virtual object corresponding to the target object;
identifying the material of the three-dimensional virtual object in the three-dimensional image to obtain an identification result;
setting a material mark for a pixel point corresponding to the three-dimensional virtual object based on the identification result;
and the acquiring the material of the three-dimensional virtual object in the touch area comprises the following steps:
and determining the material of the three-dimensional virtual object in the touch area based on the material mark of the pixel point in the touch area.
6. The method of claim 3, wherein after the determining the current value to be released by the touch screen, the method further comprises:
determining a degree of concavity of the three-dimensional virtual object based on the skin roughness, the touch behavior data, and the direction of current flow;
and performing deformation processing on the three-dimensional virtual object based on the sinking degree.
7. An electronic device, the electronic device comprising:
the detection unit is used for acquiring the skin roughness of a user and acquiring touch behavior data under the condition that the touch screen is detected to be touched by the user;
the first determining unit is used for determining a current value to be released by the touch screen based on the skin roughness and the touch behavior data;
and the release unit is used for controlling the touch screen to release current according to the current value.
8. The electronic device of claim 7, wherein a wheatstone bridge is mounted in the touch screen; and
the detection unit is further configured to:
determining a resistance value of a variable resistance in the Wheatstone bridge using the user's skin as the variable resistance;
and determining the skin roughness of the user based on the resistance value and preset corresponding relation information, wherein the corresponding relation information is used for indicating the corresponding relation between the resistance value and the skin roughness.
9. The electronic device of claim 7, wherein the first determining unit is further configured to:
acquiring the material of a three-dimensional virtual object in the touch area;
and under the condition that a current value calculation model corresponding to the material is found, inputting the skin roughness and the touch behavior data into the current value calculation model to obtain a current value to be released of the touch screen.
10. An electronic device, comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the interaction method according to any one of claims 1-6.
CN201911319984.9A 2019-12-19 2019-12-19 Interaction method and electronic equipment Active CN111158474B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911319984.9A CN111158474B (en) 2019-12-19 2019-12-19 Interaction method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911319984.9A CN111158474B (en) 2019-12-19 2019-12-19 Interaction method and electronic equipment

Publications (2)

Publication Number Publication Date
CN111158474A CN111158474A (en) 2020-05-15
CN111158474B true CN111158474B (en) 2021-10-22

Family

ID=70557476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911319984.9A Active CN111158474B (en) 2019-12-19 2019-12-19 Interaction method and electronic equipment

Country Status (1)

Country Link
CN (1) CN111158474B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102906669A (en) * 2010-05-20 2013-01-30 诺基亚公司 An apparatus for a user interface and associated methods
CN103092420A (en) * 2013-01-25 2013-05-08 京东方科技集团股份有限公司 Touch display screen and driving method thereof
CN105556423A (en) * 2013-06-11 2016-05-04 意美森公司 Systems and methods for pressure-based haptic effects
CN106940622A (en) * 2011-04-22 2017-07-11 意美森公司 Electric oscillation touch-sensitive display
CN108509028A (en) * 2017-02-24 2018-09-07 意美森公司 The system and method touched for virtual emotion

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007066717A1 (en) * 2005-12-08 2007-06-14 The University Of Tokyo Electric tactile display
EP2385562A1 (en) * 2010-05-04 2011-11-09 Koninklijke Philips Electronics N.V. Actuator device with improved tactile characteristics
US9830781B2 (en) * 2014-06-13 2017-11-28 Verily Life Sciences Llc Multipurpose contacts for delivering electro-haptic feedback to a wearer
US10146308B2 (en) * 2014-10-14 2018-12-04 Immersion Corporation Systems and methods for impedance coupling for haptic devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102906669A (en) * 2010-05-20 2013-01-30 诺基亚公司 An apparatus for a user interface and associated methods
CN106940622A (en) * 2011-04-22 2017-07-11 意美森公司 Electric oscillation touch-sensitive display
CN103092420A (en) * 2013-01-25 2013-05-08 京东方科技集团股份有限公司 Touch display screen and driving method thereof
CN105556423A (en) * 2013-06-11 2016-05-04 意美森公司 Systems and methods for pressure-based haptic effects
CN108509028A (en) * 2017-02-24 2018-09-07 意美森公司 The system and method touched for virtual emotion

Also Published As

Publication number Publication date
CN111158474A (en) 2020-05-15

Similar Documents

Publication Publication Date Title
CN109558061B (en) Operation control method and terminal
CN108132752B (en) Text editing method and mobile terminal
CN110174993B (en) Display control method, terminal equipment and computer readable storage medium
CN109005336B (en) Image shooting method and terminal equipment
CN110531915B (en) Screen operation method and terminal equipment
EP4016975A1 (en) Object position adjustment method, and electronic apparatus
CN110187822B (en) Terminal and screen display control method applied to terminal
EP4084335A1 (en) Touch key, control method and electronic device
CN111464428B (en) Audio processing method, server, electronic device, and computer-readable storage medium
CN110442261B (en) Electronic equipment and touch operation detection method thereof
CN111522613A (en) Screen capturing method and electronic equipment
CN107943406B (en) touch point determining method of touch screen and terminal
WO2021208769A1 (en) Control method and electronic device
CN108021315B (en) Control method and mobile terminal
CN111443860B (en) Touch control method and electronic equipment
CN111190528B (en) Brush display method, electronic equipment and storage medium
CN111443824A (en) Touch screen control method and electronic equipment
CN111158474B (en) Interaction method and electronic equipment
CN111328132A (en) Method for adjusting transmitting power and electronic equipment
CN110908562A (en) Icon display method and device, electronic equipment and medium
CN111026300A (en) Screen display method and electronic equipment
CN111443859B (en) Touch interaction method and electronic equipment
CN111459281B (en) Haptic feedback method, electronic device, and storage medium
CN109358792B (en) Display object selection method and terminal
CN109542293B (en) Menu interface setting method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant