CN103439030B - Texture force measuring method in a kind of haptic feedback - Google Patents

Texture force measuring method in a kind of haptic feedback Download PDF

Info

Publication number
CN103439030B
CN103439030B CN201310424215.1A CN201310424215A CN103439030B CN 103439030 B CN103439030 B CN 103439030B CN 201310424215 A CN201310424215 A CN 201310424215A CN 103439030 B CN103439030 B CN 103439030B
Authority
CN
China
Prior art keywords
force
mrow
msub
texture
acceleration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310424215.1A
Other languages
Chinese (zh)
Other versions
CN103439030A (en
Inventor
吴涓
李明
王路
刘威
宋爱国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201310424215.1A priority Critical patent/CN103439030B/en
Publication of CN103439030A publication Critical patent/CN103439030A/en
Application granted granted Critical
Publication of CN103439030B publication Critical patent/CN103439030B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Investigating Strength Of Materials By Application Of Mechanical Stress (AREA)

Abstract

The invention discloses the texture force measuring method in a kind of haptic feedback, consider the translational speed in objective grain surface factor and staff heuristic process and active pressing force, texture power is decomposed into vertical direction acting force and horizontal direction acting force.Wherein, vertical direction acting force is divided into objective factor acting force and subjective factor acting force, and objective factor acting force is determined by the grain surface micro-profile height of surveying and surface rigidity characteristic.Change in view of the normal acceleration in contact process can reflect the situation of change of staff moving process medium velocity and pressure, subjective factor acting force is the function of normal acceleration, and normal acceleration during emulation is the linear interpolation function of actual measureed value of acceleration under different pressures and speed.Horizontal direction texture power can be the frictional resistance that probe streaks grain surface, is determined by vertical direction pressing force and texture materials kinetic friction coefficient.

Description

Texture force measuring method in force touch reappearance
Technical Field
The invention belongs to the field of texture haptic reappearance, and relates to a haptic expression method for texture.
Background
At present, the texture haptic rendering technology is mainly to build a haptic model when contacting with a textured surface according to the contour features of the textured surface and to reproduce haptic through a haptic reproduction device. The texture expression modeling method based on the force tactile feedback device is mainly divided into three categories: establishing a force touch expression model of the texture based on the geometric constraint and the physical model; extracting contour features of the texture surface from the image to establish a force touch expression model; and actually measuring the contour or vibration information of the real texture, and reproducing the texture information through a virtual reality technology.
The measuring instruments based on the texture force expression technology of the actual measurement model are divided into two types: a non-contact type, such as three-dimensional optical surface profiler, obtains three-dimensional height information of a surface through optical characteristics. The method has high requirements on the cleanness and the flatness of the surface to be detected. The other is contact, typically by way of a probe tap or press to scan a textured surface and record status information in the interaction with a sensor. The main problem of this method is "inaccurate measurement", and the addition of sensors in the system will change the original connection state and dynamic characteristics, resulting in the deviation between the measurement result and the original situation.
For the contact type measurement based on the probe scanning texture surface, the domestic technology generally obtains the pressure value of the probe scanning texture surface through a force sensor, the pressure value is directly used as the output force of a force feedback device after simple processing and is expressed as the microscopic contour height of the texture, the acceleration signal of the probe scanning texture surface is simultaneously measured on the domestic basis in the foreign research, and the truth degree of the texture touch feeling is improved through conversion output.
Currently, the measurement of textured surface information based on probes and applied to force haptic reproduction are: hari et al propose using SensAblePHANTOM to drag transversely on a real texture surface, and meanwhile estimating displacement disturbance perpendicular to the surface direction, and establishing a texture force model on the basis; JochenLang et al propose using a WHaT wireless tactile sensor to measure the force and acceleration of a hand-held probe tapping a tactile texture and using these measurements to calculate the roughness and stiffness of the texture; okamura et al propose to describe the acceleration profile of the probe as the textured surface slides with a decaying sinusoidal signal; guruswamy et al propose to use the acceleration distribution value generated by the IIR filter multiplied by a scaling factor as a texture force; XianmingYe et al propose measuring three-dimensional force signals with two strain gauges and a force sensor, obtaining frictional force of a texture using normal force and tangential force, and measuring dynamic readout of axial contact stress when contacting a probe with a piezoelectric film and using the readout for tactile representation of the texture force; joseph et al propose that LPC filtering is performed on the collected acceleration signal, the predicted acceleration signal is output by a computer sound card, and a voice coil motor on a texture detection device is driven by a current amplifier to realize virtual texture force touch reproduction.
Disclosure of Invention
The technical problem is as follows: the invention provides a texture force measuring method in force touch reappearance, which utilizes an acceleration signal to carry out texture force modeling and considers the influence of human subjective action on texture touch reappearance.
The technical scheme is as follows: the texture force measuring method in the force touch reappearance comprises the following steps:
in the virtual texture force model, a virtual probe in an operation handle control model of the force touch reappearing equipment is used for scratching the surface of a virtual object, whether the virtual probe is in contact with the virtual object or not and whether relative motion exists or not are detected, and if the virtual probe is in contact with the virtual object or does not move relatively, the virtual texture force is generatedThe output is 0, otherwise, the virtual texture force is calculated according to the following formulaAnd outputs:
<math> <mrow> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>con</mi> </msub> <mo>=</mo> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>ver</mi> </msub> <mo>+</mo> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>hor</mi> </msub> <mo>;</mo> </mrow> </math>
wherein,the vertical force is the force applied by the virtual texture force,the horizontal direction acting force which is the virtual texture force, <math> <mrow> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>ver</mi> </msub> <mo>=</mo> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>obj</mi> </msub> <mo>+</mo> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>sub</mi> </msub> <mo>;</mo> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>hor</mi> </msub> <mo>=</mo> <mover> <mi>f</mi> <mo>&RightArrow;</mo> </mover> <mo>,</mo> </mrow> </math> wherein,the acting force is an objective factor and is,is the acting force of the subjective factor,the virtual probe is subjected to horizontal friction force during relative motion;
objective factor acting forceCalculated according to the following formula:
<math> <mrow> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>obj</mi> </msub> <mo>=</mo> <msub> <mi>k</mi> <mn>1</mn> </msub> <mo>&times;</mo> <msub> <mi>k</mi> <mi>s</mi> </msub> <mo>&times;</mo> <mi>H</mi> <mo>&times;</mo> <mover> <mi>n</mi> <mo>&RightArrow;</mo> </mover> <mo>;</mo> </mrow> </math>
wherein k issIs a constant describing the rigidity coefficient of the texture material, H is the contour height of the texture surface corresponding to the contact point with the texture surface when the virtual probe senses the texture, k1Is a constant coefficient of the number of the,is a vertical direction unit vector;
constant ksThe linear coefficient A is numerically equal to the mean square difference value of the acceleration in the vertical direction and the pressing force, and the calculation formula of the mean square difference value of the acceleration in the vertical direction is as follows:
where N is the number of sets of measured vertical accelerations, aiThe vertical direction acceleration data is actually measured by a texture detection pen, and in order to obtain the linear coefficient A of the mean square difference value of the vertical direction acceleration and the pressing force, the pressing force F is calculated by using correlation analysis and regression analysispScanning speed v as independent variable and acceleration mean square difference value a in vertical directionrmsFunctional relationship for dependent variables: a isrms(Fp,v)=A×Fp+ Bxv, where A is the mean square deviation value a of the acceleration in the vertical directionrmsWith a pressing force FpB is the mean square deviation value a of the acceleration in the vertical directionrmsLinear coefficient with scanning speed v, FpThe pressing force in the actual measuring process is shown, and v is the scanning speed in the actual measuring process; constant coefficient k1Is calculated by the formulaWherein, FmaxConstant c for maximum output force of actual force feedback device1The value is any one of 0.6-0.8;
subjective factor applied forceCalculated according to the following formula:
<math> <mrow> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>sub</mi> </msub> <mo>=</mo> <msub> <mi>k</mi> <mi>a</mi> </msub> <mo>&times;</mo> <msub> <mi>a</mi> <mi>ver</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>F</mi> <mi>p</mi> </msub> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>&times;</mo> <mover> <mi>n</mi> <mo>&RightArrow;</mo> </mover> <mo>;</mo> </mrow> </math>
wherein, aver(FpV) is any pressing force F in the perception process obtained by using linear predictive coding and bilinear interpolation methodpAnd vertical acceleration, k, at any perceived velocity vaTo convert the acceleration signal aver(FpV) conversion into force signalsThe calculation formula of the constant coefficient is as follows:
in which the objective factor acting force is determinedActing with subjective factorsConstant c of weight ratio2The value is any one of 0.1-0.4, FobjThe measured vertical acceleration of the acting force of the objective factor under any pressing force and searching speed;is a vertical direction unit vector;
the frictional resistance in the horizontal directionCalculated according to the following formula:
<math> <mrow> <mover> <mi>f</mi> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <mi>&mu;</mi> <mo>&times;</mo> <mo>|</mo> <msub> <mi>F</mi> <mi>ver</mi> </msub> <mo>|</mo> <mo>&times;</mo> <mover> <mi>t</mi> <mo>&RightArrow;</mo> </mover> </mrow> </math>
where μ is the sliding friction coefficient of the virtual probe as it slides on the virtual texture surface, FverThe grain force acts in the vertical direction,is a horizontal direction unit vector.
Has the advantages that: compared with the prior art, the invention has the following advantages:
1. calculating the objective factor acting force by using the actually measured acceleration signal, wherein the objective factor acting force calculation formula is as follows:the formula embodies the important role of the objective characteristics of the texture on the calculation of the virtual texture force, wherein the parameter H embodies the microscopic contour information and the parameter k of the texture surfacesThe rigidity characteristic of the texture material is embodied.
The domestic technology generally obtains the objective factor acting force of the virtual texture force only by using the microscopic contour height H obtained by scanning, and the formula is as follows:the invention considers the rigidity information of the texture surface material in the process of touching the texture by people on the basis of time, so that the force fed back by the force touch equipment can restore the real texture, and a parameter k is added to the original formulasAs the rigidity coefficient of the surface material of the texture, the influence of soft and hard factors of the texture material on human touch texture perception is considered, and the reality degree of texture force touch perception is increased.
2. A virtual texture force touch modeling algorithm considering the influence of subjective factors on texture perception is provided, and the calculation formula of the subjective factor acting force is as follows:the formula reflects the influence of the subjective perception habit of people on texture perception, and the parameter aver(FpV) the vertical direction acceleration value under any pressing force and scanning speed obtained by calculation based on actual measurement reflects the influence of different perception habits and perception speeds on texture perception, kaThe weight relation between the subjective factor acting force and the objective factor acting force is embodied, so that the virtual texture force perception is more in line with the actual experience of people.
At present, no research on subjective factors of human in the perception process is available in China. Foreign researchers have research and analysis on subjective factors, but no specific algorithm for researching the tactile representation of the subjective factors on the texture force by using the acting force in the vertical direction exists.
Drawings
FIG. 1 is a logic flow diagram of the method of the present invention;
FIG. 2 is an experimental schematic of the process of the present invention;
FIG. 3 is a diagram illustrating bilinear interpolation binary search employed in the method of the present invention.
Detailed Description
The technical solution of the present invention will be described in detail below with reference to the embodiments and the accompanying drawings.
Referring to fig. 1, the texture force measuring method in haptic force reproduction of the present invention: manipulating the hand controller of the force haptic rendering device to control the virtual probe to approach the virtual texture surface in the virtual texture force modelThe output force of the force touch equipment is 0 before the virtual probe collides with the texture surface, and when the virtual probe collides with the texture surface, the vertical acting force of the virtual force is respectively calculatedAnd horizontal forceVertical final, virtual texture force is generated by hand controllerAnd (6) outputting.
Referring to fig. 2, an experimental schematic of the present invention: when the virtual probe contacts the texture surface, a contact force is generatedDecomposing the contact force into a vertical forceActing in the horizontal directionAnd the analysis and calculation are convenient.
Taking 80-mesh sandpaper as an example, the specific implementation process of the invention is illustrated as follows:
(1) measuring the texture surface profile by using a three-dimensional optical surface profile instrument KS-1100 to obtain the height H of the microscopic profile of the texture surface;
(2) the texture detection pen is scratched on the surface of 80-mesh sand paper, the pressing force F is respectively selected to be 0.2N, 0.5N, 0.8N, 1.1N and 1.4N, the scanning speed v is 50mm/s, 100mm/s, 150mm/s and 200mm/s, and the acceleration sensors in the detection pen are used for respectively measuring the corresponding vertical direction acceleration ai. Then the mean square deviation value of the acceleration in the vertical direction corresponding to each group of pressing force F and scanning speed v can be obtainedWherein N is the number of each group of vertical acceleration values, and the vertical acceleration mean square deviation value a is obtained by using correlation and regression analysisrms(FpV) with a pressing force FpAnd the functional relationship of the scanning speed v: a isrms(Fp,v)=A×Fp+ B × v, wherein A, B is a constant coefficient. The correlation analysis and regression analysis are specifically as follows:
the correlation analysis is a common statistical method for researching the closeness degree of the relationship between different variables, a statistic which can describe the linear correlation degree between the variables and is calculated according to the number of samples is called a sample correlation coefficient and is usually expressed by a Pearson simple correlation coefficient r, and the calculation formula of r is as follows:wherein xi、yiFor the two variable sequences of the correlation analysis,are respectively xi、yiN is an array xi、yiThe number of the cells. Respectively for the measured mean square difference value a of the vertical accelerationrmsAbout the pressing force value FpCarrying out correlation analysis on the scanning speed v, and calculating a correlation coefficient r, wherein weak correlation is realized when the absolute value of r is less than 0.3; when the absolute r is more than or equal to 0.3 and less than 0.5, the correlation is low; when the absolute value of r is more than or equal to 0.5 and less than 0.8, the correlation is moderate; when the absolute value of r is less than 1 and is more than or equal to 0.8, the correlation is high.
Mean square deviation of acceleration a in the confirmed vertical directionrmsWith a pressing force value FpAnd under the condition of linear correlation of the scanning speed v, determining the mathematical relationship of the correlation quantity by adopting multivariate linear regression analysis, wherein a mathematical formula for describing the corresponding quantity relationship is called a multivariate linear regression equation. The least square method is adopted for estimating the regression coefficient in the multiple linear regression analysis. The multiple linear regression equation is set as: a isrms=β0×Fp1X v, set of multiple linesThe regression equation for the sexual sample is:
wherein,is beta0、β1Then the sum of the squares of the residuals is expressed asAccording to the principle of finding the minimum value in the calculus equation, it can be known that there is a minimum value in SSE, and in order to make SSE obtain the minimum value, SSE is to beta0、β1Should be equal to zero, i.e.:
<math> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mfrac> <mrow> <mo>&PartialD;</mo> <mi>SSE</mi> </mrow> <mrow> <mo>&PartialD;</mo> <msub> <mi>&beta;</mi> <mn>0</mn> </msub> </mrow> </mfrac> <mo>=</mo> <mo>-</mo> <mn>2</mn> <mi>&Sigma;</mi> <mrow> <mo>(</mo> <msub> <mi>a</mi> <mi>rms</mi> </msub> <mo>-</mo> <msub> <mover> <mi>a</mi> <mo>^</mo> </mover> <mi>rms</mi> </msub> <mo>)</mo> </mrow> <msub> <mi>F</mi> <mi>P</mi> </msub> <mo>=</mo> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mfrac> <mrow> <mo>&PartialD;</mo> <mi>SSE</mi> </mrow> <mrow> <mo>&PartialD;</mo> <msub> <mi>&beta;</mi> <mn>1</mn> </msub> </mrow> </mfrac> <mo>=</mo> <mo>-</mo> <mn>2</mn> <mi>&Sigma;</mi> <mrow> <mo>(</mo> <msub> <mi>a</mi> <mi>rms</mi> </msub> <mo>-</mo> <msub> <mover> <mi>a</mi> <mo>^</mo> </mover> <mi>rms</mi> </msub> <mo>)</mo> </mrow> <mi>v</mi> <mo>=</mo> <mn>0</mn> </mtd> </mtr> </mtable> </mfenced> </math>
beta can be obtained by solving the formula0、β1Is estimated value ofThenConstant coefficient A, B;
(3) the virtual texture force vertical direction objective factor acting force modeling is as follows:
<math> <mrow> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>obj</mi> </msub> <mo>=</mo> <msub> <mi>k</mi> <mn>1</mn> </msub> <mo>&times;</mo> <msub> <mi>k</mi> <mi>s</mi> </msub> <mo>&times;</mo> <mi>H</mi> <mo>&times;</mo> <mover> <mi>n</mi> <mo>&RightArrow;</mo> </mover> <mo>;</mo> </mrow> </math>
wherein k issIs a constant for describing the rigidity coefficient of the texture material, and H is the texture and the time when the virtual probe senses the textureHeight, k, of the surface profile of the texture corresponding to the point of contact of the surface1Is a constant coefficient of the number of the,is a vertical direction unit vector;
constant ksIs numerically equal to the linear coefficient of the mean square difference value of the acceleration in the vertical direction and the pressing force, namely arms(Fp,v)=A×FpConstant coefficient a in + B × v. Constant coefficient k1Is calculated by the formulaWherein, FmaxConstant c for maximum output force of actual force feedback device1The value is any one of 0.6-0.8;
(4) the virtual texture force vertical direction subjective factor acting force modeling has the following formula:
<math> <mrow> <msub> <mover> <mi>F</mi> <mo>&RightArrow;</mo> </mover> <mi>sub</mi> </msub> <mo>=</mo> <msub> <mi>k</mi> <mi>a</mi> </msub> <mo>&times;</mo> <msub> <mi>a</mi> <mi>ver</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>F</mi> <mi>p</mi> </msub> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>&times;</mo> <mover> <mi>n</mi> <mo>&RightArrow;</mo> </mover> <mo>;</mo> </mrow> </math>
wherein, aver(FpAnd v) is the magnitude of any pressing force F in the sensing processpAnd vertical acceleration, k, at any perceived velocity vaTo convert the acceleration signal aver(FpV) conversion into force signalsConstant coefficient ofThe calculation formula is as follows:in which the objective factor acting force is determinedActing with subjective factorsConstant c of weight ratio2The value is any one of 0.1-0.4, FobjThe measured vertical acceleration of the acting force of the objective factor under any pressing force and searching speed;is a vertical direction unit vector;
the magnitude F of any pressing force in the sensing processpAnd vertical acceleration a at an arbitrary perceived velocity vver(FpAnd v) the calculation process adopts a linear predictive coding method, which comprises the following steps:
for the actually measured vertical acceleration value a (k) (1, 2.. times.n), N is the number of a set of acceleration data under a specific pressing force and scanning speed, and the predicted value of the vertical direction acceleration signal is set asWhere p is the order of the linear prediction filter, the present invention takes p-10,the coefficient of the linear prediction filter is obtained by solving through a Levinson-Durbin algorithm, and then the prediction residual error of the linear prediction filterThe prediction error sequence ErTo be able to synthesize a stationary random signal with a white noise signal, { e (1), e (2) }, e (k) }, a sequence of prediction errors may be used, depending on the prediction error sequenceEstablishing a white noise sequence having a power spectral density equal to its power spectral density, i.e. a white noise sequence having a power spectral density ofA pressing force F is obtained and exertedi(i ∈ { 1.,. 5}) and a scanning speed vj(j ∈ { 1.,. 4}) corresponding parameterBuild up with a pressing force FpA binary lookup table with scanning velocity v as coordinate, as shown in FIG. 3, experimentally measured as a pressure value Fi(i ∈ { 1.,. 5}) and a fixed scan velocity value vj(j ∈ { 1.,. 4}) the corresponding point is defined as Qij=(Fi,vj) The corresponding value of the point is the abscissa FiOrdinate vjConditioned linear prediction of vertical direction acceleration signalsWherein,respectively corresponding to the pressing force Fi(i ∈ { 1.,. 5}) scanning speed vj(j ∈ { 1.,. 4}) linear prediction filter coefficient of vertical direction acceleration signalAnd white noise power spectral density σ2Defining a functionAccording to the bilinear interpolation theory, the pressing force and the scanning speed when the texture is sensed do not exceed the boundary range of the binary lookup table, namely F1≤F≤F5And v is1≤v≤v4When it is to be evaluatedThe calculation formula is obtained by numerical value weighted average of four adjacent grid points:
<math> <mrow> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>R</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <mi>F</mi> </mrow> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mi>ij</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mfrac> <mrow> <mi>F</mi> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mrow> <mo>(</mo> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> <mi>j</mi> </mrow> </msub> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>R</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <mi>F</mi> </mrow> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mi>i</mi> <mrow> <mo>(</mo> <mi>j</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mfrac> <mrow> <mi>F</mi> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mrow> <mo>(</mo> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>j</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mi>g</mi> <mrow> <mo>(</mo> <mi>P</mi> <mo>)</mo> </mrow> <mo>=</mo> <mrow> <mo>(</mo> <msub> <mover> <mi>h</mi> <mo>&RightArrow;</mo> </mover> <mi>new</mi> </msub> <mo>,</mo> <msubsup> <mi>&sigma;</mi> <mi>new</mi> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>v</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <mi>v</mi> </mrow> <mrow> <msub> <mi>v</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>v</mi> <mi>j</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>R</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mfrac> <mrow> <mi>v</mi> <mo>-</mo> <msub> <mi>v</mi> <mi>j</mi> </msub> </mrow> <mrow> <msub> <mi>v</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>v</mi> <mi>j</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>R</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> <mi>i</mi> <mo>&Element;</mo> <mo>{</mo> <mn>1</mn> <mo>,</mo> <mo>.</mo> <mo>.</mo> <mo>.</mo> <mo>,</mo> <mn>4</mn> <mo>}</mo> <mo>,</mo> <mi>j</mi> <mo>&Element;</mo> <mo>{</mo> <mn>1</mn> <mo>,</mo> <mo>.</mo> <mo>.</mo> <mo>.</mo> <mo>,</mo> <mn>3</mn> <mo>}</mo> </mrow> </math>
when the pressing force and the scanning speed when the texture is sensed exceed the boundary range of the binary lookup table, the boundary speed or the boundary pressure value is used as the scanning speed value or the scanning pressure value of the point to solve the current sensing state g (P), and the calculation formula is as follows:
<math> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mrow> <mo>(</mo> <msub> <mover> <mi>h</mi> <mo>&RightArrow;</mo> </mover> <mi>new</mi> </msub> <mo>,</mo> <msubsup> <mi>&sigma;</mi> <mi>new</mi> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> <mo>=</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>P</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>v</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <mi>v</mi> </mrow> <mrow> <msub> <mi>v</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>v</mi> <mi>j</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mn>1</mn> <mi>j</mi> </mrow> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mfrac> <mrow> <mi>v</mi> <mo>-</mo> <msub> <mi>v</mi> <mi>j</mi> </msub> </mrow> <mrow> <msub> <mi>v</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>v</mi> <mi>j</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mn>1</mn> <mrow> <mo>(</mo> <mi>j</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> </mtd> <mtd> <mi>F</mi> <mo>&lt;</mo> <msub> <mi>F</mi> <mn>1</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <msub> <mover> <mi>h</mi> <mo>&RightArrow;</mo> </mover> <mi>new</mi> </msub> <mo>,</mo> <msubsup> <mi>&alpha;</mi> <mi>new</mi> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> <mo>=</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>P</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>v</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <mi>v</mi> </mrow> <mrow> <msub> <mi>v</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>v</mi> <mi>j</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mn>5</mn> <mi>j</mi> </mrow> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mfrac> <mrow> <mi>v</mi> <mo>-</mo> <msub> <mi>v</mi> <mi>j</mi> </msub> </mrow> <mrow> <msub> <mi>v</mi> <mrow> <mi>j</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>v</mi> <mi>j</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mn>5</mn> <mrow> <mo>(</mo> <mi>j</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msub> <mo>)</mo> </mrow> </mtd> <mtd> <mi>F</mi> <mo>></mo> <msub> <mi>F</mi> <mn>5</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <msub> <mover> <mi>h</mi> <mo>&RightArrow;</mo> </mover> <mi>new</mi> </msub> <mo>,</mo> <msubsup> <mi>&sigma;</mi> <mi>new</mi> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> <mo>=</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>P</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <mi>F</mi> </mrow> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mi>i</mi> <mn>1</mn> </mrow> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mfrac> <mrow> <mi>F</mi> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mrow> <mo>(</mo> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> <mn>1</mn> </mrow> </msub> <mo>)</mo> </mrow> </mtd> <mtd> <mi>v</mi> <mo>&lt;</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>(</mo> <msub> <mover> <mi>h</mi> <mo>&RightArrow;</mo> </mover> <mi>new</mi> </msub> <mo>,</mo> <msubsup> <mi>&alpha;</mi> <mi>new</mi> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> <mo>=</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>P</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <mi>F</mi> </mrow> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mi>i</mi> <mn>5</mn> </mrow> </msub> <mo>)</mo> </mrow> <mo>+</mo> <mfrac> <mrow> <mi>F</mi> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> <mrow> <msub> <mi>F</mi> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> </mrow> </msub> <mo>-</mo> <msub> <mi>F</mi> <mi>i</mi> </msub> </mrow> </mfrac> <mi>g</mi> <mrow> <mo>(</mo> <msub> <mi>Q</mi> <mrow> <mrow> <mo>(</mo> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> <mn>5</mn> </mrow> </msub> <mo>)</mo> </mrow> </mtd> <mtd> <mi>v</mi> <mo>></mo> <msub> <mi>v</mi> <mn>4</mn> </msub> </mtd> </mtr> </mtable> </mfenced> </math>
then the relevant parameters of the point are obtainedUsing power spectral densityGenerating white noise sequence (w (k)), and finally, utilizing linear predictive filter coefficientAnd white noiseThe sequence { w (k) } realizes signal synthesis, and the synthesis formula is as follows: <math> <mrow> <msub> <mi>a</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>w</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>+</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>l</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>p</mi> </munderover> <msub> <mi>t</mi> <mi>l</mi> </msub> <msub> <mi>a</mi> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mi>k</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>=</mo> <mi>w</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>+</mo> <msup> <mover> <mi>h</mi> <mo>&RightArrow;</mo> </mover> <mi>T</mi> </msup> <msub> <mover> <mi>a</mi> <mo>&RightArrow;</mo> </mover> <mi>g</mi> </msub> <mrow> <mo>(</mo> <mi>k</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math> ag(k) i.e. the acceleration value at any desired pressing force and scanning speed.

Claims (1)

1. A method of texture force measurement in force haptic rendering, the method comprising the steps of:
in the virtual texture force model, a virtual probe in an operation handle control model of the force touch reappearing equipment is used for scratching the surface of a virtual object, whether the virtual probe is in contact with the virtual object or not and whether relative motion exists or not are detected, and if the virtual probe is in contact with the virtual object or does not move relatively, the virtual texture force is generatedThe output is 0, otherwise, the virtual texture force is calculated according to the following formulaAnd outputs:
wherein,the vertical force is the force applied by the virtual texture force,the horizontal direction acting force which is the virtual texture force,wherein,the acting force is an objective factor and is,is the acting force of the subjective factor,the virtual probe is subjected to horizontal friction force during relative motion;
the objective factor acting forceCalculated according to the following formula:
wherein k issIs to describe the stiffness of the textured materialA constant of the coefficient, H is the height of the texture surface contour corresponding to the point of contact with the texture surface when the virtual probe senses the texture, k1Is a constant coefficient of the number of the,is a vertical direction unit vector;
constant ksThe linear coefficient A is numerically equal to the mean square difference value of the acceleration in the vertical direction and the pressing force, and the calculation formula of the mean square difference value of the acceleration in the vertical direction is as follows:
where N is the number of sets of measured vertical accelerations, aiThe vertical direction acceleration data is actually measured by a texture detection pen, and in order to obtain the linear coefficient A of the mean square difference value of the vertical direction acceleration and the pressing force, the pressing force F is calculated by using correlation analysis and regression analysispScanning speed v as independent variable and acceleration mean square difference value a in vertical directionrmsFunctional relationship for dependent variables: a isrms(Fp,v)=A×Fp+ Bxv, where A is the mean square deviation value a of the acceleration in the vertical directionrmsWith a pressing force FpB is the mean square deviation value a of the acceleration in the vertical directionrmsLinear coefficient with scanning speed v, FpThe pressing force in the actual measuring process is shown, and v is the scanning speed in the actual measuring process; constant coefficient k1Is calculated by the formulaWherein, FmaxConstant c for maximum output force of actual force feedback device1The value is any one of 0.6-0.8;
the subjective factor acting forceCalculated according to the following formula:
wherein, aver(FpV) is any pressing force F in the perception process obtained by using linear predictive coding and bilinear interpolation methodpAnd vertical acceleration, k, at any perceived velocity vaTo convert the acceleration signal aver(FpV) conversion to subjective factor forceThe calculation formula of the constant coefficient is as follows:
in which the objective factor acting force is determinedActing with subjective factorsConstant c of weight ratio2The value is any one of 0.1-0.4, FobjThe measured vertical acceleration of the acting force of the objective factor under any pressing force and searching speed;is a vertical direction unit vector;
the frictional resistance in the horizontal directionCalculated according to the following formula:
where μ is the sliding friction coefficient of the virtual probe as it slides on the virtual texture surface, FverThe grain force acts in the vertical direction,is a horizontal direction unit vector.
CN201310424215.1A 2013-09-17 2013-09-17 Texture force measuring method in a kind of haptic feedback Active CN103439030B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310424215.1A CN103439030B (en) 2013-09-17 2013-09-17 Texture force measuring method in a kind of haptic feedback

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310424215.1A CN103439030B (en) 2013-09-17 2013-09-17 Texture force measuring method in a kind of haptic feedback

Publications (2)

Publication Number Publication Date
CN103439030A CN103439030A (en) 2013-12-11
CN103439030B true CN103439030B (en) 2015-10-07

Family

ID=49692726

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310424215.1A Active CN103439030B (en) 2013-09-17 2013-09-17 Texture force measuring method in a kind of haptic feedback

Country Status (1)

Country Link
CN (1) CN103439030B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105159459B (en) * 2015-09-06 2018-09-14 东南大学 A kind of dummy object 3D shape tactile sense reproduction method can be used for mobile terminal
CN107091704B (en) * 2016-02-17 2020-10-09 北京小米移动软件有限公司 Pressure detection method and device
JP2017182495A (en) * 2016-03-30 2017-10-05 ソニー株式会社 Information processing device, information processing method and program
CN108845512B (en) * 2018-06-26 2020-01-07 厦门大学 Large-texture touch reappearance force compensation system and method
CN111796709B (en) * 2020-06-02 2023-05-26 南京信息工程大学 Method for reproducing image texture features on touch screen

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3147720A1 (en) * 1981-12-02 1983-06-09 Dietmar Prof. Dr.-Ing. 5100 Aachen Boenisch Method and devices for measuring the compaction pressure of a granular material
CN101615072A (en) * 2009-06-18 2009-12-30 东南大学 Based on method for reproducing texture force touch from the shading shape technology
CN102054122A (en) * 2010-10-27 2011-05-11 东南大学 Haptic texture rendering method based on practical measurement

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05273056A (en) * 1992-03-27 1993-10-22 Rigaku Corp X-ray stress measuring method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3147720A1 (en) * 1981-12-02 1983-06-09 Dietmar Prof. Dr.-Ing. 5100 Aachen Boenisch Method and devices for measuring the compaction pressure of a granular material
CN101615072A (en) * 2009-06-18 2009-12-30 东南大学 Based on method for reproducing texture force touch from the shading shape technology
CN102054122A (en) * 2010-10-27 2011-05-11 东南大学 Haptic texture rendering method based on practical measurement

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
噪声对纹理的力触觉表达真实感影响的实验研究;程盈盈等;《2009中国智能自动化会议论文集》;20091231;第1737-1742页 *

Also Published As

Publication number Publication date
CN103439030A (en) 2013-12-11

Similar Documents

Publication Publication Date Title
CN103439030B (en) Texture force measuring method in a kind of haptic feedback
Song et al. A novel texture sensor for fabric texture measurement and classification
Culbertson et al. Generating haptic texture models from unconstrained tool-surface interactions
CN102308195B (en) Device for predicting deformation behavior of rubber material and method for predicting deformation behavior of rubber material
CN102054122B (en) Haptic texture rendering method based on practical measurement
Chelidze et al. A dynamical systems approach to damage evolution tracking, part 1: description and experimental application
TW201329815A (en) Force sensitive interface device and methods of using same
Guruswamy et al. IIR filter models of haptic vibration textures
CN111079333B (en) Deep learning sensing method of flexible touch sensor
Lang et al. Measurement-based modeling of contact forces and textures for haptic rendering
Abdulali et al. Data-driven modeling of anisotropic haptic textures: Data segmentation and interpolation
Elango et al. Analysis on the fundamental deformation effect of a robot soft finger and its contact width during power grasping
Lee et al. Application of elastic-plastic static friction models to rough surfaces with asymmetric asperity distribution
Shao et al. Dynamic sampling design for characterizing spatiotemporal processes in manufacturing
JP2007017243A (en) Tactile sensor
CN106871829A (en) The supersonic detection device and method of a kind of roller bearing contact zone lubrication film thickness
Kawasegi et al. Physical properties and tactile sensory perception of microtextured molded plastics
Wall et al. Modelling of surface identifying characteristics using fourier series
You et al. Adaptive detection of tool-workpiece contact for nanoscale tool setting based on multi-scale decomposition of force signal
JP2019032239A (en) Fingertip contact state measurement device
CN103675200A (en) Method for inspecting surface texture
Sah et al. Process monitoring in stamping operations through tooling integrated sensing
CN107710127A (en) Method and apparatus for detecting pressure
He et al. A convolutional neural network-based recognition method of gear performance degradation mode
CN114722677A (en) Method for calculating natural frequency of bolted structure under normal and tangential load coupling effect

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant