WO2022193153A1 - 基于体感遥控器的控制方法、装置及存储介质 - Google Patents

基于体感遥控器的控制方法、装置及存储介质 Download PDF

Info

Publication number
WO2022193153A1
WO2022193153A1 PCT/CN2021/081174 CN2021081174W WO2022193153A1 WO 2022193153 A1 WO2022193153 A1 WO 2022193153A1 CN 2021081174 W CN2021081174 W CN 2021081174W WO 2022193153 A1 WO2022193153 A1 WO 2022193153A1
Authority
WO
WIPO (PCT)
Prior art keywords
attitude
angle
axis
information
remote controller
Prior art date
Application number
PCT/CN2021/081174
Other languages
English (en)
French (fr)
Inventor
段武阳
商志猛
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2021/081174 priority Critical patent/WO2022193153A1/zh
Priority to CN202180087702.6A priority patent/CN116710870A/zh
Publication of WO2022193153A1 publication Critical patent/WO2022193153A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • the present application relates to the technical field of somatosensory remote control, and in particular, to a control method, device and storage medium based on somatosensory remote control.
  • the absolute attitude of the somatosensory remote controller is obtained, and the absolute attitude of the somatosensory remote controller is mapped to the control command of the unmanned aerial vehicle.
  • the three-axis attitude yaw angle yaw, pitch angle, and roll angle roll of the somatosensory remote control can be obtained through typical angle rotation decomposition, such as ZYX Euler angle rotation; the attitude is mapped within a certain angle range, Obtain the corresponding stick quantity, and then convert the stick quantity into the corresponding unmanned aerial vehicle control command, so as to realize the control of the unmanned aerial vehicle.
  • mapping method is prone to serious roll, pitch, and yaw coupling phenomenon, which makes the unmanned aerial vehicle control command coupling serious, and causes the unmanned aerial vehicle to have a large error in the control, and the unintended user occurs. Movement, the user experience is poor.
  • the present application provides a control method, device and storage medium based on a somatosensory remote controller, which can reduce the coupling phenomenon of the control commands of the unmanned aerial vehicle and reduce the unintended motion of the user.
  • the present application provides a control method based on a somatosensory remote controller, comprising:
  • the control instruction is sent to the unmanned aerial vehicle, and the control instruction is used to instruct the unmanned aerial vehicle to perform a corresponding operation.
  • the present application provides a control device based on a somatosensory remote controller, the device comprising: a memory and a processor;
  • the memory is used to store instructions
  • the processor invokes the instructions stored in the memory to implement the following operations:
  • the control instruction is sent to the unmanned aerial vehicle, where the control instruction is used to instruct the unmanned aerial vehicle to perform a corresponding operation.
  • the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor implements the above-mentioned somatosensory-based remote control control method.
  • the embodiments of the present application provide a control method, device and storage medium based on a somatosensory remote controller, to obtain first attitude information of a target attitude of the somatosensory remote controller; attitude change information between attitude information, determine the roll angle, pitch angle or yaw angle of the somatosensory remote controller corresponding to the attitude change information; generate control instructions according to the roll angle, pitch angle and yaw angle ; Send the control instruction to the unmanned aerial vehicle, and the control instruction is used to instruct the unmanned aerial vehicle to perform a corresponding operation.
  • the absolute attitude of the somatosensory remote controller is directly mapped to the control command of the unmanned aerial vehicle.
  • the pitch angle or yaw angle, but the roll angle, pitch angle or yaw angle of the somatosensory remote control is determined according to the attitude change information between the reference attitude and the target attitude of the somatosensory remote control, so the user intends to manipulate
  • the target posture of the somatosensory remote control caused by uncertainty can express the intention of the user's manipulation more clearly through the posture change between the reference posture of the somatosensory remote control which is also an absolute posture, so that it can mostly offset the user's manipulation.
  • Serious roll, pitch, yaw coupling phenomenon occurs due to uncertain intention, reduce or reduce the coupling phenomenon of UAV control instructions, reduce or reduce the control error of UAV, reduce or reduce the occurrence of unintended user unintended UAV Movement, increase user experience.
  • FIG. 1 is a schematic flowchart of an embodiment of a control method based on a somatosensory remote controller of the present application
  • FIG. 2 is a schematic flowchart of another embodiment of a control method based on a somatosensory remote controller of the present application
  • Fig. 3 is the rotational decomposition schematic diagram of determining the roll angle in one embodiment of the control method based on the somatosensory remote controller of the present application;
  • FIG. 4 is a schematic flowchart of another embodiment of a control method based on a somatosensory remote controller of the present application
  • FIG. 5 is a schematic diagram of a rotation decomposition using tilt-torsion in an embodiment of a control method based on a somatosensory remote controller of the present application;
  • FIG. 6 is a schematic flowchart of another embodiment of a control method based on a somatosensory remote controller of the present application
  • FIG. 7 is a schematic flowchart of another embodiment of a control method based on a somatosensory remote controller of the present application.
  • FIG. 8 is a schematic structural diagram of an embodiment of a control device based on a somatosensory remote controller of the present application.
  • the absolute attitude of the somatosensory remote controller is mapped to the control command of the unmanned aerial vehicle.
  • the above-mentioned mapping method is prone to serious roll, pitch, and yaw coupling phenomenon, which makes the unmanned aerial vehicle control command coupling serious, and causes the unmanned aerial vehicle to have a large error in the control, and the unintended user occurs. Movement, the user experience is poor.
  • the embodiments of the present application provide a control method, device and storage medium based on a somatosensory remote controller, to obtain first attitude information of a target attitude of the somatosensory remote controller; attitude change information between attitude information, determine the roll angle, pitch angle or yaw angle of the somatosensory remote controller corresponding to the attitude change information; generate control instructions according to the roll angle, pitch angle and yaw angle ; Send the control instruction to the unmanned aerial vehicle, and the control instruction is used to instruct the unmanned aerial vehicle to perform a corresponding operation.
  • the absolute attitude of the somatosensory remote controller is directly mapped to the control command of the unmanned aerial vehicle.
  • the pitch angle or yaw angle, but the roll angle, pitch angle or yaw angle of the somatosensory remote control is determined according to the attitude change information between the reference attitude and the target attitude of the somatosensory remote control, so the user intends to manipulate
  • the target posture of the somatosensory remote control caused by uncertainty can express the intention of the user's manipulation more clearly through the posture change between the reference posture of the somatosensory remote control which is also an absolute posture, so that it can mostly offset the user's manipulation.
  • Serious roll, pitch, and yaw coupling phenomenon occurs due to uncertain intention, reduce or reduce the coupling phenomenon of UAV control commands, reduce or reduce the control error of UAV, reduce or reduce the occurrence of unintended user-intended UAV Movement, increase user experience.
  • FIG. 1 is a schematic flowchart of an embodiment of a control method based on a somatosensory remote controller of the present application.
  • the method includes: S101 , S102 , S103 and S104 .
  • S101 Acquire first posture information of the target posture of the somatosensory remote controller.
  • S103 Generate a control command according to the roll angle, pitch angle, and yaw angle.
  • S104 Send the control instruction to the unmanned aerial vehicle, where the control instruction is used to instruct the unmanned aerial vehicle to perform a corresponding operation.
  • the somatosensory remote controller can use the somatosensory technology to obtain the user's intention or predict the user's needs through the user's body movements (such as gesture operations), so as to realize the interaction between the user and the UAV in a more natural way, so that the user can interact with the UAV in a more natural way.
  • the somatosensory remote controller can use the somatosensory technology to obtain the user's intention or predict the user's needs through the user's body movements (such as gesture operations), so as to realize the interaction between the user and the UAV in a more natural way, so that the user can interact with the UAV in a more natural way.
  • the somatosensory remote controller can use the somatosensory technology to obtain the user's intention or predict the user's needs through the user's body movements (such as gesture operations), so as to realize the interaction between the user and the UAV in a more natural way, so that the user can interact with the UAV in a more natural way.
  • the target posture of the somatosensory remote controller may be the absolute posture of the somatosensory remote controller carrying the user's intention
  • the first posture information may be specific posture information of the target posture of the somatosensory remote controller.
  • the representation methods of attitude information include rotation vector, rotation matrix, quaternion, Euler angle and other rotation representation methods.
  • the specific posture information of the target posture may be the specific posture information represented by the above-mentioned various representation methods.
  • the reference posture of the somatosensory remote controller may be the absolute posture of the somatosensory remote controller serving as the target posture comparison reference.
  • the second posture information may be specific posture information of the reference posture of the somatosensory remote controller.
  • the specific attitude information of the reference attitude may be the specific attitude information represented by the above various representation methods.
  • the reference posture of the somatosensory remote control can be a preset posture, the posture of the somatosensory remote control in a certain state, the posture at a certain moment before the target posture, or the default initial posture of the somatosensory remote control. and many more.
  • the attitude difference between the target attitude and the reference attitude can offset the vague intentions of most users, and can show the user's intentions more clearly. Therefore, through the attitude between the second attitude information and the first attitude information
  • the change information can determine the roll angle, pitch angle or yaw angle of the somatosensory remote controller with more obvious and clear user intentions. Generate control instructions with more obvious and clear user intentions according to the roll angle, pitch angle and yaw angle with more obvious user intentions, and send the control instructions with more obvious and clear user intentions to the unmanned aerial vehicle.
  • the control instructions can instruct the UAV to perform more obvious and clear operations corresponding to the user's intention, so that it can mostly offset the more serious roll, pitch, and yaw coupling phenomena caused by the uncertainty of the user's manipulation intention, and reduce or reduce the Unmanned aerial vehicle control command coupling phenomenon, reduce or reduce the control error of the unmanned aerial vehicle, reduce or reduce the unintended movement of the unmanned aerial vehicle, and increase the user experience.
  • This embodiment of the present application acquires the first attitude information of the target attitude of the somatosensory remote control; according to the attitude change information between the second attitude information, which is the reference attitude of the somatosensory remote control, and the first attitude information, the corresponding position change information is determined.
  • the roll angle, pitch angle or yaw angle of the somatosensory remote controller generate control instructions according to the roll angle, pitch angle and yaw angle; send the control instructions to the unmanned aerial vehicle, and the control instructions use to instruct the UAV to perform corresponding operations.
  • the absolute attitude of the somatosensory remote controller is directly mapped to the control command of the unmanned aerial vehicle.
  • the pitch angle or yaw angle, but the roll angle, pitch angle or yaw angle of the somatosensory remote control is determined according to the attitude change information between the reference attitude and the target attitude of the somatosensory remote control, so the user intends to manipulate
  • the target posture of the somatosensory remote control caused by uncertainty can express the intention of the user's manipulation more clearly through the posture change between the reference posture of the somatosensory remote control which is also an absolute posture, so that it can mostly offset the user's manipulation.
  • Serious roll, pitch, and yaw coupling phenomenon occurs due to uncertain intention, reduce or reduce the coupling phenomenon of UAV control commands, reduce or reduce the control error of UAV, reduce or reduce the occurrence of unintended user-intended UAV Movement, increase user experience.
  • the embodiment of the present application determines the roll angle, pitch angle or yaw angle of the somatosensory remote controller through attitude change information, this makes the actual movement range of the user through the remote controller smaller under the same maximum attitude configuration, and the user More comfortable to operate.
  • the roll angle, pitch angle or yaw angle of the somatosensory remote controller is determined by the Euler angle rotation method. That is, S102, the roll angle and pitch of the somatosensory remote controller corresponding to the attitude change information are determined according to the attitude change information between the second attitude information serving as the reference attitude of the somatosensory remote controller and the first attitude information angle or yaw angle, which may include: determining the somatosensory remote controller corresponding to the attitude change information according to the attitude change information between the second attitude information and the first attitude information by means of Euler angle rotation the roll, pitch or yaw angle.
  • the Euler angle is a set of independent angular parameters used to determine the position of the rigid body rotating at a fixed point.
  • the rotation angle of the positive direction) the rigid body position is rotated at a fixed point, the attitude of the rigid body changes, and the coordinate system XYZ also changes synchronously.
  • the angle rotated around the X axis is the roll angle roll
  • the angle rotated around the Y axis is the pitch angle pitch
  • the angle rotated around the Z axis is the yaw angle yaw.
  • the attitude change information between the second attitude information and the first attitude information can be the attitude change information from the second attitude information to the first attitude information (that is, from the reference attitude to the target attitude, the angle obtained by the rotation direction can be specified as positive), it can also be the attitude change information from the first attitude information to the second attitude information (that is, from the target attitude to the reference attitude, the angle obtained by the rotation direction can be specified as negative, the roll angle of the somatosensory remote controller, The pitch angle or yaw angle can only take the angle value).
  • a common rotation direction from the reference pose to the target pose is generally used. That is, S102, the roll angle of the somatosensory remote controller corresponding to the attitude change information is determined according to the attitude change information between the second attitude information and the first attitude information through the Euler angle rotation method. , the pitch angle or the yaw angle, and may also include: rotating from the reference attitude to the target attitude by means of Euler angle rotation, and determining the roll according to the attitude change information from the second attitude information to the first attitude information angle, pitch, or yaw.
  • the manner of determining the roll angle roll may be:
  • S102 the rotation from the reference attitude to the target attitude by means of Euler angle rotation, and determining the roll angle, pitch angle or yaw angle according to attitude change information from the second attitude information to the first attitude information , and may also include: S102A1 and S102A2, as shown in FIG. 2 .
  • S102A1 Rotate around the X-axis for the first time from the reference attitude to the target attitude, and determine an intermediate angle corresponding to the first rotation around the X-axis.
  • S102A2 Use an intermediate angle corresponding to the first rotation around the X axis as the roll angle.
  • the first rotation when rotating from the reference attitude to the target attitude, the first rotation is around the X axis (the X axis corresponding to the reference attitude), and the second and third rotations may be the second rotation around the intermediate attitude.
  • the Y axis (the corresponding Y axis in the intermediate posture) rotates, and the third time rotates around the Z axis in the target posture (the corresponding Z axis in the target posture), or it can be rotated around the Z axis in the intermediate posture for the third time.
  • sub-step S102A1 since the XYZ rotation sequence is more in line with the user's operating habits, sub-step S102A1, the first rotation around the X axis is rotated from the reference attitude to the target attitude, and the intermediate angle corresponding to the first rotation around the X axis is determined, It can also include: rotating from the reference attitude to the target attitude according to the rotation sequence of the X axis under the reference attitude, the Y axis under the intermediate attitude, and the Z axis under the target attitude, and determining the intermediate angle corresponding to the rotation around the X axis under the reference attitude; , S102A2, taking the middle angle corresponding to the first rotation around the X-axis as the roll angle, may further include: taking the middle angle corresponding to the rotation around the X-axis under the reference attitude as the roll angle .
  • the coordinate system corresponding to the reference attitude of the somatosensory remote controller is marked with xyz (see D in Figure 3); the coordinate system corresponding to the target attitude is marked with x"y"z" (see E in Figure 3) .
  • Rotate from the reference attitude to the target attitude take the X-axis direction corresponding to the reference attitude as the first rotation, and perform three rotation decompositions in the order of XYZ (roll, pitch, yaw) rotation, and only select the decomposed first time around the X-axis
  • the middle angle obtained by the direction rotation is used as the original roll angle roll (the original roll angle roll can be mapped as the roll amount), and the middle angle corresponding to the remaining two rotations is discarded.
  • the above XYZ rotation order may refer to obtaining x"y
  • the attitude of the "z" coordinate system relative to the xyz coordinate system the xyz coordinate system is revolved around the x-axis (the corresponding x-axis in the reference attitude, see A in Figure 3), the y' axis (the corresponding coordinate system x'y in the intermediate attitude)
  • the z" axis the corresponding z" axis in the target attitude, see C in Figure 3
  • the intermediate angles ⁇ , ⁇ Only the intermediate angle ⁇ of the first rotation around the x-axis is retained as the roll angle roll.
  • the manner of determining the pitch angle pitch may be:
  • S102 the rotation from the reference attitude to the target attitude by the Euler angle rotation method, according to the attitude change information from the second attitude information to the first attitude information, determine the roll angle, the pitch angle or the yaw angle, and also It can include: S102B1 and S102B2, as shown in FIG. 4 .
  • S102B1 Rotate around the Y-axis for the first time from the reference attitude to the target attitude, and determine an intermediate angle corresponding to the first rotation around the Y-axis.
  • the first rotation when rotating from the reference attitude to the target attitude, the first rotation is around the Y axis
  • the second and third rotations may be the second rotation around the X axis in the intermediate attitude, and the third rotation around the target.
  • the Z-axis rotation in the attitude can also be the second rotation around the Z-axis in the intermediate attitude, and the third rotation around the X-axis in the target attitude.
  • only the intermediate angle corresponding to the first rotation around the Y axis is retained, and the intermediate angle corresponding to the first rotation around the Y axis is used as the pitch angle pitch.
  • the first rotation around the Y axis is rotated from the reference attitude to the target attitude, the intermediate angle corresponding to the first rotation around the Y axis is determined, and the It can include: rotating from the reference attitude to the target attitude according to the rotation sequence of the Y-axis under the reference attitude, the X-axis under the intermediate attitude, and the Z-axis under the target attitude, and determining the intermediate angle corresponding to the rotation around the Y-axis under the reference attitude; at this time, S102B2, using the intermediate angle corresponding to the first rotation around the Y axis as the pitch angle, and may further include: using the intermediate angle corresponding to the rotation around the Y axis under the reference attitude as the pitch angle.
  • the reference attitude to the target attitude three rotations are decomposed according to the YXZ (pitch, roll, yaw) rotation sequence with the Y-axis direction as the first rotation, and only the decomposed first rotation around the Y-axis direction is selected.
  • the obtained intermediate angle is used as the original pitch angle pitch (the original pitch angle pitch can be mapped to the pitch rod amount), and the intermediate angles corresponding to the remaining two rotations are discarded.
  • the roll angle and pitch angle of the somatosensory remote control can take the above-mentioned intermediate angle of only the first rotation around the axis as the roll angle and pitch angle corresponding to the first rotation around the axis, that is, according to this
  • the decomposition method of multiple independent rotations has a low degree of coupling of roll, pitch, and yaw in different directions, and does not depend on the sequence of different rotations.
  • a fixed rotation decomposition method such as (XYZ) is adopted, it will not match the rotation order of the user, which will bring more coupling rods; Dependency on the order of user rotation operations.
  • the amount of coupled roll obtained by decomposing the pitch is smaller, and the amount of additional coupled pitch superimposed on the existing pitch when rolling is smaller.
  • the pitch and/or roll direction may correspond to the tilt direction Tilt
  • the yaw direction may correspond to the twist direction Torsion. Since the control of the UAV's yaw direction (Torsion) takes more than ten times longer than the control of the pitch and/or roll direction (Tilt), in the UAV's attitude loop control framework, For the expected attitude angle, Torsion's control strategy is more efficient after Tilt is prioritized.
  • the roll angle, pitch angle or yaw angle of the somatosensory remote controller is determined by a tilt-twist rotation manner. That is, S102, the roll angle and pitch of the somatosensory remote controller corresponding to the attitude change information are determined according to the attitude change information between the second attitude information serving as the reference attitude of the somatosensory remote controller and the first attitude information angle or yaw angle, and may also include: determining the somatosensory corresponding to the attitude change information according to the attitude change information between the second attitude information and the first attitude information through a tilt-torsion rotation method The roll, pitch or yaw angle of the remote control.
  • tilt-torsion rotation sequence of tilt-torsion (ie, tilt-torsion) is first explained below.
  • the first step is tilt rotation (the rotation process from A to B as shown in Figure 5), and the xy plane of the coordinate system xyz is rotated around a space rotation axis by an angle ⁇ , so that the xy plane of the coordinate system xyz is the same as the coordinate system.
  • the x'y' planes of x'y'z' coincide.
  • This space rotation axis is a vector, the space rotation axis is perpendicular to the plane composed of zz', and this space rotation axis vector is projected to the xy plane, and the direction and size of the projection are the middle angle of tilt, with tilt_x components (ie roll component) and tilt_y component (ie, pitch component).
  • tilt_x components ie roll component
  • tilt_y component ie, pitch component
  • the intermediate angle obtained by rotating around the z' axis this time is the intermediate angle of the torsion, and the intermediate angle of the torsion is the yaw component of the yaw angle.
  • a tilt-torsion sequence decomposition is completed, and the intermediate angle of tilt and torsion can be obtained by solving.
  • the reference posture When specifically decomposing the tilt-torsion order, the reference posture can be rotated to the target posture according to the till-torsion rotation order, or the target posture can be rotated to the reference posture according to the till-torsion rotation order (the obtained tilt and torsion intermediate angle Negative, that is, the tilt and torsion intermediate angles corresponding to the rotation of the reference pose to the target pose).
  • a commonly used rotation direction is used to rotate the reference posture to the target posture according to the till-torsion rotation sequence. That is, S102, the roll of the somatosensory remote controller corresponding to the attitude change information is determined according to the attitude change information between the second attitude information and the first attitude information through the tilt-torsion rotation method.
  • angle, pitch angle, or yaw angle may also include: rotating from a reference attitude to a target attitude through a tilt-twist rotation, and determining the roll, pitch or yaw.
  • the unmanned aerial vehicle may still exhibit yaw motion unintended by the user.
  • the way to determine the yaw angle yaw can be through the tilt-torsion rotation method, and the decomposed torsion intermediate angle is used as the original yaw angle yaw, and the original yaw angle yaw can be mapped to the yaw rod amount.
  • the tilt-torsion rotation method is used for decomposition.
  • the user holds the somatosensory remote controller to pitch before tilting and then roll to the left as an example
  • the manner of determining the yaw angle yaw may be:
  • S102 Rotate from a reference attitude to a target attitude in a tilt-torsion rotation manner, and determine the roll angle, pitch angle or yaw according to attitude change information from the second attitude information to the first attitude information
  • the corners may also include: S102C1, S102C2 and S102C3, as shown in FIG. 6 .
  • S102C1 Perform tilt rotation from the reference posture to the intermediate posture.
  • S102C2 Rotate around the Z axis to the target posture in the intermediate posture, and determine an intermediate angle corresponding to the rotation around the Z axis in the intermediate posture, wherein the Z axis in the intermediate posture coincides with the Z axis in the target posture.
  • tilt rotation that is, tilt rotation
  • a space rotation axis to an intermediate attitude, which can make the xy plane of the coordinate system xyz corresponding to the reference attitude and the coordinate system x'y corresponding to the intermediate attitude.
  • the x'y' planes of 'z' coincide.
  • the axis of rotation of the space is perpendicular to the plane formed by zz'.
  • the reference posture includes an initial posture of the somatosensory remote controller.
  • the method may further include: recording the initial posture of the somatosensory remote controller.
  • the somatosensory remote control can be set to a locked state (or also includes a semi-locked state) and an unlocked state.
  • the unmanned aerial vehicle is controlled by the strategy in the method of the embodiment of the present application (for example, when the unmanned aerial vehicle flies normally in the air, the somatosensory remote control is usually set in an unlocked state);
  • the strategy in the method of the embodiment of the present application is no longer used to control the UAV, but another processing strategy or control strategy is adopted to better meet the requirements of operational safety ( For example, when the UAV stops propellers on the ground, brakes and hovers in the air, etc., the somatosensory remote control is usually set in a locked or semi-locked state).
  • the acquiring the first posture information of the target posture of the somatosensory remote controller may include: when the somatosensory remote controller is in an unlocked state, acquiring the first posture information of the target posture of the somatosensory remote controller.
  • the method may further include: when the somatosensory remote control enters an unlocked state from a locked state or a semi-locked state, recording the current posture of the somatosensory remote control; Use the recorded current pose as the initial pose of the somatosensory remote.
  • the method may further include: S105 , S106 , S107 and S108 .
  • S106 Decompose the current posture through a tilt-torsion rotation method to obtain a tilt vector.
  • S107 Determine the roll angle or pitch angle of the somatosensory remote controller at the current attitude according to the tilt vector, and set the yaw angle of the somatosensory remote controller at the current attitude to zero.
  • S108 Send the roll angle or pitch angle of the somatosensory remote controller at the current posture to the display terminal, so that the display terminal displays the roll angle or pitch angle of the somatosensory remote controller at the current posture.
  • the tilt-torsion rotation sequence is directly used to decompose the current posture of the somatosensory remote control; ;Because the current posture used is an absolute posture, the somatosensory remote control can be oriented in any direction. If the absolute yaw is converted into the yaw rod value at this time, it may bring a larger yaw rod value. This value is meaningless.
  • the lever amount is reset to zero; after entering the unlocked state, it will take the entering attitude as the initial attitude (as the reference attitude), and perform the conversion of yaw and yaw lever amount.
  • the somatosensory remote control may be in any orientation (absolute heading is not 0) when pitching forward, such as leaning forward towards the due east, at this time decomposition
  • the output component may also have a larger roll component; it will cause confusion to the user when the display terminal (such as the app or glasses side of the user terminal) is displayed in real time; in the semi-locked/locked state, when the absolute pose is used for decomposition, Using the tilt-torsion order decomposition directly can avoid the above problems. After decomposing, discard the middle angle of torsion. At this time, the tilt is completely unaffected by the orientation of the remote controller. When tilting forward at any course, it corresponds to the pitch component, and the right tilt corresponds to the roll component.
  • the roll angle roll can be adopted: the first rotation around the X axis rotates from the reference attitude to the target attitude, and the The intermediate angle corresponding to the first rotation around the X axis is used as the roll angle;
  • the pitch angle pitch can be used: the first rotation around the Y axis is rotated from the reference attitude to the target attitude, which will be the same as the first rotation around the Y axis.
  • the intermediate angle corresponding to the rotation of the Y-axis is used as the pitch angle; the yaw angle yaw can be adopted: rotate from the reference attitude to the target attitude, perform tilt rotation to the intermediate attitude, rotate around the Z axis to the target attitude under the intermediate attitude, and the intermediate attitude
  • the lower Z-axis coincides with the Z-axis in the target attitude, and the intermediate angle corresponding to the rotation around the Z-axis in the intermediate attitude is taken as the yaw angle.
  • the original rod quantities converted from the above roll angle, pitch angle and yaw angle need to be processed by dead zone, exp, filtering, and maximum angle amplitude mapping.
  • mapping the maximum angle amplitude for example, the maximum angle range in the roll direction is configured as [-20, 25] degrees, when the original roll amount is -20, the mapping lever is -1, and when the original roll amount is 25, the mapping lever The amount is 1.
  • the mapping allows the positive and negative directions to be configured with different maximum angle sizes, which are used to deal with the asymmetry of wrist movements in different directions. For example, the angle of wrist lift is significantly lower than the maximum angle of wrist down.
  • the dead zone configuration is allowed, and the human hand cannot hold it and keep it in an absolute posture.
  • the wrist holding posture when the wrist holding posture is near the neutral position, it can output 0 sticks, which corresponds to the control of the UAV hovering or maintaining the posture horizontally. instruction.
  • the throttle stick quantity is directly sampled and post-processed as the final stick quantity; then the stick quantity shielding strategy is used to perform related shielding and zeroing processing.
  • FIG. 8 is a schematic structural diagram of an embodiment of a control device based on a somatosensory remote controller of the present application.
  • the control device in this embodiment can be integrated with the somatosensory remote controller, or can be set separately from the somatosensory remote controller. It should be noted that the control device of this embodiment can perform the operations in the above-mentioned control method based on a somatosensory remote controller.
  • the relevant content please refer to the relevant content of the above-mentioned control method based on a somatosensory remote controller, which will not be repeated here.
  • Iran the relevant content of the above-mentioned control method based on a somatosensory remote controller
  • the control device 100 includes: a memory 1 and a processor 2; the processor 2 and the memory 1 are connected through a bus.
  • the control device 100 is also connected to the somatosensory remote controller 3 , and the connection between the control device 100 and the somatosensory remote controller 3 may be a wired connection or a wireless connection.
  • the processor 2 may be a microcontroller unit, a central processing unit or a digital signal processor, and so on.
  • the memory 1 may be a Flash chip, a read-only memory, a magnetic disk, an optical disk, a U disk, a mobile hard disk, and the like.
  • the memory 1 is used to store instructions; the processor 2 invokes the instructions stored in the memory 1 to implement the following operations:
  • the first posture information of the target posture of the somatosensory remote controller according to the posture change information between the second posture information serving as the reference posture of the somatosensory remote controller and the first posture information, determine the somatosensory remote control corresponding to the posture change information the roll angle, pitch angle or yaw angle of the UAV; generate a control command according to the roll angle, pitch angle and yaw angle; send the control command to the unmanned aerial vehicle, and the control command is used to instruct the unmanned aerial vehicle
  • the unmanned aerial vehicle performs corresponding operations.
  • the processor is specifically configured to: determine the somatosensory remote control corresponding to the attitude change information according to the attitude change information between the second attitude information and the first attitude information by means of Euler angle rotation roll, pitch, or yaw angle of the vehicle.
  • the processor is specifically configured to: rotate from a reference attitude to a target attitude by means of Euler angle rotation, and determine the roll angle, Pitch or yaw angle.
  • the processor is specifically used to: rotate around the X-axis for the first time from the reference attitude to the target attitude, and determine an intermediate angle corresponding to the first rotation around the X-axis;
  • the intermediate angle corresponding to the axis rotation is used as the roll angle.
  • the processor is specifically configured to: rotate from the reference attitude to the target attitude according to the rotation sequence of the X-axis in the reference attitude, the Y-axis under the intermediate attitude, and the Z-axis under the target attitude, and determine the rotation corresponding to the rotation around the X-axis under the reference attitude
  • the intermediate angle corresponding to the first rotation around the X-axis as the roll angle includes: using the intermediate angle corresponding to the rotation around the X-axis under the reference attitude as the roll angle.
  • the processor is specifically used to: rotate around the Y axis for the first time from the reference attitude to the target attitude, and determine an intermediate angle corresponding to the first rotation around the Y axis;
  • the intermediate angle corresponding to the axis rotation is used as the pitch angle.
  • the processor is specifically configured to: rotate from the reference attitude to the target attitude according to the rotation sequence of the Y-axis in the reference attitude, the X-axis under the intermediate attitude, and the Z-axis under the target attitude, and determine the rotation corresponding to the rotation around the Y-axis under the reference attitude.
  • the intermediate angle corresponding to the first rotation around the Y axis as the pitch angle includes: using the intermediate angle corresponding to the rotation around the Y axis under the reference attitude as the pitch angle.
  • the processor is specifically configured to: determine the somatosensory corresponding to the attitude change information according to the attitude change information between the second attitude information and the first attitude information through a tilt-torsion rotation manner The roll, pitch or yaw angle of the remote control.
  • the processor is specifically configured to: rotate from a reference attitude to a target attitude through a tilt-twist rotation, and determine the roll angle according to attitude change information from the second attitude information to the first attitude information , pitch or yaw angle.
  • the processor is specifically configured to: perform tilt rotation from the reference posture to the intermediate posture; rotate around the Z axis to the target posture in the intermediate posture, and determine the intermediate angle corresponding to the rotation around the Z axis in the intermediate posture, Wherein, the Z axis in the intermediate attitude coincides with the Z axis in the target attitude; the intermediate angle corresponding to the rotation around the Z axis in the intermediate attitude is taken as the yaw angle.
  • the reference posture includes the initial posture of the somatosensory remote controller.
  • the processor is specifically configured to: record the initial posture of the somatosensory remote controller.
  • the processor is specifically configured to: when the somatosensory remote controller is in an unlocked state, acquire first posture information of the target posture of the somatosensory remote controller.
  • the processor is specifically configured to: acquire the current posture of the somatosensory remote controller when the somatosensory remote controller is in a locked state or a semi-locked state; vector; according to the inclination vector, determine the roll angle or pitch angle under the current attitude of the somatosensory remote controller, and set the yaw angle under the current attitude of the somatosensory remote controller to zero; The roll angle or pitch angle is sent to the display terminal, so that the display terminal displays the roll angle or pitch angle under the current attitude of the somatosensory remote controller.
  • the processor is specifically configured to: record the current posture of the somatosensory remote controller when the somatosensory remote controller enters an unlocked state from a locked state or a semi-locked state; and use the recorded current posture as the initial posture of the somatosensory remote controller.
  • the present application further provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor enables the processor to implement the somatosensory-based remote control as described in any of the above control method.
  • the computer-readable storage medium may be an internal storage unit of the above-mentioned control device, such as a hard disk or a memory.
  • the computer-readable storage medium may also be an external storage device, such as an equipped plug-in hard disk, smart memory card, secure digital card, flash memory card, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Selective Calling Equipment (AREA)

Abstract

一种基于体感遥控器的控制方法、装置及存储介质,方法包括:获取体感遥控器目标姿态的第一姿态信息(S101);根据作为体感遥控器基准姿态的第二姿态信息与第一姿态信息之间的姿态变化信息,确定姿态变化信息对应的体感遥控器的横滚角、俯仰角或偏航角(S102);根据横滚角、俯仰角以及偏航角生成控制指令(S103);将控制指令发送给无人飞行器,控制指令用于指示无人飞行器进行对应的操作(S104)。

Description

基于体感遥控器的控制方法、装置及存储介质 技术领域
本申请涉及体感遥控技术领域,尤其涉及一种基于体感遥控器的控制方法、装置及存储介质。
背景技术
相关技术中,获取体感遥控器的绝对姿态,将体感遥控器的绝对姿态映射为无人飞行器控制指令。具体可以是通过典型的角度旋转分解,例如ZYX欧拉角旋转,获取体感遥控器的三轴姿态偏航角yaw、俯仰角pitch、横滚角roll;将该姿态在一定角度范围内映射比例,获取对应的杆量,再将该杆量转为相应的无人飞行器控制指令,以实现对无人飞行器的操控。
但是,当用户操控意图不确定时,上述映射方式容易出现较严重的roll、pitch、yaw耦合现象,使无人飞行器控制指令耦合严重,使无人飞行器操控存在较大误差,出现用户意图外的运动,用户体验较差。
发明内容
基于此,本申请提供一种基于体感遥控器的控制方法、装置及存储介质,能够减少无人飞行器控制指令的耦合现象,减少用户意图外的运动。
第一方面,本申请提供一种基于体感遥控器的控制方法,包括:
获取体感遥控器目标姿态的第一姿态信息;
根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角;
根据所述横滚角、俯仰角以及偏航角生成控制指令;
将所述控制指令发送给无人飞行器,所述控制指令用于指示所述无人飞行 器进行对应的操作。
第二方面,本申请提供一种基于体感遥控器的控制装置,所述装置包括:存储器和处理器;
所述存储器用于存储指令;
所述处理器调用所述存储器中存储的指令用于实现如下操作:
获取体感遥控器目标姿态的第一姿态信息;
根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角;
根据所述横滚角、俯仰角以及偏航角生成控制指令;
将所述控制指令发送给无人飞行器,所述控制指令用于指示所述无人飞行器进行对应的操作。
第三方面,本申请提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上所述的基于体感遥控器的控制方法。
本申请实施例提供了一种基于体感遥控器的控制方法、装置及存储介质,获取体感遥控器目标姿态的第一姿态信息;根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角;根据所述横滚角、俯仰角以及偏航角生成控制指令;将所述控制指令发送给无人飞行器,所述控制指令用于指示所述无人飞行器进行对应的操作。相比较相关技术中直接将体感遥控器的绝对姿态映射为无人飞行器控制指令,本申请实施例中体感遥控器目标姿态为绝对姿态,并没有直接利用目标姿态确定体感遥控器的横滚角、俯仰角或偏航角,而是根据同为体感遥控器的绝对姿态的基准姿态与目标姿态之间的姿态变化信息确定体感遥控器的横滚角、俯仰角或偏航角,因此用户操控意图不确定而产生的体感遥控器的目标姿态能够通过与同为绝对姿态的体感遥控器的基准姿态之间的姿态变化,将用户操控的意图表现得更为明确,从而能够大部分抵消因用户操控意图不确定而出现较严重的roll、pitch、yaw耦合现象, 减少或减小无人飞行器控制指令耦合现象,减少或减小无人飞行器操控误差,减少或减小无人飞行器出现用户意图外的运动,增加用户体验。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
图1是本申请基于体感遥控器的控制方法一实施例的流程示意图;
图2是本申请基于体感遥控器的控制方法另一实施例的流程示意图;
图3是本申请基于体感遥控器的控制方法一实施例中确定横滚角的旋转分解示意图;
图4是本申请基于体感遥控器的控制方法又一实施例的流程示意图;
图5是本申请基于体感遥控器的控制方法一实施例中采用倾斜-扭转的旋转分解示意图;
图6是本申请基于体感遥控器的控制方法又一实施例的流程示意图;
图7是本申请基于体感遥控器的控制方法又一实施例的流程示意图;
图8是本申请基于体感遥控器的控制装置一实施例的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
附图中所示的流程图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。
相关技术中,将体感遥控器的绝对姿态映射为无人飞行器控制指令。但是,当用户操控意图不确定时,上述映射方式容易出现较严重的roll、pitch、yaw耦合现象,使无人飞行器控制指令耦合严重,使无人飞行器操控存在较大误差,出现用户意图外的运动,用户体验较差。
本申请实施例提供了一种基于体感遥控器的控制方法、装置及存储介质,获取体感遥控器目标姿态的第一姿态信息;根据作为体感遥控器基准姿态的第 二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角;根据所述横滚角、俯仰角以及偏航角生成控制指令;将所述控制指令发送给无人飞行器,所述控制指令用于指示所述无人飞行器进行对应的操作。相比较相关技术中直接将体感遥控器的绝对姿态映射为无人飞行器控制指令,本申请实施例中体感遥控器目标姿态为绝对姿态,并没有直接利用目标姿态确定体感遥控器的横滚角、俯仰角或偏航角,而是根据同为体感遥控器的绝对姿态的基准姿态与目标姿态之间的姿态变化信息确定体感遥控器的横滚角、俯仰角或偏航角,因此用户操控意图不确定而产生的体感遥控器的目标姿态能够通过与同为绝对姿态的体感遥控器的基准姿态之间的姿态变化,将用户操控的意图表现得更为明确,从而能够大部分抵消因用户操控意图不确定而出现较严重的roll、pitch、yaw耦合现象,减少或减小无人飞行器控制指令耦合现象,减少或减小无人飞行器操控误差,减少或减小无人飞行器出现用户意图外的运动,增加用户体验。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
参见图1,图1是本申请基于体感遥控器的控制方法一实施例的流程示意图,该方法包括:S101、S102、S103以及S104。
S101:获取体感遥控器目标姿态的第一姿态信息。
S102:根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角。
S103:根据所述横滚角、俯仰角以及偏航角生成控制指令。
S104:将所述控制指令发送给无人飞行器,所述控制指令用于指示所述无人飞行器进行对应的操作。
本实施例中,体感遥控器可以运用体感技术,通过用户的肢体动作(例如手势操作)来获得用户的意图或者预知用户的需求,以更自然的方式实现用户与无人飞行器的交互,使用户享受体感互动的新体验。
体感遥控器的目标姿态可以是承载用户的意图的体感遥控器的绝对姿态,第一姿态信息可以是体感遥控器的目标姿态的具体姿态信息。姿态信息的表示 方法包括旋转矢量、旋转矩阵、四元数、欧拉角等旋转的表示方法。目标姿态的具体姿态信息可以是以上述各种表示方法表示的具体姿态信息。
体感遥控器的基准姿态可以是作为目标姿态比较基准的体感遥控器的绝对姿态。第二姿态信息可以是体感遥控器的基准姿态的具体姿态信息。基准姿态的具体姿态信息可以是以上述各种表示方法表示的具体姿态信息。体感遥控器的基准姿态可以是预先设定的姿态,也可以是某个状态下体感遥控器的姿态,也可以是目标姿态之前某个时刻的姿态,也可以是体感遥控器的默认初始姿态,等等。
目标姿态与基准姿态之间的姿态差能够抵消掉大部分用户的模糊不清的意图,能够将用户的意图更为清楚的表现出来,因此通过第二姿态信息与第一姿态信息之间的姿态变化信息能够确定出用户意图更为明显、明确的体感遥控器的横滚角、俯仰角或偏航角。根据用户意图更为明显、明确的横滚角、俯仰角以及偏航角生成用户意图更为明显、明确的控制指令,将用户意图更为明显、明确的控制指令发送给无人飞行器,所述控制指令能够指示所述无人飞行器进行对应的用户意图更为明显、明确的操作,从而能够大部分抵消因用户操控意图不确定而出现较严重的roll、pitch、yaw耦合现象,减少或减小无人飞行器控制指令耦合现象,减少或减小无人飞行器操控误差,减少或减小无人飞行器出现用户意图外的运动,增加用户体验。
本申请实施例获取体感遥控器目标姿态的第一姿态信息;根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角;根据所述横滚角、俯仰角以及偏航角生成控制指令;将所述控制指令发送给无人飞行器,所述控制指令用于指示所述无人飞行器进行对应的操作。相比较相关技术中直接将体感遥控器的绝对姿态映射为无人飞行器控制指令,本申请实施例中体感遥控器目标姿态为绝对姿态,并没有直接利用目标姿态确定体感遥控器的横滚角、俯仰角或偏航角,而是根据同为体感遥控器的绝对姿态的基准姿态与目标姿态之间的姿态变化信息确定体感遥控器的横滚角、俯仰角或偏航角,因此用户操控意图不确定而产生的体感遥控器的目标姿态能够通过与同为绝对姿态的体感遥控器的基准姿态之间的姿态变化,将用户操控的意图表现得更为 明确,从而能够大部分抵消因用户操控意图不确定而出现较严重的roll、pitch、yaw耦合现象,减少或减小无人飞行器控制指令耦合现象,减少或减小无人飞行器操控误差,减少或减小无人飞行器出现用户意图外的运动,增加用户体验。另外,由于本申请实施例是通过姿态变化信息确定体感遥控器的横滚角、俯仰角或偏航角,这使得在相同的最大姿态配置下,用户通过遥控器的实际运动范围更小,用户操作起来更舒适。
下面详细说明S102的细节内容。
在一实施例中,由于欧拉角是表达旋转的最简单的一种方式,通过欧拉角旋转方式确定体感遥控器的横滚角、俯仰角或偏航角。即S102,所述根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角,可以包括:通过欧拉角旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角。
欧拉角是用来确定定点转动刚***置的一组独立的角参量,用三个角度表示,其值分别代表刚体绕坐标系三个轴X轴(刚体前后方向所在的轴,刚体的前方向表示X轴的正向)、Y轴(刚体左右方向所在的轴,刚体的右方向表示Y轴的正向)、Z轴(刚体上下方向所在的轴,刚体的下方向或者重力方向表示Z轴的正向)的旋转角度,定点转动刚***置,刚体的姿态发生变化,坐标系XYZ也跟着同步变化。其中绕X轴旋转的角度为横滚角roll,绕Y轴旋转的角度为俯仰角pitch,绕Z轴旋转的角度为偏航角yaw。
第二姿态信息与第一姿态信息之间的姿态变化信息,可以是第二姿态信息到第一姿态信息的姿态变化信息(即从基准姿态旋转到目标姿态,可以规定该旋转方向得到的角度为正),也可以是第一姿态信息到第二姿态信息的姿态变化信息(即从目标姿态旋转到基准姿态,可以规定该旋转方向得到的角度为负,所述体感遥控器的横滚角、俯仰角或偏航角可以只取角度值)。
在一实施例中,通常采用常见的从基准姿态旋转到目标姿态的旋转方向。即S102,所述通过欧拉角旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横 滚角、俯仰角或偏航角,还可以包括:通过欧拉角旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角。
由于采用欧拉角旋转方式,不同的旋转顺序会代表不同的旋转结果。因此为了使根据姿态变化信息确定的横滚角、俯仰角或偏航角不依赖于旋转顺序,可以仅保留第一次绕轴旋转的中间角度作为与第一次绕轴旋转对应的横滚角、俯仰角或偏航角。
因此,在一实施例中,确定横滚角roll的方式可以是:
S102,所述通过欧拉角旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角,还可以包括:S102A1和S102A2,如图2所示。
S102A1:第一次绕X轴旋转从基准姿态旋转到目标姿态,确定与所述第一次绕X轴旋转对应的中间角度。
S102A2:将与所述第一次绕X轴旋转对应的中间角度作为所述横滚角。
本实施例中,从基准姿态旋转到目标姿态时,第一次绕X轴(基准姿态下对应的X轴)旋转,第二次和第三次的旋转顺序可以是第二次绕中间姿态下Y轴(中间姿态下对应的Y轴)旋转,第三次绕目标姿态下Z轴(目标姿态下对应的Z轴)旋转,也可以是第二次绕中间姿态下Z轴旋转,第三次绕目标姿态下Y轴旋转。其中,仅保留与第一次绕X轴旋转对应的中间角度,将该与第一次绕X轴旋转对应的中间角度作为横滚角roll。
其中,由于XYZ旋转顺序比较符合用户的操作习惯,因此子步骤S102A1,所述第一次绕X轴旋转从基准姿态旋转到目标姿态,确定对应所述第一次绕X轴旋转的中间角度,还可以包括:按照以基准姿态下X轴、中间姿态下Y轴、目标姿态下Z轴的旋转顺序从基准姿态旋转到目标姿态,确定与绕基准姿态下X轴旋转对应的中间角度;此时,S102A2,所述将与所述第一次绕X轴旋转对应的中间角度作为所述横滚角,还可以包括:将与绕基准姿态下X轴旋转对应的中间角度作为所述横滚角。
如图3所示,体感遥控器的基准姿态对应的坐标系用xyz标记(参见图3中的D);目标姿态对应的坐标系用x”y”z”标记(参见图3中的E)。从基准 姿态旋转到目标姿态,以基准姿态对应的X轴方向为第一次旋转、按照XYZ(roll、pitch、yaw)旋转顺序进行三次旋转分解,仅选取分解出的第一次绕X轴方向旋转得到的中间角度作为原始横滚角roll(原始横滚角roll可以映射为roll杆量),剩余两次旋转对应的中间角度丢弃。上述的XYZ旋转次序,可以是指为了获取x”y”z”坐标系相对xyz坐标系的姿态,将xyz坐标系,绕x轴(基准姿态下对应的x轴,参见图3中的A)、y'轴(中间姿态下对应坐标系x'y'z'的y'轴,参见图3中的B)、z”轴(目标姿态下对应的z”轴,参见图3中的C)做三次旋转后,与坐标系x”y”z”重合。绕三个轴旋转得到的中间角度φ、θ、
Figure PCTCN2021081174-appb-000001
只保留第一次绕x轴旋转的中间角度φ,作为横滚角roll。
在一实施例中,确定俯仰角pitch的方式可以是:
S102,所述通过欧拉角旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定横滚角、俯仰角或偏航角,还可以包括:S102B1和S102B2,如图4所示。
S102B1:第一次绕Y轴旋转从基准姿态旋转到目标姿态,确定与所述第一次绕Y轴旋转对应的中间角度。
S102B2:将与所述第一次绕Y轴旋转对应的中间角度作为所述俯仰角。
本实施例中,从基准姿态旋转到目标姿态时,第一次绕Y轴旋转,第二次和第三次的旋转顺序可以是第二次绕中间姿态下X轴旋转,第三次绕目标姿态下Z轴旋转,也可以是第二次绕中间姿态下Z轴旋转,第三次绕目标姿态下X轴旋转。其中,仅保留与第一次绕Y轴旋转对应的中间角度,将该与第一次绕Y轴旋转对应的中间角度作为俯仰角pitch。
其中,由于YXZ旋转顺序比较符合用户的操作习惯,因此S102B1,所述第一次绕Y轴旋转从基准姿态旋转到目标姿态,确定与所述第一次绕Y轴旋转对应的中间角度,还可以包括:按照以基准姿态下Y轴、中间姿态下X轴、目标姿态下Z轴的旋转顺序从基准姿态旋转到目标姿态,确定与绕基准姿态下Y轴旋转对应的中间角度;此时,S102B2,将与所述第一次绕Y轴旋转对应的中间角度作为所述俯仰角,还可以包括:将与绕基准姿态下Y轴旋转对应的中间角度作为所述俯仰角。
本实施例从基准姿态旋转到目标姿态,以Y轴方向为第一次旋转的按照 YXZ(pitch、roll、yaw)旋转顺序进行三次旋转分解,仅选取分解出的第一次绕Y轴方向旋转得到的中间角度作为原始俯仰角pitch(原始俯仰角pitch可以映射为pitch杆量),剩余两次旋转对应的中间角度丢弃。
在实际应用中,体感遥控器的横滚角和俯仰角可以均采取上述仅保留第一次绕轴旋转的中间角度作为与第一次绕轴旋转对应的横滚角和俯仰角,即按照该多次单独旋转的分解方式,不同方向的roll、pitch、yaw耦合程度较低,也不依赖于不同旋转的先后顺序。另外,在上述操作下,若采取固定的旋转分解方式如(XYZ),则与用户的旋转次序不符,会带来较多的耦合杆量;使用该多次单独旋转的分解方式,能够减少对用户旋转操作次序的依赖。
例如,用户前倾pitch,再左倾roll,利用上述分解方式,打pitch时分解获得的耦合roll杆量更小,打roll时在已有的pitch杆量上额外叠加的耦合pitch杆量更小,能够减少用户意图外的运动。
对于无人飞行器来说,俯仰和/或横滚方向可以对应为倾斜方向Tilt,偏航方向可以对应为扭转方向Torsion。由于无人飞行器的偏航方向(Torsion)的控制所需时间要长于俯仰和/或滚转方向(Tilt)的控制所需时间的十倍多,因此在无人飞行器的姿态环控制框架中,对期待的姿态角,优先Tilt后Torsion的控制策略效率更高。
在一实施例中,通过倾斜-扭转的旋转方式确定所述体感遥控器的横滚角、俯仰角或偏航角。即S102,所述根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角,还可以包括:通过倾斜-扭转的旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角。
为方便理解,以下先解释倾斜-扭转(即tilt-torsion)的旋转次序。
考虑将基准姿态对应的坐标系xyz按该旋转次序旋转到目标姿态对应的坐标系x”y”z”。如图5所示,需要找到一个中间姿态对应的坐标系x'y'z',其中,中间姿态对应的坐标系的z'轴与目标姿态对应的坐标系的z”轴重合。因此第一步为tilt旋转(如图5所示的从A到B的旋转过程),将坐标系xyz的xy平面绕着一个空间旋转轴旋转α角度,使坐标系xyz的xy平面与坐标系 x'y'z'的x'y'平面重合。此空间旋转轴是一个向量,该空间旋转轴与zz'组成的平面垂直,将这个空间旋转轴向量投影到xy平面,其投影的方向和大小,即为tilt中间角度,具有tilt_x分量(即横滚角roll分量)和tilt_y分量(即俯仰角pitch分量)。经过第一步的tilt旋转后,只需要再经过第二次绕坐标系x'y'z'的z'轴的torsion旋转(如图5所示的从B到C的旋转过程),使得x'、y'轴分别与x”、y”轴重合,这次绕z'轴旋转获取的中间角度即torsion中间角度,该torsion中间角度为偏航角yaw分量。这样完成一次tilt-torsion次序分解,可以求解获得tilt、torsion中间角度。
具体进行tilt-torsion次序分解时,可以是将基准姿态按照till-torsion旋转次序旋转到目标姿态,也可以是将目标姿态按照till-torsion旋转次序旋转到基准姿态(将获取的tilt、torsion中间角度取负,即对应将基准姿态旋转到目标姿态的tilt、torsion中间角度)。
在一实施例中,采用比较常用的将基准姿态按照till-torsion旋转次序旋转到目标姿态的旋转方向。即S102,所述通过倾斜-扭转的旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角,还可以包括:通过倾斜-扭转的旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角。
由于采用上述欧拉角旋转方式第一次绕Z轴旋转得到的中间角度确定偏航角yaw时在某些应用场景下无人飞行器还是会出现用户意图外的yaw运动。例如:依然以上述用户握持体感遥控器先前倾pitch,后左倾roll为例;此时如果使用第一次绕Z轴旋转以固定的旋转顺序如ZYX(yaw、pitch、roll)进行分解,分解完后会有偏航角yaw,映射后对应yaw杆量,无人飞行器会有预期外的yaw运动。为了解决上述问题,确定偏航角yaw的方式可以是通过tilt-torsion旋转方式,将分解出的torsion中间角度作为原始偏航角yaw,原始偏航角yaw可以映射为yaw杆量。采用tilt-torsion旋转方式进行分解,上述例子(用户握持体感遥控器先前倾pitch,后左倾roll为例)分解后不会有yaw分量,无人飞行器将不会有预期外yaw运动。
即在一实施例中,确定偏航角yaw的方式可以是:
S102,所述通过倾斜-扭转的旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角,还可以包括:S102C1、S102C2以及S102C3,如图6所示。
S102C1:从基准姿态进行倾斜旋转至中间姿态。
S102C2:在所述中间姿态下绕Z轴旋转至目标姿态,确定与所述中间姿态下绕Z轴旋转对应的中间角度,其中所述中间姿态下Z轴与所述目标姿态下Z轴重合。
S102C3:将与所述中间姿态下绕Z轴旋转对应的中间角度作为所述偏航角。
结合参见图5,本实施例中tilt旋转(即倾斜旋转)是绕着一个空间旋转轴旋转至中间姿态,可以使基准姿态对应的坐标系xyz的xy平面与中间姿态对应的坐标系x'y'z'的x'y'平面重合。该空间旋转轴与zz'组成的平面垂直。经过tilt旋转后,再经过绕中间姿态对应的坐标系x'y'z'的z'轴进行旋转(即torsion旋转)至目标姿态对应的坐标系x”y”z”,使得x'、y'轴分别与x”、y”轴重合,将与中间姿态下绕对应的z'轴旋转对应的中间角度作为偏航角。实验表明,该方式可以最大程度使无人飞行器减少用户意图外的yaw运动。
在一实施例中,所述基准姿态包括所述体感遥控器的初始姿态。
其中,S101,所述获取体感遥控器目标姿态的第一姿态信息之前,还可以包括:记录所述体感遥控器的初始姿态。
在一实施例中,体感遥控器可以设置为锁定状态(或者还包括半锁定状态)以及未锁定状态。其中,体感遥控器处于未锁定状态时,采用本申请实施例的方法中的策略控制无人飞行器(例如无人飞行器在空中正常飞行时,通常将体感遥控器设置在未锁定状态);体感遥控器处于锁定状态(或者还包括半锁定状态)时,不再采用本申请实施例的方法中的策略控制无人飞行器,而是采用另外的处理策略或控制策略以更加满足操作安全性的要求(例如无人飞行器在地面停桨、空中刹车悬停等场景下,通常将体感遥控器设置在锁定状态或半锁定状态)。
此时,S101,所述获取体感遥控器目标姿态的第一姿态信息,可以包括:在所述体感遥控器处于未锁定状态时,获取体感遥控器目标姿态的第一姿态信 息。
其中,S101,所述获取体感遥控器目标姿态的第一姿态信息之前,还可以包括:在所述体感遥控器从锁定状态或者半锁定状态进入未锁定状态时,记录体感遥控器的当前姿态;将记录的当前姿态作为体感遥控器的初始姿态。
参见图7,所述方法还可以包括:S105、S106、S107以及S108。
S105:在所述体感遥控器处于锁定状态或者半锁定状态时,获取体感遥控器的当前姿态。
S106:通过倾斜-扭转的旋转方式对所述当前姿态进行分解,得到倾斜向量。
S107:根据所述倾斜向量,确定体感遥控器的当前姿态下的横滚角或俯仰角,并将体感遥控器的当前姿态下的偏航角置为零。
S108:将体感遥控器的当前姿态下的横滚角或俯仰角发送给显示终端,使所述显示终端显示体感遥控器的当前姿态下的横滚角或俯仰角。
本实施例中当体感遥控器处于锁定状态或者半锁定状态时,直接使用tilt-torsion旋转次序对体感遥控器的当前姿态分解;并直接将偏航角yaw置0,因此yaw杆量也为0;由于使用的当前姿态是绝对姿态,体感遥控器可以朝向任意方位,此时若将绝对yaw转化为yaw杆量,可能会带来较大的yaw杆量值,该值无意义,直接将yaw杆量清零;进入未锁定状态后,将以进入时的姿态为初始姿态(作为基准姿态),进行yaw及yaw杆量的转换。
由于使用的当前姿态是绝对姿态,若使用和未锁定状态下一样的分解策略,前倾pitch时体感遥控器可能处于任意方位(绝对航向不为0),例如朝向正东方前倾,此时分解出的分量可能还有较大的roll分量;在显示终端(例如用户终端的app或者眼镜端)进行实时显示时会给用户造成困惑;在半锁定/锁定状态下,使用绝对姿态进行分解时,直接使用tilt-torsion次序分解,可以避免上述问题。分解后,丢弃掉torsion中间角度,此时tilt则完全不受体感遥控器朝向的影响,在任意航向下前倾,都对应pitch分量,右倾对应roll分量。
在实际应用时,根据姿态变化信息确定所述横滚角、俯仰角以及偏航角三个角度时,横滚角roll可以采用:第一次绕X轴旋转从基准姿态旋转到目标姿态,将与所述第一次绕X轴旋转对应的中间角度作为所述横滚角;俯仰角pitch 可以采用:第一次绕Y轴旋转从基准姿态旋转到目标姿态,将与所述第一次绕Y轴旋转对应的中间角度作为所述俯仰角;偏航角yaw可以采用:从基准姿态旋转到目标姿态进行倾斜旋转到中间姿态,在所述中间姿态下绕Z轴旋转到目标姿态,中间姿态下Z轴与所述目标姿态下Z轴重合,将与所述中间姿态下绕Z轴旋转对应的中间角度作为所述偏航角。
上述横滚角、俯仰角以及偏航角转化出的原始杆量,需要经过死区、exp、滤波、最大角度幅度映射的处理。最大角度幅度映射时,例如roll方向最大角度范围配置为[-20,25]度,当原始roll杆量为-20时,映射杆量为-1,当原始roll杆量为25时,映射杆量为1。映射允许正负方向配置不同的最大角度大小,用于处理手腕不同方向运动的不对称性,例如手腕上抬的角度明显低于手腕下弯的最大角度。允许死区配置,人手无法握持保持在一个绝对姿态固定住,增加此配置,可在手腕握持姿态在中位附近时,输出0杆量,对应无人飞行器悬停或者姿态保持水平的操控指令。油门杆量直接采样和后处理后,作为最终杆量;再采用杆量屏蔽策略,进行相关屏蔽和清零处理。
参见图8,图8是本申请基于体感遥控器的控制装置一实施例的结构示意图,本实施例的控制装置可以与体感遥控器集成在一起,也可以与体感遥控器分开设置。需要说明的是,本实施例的控制装置能够执行上述基于体感遥控器的控制方法中的操作,相关内容的详细说明,请参见上述基于体感遥控器的控制方法的相关内容,在此不再赘叙。
所述控制装置100包括:存储器1和处理器2;处理器2与存储器1通过总线连接。控制装置100还与体感遥控器3连接,控制装置100与体感遥控器3之间的连接方式可以至有线连接也可以是无线连接。
其中,处理器2可以是微控制单元、中央处理单元或数字信号处理器,等等。
其中,存储器1可以是Flash芯片、只读存储器、磁盘、光盘、U盘或者移动硬盘等等。
所述存储器1用于存储指令;所述处理器2调用所述存储器1中存储的指令用于实现如下操作:
获取体感遥控器目标姿态的第一姿态信息;根据作为体感遥控器基准姿态 的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角;根据所述横滚角、俯仰角以及偏航角生成控制指令;将所述控制指令发送给无人飞行器,所述控制指令用于指示所述无人飞行器进行对应的操作。
其中,所述处理器具体用于:通过欧拉角旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角。
其中,所述处理器具体用于:通过欧拉角旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角。
其中,所述处理器具体用于:第一次绕X轴旋转从基准姿态旋转到目标姿态,确定与所述第一次绕X轴旋转对应的中间角度;将与所述第一次绕X轴旋转对应的中间角度作为所述横滚角。
其中,所述处理器具体用于:按照以基准姿态下X轴、中间姿态下Y轴、目标姿态下Z轴的旋转顺序从基准姿态旋转到目标姿态,确定与绕基准姿态下X轴旋转对应的中间角度;所述将与所述第一次绕X轴旋转对应的中间角度作为所述横滚角,包括:将与绕基准姿态下X轴旋转对应的中间角度作为所述横滚角。
其中,所述处理器具体用于:第一次绕Y轴旋转从基准姿态旋转到目标姿态,确定与所述第一次绕Y轴旋转对应的中间角度;将与所述第一次绕Y轴旋转对应的中间角度作为所述俯仰角。
其中,所述处理器具体用于:按照以基准姿态下Y轴、中间姿态下X轴、目标姿态下Z轴的旋转顺序从基准姿态旋转到目标姿态,确定与绕基准姿态下Y轴旋转对应的中间角度;所述将与所述第一次绕Y轴旋转对应的中间角度作为所述俯仰角,包括:将与绕基准姿态下Y轴旋转对应的中间角度作为所述俯仰角。
其中,所述处理器具体用于:通过倾斜-扭转的旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角。
其中,所述处理器具体用于:通过倾斜-扭转的旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角。
其中,所述处理器具体用于:从基准姿态进行倾斜旋转至中间姿态;在所述中间姿态下绕Z轴旋转至目标姿态,确定与所述中间姿态下绕Z轴旋转对应的中间角度,其中所述中间姿态下Z轴与所述目标姿态下Z轴重合;将与所述中间姿态下绕Z轴旋转对应的中间角度作为所述偏航角。
其中,所述基准姿态包括所述体感遥控器的初始姿态。
其中,所述处理器具体用于:记录所述体感遥控器的初始姿态。
其中,所述处理器具体用于:在所述体感遥控器处于未锁定状态时,获取体感遥控器目标姿态的第一姿态信息。
其中,所述处理器具体用于:在所述体感遥控器处于锁定状态或者半锁定状态时,获取体感遥控器的当前姿态;通过倾斜-扭转的旋转方式对所述当前姿态进行分解,得到倾斜向量;根据所述倾斜向量,确定体感遥控器的当前姿态下的横滚角或俯仰角,并将体感遥控器的当前姿态下的偏航角置为零;将体感遥控器的当前姿态下的横滚角或俯仰角发送给显示终端,使所述显示终端显示体感遥控器的当前姿态下的横滚角或俯仰角。
其中,所述处理器具体用于:在所述体感遥控器从锁定状态或者半锁定状态进入未锁定状态时,记录体感遥控器的当前姿态;将记录的当前姿态作为体感遥控器的初始姿态。
本申请还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如上任一项所述的基于体感遥控器的控制方法。
其中,该计算机可读存储介质可以是上述控制装置的内部存储单元,例如硬盘或内存。该计算机可读存储介质也可以是外部存储设备,例如配备的插接式硬盘、智能存储卡、安全数字卡、闪存卡,等等。
应当理解,在本申请说明书中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本申请。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或” 是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
以上所述,仅为本申请的具体实施例,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (31)

  1. 一种基于体感遥控器的控制方法,其特征在于,包括:
    获取体感遥控器目标姿态的第一姿态信息;
    根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角;
    根据所述横滚角、俯仰角以及偏航角生成控制指令;
    将所述控制指令发送给无人飞行器,所述控制指令用于指示所述无人飞行器进行对应的操作。
  2. 根据权利要求1所述的方法,其特征在于,所述根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角,包括:
    通过欧拉角旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角。
  3. 根据权利要求2所述的方法,其特征在于,所述通过欧拉角旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角,包括:
    通过欧拉角旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角。
  4. 根据权利要求3所述的方法,其特征在于,所述通过欧拉角旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角,包括:
    第一次绕X轴旋转从基准姿态旋转到目标姿态,确定与所述第一次绕X轴旋转对应的中间角度;
    将与所述第一次绕X轴旋转对应的中间角度作为所述横滚角。
  5. 根据权利要求4所述的方法,其特征在于,所述第一次绕X轴旋转从基准姿态旋转到目标姿态,确定对应所述第一次绕X轴旋转的中间角度,包 括:
    按照以基准姿态下X轴、中间姿态下Y轴、目标姿态下Z轴的旋转顺序从基准姿态旋转到目标姿态,确定与绕基准姿态下X轴旋转对应的中间角度;
    所述将与所述第一次绕X轴旋转对应的中间角度作为所述横滚角,包括:
    将与绕基准姿态下X轴旋转对应的中间角度作为所述横滚角。
  6. 根据权利要求3所述的方法,其特征在于,所述通过欧拉角旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定横滚角、俯仰角或偏航角,包括:
    第一次绕Y轴旋转从基准姿态旋转到目标姿态,确定与所述第一次绕Y轴旋转对应的中间角度;
    将与所述第一次绕Y轴旋转对应的中间角度作为所述俯仰角。
  7. 根据权利要求6所述的方法,其特征在于,所述第一次绕Y轴旋转从基准姿态旋转到目标姿态,确定与所述第一次绕Y轴旋转对应的中间角度,包括:
    按照以基准姿态下Y轴、中间姿态下X轴、目标姿态下Z轴的旋转顺序从基准姿态旋转到目标姿态,确定与绕基准姿态下Y轴旋转对应的中间角度;
    所述将与所述第一次绕Y轴旋转对应的中间角度作为所述俯仰角,包括:
    将与绕基准姿态下Y轴旋转对应的中间角度作为所述俯仰角。
  8. 根据权利要求1所述的方法,其特征在于,所述根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角,包括:
    通过倾斜-扭转的旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角。
  9. 根据权利要求8所述的方法,其特征在于,所述通过倾斜-扭转的旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角,包括:
    通过倾斜-扭转的旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航 角。
  10. 根据权利要求9所述的方法,其特征在于,所述通过倾斜-扭转的旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角,包括:
    从基准姿态进行倾斜旋转至中间姿态;
    在所述中间姿态下绕Z轴旋转至目标姿态,确定与所述中间姿态下绕Z轴旋转对应的中间角度,其中所述中间姿态下Z轴与所述目标姿态下Z轴重合;
    将与所述中间姿态下绕Z轴旋转对应的中间角度作为所述偏航角。
  11. 根据权利要求1所述的方法,其特征在于,所述基准姿态包括所述体感遥控器的初始姿态。
  12. 根据权利要求11所述的方法,其特征在于,所述获取体感遥控器目标姿态的第一姿态信息之前,包括:
    记录所述体感遥控器的初始姿态。
  13. 根据权利要求1所述的方法,其特征在于,所述获取体感遥控器目标姿态的第一姿态信息,包括:
    在所述体感遥控器处于未锁定状态时,获取体感遥控器目标姿态的第一姿态信息。
  14. 根据权利要求13所述的方法,其特征在于,所述方法还包括:
    在所述体感遥控器处于锁定状态或者半锁定状态时,获取体感遥控器的当前姿态;
    通过倾斜-扭转的旋转方式对所述当前姿态进行分解,得到倾斜向量;
    根据所述倾斜向量,确定体感遥控器的当前姿态下的横滚角或俯仰角,并将体感遥控器的当前姿态下的偏航角置为零;
    将体感遥控器的当前姿态下的横滚角或俯仰角发送给显示终端,使所述显示终端显示体感遥控器的当前姿态下的横滚角或俯仰角。
  15. 根据权利要求13所述的方法,其特征在于,所述获取体感遥控器目标姿态的第一姿态信息之前,包括:
    在所述体感遥控器从锁定状态或者半锁定状态进入未锁定状态时,记录体 感遥控器的当前姿态;
    将记录的当前姿态作为体感遥控器的初始姿态。
  16. 一种基于体感遥控器的控制装置,其特征在于,所述装置包括:存储器和处理器;
    所述存储器用于存储指令;
    所述处理器调用所述存储器中存储的指令用于实现如下操作:
    获取体感遥控器目标姿态的第一姿态信息;
    根据作为体感遥控器基准姿态的第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角;
    根据所述横滚角、俯仰角以及偏航角生成控制指令;
    将所述控制指令发送给无人飞行器,所述控制指令用于指示所述无人飞行器进行对应的操作。
  17. 根据权利要求16所述的装置,其特征在于,所述处理器具体用于:
    通过欧拉角旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角。
  18. 根据权利要求17所述的装置,其特征在于,所述处理器具体用于:
    通过欧拉角旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角。
  19. 根据权利要求18所述的装置,其特征在于,所述处理器具体用于:
    第一次绕X轴旋转从基准姿态旋转到目标姿态,确定与所述第一次绕X轴旋转对应的中间角度;
    将与所述第一次绕X轴旋转对应的中间角度作为所述横滚角。
  20. 根据权利要求19所述的装置,其特征在于,所述处理器具体用于:
    按照以基准姿态下X轴、中间姿态下Y轴、目标姿态下Z轴的旋转顺序从基准姿态旋转到目标姿态,确定与绕基准姿态下X轴旋转对应的中间角度;
    所述将与所述第一次绕X轴旋转对应的中间角度作为所述横滚角,包括:
    将与绕基准姿态下X轴旋转对应的中间角度作为所述横滚角。
  21. 根据权利要求18所述的装置,其特征在于,所述处理器具体用于:
    第一次绕Y轴旋转从基准姿态旋转到目标姿态,确定与所述第一次绕Y轴旋转对应的中间角度;
    将与所述第一次绕Y轴旋转对应的中间角度作为所述俯仰角。
  22. 根据权利要求21所述的装置,其特征在于,所述处理器具体用于:
    按照以基准姿态下Y轴、中间姿态下X轴、目标姿态下Z轴的旋转顺序从基准姿态旋转到目标姿态,确定与绕基准姿态下Y轴旋转对应的中间角度;
    所述将与所述第一次绕Y轴旋转对应的中间角度作为所述俯仰角,包括:
    将与绕基准姿态下Y轴旋转对应的中间角度作为所述俯仰角。
  23. 根据权利要求16所述的装置,其特征在于,所述处理器具体用于:
    通过倾斜-扭转的旋转方式,根据所述第二姿态信息与所述第一姿态信息之间的姿态变化信息,确定所述姿态变化信息对应的所述体感遥控器的横滚角、俯仰角或偏航角。
  24. 根据权利要求23所述的装置,其特征在于,所述处理器具体用于:
    通过倾斜-扭转的旋转方式从基准姿态旋转到目标姿态,根据所述第二姿态信息到所述第一姿态信息的姿态变化信息,确定所述横滚角、俯仰角或偏航角。
  25. 根据权利要求24所述的装置,其特征在于,所述处理器具体用于:
    从基准姿态进行倾斜旋转至中间姿态;
    在所述中间姿态下绕Z轴旋转至目标姿态,确定与所述中间姿态下绕Z轴旋转对应的中间角度,其中所述中间姿态下Z轴与所述目标姿态下Z轴重合;
    将与所述中间姿态下绕Z轴旋转对应的中间角度作为所述偏航角。
  26. 根据权利要求16所述的装置,其特征在于,所述基准姿态包括所述体感遥控器的初始姿态。
  27. 根据权利要求26所述的装置,其特征在于,所述处理器具体用于:
    记录所述体感遥控器的初始姿态。
  28. 根据权利要求16所述的装置,其特征在于,所述处理器具体用于:
    在所述体感遥控器处于未锁定状态时,获取体感遥控器目标姿态的第一姿 态信息。
  29. 根据权利要求28所述的装置,其特征在于,所述处理器具体用于:
    在所述体感遥控器处于锁定状态或者半锁定状态时,获取体感遥控器的当前姿态;
    通过倾斜-扭转的旋转方式对所述当前姿态进行分解,得到倾斜向量;
    根据所述倾斜向量,确定体感遥控器的当前姿态下的横滚角或俯仰角,并将体感遥控器的当前姿态下的偏航角置为零;
    将体感遥控器的当前姿态下的横滚角或俯仰角发送给显示终端,使所述显示终端显示体感遥控器的当前姿态下的横滚角或俯仰角。
  30. 根据权利要求28所述的装置,其特征在于,所述处理器具体用于:
    在所述体感遥控器从锁定状态或者半锁定状态进入未锁定状态时,记录体感遥控器的当前姿态;
    将记录的当前姿态作为体感遥控器的初始姿态。
  31. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如权利要求1-15任一项所述的基于体感遥控器的控制方法。
PCT/CN2021/081174 2021-03-16 2021-03-16 基于体感遥控器的控制方法、装置及存储介质 WO2022193153A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/081174 WO2022193153A1 (zh) 2021-03-16 2021-03-16 基于体感遥控器的控制方法、装置及存储介质
CN202180087702.6A CN116710870A (zh) 2021-03-16 2021-03-16 基于体感遥控器的控制方法、装置及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/081174 WO2022193153A1 (zh) 2021-03-16 2021-03-16 基于体感遥控器的控制方法、装置及存储介质

Publications (1)

Publication Number Publication Date
WO2022193153A1 true WO2022193153A1 (zh) 2022-09-22

Family

ID=83321824

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/081174 WO2022193153A1 (zh) 2021-03-16 2021-03-16 基于体感遥控器的控制方法、装置及存储介质

Country Status (2)

Country Link
CN (1) CN116710870A (zh)
WO (1) WO2022193153A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115639910A (zh) * 2022-10-28 2023-01-24 武汉恒新动力科技有限公司 面向操控对象作业空间的全方位体感交互方法及操控设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808675A (zh) * 2015-03-03 2015-07-29 广州亿航智能技术有限公司 基于智能终端的体感飞行操控***及终端设备
CN106094865A (zh) * 2016-07-15 2016-11-09 陈昊 无人飞行器拍摄***及其拍摄方法
US9663227B1 (en) * 2015-12-22 2017-05-30 Gopro, Inc. Systems and methods for controlling an unmanned aerial vehicle
CN107346141A (zh) * 2016-05-06 2017-11-14 北京臻迪机器人有限公司 一种体感控制方法
CN108700893A (zh) * 2017-04-07 2018-10-23 深圳市大疆创新科技有限公司 体感遥控方法、控制装置、云台和无人飞行器
KR102032067B1 (ko) * 2018-12-05 2019-10-14 세종대학교산학협력단 강화학습 기반 무인 항공기 원격 제어 방법 및 장치
CN210222569U (zh) * 2019-09-24 2020-03-31 张雨航 一种无人机控制装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808675A (zh) * 2015-03-03 2015-07-29 广州亿航智能技术有限公司 基于智能终端的体感飞行操控***及终端设备
US9663227B1 (en) * 2015-12-22 2017-05-30 Gopro, Inc. Systems and methods for controlling an unmanned aerial vehicle
CN107346141A (zh) * 2016-05-06 2017-11-14 北京臻迪机器人有限公司 一种体感控制方法
CN106094865A (zh) * 2016-07-15 2016-11-09 陈昊 无人飞行器拍摄***及其拍摄方法
CN108700893A (zh) * 2017-04-07 2018-10-23 深圳市大疆创新科技有限公司 体感遥控方法、控制装置、云台和无人飞行器
KR102032067B1 (ko) * 2018-12-05 2019-10-14 세종대학교산학협력단 강화학습 기반 무인 항공기 원격 제어 방법 및 장치
CN210222569U (zh) * 2019-09-24 2020-03-31 张雨航 一种无人机控制装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115639910A (zh) * 2022-10-28 2023-01-24 武汉恒新动力科技有限公司 面向操控对象作业空间的全方位体感交互方法及操控设备
CN115639910B (zh) * 2022-10-28 2023-08-15 武汉恒新动力科技有限公司 面向***控对象作业空间的全方位体感交互方法及设备

Also Published As

Publication number Publication date
CN116710870A (zh) 2023-09-05

Similar Documents

Publication Publication Date Title
US9619106B2 (en) Methods and apparatus for simultaneous user inputs for three-dimensional animation
US20210074054A1 (en) Image processing method and device, computer readable storage medium, and terminal
WO2019242553A1 (zh) 控制拍摄装置的拍摄角度的方法、控制装置及可穿戴设备
CN106971423B (zh) 立方体图形的绘制方法、装置、设备及存储介质
WO2022193153A1 (zh) 基于体感遥控器的控制方法、装置及存储介质
CN110188728A (zh) 一种头部姿态估计的方法及***
US10867452B2 (en) System and methods for conversion of 2D assets into 3D graphical scenes
JP7078234B2 (ja) 拡張現実空間に配置される3dオブジェクトを生成する方法
CN109621415A (zh) 3d游戏中的显示控制方法及装置、计算机存储介质
CN106569696B (zh) 一种渲染输出全景影像的方法、***及便携式终端
US20030189567A1 (en) Viewing controller for three-dimensional computer graphics
WO2022199059A1 (zh) 机械臂控制方法、装置、作业控制设备及可读存储介质
CN110096134B (zh) 一种vr手柄射线抖动矫正方法、装置、终端和介质
CN104820584B (zh) 一种面向层次化信息自然操控的3d手势界面的构建方法及***
CA2618114A1 (en) Reshaping a camera image
WO2020262392A1 (ja) 情報処理装置、情報処理方法、及びプログラム
JP2010049346A (ja) 画像表示装置
JP7150460B2 (ja) 画像処理装置および画像処理方法
CN114075810B (zh) 面向混凝土3d打印的空间路径拟合方法及***
CN115670660A (zh) 标定方法、装置、手术机器人、电子设备及存储介质
JP2010262605A (ja) 画像処理装置、画像処理方法
JP2004072553A (ja) 映像歪み補正方法及び同方法用プログラム
JP2000306106A (ja) 3次元有向体の定位方法及び画像処理装置
WO2022141122A1 (zh) 无人机的控制方法、无人机及存储介质
JP6204781B2 (ja) 情報処理方法、情報処理装置、およびコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21930749

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180087702.6

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21930749

Country of ref document: EP

Kind code of ref document: A1