CN114088085B - Position determining method and device for robot, electronic equipment and storage medium - Google Patents

Position determining method and device for robot, electronic equipment and storage medium Download PDF

Info

Publication number
CN114088085B
CN114088085B CN202111392025.7A CN202111392025A CN114088085B CN 114088085 B CN114088085 B CN 114088085B CN 202111392025 A CN202111392025 A CN 202111392025A CN 114088085 B CN114088085 B CN 114088085B
Authority
CN
China
Prior art keywords
robot
sensor
function
current position
reading information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111392025.7A
Other languages
Chinese (zh)
Other versions
CN114088085A (en
Inventor
姚秀勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anker Innovations Co Ltd
Original Assignee
Anker Innovations Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anker Innovations Co Ltd filed Critical Anker Innovations Co Ltd
Priority to CN202111392025.7A priority Critical patent/CN114088085B/en
Publication of CN114088085A publication Critical patent/CN114088085A/en
Application granted granted Critical
Publication of CN114088085B publication Critical patent/CN114088085B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the disclosure discloses a method and device for determining a position of a robot, electronic equipment and a storage medium. The robot includes an object sensor including at least one of an odometer sensor and an optical flow sensor, the method comprising: acquiring current position information of the robot and reading information of the target sensor; constructing an objective function based on the current position information and the reading information, wherein the objective function comprises a switch variable, and the switch variable is used for controlling the magnitude of a function value of a relation function; and determining the next position information corresponding to the acquired current position information based on the objective function. The embodiment of the disclosure controls the magnitude of the function value of the relation function through the switch variable, so that the accuracy of the position determination of the robot can be improved when at least one of the odometer sensor and the optical flow sensor provides error information.

Description

Position determining method and device for robot, electronic equipment and storage medium
Technical Field
The disclosure relates to the field of positioning, and in particular relates to a method and device for determining a position of a robot, electronic equipment and a storage medium.
Background
In the prior art, robots are generally provided with various sensors, such as an odometer sensor, an optical flow sensor, a loop detection sensor, and the like.
Through the arranged sensor, the positioning, path planning, position prediction and the like of the robot can be realized.
However, in the process of positioning, path planning and position prediction of the robot, the reading information of the sensor is required to be highly dependent, and once the sensor provides error information, the accuracy of position determination of the robot is seriously affected.
Disclosure of Invention
In view of the above, in order to solve the above technical problems or some of the technical problems, embodiments of the present disclosure provide a method, an apparatus, an electronic device, and a storage medium for determining a position of a robot.
In a first aspect, an embodiment of the present disclosure provides a position determining method of a robot including a target sensor including at least one of an odometer sensor and an optical flow sensor, the method including:
acquiring current position information of the robot and reading information of the target sensor;
constructing an objective function based on the current position information and the reading information, wherein the objective function comprises a switch variable, and the switch variable is used for controlling the magnitude of a function value of a relation function, and the relation function represents a corresponding relation between the current position of the robot and a next position of the robot calculated based on the reading information of the objective sensor;
And determining the next position information corresponding to the acquired current position information based on the objective function.
Optionally, in the method of any embodiment of the disclosure, constructing the objective function based on the current location information and the reading information includes:
and constructing an objective function based on the current position information, the reading information and the uncertainty of the reading information.
Optionally, in a method of any embodiment of the disclosure, the target sensor comprises an odometer sensor and an optical flow sensor; and
the constructing an objective function based on the current position information and the reading information includes:
constructing an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switching variable and the second switching variable;
wherein the first switching variable is used for controlling the magnitude of a function value of a first relation function, the second switching variable is used for controlling the magnitude of a function value of a second relation function, the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor, and the second relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor.
Optionally, in the method of any embodiment of the disclosure, the objective function further includes a constraint term of the first switching variable and a constraint term of the second switching variable.
Optionally, in the method of any embodiment of the disclosure, the robot further includes a loop detection sensor; and
the constructing an objective function based on the current position information and the reading information includes:
constructing an objective function based on the current position information, the reading information and the reading information of the loop detection sensor;
the objective function further includes a third switching variable, where the third switching variable is used to control a magnitude of a function value of a third relationship function, and the third relationship function characterizes a correspondence between a current position of the robot and a loop position of the robot calculated based on reading information of the loop detection sensor.
Optionally, in a method of any embodiment of the disclosure, the objective function further includes a constraint term of the third switching variable.
Optionally, in the method of any embodiment of the present disclosure, determining, based on the objective function, next location information corresponding to the obtained current location information includes:
And calculating the next position information corresponding to the current position information by taking the minimum value obtained by the objective function as a target.
In a second aspect, an embodiment of the present disclosure provides a position determining apparatus of a robot including a target sensor including at least one of an odometer sensor and an optical flow sensor, the apparatus including:
an acquisition unit configured to acquire current position information of the robot and reading information of the target sensor;
a construction unit configured to construct an objective function based on the current position information and the reading information, wherein the objective function includes a switching variable for controlling a magnitude of a function value of a relationship function representing a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the objective sensor;
and a determining unit configured to determine next position information corresponding to the acquired current position information based on the above objective function.
Optionally, in an apparatus of any embodiment of the disclosure, the building unit includes:
A first construction subunit configured to construct an objective function based on the current location information, the reading information, and an uncertainty of the reading information.
Optionally, in an apparatus of any embodiment of the disclosure, the target sensor includes an odometer sensor and an optical flow sensor; and
the construction unit includes:
a second construction subunit configured to construct an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switching variable, and the second switching variable;
wherein the first switching variable is used for controlling the magnitude of a function value of a first relation function, the second switching variable is used for controlling the magnitude of a function value of a second relation function, the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor, and the second relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor.
Optionally, in the apparatus of any embodiment of the disclosure, the objective function further includes a constraint term of the first switching variable and a constraint term of the second switching variable.
Optionally, in the apparatus of any embodiment of the disclosure, the robot further includes a loop detection sensor; and
the construction unit includes:
a third constructing subunit configured to construct an objective function based on the current position information, the reading information, and the reading information of the loop detection sensor;
the objective function further includes a third switching variable, where the third switching variable is used to control a magnitude of a function value of a third relationship function, and the third relationship function characterizes a correspondence between a current position of the robot and a loop position of the robot calculated based on reading information of the loop detection sensor.
Optionally, in an apparatus of any embodiment of the disclosure, the objective function further includes a constraint term of the third switching variable.
Optionally, in an apparatus of any embodiment of the disclosure, the determining unit includes:
and a calculating subunit configured to calculate next position information corresponding to the current position information with the minimum value obtained by the objective function as a target.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including:
a memory for storing a computer program;
A processor, configured to execute a computer program stored in the memory, and when the computer program is executed, implement a method of any embodiment of the position determining method of the robot of the first aspect of the disclosure.
In a fourth aspect, embodiments of the present disclosure provide a computer readable medium, which when executed by a processor, implements a method as any one of the embodiments of the method for determining a position of a robot of the first aspect described above.
In a fifth aspect, embodiments of the present disclosure provide a computer program comprising computer readable code which, when run on a device, causes a processor in the device to execute instructions for carrying out the steps of the method as in any of the embodiments of the method for position determination of a robot of the first aspect described above.
In the position determining method of the robot provided in the above embodiment of the present disclosure, the robot includes a target sensor, and the target sensor includes at least one of an odometer sensor and an optical flow sensor. And finally, determining next position information corresponding to the acquired current position information based on the objective function. According to the embodiment of the disclosure, the magnitude of the function value of the relation function is controlled through the switch variable, so that when at least one of the odometer sensor and the optical flow sensor provides error reading information, the magnitude of the function value of the relation function is limited, the influence degree of the sensor providing the error information on position determination is reduced, and the accuracy of the position determination of the robot can be improved.
The technical scheme of the present disclosure is described in further detail below through the accompanying drawings and examples.
Drawings
Other features, objects and advantages of the present disclosure will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings:
fig. 1 is an exemplary system architecture diagram of a position determining method of a robot or a position determining apparatus of a robot provided in an embodiment of the present disclosure;
FIG. 2 is a flow chart of a method of determining a position of a robot provided by an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an application scenario for the embodiment of FIG. 2;
FIG. 4 is a flow chart of another method of determining a position of a robot provided by an embodiment of the present disclosure;
fig. 5 is a schematic structural view of a position determining apparatus of a robot provided in an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
It will be appreciated by those of skill in the art that the terms "first," "second," etc. in embodiments of the present disclosure are used merely to distinguish between different steps, devices, or modules, and do not represent any particular technical meaning nor logical order between them.
It should also be understood that in embodiments of the present disclosure, "plurality" may refer to two or more, and "at least one" may refer to one, two or more.
It should also be appreciated that any component, data, or structure referred to in the presently disclosed embodiments may be generally understood as one or more without explicit limitation or the contrary in the context.
In addition, the term "and/or" in this disclosure is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the front and rear association objects are an or relationship.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and that the same or similar features may be referred to each other, and for brevity, will not be described in detail.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
It should be noted that, without conflict, the embodiments of the present disclosure and features of the embodiments may be combined with each other. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is an exemplary system architecture diagram of a position determining method of a robot or a position determining apparatus of a robot provided in an embodiment of the present disclosure.
As shown in fig. 1, the system architecture 100 may include a robot 101. Optionally, the system architecture 100 may further include a network 103 and a server 102, where the network 103 may provide a medium for a communication link between the robot 101 and the server 102. The network 103 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The robot 101 and the server 102 may interact with each other via the network 103 to receive or transmit data, etc. Here, at least one of the robot 101 and the server 102 may be the execution subject of each step in the position determination method of the robot described in the embodiments of the present disclosure. For example, the position determining method of the robot described in the embodiments of the present disclosure may be performed by the robot 101, may be performed by the server 102, and may be performed by the robot 101 and the server 102 in cooperation with each other.
Note that, the execution subject of the position determining method of the robot provided in the embodiment of the present disclosure may be hardware or software, and is not limited herein.
Further, the robot 101 may include at least one of an odometer sensor and an optical flow sensor, and optionally, the robot 101 may further include a loop detection sensor.
It should be understood that the number of robots, servers, and networks in fig. 1 is merely illustrative. There may be any number of robots, servers, and networks as desired for implementation. In addition, when the execution subject of the position determining method of the robot provided by the embodiment of the present disclosure does not need to interact with other electronic devices, the system architecture 100 described above may include only the execution subject of the position determining method of the robot, and not include other electronic devices and networks in addition thereto. For example, the system architecture 100 described above may include only robots.
Fig. 2 shows a flowchart 200 of a method for determining a position of a robot according to an embodiment of the present disclosure. The robot includes an object sensor including at least one of an odometer sensor and an optical flow sensor. The position determining method of the robot comprises the following steps:
step 201, obtaining current position information of the robot and reading information of the target sensor.
In this embodiment, the execution subject of the position determination method of the robot (for example, the robot shown in fig. 1) may acquire the current position information of the robot and the reading information of the target sensor.
Wherein the current position information may characterize the position of the robot.
Further, in the case where the target sensor includes only the odometer sensor, the reading information of the target sensor may be the reading information of the odometer sensor; in the case where the object sensor includes only the optical flow sensor, the reading information of the object sensor may be the reading information of the optical flow sensor; in the case where the target sensor includes an odometer sensor and an optical flow sensor, the reading information of the target sensor may include reading information of the odometer sensor and reading information of the optical flow sensor. The reading information may be a reading of the target sensor or may be data calculated based on the reading of the target sensor, for example, the reading information may be an integral of the reading of the target sensor at a plurality of times.
Step 202, constructing an objective function based on the current position information and the reading information.
In this embodiment, the execution subject may construct an objective function based on the current position information and the reading information.
Wherein the objective function includes a switching variable for controlling a magnitude of a function value of a relationship function representing a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the objective sensor.
Here, the current position of the robot and the next position of the robot are relatively speaking, for example, if the position of the robot at the current time is taken as the current position of the robot, the next position of the robot may be the position of the robot after a predetermined time at the current time; if the position of the robot at a preset time before the current moment is taken as the current position of the robot, the next position of the robot can be the position of the robot at the current moment; if the position of the robot at a predetermined time after the current time is taken as the current position of the robot, the next position of the robot may be the position of the robot at 2 predetermined times after the current time, and so on. That is, in embodiments of the present disclosure, the current position of the robot is not limited to merely characterizing the position at which the robot is at the current time.
In some optional implementations of this embodiment, the executing body may execute the step 202 in the following manner: and constructing an objective function based on the current position information, the reading information and the uncertainty of the reading information.
As an example, an objective function h (x * ,s * ) Can be expressed as the following formula (1):
Figure BDA0003364583260000101
wherein x is * Comprising x i ,x i+1 。s * Included
Figure BDA0003364583260000102
x i The current position of the robot is characterized. X is x i+1 And representing the next position information corresponding to the current position information of the robot. i is used to identify the time of day. />
Figure BDA0003364583260000103
Characterizing the switching variable. sig () characterizes a sigmoid function for changing a discrete variable of 0, 1 into a continuous variable, e.g.>
Figure BDA0003364583260000104
f 1 () The relationship function is characterized, which may be a nonlinear function. u (u) oi The integral of the readings characterizing the target sensor at a plurality of moments (current moment i and before). Sigma and method for producing the same oi Uncertainty in readings of the target sensor is characterized. r is (r) ij Representing a priori data with an initial value of 1.
As yet another example, an objective function h (x * ,s * ) Can also be expressed as the following formula (2):
Figure BDA0003364583260000111
wherein x is * Comprising x i ,x i+1 。s * Included
Figure BDA0003364583260000112
x i The current position of the robot is characterized. X is x i+1 And representing the next position information corresponding to the current position information of the robot. i. j is used to identify the time of day. />
Figure BDA0003364583260000113
Characterizing the switching variable. signature () characterizes a sigmoid function A number for changing a discrete variable of 0, 1 into a continuous variable, e.g. +.>
Figure BDA0003364583260000114
f1 () is a relational function, which may be a nonlinear function. u (u) oi The integral of the readings characterizing the target sensor at a plurality of moments (current moment i and before). Sigma (sigma) oi And +.>
Figure BDA0003364583260000115
Respectively, the uncertainty. Sigma (sigma) oi Uncertainty in readings of the target sensor is characterized. />
Figure BDA0003364583260000116
The uncertainty of the switching variable is characterized. r is (r) ij Representing a priori data with an initial value of 1.
In some alternative implementations of the present embodiment, the target sensor includes an odometer sensor and an optical flow sensor. On this basis, the execution body may execute the above 202 in the following manner: and constructing an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switching variable and the second switching variable.
Wherein the first switching variable is used for controlling the magnitude of a function value of a first relation function, the second switching variable is used for controlling the magnitude of a function value of a second relation function, the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor, and the second relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor.
As an example, an objective function h (x * ,s * ) Can be expressed as the following formula (3):
Figure BDA0003364583260000121
wherein x is * Comprising x i ,x i+1 。s * Included
Figure BDA0003364583260000122
x i The current position of the robot is characterized. X is x i+1 And representing the next position information corresponding to the current position information of the robot. i. j is used to identify the time of day. The switching variables include->
Figure BDA0003364583260000123
The first switching variable and the second switching variable are respectively represented. sig () characterizes a sigmoid function for changing a discrete variable of 0, 1 into a continuous variable, e.g.>
Figure BDA0003364583260000124
The relationship function includes f 1 ()、f 2 ()。f 1 ()、f 2 () May each be a nonlinear function. f (f) 1 () And a first relation function, wherein the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor. f (f) 2 () And a second relation function, which characterizes a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the optical flow sensor. u (u) oi ,u f i characterizes the integration of the readings of the odometer sensor at a plurality of times (current time i and before), and the integration of the readings of the optical flow sensor at a plurality of times, respectively. Sigma (sigma) oi ,∑ fi and
Figure BDA0003364583260000131
Respectively, the uncertainty. Sigma (sigma) oi ,∑ fi The uncertainty of the readings of the odometer sensor and the uncertainty of the readings of the optical flow sensor are respectively represented. / >
Figure BDA0003364583260000132
The uncertainty of the first switching variable and the uncertainty of the second switching variable are respectively represented.
In some application scenarios in the above alternative implementation manner, the objective function further includes a constraint term of the first switching variable and a constraint term of the second switching variable.
As an example, an objective function h (x * ,s * ) Can be expressed as the following formula (4):
Figure BDA0003364583260000133
wherein x is * Comprises xi, x i+1 。s * Included
Figure BDA0003364583260000134
x i The current position of the robot is characterized. X is x i+1 And representing the next position information corresponding to the current position information of the robot. i. j is used to identify the time of day. The switching variables include->
Figure BDA0003364583260000135
The first switching variable and the second switching variable are respectively represented. sig () characterizes a sigmoid function for changing a discrete variable of 0, 1 into a continuous variable, e.g.>
Figure BDA0003364583260000136
The relationship function includes f 1 ()、f 2 ()。f 1 ()、f 2 () May be a nonlinear function, respectively. f (f) 1 () And a first relation function, wherein the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor. f (f) 2 () And a second relation function, which characterizes a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the optical flow sensor. u (u) oi ,u fi Representing odometer transmissions respectivelyIntegration of the readings of the sensor at a plurality of times (current time i and before), integration of the readings of the optical flow sensor at a plurality of times. Sigma (sigma) oi ,∑ fi and
Figure BDA0003364583260000141
Respectively, the uncertainty. Sigma (sigma) oi ,∑ fi The uncertainty of the readings of the odometer sensor and the uncertainty of the readings of the optical flow sensor are respectively represented. />
Figure BDA0003364583260000142
The uncertainty of the first switching variable and the uncertainty of the second switching variable are respectively represented. The constraint term of the first switching variable is +.>
Figure BDA0003364583260000143
The constraint term of the second switch variable is
Figure BDA0003364583260000144
r ij Representing a priori data with an initial value of 1.
As yet another example, an objective function h (x * ,s * ) Can also be expressed as the following formula (5):
Figure BDA0003364583260000151
wherein x is * Comprising x i ,x i+1 ,x j 。s * Included
Figure BDA0003364583260000152
x i The current position of the robot is characterized. X is x i+1 And representing the next position information corresponding to the current position information of the robot. X is x j Characterization and x i The close position, i.e. the loop position. i. j is used to identify the time of day. The switching variables include->
Figure BDA0003364583260000153
Separate tableThe first switching variable, the second switching variable and the third switching variable are characterized. sig () characterizes a sigmoid function for changing a discrete variable of 0, 1 into a continuous variable, e.g
Figure BDA0003364583260000154
The relationship function includes f 1 ()、f 2 ()、f 3 (). The relationship function may be a nonlinear function. f (f) 1 () And a first relation function, wherein the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor. f (f) 2 () And a second relation function, which characterizes a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the optical flow sensor. f (f) 3 () And a third relation function, wherein the third relation function represents a corresponding relation between the current position of the robot and the loop position of the robot calculated based on the reading information of the loop detection sensor. u (u) oi ,u fi ,u ij The integration of the readings of the odometer sensor at a plurality of moments (current moment i and before), the integration of the readings of the optical flow sensor at a plurality of moments and the integration of the readings of the loop-back detection sensor at a plurality of moments are respectively represented. Sigma (sigma) oi ,∑ fi ,∑ cij And +.>
Figure BDA0003364583260000161
Respectively, the uncertainty. Sigma (sigma) oi ,∑ fi ,∑ cij The uncertainty of the readings of the odometer sensor, the uncertainty of the readings of the optical flow sensor and the uncertainty of the readings of the loop detection sensor are respectively represented.
Figure BDA0003364583260000162
The uncertainty of the first switching variable, the uncertainty of the second switching variable and the uncertainty of the third switching variable are respectively characterized. r is (r) ij Representing a priori data with an initial value of 1.
In some optional implementations of this embodiment, the robot further includes a loop detection sensor, and on the basis of this, the execution body may execute the step 202 in the following manner: and constructing an objective function based on the current position information, the reading information and the reading information of the loop detection sensor.
The objective function further includes a third switching variable, where the third switching variable is used to control a magnitude of a function value of a third relationship function, and the third relationship function characterizes a correspondence between a current position of the robot and a loop position of the robot calculated based on reading information of the loop detection sensor.
As an example, an objective function h (x * ,s * ) Can be expressed as the following formula (6):
Figure BDA0003364583260000171
wherein x is * Comprising x i ,x i+1 ,x j 。s * Included
Figure BDA0003364583260000172
x i The current position of the robot is characterized. X is x i+1 And representing the next position information corresponding to the current position information of the robot. X is x j Characterization and x i The close position, i.e. the loop position. i. j is used to identify the time of day. The switching variables include->
Figure BDA0003364583260000173
The first switching variable, the second switching variable and the third switching variable are respectively represented. sig () characterizes a sigmoid function for changing a discrete variable of 0, 1 into a continuous variable, e.g
Figure BDA0003364583260000174
The relationship function includes f 1 ()、f 2 ()、f 3 (). The relationship function may be a nonlinear function. f (f) 1 () As a first relationship function, the firstA relationship function characterizes a correspondence between a current position of the robot and a next position of the robot calculated based on the reading information of the odometer sensor. f (f) 2 () And a second relation function, which characterizes a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the optical flow sensor. f (f) 3 () And a third relation function, wherein the third relation function represents a corresponding relation between the current position of the robot and the loop position of the robot calculated based on the reading information of the loop detection sensor. u (u) oi ,u fi ,u ij The integration of the readings of the odometer sensor at a plurality of moments (current moment i and before), the integration of the readings of the optical flow sensor at a plurality of moments and the integration of the readings of the loop-back detection sensor at a plurality of moments are respectively represented. Sigma (sigma) oi ,∑ fi ,∑ cij Respectively, the uncertainty. Sigma (sigma) oi ,∑ fi ,∑ cij The uncertainty of the readings of the odometer sensor, the uncertainty of the readings of the optical flow sensor and the uncertainty of the readings of the loop detection sensor are respectively represented.
In some application scenarios in the above alternative implementation manner, the objective function further includes a constraint term of the third switching variable.
As an example, an objective function h (x * ,s * ) Can be expressed as the following formula (7):
Figure BDA0003364583260000181
wherein x is * Comprising x i ,x i+1 ,x j 。s * Included
Figure BDA0003364583260000182
x i The current position of the robot is characterized. X is x i+1 And representing the next position information corresponding to the current position information of the robot. X is x j Characterization and x i Similar positions, i.e. loop positionsAnd (5) placing. i. j is used to identify the time of day. The switching variables include->
Figure BDA0003364583260000183
The first switching variable, the second switching variable and the third switching variable are respectively represented. sig () characterizes a sigmoid function for changing a discrete variable of 0, 1 into a continuous variable, e.g
Figure BDA0003364583260000191
The relationship function includes f 1 ()、f 2 ()、f 3 (). The relationship function may be a nonlinear function. f (f) 1 () And a first relation function, wherein the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor. f (f) 2 () And a second relation function, which characterizes a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the optical flow sensor. f (f) 3 () And a third relation function, wherein the third relation function represents a corresponding relation between the current position of the robot and the loop position of the robot calculated based on the reading information of the loop detection sensor. u (u) oi ,u fi ,u ij The integration of the readings of the odometer sensor at a plurality of moments (current moment i and before), the integration of the readings of the optical flow sensor at a plurality of moments and the integration of the readings of the loop-back detection sensor at a plurality of moments are respectively represented. Sigma (sigma) oi ,∑ fi ,∑ cij And +.>
Figure BDA0003364583260000194
Respectively, the uncertainty. Sigma (sigma) oi ,∑ fi ,∑ cij The uncertainty of the readings of the odometer sensor, the uncertainty of the readings of the optical flow sensor and the uncertainty of the readings of the loop detection sensor are respectively represented. / >
Figure BDA0003364583260000192
Characterizing a third switching variableIs not determined by the degree of uncertainty of (2). The constraint term of the third switching variable is +.>
Figure BDA0003364583260000193
r ij Representing a priori data with an initial value of 1.
As yet another example, an objective function h (x * ,s * ) Can also be expressed as the following formula (8):
Figure BDA0003364583260000201
wherein x is * Comprising x i ,x i+1 ,x j 。s * Included
Figure BDA0003364583260000202
x i The current position of the robot is characterized. X is x i+1 And representing the next position information corresponding to the current position information of the robot. X is x j Characterization and x i The close position, i.e. the loop position. i. j is used to identify the time of day. The switching variables include->
Figure BDA0003364583260000203
The first switching variable, the second switching variable and the third switching variable are respectively represented. sig () characterizes a sigmoid function for changing a discrete variable of 0, 1 into a continuous variable, e.g
Figure BDA0003364583260000204
The relationship function includes f 1 ()、f 2 ()、f 3 (). The relationship function may be a nonlinear function. f (f) 1 () And a first relation function, wherein the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor. f (f) 2 () And a second relation function, which characterizes a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the optical flow sensor. f (f) 3 () Is that And a third relationship function representing a correspondence between the current position of the robot and the loop position of the robot calculated based on the reading information of the loop detection sensor. u (u) oi ,u fi ,u ij The integration of the readings of the odometer sensor at a plurality of moments (current moment i and before), the integration of the readings of the optical flow sensor at a plurality of moments and the integration of the readings of the loop-back detection sensor at a plurality of moments are respectively represented. Sigma (sigma) oi ,∑ fi ,∑ cij And +.>
Figure BDA0003364583260000211
Respectively, the uncertainty. Sigma (sigma) oi ,∑ fi ,∑ cij The uncertainty of the readings of the odometer sensor, the uncertainty of the readings of the optical flow sensor and the uncertainty of the readings of the loop detection sensor are respectively represented.
Figure BDA0003364583260000212
The uncertainty of the first switching variable, the uncertainty of the second switching variable and the uncertainty of the third switching variable are respectively characterized. The constraint term of the first switching variable is +.>
Figure BDA0003364583260000213
The constraint term of the second switch variable is
Figure BDA0003364583260000214
The constraint term of the third switching variable is +.>
Figure BDA0003364583260000215
r ij Representing a priori data with an initial value of 1.
Here, the above-mentioned switching variables (including the first switching variable, the second switching variable, and the third switching variable) may be implemented by using a function such as tanh in addition to the Sigmoid function. In addition, in the objective function, in addition to the difference between the relationship function and the next position information or the loop position information, the quotient between the relationship function and the next position information or the loop position information, and the like may be calculated. Embodiments of the present disclosure are not limited in this regard.
Step 203, determining the next position information corresponding to the obtained current position information based on the objective function.
In this embodiment, the execution body may determine the next location information corresponding to the acquired current location information based on the objective function.
In some optional implementations of this embodiment, the execution body may execute the step 203 in the following manner: and calculating the next position information corresponding to the current position information by taking the minimum value obtained by the objective function as a target.
It can be appreciated that in the above alternative implementation manner, the minimum value may be obtained by using the objective function as a target, and the next position information corresponding to the current position information is calculated, so that the accuracy of determining the position of the robot may be further improved.
Alternatively, a least square method may be used to calculate the next position information corresponding to the current position information using the objective function.
With continued reference to fig. 3, fig. 3 is a schematic view of an application scenario of the position determining method of the robot according to the present embodiment. In fig. 3, the robot 310 includes an object sensor including at least one of an odometer sensor and an optical flow sensor. The robot 310 first acquires its own current position information 301 and the reading information 302 of the above-mentioned object sensor. Then, based on the current position information 301 and the reading information 302, an objective function 303 is constructed. Wherein the objective function 303 includes a switching variable for controlling a magnitude of a function value of a relation function representing a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the objective sensor. Finally, the robot 310 determines the next position information 304 corresponding to the acquired current position information 301 based on the objective function 303.
In the position determining method of the robot provided in the above embodiment of the present disclosure, the robot includes a target sensor including at least one of an odometer sensor and an optical flow sensor. And finally, determining next position information corresponding to the acquired current position information based on the objective function. According to the embodiment of the disclosure, the magnitude of the function value of the relation function is controlled through the switch variable, so that when at least one of the odometer sensor and the optical flow sensor provides error reading information, the magnitude of the function value of the relation function is limited, the influence degree of the sensor providing the error information on position determination is reduced, and the accuracy of the position determination of the robot can be improved.
With further reference to fig. 4, fig. 4 shows a flow of yet another embodiment of a method of position determination of a robot. The robot includes an object sensor including at least one of an odometer sensor and an optical flow sensor. The flow of the position determining method of the robot comprises the following steps:
step 401, acquiring current position information of the robot and reading information of the target sensor.
Step 402, constructing an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switching variable and the second switching variable.
Wherein the first switching variable is used for controlling the magnitude of a function value of a first relation function, the second switching variable is used for controlling the magnitude of a function value of a second relation function, the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor, and the second relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor.
Step 403, determining the next position information corresponding to the obtained current position information based on the objective function.
In particular, for a multi-loop problem, there may be multiple different paths that may be reached from point a to point b. If the nodes in the graph represent positions and the edges represent paths reaching the positions, multiple paths can be reached from one node to another.
The specific implementation of the steps 401 to 403 may refer to the related description of fig. 2, and will not be described herein.
In application scenarios such as position prediction and path planning of a robot, a problem of pose map optimization is usually solved by using an odometer sensor, an optical flow sensor and a loop detection sensor which are arranged in the robot. However, if there is an error in the reading information of the odometer sensor, the optical flow sensor and the loop detection sensor, the optimization of the whole pose chart is seriously affected.
Further, as an example, after the current position information of the robot and the reading information of the target sensor are acquired, x may be solved using the following formula (9) * s *
Figure BDA0003364583260000241
Wherein x is * Comprising x i ,x i+1 ,x j 。s * Included
Figure BDA0003364583260000251
x i The current position of the robot is characterized. X is x i+1 Characterizing current positional information pairs of a robotThe next location information is to be used. X is x j Characterization and x i The close position, i.e. the loop position. i. j is used to identify the time of day. The switching variables include->
Figure BDA0003364583260000252
The first switching variable, the second switching variable and the third switching variable are respectively represented. sig () characterizes a sigmoid function for changing a discrete variable of 0, 1 into a continuous variable, e.g
Figure BDA0003364583260000253
The relationship function includes f 1 ()、f 2 ()、f 3 (). The relationship functions may be respectively nonlinear functions. f (f) 1 () And a first relation function, wherein the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor. f (f) 2 () And a second relation function, which characterizes a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the optical flow sensor. f (f) 3 () And a third relation function, wherein the third relation function represents a corresponding relation between the current position of the robot and the loop position of the robot calculated based on the reading information of the loop detection sensor. u (u) oi ,u fi ,u ij The integration of the readings of the odometer sensor at a plurality of moments (current moment i and before), the integration of the readings of the optical flow sensor at a plurality of moments and the integration of the readings of the loop-back detection sensor at a plurality of moments are respectively represented. Sigma (sigma) oi ,∑ fi ,∑ cij And +.>
Figure BDA0003364583260000254
Respectively, the uncertainty. Sigma (sigma) oi ,∑ fi ,∑ cij The uncertainty of the readings of the odometer sensor, the uncertainty of the readings of the optical flow sensor and the uncertainty of the readings of the loop detection sensor are respectively represented.
Figure BDA0003364583260000255
The uncertainty of the first switching variable, the uncertainty of the second switching variable and the uncertainty of the third switching variable are respectively characterized.
The above formula (9) also satisfies the constraint condition of the following formula (10) -formula (15):
x i+1 =f 1 (x i ,u oi )+w oi
formula (10)
x i+1 =f 2 (x i ,u fi )+w fi
Formula (11)
x j =f 3 (x i ,u ij )+w ij
Formula (12)
Figure BDA0003364583260000261
Figure BDA0003364583260000262
Figure BDA0003364583260000263
In the formulas (9) to (15), w oi 、w fi 、w ij All obey zero mean and variance sigma oi 、∑ fi 、∑ cij And r ij Representing a priori data with an initial value of 1.
The above embodiment provides a more general method for processing the error readings of the sensors, wherein three loops of the pose graph act simultaneously, if the readings of the odometer sensor and the optical flow sensor in the loop section are in error, the map data after the loop is not affected, and vision correction still acts, so that the accuracy of map optimization is ensured. Moreover, the robot has high robustness, and the map disorder caused by the wheel slipping or loop detection error of the robot can not occur.
With further reference to fig. 5, as an implementation of the method shown in the foregoing figures, the present disclosure provides an embodiment of a position determining apparatus of a robot, which corresponds to the foregoing method embodiment, and which may include the same or corresponding features as the foregoing method embodiment, in addition to the features described below, and produce the same or corresponding effects as the foregoing method embodiment. The device can be applied to various electronic equipment.
As shown in fig. 5, the robot in the position determining apparatus 500 of the robot of the present embodiment includes a target sensor including at least one of an odometer sensor and an optical flow sensor. The apparatus 500 includes: an acquisition unit 501, a construction unit 502, and a determination unit 503. Wherein, the obtaining unit 501 is configured to obtain current position information of the robot and reading information of the target sensor; a construction unit 502 configured to construct an objective function based on the current position information and the reading information, wherein the objective function includes a switching variable for controlling a magnitude of a function value of a relationship function characterizing a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the objective sensor; a determining unit 503 configured to determine next location information corresponding to the acquired current location information based on the above-described objective function.
In the present embodiment, the acquisition unit 501 of the position determining apparatus 500 of the robot may acquire the current position information of the robot and the reading information of the target sensor.
In this embodiment, the construction unit 502 may construct an objective function based on the current position information and the reading information, where the objective function includes a switch variable, and the switch variable is used to control a magnitude of a function value of a relationship function, and the relationship function characterizes a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the objective sensor.
In the present embodiment, the determination unit 503 may determine the next position information corresponding to the acquired current position information based on the above-described objective function.
In some optional implementations of this embodiment, the building unit 502 includes:
a first construction subunit (not shown in the figure) configured to construct an objective function based on the current location information, the reading information, and the uncertainty of the reading information.
In some alternative implementations of the present embodiment, the target sensor includes an odometer sensor and an optical flow sensor; and
The construction unit 502 includes:
a second construction subunit (not shown in the figure) configured to construct an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switching variable, and the second switching variable;
wherein the first switching variable is used for controlling the magnitude of a function value of a first relation function, the second switching variable is used for controlling the magnitude of a function value of a second relation function, the first relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor, and the second relation function represents a corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor.
In some optional implementations of this embodiment, the objective function further includes a constraint of the first switching variable and a constraint of the second switching variable.
In some optional implementations of this embodiment, the robot further includes a loop detection sensor; and
The construction unit 502 includes:
a third constructing subunit (not shown) configured to construct an objective function based on the current position information, the reading information, and the reading information of the loop detection sensor;
the objective function further includes a third switching variable, where the third switching variable is used to control a magnitude of a function value of a third relationship function, and the third relationship function characterizes a correspondence between a current position of the robot and a loop position of the robot calculated based on reading information of the loop detection sensor.
In some optional implementations of this embodiment, the objective function further includes a constraint term of the third switching variable.
In some optional implementations of this embodiment, the determining unit 503 includes:
a calculating subunit (not shown) configured to calculate next position information corresponding to the current position information with the objective function having the minimum value obtained as a target.
In the apparatus 500 provided by the above-described embodiment of the present disclosure, the robot includes a target sensor including at least one of an odometer sensor and an optical flow sensor. The acquiring unit 501 acquires current position information of the robot and reading information of the target sensor; a construction unit 502 constructs an objective function based on the current position information and the reading information, wherein the objective function includes a switch variable for controlling a magnitude of a function value of a relationship function, and the relationship function characterizes a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the objective sensor; the determination unit 503 determines the next position information corresponding to the acquired current position information based on the above-described objective function. Therefore, the magnitude of the function value of the relation function is controlled through the switch variable, when at least one of the odometer sensor and the optical flow sensor provides error information, the magnitude of the function value of the relation function is limited, the influence degree of the sensor providing the error information on position determination is reduced, and the accuracy of the position determination of the robot can be improved.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, and an electronic device 600 shown in fig. 6 includes: at least one processor 601, memory 602, and at least one network interface 604 and other user interfaces 603. The various components in the electronic device 600 are coupled together by a bus system 605. It is understood that the bus system 605 is used to enable connected communications between these components. The bus system 605 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 605 in fig. 6.
The user interface 603 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, a trackball, a touch pad, or a touch screen, etc.).
It is to be appreciated that the memory 602 in embodiments of the disclosure may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (Double Data Rate SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct memory bus RAM (DRRAM). The memory 602 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some implementations, the memory 602 stores the following elements, executable units or data structures, or a subset thereof, or an extended set thereof: an operating system 6021 and application programs 6022.
The operating system 6021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application 6022 includes various application programs such as a Media Player (Media Player), a Browser (Browser), and the like for realizing various application services. A program for implementing the method of the embodiment of the present disclosure may be included in the application 6022.
In the embodiments of the present disclosure, the processor 601 is configured to execute the method steps provided by the method embodiments by calling a program or an instruction stored in the memory 602, specifically, a program or an instruction stored in the application 6022, including, for example: acquiring current position information of the robot and reading information of the target sensor; constructing an objective function based on the current position information and the reading information, wherein the objective function comprises a switch variable, and the switch variable is used for controlling the magnitude of a function value of a relation function, and the relation function represents a corresponding relation between the current position of the robot and a next position of the robot calculated based on the reading information of the objective sensor; and determining the next position information corresponding to the acquired current position information based on the objective function.
The methods disclosed in the embodiments of the present disclosure may be applied to the processor 601 or implemented by the processor 601. The processor 601 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 601 or instructions in the form of software. The processor 601 may be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The various methods, steps and logic blocks of the disclosure in the embodiments of the disclosure may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present disclosure may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software elements in a decoded processor. The software elements may be located in a random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory 602, and the processor 601 reads information in the memory 602 and performs the steps of the above method in combination with its hardware.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (Application Specific Integrated Circuits, ASIC), digital signal processors (Digital Signal Processing, DSP), digital signal processing devices (dspev, DSPD), programmable logic devices (Programmable Logic Device, PLD), field programmable gate arrays (Field-Programmable Gate Array, FPGA), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented by means of units that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
The electronic device provided in this embodiment may be an electronic device as shown in fig. 6, and may perform all steps of the method for determining a position of the robot as shown in fig. 2, so as to achieve the technical effects of the method for determining a position of the robot as shown in fig. 2, and detailed description with reference to fig. 2 is omitted herein for brevity.
The disclosed embodiments also provide a storage medium (computer-readable storage medium). The storage medium here stores one or more programs. Wherein the storage medium may comprise volatile memory, such as random access memory; the memory may also include non-volatile memory, such as read-only memory, flash memory, hard disk, or solid state disk; the memory may also comprise a combination of the above types of memories.
When one or more programs in the storage medium are executable by one or more processors, the above-described method for determining the position of the robot, which is executed on the electronic device side, is implemented.
The processor is configured to execute a communication program stored in the memory, so as to implement the following steps of a method for determining a position of a robot executed on an electronic device side: acquiring current position information of the robot and reading information of the target sensor; constructing an objective function based on the current position information and the reading information, wherein the objective function comprises a switch variable, and the switch variable is used for controlling the magnitude of a function value of a relation function, and the relation function represents a corresponding relation between the current position of the robot and a next position of the robot calculated based on the reading information of the objective sensor; and determining the next position information corresponding to the acquired current position information based on the objective function.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of function in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
While the foregoing embodiments have been described in some detail for purposes of clarity of understanding, it will be understood that the above description is by way of example only and is not intended to limit the scope of the disclosure, and that any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the disclosure are intended to be included within the scope of the disclosure.

Claims (8)

1. A method of position determination for a robot, the robot comprising a target sensor comprising at least one of an odometer sensor and an optical flow sensor, the method comprising:
acquiring current position information of the robot and reading information of the target sensor;
in the case that the target sensor comprises an odometer sensor and an optical flow sensor, constructing an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, a first switching variable and a second switching variable; the first switch variable is used for controlling the magnitude of a function value of a first relation function, the second switch variable is used for controlling the magnitude of a function value of a second relation function, the first relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor, and the second relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor; or, in the case that the robot further includes a loop detection sensor, constructing an objective function based on the current position information, the reading information, and the reading information of the loop detection sensor; the objective function further comprises a third switch variable, wherein the third switch variable is used for controlling the magnitude of a function value of a third relation function, and the third relation function represents the corresponding relation between the current position of the robot and the loop position of the robot calculated based on the reading information of the loop detection sensor;
And determining the next position information corresponding to the acquired current position information based on the objective function.
2. The method of claim 1, wherein the objective function is further constructed based on uncertainty of the reading information.
3. The method of claim 1, wherein in the case of constructing an objective function based on the current location information, the reading information of the odometer sensor, the reading information of the optical flow sensor, a first switching variable and a second switching variable, the objective function further includes a constraint term of the first switching variable and a constraint term of the second switching variable.
4. The method according to claim 1, wherein in case an objective function is constructed based on the current position information, the reading information and the reading information of the loop detection sensor, the objective function further contains a constraint term of the third switching variable.
5. The method according to one of claims 1 to 4, wherein the determining, based on the objective function, the next location information corresponding to the acquired current location information comprises:
and calculating the next position information corresponding to the current position information by taking the minimum value obtained by the objective function as a target.
6. A position determining apparatus of a robot, the robot including a target sensor including at least one of an odometer sensor and an optical flow sensor, the apparatus comprising:
an acquisition unit configured to acquire current position information of the robot and reading information of the target sensor;
a construction unit configured to construct an objective function based on the current position information, reading information of the odometer sensor, reading information of the optical flow sensor, a first switching variable, and a second switching variable, in a case where the object sensor includes an odometer sensor and an optical flow sensor; the first switch variable is used for controlling the magnitude of a function value of a first relation function, the second switch variable is used for controlling the magnitude of a function value of a second relation function, the first relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the odometer sensor, and the second relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor; or, in the case that the robot further includes a loop detection sensor, constructing an objective function based on the current position information, the reading information, and the reading information of the loop detection sensor; the objective function further comprises a third switch variable, wherein the third switch variable is used for controlling the magnitude of a function value of a third relation function, and the third relation function represents the corresponding relation between the current position of the robot and the loop position of the robot calculated based on the reading information of the loop detection sensor;
And a determining unit configured to determine next position information corresponding to the acquired current position information based on the objective function.
7. An electronic device, comprising:
a memory for storing a computer program;
a processor for executing a computer program stored in said memory, and which, when executed, implements the method of any of the preceding claims 1-5.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of any of the preceding claims 1-5.
CN202111392025.7A 2021-11-19 2021-11-19 Position determining method and device for robot, electronic equipment and storage medium Active CN114088085B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111392025.7A CN114088085B (en) 2021-11-19 2021-11-19 Position determining method and device for robot, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111392025.7A CN114088085B (en) 2021-11-19 2021-11-19 Position determining method and device for robot, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114088085A CN114088085A (en) 2022-02-25
CN114088085B true CN114088085B (en) 2023-06-23

Family

ID=80303069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111392025.7A Active CN114088085B (en) 2021-11-19 2021-11-19 Position determining method and device for robot, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114088085B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850615A (en) * 2015-05-14 2015-08-19 西安电子科技大学 G2o-based SLAM rear end optimization algorithm method
CN110167137A (en) * 2019-05-08 2019-08-23 安克创新科技股份有限公司 The determination method and device of target object
CN110986930A (en) * 2019-11-29 2020-04-10 北京三快在线科技有限公司 Equipment positioning method and device, electronic equipment and storage medium
CN111337018A (en) * 2020-05-21 2020-06-26 上海高仙自动化科技发展有限公司 Positioning method and device, intelligent robot and computer readable storage medium
CN112254741A (en) * 2020-09-09 2021-01-22 安克创新科技股份有限公司 Method for detecting abnormality of mileage sensor, self-moving robot, and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10612929B2 (en) * 2017-10-17 2020-04-07 AI Incorporated Discovering and plotting the boundary of an enclosure

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850615A (en) * 2015-05-14 2015-08-19 西安电子科技大学 G2o-based SLAM rear end optimization algorithm method
CN110167137A (en) * 2019-05-08 2019-08-23 安克创新科技股份有限公司 The determination method and device of target object
CN110986930A (en) * 2019-11-29 2020-04-10 北京三快在线科技有限公司 Equipment positioning method and device, electronic equipment and storage medium
CN111337018A (en) * 2020-05-21 2020-06-26 上海高仙自动化科技发展有限公司 Positioning method and device, intelligent robot and computer readable storage medium
CN112254741A (en) * 2020-09-09 2021-01-22 安克创新科技股份有限公司 Method for detecting abnormality of mileage sensor, self-moving robot, and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
On the Positioning of Sensors with Simultaneous Bearing and Range Measurement in Wireless Sensor Networks;M. Khan 等;《IFAC-PapersOnLine》;第52卷(第24期);第334-339页 *
基于视觉惯性的非结构化场景重构测距;杨执钧;刘刚;黄蕾;乔丹;白雪;钟韬;;计算机应用研究(第S1期);第2-3页 *

Also Published As

Publication number Publication date
CN114088085A (en) 2022-02-25

Similar Documents

Publication Publication Date Title
KR102402111B1 (en) Apparatus and method for performing forward computation in convolutional neural networks
CN112101530B (en) Neural network training method, device, equipment and storage medium
JP7264376B2 (en) How to generate a general-purpose trained model
JP6853148B2 (en) Detection device, detection method and detection program
JP6610278B2 (en) Machine learning apparatus, machine learning method, and machine learning program
CN113095129B (en) Gesture estimation model training method, gesture estimation device and electronic equipment
KR102159880B1 (en) Method and apparatus for metacognition driven state space exploration
CN109711530B (en) Landslide prediction method and system
CN112257848B (en) Method for determining logic core layout, model training method, electronic device and medium
CN114462594A (en) Neural network training method and device, electronic equipment and storage medium
CN111881477A (en) Indexing method and device of data content, computer equipment and storage medium
CN114088085B (en) Position determining method and device for robot, electronic equipment and storage medium
WO2018211927A1 (en) Control apparatus, control program, learning data creation method, and learning method
WO2020217620A1 (en) Training device, estimation device, training method, estimation method, and program
CN109362027B (en) Positioning method, device, equipment and storage medium
CN110533158B (en) Model construction method, system and non-volatile computer readable recording medium
CN114549945A (en) Remote sensing image change detection method and related device
WO2021092872A1 (en) Device fingerprint extraction method based on smartphone sensor
JP2021033583A (en) Control apparatus, control system, and control method
CN109492759B (en) Neural network model prediction method, device and terminal
CN111310794A (en) Target object classification method and device and electronic equipment
CN113343366B (en) Method for determining main section parameters of vehicle body and related equipment
CN117475399B (en) Lane line fitting method, electronic device and readable medium
JP5011529B2 (en) Data processing apparatus, data processing method, and program
CN112465105B (en) Computer-readable recording medium on which learning program is recorded, and learning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant