CN115585819A - Self-moving equipment and return method, return device and computer readable medium thereof - Google Patents

Self-moving equipment and return method, return device and computer readable medium thereof Download PDF

Info

Publication number
CN115585819A
CN115585819A CN202211097575.0A CN202211097575A CN115585819A CN 115585819 A CN115585819 A CN 115585819A CN 202211097575 A CN202211097575 A CN 202211097575A CN 115585819 A CN115585819 A CN 115585819A
Authority
CN
China
Prior art keywords
area
self
determining
target imaging
charging seat
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211097575.0A
Other languages
Chinese (zh)
Inventor
张泫舜
陈熙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecoflow Technology Ltd
Original Assignee
Ecoflow Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecoflow Technology Ltd filed Critical Ecoflow Technology Ltd
Priority to CN202211097575.0A priority Critical patent/CN115585819A/en
Publication of CN115585819A publication Critical patent/CN115585819A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3469Fuel consumption; Energy use; Emission aspects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0217Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with energy consumption, time reduction or distance reduction criteria
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides a self-moving device and a return method, a return device and a computer readable medium thereof, wherein the method comprises the following steps: when the self-moving equipment is located in a preset return area, acquiring an environment image in the advancing direction of the self-moving equipment; determining the position of a target imaging region and the area of the target imaging region in an environment image; the target imaging area is an imaging area where the charging seat is located; determining attitude adjustment information according to the position; determining a predicted distance between the mobile equipment and a charging seat according to the area and a preset fitting function; the preset fitting function is used for indicating the mapping relation between the distance from the mobile equipment to the charging seat and the area of the target imaging area; and controlling the self-moving equipment to move according to the posture adjustment information and the predicted distance so as to enable the self-moving equipment to be in butt joint with the charging seat. This application realizes from accurate range finding and location between mobile device and the charging seat, realizes from the accurate back journey of mobile device through attitude control information and prediction distance.

Description

Self-moving equipment and return method, return device and computer readable medium thereof
Technical Field
The present application relates to the field of mobile device technologies, and in particular, to a mobile device, a return method thereof, a return device, and a computer readable medium.
Background
With the continuous progress of computer technology and artificial intelligence technology, self-moving devices that automatically work have started to slowly walk into people's lives. In the process of driving the commercial mobile device in the working area, if the battery power of the mobile device with the battery is lower than the set power threshold, the mobile device needs to return to the charging seat for charging, so as to realize the sustainable work of the mobile device, or after completing the operation, the mobile device needs to return to the charging seat for standby, and the like, namely the return journey of the mobile device is completed.
In the related art, a depth camera is generally used to sense a distance between a mobile device and a charging dock, so as to return the mobile device to the charging dock. However, the design of charging seat is dark or the environment that the charging seat was located is dark usually, and the degree of depth camera is when accurately finding range to dark object, because dark object can absorb the infrared ray, the infrared ray normally returns, leads to unable range finding to the charging seat, and then can't realize returning to the journey from the automation of mobile device through the degree of depth camera. That is to say, the depth camera is measured the influence of range object colour depth on the range finding, can lead to unable accurate measuring and calculating distance for from the mobile device be difficult to realize accurate location to the charging seat, and be difficult to realize automatic accurate control who navigates back.
Therefore, how to realize the accurate control of automatic recharging is a technical problem to be solved.
Disclosure of Invention
An object of the present application is to provide a self-moving device, a return method thereof, a return device, and a computer readable medium, so as to optimize the problem that the self-moving device cannot accurately identify a charging dock in the related art.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned by practice of the application.
According to an aspect of an embodiment of the present application, there is provided a return method from a mobile device, the method including:
when the self-moving equipment is located in a preset return area, acquiring an environment image in the advancing direction of the self-moving equipment;
determining the position of a target imaging region and the area of the target imaging region in an environment image; the target imaging area is an imaging area where the charging seat is located;
determining attitude adjustment information according to the position;
determining a predicted distance between the mobile equipment and a charging seat according to the area and a preset fitting function; the preset fitting function is used for indicating the mapping relation between the distance from the mobile equipment to the charging seat and the area of the target imaging area;
and controlling the movement of the self-mobile equipment according to the attitude adjustment information and the predicted distance so as to enable the self-mobile equipment to be in butt joint with the charging seat.
According to an aspect of an embodiment of the present application, there is provided a return device for a self-moving apparatus, including:
the image acquisition module is used for acquiring an environment image in the advancing direction of the self-moving equipment when the self-moving equipment is positioned in a preset return area;
the area determining module is used for determining the position of the target imaging area and the area of the target imaging area in the environment image; the target imaging area is an imaging area where the charging seat is located;
the attitude adjusting module is used for determining attitude adjusting information according to the position;
the prediction distance determining module is used for determining the prediction distance between the mobile equipment and the charging seat according to the area and a preset fitting function; the preset fitting function indicates the mapping relation between the distance from the mobile equipment to the charging seat and the area of the target imaging area;
and the equipment moving module is used for controlling the movement of the self-moving equipment according to the posture adjustment information and the predicted distance so as to ensure that the self-moving equipment is in butt joint with the charging seat.
According to an aspect of embodiments of the present application, there is provided a computer-readable medium on which a computer program is stored, the computer program, when executed by a processor, implementing a return method from a mobile device as provided in any of the embodiments of the present application.
According to an aspect of an embodiment of the present application, there is provided an autonomous mobile device including: a vehicle body including a vehicle body and wheels; and a control module, configured to execute the return method for the self-moving device provided in any embodiment of the present application.
According to the method and the device, when the self-moving device is located in a preset return area, an environment image in the advancing direction of the self-moving device is obtained, a target imaging area of a charging seat and the position and the area of the target imaging area are extracted from the environment image, attitude adjustment information is determined according to the position, the prediction distance between the self-moving device and the charging seat is determined according to the target area and a preset fitting function, and then the self-moving device and the charging seat are controlled to be in butt joint according to the attitude adjustment information and the prediction distance. From this, can confirm the prediction distance from mobile device and charging seat through the area of the imaging area at charging seat place and the corresponding relation of predetermineeing the fitting function to realize from the accurate range finding between mobile device and the charging seat, can avoid the depth of charging seat colour to cause the interference to the precision of range finding, can realize from the accurate range finding and the location between mobile device and the charging seat, realize returning voyage from mobile device's accurate through attitude control information and prediction distance.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and, together with the description, serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 schematically shows a block diagram of an exemplary system architecture to which the solution of the present application applies.
Fig. 2 schematically shows a flowchart of a method for returning from a mobile device according to an embodiment of the present application.
Fig. 3 schematically shows a schematic diagram of an application scenario of the technical solution of the present application.
Fig. 4 schematically shows a fitting result diagram of one of the cases of the preset fitting functions.
Fig. 5 is a schematic diagram showing the fitting result of one of the cases of the preset fitting functions.
Fig. 6 schematically shows a fitting result diagram of one of the cases of the preset fitting functions.
Fig. 7 schematically shows a schematic view of an ambient image.
Fig. 8 schematically shows a relative position diagram of the center of the target imaging area and the self-moving device in the present application.
Fig. 9 schematically shows a schematic structural diagram of a return device of a self-moving device to which the technical solution of the present application is applied.
Fig. 10 schematically illustrates a block diagram of a system architecture of a self-moving device suitable for implementing embodiments of the present application.
Fig. 11 schematically illustrates a schematic diagram of a self-moving device provided by an embodiment of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Fig. 1 schematically shows an exemplary system architecture block diagram to which the technical solution of the present application is applied.
As shown in fig. 1, system architecture 100 may include a terminal device 110, a network 120, and a server 130. Terminal device 110 may include a smart phone, a tablet computer, a laptop computer, an intelligent voice interaction device, an intelligent appliance, a vehicle mounted terminal, a self-moving device, and so forth. The server 130 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing a cloud computing service. Network 120 may be a communication medium of various connection types capable of providing a communication link between terminal device 110 and server 130, such as a wired communication link or a wireless communication link. The self-moving device may be a device including a self-moving auxiliary function, or may be a semi-self-moving device or a completely self-moving device. The self-moving device may be a device that contains self-moving auxiliary functionality. The self-moving auxiliary function can be realized by a vehicle-mounted terminal, and the corresponding self-moving equipment can be a vehicle with the vehicle-mounted terminal. The autonomous mobile device may also be a semi-autonomous mobile device or a fully autonomous mobile device. Such as a lawn mower, a sweeper, a robot with navigation capability, etc.
The system architecture in the embodiments of the present application may have any number of terminal devices, networks, and servers, according to implementation needs. For example, the server 130 may be a server group composed of a plurality of server devices. In addition, the technical solution provided in the embodiment of the present application may be applied to the terminal device 110, or may be applied to the server 130, or may be implemented by both the terminal device 110 and the server 130, which is not particularly limited in this application.
In an embodiment of the present application, the return method from a mobile device provided in the embodiment of the present application is implemented by the server 130. The server 130 acquires attitude information of the mobile device and an environment image of the advancing direction of the mobile device when the mobile device enters a preset return area; determining the area of a target imaging region in an environment image; wherein the target imaging area comprises an imaging area of a charging dock; determining a predicted distance between the mobile equipment and a charging seat according to the area and a preset fitting function; the preset fitting function is used for mapping the relation between the distance from the mobile equipment to the charging seat and the area of the target imaging area; and controlling the movement of the self-moving equipment according to the attitude information and the predicted distance so as to realize the butt joint of the self-moving equipment and the charging seat.
Fig. 2 schematically illustrates a flowchart of a return method from a mobile device according to an embodiment of the present application.
As shown in fig. 2, the return method of the self-moving device includes steps S210 to S240, which are specifically as follows:
s210, when the self-moving equipment is located in a preset return area, an environment image in the advancing direction of the self-moving equipment is obtained.
Wherein, return refers to the process of returning to the base from the mobile device. The preset return area refers to an area within a preset range threshold from the return of the mobile equipment to the base. The preset return area is set according to the experience of the size of an actual scene, so that after the mobile device enters the preset return area, the probability that the environment image acquired in the advancing direction of the mobile device contains the charging seat is higher, the acquisition interference image is reduced, and the acquisition efficiency of the environment image containing the charging seat is improved.
Fig. 3 schematically illustrates an application scenario of the technical solution of the present application, and as shown in fig. 3, the preset return area 12 may be a circular area with the center of the charging dock 11 as a circle center and a radius R of 1.5 meters. Of course, instead of the circular area with the position of the charging seat 11 as the center of the circle as shown in fig. 3, a sector area or an area with other shapes with the position of the charging seat 11 as the center of the circle may be used, and fig. 3 shows only one of the preset return areas, and does not limit the shape of the preset return area.
The environment image is an image obtained by shooting an environment in the advancing direction by the self-mobile device, the environment image comprises a charging seat, and if the environment image shot by the self-mobile device does not comprise the charging seat, the self-mobile device needs to adjust the position or the shooting angle until shooting the image of the charging seat.
In an embodiment of the application, when the self-moving device is located in the preset return area, the camera device of the self-moving device is controlled to acquire the environment image in the advancing direction of the self-moving device, the self-moving device may not shoot the environment image including the charging seat because of an obstacle, at this time, the self-moving device needs to perform position adjustment until the environment image including the charging seat is shot, and then the camera device sends the environment image to the processor in the self-moving device for image analysis, so that the posture information of the self-moving device is confirmed.
In an embodiment of the present application, before entering the preset return area from the mobile device, the method further includes: obtaining a first positioning signal from a mobile device; determining the relative distance from the charging seat to the mobile equipment according to the first positioning signal; and when the relative distance is smaller than or equal to the preset range threshold value, determining that the self-mobile equipment is located in the preset return flight area.
Specifically, this embodiment applies a Real-time kinematic (RTK) carrier-phase differential technique, where an RTK system includes a satellite, a base station, and a rover, and the base station and the rover each have a satellite receiver and can observe and receive satellite data. The reference station is a base station that provides a reference, and in this application the base station is mounted on a charging dock. And the rover station is a station which can move continuously. The rover is the object target which is to measure the three-dimensional coordinates of the rover, and the rover can be installed on a self-moving device in the application. Positioning process of RTK system: firstly, observing and receiving satellite data by a reference station; secondly, the reference station sends the observation data to the mobile station in real time through a radio station; thirdly, the rover station observes and receives the satellite data while receiving the data of the reference station; fourthly, the rover station carries out real-time differential operation on the basis of the datum station data and the datum of the rover station according to a relative positioning principle, and therefore three-dimensional coordinates and the precision of the three-dimensional coordinates are calculated; by this point, the RTK positioning is completed.
In the embodiment of the application, the self-moving device can be used as a mobile station, and the charging seat can be used as a reference station. The first positioning signal is the position of the mobile device, and since the position of the charging seat in the environment is fixed, that is, the position of the charging seat is known, the relative distance from the charging seat to the mobile device can be obtained according to the distance calculation formula between the two points. And when the relative distance is smaller than or equal to the preset range threshold, determining that the self-moving equipment is located in the preset return area. The method and the device determine a first positioning signal of the self-moving device through an RTK positioning technology, and therefore determine the relative distance between a charging seat and the self-moving device. The method for determining the relative distance from the charging seat to the mobile device is simple, and whether the mobile device is located in the preset return flight area or not can be quickly judged.
When the relative distance is larger than the preset range threshold, it is determined that the self-moving device is not in the preset return area, and the self-moving device needs to be controlled to move forward towards the charging seat until the self-moving device enters the preset return area. Ensuring that the self-mobile device can be matched and charged with the charging dock before the power is exhausted.
Illustratively, as shown in fig. 3, the preset range threshold of the relative position of the self-moving device 10 and the charging dock 11 is 1.5 meters, and when the relative position of the self-moving device 10 and the charging dock 11 is less than 1.5 meters, it is determined that the self-moving device 10 enters the preset return area 12, so as to control the self-moving device 10 to perform S210.
S220, determining the position of a target imaging region and the area of the target imaging region in the environment image; the target imaging area is an imaging area where the charging seat is located.
The target imaging area may be the charging-stand imaging area itself, or may be an imaging area including the main features of the charging stand.
Specifically, the environment image can be detected and identified through the convolutional neural network, so as to obtain an imaging area where the charging seat is located, that is, a target imaging area. The position of the central point of the target imaging area can be used as the position information of the target imaging area. And calculating to obtain the area of the target imaging region according to the vertex position in the target imaging region.
In one embodiment of the present application, determining a location of a target imaging region and an area of the target imaging region in an environmental image includes: detecting an object present in the environmental image; if the detected object comprises a charging seat, determining an image area where the charging seat is located as a target imaging area; the position of the target imaging region and the area of the target imaging region are acquired.
Specifically, the above-described environment image may be detected by an object detection algorithm. The type of the object detection method can be selected according to actual conditions. For example, in some scenarios, the environmental image may be detected by a residual neural network (ResNet) to detect objects present in the environmental image. For example, the object may be a charging stand, a grass, a building, etc., and is not limited thereto. In order to improve the object detection efficiency, the object may be set as a charging dock in the embodiment of the present application. Inputting an environment image into a residual error neural network, outputting a detection frame containing a charging seat, and determining a target imaging area, wherein the detection frame comprises the characteristics of texture, color, shape and the like of the charging seat, the position information of the detection frame and the like.
For example, the environment image is used as an input of the residual neural network, and the environment image may be a three-channel color image of 640 × 480, or a three-channel color image of 720 × 1280, 1080 × 1920, or the like. Since the target of the cradle is small, images of 640 × 480 size are used as network inputs in the following detailed description. The residual neural network outputs the coordinate position (x, y, w, h) of the target imaging area, wherein (x, y) is the coordinate of the upper left corner of the target imaging area, w is the width of the target imaging area, h is the height of the target imaging area, and the area of the target imaging area is determined to be w.h according to the position information of the target imaging area.
Illustratively, a coordinate system of the environment image, such as the coordinate system shown in fig. 7, is constructed with the left vertex of the environment image as the origin and the edges of the image on both sides of the left vertex as coordinate axes. Coordinates (x) of upper left corner of target imaging area l ,y l ) Width w of target imaging area c And height h of target imaging region c To thereby derive the lower right corner coordinate (x) l +w c ,y l +h c ) Calculating the coordinates (x) of the center point of the target imaging area i ,y i ) I.e. the position of the target imaging area. Wherein
Figure BDA0003839370680000071
And determining the area of the target imaging region as w c ·h c
And S230, determining posture adjustment information according to the position.
Before the self-mobile device is docked with the charging dock, the self-mobile device may not be aligned with the charging dock, that is, there is a certain offset between the environment image center and the target imaging area center of the self-mobile device. Therefore, in order to adjust the posture of the self-moving apparatus, the posture adjustment information may also be determined according to the above position. The posture adjustment information refers to a posture adjustment direction and a posture adjustment amount for aligning the posture of the mobile device with the charging stand.
In one embodiment of the present application, determining pose adjustment information according to the position includes: acquiring the offset direction and the offset of the position relative to the center of the environment image; and determining the attitude adjusting direction and the attitude adjusting quantity according to the offset direction and the offset.
Specifically, in the preset return flight area, the offset direction and the offset amount of the target imaging area center a (i.e., the position of the target imaging area) with respect to the imaging device center B (i.e., the center of the environment image) are determined based on the attitude information and the predicted distance.
In one embodiment of the present application, acquiring the offset direction and the offset amount of the position with respect to the center of the environment image includes: acquiring a first abscissa value and a first ordinate value of the center of the environment image, and a second abscissa value and a second ordinate value of the center of the target imaging area; determining the offset direction according to the first abscissa value and the second abscissa value; and determining the offset according to the first abscissa value, the first ordinate value, the second abscissa value and the second ordinate value.
In one embodiment of the present application, determining the offset direction based on the first abscissa value and the second abscissa value comprises: if the first abscissa value is larger than the second abscissa value, determining the offset direction as a first direction; if the first abscissa value is smaller than the second abscissa value, determining the offset direction as a second direction; the first direction is a left-turn direction or a right-turn direction, and the first direction and the second direction are opposite directions.
Illustratively, the center a of the environment image as shown in FIG. 7 0 (x 0 ,y 0 ) I.e. the first abscissa value is x 0 And the first ordinate value is y 0 Center of target imaging area a i (x i ,y i ) By comparison of a 0 And a i I.e. a 0 -a i If the result of (2) is less than 0, charging is indicatedThe self-moving equipment is arranged on the right side of the seat, namely the offset direction of the self-moving equipment is towards the right at the moment, and otherwise, the offset direction is towards the left.
In an embodiment of the present application, determining the offset amount according to the first abscissa value, the first ordinate value, the second abscissa value, and the second ordinate value further includes:
calculating a first absolute difference value of the first abscissa value and the second abscissa value;
calculating a second absolute difference value of the first longitudinal coordinate value and the second longitudinal coordinate value;
and determining the offset according to the first absolute difference value, the second absolute difference value and a preset inverse trigonometric function.
Exemplarily, fig. 8 schematically shows a schematic diagram of a relative position between a center of a target imaging area and a self-moving device in the present application. It should be noted that fig. 8 is only one of the cases, and does not limit the specific locations of the environment image center, the target imaging area center, and the mobile device in the present application. In fig. 8, the center of the target imaging region is a, the center of the environment image is B, an intersection point of a horizontal line where the center B of the environment image is located and a vertical line where the center a of the target imaging region is located is O, a connection line between the center a of the target imaging region and the center B of the environment image is a first connection line AB, and a connection line between the center a of the target imaging region and the intersection point O is a second connection line AO. The offset represents an included angle theta between the first connection line AB and the second connection line AO, namely an offset angle required to deflect from the mobile device to the charging seat. When the offset is the offset angle from the mobile device to the charging dock, it can be expressed as
Figure BDA0003839370680000081
Wherein, | x A -x B L is the first absolute difference value, | y A -y B And | the second absolute difference.
And determining the posture adjustment direction of the mobile equipment according to the deviation direction, and adjusting the current orientation angle of the mobile equipment to the posture adjustment amount according to the posture adjustment direction so that the orientation angle adjusted by the mobile equipment can face the charging seat in the forward direction. Wherein the attitude adjustment direction may be the same as the shift direction or a direction opposite to the shift direction. The attitude adjustment amount may be equal to the offset amount or 90 ° minus the offset amount. The posture adjustment information is determined by controlling the manner in which the mobile device is docked with the charging dock.
S240, determining a predicted distance between the mobile device and the charging seat according to the area and a preset fitting function; the preset fitting function represents the mapping relation between the distance from the mobile equipment to the charging seat and the area of the target imaging area.
It should be understood that the preset fitting function is a mapping relation between the area of the target imaging region and the predicted distance obtained according to a large amount of sample data.
FIG. 4 is a diagram schematically illustrating the fitting result of one of the cases of the preset fitting functions; FIG. 5 is a diagram schematically illustrating the fitting result of one of the cases of the preset fitting functions; fig. 6 schematically shows a fitting result diagram of one of the cases of the preset fitting functions.
The predetermined fitting function in this application may be an exponential polynomial fitting function as shown in fig. 4, a quadratic polynomial fitting function as shown in fig. 5, a cubic polynomial fitting function as shown in fig. 6, or may be other fitting functions not shown. As shown in fig. 4, 5, and 6, the discrete points represent the measured actual distance between the mobile device and the charging dock, and the curves represent the predicted distance between the mobile device and the charging dock. In addition, as can be seen from the distribution of the scattered points in the graph, the fitting result of the exponential polynomial fitting function in the present application is better than that of the quadratic polynomial fitting function and the cubic polynomial fitting function, that is, the difference between the predicted distance fitted by the exponential polynomial fitting function and the actual distance obtained by measurement is smaller.
When the predetermined fitting function is an exponential polynomial fitting function, the exponential polynomial fitting function in the present application is as follows:
a·e bd +c=z
a. b and c are index parameters, d is the pixel area of the target imaging area, and z is the predicted distance between the mobile equipment and the charging seat, wherein the pixel area d is w.h. Figure 7 schematically shows a plan coordinate diagram of an ambient image area and a target imaging area,as shown in fig. 7, the camera coordinates from the mobile device are a 0 (x 0 ,y 0 ) The coordinate of the central point of the charging seat is a i (x i ,y i ) The predicted distance from the mobile device to the charging dock is z.
And S250, controlling the movement of the self-moving equipment according to the posture adjustment information and the predicted distance so as to enable the self-moving equipment to be in butt joint with the charging seat.
And adjusting the pose of the vehicle body by the self-moving equipment through the pose adjustment direction and the pose adjustment amount, so that the environment image center of the self-moving equipment is coincided with the target imaging center.
For example, with reference to fig. 8, when the predicted distance from the mobile device B to the charging dock a is L1, the obtained offset direction is rightward, the obtained offset is an offset angle θ, and the corresponding posture adjustment information is adjusted differently according to the manner of controlling the mobile device to dock with the charging dock. Meanwhile, a distance L3 from the center of the environment image (i.e., from the mobile device) B to the intersection O, and a distance L2 from the intersection O to the center of the target imaging area (charging dock) a can be calculated from the predicted distances.
In one embodiment of the present application, the attitude adjustment direction when the mobile device is operating in the forward direction toward the intersection O may be adjusted to be the same as the offset direction, and the attitude adjustment amount may be θ. When the self-moving device rotates theta along the offset direction and moves forwards for an L3 distance from the adjusted self-moving device to the intersection point O, the posture adjustment direction and the posture adjustment amount of the self-moving device at the moment are adjusted to rotate 90 degrees anticlockwise, so that the current orientation angle of the self-moving device faces the charging seat forwards, the self-moving device with the adjusted orientation angle faces the charging seat forwards for a L2 distance, the self-moving device is connected with the charging seat in a butt joint mode, and the moving process from BO to OA is achieved. For example, as shown in fig. 8, after the self-moving device is deflected to the right by θ, the self-moving device is moved by a distance of L3 to reach a point O, rotated counterclockwise by 90 degrees at the point O, and moved by a distance of L2 to reach a point a.
In one embodiment of the present application, the orientation angle of the mobile device may be adjusted to face the charging dock in the forward direction, that is, the posture adjustment direction is adjusted to be opposite to the offset direction, and the posture adjustment amount is 90 ° - θ. When the self-moving equipment rotates by 90 degrees to theta along the opposite direction of the offset direction, and after the self-moving equipment with the well-adjusted orientation angle is controlled to face the charging seat to move for an L2 distance, the posture adjustment direction of the self-moving equipment at the moment is translated for an L3 distance according to the original offset direction, so that the self-moving equipment is in butt joint with the charging seat, and the moving process from BQ to QA is realized. For example, as shown in fig. 8, after deflecting the mobile device to the left by α =90 ° - θ, the mobile device is moved by L2 distance to point Q, and is translated to the right by L3 distance in point Q to point a.
When the mobile device enters a preset return flight area, the environment image is acquired, the target imaging area of the charging seat is extracted from the environment image, the area of the target imaging area is acquired, the prediction distance between the mobile device and the charging seat is determined according to the area of the target imaging area and a preset fitting function, and then the mobile device and the charging seat are controlled to be in butt joint. From this, can be through the regional area of target formation of image including the charging seat formation of image region and the corresponding relation of predetermineeing the fitting function, confirm the prediction distance from mobile device and charging seat to realize from the accurate range finding between mobile device and the charging seat, can avoid the depth of charging seat colour to cause the interference to the precision of range finding, can realize from the accurate range finding and the location between mobile device and the charging seat, thereby realize returning the accurate control of filling automatically to the mobile device.
Embodiments of a return device of the self-moving device of the present application are described below, which can be used to execute the return method in the above embodiments of the present application. Fig. 9 schematically shows a block diagram of a structure of a return device of a self-moving device according to an embodiment of the present application. As shown in fig. 9, the return device of the self-moving apparatus includes:
the image acquisition module 910 is configured to acquire an environment image in a forward direction of the mobile device when the mobile device is located in a preset return area;
an area determining module 920, configured to determine a position of the target imaging region and an area of the target imaging region in the environment image; the target imaging area is an imaging area where the charging seat is located;
an attitude adjustment module 930 configured to determine attitude adjustment information according to the position;
a predicted distance determining module 940, configured to determine a predicted distance between the mobile device and the charging dock according to the area and a preset fitting function; the preset fitting function indicates the mapping relation between the distance from the mobile equipment to the charging seat and the area of the target imaging area;
and a device moving module 950, configured to control movement of the self-moving device according to the posture adjustment information and the predicted distance, so that the self-moving device is docked with the charging dock.
In an embodiment of the present application, the return device is further specifically configured to: obtaining a first positioning signal from a mobile device; determining the relative distance from the charging seat to the mobile equipment according to the first positioning signal; and when the relative distance is smaller than or equal to the preset range threshold, determining that the self-moving equipment is located in the preset return area.
In an embodiment of the present application, the area determining module 920 is specifically configured to: detecting an object present in the environmental image; if the detected object comprises a charging seat, determining an image area where the charging seat is located as a target imaging area; the position of the target imaging region and the area of the target imaging region are acquired.
In an embodiment of the present application, the gesture adjustment module 930 is specifically configured to: determining pose adjustment information according to the position, comprising: acquiring the offset direction and the offset of the position relative to the center of the environment image; and determining the attitude adjusting direction and the attitude adjusting quantity according to the offset direction and the offset.
In an embodiment of the present application, the posture adjustment module 930 is further specifically configured to: wherein, the position is the position of the center of the target imaging area; acquiring a first abscissa value and a first ordinate value of the center of the environment image, and a second abscissa value and a second ordinate value of the center of the target imaging region; determining the offset direction according to the first abscissa value and the second abscissa value; and determining the offset according to the first abscissa value, the first ordinate value, the second abscissa value and the second ordinate value.
In an embodiment of the present application, the posture adjustment module 930 is further specifically configured to: if the first abscissa value is larger than the second abscissa value, determining the offset direction as a first direction; if the first abscissa value is smaller than the second abscissa value, determining the offset direction as a second direction; the first direction is a left-turn direction or a right-turn direction, and the first direction and the second direction are opposite directions.
In an embodiment of the application, the gesture adjusting module 930 is further specifically configured to: calculating a first absolute difference value of the first abscissa value and the second abscissa value; calculating a second absolute difference value of the first ordinate value and the second ordinate value; and determining the offset according to the first absolute difference value, the second absolute difference value and a preset inverse trigonometric function.
The specific details of the return device of the self-moving device provided in each embodiment of the present application have been described in detail in the corresponding method embodiment, and are not described herein again.
Fig. 10 schematically shows a system structure block diagram of an autonomous mobile device for implementing an embodiment of the present application.
It should be noted that the system 1000 of the self-moving device shown in fig. 10 is only an example, and should not bring any limitation to the functions and the application scope of the embodiments of the present application.
As shown in fig. 10, the system 1000 of the self-moving device includes a Central Processing Unit 1001 (CPU), which can perform various appropriate actions and processes according to a program stored in a Read-Only Memory 1002 (ROM) or a program loaded from a storage section 1008 into a Random Access Memory 1003 (RAM). In the random access memory 1003, various programs and data necessary for system operation are also stored. The cpu 1001, the rom 1002, and the ram 1003 are connected to each other via a bus 1004. An Input/Output interface 1005 (Input/Output interface, i.e., I/O interface) is also connected to the bus 1004.
The following components are connected to the input/output interface 1005: an input portion 1006 including a keyboard, a mouse, and the like; an output section 1007 including a Display panel such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage portion 1008 including a hard disk and the like; and a communications portion 1009 including a network interface card such as a local area network card, modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The driver 1010 is also connected to the input/output interface 1005 as necessary. A removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1010 as necessary, so that a computer program read out therefrom is mounted into the storage section 1008 as necessary.
In particular, according to embodiments of the present application, the processes described in the various method flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from the network through the communication part 1009 and/or installed from the removable medium 1011. When the computer program is executed by the cpu 1001, various functions defined in the system of the present application are executed.
It should be noted that the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the return method from the mobile device according to the embodiment of the present application.
Fig. 11 schematically illustrates a schematic diagram of a self-moving device provided in an embodiment of the present application, where, as shown in fig. 11, the self-moving device 11 includes: a vehicle body 110 including a vehicle body 1101 and wheels 1102; and a control module 1103, configured to execute the return method for self-moving device provided in any embodiment of the present application, where specific details of the return method for self-moving device have been described in detail in corresponding method embodiments, and are not described herein again.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. A method of returning a vehicle from a mobile device, the method comprising:
when the self-moving equipment is located in a preset return area, acquiring an environment image of the self-moving equipment in the advancing direction;
determining a position of a target imaging region and an area of the target imaging region in the environment image; the target imaging area is an imaging area where a charging seat is located;
determining attitude adjustment information according to the position;
determining a predicted distance between the self-moving equipment and the charging seat according to the area and a preset fitting function; the preset fitting function is used for indicating the mapping relation between the distance from the mobile equipment to the charging seat and the area of the target imaging area;
and controlling the movement of the self-moving equipment according to the attitude adjustment information and the predicted distance so as to enable the self-moving equipment to be in butt joint with the charging seat.
2. The method of claim 1, wherein before the self-moving device is located in a preset return area, the method further comprises:
acquiring a first positioning signal of the self-mobile equipment;
determining the relative distance from the charging seat to the self-mobile equipment according to the first positioning signal;
and when the relative distance is smaller than or equal to a preset range threshold value, determining that the self-moving equipment is located in the preset return flight area.
3. The method according to claim 1, wherein the attitude adjustment information includes an attitude adjustment direction and an attitude adjustment amount;
the determining of the attitude adjustment information according to the position includes:
acquiring the offset direction and the offset of the position relative to the center of the environment image;
and determining the attitude adjustment direction and the attitude adjustment quantity according to the offset direction and the offset.
4. The method of claim 3, wherein the location is a location of a center of the target imaging region;
the acquiring offset direction and offset of the position relative to the center of the environment image includes:
acquiring a first abscissa value and a first ordinate value of the center of the environment image, and a second abscissa value and a second ordinate value of the center of the target imaging area;
determining the offset direction according to the first abscissa value and the second abscissa value;
and determining the offset according to the first abscissa value, the first ordinate value, the second abscissa value and the second ordinate value.
5. The method of claim 4, wherein said determining the offset direction from the first abscissa value and the second abscissa value comprises:
if the first abscissa value is larger than the second abscissa value, determining the offset direction as a first direction;
if the first abscissa value is smaller than the second abscissa value, determining the offset direction as a second direction;
the first direction is a left-turn direction or a right-turn direction, and the first direction and the second direction are opposite directions.
6. The method of claim 4, wherein determining the offset from the first abscissa value, the first ordinate value, the second abscissa value, and the second ordinate value comprises:
calculating a first absolute difference value of the first abscissa value and the second abscissa value;
calculating a second absolute difference value of the first ordinate value and the second ordinate value;
and determining the offset according to the first absolute difference value, the second absolute difference value and a preset inverse trigonometric function.
7. The method of any one of claims 1-6, wherein determining the location of the target imaging region and the area of the target imaging region in the environmental image comprises:
detecting an object present in the environmental image;
if the detected object comprises the charging seat, determining an image area where the charging seat is located as a target imaging area;
and acquiring the position of the target imaging region and the area of the target imaging region.
8. A return voyage device from a mobile device, comprising:
the image acquisition module is used for acquiring an environment image of the self-moving equipment in the advancing direction when the self-moving equipment is located in a preset return area;
the area determination module is used for determining the position of a target imaging region and the area of the target imaging region in the environment image; the target imaging area is an imaging area where a charging seat is located;
the attitude adjusting module is used for determining attitude adjusting information according to the position;
the predicted distance determining module is used for determining the predicted distance between the self-moving equipment and the charging seat according to the area and a preset fitting function; wherein the preset fitting function indicates a mapping relationship between a distance from the mobile device to the charging dock and an area of the target imaging region;
and the equipment moving module is used for controlling the movement of the self-moving equipment according to the posture adjustment information and the predicted distance so as to enable the self-moving equipment to be in butt joint with the charging seat.
9. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out a method of return from a mobile device according to any one of claims 1 to 7.
10. An autonomous device, the autonomous device comprising:
a vehicle body including a vehicle body and wheels; and
a control module for performing the method of returning from a mobile device as claimed in any one of claims 1 to 7.
CN202211097575.0A 2022-09-08 2022-09-08 Self-moving equipment and return method, return device and computer readable medium thereof Pending CN115585819A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211097575.0A CN115585819A (en) 2022-09-08 2022-09-08 Self-moving equipment and return method, return device and computer readable medium thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211097575.0A CN115585819A (en) 2022-09-08 2022-09-08 Self-moving equipment and return method, return device and computer readable medium thereof

Publications (1)

Publication Number Publication Date
CN115585819A true CN115585819A (en) 2023-01-10

Family

ID=84771404

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211097575.0A Pending CN115585819A (en) 2022-09-08 2022-09-08 Self-moving equipment and return method, return device and computer readable medium thereof

Country Status (1)

Country Link
CN (1) CN115585819A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117848352A (en) * 2024-03-07 2024-04-09 鲁东大学 Auxiliary positioning system based on computer vision

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117848352A (en) * 2024-03-07 2024-04-09 鲁东大学 Auxiliary positioning system based on computer vision
CN117848352B (en) * 2024-03-07 2024-05-14 鲁东大学 Auxiliary positioning system based on computer vision

Similar Documents

Publication Publication Date Title
CN111325796B (en) Method and apparatus for determining pose of vision equipment
CN111209978B (en) Three-dimensional visual repositioning method and device, computing equipment and storage medium
CN106569225B (en) Unmanned vehicle real-time obstacle avoidance method based on ranging sensor
CN110986920B (en) Positioning navigation method, device, equipment and storage medium
CN110889808A (en) Positioning method, device, equipment and storage medium
CN110501712A (en) For determining the method, apparatus, equipment and medium of position and attitude data
US10852740B2 (en) Determining the orientation of flat reflectors during robot mapping
CN111070205A (en) Pile alignment control method and device, intelligent robot and storage medium
CN110263713A (en) Method for detecting lane lines, device, electronic equipment and storage medium
CN110764110B (en) Path navigation method, device and computer readable storage medium
CN110988949A (en) Positioning method, positioning device, computer readable storage medium and mobile device
CN110850882A (en) Charging pile positioning method and device of sweeping robot
US20210192777A1 (en) Method, device and storage medium for positioning object
CN115585819A (en) Self-moving equipment and return method, return device and computer readable medium thereof
CN113848940A (en) AGV autonomous navigation control method and system
CN112925302A (en) Robot pose control method and device
JP2022093291A (en) Induction inspection using object recognition model and navigation plan
Freundlich et al. A hybrid control approach to the next-best-view problem using stereo vision
CN117612132A (en) Method and device for complementing bird's eye view BEV top view and electronic equipment
CN115421486A (en) Return control method and device, computer readable medium and self-moving equipment
CN111207754A (en) Particle filter-based multi-robot formation positioning method and robot equipment
CN110901384A (en) Unmanned vehicle control method, device, medium and electronic equipment
CN110794434A (en) Pose determination method, device, equipment and storage medium
US11514588B1 (en) Object localization for mapping applications using geometric computer vision techniques
EP4345750A1 (en) Position estimation system, position estimation method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination