CN111158355A - Automatic navigation cloud server and automatic navigation control method - Google Patents

Automatic navigation cloud server and automatic navigation control method Download PDF

Info

Publication number
CN111158355A
CN111158355A CN201811320589.8A CN201811320589A CN111158355A CN 111158355 A CN111158355 A CN 111158355A CN 201811320589 A CN201811320589 A CN 201811320589A CN 111158355 A CN111158355 A CN 111158355A
Authority
CN
China
Prior art keywords
transport vehicle
cloud server
pose information
scene
destination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811320589.8A
Other languages
Chinese (zh)
Inventor
占兆武
谢恺
罗为
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fulian Fugui Precision Industry Co Ltd
Original Assignee
Fuhuake Precision Industry Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuhuake Precision Industry Shenzhen Co ltd filed Critical Fuhuake Precision Industry Shenzhen Co ltd
Priority to CN201811320589.8A priority Critical patent/CN111158355A/en
Publication of CN111158355A publication Critical patent/CN111158355A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B11/00Automatic controllers
    • G05B11/01Automatic controllers electric
    • G05B11/36Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential
    • G05B11/42Automatic controllers electric with provision for obtaining particular characteristics, e.g. proportional, integral, differential for obtaining a characteristic which is both proportional and time-dependent, e.g. P. I., P. I. D.
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0285Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

An automatic navigation cloud server is applied to a scene and comprises a transport vehicle, a destination, a camera device and a cloud server, wherein the cloud server plans a reference path for the transport vehicle to travel to the destination, the camera device shoots an image of the transport vehicle, the cloud server analyzes current pose information of the transport vehicle according to the image and gives ideal pose information of a forward reference point of the transport vehicle according to the reference path, and calculates a travel control instruction of the transport vehicle according to the current pose information of the transport vehicle and the ideal pose information of the forward reference point, and the cloud server transmits the travel control instruction to the transport vehicle so as to control the transport vehicle to travel to the destination according to the travel control instruction. An automatic navigation control method is also provided.

Description

Automatic navigation cloud server and automatic navigation control method
Technical Field
The invention relates to an automatic navigation cloud server based on a PID control algorithm and an automatic navigation control method.
Background
Currently, an Automatic Guided Vehicle (AGV) has become an important device in intelligent manufacturing, advanced logistics and intelligent factories, and has the purposes of facilitating product transportation, improving production efficiency, reducing production cost and the like. The so-called automatic guidance means that the vehicle travels along a preset path, and currently, the common guidance methods include magnetic navigation, laser navigation, two-dimensional code navigation, visual navigation, and the like. Magnetic navigation is to lay magnetic strips on a preset running path, the existing use environment needs to be changed, and the engineering quantity is not small; laser navigation is susceptible to light changes, is not easy to identify transparent objects, and is relatively expensive; the two-dimension code navigation needs to paste a plurality of two-dimension codes on a preset driving path, so that the appearance of an application environment is influenced, and the two-dimension codes are easily damaged or covered; the visual navigation has lower requirements on hardware and higher requirements on an algorithm level, but the image information acquired based on the visual navigation system has the potential of acquiring or mining richer information, so the AGV navigation control algorithm based on the visual navigation increasingly becomes a research hotspot in academic and engineering circles.
At present, visual navigation is mainly divided into visual navigation based on vehicle-mounted visual equipment and visual navigation based on a wireless network and the visual equipment. The visual navigation based on the vehicle-mounted visual equipment needs complex deep neural network learning and training, and only local environment information and position information can be mastered; the visual navigation based on the wireless network and the visual equipment does not need to transform the field environment, can master the environment information and the position information which are nearly global, is convenient for the global optimization of system design, has low requirements on the sensor and low cost, and is an AGV navigation mode with good application prospect.
Disclosure of Invention
In view of the above, there is a need for an automatic navigation cloud server for controlling the travel of a transport vehicle.
An automatic navigation cloud server is applied to a scene and comprises a transport vehicle, a destination, a camera device and a cloud server, wherein the cloud server plans a reference path for the transport vehicle to travel to the destination, the camera device shoots an image of the transport vehicle, the cloud server analyzes current pose information of the transport vehicle according to the image and gives ideal pose information of a forward reference point of the transport vehicle according to the reference path, and calculates a travel control instruction of the transport vehicle according to the current pose information of the transport vehicle and the ideal pose information of the forward reference point, and the cloud server transmits the travel control instruction to the transport vehicle so as to control the transport vehicle to travel to the destination according to the travel control instruction.
An automatic navigation control method is applied to a scene, and comprises the following steps:
planning a reference path of the transport vehicle to the destination;
shooting images of the transport vehicle in the scene;
analyzing the current pose information of the transport vehicle according to the image and giving ideal pose information of a forward reference point of the transport vehicle according to the reference path;
calculating a running control instruction of the transport vehicle according to the current pose information of the transport vehicle and the ideal pose information of the forward reference point;
and controlling the transport vehicle to run to the destination according to the running control instruction.
The automatic navigation cloud server calculates the running control instruction, so that the transport vehicle can quickly return to the reference path when running deviation occurs, efficient deviation correction is realized, and the running accuracy and efficiency of the transport vehicle are greatly improved.
Drawings
Fig. 1 is a functional block diagram of an automatic navigation cloud server according to a preferred embodiment of the present invention.
Fig. 2 is a schematic diagram of an application of the automatic navigation cloud server in a scene according to a preferred embodiment of the present invention.
Description of the main elements
Automatic navigation cloud server 100
Carrier vehicle 10
First wireless communication module 12
Destination 20
Image pickup device 30
Cloud server 40
Path planning module 42
Navigation module 44
Transport vehicle control module 46
Analysis module 48
Second wireless communication module 49
Scene 500
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
Referring to fig. 1, an automatic navigation cloud server 100 according to a preferred embodiment of the present invention includes a transportation vehicle 10, a destination 20, a camera 30, and a cloud server 40. The transporter 10, the camera device 30 and the cloud server 40 are in communication connection through a wireless network so as to transmit data and commands to each other. The automatic navigation cloud server 100 is arranged in a scene, such as a storage bin.
Referring to fig. 2, a scene 500 is provided with the destination 20 and the transportation vehicle 10, and a camera device 30 is further disposed for capturing images in the scene 500, where the images include the transportation vehicle 10. The cloud server 40 extracts the current pose information of the transporter 10 through a visual positioning algorithm. The cloud server 40 also plans a reference path for the transportation vehicle 10 to travel to the destination 20, and calculates pose information of a forward reference point. In the present embodiment, the reference path is a linear path composed of a plurality of points in order, and the forward reference point is a reference point at which the carriage 10 travels forward in the reference path. The cloud server 40 calculates a control instruction for the traveling of the transporter 10 according to the current pose information of the transporter 10 and the pose information of the forward reference point. The transporter 10 travels toward the destination 20 under the guidance of the cloud server 40.
The transport Vehicle 10 is an Automated Guided Vehicle (AGV) and is controlled by the cloud server 40 to move within the scene. In this embodiment, the transportation vehicle 10 is controlled by the cloud server 40 to drive toward the destination 20 in the scene. The transporter 10 includes a first wireless communication module 12 for establishing a wireless communication connection with the cloud server 40 for transmitting data and commands.
The destination 20 is fixedly arranged at a position within the scene. In this embodiment, the destination 20 may be a rack for carrying cargo, and the transporter 10 is driven toward the destination 20 to perform loading and unloading interactions.
The camera device 30 is fixedly disposed on a ceiling in the scene, and is configured to capture images in the scene, including capturing the movement of the transporter 10 in the scene. The image information captured by the imaging device 30 is transmitted to the cloud server 40.
The cloud server 40 is configured to collect image information of the camera device 30, analyze the movement of the transporter 10, and control the movement of the transporter 10. The cloud server 40 analyzes the position and posture of the transporter 10 and its surrounding environmental conditions by analyzing the movement of the transporter 10. It is understood that the cloud server 40 further includes a plurality of 4G/5G base stations disposed within or near the scene for communicating instructions and information to the transporter 10.
The cloud server 40 includes a path planning module 42, a navigation module 44, a transporter control module 46, an analysis module 48, and a second wireless communication module 49. The second wireless communication module 49 establishes a wireless communication connection with the first wireless communication module 12 of the transporter 10 to transmit data and commands. The path planning module 42 is configured to plan the reference path traveled by the vehicle 10 toward the destination 20. The navigation module 44 is configured to provide navigation information to the transporter 10 according to the reference path. The transporter control module 46 is configured to control the transporter 10 to travel to the destination 20 according to the navigation information. The analysis module 48 receives the image information transmitted by the camera device 30 and analyzes the current pose information of the transporter 10. Specifically, the analysis module 48 extracts characteristic information of the transporter 10 from the image information. In this embodiment, the characteristic information of the transporter 10 is that the analysis module 48 extracts the current pose information (x, y, θ) of the transporter 10 based on a visual positioning algorithm, where the parameters x and y are the position information of the transporter 10 in the scene global coordinate system, and θ is the heading angle information of the transporter 10.
After the path planning module 42 formulates the reference path, the analysis module 48 analyzes the current position information (x, y, θ) of the transporter 10 and the ideal pose information (x) of the selected forward reference pointr,yr,θr) Wherein the parameter xrAnd yrPosition information of the forward reference point in the scene global coordinate system, θrAnd the ideal course angle information of the forward reference point. Based on the current pose information (x, y, theta) of the transporter 10 and the ideal pose information (x) of the forward reference pointr,yr,θr) The analysis module 48 calculates three deviation amounts including: an angular deviation amount Δ θ, a distance deviation amount Δ d, and another angular deviation amount Δ φ. The angular deviation Δ θ is defined as an angle between the heading angle of the transporter 10 and the ideal heading angle of the forward reference point, i.e., Δ θ - θrRepresenting the need for delta theta angle adjustment of the vehicle 10. The distance deviation Δ d is defined as the minimum distance from the current position of the vehicle 10 to the reference path, i.e., Δ d min { norm [ (X, y) - (X) }i,Yi)],i∈[1,N]Norm represents the Euclidean distance, N represents the number of points quantized on the reference path, and the smaller N is, the larger the distance between two points is, and the lower the resolution is; said (X)i,Yi) Is the position parameter of the ith point constituting the reference path. The further angular deviation Δ φ is defined as a heading angle of the transporter 10 and the destination 20 and the transportThe angle between the line connecting the current positions of the truck 10, i.e., Δ φ - θ. And calculating a running control command (v, omega) of the transport vehicle 10 based on a PID (proportion, integral and differential) algorithm according to the three deviation values, wherein the parameter v is the linear speed of the transport vehicle 10, and the parameter omega is the angular speed of the transport vehicle 10. Wherein the computational mathematical model is represented as
ω=PID(Δθ,Δd,Δφ)=PID1(Δθ)+PID2(Δd)+PID3(Δφ)
Wherein the content of the first and second substances,
Figure BDA0001857423800000051
Figure BDA0001857423800000052
Figure BDA0001857423800000053
wherein, Ki,P,i∈[1,3]Is a proportional element gain coefficient, Ki,I,i∈[1,3]Is an integral unit gain coefficient, Ki,D,i∈[1,3]Is the differential cell gain factor. The analysis module 48 transmits the travel control command (v, ω) to the transporting vehicle 10 through the second wireless communication module 49 to control the traveling of the transporting vehicle 10.
The automatic navigation cloud server 100 calculates the driving control command (v, ω) so that the transport vehicle 10 can quickly return to the reference path during driving deviation, thereby realizing efficient deviation correction and greatly improving the driving accuracy and efficiency of the transport vehicle 10.
In view of the above, although the preferred embodiments of the present invention have been disclosed for illustrative purposes, the present invention is not limited to the above-described embodiments, and those skilled in the relevant art can make various modifications and applications without departing from the scope of the basic technical idea of the present invention.

Claims (10)

1. The utility model provides an automatic navigation cloud server, is applied to in a scene which characterized in that: the automatic navigation cloud server comprises a transport vehicle, a destination, a camera device and a cloud server, the cloud server plans a reference path of the transport vehicle running to the destination, the camera device shoots an image of the transport vehicle, the cloud server analyzes current pose information of the transport vehicle according to the image and gives ideal pose information of a forward reference point of the transport vehicle according to the reference path, a running control instruction of the transport vehicle is calculated according to the current pose information of the transport vehicle and the ideal pose information of the forward reference point, and the cloud server transmits the running control instruction to the transport vehicle so as to control the transport vehicle to run to the destination according to the running control instruction.
2. The automated navigation cloud server of claim 1, wherein: the cloud server extracts current pose information (x, y, theta) of the transport vehicle based on a visual positioning algorithm, wherein parameters x and y are position information of the transport vehicle in the scene global coordinate system, and theta is course angle information of the transport vehicle; the cloud server analyzes ideal pose information (x) of the forward reference pointr,yr,θr) Wherein the parameter xrAnd yrPosition information of the forward reference point in the scene global coordinate system, θrAnd the ideal course angle information of the forward reference point.
3. The automated navigation cloud server of claim 2, wherein: the cloud server is used for obtaining the current pose information (x, y, theta) of the transport vehicle and the ideal pose information (x) of the forward reference pointr,yr,θr) Calculating a driving control command (v, ω) of the transport vehicle, wherein the calculation of the cloud server is based on a mathematical model ω ═ PID (Δ θ, Δ d, Δ φ) ═ PID1(Δθ)+PID2(Δd)+PID3(delta phi), wherein,
Figure FDA0001857423790000011
Figure FDA0001857423790000012
Figure FDA0001857423790000013
wherein, the parameter delta theta is an angle deviation value, the parameter delta d is a distance deviation value, the parameter delta phi is another angle deviation value, the parameter nu is the linear speed of the carrier vehicle, and omega is the angular speed of the carrier vehicle.
4. The automated navigation cloud server of claim 3, wherein: the cloud server provides navigation information of the transport vehicle according to the driving control instruction (v, omega).
5. The automated navigation cloud server of claim 1, wherein: the transport vechicle is Automatic Guided Vehicle (AGV), the destination is fixed set up in position in the scene, camera device fixed set up in the scene for shoot image in the scene, cloud server with wireless communication connection is established to the transport vechicle, so that cloud server conveying the control command that traveles extremely the transport vechicle.
6. An automatic navigation control method is applied to a scene, and is characterized in that: the automatic navigation control method comprises the following steps:
planning a reference path of the transport vehicle to the destination;
shooting images of the transport vehicle in the scene;
analyzing the current pose information of the transport vehicle according to the image and giving ideal pose information of a forward reference point of the transport vehicle according to the reference path;
calculating a running control instruction of the transport vehicle according to the current pose information of the transport vehicle and the ideal pose information of the forward reference point;
and controlling the transport vehicle to run to the destination according to the running control instruction.
7. The automatic navigation control method of claim 6, wherein: the automatic navigation control method further comprises the following steps:
extracting current pose information (x, y, theta) of the transport vehicle based on a visual positioning algorithm, wherein parameters x and y are position information of the transport vehicle in the scene global coordinate system, and theta is course angle information of the transport vehicle;
analyzing the ideal pose information (x) of the destinationr,yr,θr) Wherein the parameter xrAnd yrInformation of the location of the destination in the scene global coordinate system, θrIs the desired course angle information for the destination.
8. The automatic navigation control method of claim 7, wherein: the automatic navigation control method further comprises the following steps:
according to the current pose information (x, y, theta) of the transport vehicle and the ideal pose information (x) of the destinationr,yr,θr) Calculating a driving control command (v, ω) of the transport vehicle, wherein the calculation of the cloud server is based on a mathematical model ω ═ PID (Δ θ, Δ d, Δ φ) ═ PID1(Δθ)+PID2(Δd)+PID3(delta phi), wherein,
Figure FDA0001857423790000031
Figure FDA0001857423790000032
Figure FDA0001857423790000033
wherein, the parameter delta theta is an angle deviation value, the parameter delta d is a distance deviation value, the parameter delta phi is another angle deviation value, the parameter nu is the linear speed of the carrier vehicle, and omega is the angular speed of the carrier vehicle.
9. The automatic navigation control method of claim 8, wherein: the automatic navigation control method further comprises the following steps:
providing navigation information of the transport vehicle according to the driving control instruction (v, ω).
10. The automatic navigation control method of claim 6, wherein: the destination is fixedly arranged at a position in the scene, the camera device is fixedly arranged in the scene and is used for shooting images in the scene, and the cloud server is in wireless communication connection with the transport vehicle so as to enable the cloud server to transmit the driving control command to the transport vehicle.
CN201811320589.8A 2018-11-07 2018-11-07 Automatic navigation cloud server and automatic navigation control method Pending CN111158355A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811320589.8A CN111158355A (en) 2018-11-07 2018-11-07 Automatic navigation cloud server and automatic navigation control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811320589.8A CN111158355A (en) 2018-11-07 2018-11-07 Automatic navigation cloud server and automatic navigation control method

Publications (1)

Publication Number Publication Date
CN111158355A true CN111158355A (en) 2020-05-15

Family

ID=70554511

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811320589.8A Pending CN111158355A (en) 2018-11-07 2018-11-07 Automatic navigation cloud server and automatic navigation control method

Country Status (1)

Country Link
CN (1) CN111158355A (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111486848A (en) * 2020-05-25 2020-08-04 上海杰销自动化科技有限公司 AGV visual navigation method, system, computer equipment and storage medium
CN113044020A (en) * 2021-04-13 2021-06-29 杭州豪盛电动车辆有限公司 Handling system
CN113325850A (en) * 2021-06-01 2021-08-31 武汉商学院 Autonomous cruise system and method for cloud tour guide robot
US11403069B2 (en) 2017-07-24 2022-08-02 Tesla, Inc. Accelerated mathematical engine
US11409692B2 (en) 2017-07-24 2022-08-09 Tesla, Inc. Vector computational unit
US11487288B2 (en) 2017-03-23 2022-11-01 Tesla, Inc. Data synthesis for autonomous control systems
US11537811B2 (en) 2018-12-04 2022-12-27 Tesla, Inc. Enhanced object detection for autonomous vehicles based on field view
US11561791B2 (en) 2018-02-01 2023-01-24 Tesla, Inc. Vector computational unit receiving data elements in parallel from a last row of a computational array
US11562231B2 (en) 2018-09-03 2023-01-24 Tesla, Inc. Neural networks for embedded devices
US11567514B2 (en) 2019-02-11 2023-01-31 Tesla, Inc. Autonomous and user controlled vehicle summon to a target
US11610117B2 (en) 2018-12-27 2023-03-21 Tesla, Inc. System and method for adapting a neural network model on a hardware platform
US11636333B2 (en) 2018-07-26 2023-04-25 Tesla, Inc. Optimizing neural network structures for embedded systems
US11665108B2 (en) 2018-10-25 2023-05-30 Tesla, Inc. QoS manager for system on a chip communications
US11681649B2 (en) 2017-07-24 2023-06-20 Tesla, Inc. Computational array microprocessor system using non-consecutive data formatting
US11734562B2 (en) 2018-06-20 2023-08-22 Tesla, Inc. Data pipeline and deep learning system for autonomous driving
US11748620B2 (en) 2019-02-01 2023-09-05 Tesla, Inc. Generating ground truth for machine learning from time series elements
US11790664B2 (en) 2019-02-19 2023-10-17 Tesla, Inc. Estimating object properties using visual image data
US11816585B2 (en) 2018-12-03 2023-11-14 Tesla, Inc. Machine learning models operating at different frequencies for autonomous vehicles
US11841434B2 (en) 2018-07-20 2023-12-12 Tesla, Inc. Annotation cross-labeling for autonomous control systems
US11893393B2 (en) 2017-07-24 2024-02-06 Tesla, Inc. Computational array microprocessor system with hardware arbiter managing memory requests
US11893774B2 (en) 2018-10-11 2024-02-06 Tesla, Inc. Systems and methods for training machine models with augmented data
US12014553B2 (en) 2019-02-01 2024-06-18 Tesla, Inc. Predicting three-dimensional features for autonomous driving

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101713999A (en) * 2009-11-18 2010-05-26 北京矿冶研究总院 Navigation control method of underground autonomous scraper
JP2011237226A (en) * 2010-05-07 2011-11-24 Navitime Japan Co Ltd Navigation system, navigation device, navigation server, navigation method and program
CN103558856A (en) * 2013-11-21 2014-02-05 东南大学 Service mobile robot navigation method in dynamic environment
JP2017120238A (en) * 2015-12-28 2017-07-06 トヨタ自動車株式会社 Navigation information providing system and navigation information providing device
KR101781048B1 (en) * 2016-04-20 2017-09-25 엘지전자 주식회사 Control device for a vehhicle
CN107228675A (en) * 2016-03-24 2017-10-03 高德信息技术有限公司 A kind of determination method of road residing for terminal, apparatus and system
CN107463172A (en) * 2017-07-21 2017-12-12 浙江大学 It is a kind of that automated driving system is controlled based on the high in the clouds of SUMO and unified time axle
CN108458712A (en) * 2017-02-22 2018-08-28 深圳市城市交通规划设计研究中心有限公司 Unmanned trolley navigation system and air navigation aid, unmanned trolley
CN108674551A (en) * 2018-07-03 2018-10-19 武汉职业技术学院 A kind of remote monitoring balance car system based on NB-IoT

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101713999A (en) * 2009-11-18 2010-05-26 北京矿冶研究总院 Navigation control method of underground autonomous scraper
JP2011237226A (en) * 2010-05-07 2011-11-24 Navitime Japan Co Ltd Navigation system, navigation device, navigation server, navigation method and program
CN103558856A (en) * 2013-11-21 2014-02-05 东南大学 Service mobile robot navigation method in dynamic environment
JP2017120238A (en) * 2015-12-28 2017-07-06 トヨタ自動車株式会社 Navigation information providing system and navigation information providing device
CN107228675A (en) * 2016-03-24 2017-10-03 高德信息技术有限公司 A kind of determination method of road residing for terminal, apparatus and system
KR101781048B1 (en) * 2016-04-20 2017-09-25 엘지전자 주식회사 Control device for a vehhicle
CN108458712A (en) * 2017-02-22 2018-08-28 深圳市城市交通规划设计研究中心有限公司 Unmanned trolley navigation system and air navigation aid, unmanned trolley
CN107463172A (en) * 2017-07-21 2017-12-12 浙江大学 It is a kind of that automated driving system is controlled based on the high in the clouds of SUMO and unified time axle
CN108674551A (en) * 2018-07-03 2018-10-19 武汉职业技术学院 A kind of remote monitoring balance car system based on NB-IoT

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘法勇: "基于人工势场的车道保持***研究", 《农业装备与车辆工程》, vol. 56, no. 1, pages 32 - 36 *
刘法勇;: "基于人工势场的车道保持***研究", 农业装备与车辆工程, no. 01, pages 32 - 36 *

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11487288B2 (en) 2017-03-23 2022-11-01 Tesla, Inc. Data synthesis for autonomous control systems
US12020476B2 (en) 2017-03-23 2024-06-25 Tesla, Inc. Data synthesis for autonomous control systems
US11893393B2 (en) 2017-07-24 2024-02-06 Tesla, Inc. Computational array microprocessor system with hardware arbiter managing memory requests
US11403069B2 (en) 2017-07-24 2022-08-02 Tesla, Inc. Accelerated mathematical engine
US11409692B2 (en) 2017-07-24 2022-08-09 Tesla, Inc. Vector computational unit
US11681649B2 (en) 2017-07-24 2023-06-20 Tesla, Inc. Computational array microprocessor system using non-consecutive data formatting
US11561791B2 (en) 2018-02-01 2023-01-24 Tesla, Inc. Vector computational unit receiving data elements in parallel from a last row of a computational array
US11797304B2 (en) 2018-02-01 2023-10-24 Tesla, Inc. Instruction set architecture for a vector computational unit
US11734562B2 (en) 2018-06-20 2023-08-22 Tesla, Inc. Data pipeline and deep learning system for autonomous driving
US11841434B2 (en) 2018-07-20 2023-12-12 Tesla, Inc. Annotation cross-labeling for autonomous control systems
US11636333B2 (en) 2018-07-26 2023-04-25 Tesla, Inc. Optimizing neural network structures for embedded systems
US11983630B2 (en) 2018-09-03 2024-05-14 Tesla, Inc. Neural networks for embedded devices
US11562231B2 (en) 2018-09-03 2023-01-24 Tesla, Inc. Neural networks for embedded devices
US11893774B2 (en) 2018-10-11 2024-02-06 Tesla, Inc. Systems and methods for training machine models with augmented data
US11665108B2 (en) 2018-10-25 2023-05-30 Tesla, Inc. QoS manager for system on a chip communications
US11816585B2 (en) 2018-12-03 2023-11-14 Tesla, Inc. Machine learning models operating at different frequencies for autonomous vehicles
US11537811B2 (en) 2018-12-04 2022-12-27 Tesla, Inc. Enhanced object detection for autonomous vehicles based on field view
US11908171B2 (en) 2018-12-04 2024-02-20 Tesla, Inc. Enhanced object detection for autonomous vehicles based on field view
US11610117B2 (en) 2018-12-27 2023-03-21 Tesla, Inc. System and method for adapting a neural network model on a hardware platform
US11748620B2 (en) 2019-02-01 2023-09-05 Tesla, Inc. Generating ground truth for machine learning from time series elements
US12014553B2 (en) 2019-02-01 2024-06-18 Tesla, Inc. Predicting three-dimensional features for autonomous driving
US11567514B2 (en) 2019-02-11 2023-01-31 Tesla, Inc. Autonomous and user controlled vehicle summon to a target
US11790664B2 (en) 2019-02-19 2023-10-17 Tesla, Inc. Estimating object properties using visual image data
CN111486848A (en) * 2020-05-25 2020-08-04 上海杰销自动化科技有限公司 AGV visual navigation method, system, computer equipment and storage medium
CN113044020A (en) * 2021-04-13 2021-06-29 杭州豪盛电动车辆有限公司 Handling system
CN113325850A (en) * 2021-06-01 2021-08-31 武汉商学院 Autonomous cruise system and method for cloud tour guide robot

Similar Documents

Publication Publication Date Title
CN111158355A (en) Automatic navigation cloud server and automatic navigation control method
Barry et al. High‐speed autonomous obstacle avoidance with pushbroom stereo
CN110969655B (en) Method, device, equipment, storage medium and vehicle for detecting parking space
CN109000649B (en) Omni-directional mobile robot pose calibration method based on right-angle bend characteristics
KR101323705B1 (en) Autonomous freight transportation system using mobile robot for autonomous freight transportation
CN107169468A (en) Method for controlling a vehicle and device
CN113791621B (en) Automatic steering tractor and airplane docking method and system
KR101644270B1 (en) Unmanned freight transportation system using automatic positioning and moving route correcting
US20210318122A1 (en) Positioning apparatus capable of measuring position of moving body using image capturing apparatus
CN206193534U (en) Carrying device and storehouse deposit -holding article management system
US20210101747A1 (en) Positioning apparatus capable of measuring position of moving body using image capturing apparatus
CN111198496A (en) Target following robot and following method
WO2023045486A1 (en) Warehousing system, shuttle vehicle for warehousing system and navigation method therefor
US11687086B2 (en) Autonomous robotic navigation in storage site
CN112462762B (en) Robot outdoor autonomous moving system and method based on roadside two-dimensional code unit
CN108919810A (en) The localization for Mobile Robot and navigation system of view-based access control model teaching
CN106556395A (en) A kind of air navigation aid of the single camera vision system based on quaternary number
Behrje et al. An autonomous forklift with 3d time-of-flight camera-based localization and navigation
US20220366599A1 (en) Positioning system and moving body for measuring position of moving body using image capturing apparatus
CN111033426A (en) Moving object, position estimation device, and computer program
CN111708010B (en) Mobile equipment positioning method, device and system and mobile equipment
CN115686073B (en) Unmanned aerial vehicle-based transmission line inspection control method and system
Lee et al. Cyber Physical Autonomous Mobile Robot (CPAMR) framework in the context of industry 4.0
WO2020137311A1 (en) Positioning device and moving object
US20240124137A1 (en) Obstacle avoidance for aircraft from shadow analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221207

Address after: The first floor, the second floor, the third floor and the fourth floor of the factory building No.1, f8d District, Foxconn science and Technology Industrial Park, east side of Minqing Road, Longhua street, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Fulian Fugui Precision Industry Co.,Ltd.

Address before: 518109 3rd floor, building 1, F8B, Foxconn Science Park, No.2, Donghuan 2nd Road, Longhua street, Longhua District, Shenzhen City, Guangdong Province

Applicant before: FUHUAKE PRECISION INDUSTRY (SHENZHEN) Co.,Ltd.

TA01 Transfer of patent application right
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200515

WD01 Invention patent application deemed withdrawn after publication