WO2020227899A1 - Dispositif mobile et procédé de commande - Google Patents

Dispositif mobile et procédé de commande Download PDF

Info

Publication number
WO2020227899A1
WO2020227899A1 PCT/CN2019/086688 CN2019086688W WO2020227899A1 WO 2020227899 A1 WO2020227899 A1 WO 2020227899A1 CN 2019086688 W CN2019086688 W CN 2019086688W WO 2020227899 A1 WO2020227899 A1 WO 2020227899A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
module
processor
control
interface
Prior art date
Application number
PCT/CN2019/086688
Other languages
English (en)
Chinese (zh)
Inventor
郭厚锦
吴易霖
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201980005506.2A priority Critical patent/CN111316184A/zh
Priority to PCT/CN2019/086688 priority patent/WO2020227899A1/fr
Publication of WO2020227899A1 publication Critical patent/WO2020227899A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Definitions

  • This application relates to the technical field of mobile platforms, and in particular to a mobile device and a control method.
  • Mobile devices such as unmanned aerial vehicles, robots, mobile vehicles, mobile ships, or underwater mobile devices, are very important in many fields such as industry, agriculture, civilian use, film and television, search and rescue, police, and military due to their flexible mobility. It can be applied to complex environments. With the development of technology, the complexity of algorithms is getting higher and higher, and the amount of data processed by calculations is getting larger and larger. Therefore, mobile devices require more and more processor computing power, and the requirements for processing power are getting higher and higher. .
  • This application provides an improved mobile device and control method.
  • a mobile device including: a body; a detection device, which is provided in the body, and is used to generate a detection signal; a local area network unit, which is provided in the body, and includes multiple processing And a network device connected to a plurality of the processors, the local area network unit is connected to the detection device, and is used to receive and process the detection signal to generate a local area network signal; and the main control module is arranged in the body, It is in communication connection with the local area network unit, and is used to receive the local area network signal and generate a control signal according to the local area network signal.
  • a control method for controlling a mobile device includes: generating a detection signal through a detection device provided on the body Processing the detection signal and generating a local area network signal through a local area network unit provided in the body, the local area network unit including a plurality of processors and a network device connected to a plurality of the processors; and by being provided in the The main control module of the fuselage and communicatively connected with the local area network unit generates a control signal according to the local area network signal.
  • the mobile device in the embodiment of the present application includes a local area network unit.
  • the local area network unit includes multiple processors connected through the network device.
  • the local area network unit can process the detection signal generated by the detection device.
  • the multiple processors of the local area network unit can meet the processing tasks with a large amount of data.
  • the computing power is high, which can improve the data processing capacity and speed of the mobile device, and make the mobile device more sensitive.
  • Fig. 1 is a three-dimensional schematic diagram of an embodiment of a mobile device of this application.
  • Fig. 2 is a block diagram of a module of an embodiment of a mobile device of this application.
  • FIG. 3 is a schematic diagram of a three-dimensional structure of an embodiment of a local area network unit of a mobile device of this application.
  • FIG. 4 is a three-dimensional schematic diagram of an embodiment of the network device of the local area network unit shown in FIG. 3.
  • FIG. 5 is a three-dimensional schematic diagram of an embodiment of the processor of the local area network unit shown in FIG. 3.
  • FIG. 6 is a three-dimensional schematic diagram of the processor shown in FIG. 5 from another angle.
  • FIG. 7 is a three-dimensional schematic diagram of another embodiment of a mobile device of this application.
  • Fig. 8 shows a flowchart of an embodiment of the control method of the present application.
  • the mobile device in the embodiment of the application includes a body, a detection device, a local area network unit, and a main control module.
  • the detection device is arranged on the fuselage and is used to generate detection signals.
  • the local area network unit is arranged in the body and includes multiple processors and network equipment connected to the multiple processors.
  • the local area network unit is connected with the detection device and is used to receive and process the detection signal to generate the local area network signal.
  • the main control module is arranged in the body, and is connected to the local area network unit to receive local area network signals and generate control signals according to the local area network signals.
  • the mobile device of some embodiments of the present application includes a local area network unit, the local area network unit includes multiple processors connected through a network device, the local area network unit can process the detection signal generated by the detection device, and the multiple processors of the local area network unit can meet the requirements of large data volume Processing work, high computing power, which can improve the data processing capacity and speed of the mobile device, and make the mobile device sensitive.
  • a large amount of calculation processing is performed by building a computer workstation, and the mobile device communicates with the computer workstation.
  • Computer workstations are bulky and fixed, but mobile devices are mobile and wired to the computer workstations affect flexible mobility. Some mobile devices communicate with computer workstations wirelessly, and send the data that needs to be processed to the computer workstation. The computer workstation processes the data and then sends it to the mobile device. This way back and forth communication consumes a certain amount of time, and the computer workstation communicates with multiple mobile devices. Device communication also requires high communication bandwidth, and it is difficult to guarantee the real-time and reliability of mobile device control.
  • the local area network unit of the mobile device in some embodiments of the present application is set in the body, and moves along with the movement of the mobile device, which can avoid the restriction on the flexible movement of the mobile device, and can quickly communicate with the main control module of the mobile device.
  • the control has strong real-time performance and high reliability, so the mobile device responds quickly. Therefore, the mobile device of some embodiments of the present application not only has a relatively strong and fast processing capability, but also maintains a fast response state, and has high control real-time and reliability.
  • the control method in the embodiment of the present application is used to control a mobile device, and the mobile device includes a body.
  • the control method includes: generating a detection signal through a detection device provided in the fuselage; processing the detection signal through a local area network unit provided in the fuselage, and generating a local area network signal.
  • the local area network unit includes multiple processors and a network connected to the multiple processors Equipment; and through the main control module set in the fuselage and communicatively connected with the local area network unit to generate control signals according to the local area network signal.
  • the control method has strong ability to process the detection signal, and the control is real-time and reliable, so that the mobile device can react quickly.
  • FIG. 1 is a three-dimensional schematic diagram of an embodiment of a mobile device 100.
  • the mobile device 100 shown in Fig. 1 is an unmanned aerial vehicle.
  • the mobile device 100 includes a body 101.
  • FIG. 2 is a module block diagram of an embodiment of the mobile device 100 shown in FIG. 1. 1 and 2, the mobile device 100 further includes a detection device 102, a local area network unit 103, and a main control module 104.
  • the detection device 102 is provided on the body 101 for generating detection signals.
  • the local area network unit 103 is set in the body 101 and includes multiple processors 131-134 and a network device 135 connected to the multiple processors 131-134.
  • the local area network unit 103 is connected to the detection device 102 and is used to receive and process detection signals to generate a local area network signal.
  • the main control module 104 is provided in the body 101 and is connected to the LAN unit 103 in communication, and is used for receiving LAN signals and generating control signals according to the LAN signals.
  • the local area network unit 103 can process the detection signals generated by the detection device 102, and the multiple processors 131-134 of the local area network unit can meet the processing tasks with a large amount of data and have high computing power, thereby improving the data processing capability and speed of the mobile device 100 , Make the mobile device responsive.
  • the local area network unit 103 is provided in the body 101 and moves along with the movement of the mobile device 100, which can avoid the restrictions on the flexible movement of the mobile device 100, and can quickly communicate with the main control module 104 of the mobile device 100.
  • the control has strong real-time performance and high reliability, so the mobile device responds quickly. Therefore, the mobile device 100 not only has a relatively strong and fast processing capability, but also maintains a fast response state, and the control is highly real-time and reliable.
  • the detection device 102 includes a sensor, and the generated detection signal may represent information sensed by the detection device 102.
  • the detection device 102 includes at least one of a camera, a radar, a GPS (Global Positioning System, global positioning system), an altimeter, an inertial measurement unit (IMU), and a pressure gauge.
  • the camera can be used to take images and so on. Radar can be used for detecting obstacles, ranging, positioning, etc. GPS can be used for positioning.
  • the altimeter can be used to sense the flying height of mobile devices such as unmanned aerial vehicles.
  • the pressure gauge can determine the flight altitude by sensing the pressure of the air.
  • the inertial measurement unit can be used to sense the posture of the mobile device 100.
  • the detection device 102 may include other sensors and the like. For different mobile devices, different detection devices 102 can be provided.
  • the local area network unit 103 is provided in the body 101.
  • the local area network unit 103 may be provided on the top of the body 101.
  • the local area network unit 103 shown in FIG. 1 is provided on the top of the UAV.
  • the local area network unit 103 may be located at the bottom of the body 101 of the mobile device 100 or other locations.
  • the processors 131-134 of the local area network unit 103 and the network device 135 may be assembled in the body 101 as a whole.
  • the processors 131-134 of the local area network unit 103 and the network device 135 can be separated and located in different positions of the body 101, so that the space of the body 101 is reasonably used to ensure the balance and stability of the mobile device 101 .
  • the network device 135 may include an interactive machine, a hub, etc., to implement communication between multiple processors 131-134.
  • the local area network unit 103 may include two or more processors. For illustrative purposes only, four processors 131-134 are shown in the figure, but not limited to this, the number of processors can be set according to actual applications.
  • the processors 131-134 include at least one of a CPU (Central Processing Unit, central processing unit) and a GPU (Graphics Processing Unit, graphics processor), which improves computing power and has low power consumption.
  • a suitable processor can be selected according to the actual processing required and the type of the detection device 102 to ensure computing power.
  • the multiple processors 131-134 include multiple CPUs. In another embodiment, the multiple processors 131-134 include multiple GPUs. In another embodiment, the multiple processors 131-134 include one CPU and one or more GPUs. In one embodiment, the multiple processors 131-134 include multiple CPUs and one or more GPUs. In some other embodiments, the processor may include other types of processors, such as FPGA. In some embodiments, multiple processors 131-134 can form a homogeneous or heterogeneous computer cluster through the network device 135 to perform distributed computing, which greatly shortens the processing time and improves the response agility of the system.
  • At least one processor 131-134 is connected to the detection device 102, and processes detection signals generated by the detection device 102.
  • one processor 131-134 may be connected to one or more detection devices 102.
  • One processor 131-134 can process the detection signals generated by one or more detection devices 102.
  • the detection signal generated by a detection device 102 has a large amount of data and a complex processing algorithm.
  • a processor 131-134 can be connected to the detection device 102 to process the detection signal of the detection device 102 to ensure fast processing speed.
  • And can select processors 131-134 that are good at processing the detection signal for processing.
  • the image data generated by the camera can be processed by the GPU.
  • the data volume of the detection signals generated by multiple detection devices 102 is not very large, and the detection signals of multiple detection devices 102 can be provided to the same processor 131-134 for processing. In this way, the processor resources are reasonably used, while ensuring computing power Therefore, as few processors are set as possible, so that the volume of the local area network unit 103 is as small as possible, and the volume and weight of the mobile device 100 will not increase too much, ensuring flexibility and portability.
  • one detection device 102 may be connected to one processor 131-134. In other embodiments, one detection device 102 may be connected to multiple processors 131-134, and multiple processors 131-134 may perform different processing on the detection signal of one detection device 102, respectively.
  • the local area network signal includes a control decision
  • the processors 131-134 are configured to determine the control decision according to the processed detection signal and provide the control decision to the main control module 104.
  • the main control module 104 is used for generating control signals according to control decisions.
  • the local area network unit 103 can process the detection signal and generate corresponding control decisions, so according to the information sensed by the detection device 102, generate corresponding control decisions, and then instruct the main control module 104 to generate corresponding control signals to control the movement and behavior of the mobile device 100 Wait.
  • the local area network unit 103 has strong processing capability and fast processing speed, so that it can realize rapid and timely control.
  • the multiple processors 131-134 include a first processor, which is connected to the detection device 102 and is used to process the detection signal.
  • the first processor may be one or more of the processors 131-134 in the figure, and quickly process the detection signal.
  • the first processor is the processor 131 as an example.
  • the multiple processors 131-134 include a second processor that is communicatively connected to the first processor 131 through a network device, and the second processor is connected to the main control module 104 for sending local area network signals to the main control. Module 104.
  • the second processor is used as the processor 134 for description.
  • the second processor 134 may be responsible for communicating with the main control module 104.
  • the second processor 134 may receive the signal processed by the first processor 131 through the network device 135, and send the signal to the main control module 104.
  • the first processor 131 generates a control decision after processing the detection signal, and the second processor 134 sends the control decision to the main control module 104.
  • the second processor 134 may further process the signal processed by the first processor 131, and then send it to the main control module 104.
  • the second processor 134 is configured to receive the detection signal processed by the first processor 131, determine a control decision according to the processed detection signal, and send the control decision to the main control module 104. In this way, labor can be divided and cooperated to improve efficiency and processing speed.
  • the multiple processors 131-134 include multiple first processors.
  • the processors 131-133 in the figure are the first processors.
  • the first processors 131-133 process the detection signals.
  • the multiple first processors 131-133 can process detection signals independently, can process different detection signals separately, or process the same detection signal differently.
  • a plurality of first processors 131-133 may process the detection signal in cooperation.
  • the first processor 131 may perform processing such as noise removal, enhancement, restoration, segmentation, and/or feature extraction on the image data.
  • the other first processor 132 compresses and/or stores the processed image data.
  • the specialties of the first processor 131 and the first processor 132 can be different, and the specialties of the processors can be fully utilized to process the detection signal faster and better. Multiple first processors 131-133 work together to increase processing speed.
  • the second processor 134 communicates with the plurality of first processors 131-133 through the network device 135, and is configured to receive the processed detection signals of the plurality of first processors 131-133, and according to the A processed detection signal determines the control decision. After the multiple first processors 131-133 process different detection signals or perform different processes on the same detection signal, the processed detection signals are provided to the second processor 134, and the second processor 134 according to different Process the results and determine the control decision, so that a better control decision can be determined. In one embodiment, the second processor 134 comprehensively considers the processed different detection signals to determine the control decision. For example, according to the detection signals of the camera and radar, a control decision for planning the movement path of the mobile device 100 is determined. In another embodiment, the second processor 134 determines the control decision based on the differently processed data of the same detection signal and comprehensive considerations.
  • At least one processor 131-134 is used to process the detection signal using an artificial neural network. Input the detection signal or the preprocessed detection signal into the artificial neural network for processing to obtain the processing result. Deep learning, artificial intelligence, fast processing speed, especially in processing big data, the speed is significantly improved.
  • the local area network unit 103 provides a hardware foundation for the operation of the artificial neural network algorithm, so that the intelligent control of the mobile device 100 can be realized, and the mobile device 100 can be more intelligent.
  • the detection device 102 includes a camera.
  • the camera is used to capture images and generate corresponding image data.
  • the processors 131 to 134 are used to process the image data and determine control decisions based on the processed image data.
  • the amount of image data is relatively large, and the processing algorithm is relatively complex.
  • Using the local area network unit 103 for processing can quickly perform processing and reduce the workload of the main control module 104.
  • artificial neural networks may be used to process image data.
  • the processors 131-134 are used to identify the photographed object and/or determine the relative position of the photographed object and the mobile device 100 according to the image data, and determine corresponding control decisions.
  • the camera can photograph the photographed object, and the image data includes the image data of the photographed object.
  • the image data is processed to identify the photographed object and/or determine the relative position.
  • the control decision includes at least one of the following decisions: a decision to control the mobile device 100 to approach the object being photographed, a decision to control the mobile device to move away from the object being photographed, a decision to control the mobile device to track the object being photographed, and a decision to control the mobile device to strike Decision of the object being photographed.
  • the main control module 104 correspondingly controls the mobile device 100 according to the control decision, such as controlling the mobile device to move to a position close to the photographed object, move to a position far away from the photographed object, track the photographed object, or strike the photographed object. In this way, timely and fast control is achieved.
  • the detection device 102 includes a radar, and the processors 131-134 are used to process radar data and determine the relative position information of the obstacle and the mobile device 100.
  • the distance and orientation of the obstacle from the mobile device 100 can be determined by radar.
  • the radar data volume is large and the processing algorithm is complex.
  • the local area network unit 103 can be used to realize rapid processing and reduce the workload of the main control module 104.
  • an artificial neural network may be used to process radar data.
  • the processors 131-134 are configured to determine corresponding control decisions based on relative position information, and the control decisions include at least one of the following decisions: a decision to plan a moving path of the mobile device, and a decision to control the operating state of the mobile device.
  • the path can be re-planned to effectively avoid the obstacle. Controlling the operating state of the mobile device can control the mobile device to stop moving, control the posture of the mobile device, and/or control the moving speed of the mobile device.
  • the local area network unit 103 includes a wireless communication module 136 connected to at least one processor 131-134.
  • the local area network unit 103 of the mobile device 100 can communicate with the local area network unit 103 of other mobile devices 100 through the wireless communication module 136, and multiple mobile devices 100 can work together.
  • the local area network unit 103 of the mobile device 100 may communicate with external devices (for example, a computer, a mobile phone, etc.) through a wireless communication module 136.
  • the external device can debug the processors 131-134.
  • the external devices may include user equipment, and the user may send information and/or instructions to the local area network unit 103 through the user equipment.
  • the local area network unit 103 of the mobile device 100 may also communicate wirelessly with other devices having wireless communication modules through the wireless communication module 136, for example, wirelessly communicate with a server.
  • the wireless communication module 136 includes an antenna.
  • the antenna includes a WiFi antenna, which can realize wireless communication with external devices such as computers, mobile phones, and servers.
  • the wireless communication module 136 may be omitted.
  • the mobile device 100 includes a power module 107 connected to the main control module 104, and the main control module 104 is used to generate a control signal to control the power module 107.
  • the main control module 104 can control the power module 107 according to the control decision.
  • the power module 107 includes a motor 110 and a propeller 111 connected to the motor 110.
  • the main control module 104 can control the motor 110 to drive the propeller 111.
  • Other types of mobile equipment 100 may include other power modules 107, such as walking devices such as wheels, boat paddles, and the like.
  • the mobile device 100 includes a behavior module 108 connected to the main control module 104, and the main control module 104 is configured to generate control signals and control the behavior module 108.
  • the behavior module 108 may be used for image transmission and aerial photography, rescue missions, and/or strike confrontation missions, etc., to implement some tasks of the mobile device 100.
  • the main control module 104 may control the behavior module 108 according to the control decision.
  • the mobile device 100 includes a power supply module 105 provided in the body 101.
  • the power supply module 105 is connected to the local area network unit 103 and the main control module 104 to supply power to the local area network unit 103 and the main control module 104 to ensure that the local area network unit 103 and main control module 104 normal working power.
  • the power module 105 includes a battery, which may be a rechargeable battery, such as a lithium battery.
  • the power module 105 can supply power to the processors 131-134 and the network device 135.
  • the mobile device 100 includes a sensing module 106 connected to the main control module 104, the sensing module 106 is used to generate a sensing signal to the main control module 104, and the main control module 104 is used to process the sensing signal.
  • the main control module 104 can process the sensing signals and generate control signals, which can control the power module 107 and/or the behavior module 108.
  • the sensing module 106 may include a sensor. The sensor of the sensing module 106 may be different from the sensor of the detection device 102, and the data amount of the sensing signal may be smaller than the data amount of the detection signal of the detection device 102.
  • the main control module 104 It can be processed quickly to achieve timely and rapid control of the mobile device 100.
  • the perception module 106 includes at least one of a binocular vision module and a carrier-free communication positioning module.
  • the binocular vision module can be used for altitude positioning, distance measurement, etc., and can be used on unmanned aerial vehicles.
  • the carrier-free communication positioning module can be used for indoor precise positioning and can be used on mobile vehicles.
  • FIG. 3 is a schematic diagram of a three-dimensional structure of an embodiment of the local area network unit 103. Only two processors 131 and 132 are shown in the figure, but not limited to two. In one embodiment, the processors 131 and 132 and the network device 135 may be stacked and fixed together. The multiple processors 131 and 132 can be separately placed on the upper and lower sides of the network device 135 to facilitate the connection between the processors 131 and 132 and the network device 135.
  • the mobile device 100 includes a power adapter board 112.
  • the power adapter board 112 is connected to a power module 105 (as shown in FIG. 2) and a plurality of processors 131, 132, and provides power from the power module 105 to multiple processors.
  • Two processors 131, 132 realize power supply to multiple processors 131, 132.
  • the processor 131 includes a first power interface 1311 and a second power interface 1312, the power module 105 is connected through the first power interface 1311, and a power adapter is connected through the second power interface 1312.
  • the power adapter can be connected to an external power supply, such as a commercial power supply, to supply power to the processor 131.
  • the power adapter can be unplugged from the second power interface 1312, so that the mobile device 100 can move flexibly.
  • a power adapter can be inserted into the second power interface 1312 to supply power to the processor 131, which can save the power of the power module 105.
  • Other processors may also be similar to the processor 131, including a first power interface and a second power interface.
  • the power adapter board 112 includes a first adapter board 1121 and a second adapter board 1122, the first adapter board 1121 is connected to the power module 105 and the multiple processors 131, 132, and the second adapter board 1122 connects the power adapter and multiple processors 131 and 132.
  • FIG. 4 shows a three-dimensional schematic diagram of an embodiment of the network device 135.
  • the network device 135 includes a network device power interface 1351, which can be connected to the power module 105 (as shown in FIG. 2).
  • the network device 135 may include a first network device power interface connected to the power supply module 105 and a second network device power interface connected to a power adapter.
  • the first network device power interface can be connected to the power module 105 through the first adapter board 1121
  • the second network device power interface can be connected to a power adapter through the second adapter board 1122.
  • the network device 135 includes multiple network interfaces 1352, which can be connected to multiple processors.
  • FIG. 5 shows a three-dimensional schematic diagram of an embodiment of the processor 131.
  • Fig. 6 is a perspective view of the processor 131 from another angle.
  • the processor 131 includes a wired network interface 1313, and is connected to the network device 135 through the wired network interface 1313.
  • the wired network interface 1313 of the processor 131 is wiredly connected to the network interface 1352 of the network device 135, so that data transmission is faster and more reliable.
  • the interface at least one processor 131 includes a detection signal interface, and is connected to the detection device 102 (as shown in FIG. 2) through the detection signal interface.
  • the detection device 102 and the processor 131 are wiredly connected to transmit data quickly and reliably.
  • the probe signal interface includes at least one of a UART interface, a CAN interface, a USB interface, an SPI interface, and an I2C interface.
  • the interfaces 1314 and 1315 are USB3.0 interfaces.
  • the interface 1316 is an HDMI interface.
  • the interface 1317 is a USB3.0 MicroB interface.
  • the interfaces 1318 and 1319 are UART interfaces.
  • the interfaces 1320 and 1321 are CAN interfaces.
  • the interface 1322 is an IO interface of I2C and SPI.
  • the detection signal interface may be at least one of the interfaces 1314, 1315, 1317, 1318, 1319, 1320, 1321, 1322.
  • At least one processor 131 includes a main control interface, and is connected to the main control module 104 (as shown in FIG. 2) through the main control interface.
  • the host interface includes at least one of a UART interface, a CAN interface, and a USB interface.
  • the main control interface may be at least one of the interfaces 1314, 1315, 1317, 1318, 1319, 1320, and 1321.
  • the processor 131 and the main control module 104 are wiredly connected through an interface, which can ensure timely and effective data transmission.
  • the wireless communication module 136 includes an antenna
  • the at least one processor 131 includes antenna interfaces 1323 and 1324 connected to the antenna.
  • the antenna can be inserted into the antenna interface 1323, 1324.
  • the processor 131 includes an interactive interface for connecting to an interactive device (not shown).
  • the processor 131 can be debugged and the data in the processor 131 can be viewed through the interactive device.
  • the interactive interface may include at least one of an HDMI interface and a USB interface.
  • the interactive interface may be at least one of the interfaces 1314, 1315, 1316, and 1317.
  • the interactive device includes at least one of a display, a mouse, and a keyboard, which can input instructions, load programs, view data, and so on.
  • processors of the local area network unit may have the same or different interfaces as the processor 131.
  • the interface can be set according to the actual application.
  • FIG. 7 shows a schematic diagram of another embodiment of the mobile device 200.
  • the mobile device 200 shown in FIG. 7 is a mobile trolley.
  • the local area network unit 203 is provided on the body 201 and may be provided on the top of the body 201. In another embodiment, the local area network unit 203 can be located at the bottom of the body 201 or on the chassis.
  • the processor and network equipment of the local area network unit 203 can be separated and arranged in different positions of the body 201.
  • the power module 207 of the mobile device 200 includes wheels and is driven by a motor.
  • the wheels may be universal wheels, such as mecanum wheels.
  • the mobile device 200 is similar to the mobile device 100, and the local area network unit 203 is similar to the local area network 103 of the mobile device 100.
  • the local area network unit 203 is similar to the local area network 103 of the mobile device 100.
  • FIG. 8 shows a flowchart of an embodiment of a control method 300 of this application.
  • the control method 300 is used to control a mobile device, and the mobile device includes a body.
  • the mobile device may be the mobile device 100 or 200 described above.
  • the control method 300 includes steps.
  • a detection signal is generated by a detection device provided on the fuselage.
  • the detection signal is processed and the local area network signal is generated by the local area network unit provided in the fuselage.
  • the local area network unit includes multiple processors and a network device connected to the multiple processors.
  • step 303 a control signal is generated according to the local area network signal through the main control module provided in the body and communicatively connected with the local area network unit.
  • the control method 300 has a strong ability to process detection signals, and has high control real-time and high reliability, so that the mobile device responds quickly.
  • the local area network signal includes a control decision; the processor determines the control decision according to the processed detection signal; and the control signal is generated according to the control decision.
  • the plurality of processors includes a first processor, which is connected to the detection device; the detection signal is processed by the first processor.
  • the multiple processors include a second processor communicatively connected to the first processor through a network device, and the second processor is connected to the main control module; the control method 300 includes: sending a local area network signal through the second processor To the main control module.
  • the second processor receives the detection signal processed by the first processor, and determines the control decision based on the processed detection signal.
  • the plurality of processors includes a plurality of first processors, and the second processor is in communication connection with the plurality of first processors through a network device; the control is determined according to the detection signals processed by the plurality of first processors decision making.
  • the detection device includes a camera. Take images through the camera and generate corresponding image data. The image data is processed by the processor; and the control decision is determined according to the processed image data.
  • the processor is used to identify the photographed object and/or determine the relative position of the photographed object and the mobile device according to the image data, and determine the corresponding control decision.
  • control decision includes at least one of the following decisions: a decision to control the mobile device to approach the photographed object, a decision to control the mobile device to move away from the photographed object, and a decision to control the mobile device to track the object. The decision of the photographed object and the decision of controlling the mobile device to strike the photographed object.
  • the detection device includes radar.
  • the radar data is processed to determine the relative position information of the obstacle and the mobile device.
  • the processor determines the corresponding control decision based on the relative position information.
  • the control decision includes at least one of the following decisions: a decision to plan the movement path of the mobile device, and a decision to control the operating state of the mobile device.
  • At least one processor uses an artificial neural network to process the detection signal.
  • the multiple processors include a wired network interface, and are wired to the network device through the wired network interface, and the control method includes: the multiple processors and the network device have wired communication.
  • the mobile device includes a sensing module connected to the main control module, and the control method includes: generating a sensing signal to the main control module through the sensing module, and processing the sensing signal through the main control module.
  • the mobile device includes a power module connected to the main control module, and the control method includes: generating a control signal through the main control module to control the power module.
  • the mobile device includes a behavior module connected to the main control module, and the control method includes: generating a control signal through the main control module to control the behavior module.
  • the relevant part can refer to the part of the description of the device embodiment.
  • the method embodiment and the device embodiment are complementary to each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

L'invention concerne un dispositif mobile et un procédé de commande. Le dispositif mobile comprend un corps, un appareil de détection, une unité de réseau local et un module de commande principal. L'appareil de détection est disposé sur le corps et est utilisé pour générer un signal de détection. L'unité de réseau local est disposée sur le corps et comprend une pluralité de processeurs et un dispositif de réseau connecté à la pluralité de processeurs. L'unité de réseau local est connectée à l'appareil de détection et est utilisée pour recevoir et traiter le signal de détection et générer un signal de réseau local. Le module de commande principal est disposé sur le corps, est en connexion de communication avec l'unité de réseau local et est utilisé pour recevoir le signal de réseau local et générer un signal de commande selon le signal de réseau local.
PCT/CN2019/086688 2019-05-13 2019-05-13 Dispositif mobile et procédé de commande WO2020227899A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980005506.2A CN111316184A (zh) 2019-05-13 2019-05-13 移动设备及控制方法
PCT/CN2019/086688 WO2020227899A1 (fr) 2019-05-13 2019-05-13 Dispositif mobile et procédé de commande

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/086688 WO2020227899A1 (fr) 2019-05-13 2019-05-13 Dispositif mobile et procédé de commande

Publications (1)

Publication Number Publication Date
WO2020227899A1 true WO2020227899A1 (fr) 2020-11-19

Family

ID=71148357

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/086688 WO2020227899A1 (fr) 2019-05-13 2019-05-13 Dispositif mobile et procédé de commande

Country Status (2)

Country Link
CN (1) CN111316184A (fr)
WO (1) WO2020227899A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203946284U (zh) * 2014-07-21 2014-11-19 深圳市大疆创新科技有限公司 一种飞行***、飞行器以及处理器
US20180143633A1 (en) * 2016-06-28 2018-05-24 Faraday&Future Inc. Multi-processor soc system
CN108762152A (zh) * 2018-06-04 2018-11-06 上海哲奥实业有限公司 一种开放式智能网联域控制器硬件平台
CN108891409A (zh) * 2018-08-29 2018-11-27 固安海高汽车技术有限公司 一种智能驾驶***和中央域控制器及其方法
CN208752538U (zh) * 2018-10-29 2019-04-16 深圳市大疆创新科技有限公司 可移动平台及其主机

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203946284U (zh) * 2014-07-21 2014-11-19 深圳市大疆创新科技有限公司 一种飞行***、飞行器以及处理器
US20180143633A1 (en) * 2016-06-28 2018-05-24 Faraday&Future Inc. Multi-processor soc system
CN108762152A (zh) * 2018-06-04 2018-11-06 上海哲奥实业有限公司 一种开放式智能网联域控制器硬件平台
CN108891409A (zh) * 2018-08-29 2018-11-27 固安海高汽车技术有限公司 一种智能驾驶***和中央域控制器及其方法
CN208752538U (zh) * 2018-10-29 2019-04-16 深圳市大疆创新科技有限公司 可移动平台及其主机

Also Published As

Publication number Publication date
CN111316184A (zh) 2020-06-19

Similar Documents

Publication Publication Date Title
JP6618577B2 (ja) 外部客体との距離に基づいて移動する電子装置
US10901437B2 (en) Unmanned aerial vehicle including an omnidirectional depth sensing and obstacle avoidance aerial system and method of operating same
WO2019033747A1 (fr) Procédé de détermination de cible à suivi intelligent par un véhicule aérien sans pilote, véhicule aérien sans pilote et dispositif de commande à distance
CN113534828A (zh) 质心位置确定方法、装置、足式机器人及存储介质
WO2022021027A1 (fr) Procédé et appareil de suivi de cible, véhicule aérien sans pilote, système et support de stockage lisible associés
US20190324448A1 (en) Remote steering of an unmanned aerial vehicle
CN110187720A (zh) 无人机导引方法、装置、***、介质及电子设备
CN115933718A (zh) 一种融合全景slam与目标识别的无人机自主飞行技术方法
Farooq et al. A lightweight controller for autonomous following of a target platform for drones
Labrado et al. Proposed testbed for the modeling and control of a system of autonomous vehicles
WO2022126598A1 (fr) Systèmes et structures de véhicules aériens sans pilote
US20210181769A1 (en) Movable platform control method, movable platform, terminal device, and system
WO2020227899A1 (fr) Dispositif mobile et procédé de commande
WO2021208917A1 (fr) Procédé et dispositif de détermination de position barycentrique, robot marcheur et support de stockage
CN115237158A (zh) 多旋翼无人机自主跟踪与着陆控制***及控制方法
WO2022021028A1 (fr) Procédé de détection de cible, dispositif, aéronef sans pilote et support de stockage lisible par ordinateur
Bhat et al. Real-time gesture control UAV with a low resource framework
Kohlbrecher et al. RoboCup Rescue 2016 team description paper hector Darmstadt
CN115922731B (zh) 一种机器人的控制方法以及机器人
Al-Khalil et al. Unmanned Ground Vehicle with Virtual Reality Vision
WO2022113482A1 (fr) Dispositif, procédé et programme de traitement d'informations
Kumar et al. Autonomous Drone Navigation using Monocular Camera and Light Weight Embedded System
KR20190053018A (ko) 카메라를 포함하는 무인 비행 장치를 조종하는 방법 및 전자장치
Singh et al. Development of a low-cost Collision Avoidance System based on Coulomb’s inverse-square law for Multi-rotor Drones (UAVs)
EP4207100A1 (fr) Procédé et système pour fournir une interface utilisateur pour la création de cibles cartographiques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19928373

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19928373

Country of ref document: EP

Kind code of ref document: A1