WO2020259154A1 - 设备控制方法、终端、受控设备、电子设备、介质和程序 - Google Patents

设备控制方法、终端、受控设备、电子设备、介质和程序 Download PDF

Info

Publication number
WO2020259154A1
WO2020259154A1 PCT/CN2020/091915 CN2020091915W WO2020259154A1 WO 2020259154 A1 WO2020259154 A1 WO 2020259154A1 CN 2020091915 W CN2020091915 W CN 2020091915W WO 2020259154 A1 WO2020259154 A1 WO 2020259154A1
Authority
WO
WIPO (PCT)
Prior art keywords
controlled device
program code
control instruction
program
terminal
Prior art date
Application number
PCT/CN2020/091915
Other languages
English (en)
French (fr)
Inventor
张军伟
Original Assignee
上海商汤智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海商汤智能科技有限公司 filed Critical 上海商汤智能科技有限公司
Publication of WO2020259154A1 publication Critical patent/WO2020259154A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4183Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41835Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by programme execution
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41845Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by system universality, reconfigurability, modularity
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4185Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the network communication
    • G05B19/41855Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the network communication by local area network [LAN], network structure
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • This application relates to smart device control technology, in particular to a device control method, terminal, controlled device, electronic device, computer storage medium, and computer program.
  • the remote control method for smart devices can be applied to multiple fields, for example, it can be applied to the field of remote control or programming education.
  • corresponding control instructions can be sent through fixed command buttons.
  • the embodiments of the present application propose a device control method, terminal, controlled device, electronic device, computer storage medium, and computer program.
  • the embodiment of the present application provides a device control method, the method includes:
  • the program code is sent to the first controlled device through the local agent program, so that the first controlled device runs the program code.
  • the method further includes:
  • the feedback information of the first controlled device is received, and the feedback information is generated by the first controlled device after running the program code.
  • the operation status of the above program code can be learned, and then the program code can be processed conveniently. For example, when the operation status of the program code does not meet expectations, the program code can be modified.
  • the method further includes:
  • the feedback information includes the execution result of the program code.
  • the obtaining program code for controlling the operation of the controlled device through a local agent program includes:
  • the program code submitted by the user based on the web (WEB) page is obtained through the local agent program.
  • the local agent program can collect the corresponding program code; it can be seen that the user can submit the program code more conveniently based on the WEB page.
  • the embodiment of the present application also provides another device control method, which is applied to the first controlled device, and the method includes:
  • the method further includes:
  • the terminal can learn about the operation of the program code, and then facilitate processing of the program code. For example, when the operation of the program code does not meet expectations, the program code can be modified.
  • the feedback information includes the execution result of the program code.
  • the method further includes:
  • the embodiments of the present application provide a cooperative control solution for the first controlled device and the second controlled device.
  • the first controlled device and the second controlled device can be implemented based on the above program code. Cooperative control of the second controlled device.
  • the method further includes:
  • the working mode switching information is used to instruct the first controlled device and the second controlled device to switch the current working mode to the target working mode;
  • control the first controlled device to switch operating modes; send the operating mode switching information to a second controlled device, so that the second controlled device is based on the operating mode switching information Switch working mode.
  • control instruction of the second controlled device is an image detection instruction, and the execution result of the instruction is an image detection result
  • the generating the control instruction of the first controlled device according to the instruction execution result and executing the control instruction of the first controlled device includes:
  • the movement state of the first controlled device is controlled.
  • human body tracking can be implemented on the basis of the coordinated control of the first controlled device and the second controlled device.
  • the generating a control instruction of the first controlled device according to the image detection result includes:
  • a control instruction for controlling the first controlled device to turn right is generated.
  • the human body tracking can be accurately realized by controlling the motion state of the first controlled device.
  • the first controlled device and the second controlled device are connected through a wired connection.
  • An embodiment of the present application also provides a terminal, the terminal includes an acquisition module and a first processing module, wherein:
  • the obtaining module is configured to obtain program code for controlling the operation of the first controlled device through the local agent program
  • the first processing module is configured to send the program code to the first controlled device through the local agent program, so that the first controlled device runs the program code.
  • the first processing module is further configured to receive feedback information of the first controlled device after sending the program code to the first controlled device, and the The feedback information is generated after the first controlled device runs the program code.
  • the operation status of the above program code can be learned, and then the program code can be processed conveniently. For example, when the operation status of the program code does not meet expectations, the program code can be modified.
  • the first processing module is further configured to load the feedback information on the terminal where the local agent program is located after receiving the feedback information of the first controlled device And/or display.
  • the feedback information includes the execution result of the program code.
  • the obtaining module is configured to obtain the program code submitted by the user based on the WEB page through the local agent program.
  • the local agent program can collect the corresponding program code; it can be seen that the user can submit the program code more conveniently based on the WEB page.
  • An embodiment of the present application also provides a first controlled device.
  • the first controlled device includes a receiving module and a second processing module, wherein:
  • the receiving module is configured to receive the program code sent by the terminal through the local agent program for controlling the operation of the first controlled device; the program code is collected by the local agent program;
  • the second processing module is configured to run the program code.
  • the second processing module is further configured to run the program code to generate feedback information, and send the feedback information to the terminal.
  • the terminal can learn about the operation of the program code, and then facilitate processing of the program code. For example, when the operation of the program code does not meet expectations, the program code can be modified.
  • the feedback information includes the execution result of the program code.
  • the second processing module is further configured to:
  • the embodiments of the present application provide a cooperative control solution for the first controlled device and the second controlled device.
  • the first controlled device and the second controlled device can be implemented based on the above program code. Cooperative control of the second controlled device.
  • the second processing module is further configured to obtain work mode switching information; according to the work mode switching information, control the first controlled device to switch the work mode; change the work mode The switching information is sent to the second controlled device so that the second controlled device switches the working mode based on the working mode switching information; the working mode switching information is used to instruct the first controlled device and the second controlled device The controlled device switches the current working mode to the target working mode.
  • control instruction of the second controlled device is an image detection instruction, and the execution result of the instruction is an image detection result
  • the second processing module is configured to generate a control instruction of the first controlled device according to the image detection result, the control instruction of the first controlled device is a human body tracking instruction; according to the human body tracking instruction, Controlling the movement state of the first controlled device.
  • human body tracking can be implemented based on the coordinated control of the first controlled device and the second controlled device.
  • the second processing module is configured to generate a control instruction for controlling the advancement of the first controlled device in response to the image detection result indicating that the human body becomes smaller;
  • the image detection result indicates that the human body becomes larger, and a control instruction for controlling the first controlled device to remain still is generated; in response to the image detection result, the human body is located on the left side of the first controlled device Case, generate a control instruction for controlling the first controlled device to turn left; in response to the image detection result indicating that the human body is on the right side of the first controlled device, generate a control command for controlling the first controlled device Control instructions for the controlled device to turn right.
  • the human body tracking can be accurately realized by controlling the motion state of the first controlled device.
  • the first controlled device and the second controlled device are connected through a wired connection.
  • An embodiment of the present application also provides a terminal, the terminal including a processor and a memory for storing a computer program that can run on the processor; wherein,
  • any one of the above-mentioned device control methods applied to the terminal is executed.
  • An embodiment of the present application also provides an electronic device, which includes a processor and a memory for storing a computer program that can run on the processor; wherein,
  • the processor When the processor is used to run the computer program, it executes any one of the aforementioned device control methods applied to the first controlled device.
  • the embodiment of the present application also provides a computer storage medium on which a computer program is stored, and when the computer program is executed by a processor, any one of the above device control methods is implemented.
  • the embodiment of the present application also provides a computer program, including computer readable code, when the computer readable code runs in the terminal, the processor in the terminal executes any of the above-mentioned device control methods applied to the terminal .
  • the embodiments of the present application also provide another computer program, including computer-readable code.
  • the processor in the electronic device executes the Any of the above device control methods.
  • the program code for controlling the operation of the first controlled device is obtained through a local agent program;
  • the program sends the program code to the first controlled device, so that the first controlled device runs the program code.
  • the technical solution of the embodiment of the present application can obtain and push the program code based on the local agent program, and then can realize the control of the first controlled device based on the program code, and the program code can be flexibly edited, so that it can be more flexible Control the first controlled device.
  • Fig. 1 is a first flowchart of a device control method according to an embodiment of the application
  • Figure 2 is a schematic diagram of the connection relationship between the EV3 smart car and the Raspberry Pi in an embodiment of the application;
  • Figure 3 is a schematic diagram of the connection relationship between the EV3 smart car and the terminal in an embodiment of the application
  • Figure 5 is a schematic diagram of the functional flow of the agent program in an embodiment of the application.
  • FIG. 6 is a schematic diagram of the processing flow of the EV3 smart car and the Raspberry Pi in the human tracking mode in the embodiment of the application;
  • FIG. 7 is a schematic diagram of a processing flow for feedback information in an embodiment of the application.
  • FIG. 8 is a third flowchart of a device control method according to an embodiment of the application.
  • FIG. 9 is a fourth flowchart of a device control method according to an embodiment of this application.
  • FIG. 10 is a schematic diagram of the composition structure of a terminal according to an embodiment of the application.
  • FIG. 11 is a schematic structural diagram of another terminal according to an embodiment of the application.
  • FIG. 12 is a schematic diagram of the composition structure of a first controlled device according to an embodiment of the application.
  • FIG. 13 is a schematic structural diagram of an electronic device according to an embodiment of the application.
  • the terms "including”, “including” or any other variations thereof are intended to cover non-exclusive inclusion, so that a method or device including a series of elements not only includes what is clearly stated Elements, but also include other elements that are not explicitly listed, or elements inherent to the implementation of the method or device. Without more restrictions, the element defined by the sentence “including a" does not exclude the existence of other related elements (such as steps or steps in the method) in the method or device that includes the element.
  • the unit in the device for example, the unit may be part of a circuit, part of a processor, part of a program or software, etc.).
  • the device control method provided by the embodiment of the application includes a series of steps, but the device control method provided by the embodiment of the application is not limited to the recorded steps.
  • the terminal and controlled device provided by the embodiment of the application include There are a series of modules, but the terminals and controlled devices provided in the embodiments of the present application are not limited to include explicitly recorded modules, and may also include modules that need to be set to obtain related information or perform processing based on information.
  • the embodiments of the present application can be applied to a control system composed of a terminal and a controlled device, and can be operated with many other general or dedicated computing system environments or configurations.
  • the terminal can be a thin client, a thick client, a handheld or laptop device, a microprocessor-based system, a set-top box, a programmable consumer electronic product, a network personal computer, a small computer system, etc.
  • the controlled device can be Electronic equipment such as smart cars.
  • the device control method provided in the embodiments of the present application may also be implemented by a processor executing computer program code.
  • Electronic devices such as terminals and controlled devices can be described in the general context of computer system executable instructions (such as program modules) executed by the computer system.
  • program modules may include routines, programs, object programs, components, logic, data structures, etc., which perform specific tasks or implement specific abstract data types.
  • the educational robot is a universal teaching aid model for modern programming education, which can be controlled by WEB or a mobile application (Application, APP) client; in related smart device control solutions , Usually through Bluetooth and wireless video transmission, and then information processing on the server side; in the first exemplary solution, a Wi-Fi (Wireless Fidelity) communication-based idea control video car system is proposed.
  • the system consists of four parts: an EEG acquisition card, a smart phone, a mobile car and a personal computer.
  • the connection between the smart phone and the EEG acquisition card is realized through Bluetooth, and the remote control of the mobile car can be realized; in the second exemplary solution, A remote control weeding robot is proposed.
  • a camera and a Global Positioning System (GPS) chip can be installed on the robot body.
  • the car completes the video capture and then transmits it to the host computer through wireless communication.
  • the instructions are issued to the robot; in the third exemplary solution, a remote management solution for the sweeper is proposed.
  • the communication module can be used to wirelessly transmit the mobile phone information to the network management server.
  • the management platform provides services to users.
  • the solution mainly uses WEB to provide related services to facilitate users' management of equipment.
  • fixed command transmission can only be accomplished through wireless communication, and fixed control commands can only be sent through fixed command buttons; the communication quality of the wireless communication network will affect the communication between upper and lower computers.
  • remote control of smart devices based on WEB is an efficient and convenient solution, which plays an extremely important role in the field of remote control and education; but in order to achieve accurate and real-time control, there are still many problems that need to be resolved. ; Firstly, due to the wireless connection between the upper and lower computers, but the wireless network is unstable and the packet loss rate is large, there are big problems in the occasions with higher performance requirements, which limits the application scenarios; secondly, only the implementation of related technologies In order to transmit control instructions from the upper computer to the lower computer, feedback information cannot be collected or processed effectively. The issuance of control instructions on the web side of the upper computer and the collection and reporting of the feedback information of the instructions executed by the lower computer also require flexible design. Complex architecture.
  • FIG. 1 is a flowchart 1 of the device control method of an embodiment of the application. As shown in FIG. 1, the flow may include:
  • Step 101 The terminal obtains the program code for controlling the operation of the first controlled device through the local agent program;
  • the first controlled device may be an electronic device such as a smart car, an embedded development device, or a smart robot; the first controlled device may include a sensor, a camera, and other devices, and the terminal may provide a human-computer interaction interface.
  • the first controlled device may be a separate electronic device or an integrated electronic device.
  • a local agent program can be installed in the terminal.
  • the local agent program is at least used to collect the program code submitted by the user.
  • the program code submitted by the user may be a program used to control the operation of the first controlled device. Code, in this way, by running the local agent program, the program code used to control the operation of the first controlled device can be collected.
  • the program code submitted by the user based on the WEB page can be obtained through the local agent program, that is, the user can submit the program code for controlling the operation of the first controlled device through the WEB page of the terminal ,
  • the local agent program can collect the corresponding program code; it can be seen that the user can submit the program code more conveniently based on the WEB page.
  • the type of the program code is not limited.
  • the type of the program code can be predetermined according to actual application requirements.
  • the program code can be python code or other program codes.
  • Step 102 The terminal sends the program code to the first controlled device through the local agent program.
  • the communication connection between the terminal and the first controlled device can be established in advance.
  • the embodiment of this application does not limit the connection mode of the communication connection between the terminal and the first controlled device.
  • the device can perform data interaction through wired connection or various wireless connection methods.
  • the above-mentioned local agent program may be used to establish a communication connection between the terminal and the first controlled device.
  • the terminal may issue address resolution in the local area network by broadcasting.
  • IP Internet Protocol
  • the first controlled device in the LAN will reply to the corresponding data packet upon receiving the ARP data packet, and the above local agent can parse the data packet replies from the first controlled device , Obtain the Internet Protocol (IP) address of the first controlled device, so that a communication connection between the terminal and the first controlled device can be established; in some embodiments of the present application, when there are multiple In the case of a controlled device, if the above-mentioned local agent program receives data packets from multiple controlled devices, it can generate a smart device list representing multiple controlled devices, and the user can be in the smart device list of multiple controlled devices , Select at least one controlled device among multiple controlled devices; based on the local agent program, a communication connection between the terminal and the selected controlled device can be established.
  • IP Internet Protocol
  • the communication connection between the terminal and the first controlled device can also be monitored to learn about the communication between the terminal and the first controlled device Connection status; for example, based on the above-mentioned local agent program, the terminal can continue to broadcast ARP data packets in the local area network.
  • the terminal can determine the communication connection between the terminal and the first controlled device based on the response to the ARP data packet Status; for example, after the terminal sends an ARP data packet to the first controlled device, if it does not receive a reply data packet for the ARP data packet within a set time period, it can be considered that the communication connection between the terminal and the first controlled device is interrupted ; If the reply data packet for the ARP data packet is received within the set time, it can be considered that the terminal and the first controlled device maintain a communication connection; the set time can be set according to actual needs, for example, the set time can be 1 to 3 seconds.
  • the terminal may send the above program code to the first controlled device according to the pre-established communication connection between the terminal and the first controlled device.
  • Step 103 The first controlled device receives and runs the above program code.
  • the first controlled device can run the uploaded program code to generate corresponding control instructions, and the first controlled device can execute the control instructions; thus, since the control instructions are generated based on the program code, the terminal can pass The above program code realizes the control of the first controlled device.
  • the first controlled device includes a smart car
  • a control instruction can be generated, and the control instruction can indicate the motion state of the smart car.
  • step 101 to step 102 can be implemented based on the processor of the terminal, and step 103 can be implemented based on the processor of the first controlled device.
  • the aforementioned processor can be an Application Specific Integrated Circuit (ASIC), digital Signal processor (Digital Signal Processor, DSP), Digital Signal Processing Device (Digital Signal Processing Device, DSPD), Programmable Logic Device (PLD), Field Programmable Gate Array (Field Programmable Gate Array, FPGA), At least one of a central processing unit (Central Processing Unit, CPU), a controller, a microcontroller, and a microprocessor.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Signal Processor
  • DSPD Digital Signal Processing Device
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • CPU Central Processing Unit
  • controller a controller
  • microcontroller a microcontroller
  • microprocessor a microprocessor
  • the program code can be obtained and pushed based on the local agent program, and then the control of the first controlled device can be realized based on the program code, and the program code can be flexibly edited. , Can control the first controlled device more flexibly.
  • the first controlled device runs the program code to generate feedback information, and then the feedback information can be sent to the terminal.
  • the feedback information may be used to indicate feedback on the running status of the program code.
  • the feedback information may include the execution result of the above-mentioned program code.
  • the terminal can learn about the operation of the program code, and then facilitate processing of the program code. For example, when the operation of the program code does not meet expectations, the program code can be modified.
  • the terminal may load the terminal and/or display the feedback information on the display interface of the terminal.
  • the implementation manner of loading feedback information by the terminal is not limited.
  • the terminal loads the feedback information in a synchronous loading or asynchronous loading manner.
  • the terminal may load and/or display the feedback information on the WEB front end, so that the user can intuitively learn the operation status of the above program code through the WEB front end.
  • the first controlled device may generate the control instruction of the second controlled device by running the program code, wherein the first controlled device and the second controlled device form a communication connection;
  • one second controlled device or multiple second controlled devices may be set; the second controlled device may be an electronic device such as a Raspberry Pi; the second controlled device may include a sensor and a camera
  • the second controlled device and the first controlled device can be of the same or different types.
  • the first controlled device After the first controlled device generates the control instruction of the second controlled device, it can send the control instruction of the second controlled device to the second controlled device; the second controlled device can execute the control instruction of the second controlled device , Get the instruction execution result.
  • the second controlled device may send the instruction execution result to the first controlled device; the first controlled device may generate the control instruction of the first controlled device according to the instruction execution result, and execute the control instruction of the first controlled device.
  • the embodiments of the present application provide a cooperative control solution for the first controlled device and the second controlled device.
  • the first controlled device and the second controlled device can be implemented based on the above program code. Cooperative control of the second controlled device.
  • the first controlled device and the second controlled device are connected through a wired connection, for example, the first controlled device and the second controlled device are connected through a serial port; in this way, the first controlled device can be connected The reliability of communication between a controlled device and a second controlled device.
  • the first controlled device may also obtain working mode switching information; the working mode switching information is used to instruct the first controlled device and the second controlled device to switch the current working mode to the target working mode;
  • the first controlled device can control the first controlled device to switch the working mode according to the working mode switching information; the first controlled device can also send the working mode switching information to the second controlled device; the second controlled device is in After receiving the working mode switching information, the working mode can be switched based on the working mode switching information.
  • the embodiment of the present application for each controlled device, multiple working modes can be set. When in different working modes, the working mode of the controlled device is different. It can be seen that the embodiment of the present application can realize synchronous switching of the working modes of the first controlled device and the second controlled device.
  • control instruction of the second controlled device is an image detection instruction
  • execution result of the instruction is the image detection result
  • the first controlled device may generate a human tracking instruction according to the received image detection result ; Then, the motion state of the first controlled device can be controlled according to the human body tracking instruction, so as to realize human body tracking.
  • human body tracking can be implemented on the basis of the coordinated control of the first controlled device and the second controlled device.
  • the first controlled device generates a control instruction for controlling the advancement of the first controlled device in response to the image detection result indicating that the human body becomes smaller; in response to the image detection The result indicates that the human body becomes larger, and a control instruction for controlling the first controlled device to remain stationary is generated; in response to the image detection result indicating that the human body is located on the left side of the first controlled device, the A control instruction for controlling the first controlled device to turn left; in response to the image detection result indicating that the human body is located on the right side of the first controlled device, generating a right for controlling the first controlled device Turn the control instructions.
  • the human body tracking can be accurately realized by controlling the motion state of the first controlled device.
  • the first controlled device can include EV3 smart cars, motors, related sensors and other devices; the EV3 smart car can drive related sensors to move through motors; the second controlled device can include Raspberry Pi, motors, cameras, etc. Device;
  • Figure 2 is a schematic diagram of the connection relationship between the EV3 smart car and the Raspberry Pi in an embodiment of the application. As shown in Figure 2, the Raspberry Pi 201 can be connected to the camera 202 through a Universal Serial Bus (USB) connection.
  • USB Universal Serial Bus
  • Raspberry Pi 201 can also be driven by a motor to rotate the camera; Raspberry Pi 201 can be connected to the EV3 smart car 203 through a serial port adapter; EV3 smart car 203 can also be connected to related sensors 204, A driving wheel 205 and a second driving wheel 206; in one example, the related sensor 204 can be a touch sensor, and the trigger signal generated when the touch sensor is touched can be used to modify the Raspberry Pi 201 and EV3 smart car simultaneously 203 function mode; the EV3 smart car 203 can also drive the first drive wheel 205 and the second drive wheel 206 to work; in some embodiments of the present application, the touch sensor can generate a trigger signal to the direct The connected EV3 smart car 203 then changes the function mode flag bit of the EV3 smart car 203, and sends the flag bit to the Raspberry Pi 201 through the serial port, which then triggers the change of the function mode flag bit of the Raspberry Pi 201, thereby realizing the tree Synchronous switching of
  • connection lines of the Raspberry Pi 201 and the EV3 smart car 203 can be as shown in Table 1. From Table 1, the corresponding relationship between the interface of the Raspberry Pi 201 and the color of the connection line can be seen.
  • FIG. 3 is a schematic diagram of the connection relationship between the EV3 smart car and the terminal in an embodiment of this application.
  • the EV3 smart car 203 can be connected to the terminal 301 through a wireless connection.
  • a wireless network card can be connected to the USB interface of the EV3 smart car 203;
  • the EV3 smart car 203 and the terminal 301 can be in the same network segment. In this way, the stability of the wireless network connection and the reliability of the communication between the EV3 smart car 203 and the terminal 301 can be ensured.
  • Fig. 4 is a second flowchart of the device control method according to an embodiment of the application. As shown in Fig. 4, the process may include:
  • Step 401 Connect the EV3 smart car 203 and the Raspberry Pi 201, and connect the EV3 smart car 203 and the terminal 301.
  • Step 402 On the terminal side, the program code collection of the WEB front-end is implemented based on the local agent program, and the collected program code is pushed to the EV3 smart car 203.
  • the python code submitted by the user can be collected based on the local agent program.
  • the EV3 smart car 203 can be made to work based on the python code, that is to say, the EV3 smart car can be implemented based on the python code.
  • the remote control of 203 in this way, can control the EV3 smart car 203 and other controlled equipment more flexibly and diversely; in the education scene, students can continue to trial and error and verify novel control schemes, which can cultivate students' programming process thinking.
  • UDP User Datagram Protocol
  • FIG. 5 is a schematic diagram of the functional flow of the agent program in an embodiment of the application. Referring to FIG. 5, the flow may include:
  • Step A1 Obtain a list of smart cars in the same network segment of the terminal 301 by broadcasting.
  • the terminal 301 when running the local agent program, can send ARP data packets in the local area network by broadcasting, and the EV3 smart car 203 in the local area network will reply with the corresponding data after receiving the ARP data packet.
  • the above-mentioned local agent program can parse the data packet returned by the EV3 smart car 203 to obtain the IP address of the EV3 smart car 203; when the IP addresses of multiple EV3 smart cars 203 are obtained, it can generate the EV3 under the same network segment of the terminal 301 Smart car list.
  • Step A2 Select EV3 smart car 203 and establish a connection.
  • a specific smart car can be selected from the above-mentioned EV3 smart car list, and then a UDP connection with the corresponding smart car can be created.
  • Step A3 Push the program code to the EV3 smart car 203, and receive the feedback information of the EV3 smart car 203.
  • the user can submit the program code.
  • the local agent program can collect the program code submitted by the user, and then push the program code to the selection EV3 smart car 203; EV3 smart car 203 can run the program code, generate feedback information, and send the feedback information to the terminal 301.
  • Step A4 Load feedback information on the front end of the WEB through asynchronous loading.
  • Step A5 After the program code is executed, the connection between the terminal 301 and the EV3 smart car 203 can be disconnected, and resources can be recovered.
  • the program code after receiving or loading the feedback information, the program code can be considered to be executed.
  • the connection between the terminal 301 and the EV3 smart car 203 can be released.
  • corresponding hardware resources and/or software resources for example, WEB front-end resources
  • WEB front-end resources can also be released for the network transmission resources between the terminals.
  • Step 403 The EV3 smart car 203 receives and runs the program code, and pushes the image detection instruction to the Raspberry Pi 201.
  • the EV3 smart car 203 starts the corresponding UDP service, which can continuously monitor the information of the fixed port, and after the local agent program pushes the program code to the EV3 smart car 203, the EV3 smart car 203 can run
  • the received program code can be encapsulated in a data packet
  • get the control instruction if the control instruction involves image processing tasks, it can generate image inspection instructions, start the corresponding service mode, and push the image inspection instructions to the Raspberry Pi 201 to instruct the Raspberry Pi 201 to perform an image detection task, so that the image detection result can be obtained from the Raspberry Pi 201.
  • the EV3 smart car 203 and the Raspberry Pi 201 may have multiple working modes.
  • Table 2 illustrates several working modes of the EV3 smart car 203 and the Raspberry Pi 201 as an example.
  • Step 404 The Raspberry Pi 201 executes the image detection instruction, and transmits the image detection result (also referred to as the image processing result) to the EV3 smart car 203, and the EV3 smart car 203 executes the corresponding control command according to the image detection result.
  • the image detection result also referred to as the image processing result
  • the image detection instruction can be encapsulated in a data packet and sent.
  • the Raspberry Pi 201 receives the data packet from the EV3 smart car 203, obtains its own working mode through analysis, and then executes the corresponding image detection task , And send the image detection result to the EV3 smart car 203. In the same working mode, the EV3 smart car 203 and the Raspberry Pi 201 cooperate to perform specific tasks.
  • FIG. 6 is a schematic diagram of the processing flow of the EV3 smart car 203 and the Raspberry Pi 201 in the human tracking mode in the embodiment of the application.
  • the EV3 smart car can be explained from the image processing stage, the detection data processing stage and the instruction execution stage
  • the camera 202 is used to collect pictures, and then the collected images are detected based on deep learning technology to achieve human body detection. Coordinates; the data processing stage mainly realizes human body ID recognition and position coordinate processing; the result of data processing is sent to the EV3 smart car 203.
  • the EV3 smart car 203 can control the motor operation according to the logic shown in Figure 6, which can be achieved in this way Basic human body tracking function; referring to Figure 6, when it is determined that the human body becomes smaller, the EV3 smart car 203 is driven forward by the motor; when the human body is determined to be larger, the EV3 smart car 203 is controlled to stand still by the motor driving mode; when the human body is determined to be on the left , The motor drives the EV3 smart car 203 to turn left; when it is determined that the human body is on the right side, the motor drives the EV3 smart car 203 to turn right.
  • Step 405 The EV3 smart car 203 collects feedback information and sends the feedback information to the terminal 301.
  • Figure 7 is a schematic diagram of the processing flow for feedback information in an embodiment of the application.
  • the python code when received, it can be based on its own Python interpreter, complete the conversion of original code and bytecode, and then convert it into machine code to facilitate the execution of instructions.
  • the execution result of the program code can be redirected to a text file.
  • the communication connection between the EV3 smart car 203 and the terminal 301 can be established based on the UDP communication module; the UDP communication module can detect the above text file in real time Whether to be updated, after the text information is updated, the content (that is, feedback information) will be read; then the read feedback information will be actively pushed to the local agent program of the terminal 301 through the established communication connection; the local agent program is based on JAVA
  • the feedback information can be loaded to the WEB front end by asynchronous loading, and the WEB front end will feedback information in the designated area after a series of renderings.
  • a complete remote control scheme based on the EV3 smart car 203 is proposed.
  • the program code can be input through the WEB terminal, and then transmitted to the lower computer (that is, the EV3 smart car) in real time, and realize The lower computer responds to feedback information; further, through the local agent program, it can support the completion of the collection of the web front-end python code, and push it to the lower computer based on the UDP connection to execute the program code; through the local agent program, the terminal 301 and EV3 smart car can be realized 203’s information interaction, transmission is efficient and reliable; further, a connection scheme between the terminal, EV3 smart car 203, Raspberry Pi 201 and related peripherals is designed to ensure the efficient feasibility of the connection, and it is compatible with most embedded developments. equipment.
  • a complete implementation scheme is given for the connection between the EV3 smart car and the Raspberry Pi, and the control of the EV3 smart car and the Raspberry Pi; and, it can be used for the EV3 smart car and the Raspberry Pi.
  • Send, realize the synchronous switching mechanism of different working modes, can ensure the consistency of information exchange under different working modes.
  • the following describes a device control method of the embodiment of the present application from the perspective of the terminal.
  • FIG. 8 is the third flow chart of the device control method according to the embodiment of the application. As shown in FIG. 8, the flow may include:
  • Step 801 Obtain program code for controlling the operation of the first controlled device through the local agent program
  • the first controlled device may be an electronic device such as a smart car or an embedded development device; the first controlled device may include a sensor, a camera, and other devices, and the terminal may provide a human-computer interaction interface.
  • the first controlled device may be a single electronic device, or may integrate multiple electronic devices.
  • a local agent program can be installed in the terminal.
  • the local agent program is used at least to collect the program code submitted by the user.
  • the program code submitted by the user may be the program code used to control the operation of the first controlled device.
  • the program code used to control the work of the first controlled device can be collected.
  • the type of the program code is not limited.
  • the type of the program code can be predetermined according to actual application requirements.
  • the program code can be python code or other program codes.
  • Step 802 Send the program code to the first controlled device through the local agent program, so that the first controlled device runs the program code.
  • the communication connection between the terminal and the first controlled device can be established in advance.
  • the embodiment of this application does not limit the connection mode of the communication connection between the terminal and the first controlled device.
  • the device can perform data interaction through wired connection or various wireless connection methods.
  • the first controlled device can run the uploaded program code to generate corresponding control instructions, and the first controlled device can execute the control instructions; thus, since the control instructions are generated based on the program code, the terminal can pass The above program code realizes the control of the first controlled device.
  • steps 801 to 802 may be implemented based on the processor of the terminal, and the foregoing processor may be at least one of ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. It is understandable that, for different terminals or controlled devices, the electronic devices used to implement the above-mentioned processor functions may also be other, which are not specifically limited in the embodiment of the present application.
  • the program code can be obtained and pushed based on the local agent program, and then the control of the first controlled device can be realized based on the program code, and the program code can be flexibly edited. , Can control the first controlled device more flexibly.
  • the method further includes:
  • the feedback information of the first controlled device is received, and the feedback information is generated by the first controlled device after running the program code.
  • the feedback information includes the execution result of the program code.
  • the terminal can be informed of the operation of the above program code, and then it is convenient to process the program code. For example, when the operation of the program code does not meet expectations, the program code can be modified .
  • the method further includes:
  • the obtaining the program code for controlling the operation of the controlled device through the local agent program includes:
  • the program code submitted by the user based on the WEB page is obtained through the local agent program.
  • the user can submit the program code more conveniently based on the WEB page, which in turn facilitates the local agent program to collect the program code submitted by the user.
  • Fig. 9 is a fourth flowchart of a device control method according to an embodiment of this application. As shown in Fig. 9, the process may include:
  • Step 901 A program code for controlling the operation of the first controlled device sent by a terminal through a local agent program is received; the program code is collected by the local agent program.
  • the first controlled device may be an electronic device such as a smart car or an embedded development device; the first controlled device may include a sensor, a camera, and other devices, and the terminal may provide a human-computer interaction interface.
  • the first controlled device may be a single electronic device, or may integrate multiple electronic devices.
  • a local agent program can be installed in the terminal.
  • the local agent program is used at least to collect the program code submitted by the user.
  • the program code submitted by the user may be the program code used to control the operation of the first controlled device.
  • the program code used to control the work of the first controlled device can be collected.
  • the type of the program code is not limited.
  • the type of the program code can be predetermined according to actual application requirements.
  • the program code can be python code or other program codes.
  • the communication connection between the terminal and the first controlled device can be established in advance.
  • the embodiment of this application does not limit the connection mode of the communication connection between the terminal and the first controlled device.
  • the device can perform data interaction through wired connection or various wireless connection methods.
  • Step 902 Run the program code.
  • the first controlled device can run the uploaded program code to generate corresponding control instructions, and the first controlled device can execute the control instructions; thus, since the control instructions are generated based on the program code, the terminal can pass The above program code realizes the control of the first controlled device.
  • the program code can be obtained and pushed based on the local agent program, and then the control of the first controlled device can be realized based on the program code, and the program code can be flexibly edited. , Can control the first controlled device more flexibly.
  • the method further includes:
  • the feedback information includes the execution result of the program code.
  • the terminal can be informed of the operation of the above program code, and then it is convenient to process the program code. For example, when the operation of the program code does not meet expectations, the program code can be modified .
  • the method further includes:
  • the embodiments of the present application provide a cooperative control solution for the first controlled device and the second controlled device.
  • the first controlled device and the second controlled device can be implemented based on the above program code. Cooperative control of the second controlled device.
  • the method further includes:
  • the working mode switching information is used to instruct the first controlled device and the second controlled device to switch the current working mode to the target working mode;
  • control the first controlled device to switch operating modes; send the operating mode switching information to a second controlled device, so that the second controlled device is based on the operating mode switching information Switch working mode.
  • the embodiment of the present application for each controlled device, multiple working modes can be set. When in different working modes, the working mode of the controlled device is different. It can be seen that the embodiment of the present application can realize synchronous switching of the working modes of the first controlled device and the second controlled device.
  • control instruction of the second controlled device is an image detection instruction, and the execution result of the instruction is an image detection result
  • the generating the control instruction of the first controlled device according to the instruction execution result and executing the control instruction of the first controlled device includes:
  • the movement state of the first controlled device is controlled.
  • human body tracking can be implemented on the basis of the coordinated control of the first controlled device and the second controlled device.
  • the generating a control instruction of the first controlled device according to the image detection result includes:
  • a control instruction for controlling the first controlled device to turn right is generated.
  • the human body tracking can be accurately realized by controlling the motion state of the first controlled device.
  • the first controlled device and the second controlled device are connected through a wired connection; in this way, the communication between the first controlled device and the second controlled device can be ensured reliability.
  • the writing order of the steps does not mean a strict execution order but constitutes any limitation on the implementation process.
  • the specific execution order of each step should be based on its function and possibility.
  • the inner logic is determined.
  • an embodiment of the present application proposes a terminal.
  • FIG. 10 is a schematic diagram of the composition structure of a terminal according to an embodiment of the application. As shown in FIG. 10, the device includes an acquisition module 1001 and a first processing module 1002, where,
  • the obtaining module 1001 is configured to obtain program code for controlling the operation of the first controlled device through a local agent program
  • the first processing module 1002 is configured to send the program code to the first controlled device through the local agent program, so that the first controlled device runs the program code.
  • the first processing module 1002 is further configured to receive feedback information of the first controlled device after sending the program code to the first controlled device, and the feedback information It is generated after the first controlled device runs the program code.
  • the first processing module 1002 is further configured to load and/or load the feedback information on the terminal where the local agent program is located after receiving the feedback information of the first controlled device. Or display.
  • the feedback information includes an execution result of the program code.
  • the obtaining module 1001 is configured to obtain the program code submitted by the user based on the WEB page through the local agent program.
  • Both the acquisition module 1001 and the first processing module 1002 can be implemented by a processor located in a terminal.
  • the processor is at least one of ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. kind.
  • the functional modules in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be realized in the form of hardware or software function module.
  • the integrated unit is implemented in the form of a software function module and is not sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially In other words, the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium and includes several instructions to enable a computer device ( It may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read only memory (Read Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes.
  • the computer program instructions corresponding to a device control method in the embodiments of the present application may be stored on storage media such as optical disks, hard disks, and USB flash drives.
  • storage media such as optical disks, hard disks, and USB flash drives.
  • FIG. 11 shows another terminal provided by an embodiment of the present application, which may include: a first memory 1101 and a first processor 1102; wherein,
  • the first memory 1101 is configured to store computer programs and data
  • the first processor 1102 is configured to execute a computer program stored in the first memory, so as to implement any device control method applied to the terminal in the foregoing embodiments.
  • the above-mentioned first memory 1101 may be a volatile memory (volatile memory), such as RAM; or a non-volatile memory (non-volatile memory), such as ROM, flash memory, hard disk ( Hard Disk Drive (HDD) or Solid-State Drive (SSD); or a combination of the foregoing types of memories, and provides instructions and data to the first processor 1102.
  • volatile memory such as RAM
  • non-volatile memory non-volatile memory
  • ROM read-only memory
  • flash memory such as hard disk ( Hard Disk Drive (HDD) or Solid-State Drive (SSD); or a combination of the foregoing types of memories, and provides instructions and data to the first processor 1102.
  • HDD Hard Disk Drive
  • SSD Solid-State Drive
  • the first processor 1102 may be at least one of ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. It is understandable that, for different devices, the electronic devices used to implement the above-mentioned processor functions may also be other, which is not specifically limited in the embodiment of the present application.
  • an embodiment of the present application proposes a controlled device.
  • FIG. 12 is a schematic diagram of the composition structure of a first controlled device according to an embodiment of the application. As shown in FIG. 12, the device includes a receiving module 1201 and a second processing module 1202, wherein,
  • the receiving module 1201 is configured to receive the program code for controlling the operation of the first controlled device sent by the terminal through a local agent program; the program code is collected by the local agent program;
  • the second processing module 1202 is configured to run the program code.
  • the second processing module 1202 is further configured to run the program code to generate feedback information, and send the feedback information to the terminal.
  • the feedback information includes an execution result of the program code.
  • the second processing module 1202 is further configured to:
  • the second processing module 1202 is further configured to obtain work mode switching information; control the first controlled device to switch work modes according to the work mode switching information; and change the work mode switching information Sent to the second controlled device to enable the second controlled device to switch the working mode based on the working mode switching information; the working mode switching information is used to instruct the first controlled device and the second controlled device The device switches the current working mode to the target working mode.
  • control instruction of the second controlled device is an image detection instruction, and the execution result of the instruction is an image detection result;
  • the second processing module 1202 is configured to generate a control instruction of the first controlled device according to the image detection result, and the control instruction of the first controlled device is a human tracking instruction; according to the human tracking instruction , Controlling the motion state of the first controlled device.
  • the second processing module 1202 is configured to generate a control instruction for controlling the progress of the first controlled device in response to the image detection result indicating that the human body becomes smaller; in response to the The image detection result indicates that the human body becomes larger, and a control instruction for controlling the first controlled device to remain still is generated; in response to the image detection result indicating that the human body is located on the left side of the first controlled device, Generate a control instruction for controlling the first controlled device to turn left; in response to the image detection result indicating that the human body is on the right side of the first controlled device, generate a control instruction for controlling the first controlled device Control instructions for turning the device right.
  • the first controlled device and the second controlled device are connected through a wired connection.
  • Both the receiving module 1201 and the second processing module 1202 can be implemented by a processor located in the first controlled device.
  • the processors are ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. At least one of them.
  • the functional modules in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be realized in the form of hardware or software function module.
  • the integrated unit is implemented in the form of a software function module and is not sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially In other words, the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium and includes several instructions to enable a computer device ( It may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read only memory (Read Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes.
  • the computer program instructions corresponding to another device control method in the embodiment of the present application can be stored on storage media such as optical disks, hard disks, U disks, etc., when the storage medium corresponds to another device control method
  • storage media such as optical disks, hard disks, U disks, etc.
  • FIG. 13 shows an electronic device provided by an embodiment of the present application, which may include: a second memory 1301 and a second processor 1302; wherein,
  • the second memory 1301 is configured to store computer programs and data
  • the second processor 1302 is configured to execute a computer program stored in the second memory to implement any device control method applied to the first controlled device in the foregoing embodiments.
  • the above-mentioned second memory 1301 may be a volatile memory, such as RAM; or a non-volatile memory, such as ROM, flash memory, HDD or SSD; or a combination of the above-mentioned types of memory, and transfer to the second
  • the processor 1302 provides instructions and data.
  • the aforementioned second processor 1302 may be at least one of ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. It is understandable that, for different electronic devices, the electronic devices used to implement the above-mentioned processor functions may also be other, which is not specifically limited in the embodiment of the present application.
  • the functions or modules contained in the apparatus provided in the embodiments of the application can be used to execute the methods described in the above method embodiments.
  • the functions or modules contained in the apparatus provided in the embodiments of the application can be used to execute the methods described in the above method embodiments.
  • brevity, here No longer refer to the description of the above method embodiments.
  • the method of the above embodiments can be implemented by means of software plus the necessary general hardware platform. Of course, it can also be implemented by hardware, but in many cases the former is better. ⁇
  • the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make a terminal (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present application.
  • the embodiment of the application discloses a device control method, a terminal, a controlled device, an electronic device, a medium, and a program.
  • the method includes: obtaining program code for controlling the operation of a first controlled device through a local agent program;
  • the local agent program sends the program code to the first controlled device, so that the first controlled device runs the program code.
  • the technical solution of the embodiment of the present application can obtain and push the program code based on the local agent program, and then can realize the control of the first controlled device based on the program code, and the program code can be flexibly edited, so that it can be more flexible Control the first controlled device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Selective Calling Equipment (AREA)
  • Stored Programmes (AREA)

Abstract

提供了一种设备控制方法、终端、受控设备、电子设备、介质和程序,该方法包括:通过本地代理程序获取用于控制第一受控设备工作的程序代码(101);通过本地代理程序将程序代码发送至第一受控设备(102),使第一受控设备运行该程序代码(103)。

Description

设备控制方法、终端、受控设备、电子设备、介质和程序
相关申请的交叉引用
本申请基于申请号为201910578882.2、申请日为2019年06月28日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及智能设备控制技术,尤其涉及一种设备控制方法、终端、受控设备、电子设备、计算机存储介质和计算机程序。
背景技术
目前,对智能设备的远程控制方法可以应用于多个领域,例如,可以适用于遥控领域或编程教育领域。相关技术中,在对智能设备进行控制时,可通过固定命令按钮发送相应的控制指令。
发明内容
本申请实施例提出了一种设备控制方法、终端、受控设备、电子设备、计算机存储介质和计算机程序。
本申请实施例提供了一种设备控制方法,所述方法包括:
通过本地代理(agent)程序获取用于控制第一受控设备工作的程序代码;
通过所述本地代理程序将所述程序代码发送至所述第一受控设备,使所述第一受控设备运行所述程序代码。
在本申请的一些实施例中,所述方法还包括:
在将所述程序代码发送至所述第一受控设备后,接收所述第一受控设备的反馈信息,所述反馈信息由所述第一受控设备运行所述程序代码后生成。
可以看出,通过接收反馈信息,可以获知上述程序代码运行情况,进而便于针对该程序代码进行处理,例如,在程序代码的运行情况不符合预期时,可以修改该程序代码。
在本申请的一些实施例中,所述方法还包括:
在接收所述第一受控设备的反馈信息后,将所述反馈信息在所述本地代理程序所在的终端进行加载和/或显示。
可以看出,通过对反馈信息的加载和/或显示,便于用户直观地获知上述程序代码运行情况。
在本申请的一些实施例中,所述反馈信息包括所述程序代码的执行结果。
可以看出,通过对反馈信息的加载和/或显示,便于用户直观地获知上述程序代码的执行结果。
在本申请的一些实施例中,所述通过本地代理程序获取用于控制受控设备工作的程序代码,包括:
通过所述本地代理程序获取用户基于网络(WEB)页面提交的所述程序代码。
这样,在本地代理程序运行的情况下,本地代理程序可以收集到相应的程序代码;可以看出,用户可以基于WEB页面,较为方便地提交程序代码。
本申请实施例还提供了另一种设备控制方法,应用于第一受控设备中,所述方法包括:
接收终端通过本地代理程序发送的用于控制所述第一受控设备工作的程序代码;所述程序代码是由所述本地代理程序收集的;
运行所述程序代码。
在本申请的一些实施例中,所述方法还包括:
运行所述程序代码生成反馈信息,并将所述反馈信息发送至所述终端。
可以看出,通过将反馈信息发送至终端,可以使终端获知上述程序代码运行情况,进而便于针对该程序代码进行处理,例如,在程序代码的运行情况不符合预期时,可以修改该程序代码。
在本申请的一些实施例中,所述反馈信息包括所述程序代码的执行结果。
可以看出,通过将反馈信息发送至终端,可以使终端获知上述程序代码的执行结果。
在本申请的一些实施例中,所述方法还包括:
通过运行所述程序代码,生成第二受控设备的控制指令,其中,所述第一受控设备和所述第二受控设备形成通信连接;
将所述第二受控设备的控制指令发送至第二受控设备,使所述第二受控设备执行所述第二受控设备的控制指令;
接收所述第二受控设备发送的指令执行结果,根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令;其中,所述指令执行结果是所述第二受控设备在执行所述第二受控设备的控制指令后得到的。
可以看出,本申请实施例针对第一受控设备和第二受控设备,给出了协同控制方案,在本申请的一些实施例中,可以基于上述程序代码实现对第一受控设备和第二受控设备的协同控制。
在本申请的一些实施例中,所述方法还包括:
获取工作模式切换信息;所述工作模式切换信息用于指示所述第一受控设备和所述第二受控设备将当前工作模式切换为目标工作模式;
根据所述工作模式切换信息,控制所述第一受控设备切换工作模式;将所述工作模式切换信息发送至第二受控设备,使所述第二受控设备基于所述工作模式切换信息切换工作模式。
这样,针对每个受控设备,可以设置多种工作模式,在处于不同的工作模式时,受控设备的工作方式是不同的。可以看出,本申请实施例可以实现第一受控设备和第二受控设备的工作模式的同步切换。
在本申请的一些实施例中,所述第二受控设备的控制指令为图像检测指令,所述指令执行结果为图像检测结果;
所述根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令,包括:
根据所述图像检测结果,生成所述第一受控设备的控制指令,所述第一受控设备的控制指令为人体跟踪指令;
根据所述人体跟踪指令,控制所述第一受控设备的运动状态。
可以看出,在本申请实施例中,可以在对第一受控设备和第二受控设备的协同控制的基础上,实现人体跟踪。
在本申请的一些实施例中,所述根据所述图像检测结果,生成所述第一受控设备的控制指令,包括:
响应于所述图像检测结果表示人体变小的情况,生成用于控制所述第一受控设备前进的控制指令;
响应于所述图像检测结果表示人体变大的情况,生成用于控制所述第一受控设备保持静止的控制指令;
响应于所述图像检测结果表示人***于所述第一受控设备的左侧的情况,生成用于控制所述第一受控设备左转的控制指令;
响应于所述图像检测结果表示人***于所述第一受控设备的右侧的情况,生成用于控制所述第一受控设备右转的控制指令。
可以看出,在本申请实施例中,通过对第一受控设备的运动状态的控制,可以准确地实现人体跟踪。
在本申请的一些实施例中,所述第一受控设备和所述第二受控设备通过有线连接方式进行连接。
如此,可以确保第一受控设备与第二受控设备之间的通信可靠性。
本申请实施例还提供了一种终端,所述终端包括获取模块和第一处理模块,其中,
获取模块,配置为通过本地代理程序获取用于控制第一受控设备工作的程序代码;
第一处理模块,配置为通过所述本地代理程序将所述程序代码发送至所述第一受控设备,使所述第一受控设备运行所述程序代码。
在本申请的一些实施例中,所述第一处理模块,还配置为在将所述程序代码发送至所述第一受控设备后,接收所述第一受控设备的反馈信息,所述反馈信息由所述第一受控设备运行所述程序代码后生成。
可以看出,通过接收反馈信息,可以获知上述程序代码运行情况,进而便于针对该程序代码进行处理,例如,在程序代码的运行情况不符合预期时,可以修改该程序代码。
在本申请的一些实施例中,所述第一处理模块,还配置为在在接收所述第一受控设备的反馈信息后,将所述反馈信息在所述本地代理程序所在的终端进行加载和/或显示。
可以看出,通过对反馈信息的加载和/或显示,便于用户直观地获知上述程序代码运行情况。
在本申请的一些实施例中,所述反馈信息包括所述程序代码的执行结果。
可以看出,通过对反馈信息的加载和/或显示,便于用户直观地获知上述程序代码的执行结果。
在本申请的一些实施例中,所述获取模块,配置为通过所述本地代理程序获取用户基于WEB页面提交的所述程序代码。
这样,在本地代理程序运行的情况下,本地代理程序可以收集到相应的程序代码;可以看出,用户可以基于WEB页面,较为方便地提交程序代码。
本申请实施例还提供了一种第一受控设备,所述第一受控设备包括接收模块和第二处理模块,其中,
接收模块,配置为接收终端通过本地代理程序发送的用于控制所述第一受控设备工作的程序代码;所述程序代码是由所述本地代理程序收集的;
第二处理模块,配置为运行所述程序代码。
在本申请的一些实施例中,所述第二处理模块,还配置为运行所述程序代码生成反馈信息,并将所述反馈信息发送至所述终端。
可以看出,通过将反馈信息发送至终端,可以使终端获知上述程序代码运行情况,进而便于针对该程序代码进行处理,例如,在程序代码的运行情况不符合预期时,可以修改该程序代码。
在本申请的一些实施例中,所述反馈信息包括所述程序代码的执行结果。
可以看出,通过将反馈信息发送至终端,可以使终端获知上述程序代码的执行结果。
在本申请的一些实施例中,所述第二处理模块,还配置为:
通过运行所述程序代码,生成第二受控设备的控制指令,其中,所述第一受控设备和所述第二受控设备形成通信连接;
将所述第二受控设备的控制指令发送至第二受控设备,使所述第二受控设备执行所述第二受控设备的控制指令;
接收所述第二受控设备发送的指令执行结果,根据所述指令执行结果,生成所述第 一受控设备的控制指令,执行所述第一受控设备的控制指令;其中,所述指令执行结果是所述第二受控设备在执行所述第二受控设备的控制指令后得到的。
可以看出,本申请实施例针对第一受控设备和第二受控设备,给出了协同控制方案,在本申请的一些实施例中,可以基于上述程序代码实现对第一受控设备和第二受控设备的协同控制。
在本申请的一些实施例中,所述第二处理模块,还配置为获取工作模式切换信息;根据所述工作模式切换信息,控制所述第一受控设备切换工作模式;将所述工作模式切换信息发送至第二受控设备,使所述第二受控设备基于所述工作模式切换信息切换工作模式;所述工作模式切换信息用于指示所述第一受控设备和所述第二受控设备将当前工作模式切换为目标工作模式。
这样,针对每个受控设备,可以设置多种工作模式,在处于不同的工作模式时,受控设备的工作方式是不同的。可以看出,本申请实施例可以实现第一受控设备和第二受控设备的工作模式的同步切换。
在本申请的一些实施例中,所述第二受控设备的控制指令为图像检测指令,所述指令执行结果为图像检测结果;
所述第二处理模块,配置为根据所述图像检测结果,生成所述第一受控设备的控制指令,所述第一受控设备的控制指令为人体跟踪指令;根据所述人体跟踪指令,控制所述第一受控设备的运动状态。
可以看出,在本申请实施例中,可以在对第一受控设备和第二受控设备的协同控制的基础上,实现人体跟踪。
在本申请的一些实施例中,所述第二处理模块,配置为响应于所述图像检测结果表示人体变小的情况,生成用于控制所述第一受控设备前进的控制指令;响应于所述图像检测结果表示人体变大的情况,生成用于控制所述第一受控设备保持静止的控制指令;响应于所述图像检测结果表示人***于所述第一受控设备的左侧的情况,生成用于控制所述第一受控设备左转的控制指令;响应于所述图像检测结果表示人***于所述第一受控设备的右侧的情况,生成用于控制所述第一受控设备右转的控制指令。
可以看出,在本申请实施例中,通过对第一受控设备的运动状态的控制,可以准确地实现人体跟踪。
在本申请的一些实施例中,所述第一受控设备和所述第二受控设备通过有线连接方式进行连接。
如此,可以确保第一受控设备与第二受控设备之间的通信可靠性。
本申请实施例还提供了一种终端,所述终端包括处理器和用于存储能够在处理器上运行的计算机程序的存储器;其中,
所述处理器用于运行所述计算机程序时,执行应用于终端的上述任意一种设备控制方法。
本申请实施例还提供了一种电子设备,所述设备包括处理器和用于存储能够在处理器上运行的计算机程序的存储器;其中,
所述处理器用于运行所述计算机程序时,执行应用于第一受控设备的上述任意一种设备控制方法。
本申请实施例还提供了一种计算机存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现上述任意一种设备控制方法。
本申请实施例还提供了一种计算机程序,包括计算机可读代码,当所述计算机可读代码在终端中运行时,所述终端中的处理器执行应用于终端的上述任意一种设备控制方法。
本申请实施例还提供了另一种计算机程序,包括计算机可读代码,当所述计算机可 读代码在电子设备中运行时,所述电子设备中的处理器执行应用于第一受控设备的上述任意一种设备控制方法。
本申请实施例提出的设备控制方法、终端、受控设备、电子设备、计算机存储介质和计算机程序中,通过本地代理程序获取用于控制第一受控设备工作的程序代码;通过所述本地代理程序将所述程序代码发送至所述第一受控设备,使所述第一受控设备运行所述程序代码。本申请实施例的技术方案,可以基于本地代理程序获取程序代码并推送,进而可以基于程序代码来实现对第一受控设备的控制,而程序代码是可以灵活编辑的,如此,可以更加灵活地对第一受控设备进行控制。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,而非限制本申请。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,这些附图示出了符合本申请的实施例,并与说明书一起用于说明本申请实施例的技术方案。
图1为本申请实施例的设备控制方法的流程图一;
图2为本申请实施例中EV3智能车与树莓派的连接关系示意图;
图3为本申请实施例中EV3智能车与终端的连接关系示意图;
图4为本申请实施例的设备控制方法的流程图二;
图5为本申请实施例中代理程序的功能流程示意图;
图6为本申请实施例中人体跟踪模式下EV3智能车和树莓派的处理流程示意图;
图7为本申请实施例中针对反馈信息的处理流程示意图;
图8为本申请实施例的设备控制方法的流程图三;
图9为本申请实施例的设备控制方法的流程图四;
图10为本申请实施例的一种终端的组成结构示意图;
图11为本申请实施例的另一种终端的结构示意图;
图12为本申请实施例的一种第一受控设备的组成结构示意图;
图13为本申请实施例的一种电子设备的结构示意图。
具体实施方式
以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所提供的实施例仅仅用以解释本申请,并不用于限定本申请。另外,以下所提供的实施例是用于实施本申请的部分实施例,而非提供实施本申请的全部实施例,在不冲突的情况下,本申请实施例记载的技术方案可以任意组合的方式实施。
需要说明的是,在本申请实施例中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的方法或者装置不仅包括所明确记载的要素,而且还包括没有明确列出的其他要素,或者是还包括为实施方法或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个......”限定的要素,并不排除在包括该要素的方法或者装置中还存在另外的相关要素(例如方法中的步骤或者装置中的单元,例如的单元可以是部分电路、部分处理器、部分程序或软件等等)。
例如,本申请实施例提供的设备控制方法包含了一系列的步骤,但是本申请实施例提供的设备控制方法不限于所记载的步骤,同样地,本申请实施例提供的终端和受控设备包括了一系列模块,但是本申请实施例提供的终端和受控设备不限于包括所明确记载的模块,还可以包括为获取相关信息、或基于信息进行处理时所需要设置的模块。
本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。 另外,本文中术语“至少一种”表示多种中的任意一种或多种中的至少两种的任意组合,例如,包括A、B、C中的至少一种,可以表示包括从A、B和C构成的集合中选择的任意一个或多个元素。
本申请实施例可以应用于终端和受控设备组成的控制***中,并可以与众多其它通用或专用计算***环境或配置一起操作。这里,终端可以是瘦客户机、厚客户机、手持或膝上设备、基于微处理器的***、机顶盒、可编程消费电子产品、网络个人电脑、小型计算机***,等等,受控设备可以是智能车等电子设备。本申请实施例提供的设备控制方法还可以通过处理器执行计算机程序代码实现。
终端、受控设备等电子设备可以在由计算机***执行的计算机***可执行指令(诸如程序模块)的一般语境下描述。通常,程序模块可以包括例程、程序、目标程序、组件、逻辑、数据结构等等,它们执行特定的任务或者实现特定的抽象数据类型。
在本申请的一些实施例中,教育机器人是现代编程教育的一个通用教具模型,可以利用WEB方式或者移动应用程序(Application,APP)客户端实现对教育机器人控制;在相关的智能设备控制方案中,通常是通过蓝牙、无线视频进行传输,然后在服务器侧进行信息处理;在第一个示例性的方案中,提出了一种基于Wi-Fi(Wireless Fidelity)通信的意念控制视频车***,该***由脑电采集卡、智能手机、移动小车和个人计算机四部分组成,通过蓝牙实现智能手机和脑电采集卡连接,进而可以实现移动小车的远程控制;在第二个示例性的方案中,提出了一种远程控制除草机器人,在实际实施时,可以在机器人本体上安装摄像头和全球定位***(Global Positioning System,GPS)芯片,小车完成视频采集然后通过无线通信方式传输到上位机,在完成相关处理后,在将指令下发到机器人;在第三个示例性的方案中,提出了一种扫地车远程管理方案,在实际实施时,可以利用通信模块将手机信息无线传输到网管服务器,再由管理平台向用户提供服务,该方案主要利用WEB方式提供相关服务,方便用户对设备的管理。
在上述的智能设备控制方案中,只能无线通讯方式完成固定命令传输,只能通过固定命令按钮发送固定的控制指令;无线通信网络的通信质量会影响到上下位机之间的通信。
在相关技术中,基于WEB方式实现对智能设备的远程控制是一种高效且方便的方案,在遥控领域和教育领域有着极为重要作用;但是为了实现准确且实时的控制还存在较多问题亟待解决;首先由于上下位机之间采用了无线连接方式,但是无线网络不稳定且丢包率较大,在性能要求较高的场合存在较大问题,限制了应用场景;其次,相关技术中只实现了从上位机到下位机控制指令传输,不能收集反馈信息或者对反馈信息进行有效处理,上位机侧WEB端的控制指令的下发和下位机侧执行指令的反馈信息的收集及上报也需要设计灵活复杂的架构。
针对上述技术问题,在本申请的一些实施例中,提出了一种设备控制的技术方案,本申请实施例可以在教育机器人、远程控制、智能遥控等场景实施。
本申请实施例提出了一种设备控制方法;图1为本申请实施例的设备控制方法的流程图一,如图1所示,该流程可以包括:
步骤101:终端通过本地代理程序获取用于控制第一受控设备工作的程序代码;
本申请实施例中,第一受控设备可以是智能车、嵌入式开发设备、智能机器人等电子设备;第一受控设备可以包括传感器、摄像头等器件,终端可以提供人机交互界面。本申请实施例中,第一受控设备可以是一个单独的电子设备,也可以是集成的电子设备。
在实际应用中,可以在终端中安装本地代理(agent)程序,本地代理程序至少用于收集用户提交的程序代码,这里,用户提交的程序代码可以是用于控制第一受控设备工作的程序代码,如此,通过运行本地代理程序,可以收集到用于控制第一受控设备工作的程序代码。
在本申请的一些实施例中,可以通过本地代理程序获取用户基于WEB页面提交的所述程序代码,也就是说,用户可以通过终端的WEB页面提交用于控制第一受控设备工作的程序代码,在本地代理程序运行的情况下,本地代理程序可以收集到相应的程序代码;可以看出,用户可以基于WEB页面,较为方便地提交程序代码。
本申请实施例中,并不对上述程序代码的种类进行限定,上述程序代码的种类可以根据实际应用需求预先确定,例如,上述程序代码可以是python代码或其他程序代码。
步骤102:终端通过本地代理程序将程序代码发送至第一受控设备。
在实际应用中,可以预先建立终端与第一受控设备的通信连接,本申请实施例中并不对终端与第一受控设备的通信连接的连接方式进行限定,例如,终端与第一受控设备可以通过有线连接方式或各种无线连接方式进行数据交互。
在本申请的一些实施例中,上述本地代理程序可以用于建立终端与第一受控设备的通信连接,例如,在运行上述本地代理程序时,终端可以通过广播方式在局域网内下发地址解析协议(Address Resolution Protocol,ARP)数据包,该局域网内的第一受控设备在收到ARP数据包会回复相应的数据包,上述本地代理程序可以对第一受控设备回复的数据包进行解析,得到第一受控设备的互联网协议(Internet Protocol,IP)地址,如此,可以建立终端与第一受控设备的通信连接;在本申请的一些实施例中,当在上述局域网内存在多个受控设备时,如果上述本地代理程序接收到多个受控设备回复的数据包,可以生成用于表示多个受控设备的智能设备列表,用户可以在多个受控设备的智能设备列表中,选择多个受控设备中的至少一个受控设备;基于本地代理程序可以建立终端与选择的受控设备的通信连接。
在本申请的一些实施例中,在建立终端与第一受控设备的通信连接后,还可以对终端与第一受控设备的通信连接进行监控,以获知终端与第一受控设备的通信连接状况;例如,基于上述本地代理程序,终端可以继续通过广播方式在局域网内下发ARP数据包,如此,终端可以根据针对ARP数据包的回复情况,判断终端与第一受控设备的通信连接状况;例如,终端在向第一受控设备发送ARP数据包后,如果在设定时长内未收到针对ARP数据包的回复数据包,则可以认为终端与第一受控设备的通信连接中断;如果在设定时长内收到针对ARP数据包的回复数据包,则可以认为终端与第一受控设备之间保持通信连接;设定时长可以根据实际需求设置,例如,设定时长可以是1至3秒。
本申请实施例中,终端可以根据预先建立终端与第一受控设备的通信连接,将上述程序代码发送至第一受控设备。
步骤103:第一受控设备接收并运行上述程序代码。
在实际应用中,第一受控设备可以运行上传程序代码,以生成相应的控制指令,第一受控设备可以执行控制指令;如此,由于控制指令是根据程序代码生成的,因而,终端可以通过上述程序代码实现对第一受控设备的控制。
例如,当第一受控设备包括智能车时,通过运行上述程序代码,可以生成控制指令,该控制指令可以指示智能车的运动状态。
实际应用中,步骤101至步骤102可以基于终端的处理器实现,步骤103可以基于第一受控设备的处理器实现,上述处理器可以为特定用途集成电路(Application Specific Integrated Circuit,ASIC)、数字信号处理器(Digital Signal Processor,DSP)、数字信号处理装置(Digital Signal Processing Device,DSPD)、可编程逻辑装置(Programmable Logic Device,PLD)、现场可编程门阵列(Field Programmable Gate Array,FPGA)、中央处理器(Central Processing Unit,CPU)、控制器、微控制器、微处理器中的至少一种。可以理解地,对于不同的终端或受控设备,用于实现上述处理器功能的电子器件还可以为其它,本申请实施例不作具体限定。
可以看出,采用本申请实施例的技术方案,可以基于本地代理程序获取程序代码并 推送,进而可以基于程序代码来实现对第一受控设备的控制,而程序代码是可以灵活编辑的,如此,可以更加灵活地对第一受控设备进行控制。
在本申请的一些实施例中,第一受控设备运行程序代码生成反馈信息,然后,可以将上述反馈信息发送至终端。
本申请实施例中,反馈信息可以用于表示对程序代码运行情况的反馈,例如,反馈信息可以包括上述程序代码的执行结果。
可以看出,通过将反馈信息发送至终端,可以使终端获知上述程序代码运行情况,进而便于针对该程序代码进行处理,例如,在程序代码的运行情况不符合预期时,可以修改该程序代码。
在本申请的一些实施例中,终端在接收到上述第一受控设备发送的反馈信息后,可以在终端进行加载和/或,在终端的显示界面显示反馈信息。
可以看出,通过对反馈信息在终端本地的加载和/或显示,便于用户直观地获知上述程序代码运行情况。
本申请实施例中,并不对终端加载反馈信息的实现方式进行限定,在本申请的一些实施例中,终端以同步加载或异步加载的方式加载反馈信息。
在本申请的一些实施例中,终端可以将反馈信息在WEB前端进行加载和/或显示,便于用户直观地通过WEB前端获知上述程序代码运行情况。
在本申请的一些实施例中,第一受控设备可以通过运行程序代码,生成第二受控设备的控制指令,其中,第一受控设备和所述第二受控设备形成通信连接;
本申请实施例中,可以设置一个第二受控设备,也可以设置多个第二受控设备;第二受控设备可以是树莓派等电子设备;第二受控设备可以包括传感器、摄像头等器件;第二受控设备与第一受控设备的种类可以相同,也可以不同。
第一受控设备在生成第二受控设备的控制指令后,可以将第二受控设备的控制指令发送至第二受控设备;第二受控设备可以执行第二受控设备的控制指令,得到指令执行结果。
第二受控设备可以将指令执行结果发送至第一受控设备;第一受控设备可以根据指令执行结果,生成第一受控设备的控制指令,执行第一受控设备的控制指令。
可以看出,本申请实施例针对第一受控设备和第二受控设备,给出了协同控制方案,在本申请的一些实施例中,可以基于上述程序代码实现对第一受控设备和第二受控设备的协同控制。
在本申请的一些实施例中,第一受控设备和第二受控设备通过有线连接方式进行连接,例如,第一受控设备和第二受控设备通过串口进行连接;如此,可以确保第一受控设备与第二受控设备之间的通信可靠性。
在本申请的一些实施例中,第一受控设备还可以获取工作模式切换信息;工作模式切换信息用于指示第一受控设备和第二受控设备将当前工作模式切换为目标工作模式;
这样,第一受控设备可以根据工作模式切换信息,控制第一受控设备切换工作模式;第一受控设备还可以将工作模式切换信息发送至第二受控设备;第二受控设备在接收到工作模式切换信息后,可以基于工作模式切换信息切换工作模式。
也就是说,本申请实施例中,针对每个受控设备,可以设置多种工作模式,在处于不同的工作模式时,受控设备的工作方式是不同的。可以看出,本申请实施例可以实现第一受控设备和第二受控设备的工作模式的同步切换。
在本申请的一些实施例中,第二受控设备的控制指令为图像检测指令,指令执行结果为图像检测结果;此时,第一受控设备可以根据接收的图像检测结果,生成人体跟踪指令;然后,可以根据人体跟踪指令,控制第一受控设备的运动状态,进而实现人体跟踪。
可以看出,在本申请实施例中,可以在对第一受控设备和第二受控设备的协同控制的基础上,实现人体跟踪。
在本申请的一些实施例中,第一受控设备响应于所述图像检测结果表示人体变小的情况,生成用于控制所述第一受控设备前进的控制指令;响应于所述图像检测结果表示人体变大的情况,生成用于控制所述第一受控设备保持静止的控制指令;响应于所述图像检测结果表示人***于所述第一受控设备的左侧的情况,生成用于控制所述第一受控设备左转的控制指令;响应于所述图像检测结果表示人***于所述第一受控设备的右侧的情况,生成用于控制所述第一受控设备右转的控制指令。
可以看出,在本申请实施例中,通过对第一受控设备的运动状态的控制,可以准确地实现人体跟踪。
下面结合一个应用场景对本申请进行进一步说明。
在该应用场景中,第一受控设备可以包括EV3智能车、电机、相关传感器等器件;EV3智能车可以通过电机驱动相关传感器运动;第二受控设备可以包括树莓派、电机、摄像头等器件;图2为本申请实施例中EV3智能车与树莓派的连接关系示意图,如图2所示,树莓派201可以通过通用串行总线(Universal Serial Bus,USB)连接方式连接摄像头202,可以实时采集周围视频或图像信息;树莓派201还可以通过电机驱动摄像头转动;树莓派201可以通过串口转接线连接至EV3智能车203;EV3智能车203还可以连接相关传感器204、第一驱动轮205和第二驱动轮206;在一个示例中,相关传感器204可以是触碰传感器,触碰传感器被触碰时产生的触发信号,可以用于同步修改树莓派201和EV3智能车203的功能模式;EV3智能车203还可以驱动第一驱动轮205和第二驱动轮206进行工作;在本申请的一些实施例中,触碰传感器可以在被触碰时,产生触发信号给直接相连的EV3智能车203,进而改变EV3智能车203的功能模式标志位,并且将该标志位通过串口发送给树莓派201,进而触发树莓派201的功能模式标志位的更改,从而实现树莓派201和EV3智能车203的功能模式的同步切换。
在一个示例中,树莓派201与EV3智能车203的连接线对应关系可以如表1所示,从表1中可以看出树莓派201的接口与连接线颜色的对应关系。
表1 树莓派201与EV3智能车203的连接关系表
Figure PCTCN2020091915-appb-000001
终端中可以安装上述记载的本地代理程序;图3为本申请实施例中EV3智能车与终端的连接关系示意图,如图3所示,EV3智能车203可以通过无线连接方式与终端301进行连接,这样终端301本地代理程序获取程序代码后,可以将程序代码实时推送至EV3智能车203,为了实现EV3智能车203与终端301的无线连接,可以在EV3智能车203的USB接口外接无线网卡;在本申请的一些实施例中,可以使EV3智能车203与终端301处于同一网段,如此,可以保证无线网络连接稳定性以及EV3智能车203与终端301 之间的通信可靠性。
图4为本申请实施例的设备控制方法的流程图二,如图4所示,该流程可以包括:
步骤401:连接EV3智能车203和树莓派201,并连接EV3智能车203和终端301。
本步骤的实现方式已经在前述记载的内容中作出说明,这里不再赘述。
步骤402:在终端侧,基于本地代理程序实现WEB前端的程序代码收集,并将收集的程序代码推送至EV3智能车203。
这里,可以基于本地代理程序收集用户提交的python代码,在将python代码推送至EV3智能车203后,可以使EV3智能车203基于python代码工作,也就是说,可以基于python代码实现对EV3智能车203的远程控制,如此,可以更加灵活多样地控制EV3智能车203等受控设备;在教育场景中,可以通过学生不断试错及验证新颖的控制方案,可以培养学生程序设计流程思维。
为了实现EV3智能车203和终端301的连接,需要在EV3智能车203一侧开启用户数据报协议(User Datagram Protocol,UDP)服务,本地代理程序的实施需要以EV3智能车203开启的UDP服务为基础。
图5为本申请实施例中代理程序的功能流程示意图,参照图5,该流程可以包括:
步骤A1:通过广播方式获取终端301同一网段下的智能车列表。
在本申请的一些实施例中,在运行本地代理程序时,终端301可以通过广播方式在局域网内下发ARP数据包,该局域网内的EV3智能车203在收到ARP数据包会回复相应的数据包,上述本地代理程序可以对EV3智能车203回复的数据包进行解析,得到EV3智能车203的IP地址;当得到多个EV3智能车203的IP地址,可以生成终端301同一网段下的EV3智能车列表。
步骤A2:选择EV3智能车203并建立连接。
在本申请的一些实施例中,可以在上述EV3智能车列表选择特定的智能车,然后创建与相应的智能车的UDP连接。
步骤A3:推送程序代码至EV3智能车203,接收EV3智能车203的反馈信息。
在本申请的一些实施例中,在终端侧的WEB前端,用户可以提交程序代码,当用户点击提交按钮后,本地代理程序可以收集用户提交的程序代码,然后,可以将该程序代码推送至选择的EV3智能车203中;EV3智能车203可以运行程序代码,生成反馈信息,并将该反馈信息发送至终端301。
步骤A4:通过异步加载的方式在WEB前端加载反馈信息。
步骤A5:在程序代码执行完毕后,可以断开终端301与EV3智能车203的连接,并回收资源。
在本申请的一些实施例中,在接收或加载反馈信息后,可以认为程序代码执行完毕,此时,通过断开终端301与EV3智能车203的连接,可以释放终端301与EV3智能车203之间的网络传输资源,在终端301一侧,也可以释放相应的硬件资源和/或软件资源(例如,WEB前端资源)。
步骤403:EV3智能车203接收并运行程序代码,将图像检测指令推送至树莓派201。
在本申请的一些实施例中,EV3智能车203启动对应的UDP服务,该服务可以持续监听固定端口的信息,本地代理程序将程序代码推送到该EV3智能车203以后,EV3智能车203可以运行接收到的程序代码(可以封装在数据包中),得到控制指令;如果控制指令中涉及到图像处理任务,则可以生成图像检测指令,启动对应的服务模式,将图像检测指令推送至树莓派201,以指示树莓派201执行图像检测任务,如此,可以从树莓派201获取图像检测结果。
在本申请的一些实施例中,EV3智能车203和树莓派201可以具有多种工作模式,下面通过表2示例性地说明EV3智能车203和树莓派201的几种工作模式。
表2 EV3智能车203和树莓派201的工作模式表
Figure PCTCN2020091915-appb-000002
步骤404:树莓派201执行图像检测指令,并将图像检测结果(也称为图像处理结果)传输至EV3智能车203,EV3智能车203根据图像检测结果执行相应的控制指令。
在本申请的一些实施例中,图像检测指令可以封装在数据包中发送,树莓派201接收来自EV3智能车203的数据包,通过解析得出自身的工作模式,然后执行对应的图像检测任务,并将图像检测结果发送至EV3智能车203。在同一工作模式下,EV3智能车203和树莓派201协同执行特定任务。
图6为本申请实施例中人体跟踪模式下EV3智能车203和树莓派201的处理流程示意图,如图6所示,可以从图像处理阶段,检测数据处理阶段和指令执行阶段说明EV3智能车203和树莓派201的处理流程;在图像处理阶段可以通过EV3智能车203推送图像检测指令触发,首先,利用摄像头202采集图片,然后,针对采集的图像基于深度学习技术实现人体检测,得到人体坐标;数据处理阶段主要实现人体ID识别和位置坐标处理;数据处理的结果被发送至EV3智能车203,EV3智能车203可以根据图6所示的逻辑控制电机运转,通过这种方式就可以实现基础的人体跟踪功能;参照图6,当确定人体变小时,通过电机驱动EV3智能车203前进;当确定人体变大时,通过电机驱动方式控制EV3智能车203静止;当确定人***于左侧时,通过电机驱动EV3智能车203左转;当确定人***于右侧时,通过电机驱动EV3智能车203右转。
步骤405:EV3智能车203收集反馈信息,并将反馈信息发送至终端301。
这里,基于步骤401至步骤405,已经实现基于WEB前端对EV3智能车203的远程控制,但是还需要配合设计一套完善的反馈流程,以保证双向信息流的完整性。
图7为本申请实施例中针对反馈信息的处理流程示意图,如图7所示,在EV3智能车203一侧(图7中记为EV3侧),当接收到python代码后,可以基于自身的python解释器,完成原始代码和字节码的转换,之后再转换成机器码,以便于执行指令。程序代码的执行结果可以重定向到文本文件中,在EV3侧启动UDP通信模块时,基于UDP通信模块可以建立EV3智能车203与终端301之间的通信连接;UDP通信模块可以实时检测上述文本文件是否被更新,在文本信息被更新后,会读取其中内容(即反馈信息);然后通过建立的通信连接将读取的反馈信息主动推送到终端301的本地代理程序中;本地代理程序基于JAVA框架,可以采用异步加载方式将反馈信息加载到WEB前端,WEB前端经过一系列渲染后在指定的区域部分反馈信息。
可以看出,本申请实施例中,提出了一种完整的基于EV3智能车203的远程控制方案,可通过WEB端完成程序代码输入,然后实时传输到下位机(即EV3智能车),并实现下位机回复反馈信息;进一步地,通过本地代理程序,可以支持完成对WEB前端python代码收集,并基于UDP连接推送到下位机侧执行程序代码;通过本地代理程序,可以实现终端301和EV3智能车203的信息交互,传输高效且可靠;进一步地,设计了终端、EV3智能车203、树莓派201以及相关外设之间的连接方案,可以保证连接高效可行性,同时兼容大多数嵌入式开发设备。
在相关技术中,通常是针对EV3智能车和树莓派进行独立控制,未给出EV3智能车和树莓派的协同控制方案;并且,针对实现对EV3智能车或树莓派的单一工作模式控制, 不能针对EV3智能车或树莓派,实现不同工作模式的切换。
而在本申请实施例中,针对EV3智能车和树莓派的连接、以及EV3智能车和树莓派的控制,给出了一种完整的实现方案;并且,可以针对EV3智能车和树莓派,实现不同工作模式的同步切换机制,可以保证不同工作模式下信息交互一致性。
基于前述实施例记载的内容,下面从终端的角度,来说明本申请实施例的一种设备控制方法。
图8为本申请实施例的设备控制方法的流程图三,如图8所示,该流程可以包括:
步骤801:通过本地代理程序获取用于控制第一受控设备工作的程序代码;
本申请实施例中,第一受控设备可以是智能车、嵌入式开发设备等电子设备;第一受控设备可以包括传感器、摄像头等器件,终端可以提供人机交互界面。本申请实施例中,第一受控设备可以是一个单独的电子设备,也可以集成多个电子设备。
在实际应用中,可以在终端中安装本地代理程序,本地代理程序至少用于收集用户提交的程序代码,这里,用户提交的程序代码可以是用于控制第一受控设备工作的程序代码,如此,通过运行本地代理程序,可以收集到用于控制第一受控设备工作的程序代码。
本申请实施例中,并不对上述程序代码的种类进行限定,上述程序代码的种类可以根据实际应用需求预先确定,例如,上述程序代码可以是python代码或其他程序代码。
步骤802:通过所述本地代理程序将所述程序代码发送至所述第一受控设备,使所述第一受控设备运行所述程序代码。
在实际应用中,可以预先建立终端与第一受控设备的通信连接,本申请实施例中并不对终端与第一受控设备的通信连接的连接方式进行限定,例如,终端与第一受控设备可以通过有线连接方式或各种无线连接方式进行数据交互。
在实际应用中,第一受控设备可以运行上传程序代码,以生成相应的控制指令,第一受控设备可以执行控制指令;如此,由于控制指令是根据程序代码生成的,因而,终端可以通过上述程序代码实现对第一受控设备的控制。
实际应用中,步骤801至步骤802可以基于终端的处理器实现,上述处理器可以为ASIC、DSP、DSPD、PLD、FPGA、CPU、控制器、微控制器、微处理器中的至少一种。可以理解地,对于不同的终端或受控设备,用于实现上述处理器功能的电子器件还可以为其它,本申请实施例不作具体限定。
可以看出,采用本申请实施例的技术方案,可以基于本地代理程序获取程序代码并推送,进而可以基于程序代码来实现对第一受控设备的控制,而程序代码是可以灵活编辑的,如此,可以更加灵活地对第一受控设备进行控制。
在本申请的一些实施例中,所述方法还包括:
在将所述程序代码发送至所述第一受控设备后,接收所述第一受控设备的反馈信息,所述反馈信息由所述第一受控设备运行所述程序代码后生成。
在本申请的一些实施例中,所述反馈信息包括所述程序代码的执行结果。
可以看出,通过将反馈信息发送至控制设备,可以使终端获知上述程序代码运行情况,进而便于针对该程序代码进行处理,例如,在程序代码的运行情况不符合预期时,可以修改该程序代码。
在本申请的一些实施例中,所述方法还包括:
在接收所述第一受控设备的反馈信息后,将所述反馈信息在所述本地代理程序所在的终端进行加载和/或显示。
可以看出,通过对反馈信息在终端本地的加载和/或显示,便于用户直观地获知上述程序代码运行情况。
在本申请的一些实施例中,所述通过本地代理程序获取用于控制受控设备工作的程 序代码,包括:
通过所述本地代理程序获取用户基于WEB页面提交的所述程序代码。
可以看出,用户可以基于WEB页面,较为方便地提交程序代码,进而,有利于本地代理程序收集到用户提交的程序代码。
基于前述实施例记载的内容,下面从第一受控设备的角度,来说明本申请实施例的另一种设备控制方法。
图9为本申请实施例的设备控制方法的流程图四,如图9所示,该流程可以包括:
步骤901:接收终端通过本地代理程序发送的用于控制所述第一受控设备工作的程序代码;所述程序代码是由所述本地代理程序收集的。
本申请实施例中,第一受控设备可以是智能车、嵌入式开发设备等电子设备;第一受控设备可以包括传感器、摄像头等器件,终端可以提供人机交互界面。本申请实施例中,第一受控设备可以是一个单独的电子设备,也可以集成多个电子设备。
在实际应用中,可以在终端中安装本地代理程序,本地代理程序至少用于收集用户提交的程序代码,这里,用户提交的程序代码可以是用于控制第一受控设备工作的程序代码,如此,通过运行本地代理程序,可以收集到用于控制第一受控设备工作的程序代码。
本申请实施例中,并不对上述程序代码的种类进行限定,上述程序代码的种类可以根据实际应用需求预先确定,例如,上述程序代码可以是python代码或其他程序代码。
在实际应用中,可以预先建立终端与第一受控设备的通信连接,本申请实施例中并不对终端与第一受控设备的通信连接的连接方式进行限定,例如,终端与第一受控设备可以通过有线连接方式或各种无线连接方式进行数据交互。
步骤902:运行所述程序代码。
在实际应用中,第一受控设备可以运行上传程序代码,以生成相应的控制指令,第一受控设备可以执行控制指令;如此,由于控制指令是根据程序代码生成的,因而,终端可以通过上述程序代码实现对第一受控设备的控制。
可以看出,采用本申请实施例的技术方案,可以基于本地代理程序获取程序代码并推送,进而可以基于程序代码来实现对第一受控设备的控制,而程序代码是可以灵活编辑的,如此,可以更加灵活地对第一受控设备进行控制。
在本申请的一些实施例中,所述方法还包括:
运行所述程序代码生成反馈信息,并将所述反馈信息发送至所述终端。
在本申请的一些实施例中,所述反馈信息包括所述程序代码的执行结果。
可以看出,通过将反馈信息发送至控制设备,可以使终端获知上述程序代码运行情况,进而便于针对该程序代码进行处理,例如,在程序代码的运行情况不符合预期时,可以修改该程序代码。
在本申请的一些实施例中,所述方法还包括:
通过运行所述程序代码,生成第二受控设备的控制指令,其中,所述第一受控设备和所述第二受控设备形成通信连接;
将所述第二受控设备的控制指令发送至第二受控设备,使所述第二受控设备执行所述第二受控设备的控制指令;
接收所述第二受控设备发送的指令执行结果,根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令;其中,所述指令执行结果是所述第二受控设备在执行所述第二受控设备的控制指令后得到的。
可以看出,本申请实施例针对第一受控设备和第二受控设备,给出了协同控制方案,在本申请的一些实施例中,可以基于上述程序代码实现对第一受控设备和第二受控设备的协同控制。
在本申请的一些实施例中,所述方法还包括:
获取工作模式切换信息;所述工作模式切换信息用于指示所述第一受控设备和所述第二受控设备将当前工作模式切换为目标工作模式;
根据所述工作模式切换信息,控制所述第一受控设备切换工作模式;将所述工作模式切换信息发送至第二受控设备,使所述第二受控设备基于所述工作模式切换信息切换工作模式。
也就是说,本申请实施例中,针对每个受控设备,可以设置多种工作模式,在处于不同的工作模式时,受控设备的工作方式是不同的。可以看出,本申请实施例可以实现第一受控设备和第二受控设备的工作模式的同步切换。
在本申请的一些实施例中,所述第二受控设备的控制指令为图像检测指令,所述指令执行结果为图像检测结果;
所述根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令,包括:
根据所述图像检测结果,生成所述第一受控设备的控制指令,所述第一受控设备的控制指令为人体跟踪指令;
根据所述人体跟踪指令,控制所述第一受控设备的运动状态。
可以看出,在本申请实施例中,可以在对第一受控设备和第二受控设备的协同控制的基础上,实现人体跟踪。
在本申请的一些实施例中,所述根据所述图像检测结果,生成所述第一受控设备的控制指令,包括:
响应于所述图像检测结果表示人体变小的情况,生成用于控制所述第一受控设备前进的控制指令;
响应于所述图像检测结果表示人体变大的情况,生成用于控制所述第一受控设备保持静止的控制指令;
响应于所述图像检测结果表示人***于所述第一受控设备的左侧的情况,生成用于控制所述第一受控设备左转的控制指令;
响应于所述图像检测结果表示人***于所述第一受控设备的右侧的情况,生成用于控制所述第一受控设备右转的控制指令。
可以看出,在本申请实施例中,通过对第一受控设备的运动状态的控制,可以准确地实现人体跟踪。
在本申请的一些实施例中,所述第一受控设备和所述第二受控设备通过有线连接方式进行连接;如此,可以确保第一受控设备与第二受控设备之间的通信可靠性。
本领域技术人员可以理解,在具体实施方式的上述方法中,各步骤的撰写顺序并不意味着严格的执行顺序而对实施过程构成任何限定,各步骤的具体执行顺序应当以其功能和可能的内在逻辑确定。
在前述实施例提出的设备控制方法的基础上,本申请实施例提出了一种终端。
图10为本申请实施例的一种终端的组成结构示意图,如图10所示,该设备包括获取模块1001和第一处理模块1002,其中,
获取模块1001,配置为通过本地代理程序获取用于控制第一受控设备工作的程序代码;
第一处理模块1002,配置为通过本地代理程序将程序代码发送至所述第一受控设备,使第一受控设备运行程序代码。
在一实施方式中,所述第一处理模块1002,还配置为在将所述程序代码发送至所述第一受控设备后,接收所述第一受控设备的反馈信息,所述反馈信息由所述第一受控设备运行所述程序代码后生成。
在一实施方式中,所述第一处理模块1002,还配置为在在接收所述第一受控设备的反馈信息后,将所述反馈信息在所述本地代理程序所在的终端进行加载和/或显示。
在一实施方式中,所述反馈信息包括所述程序代码的执行结果。
在一实施方式中,所述获取模块1001,配置为通过所述本地代理程序获取用户基于WEB页面提交的所述程序代码。
上述获取模块1001和第一处理模块1002均可由位于终端中的处理器实现,上述处理器为ASIC、DSP、DSPD、PLD、FPGA、CPU、控制器、微控制器、微处理器中的至少一种。
另外,在本申请实施例中的各功能模块可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。
所述集成的单元如果以软件功能模块的形式实现并非作为独立的产品进行销售或使用时,可以存储在一个计算机可读取存储介质中,基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或processor(处理器)执行本申请实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
具体来讲,本申请实施例中的一种设备控制方法对应的计算机程序指令可以被存储在光盘,硬盘,U盘等存储介质上,当存储介质中的与一种设备控制方法对应的计算机程序指令被一终端读取或被执行时,实现前述实施例中应用于终端的任意一种设备控制方法。
基于前述实施例相同的技术构思,参见图11,其示出了本申请实施例提供的另一种终端,可以包括:第一存储器1101和第一处理器1102;其中,
所述第一存储器1101,配置为存储计算机程序和数据;
所述第一处理器1102,配置为执行所述第一存储器中存储的计算机程序,以实现前述实施例中应用于终端的任意一种设备控制方法。
在实际应用中,上述第一存储器1101可以是易失性存储器(volatile memory),例如RAM;或者非易失性存储器(non-volatile memory),例如ROM,快闪存储器(flash memory),硬盘(Hard Disk Drive,HDD)或固态硬盘(Solid-State Drive,SSD);或者上述种类的存储器的组合,并向第一处理器1102提供指令和数据。
上述第一处理器1102可以为ASIC、DSP、DSPD、PLD、FPGA、CPU、控制器、微控制器、微处理器中的至少一种。可以理解地,对于不同的设备,用于实现上述处理器功能的电子器件还可以为其它,本申请实施例不作具体限定。
在前述实施例提出的另一种设备控制方法的基础上,本申请实施例提出了一种受控设备。
图12为本申请实施例的一种第一受控设备的组成结构示意图,如图12所示,所述设备包括接收模块1201和第二处理模块1202,其中,
接收模块1201,配置为接收终端通过本地代理程序发送的用于控制所述第一受控设备工作的程序代码;所述程序代码是由所述本地代理程序收集的;
第二处理模块1202,配置为运行所述程序代码。
在一实施方式中,所述第二处理模块1202,还配置为运行所述程序代码生成反馈信息,并将所述反馈信息发送至所述终端。
在一实施方式中,所述反馈信息包括所述程序代码的执行结果。
在一实施方式中,所述第二处理模块1202,还配置为:
通过运行所述程序代码,生成第二受控设备的控制指令,其中,所述第一受控设备和所述第二受控设备形成通信连接;
将所述第二受控设备的控制指令发送至第二受控设备,使所述第二受控设备执行所述第二受控设备的控制指令;
接收所述第二受控设备发送的指令执行结果,根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令;其中,所述指令执行结果是所述第二受控设备在执行所述第二受控设备的控制指令后得到的。
在一实施方式中,所述第二处理模块1202,还配置为获取工作模式切换信息;根据所述工作模式切换信息,控制所述第一受控设备切换工作模式;将所述工作模式切换信息发送至第二受控设备,使所述第二受控设备基于所述工作模式切换信息切换工作模式;所述工作模式切换信息用于指示所述第一受控设备和所述第二受控设备将当前工作模式切换为目标工作模式。
在一实施方式中,所述第二受控设备的控制指令为图像检测指令,所述指令执行结果为图像检测结果;
所述第二处理模块1202,配置为根据所述图像检测结果,生成所述第一受控设备的控制指令,所述第一受控设备的控制指令为人体跟踪指令;根据所述人体跟踪指令,控制所述第一受控设备的运动状态。
在一实施方式中,所述第二处理模块1202,配置为响应于所述图像检测结果表示人体变小的情况,生成用于控制所述第一受控设备前进的控制指令;响应于所述图像检测结果表示人体变大的情况,生成用于控制所述第一受控设备保持静止的控制指令;响应于所述图像检测结果表示人***于所述第一受控设备的左侧的情况,生成用于控制所述第一受控设备左转的控制指令;响应于所述图像检测结果表示人***于所述第一受控设备的右侧的情况,生成用于控制所述第一受控设备右转的控制指令。
在一实施方式中,所述第一受控设备和所述第二受控设备通过有线连接方式进行连接。
上述接收模块1201和第二处理模块1202均可由位于第一受控设备中的处理器实现,上述处理器为ASIC、DSP、DSPD、PLD、FPGA、CPU、控制器、微控制器、微处理器中的至少一种。
另外,在本申请实施例中的各功能模块可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。
所述集成的单元如果以软件功能模块的形式实现并非作为独立的产品进行销售或使用时,可以存储在一个计算机可读取存储介质中,基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或processor(处理器)执行本申请实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
具体来讲,本申请实施例中的另一种设备控制方法对应的计算机程序指令可以被存储在光盘,硬盘,U盘等存储介质上,当存储介质中的与另一种设备控制方法对应的计算机程序指令被一电子设备读取或被执行时,实现前述实施例中应用于第一受控设备的任意一种设备控制方法。
基于前述实施例相同的技术构思,参见图13,其示出了本申请实施例提供的一种电 子设备,可以包括:第二存储器1301和第二处理器1302;其中,
所述第二存储器1301,配置为存储计算机程序和数据;
所述第二处理器1302,配置为执行所述第二存储器中存储的计算机程序,以实现前述实施例中应用于第一受控设备的任意一种设备控制方法。
在实际应用中,上述第二存储器1301可以是易失性存储器,例如RAM;或者非易失性存储器,例如ROM,快闪存储器,HDD或SSD;或者上述种类的存储器的组合,并向第二处理器1302提供指令和数据。
上述第二处理器1302可以为ASIC、DSP、DSPD、PLD、FPGA、CPU、控制器、微控制器、微处理器中的至少一种。可以理解地,对于不同的电子设备,用于实现上述处理器功能的电子器件还可以为其它,本申请实施例不作具体限定。
在一些实施例中,本申请实施例提供的装置具有的功能或包含的模块可以用于执行上文方法实施例描述的方法,其具体实现可以参照上文方法实施例的描述,为了简洁,这里不再赘述
上文对各个实施例的描述倾向于强调各个实施例之间的不同之处,其相同或相似之处可以互相参考,为了简洁,本文不再赘述
本申请所提供的各方法实施例中所揭露的方法,在不冲突的情况下可以任意组合,得到新的方法实施例。
本申请所提供的各产品实施例中所揭露的特征,在不冲突的情况下可以任意组合,得到新的产品实施例。
本申请所提供的各方法或设备实施例中所揭露的特征,在不冲突的情况下可以任意组合,得到新的方法实施例或设备实施例。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本申请各个实施例所述的方法。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,这些均属于本申请的保护之内。
工业实用性
本申请实施例公开了一种设备控制方法、终端、受控设备、电子设备、介质和程序,该方法包括:通过本地代理程序获取用于控制第一受控设备工作的程序代码;通过所述本地代理程序将所述程序代码发送至所述第一受控设备,使所述第一受控设备运行所述程序代码。本申请实施例的技术方案,可以基于本地代理程序获取程序代码并推送,进而可以基于程序代码来实现对第一受控设备的控制,而程序代码是可以灵活编辑的,如此,可以更加灵活地对第一受控设备进行控制。

Claims (32)

  1. 一种设备控制方法,包括:
    通过本地代理程序获取用于控制第一受控设备工作的程序代码;
    通过所述本地代理程序将所述程序代码发送至所述第一受控设备,使所述第一受控设备运行所述程序代码。
  2. 根据权利要求1所述的方法,其中,所述方法还包括:
    在将所述程序代码发送至所述第一受控设备后,接收所述第一受控设备的反馈信息,所述反馈信息由所述第一受控设备运行所述程序代码后生成。
  3. 根据权利要求2所述的方法,其中,所述方法还包括:
    在接收所述第一受控设备的反馈信息后,将所述反馈信息在所述本地代理程序所在的终端进行加载和/或显示。
  4. 根据权利要求2或3所述的方法,其中,所述反馈信息包括所述程序代码的执行结果。
  5. 根据权利要求1至3任一项所述的方法,其中,所述通过本地代理程序获取用于控制受控设备工作的程序代码,包括:
    通过所述本地代理程序获取用户基于网络WEB页面提交的所述程序代码。
  6. 一种设备控制方法,应用于第一受控设备中,所述方法包括:
    接收终端通过本地代理程序发送的用于控制所述第一受控设备工作的程序代码;所述程序代码是由所述本地代理程序收集的;
    运行所述程序代码。
  7. 根据权利要求6所述的方法,其中,所述方法还包括:
    运行所述程序代码生成反馈信息,并将所述反馈信息发送至所述终端。
  8. 根据权利要求7所述的方法,其中,所述反馈信息包括所述程序代码的执行结果。
  9. 根据权利要求6至8任一项所述的方法,其中,所述方法还包括:
    通过运行所述程序代码,生成第二受控设备的控制指令,其中,所述第一受控设备和所述第二受控设备形成通信连接;
    将所述第二受控设备的控制指令发送至第二受控设备,使所述第二受控设备执行所述第二受控设备的控制指令;
    接收所述第二受控设备发送的指令执行结果,根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令;其中,所述指令执行结果是所述第二受控设备在执行所述第二受控设备的控制指令后得到的。
  10. 根据权利要求9所述的方法,其中,所述方法还包括:
    获取工作模式切换信息;所述工作模式切换信息用于指示所述第一受控设备和所述第二受控设备将当前工作模式切换为目标工作模式;
    根据所述工作模式切换信息,控制所述第一受控设备切换工作模式;将所述工作模式切换信息发送至第二受控设备,使所述第二受控设备基于所述工作模式切换信息切换工作模式。
  11. 根据权利要求9或10所述的方法,其中,所述第二受控设备的控制指令为图像检测指令,所述指令执行结果为图像检测结果;
    所述根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令,包括:
    根据所述图像检测结果,生成所述第一受控设备的控制指令,所述第一受控设备的 控制指令为人体跟踪指令;
    根据所述人体跟踪指令,控制所述第一受控设备的运动状态。
  12. 根据权利要求11所述的方法,其中,所述根据所述图像检测结果,生成所述第一受控设备的控制指令,包括:
    响应于所述图像检测结果表示人体变小的情况,生成用于控制所述第一受控设备前进的控制指令;
    响应于所述图像检测结果表示人体变大的情况,生成用于控制所述第一受控设备保持静止的控制指令;
    响应于所述图像检测结果表示人***于所述第一受控设备的左侧的情况,生成用于控制所述第一受控设备左转的控制指令;
    响应于所述图像检测结果表示人***于所述第一受控设备的右侧的情况,生成用于控制所述第一受控设备右转的控制指令。
  13. 根据权利要求6至12任一项所述的方法,其中,所述第一受控设备和所述第二受控设备通过有线连接方式进行连接。
  14. 一种终端,包括获取模块和第一处理模块,其中,
    获取模块,配置为通过本地代理程序获取用于控制第一受控设备工作的程序代码;
    第一处理模块,配置为通过所述本地代理程序将所述程序代码发送至所述第一受控设备,使所述第一受控设备运行所述程序代码。
  15. 根据权利要求14所述的终端,其中,所述第一处理模块,还配置为在将所述程序代码发送至所述第一受控设备后,接收所述第一受控设备的反馈信息,所述反馈信息由所述第一受控设备运行所述程序代码后生成。
  16. 根据权利要求15所述的终端,其中,所述第一处理模块,还配置为在在接收所述第一受控设备的反馈信息后,将所述反馈信息在所述本地代理程序所在的终端进行加载和/或显示。
  17. 根据权利要求15或16所述的终端,其中,所述反馈信息包括所述程序代码的执行结果。
  18. 根据权利要求14至16任一项所述的终端,其中,所述获取模块,配置为通过所述本地代理程序获取用户基于网络WEB页面提交的所述程序代码。
  19. 一种第一受控设备,包括接收模块和第二处理模块,其中,
    接收模块,配置为接收终端通过本地代理程序发送的用于控制所述第一受控设备工作的程序代码;所述程序代码是由所述本地代理程序收集的;
    第二处理模块,配置为运行所述程序代码。
  20. 根据权利要求19所述的第一受控设备,其中,所述第二处理模块,还配置为运行所述程序代码生成反馈信息,并将所述反馈信息发送至所述终端。
  21. 根据权利要求20所述的第一受控设备,其中,所述反馈信息包括所述程序代码的执行结果。
  22. 根据权利要求19至21任一项所述的第一受控设备,其中,所述第二处理模块,还配置为:
    通过运行所述程序代码,生成第二受控设备的控制指令,其中,所述第一受控设备和所述第二受控设备形成通信连接;
    将所述第二受控设备的控制指令发送至第二受控设备,使所述第二受控设备执行所述第二受控设备的控制指令;
    接收所述第二受控设备发送的指令执行结果,根据所述指令执行结果,生成所述第一受控设备的控制指令,执行所述第一受控设备的控制指令;其中,所述指令执行结果是所述第二受控设备在执行所述第二受控设备的控制指令后得到的。
  23. 根据权利要求22所述的第一受控设备,其中,所述第二处理模块,还配置为获取工作模式切换信息;根据所述工作模式切换信息,控制所述第一受控设备切换工作模式;将所述工作模式切换信息发送至第二受控设备,使所述第二受控设备基于所述工作模式切换信息切换工作模式;所述工作模式切换信息用于指示所述第一受控设备和所述第二受控设备将当前工作模式切换为目标工作模式。
  24. 根据权利要求22或23所述的第一受控设备,其中,所述第二受控设备的控制指令为图像检测指令,所述指令执行结果为图像检测结果;
    所述第二处理模块,配置为根据所述图像检测结果,生成所述第一受控设备的控制指令,所述第一受控设备的控制指令为人体跟踪指令;根据所述人体跟踪指令,控制所述第一受控设备的运动状态。
  25. 根据权利要求24所述的第一受控设备,其中,所述第二处理模块,配置为响应于所述图像检测结果表示人体变小的情况,生成用于控制所述第一受控设备前进的控制指令;响应于所述图像检测结果表示人体变大的情况,生成用于控制所述第一受控设备保持静止的控制指令;响应于所述图像检测结果表示人***于所述第一受控设备的左侧的情况,生成用于控制所述第一受控设备左转的控制指令;响应于所述图像检测结果表示人***于所述第一受控设备的右侧的情况,生成用于控制所述第一受控设备右转的控制指令。
  26. 根据权利要求19至25任一项所述的第一受控设备,其中,所述第一受控设备和所述第二受控设备通过有线连接方式进行连接。
  27. 一种终端,包括处理器和配置为存储能够在处理器上运行的计算机程序的存储器;其中,
    所述处理器配置为运行所述计算机程序时,执行权利要求1至5任一项所述的方法。
  28. 一种电子设备,包括处理器和配置为存储能够在处理器上运行的计算机程序的存储器;其中,
    所述处理器配置为运行所述计算机程序时,执行权利要求6至13任一项所述的方法。
  29. 一种计算机存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现权利要求1至5任一项所述的方法。
  30. 一种计算机存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现权利要求6至13任一项所述的方法。
  31. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在终端中运行时,所述终端中的处理器执行用于实现权利要求1至5中的任一权利要求所述的方法。
  32. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器执行用于实现权利要求6至13中的任一权利要求所述的方法。
PCT/CN2020/091915 2019-06-28 2020-05-22 设备控制方法、终端、受控设备、电子设备、介质和程序 WO2020259154A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910578882.2A CN110297472A (zh) 2019-06-28 2019-06-28 设备控制方法、终端、受控设备、电子设备和存储介质
CN201910578882.2 2019-06-28

Publications (1)

Publication Number Publication Date
WO2020259154A1 true WO2020259154A1 (zh) 2020-12-30

Family

ID=68029509

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/091915 WO2020259154A1 (zh) 2019-06-28 2020-05-22 设备控制方法、终端、受控设备、电子设备、介质和程序

Country Status (3)

Country Link
CN (1) CN110297472A (zh)
TW (1) TWI743853B (zh)
WO (1) WO2020259154A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112860522A (zh) * 2021-03-02 2021-05-28 北京梧桐车联科技有限责任公司 程序的运行监控方法、装置及设备
CN114070659A (zh) * 2021-10-29 2022-02-18 深圳市优必选科技股份有限公司 一种设备锁定方法、装置及终端设备
CN115118699A (zh) * 2022-06-21 2022-09-27 国仪量子(合肥)技术有限公司 数据传输方法、装置、***、上位机和存储介质

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110297472A (zh) * 2019-06-28 2019-10-01 上海商汤智能科技有限公司 设备控制方法、终端、受控设备、电子设备和存储介质
CN111262912B (zh) * 2020-01-09 2021-06-29 北京邮电大学 一种控制车辆运动的***、方法及装置
CN112001827A (zh) * 2020-09-25 2020-11-27 上海商汤临港智能科技有限公司 教具控制方法及装置、教学设备、存储介质
CN113903218B (zh) * 2021-10-21 2024-03-15 优必选(湖北)科技有限公司 一种教育编程方法、电子设备及计算机可读存储介质
CN115442450A (zh) * 2022-08-24 2022-12-06 山东浪潮科学研究院有限公司 可编程人工智能小车的云化共享方法及存储介质
CN115827516A (zh) * 2023-02-03 2023-03-21 北京融合未来技术有限公司 设备控制方法和装置、数据采集***、设备和介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004114032A1 (en) * 2003-06-25 2004-12-29 Avecontrol Oy Signal processing method and signal processing system
US20090030546A1 (en) * 2007-07-24 2009-01-29 Wah Hong Industrial Corp. Apparatus and method for positioning control
CN102981758A (zh) * 2012-11-05 2013-03-20 福州瑞芯微电子有限公司 电子设备之间的连接方法
CN104808600A (zh) * 2014-01-26 2015-07-29 广东美的制冷设备有限公司 受控终端多控制模式自适应控制方法、***及控制终端
CN204965043U (zh) * 2015-09-23 2016-01-13 苏州工业园区宅艺智能科技有限公司 一种基于云平台的智能家居控制***
CN105871670A (zh) * 2016-05-20 2016-08-17 珠海格力电器股份有限公司 终端设备的控制方法、装置和***
CN108958103A (zh) * 2018-06-25 2018-12-07 珠海格力电器股份有限公司 控制方法、被控方法、装置、智能终端及智能电器
CN110297472A (zh) * 2019-06-28 2019-10-01 上海商汤智能科技有限公司 设备控制方法、终端、受控设备、电子设备和存储介质

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6853867B1 (en) * 1998-12-30 2005-02-08 Schneider Automation Inc. Interface to a programmable logic controller
US7950010B2 (en) * 2005-01-21 2011-05-24 Sap Ag Software deployment system
US20060168575A1 (en) * 2005-01-21 2006-07-27 Ankur Bhatt Defining a software deployment
US8914783B2 (en) * 2008-11-25 2014-12-16 Fisher-Rosemount Systems, Inc. Software deployment manager integration within a process control system
JP5240141B2 (ja) * 2009-09-14 2013-07-17 株式会社リコー プログラムダウンロードシステム、プログラムダウンロード方法、画像形成装置、プログラム配信サーバおよびダウンロードプログラム
TW201233096A (en) * 2011-01-31 2012-08-01 Fuither Tech Co Ltd Remote assistance service method for embedded operation system
CN102520665A (zh) * 2011-12-23 2012-06-27 中国科学院自动化研究所 开放式机器人示教装置和机器人控制***
CN104175308A (zh) * 2014-08-12 2014-12-03 湖南信息职业技术学院 自主服务机器人
CN104932298A (zh) * 2015-04-28 2015-09-23 中国地质大学(武汉) 一种教学机器人控制器
CN105045153A (zh) * 2015-07-31 2015-11-11 中国地质大学(武汉) 基于移动机器人平台的三模式控制***
CN106651384A (zh) * 2015-10-30 2017-05-10 阿里巴巴集团控股有限公司 样本品质检测方法和检测数据录入方法、装置和***
CN105488815B (zh) * 2015-11-26 2018-04-06 北京航空航天大学 一种支持目标尺寸变化的实时对象跟踪方法
CN105760824B (zh) * 2016-02-02 2019-02-01 北京进化者机器人科技有限公司 一种运动人体跟踪方法和***
CN105760106B (zh) * 2016-03-08 2019-01-15 网易(杭州)网络有限公司 一种智能家居设备交互方法和装置
US20170351226A1 (en) * 2016-06-01 2017-12-07 Rockwell Automation Technologies, Inc. Industrial machine diagnosis and maintenance using a cloud platform
CN106737676B (zh) * 2016-12-28 2019-03-15 南京埃斯顿机器人工程有限公司 一种基于脚本可二次开发的工业机器人编程***
CN109166404A (zh) * 2018-10-12 2019-01-08 山东爱泊客智能科技有限公司 基于共享式可操控模型的自编程控制的实现方法与装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004114032A1 (en) * 2003-06-25 2004-12-29 Avecontrol Oy Signal processing method and signal processing system
US20090030546A1 (en) * 2007-07-24 2009-01-29 Wah Hong Industrial Corp. Apparatus and method for positioning control
CN102981758A (zh) * 2012-11-05 2013-03-20 福州瑞芯微电子有限公司 电子设备之间的连接方法
CN104808600A (zh) * 2014-01-26 2015-07-29 广东美的制冷设备有限公司 受控终端多控制模式自适应控制方法、***及控制终端
CN204965043U (zh) * 2015-09-23 2016-01-13 苏州工业园区宅艺智能科技有限公司 一种基于云平台的智能家居控制***
CN105871670A (zh) * 2016-05-20 2016-08-17 珠海格力电器股份有限公司 终端设备的控制方法、装置和***
CN108958103A (zh) * 2018-06-25 2018-12-07 珠海格力电器股份有限公司 控制方法、被控方法、装置、智能终端及智能电器
CN110297472A (zh) * 2019-06-28 2019-10-01 上海商汤智能科技有限公司 设备控制方法、终端、受控设备、电子设备和存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112860522A (zh) * 2021-03-02 2021-05-28 北京梧桐车联科技有限责任公司 程序的运行监控方法、装置及设备
CN114070659A (zh) * 2021-10-29 2022-02-18 深圳市优必选科技股份有限公司 一种设备锁定方法、装置及终端设备
CN114070659B (zh) * 2021-10-29 2023-11-17 深圳市优必选科技股份有限公司 一种设备锁定方法、装置及终端设备
CN115118699A (zh) * 2022-06-21 2022-09-27 国仪量子(合肥)技术有限公司 数据传输方法、装置、***、上位机和存储介质

Also Published As

Publication number Publication date
CN110297472A (zh) 2019-10-01
TWI743853B (zh) 2021-10-21
TW202122944A (zh) 2021-06-16

Similar Documents

Publication Publication Date Title
WO2020259154A1 (zh) 设备控制方法、终端、受控设备、电子设备、介质和程序
JP6126738B2 (ja) 手振り制御方法、装置およびシステム
CN111283680B (zh) 一种无线远程遥控机器人的***和方法
US10591999B2 (en) Hand gesture recognition method, device, system, and computer storage medium
WO2022127829A1 (zh) 自移动机器人及其路径规划方法、装置、设备和存储介质
WO2021027967A1 (zh) 一种路线确定方法、可行进设备、和存储介质
CN109992111B (zh) 增强现实扩展方法和电子设备
WO2022178985A1 (zh) 推料机器人的异常处理方法、装置、服务器及存储介质
WO2019069436A1 (ja) 監視装置、監視システムおよび監視方法
US20180300552A1 (en) Differential Tracking for Panoramic Images
JP2018149669A (ja) 学習装置及び学習方法
JP2020026026A (ja) 情報処理装置、仲介装置、シミュレートシステム及び情報処理方法
WO2023115927A1 (zh) 云端机器人的建图方法、***、设备及存储介质
JP2016224665A (ja) 通信端末、通信システム、通信制御方法、及びプログラム
WO2017012499A1 (zh) 一种无人机控制方法、装置和***
CN112346809A (zh) 网页图像标注方法、装置、电子设备及存储介质
Barbosa et al. ROS, Android and cloud robotics: How to make a powerful low cost robot
US20170026617A1 (en) Method and apparatus for real-time video interaction by transmitting and displaying user interface correpsonding to user input
Velamala et al. Development of ROS-based GUI for control of an autonomous surface vehicle
CN116483357A (zh) 一种基于ros的机器人在线仿真实训平台的构建方法
WO2022000757A1 (zh) 一种基于ar的机器人物联网交互方法、装置及介质
CN112233208B (zh) 机器人状态处理方法、装置、计算设备和存储介质
Annable et al. Nubugger: A visual real-time robot debugging system
WO2020067204A1 (ja) 学習用データ作成方法、機械学習モデルの生成方法、学習用データ作成装置及びプログラム
JP2017159429A (ja) ロボット制御装置、情報処理装置、及びロボットシステム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20832004

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20832004

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20832004

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 13.09.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20832004

Country of ref document: EP

Kind code of ref document: A1