CN109408996B - Interaction method, device and system for intelligent equipment control and storage medium - Google Patents

Interaction method, device and system for intelligent equipment control and storage medium Download PDF

Info

Publication number
CN109408996B
CN109408996B CN201811309227.9A CN201811309227A CN109408996B CN 109408996 B CN109408996 B CN 109408996B CN 201811309227 A CN201811309227 A CN 201811309227A CN 109408996 B CN109408996 B CN 109408996B
Authority
CN
China
Prior art keywords
user
scene
interface
control
site
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811309227.9A
Other languages
Chinese (zh)
Other versions
CN109408996A (en
Inventor
张腾
罗亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Kuangshi Technology Co Ltd
Beijing Kuangshi Robot Technology Co Ltd
Original Assignee
Beijing Kuangshi Technology Co Ltd
Beijing Kuangshi Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Kuangshi Technology Co Ltd, Beijing Kuangshi Robot Technology Co Ltd filed Critical Beijing Kuangshi Technology Co Ltd
Priority to CN201811309227.9A priority Critical patent/CN109408996B/en
Publication of CN109408996A publication Critical patent/CN109408996A/en
Application granted granted Critical
Publication of CN109408996B publication Critical patent/CN109408996B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/043Optimisation of two dimensional placement, e.g. cutting of clothes or wood

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Computer Hardware Design (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides an interaction method, device and system for intelligent equipment control and a storage medium. The method comprises the following steps: providing a site modeling interface; generating a visual field scene in response to a first operation of a user on the field modeling page; providing a business process arrangement interface; and responding to the operation of the user on the business process layout page, and generating a business process for controlling the intelligent equipment to execute in the scene. By the technical scheme, the design complexity of the control scheme of the intelligent equipment is greatly reduced. Thus, the user experience of the scheme designer is significantly improved.

Description

Interaction method, device and system for intelligent equipment control and storage medium
Technical Field
The present invention relates to the field of artificial intelligence, and more particularly, to an interaction method, apparatus, system, and storage medium for intelligent device control.
Background
With the development of artificial intelligence technology, intelligent devices are increasingly applied to various industries. For example, an intelligent robot can be automatically controlled to be applicable to a factory or a warehouse. But smart devices are difficult to apply directly to application scenarios.
In existing practical applications, it is generally necessary to first draw a map using a drawing tool, such as CAD, then plan a business process using a tool such as Visio, and finally mark information to implement solution landing of the smart device.
The above-mentioned process is complicated, and it is very difficult to implement it by using various tools.
Disclosure of Invention
The present invention has been made in view of the above-described problems. The invention provides an interaction method, device and system for intelligent equipment control and a storage medium.
According to an aspect of the present invention, there is provided an interaction method for intelligent device control, including:
providing a site modeling interface;
generating a visual field scene in response to a first operation of a user on the field modeling page;
providing a business process arrangement interface; and
and responding to the operation of the user on the business process layout page, and generating the business process which is executed by the intelligent equipment in the scene.
Illustratively, the method further comprises:
providing a simulation setting interface;
and responding to the operation of the user on the simulation setting interface, and generating dynamic simulation feedback information aiming at the business process in real time.
Illustratively, the generating, in real time, dynamic simulation feedback information for the business process in response to the operation of the user at the simulation setting interface includes:
responding to the operation of the user on the simulation setting interface, and determining a simulation scene of the intelligent equipment for executing the business process; and
and displaying the simulation scene.
Illustratively, the generating, in real time, dynamic simulation feedback information for the business process in response to the operation of the user at the simulation setting interface includes:
determining simulation parameter information of the intelligent equipment for executing the business process in response to the operation of the user on the simulation setting interface; and
and displaying the simulation parameter information.
Illustratively, the providing a site modeling interface includes:
providing one or more site controls on a first side of the site modeling interface for selection by the user;
a venue editing region is provided on a second side of the venue modeling interface.
The method for generating the visual field scene in response to the first operation of the user on the field modeling page comprises the following steps:
in response to the user selecting one of the field controls and a target position in the field editing area for arranging the field control, generating a field scene in the field editing area in which the field control exists at the corresponding target position.
Illustratively, the providing a venue modeling interface further comprises:
displaying a menu bar on a third side of the site modeling interface, wherein the menu bar comprises editing controls for editing the placement postures of the site controls;
the method for generating the visual field scene in response to the first operation of the user on the field modeling page comprises the following steps:
responsive to the user clicking on the editing control, editing the selected venue control in the venue scene accordingly.
Illustratively, the attribute of at least a portion of the site controls includes information about the road connection point.
Illustratively, the venue is a warehouse, and the venue controls include a road control, a charging zone control, a site control, and/or a staging zone control.
Illustratively, the staging area control includes a unidirectional route area for defining a direction of a route of travel of the smart device.
Illustratively, the interaction method further comprises:
and planning a travel route of the intelligent device in the venue scene in response to a second operation of the user on the venue modeling page.
Illustratively, the interaction method further comprises:
and responding to a third operation of the user on the site modeling page, and exporting the site scene in the form of a file.
Illustratively, the interaction method further comprises:
and importing a field scene file for editing in response to a third operation of the user on the field modeling page.
Illustratively, the providing a business process orchestration interface includes:
displaying a field control in the field scene on a first side of the business process arrangement interface;
and providing a flow programming area on a second side of the business flow programming interface.
The responding to the operation of the user on the business process layout page generates a business process which controls the intelligent equipment to execute in the scene, and the method comprises the following steps:
and responding to the user selection of the field control in the field scene displayed on the first side, the target position for arranging the field control in the process arrangement area and the user input job type, and generating a corresponding business process in the process arrangement area.
Illustratively, the smart device is for handling items.
According to another aspect of the present invention, there is also provided an interaction apparatus for intelligent device control, including:
the first providing module is used for providing a site modeling interface;
the scene module is used for responding to the first operation of the user on the scene modeling page and generating a visual scene;
The second providing module is used for providing a business process arrangement interface; and
and the flow module is used for responding to the operation of the user on the business flow layout page and generating a business flow which is used for controlling the intelligent equipment to execute in the scene.
According to still another aspect of the present invention, there is also provided an interaction system for smart device control, including a processor and a memory, wherein the memory stores computer program instructions, which when executed by the processor, are configured to perform the interaction method for smart device control described above.
According to a further aspect of the present invention, there is also provided a storage medium on which program instructions are stored, which program instructions, when executed, are adapted to carry out the above-mentioned interaction method for smart device control.
According to the interaction method, the interaction device, the interaction system and the storage medium for intelligent equipment control, design complexity of a control scheme of the intelligent equipment is greatly reduced. Thus, the user experience of the scheme designer is significantly improved.
The foregoing description is only an overview of the present invention, and is intended to be implemented in accordance with the teachings of the present invention in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present invention more readily apparent.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent from the following more particular description of embodiments of the present invention, as illustrated in the accompanying drawings. The accompanying drawings are included to provide a further understanding of embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, and not constitute a limitation to the invention. In the drawings, like reference numerals generally refer to like parts or steps.
FIG. 1 shows a schematic block diagram of an example electronic device for implementing interaction methods and apparatus for smart device control in accordance with embodiments of the invention;
FIG. 2 shows a schematic flow chart of an interaction method for smart device control according to one embodiment of the invention;
FIG. 3 shows a schematic diagram of a site modeling interface, according to one embodiment of the invention;
FIG. 4 shows a schematic diagram of a business process orchestration interface according to one embodiment of the present invention;
FIG. 5 shows a schematic flow chart of an interaction method for smart device control according to another embodiment of the invention;
FIG. 6 shows a schematic diagram of a simulation setup interface in accordance with one embodiment of the invention;
FIG. 7 shows a schematic diagram of a simulation setup interface in accordance with another embodiment of the present invention;
FIG. 8 shows a schematic block diagram of an interaction apparatus for smart device control in accordance with one embodiment of the invention; and
FIG. 9 shows a schematic block diagram of an interactive system for smart device control, in accordance with one embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present invention and not all embodiments of the present invention, and it should be understood that the present invention is not limited by the example embodiments described herein. Based on the embodiments of the invention described in the present application, all other embodiments that a person skilled in the art would have without inventive effort shall fall within the scope of the invention.
First, an example electronic device 100 for implementing the interaction method and apparatus for smart device control according to an embodiment of the present invention is described with reference to fig. 1.
As shown in fig. 1, the electronic device 100 includes one or more processors 102, one or more storage devices 104. Optionally, the electronic device 100 may also include an input device 106 and an output device 108, which are interconnected by a bus system 110 and/or other forms of connection mechanisms (not shown). It should be noted that the components and structures of the electronic device 100 shown in fig. 1 are exemplary only and not limiting, as the electronic device may have other components and structures as desired.
The processor 102 may be a Central Processing Unit (CPU), a Graphics Processor (GPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 100 to perform desired functions.
The storage 104 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM) and/or cache memory (cache), and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program instructions may be stored on the computer readable storage medium that can be executed by the processor 102 to implement client functions and/or other desired functions in embodiments of the present invention as described below. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer readable storage medium.
The input device 106 may be a device used by a user to input and may include one or more of a keyboard, mouse, microphone, touch screen, and the like.
The output device 108 may output various information (e.g., images and/or sounds) to the outside (e.g., a user), and may include one or more of a display, a speaker, and the like.
By way of example, an example electronic device for implementing the interaction method and apparatus for smart device control in accordance with embodiments of the application may be implemented on a device such as a personal computer or remote server.
The interaction method, the interaction device, the interaction system and the storage medium for intelligent device control, which are disclosed by the embodiment of the application, can be applied to various intelligent devices, such as intelligent robots, automatic driving automobiles and the like. The technical scheme of the application helps the intelligent equipment to land the service in a specific application scene. In the prior art, the process is quite complicated. Not only are various tools used, but also the scene map and the business flow chart are repeatedly checked. This consumes much time and effort from the designer.
In order to solve the problems, the application provides an interaction method for intelligent device control. An interaction method 200 for smart device control according to an embodiment of the present application will be described below with reference to fig. 2. As shown in fig. 2, the interaction method 200 includes the following steps.
Step S210, providing a site modeling interface.
Step S220, generating a visualized site scene in response to a first operation of the user on the site modeling page.
The field represents the location where the smart device performs the target task. For example, the site may represent a supermarket, factory building, warehouse, etc. The site modeling interface is a man-machine interaction interface and can be used for modeling the site by a user. Through interaction with a user, a physical map of the venue, i.e., a venue scene, may be generated using a venue modeling interface, which may also be referred to as a venue map or model map.
The intelligent device in the application refers to a device capable of automatically moving. In some embodiments, the smart device may be a device for handling items (e.g., goods or merchandise). For example, a smart device includes a mechanism (e.g., a tray or rack) for placing items, which moves items placed on the tray or rack to a destination. For convenience of description, the following embodiments are exemplified by a smart robot.
A site editing area may be included in the site modeling interface. A venue scene corresponding to a first operation of the user may be generated in a venue editing region by various operations of a user venue modeling interface. The field scene is visualized, what you see is what you get. The site scenario may include a plurality of specific locations for the intelligent device to complete a specific job. For example, for a supermarket, a checkout counter may be included. The checkout counter may act as an end point of a travel route for the intelligent robot. The intelligent robot conveys the merchandise to a checkout counter for checkout. For example, for a warehouse, which may include a site, the site may be the end of one travel route for an intelligent robot that delivers goods to the site to perform the next process. In addition, for a warehouse, a charging station may be included in the scene of the site, and the intelligent robot with insufficient electric quantity can move to the charging station to charge.
Step S230, providing a business process arrangement interface.
Step S240, in response to the operation of the user on the business process layout page, generates a business process for controlling the intelligent device to execute in the scene.
The smart device accomplishes the target task by performing a series of business processes in the venue. The business process is customized according to the specific application scenario. For example, in a warehouse, intelligent robots may be required to complete cargo or shelf handling. A business process orchestration interface is provided at step S230. The business process arrangement interface is also a man-machine interaction interface and can be used for arranging the business process according to the target task by the user. Through interaction with the user, a defined business process may be orchestrated using the business process orchestration interface.
The business process is used for controlling the intelligent device to execute corresponding jobs in sequence in the scene of the venue. The business process may include a location where the intelligent device executes the job and a type of the executed job. It will be appreciated that the location to which the smart device is involved in executing a job is a location in the venue scenario described above. Taking a supermarket as an example, the end point of one travel route of the smart device is a cash register. Taking a warehouse as an example, the end of a travel route for a smart device may be a shelf, a station, or a charging station.
It will be appreciated that a switch control may be included in the site modeling interface for switching to the business process orchestration interface. For example, the toggle control may be a button. The user may click on the button after the user completes the site modeling interface or at any time the user desires. And responding to the click operation of the button by the user, and providing a business process arrangement interface. Similarly, the business process orchestration interface may also include a switch control for switching to the site modeling interface. Thus, the user can switch to the site modeling interface in response to the operation of the switch control at any time.
In the technical scheme, a site modeling interface and a business process arrangement interface are provided in an intelligent equipment open platform in sequence, so that a designer can conveniently design a business process of a site scene of the intelligent equipment. Thus, the smart device open platform is an integrated platform. Through the platform, the business process design of the intelligent equipment can be conveniently realized, so that the intelligent equipment is controlled to complete target tasks according to the expected business process in a scene of the field. Therefore, time and energy of a designer are saved, and user experience of the designer is remarkably improved.
Alternatively, the above method may be implemented online using a browser. The method can support design based on Software As A Service (SAAS) technology. Alternatively, the above method may be implemented on a local device using application software.
Optionally, the providing a site modeling interface in step S210 includes: providing one or more site controls on a first side of a site modeling interface for selection by the user; and providing a venue editing region on a second side of the venue modeling interface. The step S220 generates a visual venue scene in response to a first operation of the venue modeling page by the user, including: in response to a user selecting one of the site controls and a target location in the site editing area for arranging the site control, a site scene in the site editing area in which the site control exists at the corresponding target location is generated. It should be appreciated that the first side and the second side of the site modeling interface of the present application may be any two non-overlapping areas of the site modeling interface, and may be specifically designed from the standpoint of user usage habits and/or interface aesthetics.
In the technical scheme, the field modeling interface provides corresponding field controls and field editing areas for the application scene, and a user can conveniently generate a desired field scene by simply selecting operation. This further simplifies the user's operation, saving the user's effort and time.
FIG. 3 shows a schematic diagram of a site modeling interface, according to one embodiment of the invention. As shown in fig. 3, a control field is provided on the right side of the venue modeling interface. The control section includes a plurality of venue controls for user selection to place in a venue scene. It is to be appreciated that the venue controls can include not only controls for locations in the scene, e.g., a midrange control, an obstacle zone control, etc., but also controls for any device or other item in the scene, e.g., an intelligent robotic control, etc. A venue edit area is provided to the left of the venue modeling interface in which a warehouse scene is being edited.
The site modeling interface shown in fig. 3 is for a warehouse. The site control comprises a road control, a charging area control, a site control, a transfer area control, an obstacle area control, an intelligent robot control, a door control and a goods shelf control.
In this embodiment, the smart device is a smart robot, so smart robot controls are included in the site modeling interface. In the scene of the scene, as the name implies, the position of the road control is the road where the intelligent robot travels, a door exists at the position of the door control, and a shelf exists at the position of the shelf control. In a venue scenario, the location of the obstacle zone control is a venue where the intelligent robot cannot travel. In the venue scenario, the site control is located where the actual job is in the warehouse, for example: warehouse entry, picking, warehouse exit, etc. The job may be an intelligent robotic job. The job may also be a manual job. In this case, the intelligent robot may travel through and stay at the station, and leave the station after the task is manually completed, for example, after loading a pallet carried on the intelligent robot. In the scene of the field, the position of the transfer area control is the place where the goods are actually transferred, and can be the shared area of the goods shelf and the intelligent robot, because the intelligent robot can stop at the bottom of the goods shelf. Optionally, the transfer zone control includes a unidirectional route zone for defining a direction of a travel route of the smart device. In the venue modeling interface shown in fig. 3, the shaded portion of the staging area control is the unidirectional route area. The unidirectional route area is marked with a route direction as indicated by the arrow in fig. 3. The transit area is an area that is frequently accessed by intelligent robots. The transfer area control comprises a one-way route area, so that collision of a plurality of intelligent robots during simultaneous travelling is effectively avoided, and the intelligent robots are ensured to successfully complete target tasks. In a venue scenario, the intelligent robot can charge at the location of the charging zone control. When the electric quantity of the intelligent robot is lower than a preset threshold value, the intelligent robot can automatically travel to the position where the charging area control is located for charging. After the charging is completed, the intelligent robot can return to the business process to continue the operation.
Optionally, the attributes of at least a portion of the site controls include information about the road connection point, such as the charging zone control, the site control, and the staging zone control described above. The site controls are connected with the positions of the road controls in only one direction at the positions of the scenes. That is, the intelligent robot to the location of the site control can only appear in the location connected to the location where the road control is located. In the venue scenario shown in fig. 3, the road junction of the venue control is represented by a circle that connects with the venue control.
The road connection points of the site control are clearly marked, so that the intelligent robot is prevented from traveling to an incorrect route, and the intelligent robot is ensured to successfully complete a target task. Thereby, the user experience of the designer is improved.
In the venue edit area on the left side of fig. 3, a warehouse being edited is shown. The warehouse comprises two charging areas, two stations, a transit area, three obstacle areas and a section of road.
The user may select a site control in a control section in the site modeling interface and select a target location for placement of the site control in a site editing area. In response to a user selecting the site control and a target location in the site editing area, a site scene in the site editing area in which the site control exists at the corresponding target location is generated.
For example, in response to a user directly dragging a site control of a control section to a target location in a site editing area, the site control is rendered in the target location such that the site exists at a corresponding location of the warehouse. For another example, the user may select a site control of the control section and a target location in the venue-editing region, respectively, such as by a click operation. In response to a click operation by a user, a physical map of a warehouse in which the site exists at a target location in a site editing area is generated.
In this embodiment, a warehouse scenario is generated. A warehouse is a relatively simple scenario for operational requirements. The technical scheme not only provides possibility for intelligent equipment to replace people to finish various operations in the warehouse, but also provides convenience for designers who solve the problem.
Optionally, the providing a site modeling interface in step S210 further includes: and displaying a menu bar on a third side of the site modeling interface, wherein the menu bar comprises editing controls for editing the placement gestures of the site controls. The step S220 generates a visual field scene in response to a first operation of the user on the field modeling page, including: responsive to the user clicking on the editing control, editing the selected venue control in the venue scene accordingly.
Again taking fig. 3 as an example, a menu bar is displayed on the upper side of the site modeling interface shown in fig. 3. The menu bar includes various editing controls for editing the pose of the site control. The editing controls can be used for realizing the editing operations such as turning, rotating and the like on the site control. The user may first select a site control to be edited in the warehouse to be edited, and then click on the editing control, for example, a horizontal flip control, to implement a corresponding editing operation of the site control to be edited.
The common menu bar is displayed in the site modeling interface, so that editing convenience can be provided for a user, the user can conveniently perform various editing operations on site controls, and the working efficiency of the user is improved.
Optionally, the interaction method further includes: a travel route of the smart device is planned in the venue scenario in response to a second operation of a venue modeling page by a user. Taking the site modeling page shown in fig. 3 as an example, a planned route button is included in the upper menu bar of the site modeling page. In response to a user's clicking operation of the button, and the start and end points selected by the user in the warehouse scene being edited, the travel route of the intelligent robot may be planned accordingly.
In many scenarios, the smart device needs to be moved to different locations to complete different jobs. The unified planning interface for the travel route of the intelligent device is provided, so that the intelligent device can be managed by a user conveniently. On the one hand, accidents such as collision and the like of intelligent equipment in the field can be avoided. On the other hand, the intelligent equipment can be prevented from being left out, the power and time of the intelligent equipment are saved, and the working efficiency of the intelligent equipment is improved.
Optionally, the interaction method also supports import and export operations of files. The interaction method may comprise at least one of the following steps: deriving the venue scenario in the form of a file in response to a third operation of the user venue modeling page; and importing a field scene file for editing in response to a third operation of the user on the field modeling page.
The file export operation may export the user edited site scenario directly to other computing devices, such as a control system of the smart device. It will be appreciated that the venue scenario may be exported in the form of a graph file. Therefore, the control system of the intelligent device can control the intelligent device to complete the target task according to the site scene.
The file import operation may import the venue scene file directly from the other computing device. The venue scene file may be a map file of a venue scene. For example, assuming a scene change, an existing scene may be imported from the control system of the smart device. Editing is then performed for the existing venue scene to generate a new venue scene.
The interaction method can also support cloud management of the site scene file. For example, a field scene file is first imported locally and then edited. And finally uploading the edited site scene to the cloud in the form of a map file.
The import, export and cloud management operations facilitate the transplantation, editing and management of site scenes, and provide great convenience for users.
Through the above steps S210 and S220, a visualized field scene is generated. Optionally, the providing a business process orchestration interface in step S230 includes: displaying a field control in the generated field scene on a first side of a business process arrangement interface; a flow orchestration area is provided on a second side of the business flow orchestration interface. The business process interface may include a control field. The controls in the control area are all field controls in the field scene generated through step S210 and step S220. To distinguish between these site controls, each site control in the generated site scene may be given a unique Identifier (ID), for example by modifying the properties of the site control. Each site control in the site scene is included in the control section of the business process orchestration interface. These field controls may be distinguished by ID. The step S240, in response to the user operation of laying out the page in the service flow, generates a service flow for controlling the intelligent device to execute in the scene of the venue, including: responsive to the user selecting a site control in the site scene displayed on the first side and a target position in the process orchestration area for arranging the site control, and responsive to the user entering a job type, a corresponding business process is generated in the process orchestration area.
In the technical scheme, the business process arrangement interface provides a site control and a process arrangement area in a site scene generated by the site modeling interface aiming at the application scene, and a user can conveniently generate a desired business process by simply selecting operation. This further simplifies the user's operation, saving the user's effort and time.
FIG. 4 shows a schematic diagram of a business process orchestration interface according to one embodiment of the present invention. As shown in fig. 4, a control field is provided on the right side of the business process orchestration interface. The control section includes a site control in a site scene in the site modeling interface shown in fig. 3. For example, "buffer B" represents a staging area in the venue scenario of fig. 3, and "workstation a" and "workstation B" each represent one site in the venue scenario of fig. 3. A flow orchestration area is also provided on the left side of the business flow orchestration interface. The user can first create a business process by "newly added process" in the toolbar on the upper side of the business process layout interface. Then, each time an operation is performed, the user can select one of the field controls in the control area and drag the field control to the target position of the flow arranging area so as to connect with the relevant field control. Finally, a chain of site controls may be generated that describe the path of travel of the intelligent robot from one perspective. Each site control in the chain of site controls may act as a node of the overall business process.
The intelligent robot completes different jobs at different positions, so different job types can be identified for different site controls. Still referring to fig. 4, the business process therein deduces such a process. The intelligent robot starts from the buffer area B. Empty shelves are taken from the buffer area B, and loading is carried out at the workstation A, wherein the loading operation can be automatically finished by an intelligent robot or manually assisted. The intelligent robot then delivers the shipment to workstation C. At workstation C, the intelligent robot completes the discharge operation. Similarly, the unloading operation may be accomplished automatically by an intelligent robot or with manual assistance. After the unloading operation is completed, the intelligent robot returns the empty shelves to the buffer B. In response to a user selecting a site control and a target location for placement of the site control, and in response to a user entering a job type, a complete business process is generated in a process orchestration area.
FIG. 5 illustrates an interaction method 500 for smart device control according to another embodiment of the invention. Steps S510-S540 in the interaction method 500 are similar to steps S210-S240, respectively, in the interaction method 200 described above. For brevity, these steps are not described in detail herein. In the interaction method 500, step S550 and step S560 are further included. In step S550, a simulation setup interface is provided. The intelligent robot number and various simulation parameters such as the shelf number, the site robot capacity, the site productivity, the duty ratio distribution of the shared business processes and the like can be configured on the simulation setting interface, wherein the site robot capacity can be represented as the number of robots configured by the site, and the duty ratio distribution of the shared business processes can be represented as the duty ratio of each business process of the site in all business processes of the site. In step S560, in response to the operation of the user on the simulation setting interface, dynamic simulation feedback information is generated in real time for the business process executed by the intelligent device. The simulation feedback information may be displayed in a simulation setup interface. In the embodiment, the intelligent equipment open platform also provides a simulation setting interface, can perform actual simulation based on the physical diagram and the business flow of the field scene, and provides possibility for a user to evaluate the design effect according to the simulation condition. Therefore, the user can check the execution condition of the business process in the scene at any time, and can return to the step S510 or the step S530 according to whether the execution condition reaches the expected decision, and modify the current scene or the business process until the expected effect is reached. Therefore, the design effect of the control scheme can be visually displayed, and the control scheme can be obtained. The method not only facilitates the modification of the existing design scheme by the user, but also ensures that the field scene and the business flow which are finally obtained by the user are satisfactory.
Optionally, step S560, in response to the operation of the user on the simulation setting interface, generates dynamic simulation feedback information in real time for the business process executed by the intelligent device, including: responding to the operation of a user on a simulation setting interface, and determining a simulation scene of the intelligent equipment for executing the business process; and displaying the simulation scene. FIG. 6 illustrates a schematic diagram of a simulation setup interface in accordance with one embodiment of the present invention. The simulation setup interface includes a control, such as a button, for starting the simulation, for example, in the upper toolbar. In response to the user clicking the button, a dynamic image may be displayed. The dynamic image shows a process in which the intelligent robot performs the business process shown in fig. 4 in the scene of the scene shown in fig. 3. In this embodiment, the dynamic image can also display the course of the shipment of the shelves in the venue scenario shown in fig. 3 in real time. A control, such as a button, for stopping the simulation can also be included in the simulation setup interface, such as in the upper toolbar, so as to facilitate the user to terminate the simulation operation at any time. It is to be appreciated that the control for starting the simulation and the control for stopping the simulation may be implemented with the same control. The control implements different functions depending on the state of the simulation operation. The simulated operation is from a scene view. Therefore, the user can visually know the specific process of the intelligent device for executing the target task, and the requirements of the user are better met.
Optionally, step S560, in response to the operation of the user on the simulation setting interface, generates dynamic simulation feedback information in real time for the business process executed by the intelligent device, including: determining simulation parameter information of the intelligent equipment execution business process in response to the operation of a user on a simulation setting interface; and displaying the simulation parameter information. FIG. 7 shows a schematic diagram of a simulation setup interface in accordance with another embodiment of the invention. Similarly, a control, such as a button, for starting or stopping the simulation is included in the simulation setup interface, such as in the upper toolbar. And responding to the operation of starting simulation by clicking the button by the user, and displaying the current execution condition of the intelligent equipment on the business process. The performance may be represented by some statistics of interest to the user. As shown in fig. 7, in this embodiment, some job data of the workstation a and the workstation C are mainly determined. The simulated operation is business process perspective. Therefore, the user can quantitatively know the specific condition of the intelligent device for executing the target task, and the requirement of the user is met from another angle.
Optionally, a toggle control may also be included in the simulation setup interface, e.g. in the upper toolbar, for returning to step S510 or step S530, e.g. a button. In response to the user clicking the button, the process may return to step S510 or step S530 to re-edit the existing site scene or to re-compose the existing business process brocade. Referring back to fig. 6 and 7, the control "edit engineering drawing" is a button for returning to step S510, and the control "orchestration flow" is a button for returning to step S530. Optionally, a switching control for switching to the simulation setting interface may be included in the site modeling interface and/or the business process orchestration interface. In response to the user' S operation of the switch control, step S550 may be reached to perform a simulation operation. Therefore, the user requirements of simulating the current design at any time can be met.
According to another aspect of the invention, an interaction device for intelligent equipment control is also provided. Fig. 8 shows a schematic block diagram of an interaction arrangement 800 for smart device control according to an embodiment of the invention.
As shown in fig. 8, the interaction device 800 includes a first providing module 810, a scene module 820, a second providing module 830, and a flow module 820. The various modules may perform the various steps/functions of the interaction method for smart device control described above, respectively. Optionally, the intelligent device is used for carrying the article. Only the main functions of the respective components of the interaction device 800 will be described below, and the details already described above will be omitted.
The first providing module 810 is configured to provide a site modeling interface. The scene module 820 is configured to generate a visual venue scene in response to a first operation of a user on the venue modeling page. The second providing module 830 is configured to provide a business process arrangement interface. The flow module 840 is configured to generate, in response to an operation of the user on the business process layout page, a business process that controls the intelligent device to execute in the site scene.
Optionally, the first providing module 810 includes a control providing unit and a venue editing region providing unit. The control providing unit is used for providing one or more field controls on a first side of the field modeling interface for selection by the user. The venue editing region providing unit is configured to provide a venue editing region on a second side of the venue modeling interface. The scene module 820 is specifically configured to generate, in response to the user selecting one of the field controls and a target position in the field editing area for arranging the field control, a field scene in which the field control exists at the corresponding target position in the field editing area.
Optionally, the first providing module 810 further includes a menu providing unit, configured to display a menu bar on a third side of the site modeling interface, where the menu bar includes an editing control for editing a pose of the site control. The scene module 820 includes a scene editing unit for editing a selected venue control in the venue scene accordingly in response to the user clicking on the editing control.
Optionally, the attribute of at least a portion of the site controls includes information about road connection points.
Optionally, the site is a warehouse, and the site control includes a road control, a charging area control, a site control and/or a transfer area control.
Optionally, the transfer zone control includes a unidirectional route zone for defining a direction of a travel route of the smart device.
Optionally, the second providing module 830 includes a control providing unit and a flow arrangement area providing unit. And the control providing unit is used for displaying the field control in the field scene on the first side of the business process arrangement interface. The flow arranging area providing unit is used for providing a flow arranging area on a second side of the business flow arranging interface. The flow module is specifically configured to generate a corresponding business flow in the flow arrangement area in response to the user selecting a field control in the field scene displayed on the first side, a target position for arranging the field control in the flow arrangement area, and the user inputting a job type.
Optionally, the interaction device 800 further comprises a third providing module 850 and a simulation module 860. The third providing module 850 is configured to provide a simulation setup interface. The simulation module 860 is configured to generate dynamic simulation feedback information in real time for the business process in response to the operation of the user on the simulation setting interface.
Optionally, the simulation module 860 is specifically configured to determine a simulation scenario of the intelligent device executing the service flow in response to an operation of the user on the simulation setting interface; and displaying the simulation scene.
Optionally, the simulation module 860 is specifically configured to determine, in response to an operation of the user on the simulation setting interface, simulation parameter information of the intelligent device for executing the service flow; and displaying the simulation parameter information.
Optionally, the interaction device 800 further comprises a route planning module for planning a travel route of the smart device in the venue scenario in response to a second operation of the user at the venue modeling page.
Optionally, the interaction device 800 further includes a file exporting module configured to export the venue scene in a file form in response to a third operation of the user on the venue modeling page.
Optionally, the interaction device 800 further includes a file importing module, configured to import a venue scene file for editing in response to a third operation of the user on the venue modeling page.
FIG. 9 shows a schematic block diagram of an interactive system 900 for smart device control, in accordance with one embodiment of the present invention. As shown in fig. 9, the system 900 includes an input device 910, a storage device 920, a processor 930, and an output device 940. Illustratively, the smart device is for handling items.
The input device 910 is configured to receive an operation instruction input by a user and collect data. The input device 910 may include one or more of a keyboard, mouse, microphone, touch screen, image capture device, and the like.
The storage 920 stores computer program instructions for implementing corresponding steps in an interaction method for smart device control according to an embodiment of the present invention.
The processor 930 is configured to execute the computer program instructions stored in the storage 920 to perform the respective steps of the interaction method for smart device control according to an embodiment of the present invention, and to implement the respective modules in the interaction apparatus for smart device control according to an embodiment of the present invention.
In one embodiment of the invention, the computer program instructions, when executed by the processor 930, cause the system 900 to perform the steps of:
providing a site modeling interface;
generating a visual field scene in response to a first operation of a user on the field modeling page;
providing a business process arrangement interface; and
and responding to the operation of the user on the business process layout page, and generating the business process which is executed by the intelligent equipment in the scene.
Illustratively, the step of providing a site modeling interface that causes the system 900 to execute when the computer program instructions are executed by the processor 930 comprises:
providing one or more site controls on a first side of the site modeling interface for selection by the user;
a venue editing region is provided on a second side of the venue modeling interface.
The step of causing the system 900 to perform in response to a first operation of a user on the venue modeling page when the computer program instructions are executed by the processor 930 comprises:
in response to the user selecting one of the field controls and a target position in the field editing area for arranging the field control, generating a field scene in the field editing area in which the field control exists at the corresponding target position.
Illustratively, the step of providing a site modeling interface that causes the system 900 to execute when the computer program instructions are executed by the processor 930 further comprises:
displaying a menu bar on a third side of the site modeling interface, wherein the menu bar comprises editing controls for editing the placement postures of the site controls;
the step of causing the system 900 to perform in response to a first operation of a user on the venue modeling page when the computer program instructions are executed by the processor 930 comprises:
responsive to the user clicking on the editing control, editing the selected venue control in the venue scene accordingly.
Illustratively, the attribute of at least a portion of the site controls includes information about the road connection point.
Illustratively, the venue is a warehouse, and the venue controls include a road control, a charging zone control, a site control, and/or a staging zone control.
Illustratively, the staging area control includes a unidirectional route area for defining a direction of a route of travel of the smart device.
The computer program instructions, when executed by the processor 930, cause the system 900 to further perform the steps of: and planning a travel route of the intelligent device in the venue scene in response to a second operation of the user on the venue modeling page.
The computer program instructions, when executed by the processor 930, cause the system 900 to further perform the steps of: and responding to a third operation of the user on the site modeling page, and exporting the site scene in the form of a file.
The computer program instructions, when executed by the processor 930, cause the system 900 to further perform the steps of: and importing a field scene file for editing in response to a third operation of the user on the field modeling page.
Illustratively, the step of providing a business process orchestration interface that causes the system 900 to execute when the computer program instructions are executed by the processor 930 comprises:
displaying a field control in the field scene on a first side of the business process arrangement interface;
and providing a flow programming area on a second side of the business flow programming interface.
The steps that, when executed by the processor 930, cause the system 900 to perform in response to the user's operation on the business process orchestration page, generate business processes that control the intelligent device to perform in the venue scenario include:
and responding to the user selection of the field control in the field scene displayed on the first side, the target position for arranging the field control in the process arrangement area and the user input job type, and generating a corresponding business process in the process arrangement area.
Illustratively, the computer program instructions, when executed by the processor 930, cause the system 900 to further perform the steps of:
providing a simulation setting interface;
and responding to the operation of the user on the simulation setting interface, and generating dynamic simulation feedback information aiming at the business process in real time.
The step of causing the system 900 to perform, when the computer program instructions are executed by the processor 930, generating dynamic simulation feedback information for the business process in real time in response to the user's operation at the simulation setup interface includes:
responding to the operation of the user on the simulation setting interface, and determining a simulation scene of the intelligent equipment for executing the business process; and
and displaying the simulation scene.
The step of causing the system 900 to perform, when the computer program instructions are executed by the processor 930, generating dynamic simulation feedback information for the business process in real time in response to the user's operation at the simulation setup interface includes:
determining simulation parameter information of the intelligent equipment for executing the business process in response to the operation of the user on the simulation setting interface; and
And displaying the simulation parameter information.
Furthermore, according to still another aspect of the present invention, there is also provided a storage medium on which program instructions are stored, which when executed by a computer or a processor, cause the computer or the processor to perform the respective steps of the interaction method for smart device control of the embodiment of the present invention, and to implement the respective modules in the interaction apparatus for smart device control according to the embodiment of the present invention. Illustratively, the smart device is for handling items. The storage medium may include, for example, a storage component of a tablet computer, a hard disk of a personal computer, read-only memory (ROM), erasable programmable read-only memory (EPROM), portable compact disc read-only memory (CD-ROM), USB memory, or any combination of the foregoing storage media. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
In one embodiment of the invention, the computer program instructions, when executed by a computer or processor, cause the computer or processor to perform the steps of:
providing a site modeling interface;
Generating a visual field scene in response to a first operation of a user on the field modeling page;
providing a business process arrangement interface; and
and responding to the operation of the user on the business process layout page, and generating the business process which is executed by the intelligent equipment in the scene.
Illustratively, the step of providing a site modeling interface, which when executed by the computer or processor, causes the computer or processor to perform, comprises:
providing one or more site controls on a first side of the site modeling interface for selection by the user;
a venue editing region is provided on a second side of the venue modeling interface.
The step of causing a computer or processor to execute in response to a first operation of a user on the venue modeling page when the computer program instructions are executed by the computer or processor comprises:
in response to the user selecting one of the field controls and a target position in the field editing area for arranging the field control, generating a field scene in the field editing area in which the field control exists at the corresponding target position.
Illustratively, the step of providing a site modeling interface, which when executed by the computer or processor, causes the computer or processor to perform, further comprises:
displaying a menu bar on a third side of the site modeling interface, wherein the menu bar comprises editing controls for editing the placement postures of the site controls;
the step of causing a computer or processor to execute in response to a first operation of a user on the venue modeling page when the computer program instructions are executed by the computer or processor comprises:
responsive to the user clicking on the editing control, editing the selected venue control in the venue scene accordingly.
Illustratively, the attribute of at least a portion of the site controls includes information about the road connection point.
Illustratively, the venue is a warehouse, and the venue controls include a road control, a charging zone control, a site control, and/or a staging zone control.
Illustratively, the staging area control includes a unidirectional route area for defining a direction of a route of travel of the smart device.
The computer program instructions, when executed by a computer or processor, cause the computer or processor to further perform the steps of: and planning a travel route of the intelligent device in the venue scene in response to a second operation of the user on the venue modeling page.
The computer program instructions, when executed by a computer or processor, cause the computer or processor to further perform the steps of: and responding to a third operation of the user on the site modeling page, and exporting the site scene in the form of a file.
The computer program instructions, when executed by a computer or processor, cause the computer or processor to further perform the steps of: and importing a field scene file for editing in response to a third operation of the user on the field modeling page.
Illustratively, the step of providing a business process orchestration interface that causes a computer or processor to execute when the computer program instructions are executed by the computer or processor comprises:
displaying a field control in the field scene on a first side of the business process arrangement interface;
and providing a flow programming area on a second side of the business flow programming interface.
The step of causing a computer or processor to execute, when the computer program instructions are executed by the computer or processor, in response to the user operating on the business process orchestration page, to generate a business process that controls the intelligent device to execute in the venue scenario comprises:
And responding to the user selection of the field control in the field scene displayed on the first side, the target position for arranging the field control in the process arrangement area and the user input job type, and generating a corresponding business process in the process arrangement area.
The computer program instructions, when executed by a computer or processor, cause the computer or processor to further perform the steps of:
providing a simulation setting interface;
and responding to the operation of the user on the simulation setting interface, and generating dynamic simulation feedback information aiming at the business process in real time.
The step of causing a computer or processor to execute, when the computer program instructions are executed by the computer or processor, generating dynamic simulation feedback information for the business process in real time in response to operation of the user at the simulation setup interface comprises:
responding to the operation of the user on the simulation setting interface, and determining a simulation scene of the intelligent equipment for executing the business process; and
and displaying the simulation scene.
The step of causing a computer or processor to execute, when the computer program instructions are executed by the computer or processor, generating dynamic simulation feedback information for the business process in real time in response to operation of the user at the simulation setup interface comprises:
Determining simulation parameter information of the intelligent equipment for executing the business process in response to the operation of the user on the simulation setting interface; and
and displaying the simulation parameter information.
The modules in the interaction device for smart device control according to the embodiments of the present invention may be implemented by a processor executing computer program instructions stored in a memory of an interaction system for smart device control according to the embodiments of the present invention, or may be implemented when the computer instructions stored in a computer readable storage medium of a computer program product according to the embodiments of the present invention are executed by a computer.
According to the interaction method, the interaction device, the interaction system and the storage medium for intelligent equipment control, design complexity of a control scheme of the intelligent equipment is greatly reduced. Thus, the user experience of the scheme designer is significantly improved.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above illustrative embodiments are merely illustrative and are not intended to limit the scope of the present invention thereto. Various changes and modifications may be made therein by one of ordinary skill in the art without departing from the scope and spirit of the invention. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another device, or some features may be omitted or not performed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in order to streamline the invention and aid in understanding one or more of the various inventive aspects, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof in the description of exemplary embodiments of the invention. However, the method of the present invention should not be construed as reflecting the following intent: i.e., the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be combined in any combination, except combinations where the features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some of the modules in the interaction means for smart device control according to embodiments of the present invention may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present invention can also be implemented as an apparatus program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present invention may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing description is merely illustrative of specific embodiments of the present invention and the scope of the present invention is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present invention. The protection scope of the invention is subject to the protection scope of the claims.

Claims (17)

1. An interaction method for intelligent device control, comprising:
providing a site modeling interface;
generating a visual field scene in response to a first operation of a user on the field modeling interface;
providing a business process arrangement interface; and
responding to the operation of the user on the business process arrangement interface, and generating a business process for controlling the intelligent equipment to execute in the site scene;
the business process is used for controlling the intelligent equipment to sequentially execute corresponding jobs in the scene of the site, and the business process comprises the position of the intelligent equipment where the jobs are executed and the type of the executed jobs;
the providing a business process arrangement interface comprises:
displaying the generated field control in the field scene on the first side of the business process arrangement interface;
providing a flow programming area on a second side of the business flow programming interface;
the generating, in response to the operation of the user in the business process arrangement interface, a business process for controlling the intelligent device to execute in the scene of the site includes:
and responding to the user selection of a field control in the field scene displayed on the first side and a target position for arranging the field control in the process arrangement area, and generating a corresponding business process in the process arrangement area.
2. The interaction method of claim 1, wherein the method further comprises:
providing a simulation setting interface;
and responding to the operation of the user on the simulation setting interface, and generating dynamic simulation feedback information aiming at the business process in real time.
3. The interaction method of claim 2, wherein the generating, in real time, dynamic simulation feedback information for the business process in response to the user's operation at the simulation setup interface comprises:
responding to the operation of the user on the simulation setting interface, and determining a simulation scene of the intelligent equipment for executing the business process; and
and displaying the simulation scene.
4. The interaction method of claim 2 or 3, wherein the generating, in real time, dynamic simulation feedback information for the business process in response to the operation of the user at the simulation setup interface comprises:
determining simulation parameter information of the intelligent equipment for executing the business process in response to the operation of the user on the simulation setting interface; and
and displaying the simulation parameter information.
5. The interaction method according to claim 1 or 2, wherein,
the providing a site modeling interface includes:
Providing one or more site controls on a first side of the site modeling interface for selection by the user;
providing a venue editing region on a second side of the venue modeling interface;
the generating a visual field scene in response to a first operation of a user at the field modeling interface includes:
in response to the user selecting one of the field controls and a target position in the field editing area for arranging the field control, generating a field scene in the field editing area in which the field control exists at the corresponding target position.
6. The interaction method of claim 5, wherein,
the providing a site modeling interface further includes:
displaying a menu bar on a third side of the site modeling interface, wherein the menu bar comprises editing controls for editing the placement postures of the site controls;
the generating a visual field scene in response to a first operation of a user at the field modeling interface includes:
responsive to the user clicking on the editing control, editing the selected venue control in the venue scene accordingly.
7. The interaction method of claim 5, wherein the attributes of at least a portion of the site controls include information about road connection points.
8. The interaction method of claim 5, wherein the venue is a warehouse and the venue controls comprise a road control, a charging zone control, a site control, and/or a staging zone control.
9. The interaction method of claim 8, wherein the transition zone control comprises a unidirectional route zone for defining a direction of a travel route of the smart device.
10. The interaction method of claim 1 or 2, wherein the interaction method further comprises:
and planning a travel route of the intelligent device in the venue scene in response to a second operation of the user at the venue modeling interface.
11. The interaction method of claim 1 or 2, wherein the interaction method further comprises:
and responding to a third operation of the user on the site modeling interface, and deriving the site scene in the form of a file.
12. The interaction method of claim 1 or 2, wherein the interaction method further comprises:
and importing a field scene file for editing in response to a third operation of the user on the field modeling interface.
13. The interaction method according to claim 1 or 2, wherein,
the responding to the operation of the user on the business process arranging interface generates the business process which is controlled to be executed by the intelligent equipment in the field scene, and the method comprises the following steps:
And responding to the user selection of the field control in the field scene displayed on the first side, the target position for arranging the field control in the process arrangement area and the user input job type, and generating a corresponding business process in the process arrangement area.
14. The interaction method of claim 1 or 2, wherein the smart device is for handling items.
15. An interaction apparatus for smart device control, comprising:
the first providing module is used for providing a site modeling interface;
the scene module is used for responding to the first operation of a user on the scene modeling interface and generating a visual scene;
the second providing module is used for providing a business process arrangement interface; and
the flow module is used for responding to the operation of the user on the business flow arrangement interface and generating a business flow for controlling the intelligent equipment to execute in the scene;
the business process is used for controlling the intelligent equipment to sequentially execute corresponding jobs in the scene of the site, and the business process comprises the position of the intelligent equipment where the jobs are executed and the type of the executed jobs;
The second providing module is specifically configured to display, on a first side of the business process arrangement interface, a field control in the generated field scene; providing a flow programming area on a second side of the business flow programming interface;
the flow module is specifically configured to generate a corresponding business flow in the flow arrangement area in response to the user selecting a site control in the site scene displayed on the first side and a target position in the flow arrangement area for arranging the site control.
16. An interaction system for smart device control, comprising a processor and a memory, wherein the memory has stored therein computer program instructions which, when executed by the processor, are adapted to carry out the interaction method for smart device control as claimed in any one of claims 1 to 14.
17. A storage medium having stored thereon program instructions for performing, when executed, the interaction method for smart device control of any of claims 1 to 14.
CN201811309227.9A 2018-11-05 2018-11-05 Interaction method, device and system for intelligent equipment control and storage medium Active CN109408996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811309227.9A CN109408996B (en) 2018-11-05 2018-11-05 Interaction method, device and system for intelligent equipment control and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811309227.9A CN109408996B (en) 2018-11-05 2018-11-05 Interaction method, device and system for intelligent equipment control and storage medium

Publications (2)

Publication Number Publication Date
CN109408996A CN109408996A (en) 2019-03-01
CN109408996B true CN109408996B (en) 2023-11-07

Family

ID=65471857

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811309227.9A Active CN109408996B (en) 2018-11-05 2018-11-05 Interaction method, device and system for intelligent equipment control and storage medium

Country Status (1)

Country Link
CN (1) CN109408996B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111339955A (en) * 2020-02-20 2020-06-26 杭州涂鸦信息技术有限公司 Visual intelligent network distribution method and system, storage medium and control method
CN112435149A (en) * 2020-12-03 2021-03-02 郑州捷安高科股份有限公司 Simulation method, device, equipment and storage medium based on scene guidance prompt
CN114691308A (en) * 2020-12-25 2022-07-01 第四范式(北京)技术有限公司 Online task arranging method and online task arranging device
CN113050938B (en) * 2021-03-08 2024-06-14 杭州海康机器人股份有限公司 Visual software development system, method, device and computer storage medium
CN115903692A (en) * 2022-11-04 2023-04-04 北京镁伽机器人科技有限公司 Control method and device for automation system, electronic device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106326087A (en) * 2016-08-31 2017-01-11 北京光年无限科技有限公司 Webpage experiencing method based on robot operating system and system thereof
CN106533860A (en) * 2016-11-18 2017-03-22 深圳Tcl智能家庭科技有限公司 Intelligent household interaction software bus system and realization method
CN108227520A (en) * 2016-12-12 2018-06-29 李涛 A kind of control system and control method of the smart machine based on panorama interface
CN108388142A (en) * 2018-04-10 2018-08-10 百度在线网络技术(北京)有限公司 Methods, devices and systems for controlling home equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9198204B2 (en) * 2012-04-11 2015-11-24 Google Inc. Apparatus and method for seamless commissioning of wireless devices

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106326087A (en) * 2016-08-31 2017-01-11 北京光年无限科技有限公司 Webpage experiencing method based on robot operating system and system thereof
CN106533860A (en) * 2016-11-18 2017-03-22 深圳Tcl智能家庭科技有限公司 Intelligent household interaction software bus system and realization method
CN108227520A (en) * 2016-12-12 2018-06-29 李涛 A kind of control system and control method of the smart machine based on panorama interface
CN108388142A (en) * 2018-04-10 2018-08-10 百度在线网络技术(北京)有限公司 Methods, devices and systems for controlling home equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"一种可定制的智能家居***的设计与实现";倪海鸥;《西昌学院学报》;20180630;第32卷(第2期);全文 *

Also Published As

Publication number Publication date
CN109408996A (en) 2019-03-01

Similar Documents

Publication Publication Date Title
CN109408996B (en) Interaction method, device and system for intelligent equipment control and storage medium
Tsarouchi et al. On a human-robot collaboration in an assembly cell
Sturrock et al. Recent innovations in Simio
CN108960506A (en) A kind of robot dispatching method, device, server and storage medium
US11247334B2 (en) Task planning apparatus, task planning method, and non-transitory computer-readable storage medium
US20180330316A1 (en) Systems and Methods for Fulfilment Design & Optimization
Vieira et al. A two-level optimisation-simulation method for production planning and scheduling: the industrial case of a human–robot collaborative assembly line
Inglés-Romero et al. Dealing with run-time variability in service robotics: Towards a dsl for non-functional properties
Sadeghpour et al. A CAD-based model for site planning
Górski Building virtual reality applications for engineering with knowledge-based approach
Hermann et al. An integrated system to select, position, and simulate mobile cranes for complex industrial projects
Wu et al. How human-robot collaboration impacts construction productivity: An agent-based multi-fidelity modeling approach
Heumann et al. Humanizing architectural automation: a case study in office layouts
US20070198588A1 (en) Automatic Qualification of Plant Equipment
CN109241564A (en) A kind of Photographing On-line method and apparatus of computer room assets
CN112906081A (en) Method and device for planning warehouse layout
US20230108774A1 (en) AI Augmented Digital Platform And User Interface
CN115774946B (en) SLP-based three-dimensional facility planning and logistics analysis method, system and application
CN106033349A (en) Object position adjusting method and device
JP3394024B2 (en) Device placement device and device placement design method
JPH07141421A (en) Process control system utilizing computer wherein general-purpose-process forming block is used
CN113885531A (en) Method for moving robot, circuit, medium, and program
Kluska Support the decision processes in management through the automatic generation of simulation models of warehouses
CN113887071A (en) Process flow chart manufacturing method, system, storage medium and equipment
Singh et al. Decision Support System for Site Layout Planning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 100094 Eris Robot 106 in the Main Hall of International Software Building, 9th Building, Phase I, Zhongguancun Software Park, Shangdi Information Industry Base, Haidian District, Beijing

Applicant after: Beijing Wide-sighted Robot Co.,Ltd.

Applicant after: BEIJING KUANGSHI TECHNOLOGY Co.,Ltd.

Address before: 100094 Eris Robot 106 in the Main Hall of International Software Building, 9th Building, Phase I, Zhongguancun Software Park, Shangdi Information Industry Base, Haidian District, Beijing

Applicant before: BEIJING ARES ROBOT TECHNOLOGY CO.,LTD.

Applicant before: BEIJING KUANGSHI TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100094 Eris Robot 106 in the Main Hall of International Software Building, 9th Building, Phase I, Zhongguancun Software Park, Shangdi Information Industry Base, Haidian District, Beijing

Applicant after: BEIJING KUANGSHI ROBOT TECHNOLOGY Co.,Ltd.

Applicant after: BEIJING KUANGSHI TECHNOLOGY Co.,Ltd.

Address before: 100094 Eris Robot 106 in the Main Hall of International Software Building, 9th Building, Phase I, Zhongguancun Software Park, Shangdi Information Industry Base, Haidian District, Beijing

Applicant before: Beijing Wide-sighted Robot Co.,Ltd.

Applicant before: BEIJING KUANGSHI TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100193 District 106-1, No. 9 Building, 8 Wangxi Road, Haidian District, Beijing

Applicant after: BEIJING KUANGSHI ROBOT TECHNOLOGY Co.,Ltd.

Applicant after: BEIJING KUANGSHI TECHNOLOGY Co.,Ltd.

Address before: 100094 Eris Robot 106 in the Main Hall of International Software Building, 9th Building, Phase I, Zhongguancun Software Park, Shangdi Information Industry Base, Haidian District, Beijing

Applicant before: BEIJING KUANGSHI ROBOT TECHNOLOGY Co.,Ltd.

Applicant before: BEIJING KUANGSHI TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant