CN112904747A - Control system and control method based on intelligent sensing - Google Patents

Control system and control method based on intelligent sensing Download PDF

Info

Publication number
CN112904747A
CN112904747A CN202110129330.0A CN202110129330A CN112904747A CN 112904747 A CN112904747 A CN 112904747A CN 202110129330 A CN202110129330 A CN 202110129330A CN 112904747 A CN112904747 A CN 112904747A
Authority
CN
China
Prior art keywords
information
unit
control
environment
sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110129330.0A
Other languages
Chinese (zh)
Inventor
张旻晋
许达文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Shihaixintu Microelectronics Co ltd
Original Assignee
Chengdu Shihaixintu Microelectronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Shihaixintu Microelectronics Co ltd filed Critical Chengdu Shihaixintu Microelectronics Co ltd
Priority to CN202110129330.0A priority Critical patent/CN112904747A/en
Publication of CN112904747A publication Critical patent/CN112904747A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a control system and a control method based on intelligent sensing. The main control unit controls the sensor unit to sense the environment and the control source to obtain original information, transmits the original information to the sensing unit to generate sensing information, transmits the sensing information to the characteristic induction unit to generate characteristic representation information, transmits the characteristic representation information to the control instruction generation unit to generate a task instruction, and transmits the task instruction to the execution control unit to execute the task instruction. The main control unit controls information interaction among the units, the environment cognition and the target source determination and locking are executed according to the sensing unit, the sensing unit and the characteristic induction unit, and a task instruction aiming at the environment and the control source intention is generated according to the control instruction generation unit. The execution control unit executes tasks for environment and control source intents.

Description

Control system and control method based on intelligent sensing
Technical Field
The invention relates to the field of computers, in particular to a control system and a control method based on intelligent perception.
Background
With the development of sensing technology, the sensing equipment can sense the precision and range of environment and strain signals, the variety is increased, and the information of the environment can be reflected more abundantly. Based on the above, the environment description and the depiction of the equipment can be more detailed and embodied. Meanwhile, with the development of artificial intelligence technology, technologies such as perception portrayal aiming at environmental information, abstraction technology, induction and collection of knowledge are becoming mature. The response of the equipment to the control source according to the environment is more flexible, accurate and reliable.
Based on the above research, the inventor proposes a control system and method based on perception, and will be described below with reference to a control system and method for real world or virtual world based on speech recognition and gesture recognition perception, but not limited thereto.
The intelligent perception algorithms such as deep learning enable the electronic equipment to have accurate perception capability, such as gesture recognition, voice recognition, text recognition and the like, and more accurate perception information is provided for management and control of the equipment. With the aging of the management and control of intelligent equipment, how to utilize machine learning perception information to control the equipment to complete corresponding tasks, correct error behaviors and adjust instructions in time becomes the research trend of the industry. Therefore, the invention provides a control method and a control system based on intelligent perception information, which can stably realize the control and function implementation of the intelligent information to the equipment task.
Disclosure of Invention
The invention aims to solve the technical problems that the existing intelligent equipment cannot directly utilize machine learning perception information to control equipment to complete corresponding tasks, correct wrong behaviors in time and adjust instructions, and aims to provide a control system and a control method based on intelligent perception to stably realize control and function implementation of equipment tasks by intelligent information
The invention is realized by the following technical scheme:
a control system based on intelligent sensing comprises a sensor unit, a sensing unit, a characteristic induction unit, a control instruction generation unit, a main control unit and an execution control unit;
the main control unit controls the sensor unit to sense the environment and the control source to obtain original information, transmits the original information to the sensing unit to generate sensing information, transmits the sensing information to the characteristic induction unit to generate characteristic representation information, transmits the characteristic representation information to the control instruction generation unit to generate a task instruction, and transmits the task instruction to the execution control unit to execute the task instruction;
the main control unit controls information interaction among the sensor unit, the sensing unit, the characteristic induction unit, the control instruction generation unit and the execution control unit, and the main control unit executes environment cognition and target source determination and locking according to the sensing unit, the sensing unit and the characteristic induction unit;
the main control unit analyzes the characteristic representation information and generates a task instruction aiming at the environment and the control source intention according to the control instruction generating unit;
the main control unit controls the cooperative execution among the units, and executes tasks aiming at the environment and the control source intention according to the sensing unit, the characteristic induction unit, the control instruction generation unit and the execution control unit.
Further, the sensing unit comprises an image sensor, a voice sensor and an infrared sensor, and is used for receiving the original information of the external environment and the control source and converting the original information into information in a form which can be processed by the sensing unit;
the sensing unit comprises a sensing calculation carrier for the original information by a processor in the control system, executes an intelligent sensing algorithm for the original information, and generates sensing information comprising image recognition, voice recognition and gesture recognition for environment and control source;
the characteristic induction unit comprises an induction calculation carrier of a processor in the control system aiming at the perception information, a storage carrier of a memory in the control system aiming at the characteristic representation information of the environment and the control source and an induction algorithm for executing the perception information, and is used for inducing and storing depth information, gesture information, voice information, target class information and shape information aiming at the environment and the control source, and inducing the depth information, the gesture information, the voice information, the target class information and the shape information into the characteristic representation information which can be identified by the control instruction generating unit;
the control instruction generating unit comprises an execution carrier of the characteristic representation information of the processor in the control system, executes a task instruction generating algorithm aiming at the characteristic representation information of the environment and the control source, and generates a task execution instruction aiming at the intention of the environment and the control source;
the execution control unit comprises an execution carrier for the processor in the control system to execute the task instruction, execute the generated task instruction and control the execution of the task of the execution device;
the main control unit comprises a calculation carrier for the control system based on intelligent perception, executes management and control of each unit of the control system, controls and implements the control method based on intelligent perception, and controls connection and information transmission among the sensor unit, the perception unit, the characteristic induction unit, the control instruction generation unit and the execution control unit according to the real-time state of the environment and the control source.
Further, the environments include a physical environment, a network environment, a system environment, and a virtual environment; the original information comprises optical information, electronic information, sound wave information, biological information and chemical information;
the characteristic representation information form comprises a knowledge graph, semantics and a text;
the feature representation information comprises past feature representation information, current feature representation information and future feature representation information on a time line;
the control sources include a person, an object, and the environment;
the target executed by the execution control unit comprises a robot, intelligent household equipment, automatic driving equipment, maneuvering equipment and interaction equipment.
Further, a control method based on intelligent perception comprises the following steps:
step 1: the sensor unit senses the original information of the environment and the control source and converts the original information into information in a form which can be processed by the sensing unit;
step 2: the sensing unit receives the original information converted in the step 1, executes an intelligent sensing algorithm and generates sensing information;
and step 3: the characteristic induction unit receives the perception information generated in the step 2, executes a characteristic induction algorithm aiming at the perception information and generates characteristic representation information;
and 4, step 4: the main control unit receives and analyzes the characteristic representation information generated in the step 3, locks the environment and the control source according to the characteristic representation information, analyzes the intention of the control source and plans the task to be executed;
and 5: the control instruction generating unit receives the characteristic representation information generated in the step 3, executes a task instruction generating algorithm aiming at the environment and the control source, and generates a task instruction aiming at the environment and the control source;
step 6: and the execution control unit receives the task instruction generated in the step 5, executes the task instruction and controls the task execution of the target device.
Further, the sensing unit senses information of the environment and the control source in real time, continuously transmits original information to the sensing unit for sensing in real time, updates the characteristic representation information of the characteristic induction unit, and the main control unit analyzes the characteristic representation information in real time and makes decisions and corrections on tasks in real time.
Compared with the prior art, the invention has the following advantages and beneficial effects:
the control system and the control method based on intelligent perception can effectively generate and adjust task execution according to the states of the environment and the control source, and can stably implement the control system task based on perception.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principles of the invention. In the drawings:
FIG. 1 is a schematic diagram of a perception-based control system according to the present invention.
FIG. 2 is a schematic diagram of a method for implementing the perception-based control system of the present invention.
Fig. 3 is a schematic structural diagram of an article pick-up and delivery control system of a logistics apparatus based on voice perception and image perception according to embodiment 1 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to examples and accompanying drawings, and the exemplary embodiments and descriptions thereof are only used for explaining the present invention and are not meant to limit the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that: it is not necessary to employ these specific details to practice the present invention. In other instances, well-known structures, circuits, materials, or methods have not been described in detail so as not to obscure the present invention.
Throughout the specification, reference to "one embodiment," "an embodiment," "one example," or "an example" means: the particular features, structures, or characteristics described in connection with the embodiment or example are included in at least one embodiment of the invention. Thus, the appearances of the phrases "one embodiment," "an embodiment," "one example" or "an example" in various places throughout this specification are not necessarily all referring to the same embodiment or example. Furthermore, the particular features, structures, or characteristics may be combined in any suitable combination and/or sub-combination in one or more embodiments or examples. Further, those of ordinary skill in the art will appreciate that the illustrations provided herein are for illustrative purposes and are not necessarily drawn to scale. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
In the description of the present invention, it is to be understood that the terms "front", "rear", "left", "right", "upper", "lower", "vertical", "horizontal", "high", "low", "inner", "outer", etc. indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of description and simplicity of description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and therefore, are not to be construed as limiting the scope of the present invention.
Examples
Fig. 1 is a schematic structural diagram of a control system and a control method based on smart sensing according to the present invention, and as shown in fig. 1, the system 1 includes a sensor unit 11, a sensing unit 12, a characteristic induction unit 13, a control instruction generation unit 14, an execution control unit 15, and a main control unit 16.
The sensor unit 11 includes, but is not limited to, an acoustic sensor 111, a mechanical sensor 112, an optical sensor 113, and a biosensor 114 for sensing environmental raw information. The sensor unit 11 is used for receiving at least one of the raw information of the environment and the control source, and converting the raw information into the raw information which can be received and identified by the sensing unit 12 through the corresponding sensor, including but not limited to electronic data information such as voice, picture, video and point cloud.
The sensing unit 12 includes, but is not limited to, a computational carrier for sensing inference for receiving raw signals generated by the sensor unit 11, performing sensing inference algorithms including, but not limited to, image recognition, voice recognition, object detection, optical flow detection, gesture recognition inference. Perceptual information for the environmental information is generated.
The feature induction unit 13 includes, but is not limited to, a calculation carrier for performing induction of sensing information, and is configured to receive the sensing information generated by the sensing unit 12, perform induction of sensing information for environment and control source, and generate feature characterization information for environment and control source.
The control instruction generating unit 14, including but not limited to a computing carrier for generating task execution commands for the feature representation information, is configured to receive the feature representation information generated by the feature summarizing unit 13, perform task command generation for the feature representation information, and generate task execution commands corresponding to the environment and the intention of the control source.
The execution control unit 15, including but not limited to a computing carrier for task execution command execution, is configured to receive the task execution instruction generated by the control instruction generation unit 14, and is configured to control task execution of the execution device.
The main control unit 16, including but not limited to a calculation carrier for system control, is configured to receive the feedback signals and status signals of each unit, and monitor status information for the environment and the control source in the characteristic induction unit 13 in real time, so as to control feedback and task decision of the system for the environment and the control source.
Fig. 2 is a flowchart of a system implementation method of the perception-based control system provided by the present invention, and the steps include:
step s1, the sensor unit senses the original information of the environment and the control source, and converts the original information into information in a form that can be processed by the sensing unit;
step s2, the sensing unit receives the original information converted in step s1, executes an intelligent sensing algorithm, and generates sensing information;
step s3, the feature induction unit receives the perception information generated in step s2, and executes a feature induction algorithm aiming at the perception information to generate feature representation information;
and step s4, the main control unit receives and analyzes the characteristic representation information generated in the step s3, locks the environment and the control source according to the characteristic representation information, analyzes the intention of the control source, and plans the task to be executed.
Step s5, the control instruction generating unit receives the feature representation information generated in step s3, executes a task instruction generating algorithm aiming at the environment and the control source, and generates a task instruction aiming at the environment and the control source;
in step s6, the execution control unit receives the task instruction generated in step s5, executes the task instruction, and controls the task execution of the target device.
In an embodiment of the invention, as shown in fig. 3, the structure and principle of the express delivery system are controlled by voice and visual perception through logistics human-computer interaction equipment. When the system runs in a scene of a control source, the steps are as follows:
step 1, an image perception sensor of a system acquires scene environment information and a control source, namely character original information, and transmits the scene original information to a perception unit;
step 2, the perception unit executes character perception and environmental space perception aiming at the original information to generate perception information which comprises perception information aiming at the environment and character perception information;
step 3, the perception information is transmitted to a characteristic induction unit, and the characteristic induction unit induces the character and the perception information of the environment to generate characteristic representation information of the environment and the character;
step 4, the main control module reads the characteristic representation information about the environment and the people, determines the people as a control source and locks the control source at the same time;
step 5, the main control module controls the characteristic representation information of the characteristic induction unit to be transmitted to the control instruction generation unit, and the control instruction generation unit generates a task instruction aiming at a control source command waiting to be listened in logistics fetching and sending interacted with a control source in a real-time environment;
and 6, the execution control unit acquires the task instruction and executes the operation of waiting for the control source command in the logistics fetching and delivering task.
The process is that the control source locking is executed based on the image perception environment information in the process of controlling the logistics pick-up and delivery system equipment through voice recognition, and the operation of waiting for the control source command is carried out according to the control equipment.
In this embodiment, the control source sends out voice information, and the steps of implementing the voice information sensing to control the further operation of the logistics pickup system equipment are as follows:
step 1, an acoustic sensor of a sensing unit receives voice original signals of an environment and a control source and converts the voice original signals into original information which can be received and processed by the sensing unit;
step 2, the sensing unit receives the original information in the step 1, executes voice sensing reasoning calculation and generates voice sensing information;
step 3, the feature induction unit receives the voice perception information in the step 2, and performs feature representation processing on the environment and the voice information according to feature representation information of the previous environment and the current environment to generate feature representation information;
step 4, the main control unit receives characteristic representation information aiming at the environment and the control source, decides whether to execute the task, and if not, transfers to the step 1; if the decision is executed, turning to step 5;
step 5, the control instruction generating unit receives the characteristic information of the step 3, updates and generates an object fetching and sending operation execution instruction in the logistics fetching and sending task aiming at the environment and the control source intention;
and 6, receiving the object fetching and sending command generated in the step 5 by the execution control unit, and controlling the logistics equipment to finish the logistics fetching and sending actions.
In the above embodiment, the main control real-time management sensing unit receives information, and the environmental information such as images and voices received by the sensing unit is transmitted to the sensing unit to sense information of different types, so as to obtain a sensing result, and the sensing result is characterized and then transmitted to the main control unit to perform target locking and decision execution.
In the above embodiment, when the main control unit determines that the result is negative, the main control unit transmits control to the control instruction generation unit without passing through the feature representation information; when the decision result is yes, the main control unit allows control to be transmitted to the control instruction generation unit through the characteristic representation information. After the characteristic representation information is allowed to be transmitted, the control instruction generation unit generates a task execution command of the environment and the control source intention, and transmits the command to the execution control unit for command execution.
In the above embodiment, the sensing unit implements information of the sensing environment and the control source, and transmits the original information to the sensing unit for sensing in real time and updating the characteristic representation information, and the main control unit analyzes the characteristic representation information in real time and makes a decision and corrects the task in real time.
The image sensor in this embodiment may include, but is not limited to, a camera for collecting image raw information of the environment and the control source;
the acoustic sensor in this embodiment may include, but is not limited to, a microphone for collecting voice raw information of the environment and the control source;
the sensing unit in this embodiment may include, but is not limited to, a processor for image sensing and speech recognition, a hardware smart accelerator, a GPU, and other computing carriers;
the feature generalization unit in this embodiment may include, but is not limited to, a processor, CPU, for information integration and generalization calculations;
the control instruction generation unit in this embodiment may include, but is not limited to, a processor, CPU, for implementing control instruction generation calculations;
the execution control unit in this embodiment may include, but is not limited to, a processor, CPU, for performing the execution of the device task command;
the main control unit in this embodiment may include, but is not limited to, a processor and a CPU for controlling data transmission between units, control source locking, and decision execution schemes;
the execution device in this embodiment may include, but is not limited to, various motion execution devices such as a motor, a solenoid valve, an interactive device such as a display, and a speaker.
According to the image and voice perception based control system for fetching and sending the equipment object, the perception of the environment and the control source can be effectively realized in real time, the automatic task planning and implementation of the equipment are realized through induction and decision of the perceived information, and the execution of the task of the equipment is stably controlled in real time.
Although the control system and method based on intelligent perception provided by the invention are described in the above embodiments by using the methods of visual perception, voice perception, etc. to control the article fetching and delivering operations of the logistics equipment, it should be understood by those skilled in the art that the control system and method based on intelligent perception provided by the invention are applicable to other control systems based on perception.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (5)

1. A control system based on intelligent sensing is characterized by comprising a sensor unit, a sensing unit, a characteristic induction unit, a control instruction generation unit, a main control unit and an execution control unit;
the main control unit controls the sensor unit to sense the environment and the control source to obtain original information, transmits the original information to the sensing unit to generate sensing information, transmits the sensing information to the characteristic induction unit to generate characteristic representation information, transmits the characteristic representation information to the control instruction generation unit to generate a task instruction, and transmits the task instruction to the execution control unit to execute the task instruction;
the main control unit controls information interaction among the sensor unit, the sensing unit, the characteristic induction unit, the control instruction generation unit and the execution control unit, and the main control unit executes environment cognition and target source determination and locking according to the sensing unit, the sensing unit and the characteristic induction unit;
the main control unit analyzes the characteristic representation information and generates a task instruction aiming at the environment and the control source intention according to the control instruction generating unit;
the main control unit controls the cooperative execution among the units, and executes tasks aiming at the environment and the control source intention according to the sensing unit, the characteristic induction unit, the control instruction generation unit and the execution control unit.
2. The control system based on intelligent perception according to claim 1, characterized in that:
the sensing unit comprises an image sensor, a voice sensor and an infrared sensor, and is used for receiving the original information of an external environment and a control source and converting the original information into information in a form which can be processed by the sensing unit;
the sensing unit comprises a sensing calculation carrier for the original information by a processor in the control system, executes an intelligent sensing algorithm for the original information, and generates sensing information comprising image recognition, voice recognition and gesture recognition for environment and control source;
the characteristic induction unit comprises an induction calculation carrier of a processor in the control system aiming at the perception information, a storage carrier of a memory in the control system aiming at the characteristic representation information of the environment and the control source and an induction algorithm for executing the perception information, and is used for inducing and storing depth information, gesture information, voice information, target class information and shape information aiming at the environment and the control source, and inducing the depth information, the gesture information, the voice information, the target class information and the shape information into the characteristic representation information which can be identified by the control instruction generating unit;
the control instruction generating unit comprises an execution carrier of the characteristic representation information of the processor in the control system, executes a task instruction generating algorithm aiming at the characteristic representation information of the environment and the control source, and generates a task execution instruction aiming at the intention of the environment and the control source;
the execution control unit comprises an execution carrier for the processor in the control system to execute the task instruction, execute the generated task instruction and control the execution of the task of the execution device;
the main control unit comprises a calculation carrier for the control system based on intelligent perception, executes management and control of each unit of the control system, controls and implements the control method based on intelligent perception, and controls connection and information transmission among the sensor unit, the perception unit, the characteristic induction unit, the control instruction generation unit and the execution control unit according to the real-time state of the environment and the control source.
3. The control system based on intelligent perception according to claim 2, characterized in that: the environment comprises a physical environment, a network environment, a system environment and a virtual environment; the original information comprises optical information, electronic information, sound wave information, biological information and chemical information;
the characteristic representation information form comprises a knowledge graph, semantics and a text;
the feature representation information comprises past feature representation information, current feature representation information and future feature representation information on a time line;
the control sources include a person, an object, and the environment;
the target executed by the execution control unit comprises a robot, intelligent household equipment, automatic driving equipment, maneuvering equipment and interaction equipment.
4. A control method based on intelligent perception is characterized by comprising the following steps:
step 1: the sensor unit senses the original information of the environment and the control source and converts the original information into information in a form which can be processed by the sensing unit;
step 2: the sensing unit receives the original information converted in the step 1, executes an intelligent sensing algorithm and generates sensing information;
and step 3: the characteristic induction unit receives the perception information generated in the step 2, executes a characteristic induction algorithm aiming at the perception information and generates characteristic representation information;
and 4, step 4: the main control unit receives and analyzes the characteristic representation information generated in the step 3, locks the environment and the control source according to the characteristic representation information, analyzes the intention of the control source and plans the task to be executed;
and 5: the control instruction generating unit receives the characteristic representation information generated in the step 3, executes a task instruction generating algorithm aiming at the environment and the control source, and generates a task instruction aiming at the environment and the control source;
step 6: and the execution control unit receives the task instruction generated in the step 5, executes the task instruction and controls the task execution of the target device.
5. The control method based on intelligent perception according to claim 4, wherein the sensing unit perceives the information of environment and control source in real time, and transmits the original information to the perception unit continuously for perception in real time, the characteristic characterization information is updated for the characteristic induction unit, and the main control unit analyzes the characteristic characterization information in real time and makes decision and correction on the task in real time.
CN202110129330.0A 2021-01-29 2021-01-29 Control system and control method based on intelligent sensing Pending CN112904747A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110129330.0A CN112904747A (en) 2021-01-29 2021-01-29 Control system and control method based on intelligent sensing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110129330.0A CN112904747A (en) 2021-01-29 2021-01-29 Control system and control method based on intelligent sensing

Publications (1)

Publication Number Publication Date
CN112904747A true CN112904747A (en) 2021-06-04

Family

ID=76121636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110129330.0A Pending CN112904747A (en) 2021-01-29 2021-01-29 Control system and control method based on intelligent sensing

Country Status (1)

Country Link
CN (1) CN112904747A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106325065A (en) * 2015-06-26 2017-01-11 北京贝虎机器人技术有限公司 Robot interactive behavior control method, device and robot
CN109318232A (en) * 2018-10-22 2019-02-12 佛山智能装备技术研究院 A kind of polynary sensory perceptual system of industrial robot
CN109346069A (en) * 2018-09-14 2019-02-15 北京赋睿智能科技有限公司 A kind of interactive system and device based on artificial intelligence
CN109460042A (en) * 2018-12-29 2019-03-12 北京经纬恒润科技有限公司 A kind of automatic Pilot control method and system
US20190341029A1 (en) * 2018-05-01 2019-11-07 Dell Products, L.P. Intelligent assistance using voice services
CN110488616A (en) * 2019-07-08 2019-11-22 深圳职业技术学院 Intelligent home control system and method based on Internet of Things
CN110716706A (en) * 2019-10-30 2020-01-21 华北水利水电大学 Intelligent human-computer interaction instruction conversion method and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106325065A (en) * 2015-06-26 2017-01-11 北京贝虎机器人技术有限公司 Robot interactive behavior control method, device and robot
US20190341029A1 (en) * 2018-05-01 2019-11-07 Dell Products, L.P. Intelligent assistance using voice services
CN109346069A (en) * 2018-09-14 2019-02-15 北京赋睿智能科技有限公司 A kind of interactive system and device based on artificial intelligence
CN109318232A (en) * 2018-10-22 2019-02-12 佛山智能装备技术研究院 A kind of polynary sensory perceptual system of industrial robot
CN109460042A (en) * 2018-12-29 2019-03-12 北京经纬恒润科技有限公司 A kind of automatic Pilot control method and system
CN110488616A (en) * 2019-07-08 2019-11-22 深圳职业技术学院 Intelligent home control system and method based on Internet of Things
CN110716706A (en) * 2019-10-30 2020-01-21 华北水利水电大学 Intelligent human-computer interaction instruction conversion method and system

Similar Documents

Publication Publication Date Title
US11413763B2 (en) Charging robot and control method thereof
US11358285B2 (en) Robot and method of recognizing mood using the same
CN107835324B (en) Backlight brightness adjusting method and mobile terminal
US10131052B1 (en) Persistent predictor apparatus and methods for task switching
US11559902B2 (en) Robot system and control method of the same
EP2965267A2 (en) Adapting robot behavior based upon human-robot interaction
US20210089172A1 (en) Robot
KR20210004487A (en) An artificial intelligence device capable of checking automatically ventaliation situation and operating method thereof
US11738465B2 (en) Robot and controlling method thereof
KR20190102151A (en) Artificial intelligence server and method for providing information to user
US11376742B2 (en) Robot and method of controlling the same
KR20190085895A (en) Artificial intelligence device that can be controlled according to user gaze
US20210174187A1 (en) Ai apparatus and method for managing operation of artificial intelligence system
KR20210020312A (en) Robot and method for controlling same
US20190392810A1 (en) Engine sound cancellation device and engine sound cancellation method
CN112308006A (en) Sight line area prediction model generation method and device, storage medium and electronic equipment
KR20200128486A (en) Artificial intelligence device for determining user's location and method thereof
KR20210001529A (en) Robot, server connected thereto, and method for recognizing voice using robot
KR20190098936A (en) System and method for cooking robot
KR20210066207A (en) Artificial intelligence apparatus and method for recognizing object
KR20190094311A (en) Artificial intelligence robot and operating method thereof
CN112904747A (en) Control system and control method based on intelligent sensing
KR102229562B1 (en) Artificial intelligence device for providing voice recognition service and operating mewthod thereof
KR102314385B1 (en) Robot and contolling method thereof
KR20210056019A (en) Artificial intelligence device and operating method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210604

RJ01 Rejection of invention patent application after publication