CN108375911A - Equipment control method and device, storage medium and equipment - Google Patents

Equipment control method and device, storage medium and equipment Download PDF

Info

Publication number
CN108375911A
CN108375911A CN201810057973.7A CN201810057973A CN108375911A CN 108375911 A CN108375911 A CN 108375911A CN 201810057973 A CN201810057973 A CN 201810057973A CN 108375911 A CN108375911 A CN 108375911A
Authority
CN
China
Prior art keywords
user
control command
relevant device
presumptive area
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810057973.7A
Other languages
Chinese (zh)
Other versions
CN108375911B (en
Inventor
陈铭进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN201810057973.7A priority Critical patent/CN108375911B/en
Publication of CN108375911A publication Critical patent/CN108375911A/en
Application granted granted Critical
Publication of CN108375911B publication Critical patent/CN108375911B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The invention provides a device control method, a device, a storage medium and a device, wherein the method comprises the following steps: the method comprises the steps that motion capture is conducted on a user in a preset area through a Kinect device, so that whether the user conducts preset motion or not is recognized, and/or a voice command sent by the user is received; and if the user is caught to make the preset action and/or a voice command sent by the user is received, controlling the corresponding equipment according to the preset action and/or the voice command. According to the scheme provided by the invention, the intelligent household equipment can be controlled through the preset action or the voice command, the intelligent household equipment is not required to be controlled by using the control terminal, and the user experience is improved.

Description

A kind of apparatus control method, device, storage medium and equipment
Technical field
The present invention relates to control field more particularly to a kind of apparatus control method, device, storage medium and equipment.
Background technology
Currently, there are many kinds of the control methods of smart home, for example, remote network control, voice control etc..Nowadays to intelligence Energy home equipment is controlled, and still to be manipulated, be operated by control terminals such as mobile phone, control panel, remote controlers Still more complicated, it is not smart enough.
Invention content
It is a primary object of the present invention to overcome the defect of the above-mentioned prior art, provide a kind of apparatus control method, device, Storage medium and equipment, with solve the problems, such as in the prior art to smart home device carry out control operation it is more complicated.
One aspect of the present invention provides a kind of apparatus control method, including:By Kinect device in presumptive area User carries out motion capture, to identify whether the user makes deliberate action, and/or receives the voice command that user sends out; If capturing the user to make the deliberate action and/or receive the voice command that user sends out, according to described default Action and/or institute's speech commands control relevant device.
Optionally, the presumptive area includes more than one subregion;By Kinect device in presumptive area Before user carries out motion capture, further include:The user position in the presumptive area is detected, with according to the user institute The subregion residing in the presumptive area of the user described in location determination;By Kinect device to the use in presumptive area Family carries out motion capture, including:Pass through the user corresponding Kinect device of subregion residing in the presumptive area Motion capture is carried out to the user.
Optionally, the user position in the presumptive area is detected, including:It is detected by infrared sensor The user position;And/or auditory localization is carried out to the user by Array Microphone;And/or pass through detection The intensity for the wireless signal that the equipment that the user carries is sent out determines the user position.
Optionally, relevant device is controlled according to the deliberate action, including:Identify that the deliberate action is corresponding First control command;First control command is sent to relevant device, so that the relevant device is controlled according to described first Order executes corresponding operation;And/or relevant device is controlled according to institute's speech commands, including:Identify the voice Corresponding semanteme is ordered, to determine corresponding second control command of institute's speech commands according to the semanteme of identification;To corresponding Equipment sends second control command, so that the relevant device executes corresponding operation according to second control command.
Optionally, it sends first control command to relevant device and/or sends second control to relevant device Order, including:It is sent to the relevant device by least one of LAN, WiFi, router and bluetooth communication mode First control command and/or second control command.
Optionally, the equipment, including:At least one in air-conditioning, refrigerator, washing machine, television set, water heater and micro-wave oven It is a.
Another aspect of the present invention provides a kind of plant control unit, including:7, a kind of plant control unit, feature exist In, including:Capture unit, for carrying out motion capture to the user in presumptive area by Kinect device, described in identification Whether user makes deliberate action and/or receiving unit, the voice command sent out for receiving user;Control unit, if for The capture unit captures that the user makes the deliberate action and/or the receiving unit receives the language that user sends out Sound order then controls relevant device according to the deliberate action and/or institute's speech commands.
Optionally, the presumptive area includes more than one subregion;Described device further includes:Detection unit is used for Before the capture unit carries out motion capture by Kinect device to the user in presumptive area, the presumptive area is detected Interior user position, to determine user sub-district residing in the presumptive area according to the user position Domain;The capture unit, is further used for:It is corresponding by user subregion residing in the presumptive area Kinect device carries out motion capture to the user.
Optionally, the detection unit detects the user position in the presumptive area, including:By red Outside line sensor detects the user position;And/or auditory localization is carried out to the user by Array Microphone; And/or the intensity by detecting the wireless signal that the equipment that the user carries is sent out determines the user position.
Optionally, described control unit controls relevant device according to the deliberate action, including:Described in identification Corresponding first control command of deliberate action;First control command is sent to relevant device, so that the relevant device root Corresponding operation is executed according to first control command;And/or described control unit, according to institute's speech commands to accordingly setting It is standby to be controlled, including:The corresponding semanteme of institute's speech commands is identified, to be ordered according to the semantic determining voice of identification Enable corresponding second control command;Second control command is sent to relevant device, so that the relevant device is according to Second control command executes corresponding operation.
Optionally, described control unit sends first control command to relevant device and/or is sent to relevant device Second control command, including:By at least one of LAN, WiFi, router and bluetooth communication mode to described Relevant device sends first control command and/or second control command.
Optionally, the equipment, including:At least one in air-conditioning, refrigerator, washing machine, television set, water heater and micro-wave oven It is a.
Another aspect of the invention provides a kind of computer readable storage medium, is stored thereon with computer program, described The step of aforementioned any the method is realized when program is executed by processor.
Further aspect of the present invention provides a kind of equipment, including processor, memory and storage on a memory can be The step of computer program run on processor, the processor realizes aforementioned any the method when executing described program.
Further aspect of the present invention provides a kind of equipment, including aforementioned any plant control unit.
Optionally, the equipment includes:Gateway, router, air-conditioning, refrigerator, washing machine, television set, water heater and microwave At least one of stove.
According to the technique and scheme of the present invention, by Kinect device in presumptive area user carry out motion capture and/ Or receive the voice command that user sends out;If capturing user to make deliberate action and/or receive the voice life that user sends out Enable, then relevant device controlled according to the deliberate action and/or institute's speech commands, realize through deliberate action or Voice command controls smart home device, without using control terminals such as mobile phone, control panel, remote controlers to intelligent family It occupies equipment to be controlled, the user experience is improved.
Description of the drawings
Attached drawing described herein is used to provide further understanding of the present invention, and constitutes the part of the present invention, this hair Bright illustrative embodiments and their description are not constituted improper limitations of the present invention for explaining the present invention.In the accompanying drawings:
Fig. 1 is the method schematic diagram of an embodiment of apparatus control method provided by the invention;
Fig. 2 is the exemplary plot at least one Kinect device of the embodiment of the present invention being arranged in predeterminable area;
Fig. 3 is the method schematic diagram of another embodiment of apparatus control method provided by the invention;
Fig. 4 is the structural schematic diagram of an embodiment of plant control unit provided by the invention;
Fig. 5 is the structural schematic diagram of another embodiment of plant control unit provided by the invention.
Specific implementation mode
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with the specific embodiment of the invention and Technical solution of the present invention is clearly and completely described in corresponding attached drawing.Obviously, described embodiment is only the present invention one Section Example, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art are not doing Go out the every other embodiment obtained under the premise of creative work, shall fall within the protection scope of the present invention.
It should be noted that term " first " in description and claims of this specification and above-mentioned attached drawing, " Two " etc. be for distinguishing similar object, without being used to describe specific sequence or precedence.It should be appreciated that using in this way Data can be interchanged in the appropriate case, so as to the embodiment of the present invention described herein can in addition to illustrating herein or Sequence other than those of description is implemented.In addition, term " comprising " and " having " and their any deformation, it is intended that cover It includes to be not necessarily limited to for example, containing the process of series of steps or unit, method, system, product or equipment to cover non-exclusive Those of clearly list step or unit, but may include not listing clearly or for these processes, method, product Or the other steps or unit that equipment is intrinsic.
Fig. 1 is the method schematic diagram of an embodiment of apparatus control method provided by the invention.This method can be used for controlling Corresponding device and/or other equipment.Corresponding device includes:Gateway, router or electric appliance are (for example, air-conditioning, refrigerator, laundry are mechanical, electrical Depending on machine, water heater and micro-wave oven etc.), the other equipment includes:Air-conditioning, refrigerator, washing machine, television set, water heater and microwave At least one of stove.The equipment controls the other equipment by being communicated with other equipment.For example, passing through gateway Or router is networked with above equipment to control above equipment.
As shown in Figure 1, according to one embodiment of present invention, the apparatus control method includes at least step S110 and step Rapid S120.
Step S110 carries out motion capture, to identify the user by Kinect device to the user in presumptive area Whether make deliberate action and/or receives the voice command that user sends out.
Specifically, pre-set the deliberate action for controlling distinct device, the deliberate action specifically can by user into Row setting carries out motion capture, in the presumptive area using Kinect motion captures technology to the user in presumptive area At least one Kinect device (such as Kinect cameras) (such as camera) is set, is set by least one Kinect The standby user in the presumptive area carries out motion capture, and identifies whether the action that user makes is deliberate action.For example, As shown in Fig. 2, the present invention is implemented in gateway C, equipment B1, B2, B3 and B4 in living area 1 are controlled, lived Four angles in region 1 are separately installed with Kinect device A1, A2, A3 and A4.Wherein, Kinect device captures depth information, by The three-dimensional point cloud of pre-set programs generation action, then will be transmitted on the model in VR patterns, to track mankind's movement locus, it is used in combination Machine learning techniques carry out action recognition.
The voice command that user sends out can be received by microphone, for example, passing through the array built in Kinect device Microphone receives the voice command that user sends out, alternatively, receiving the voice command that user sends out by voice bluetooth remote control device.
Step S120 makes the deliberate action and/or receives the voice life that user sends out if capturing the user It enables, then relevant device is controlled according to the deliberate action and/or institute's speech commands.
Specifically, the relevant device includes corresponding device and other equipment, if capturing user makes deliberate action, Relevant device is controlled according to the deliberate action, identifies corresponding first control command of the deliberate action, and to phase Equipment is answered to send first control command, so that the relevant device executes corresponding behaviour according to first control command Make.Pre-set the deliberate action for controlling distinct device and its corresponding the first control for being controlled relevant device System order obtains the first control command corresponding to the deliberate action that user makes when capturing user and making deliberate action, And send first control command to relevant device, after the relevant device receives first control command, execute with First control command operates accordingly.
Carrying out control to relevant device according to institute's speech commands includes:Identify the corresponding semanteme of institute's speech commands, with Corresponding second control command of institute's speech commands is determined according to the semanteme of identification;And send described second to relevant device Control command, so that the relevant device executes corresponding operation according to second control command.It pre-sets for phase The corresponding semanteme of the second control command that equipment is controlled is answered, when receiving the voice command that user sends out, user is sent out The institute's speech commands gone out carry out semantics recognition, and according to the corresponding semantic determining voice of the institute's speech commands identified Corresponding second control life is ordered, and second control command is sent to relevant device, the relevant device receives described After second control command, operate corresponding with second control command is executed.
First control command is sent to relevant device and/or sends second control command to relevant device, it can To pass through at least one of LAN, WiFi, router and bluetooth communication mode.Specifically, can by LAN, WiFi, router and at least one of bluetooth communication mode connect the required equipment controlled, described in being sent to relevant device First control command and/or second control command.
Fig. 3 is the method schematic diagram of another embodiment of apparatus control method provided by the invention.As shown in figure 3, being based on Above-described embodiment, the apparatus control method further include step S100.
Step S100 detects the user position in the presumptive area, to be determined according to the user position User subregion residing in the presumptive area.
Specifically, the presumptive area includes more than one subregion, can be respectively set in every sub-regions corresponding Kinect device is caught when carrying out motion capture to the user in presumptive area by Kinect device carrying out the action Before catching, the user position in the presumptive area is detected, to determine that the user exists according to the user position Residing subregion in the presumptive area, thus in step s 110, it is residing in the presumptive area by the user The corresponding Kinect device of subregion to the user carry out motion capture, that is, open the Kinect nearest apart from the user Equipment carries out motion capture to the user, does not have to open all Kinect motion capture equipment, reduces the consumption of the energy.
The user position detected in the presumptive area can specifically be achieved by the following way in mode at least It is a kind of:
(1) the user position is detected by infrared sensor.
Since as long as any substance itself has certain temperature (being higher than absolute zero), can infrared radiation, therefore The user position can be detected by infrared sensor.
(2) auditory localization is carried out to the user by Array Microphone.
Kinect device is internally provided with Array Microphone, that is, includes at least two microphones, pass through described at least two The microphone array that microphone is constituted receives sound, and the direction of sound source incidence microphone array is then calculated by preset algorithm Angle can judge the locality of sound source.
(3) determine that the user institute is in place by detecting the intensity for the wireless signal that the equipment that the user carries is sent out It sets.
If the user carries the equipment that can send out wireless signal, such as mobile phone, bracelet, wrist-watch etc., can pass through The intensity of networking mode, the wireless signal that the equipment carried according to user is sent out positions the user.
Technical solution of the present invention, which is realized, controls smart home device by compulsory exercise or voice command, example Such as, when user wants to open curtain, it is only necessary to thumb up to represent and open curtain, be opened so as to control curtain. Therefore, by using technical solution of the present invention, without using control terminals such as mobile phone, control panel, remote controlers to smart home Equipment is controlled, and the user experience is improved.
Fig. 4 is the structural schematic diagram of an embodiment of plant control unit provided by the invention.The plant control unit It can be used for controlling corresponding device and/or other equipment.Corresponding device includes:Gateway, router or electric appliance are (for example, air-conditioning, ice Case, washing machine, television set, water heater and micro-wave oven etc.), the other equipment includes:Air-conditioning, refrigerator, washing machine, television set, At least one of water heater and micro-wave oven.The equipment controls the other equipment by being communicated with other equipment. For example, being networked with above equipment to control above equipment by gateway and/or router.
As shown in figure 4, the plant control unit 100 includes:Capture unit 110 and/or receiving unit 120 further include Control unit 130.
Capture unit 110 is used to carry out motion capture to the user in presumptive area by Kinect device, and/or receives Unit 120 is for receiving the voice command that user sends out;If control unit 130 captures the user for the capture unit It makes deliberate action and/or the receiving unit receives the voice command of user, then according to the deliberate action and/or described Voice command controls relevant device.
Specifically, pre-set the deliberate action for controlling distinct device, the deliberate action specifically can by user into Row setting carries out motion capture, in the presumptive area using Kinect motion captures technology to the user in presumptive area At least one Kinect device is arranged, and (such as Kinect cameras, the capture unit 110 pass through at least one Kinect Equipment carries out motion capture to the user in the presumptive area, and identifies whether the action that user makes is deliberate action.Example Such as, as shown in Fig. 2, the present invention is implemented in gateway C, equipment B1, B2, B3 and B4 in living area 1 are controlled, in life Four angles in region 1 (such as room) living are separately installed with Kinect device A1, A2, A3 and A4.Wherein, Kinect device captures Depth information, the three-dimensional point cloud acted by pre-set programs generation, then will be transmitted on the model in VR patterns, it is dynamic to track the mankind Make track, action recognition is carried out with machine learning techniques.
The receiving unit 120 can receive the voice command that user sends out by microphone, such as be set by Kinect Standby built-in Array Microphone receives the voice command that user sends out, alternatively, the receiving unit 120 can pass through voice indigo plant Tooth remote controler receives the voice command that user sends out.
If the capture unit 110 captures, the user makes the deliberate action and/or the receiving unit 120 connects Receive the voice command that user sends out, then described control unit 130 according to the deliberate action and/or institute's speech commands to phase Equipment is answered to be controlled.
Specifically, the relevant device includes corresponding device and other equipment, if the capture unit 110 captures user Deliberate action is made, then described control unit 130 controls relevant device according to the deliberate action, identifies described default Act corresponding first control command, and first control command sent to relevant device so that the relevant device according to First control command executes corresponding operation.Pre-set the deliberate action for controlling distinct device and its corresponding use In the first control command controlled relevant device, when the capture unit 110 captures user and makes deliberate action, Described control unit 130 obtains the first control command corresponding to the deliberate action that user makes, and sends institute to relevant device The first control command is stated, after the relevant device receives first control command, is executed and the first control command phase The operation answered.
Described control unit 130 carries out control according to institute's speech commands to relevant device:Identify the voice life Corresponding semanteme is enabled, to determine corresponding second control command of institute's speech commands according to the semanteme of identification;And to corresponding Equipment sends second control command, so that the relevant device executes corresponding operation according to second control command. The corresponding semanteme of the second control command for being controlled relevant device is pre-set, is received in the receiving unit 120 When the voice command sent out to user, institute's speech commands that the single 130 couples of users of control send out carry out semantics recognition, and root According to the institute's speech commands identified it is corresponding it is semantic determine the corresponding second control life of institute's speech commands, and to relevant device Second control command is sent, after the relevant device receives second control command, executes and is controlled with described second The corresponding operation of order.
Described in described control unit 130 is sent to relevant device transmission first control command and/or to relevant device Second control command can pass through at least one of LAN, WiFi, router and bluetooth communication mode.
Specifically, described control unit 130 can be logical by least one of LAN, WiFi, router and bluetooth Letter mode connects the required equipment controlled, to send first control command and/or the second control life to relevant device It enables.
Fig. 5 is the structural schematic diagram of another embodiment of plant control unit provided by the invention.As shown in figure 5, being based on Above-described embodiment, the plant control unit 100 further include detection unit 102.
Detection unit 102 is used to carry out the user in presumptive area by Kinect device in the capture unit 110 Before motion capture, the user position in the presumptive area is detected, with corresponding by the user position Kinect device carries out motion capture to the user.
Specifically, the presumptive area includes more than one subregion, can be respectively set in every sub-regions corresponding Kinect device, before the capture unit 110 carries out motion capture by Kinect device to the user in presumptive area, The detection unit 102 detects the user position in the presumptive area, to determine institute according to the user position User's subregion residing in the presumptive area is stated, to which the capture unit 110 is made a reservation for by the user described The residing corresponding Kinect device of subregion carries out motion capture to the user in region, that is, opens apart from the user most Close Kinect device carries out motion capture to the user, does not have to open all Kinect motion capture equipment, reduces energy The consumption in source.
The detection unit 102 detects the user position in the presumptive area specifically can be by following reality At least one of existing mode:
(1) the user position is detected by infrared sensor.
Since as long as any substance itself has certain temperature (being higher than absolute zero), can infrared radiation, therefore The user position can be detected by infrared sensor.
(2) auditory localization is carried out to the user by Array Microphone.
Kinect device is internally provided with Array Microphone, that is, includes at least two microphones, pass through described at least two The microphone array that microphone is constituted receives sound, and the direction of sound source incidence microphone array is then calculated by preset algorithm Angle can judge the locality of sound source.
(3) determine that the user institute is in place by detecting the intensity for the wireless signal that the equipment that the user carries is sent out It sets.
If the user carries the equipment that can send out wireless signal, such as mobile phone, bracelet, wrist-watch etc., can pass through The intensity of networking mode, the wireless signal that the equipment carried according to user is sent out positions the user.
The present invention also provides a kind of computer readable storage mediums corresponding to the apparatus control method, are stored thereon with Computer program, the step of aforementioned any the method is realized when described program is executed by processor.
The present invention also provides a kind of equipment corresponding to the apparatus control method, including processor, memory and deposit The computer program that can be run on a processor on a memory is stored up, the processor is realized aforementioned any when executing described program The step of the method.
The present invention also provides a kind of equipment corresponding to the plant control unit, including aforementioned any equipment control Device processed.Any of the above-described equipment can specifically include:Gateway, router, air-conditioning, refrigerator, washing machine, television set, water heater and At least one of micro-wave oven.The equipment can control itself or other equipment by the plant control unit.
Accordingly, scheme provided by the invention, by Kinect device in presumptive area user carry out motion capture and/ Or receive the voice command that user sends out;If capturing user to make deliberate action and/or receive the voice life that user sends out Enable, then relevant device controlled according to the deliberate action and/or institute's speech commands, realize through deliberate action or Voice command controls smart home device, without using control terminals such as mobile phone, control panel, remote controlers to intelligent family It occupies equipment to be controlled, the user experience is improved.
Function described herein can hardware, by processor execute software, firmware or any combination thereof in implement. If implemented in the software executed by processor, computer can be stored in using function as one or more instructions or codes It is transmitted on readable media or via computer-readable media.Other examples and embodiment are wanted in the present invention and appended right It asks in the scope and spirit of book.For example, be attributed to the property of software, function described above can be used by processor, Hardware, firmware, hardwired appoint the software implementation that the combination of whichever executes in these.In addition, each functional unit can integrate Can also be that each unit physically exists alone in a processing unit, can also two or more units be integrated in In one unit.
In several embodiments provided herein, it should be understood that disclosed technology contents can pass through others Mode is realized.Wherein, the apparatus embodiments described above are merely exemplary, for example, the unit division, Ke Yiwei A kind of division of logic function, formula that in actual implementation, there may be another division manner, such as multiple units or component can combine or Person is desirably integrated into another system, or some features can be ignored or not executed.Another point, shown or discussed is mutual Between coupling, direct-coupling or communication connection can be INDIRECT COUPLING or communication link by some interfaces, unit or module It connects, can be electrical or other forms.
The unit illustrated as separating component may or may not be physically separated, and be filled as control The component set may or may not be physical unit, you can be located at a place, or may be distributed over multiple On unit.Some or all of unit therein can be selected according to the actual needs to achieve the purpose of the solution of this embodiment.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product When, it can be stored in a computer read/write memory medium.Based on this understanding, technical scheme of the present invention is substantially The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words It embodies, which is stored in a storage medium, including some instructions are used so that a computer Equipment (can be personal computer, server or network equipment etc.) execute each embodiment the method for the present invention whole or Part steps.And storage medium above-mentioned includes:USB flash disk, read-only memory (ROM, Read-Only Memory), arbitrary access are deposited Reservoir (RAM, Random Access Memory), mobile hard disk, magnetic disc or CD etc. are various can to store program code Medium.
Example the above is only the implementation of the present invention is not intended to restrict the invention, for those skilled in the art For member, the invention may be variously modified and varied.Any modification made by all within the spirits and principles of the present invention, Equivalent replacement, improvement etc., should be included within scope of the presently claimed invention.

Claims (16)

1. a kind of apparatus control method, which is characterized in that including:
Motion capture is carried out to the user in presumptive area by Kinect device, to identify it is default dynamic whether the user makes Make, and/or receives the voice command that user sends out;
If capturing the user to make the deliberate action and/or receive the voice command that user sends out, according to Deliberate action and/or institute's speech commands control relevant device.
2. according to the method described in claim 1, it is characterized in that, the presumptive area includes more than one subregion;Logical Before Kinect device is crossed to user's progress motion capture in presumptive area, further include:
The user position in the presumptive area is detected, to determine the user described according to the user position Residing subregion in presumptive area;
Motion capture is carried out to the user in presumptive area by Kinect device, including:
By the user corresponding Kinect device of subregion residing in the presumptive area to the user into action It captures.
3. according to the method described in claim 2, it is characterized in that, the user institute detected in the presumptive area is in place It sets, including:
The user position is detected by infrared sensor;
And/or
Auditory localization is carried out to the user by Array Microphone;
And/or
Intensity by detecting the wireless signal that the equipment that the user carries is sent out determines the user position.
4. according to claim 1-3 any one of them methods, which is characterized in that
Relevant device is controlled according to the deliberate action, including:
Identify corresponding first control command of the deliberate action;
First control command is sent to relevant device, so that the relevant device executes phase according to first control command The operation answered;
And/or
Relevant device is controlled according to institute's speech commands, including:
The corresponding semanteme of institute's speech commands is identified, with according to the semantic determining institute speech commands corresponding second of identification Control command;
Second control command is sent to relevant device, so that the relevant device executes phase according to second control command The operation answered.
5. according to the method described in claim 4, it is characterized in that, to relevant device send first control command and/or Second control command is sent to relevant device, including:
By at least one of LAN, WiFi, router and bluetooth communication mode described the is sent to the relevant device One control command and/or second control command.
6. according to claim 1-5 any one of them methods, which is characterized in that the equipment, including:Air-conditioning, refrigerator, laundry It is mechanical, electrical to regard at least one of machine, water heater and micro-wave oven.
7. a kind of plant control unit, which is characterized in that including:
Capture unit, for carrying out motion capture to the user in presumptive area by Kinect device, to identify the user Whether deliberate action and/or receiving unit are made, the voice command sent out for receiving user;
Control unit, if being captured for the capture unit, the user makes the deliberate action and/or the reception is single Member receives the voice command that user sends out, then is carried out to relevant device according to the deliberate action and/or institute's speech commands Control.
8. device according to claim 7, which is characterized in that the presumptive area includes more than one subregion;It is described Device further includes:
Detection unit, for the capture unit by Kinect device in presumptive area user carry out motion capture it Before, the user position in the presumptive area is detected, to determine the user described according to the user position Residing subregion in presumptive area;
The capture unit, is further used for:It is corresponding by user subregion residing in the presumptive area Kinect device carries out motion capture to the user.
9. device according to claim 8, which is characterized in that the detection unit detects the institute in the presumptive area User position is stated, including:
The user position is detected by infrared sensor;
And/or
Auditory localization is carried out to the user by Array Microphone;
And/or
Intensity by detecting the wireless signal that the equipment that the user carries is sent out determines the user position.
10. according to claim 7-9 any one of them devices, which is characterized in that
Described control unit controls relevant device according to the deliberate action, including:
Identify corresponding first control command of the deliberate action;
First control command is sent to relevant device, so that the relevant device executes phase according to first control command The operation answered;
And/or
Described control unit controls relevant device according to institute's speech commands, including:
The corresponding semanteme of institute's speech commands is identified, with according to the semantic determining institute speech commands corresponding second of identification Control command;
Second control command is sent to relevant device, so that the relevant device executes phase according to second control command The operation answered.
11. device according to claim 10, which is characterized in that described control unit sends described the to relevant device One control command and/or second control command is sent to relevant device, including:
By at least one of LAN, WiFi, router and bluetooth communication mode described the is sent to the relevant device One control command and/or second control command.
12. according to claim 7-11 any one of them devices, which is characterized in that the equipment, including:Air-conditioning, is washed refrigerator Clothing is mechanical, electrical to regard at least one of machine, water heater and micro-wave oven.
13. a kind of computer readable storage medium, which is characterized in that be stored thereon with computer program, described program is handled The step of claim 1-6 any the method is realized when device executes.
14. a kind of equipment, which is characterized in that can be transported on a processor on a memory including processor, memory and storage The step of capable computer program, the processor realizes claim 1-6 any the methods when executing described program.
15. a kind of equipment, which is characterized in that include the plant control unit as described in claim 7-12 is any.
16. the equipment according to claims 14 or 15, which is characterized in that including:Gateway, air-conditioning, refrigerator, is washed router Clothing is mechanical, electrical to regard at least one of machine, water heater and micro-wave oven.
CN201810057973.7A 2018-01-22 2018-01-22 Equipment control method and device, storage medium and equipment Active CN108375911B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810057973.7A CN108375911B (en) 2018-01-22 2018-01-22 Equipment control method and device, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810057973.7A CN108375911B (en) 2018-01-22 2018-01-22 Equipment control method and device, storage medium and equipment

Publications (2)

Publication Number Publication Date
CN108375911A true CN108375911A (en) 2018-08-07
CN108375911B CN108375911B (en) 2020-03-27

Family

ID=63015171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810057973.7A Active CN108375911B (en) 2018-01-22 2018-01-22 Equipment control method and device, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN108375911B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109839827A (en) * 2018-12-26 2019-06-04 哈尔滨拓博科技有限公司 A kind of gesture identification intelligent home control system based on total space location information
CN110613457A (en) * 2019-08-23 2019-12-27 珠海格力电器股份有限公司 Detection method and device
CN111306714A (en) * 2020-03-03 2020-06-19 青岛海尔空调器有限总公司 Air conditioner and control method thereof
CN112965592A (en) * 2021-02-24 2021-06-15 中国工商银行股份有限公司 Equipment interaction method, device and system
CN114488831A (en) * 2022-01-10 2022-05-13 江苏博子岛智能产业技术研究院有限公司 Internet of things intelligent home control system and method based on human-computer interaction

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070254604A1 (en) * 2006-05-01 2007-11-01 Kim Joon Sik Sound Communication Network
CN101198030A (en) * 2007-12-18 2008-06-11 北京中星微电子有限公司 Camera locating method and locating device of video monitoring system
CN101587606A (en) * 2008-05-21 2009-11-25 上海新联纬讯科技发展有限公司 Method and system for detecting staff flow in exhibition venue
CN102004900A (en) * 2010-11-04 2011-04-06 北京理工大学 Novel infrared index point coding manner applied to large-scale tracking system
CN102824092A (en) * 2012-08-31 2012-12-19 华南理工大学 Intelligent gesture and voice control system of curtain and control method thereof
CN102932212A (en) * 2012-10-12 2013-02-13 华南理工大学 Intelligent household control system based on multichannel interaction manner
CN104750085A (en) * 2015-04-23 2015-07-01 谢玉章 Intelligent hardware voice body control method
CN105137771A (en) * 2015-07-27 2015-12-09 上海斐讯数据通信技术有限公司 Intelligent household appliance control system and method based on mobile terminal
CN205594339U (en) * 2016-04-13 2016-09-21 南京工业职业技术学院 Intelligent house control system is felt to body
CN107272902A (en) * 2017-06-23 2017-10-20 深圳市盛路物联通讯技术有限公司 Smart home service end, control system and control method based on body feeling interaction

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070254604A1 (en) * 2006-05-01 2007-11-01 Kim Joon Sik Sound Communication Network
CN101198030A (en) * 2007-12-18 2008-06-11 北京中星微电子有限公司 Camera locating method and locating device of video monitoring system
CN101587606A (en) * 2008-05-21 2009-11-25 上海新联纬讯科技发展有限公司 Method and system for detecting staff flow in exhibition venue
CN102004900A (en) * 2010-11-04 2011-04-06 北京理工大学 Novel infrared index point coding manner applied to large-scale tracking system
CN102824092A (en) * 2012-08-31 2012-12-19 华南理工大学 Intelligent gesture and voice control system of curtain and control method thereof
CN102932212A (en) * 2012-10-12 2013-02-13 华南理工大学 Intelligent household control system based on multichannel interaction manner
CN104750085A (en) * 2015-04-23 2015-07-01 谢玉章 Intelligent hardware voice body control method
CN105137771A (en) * 2015-07-27 2015-12-09 上海斐讯数据通信技术有限公司 Intelligent household appliance control system and method based on mobile terminal
CN205594339U (en) * 2016-04-13 2016-09-21 南京工业职业技术学院 Intelligent house control system is felt to body
CN107272902A (en) * 2017-06-23 2017-10-20 深圳市盛路物联通讯技术有限公司 Smart home service end, control system and control method based on body feeling interaction

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈乙雄,汪成亮,尹云飞: "《移动设备新技术简明教程》", 31 August 2016 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109839827A (en) * 2018-12-26 2019-06-04 哈尔滨拓博科技有限公司 A kind of gesture identification intelligent home control system based on total space location information
CN109839827B (en) * 2018-12-26 2021-11-30 哈尔滨拓博科技有限公司 Gesture recognition intelligent household control system based on full-space position information
CN110613457A (en) * 2019-08-23 2019-12-27 珠海格力电器股份有限公司 Detection method and device
CN111306714A (en) * 2020-03-03 2020-06-19 青岛海尔空调器有限总公司 Air conditioner and control method thereof
CN112965592A (en) * 2021-02-24 2021-06-15 中国工商银行股份有限公司 Equipment interaction method, device and system
CN114488831A (en) * 2022-01-10 2022-05-13 江苏博子岛智能产业技术研究院有限公司 Internet of things intelligent home control system and method based on human-computer interaction
CN114488831B (en) * 2022-01-10 2023-09-08 锋芒科技南京有限公司 Internet of things household intelligent control system and method based on man-machine interaction

Also Published As

Publication number Publication date
CN108375911B (en) 2020-03-27

Similar Documents

Publication Publication Date Title
CN108375911A (en) Equipment control method and device, storage medium and equipment
CN109074819A (en) Preferred control method based on operation-sound multi-mode command and the electronic equipment using it
CN105116783B (en) control interface switching method and device
US11061385B2 (en) Method, apparatus and system for controlling device
CN105830397B (en) Method, portable operating device, system for switching a household appliance between a home mode and a non-home mode
US20180048482A1 (en) Control system and control processing method and apparatus
CN108702313A (en) Intelligent home voice control method, device, equipment and system
US10295972B2 (en) Systems and methods to operate controllable devices with gestures and/or noises
EP3857860B1 (en) System and method for disambiguation of internet-of-things devices
WO2020024546A1 (en) Auxiliary speech control method and device and air conditioner
CN109944019A (en) A kind of laundry cleaning systems based on the interconnection of full room
CN104615000B (en) Control method and device of control system of intelligent household equipment
CN109974235A (en) Method and device for controlling household appliance and household appliance
CN105204357A (en) Contextual model regulating method and device for intelligent household equipment
CN110475273A (en) Mesh network-building method and device for Mesh networking
CN105627513B (en) control method, device and system of air conditioner
CN110426962A (en) A kind of control method and system of smart home device
CN107101235A (en) Range hood control method and device and range hood
CN105700367A (en) A method and apparatus using bearing gestures to carry out remote control of smart household equipment
CN104062909A (en) Household appliance equipment and control device and method thereof
CN105245416A (en) Household appliance controlling method and device
CN108919658A (en) Smart machine control method and device
CN109147056A (en) Electric appliance control method and device, storage medium and mobile terminal
CN111884887A (en) Voice interaction method and device, storage medium and electronic device
CN110632854A (en) Voice control method and device, voice control node and system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant