CN115390468A - Device control method, device, electronic device and storage medium - Google Patents

Device control method, device, electronic device and storage medium Download PDF

Info

Publication number
CN115390468A
CN115390468A CN202210929095.XA CN202210929095A CN115390468A CN 115390468 A CN115390468 A CN 115390468A CN 202210929095 A CN202210929095 A CN 202210929095A CN 115390468 A CN115390468 A CN 115390468A
Authority
CN
China
Prior art keywords
target
moving direction
determining
detection area
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210929095.XA
Other languages
Chinese (zh)
Inventor
曾昭泽
宋志龙
姚沁
刘莹胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lumi United Technology Co Ltd
Original Assignee
Lumi United Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lumi United Technology Co Ltd filed Critical Lumi United Technology Co Ltd
Priority to CN202210929095.XA priority Critical patent/CN115390468A/en
Publication of CN115390468A publication Critical patent/CN115390468A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the application provides a device control method and device, electronic equipment and a storage medium, and relates to the technical field of Internet of things. Wherein, the method comprises the following steps: receiving a reflected signal formed by the reflection of a target, and determining position data of the target according to the reflected signal, wherein the position data is used for indicating the position of the target in a detection area; tracking and positioning the position of the target in a detection area according to the position data of the target, and determining the track distribution of the target in the detection area; and calculating the track trend change based on the track distribution of the target in the detection area, and determining the moving direction of the target, so that if the moving direction of the controlled equipment in the target meets equipment linkage conditions, an action corresponding to the moving direction of the target is executed. The embodiment of the application solves the problem of poor flexibility of equipment control in the related art.

Description

Device control method, device, electronic device and storage medium
Technical Field
The application relates to the technical field of internet of things, in particular to a device control method and device, an electronic device and a storage medium.
Background
With the development of the internet of things technology, the control of the intelligent equipment is gradually and widely applied to the field of intelligent home.
In present intelligent house field, the control of smart machine utilizes human existence to detect more, in order to control smart machine more intelligently, utilizes the discernment of human moving direction to control smart machine also to appear gradually, however the human moving direction who discerns at present is comparatively single, leads to smart machine's control flexibility relatively poor, can't satisfy user demand.
Therefore, how to improve the flexibility of the device control remains to be solved.
Disclosure of Invention
Embodiments of the present application provide an apparatus control method, an apparatus, an electronic device, and a storage medium, which can solve the problem of poor flexibility of apparatus control in the related art. The technical scheme is as follows:
according to an aspect of an embodiment of the present application, a device control method includes: receiving a reflected signal formed by the reflection of the target, and determining position data of the target according to the reflected signal, wherein the position data is used for indicating the position of the target in the detection area; tracking and positioning the position of the target in the detection area according to the position data of the target, and determining the track distribution of the target in the detection area; and calculating the track trend change based on the track distribution of the target in the detection area, and determining the moving direction of the target, so that if the moving direction of the controlled equipment in the target meets the equipment linkage condition, the action corresponding to the moving direction of the target is executed.
According to an aspect of an embodiment of the present application, an apparatus for controlling a device, the apparatus includes: the position determining module is used for receiving a reflected signal formed by the reflection of the target and determining position data of the target according to the reflected signal, wherein the position data is used for indicating the position of the target in the detection area; the target tracking module is used for tracking and positioning the position of the target in a detection area according to the position data of the target and determining the track distribution of the target in the detection area; and the movement determining module is used for calculating the track trend change based on the track distribution of the target in the detection area and determining the movement direction of the target, so that if the movement direction of the controlled equipment in the target meets the equipment linkage condition, the action corresponding to the movement direction of the target is executed.
According to an aspect of an embodiment of the present application, an electronic device includes: the system comprises at least one processor, at least one memory and at least one communication bus, wherein the memory is stored with computer programs, and the processor reads the computer programs in the memory through the communication bus; the computer program, when executed by a processor, implements the device control method as described above.
According to an aspect of an embodiment of the present application, a storage medium has a computer program stored thereon, and the computer program, when executed by a processor, implements the device control method as described above.
According to an aspect of an embodiment of the present application, a computer program product includes a computer program, the computer program is stored in a storage medium, a processor of a computer device reads the computer program from the storage medium, and the processor executes the computer program, so that the computer device realizes the device control method as described above when executing.
The technical scheme provided by the application brings the beneficial effects that:
in the technical scheme, the position data of the target is determined according to the reflection signal formed by the reflection of the target, the position of the target in the detection area is tracked and positioned based on the position data of the target, the track distribution of the target in the detection area is determined, and then the track trend change of the target is obtained by calculating the track distribution, so that the moving direction of the target is determined, so that the controlled equipment can execute the action corresponding to the moving direction of the target when the moving direction of the target meets the equipment linkage condition, that is, more moving directions of the target can be identified based on the track trend change of the target, the problem that the currently identified moving direction of a human body is single is avoided, and the problem that the flexibility of equipment control is poor in the related technology can be effectively solved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments of the present application will be briefly described below.
FIG. 1 is a schematic illustration of an implementation environment according to the present application;
FIG. 2 is a flow chart illustrating a method of controlling a device according to an exemplary embodiment;
FIG. 3 is a flow diagram for one embodiment of step 310 in a corresponding embodiment of FIG. 2;
FIG. 4 is a schematic illustration of the coordinate transformation involved in step 313 of the corresponding embodiment of FIG. 2;
FIG. 5 is a schematic diagram illustrating different trajectory profiles in accordance with an exemplary embodiment;
FIG. 6 is a top view of a horizontal plane of a detection zone shown according to an exemplary embodiment;
FIG. 7 is a flow chart for one embodiment of step 330 in the corresponding embodiment of FIG. 2;
FIG. 8 is a schematic diagram illustrating a clustering process in accordance with an exemplary embodiment;
FIG. 9 is a schematic diagram illustrating the formation of a trajectory profile according to an exemplary embodiment;
FIG. 10 is a flowchart of one embodiment of step 335 in the corresponding embodiment of FIG. 7;
FIG. 11 is a flow diagram for one embodiment of step 350 of the corresponding embodiment of FIG. 2;
FIG. 12 is a detailed diagram of a method of controlling a device according to an application scenario;
FIG. 13 is a flow chart of the movement direction determination of an object in the horizontal direction in the application scenario of FIG. 12;
FIG. 14 is a flow chart of the determination of the direction of movement of an object in the vertical direction in the application scenario of FIG. 12;
fig. 15 is a block diagram showing a configuration of an apparatus control device according to an exemplary embodiment;
FIG. 16 is a diagram illustrating a hardware configuration of an electronic device in accordance with an exemplary embodiment;
fig. 17 is a block diagram illustrating a configuration of an electronic device according to an example embodiment.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
The following is a description and explanation of several terms involved in the present application:
FFT, known as Fast Fourier Transform in English, means Fast Fourier Transform in Chinese.
K-MEAN, which represents a K-MEANs clustering algorithm.
DBF, english is called Digital Beam Forming, and Chinese means Digital Beam Forming.
DBSCAN, english called Density Based Spatial Clustering of Application with Noise, chinese meaning Density Clustering algorithm.
KALMAN, denoting KALMAN filtering.
As described above, the currently recognized moving direction of the human body is single, for example, the flexibility of control of the smart device is poor due to the distance recognition and the left-right recognition based on the sensor technology, and it is difficult to meet the user requirements.
Specifically, the distance recognition based on the sensor technology is to acquire a distance between a target and a sensor by proximity sensing recognition using the ultrasonic sensor technology, and determine the distance between the target and the sensor.
The left-right recognition based on the sensor technology refers to two infrared sensors which are separately installed, for example, the infrared sensor a is installed on the left side, the infrared sensor B is installed on the right side, and the left-right moving direction of the target is recognized.
In addition, the moving direction recognition based on the infrared technology is weak in anti-jamming capability, is easily interfered by environmental factors, for example, is easily affected by temperature change, or is easily affected by environmental light change, and is difficult to accurately recognize for multi-person movement.
As can be seen from the above, the related art still has the drawback of poor flexibility in control of the apparatus.
Therefore, the device control method provided by the application can effectively improve the flexibility of device control, and accordingly, the device control method is suitable for the device control apparatus, and the device control apparatus can be deployed in electronic devices, for example, the electronic devices can be intelligent devices configured with millimeter wave radars, such as human body sensors.
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of an implementation environment involved with a device control method. The implementation environment includes a user terminal 110, an intelligent device 130, a gateway 150, a server side 170, and a router 190.
Specifically, the user terminal 110 may also be considered as a user terminal or a terminal, and may perform deployment (also understood as installation) of a client associated with the smart device 130, and this user terminal 110 may be an electronic device such as a smart phone, a tablet computer, a notebook computer, a desktop computer, an intelligent control panel, and other devices with display and control functions, and is not limited herein.
The client is associated with the smart device 130, and is substantially that a user registers an account in the client, and configures the smart device 130 in the client, for example, the configuration includes adding a device identifier to the smart device 130, so that when the client is run in the user terminal 110, functions related to device display, device control, and the like of the smart device 130 can be provided for the user, the client may be in the form of an application program or a web page, and accordingly, an interface for the client to perform device display may be in the form of a program window or a web page, which is not limited herein.
The intelligent device 130 is disposed in the gateway 150, and communicates with the gateway 150 through its own configured communication module, so as to be controlled by the gateway 150. It should be understood that the smart device 130 refers to one of the plurality of smart devices 130, and the smart device 130 is only illustrated in the embodiment of the present application, that is, the number of smart devices and the type of devices deployed in the gateway 150 are not limited in the embodiment of the present application. In one application scenario, the smart device 130 accesses the gateway 150 through a local area network, and is thereby deployed in the gateway 150. The process of accessing the gateway 150 by the intelligent device 130 through the local area network includes: a local area network is first established by the gateway 150, and the intelligent device 130 joins the local area network established by the gateway 150 by connecting to the gateway 150. Such local area networks include, but are not limited to: ZIGBEE or bluetooth. The intelligent device 130 may be an intelligent printer, an intelligent fax machine, an intelligent camera, an intelligent air conditioner, an intelligent door lock, an intelligent lamp, or an electronic device equipped with a communication module, such as a human body sensor, a door and window sensor, a temperature and humidity sensor, a water sensor, a natural gas alarm, a smoke alarm, a wall switch, a wall socket, a wireless switch, a wireless wall switch, a magic cube controller, a curtain motor, and the like.
The interaction between the user terminal 110 and the smart device 130 may be implemented through a local area network or a wide area network. In an application scenario, the user terminal 110 establishes a wired or wireless communication connection with the gateway 150 through the router 190, for example, the wired or wireless communication connection includes but is not limited to WIFI, so that the user terminal 110 and the gateway 150 are deployed in the same local area network, and further, the user terminal 110 may implement interaction with the smart device 130 through a local area network path. In another application scenario, the user terminal 110 establishes a wired or wireless communication connection with the gateway 150 through the server 170, for example, the wired or wireless communication connection includes but is not limited to 2G, 3G, 4G, 5G, WIFI, and the like, so that the user terminal 110 and the gateway 150 are deployed in the same wide area network, and further the user terminal 110 can interact with the smart device 130 through a wide area network path.
The server 170 may be a server, or a server cluster formed by multiple servers, or a cloud, a cloud platform, a cloud computing center, and the like formed by multiple servers, so as to better provide background services to the mass user terminals 110. For example, the background service includes a movement direction identification service.
Taking the intelligent device 130 (i.e., the detection device) responsible for detection as a human body sensor as an example, the human body sensor receives the reflected signal, determines position data of the target according to the reflected signal, tracks and positions the target according to the position data of the target, determines track distribution of the target in the detection area, performs calculation of track trend change on the track distribution of the target in the detection area, determines a moving direction of the target, and sends the moving direction to the gateway 150.
The gateway 150 receives the moving direction of the target, and sends a device control command to the controlled intelligent device 130 (i.e., the controlled device) when the moving direction of the target satisfies the device linkage condition.
At this time, the controlled device can receive the device control command transmitted from the gateway 150 and execute an operation according to the movement direction of the target in response to the device control command.
Of course, in other embodiments, along with the interaction between the human body sensor and the server 170, the human body sensor may send the transmission signal to the server 170, and provide a moving direction identification service by using the server 170, so as to notify the gateway 150 of the moving direction of the target, so that the gateway can send a device control instruction to the controlled device based on the moving direction of the target, so that the controlled device performs an action corresponding to the moving direction of the target according to the device control instruction.
Referring to fig. 2, an embodiment of the present application provides a device control method, which is applicable to an electronic device, where the electronic device may be a server 170 in the implementation environment shown in fig. 1, and may also be an intelligent device configured with a millimeter wave radar.
In the following method embodiments, for convenience of description, the main execution body of each step of the method is taken as an electronic device as an example, but the configuration is not particularly limited.
As shown in fig. 2, the method may include the steps of:
step 310, receiving a reflection signal formed by the reflection of the target, and determining position data of the target according to the reflection signal.
Here, the target refers to any object that can move within a detection area of the detection device, for example, the any object may be a human, and may also be an animal, a robot, or the like. It should be noted that the detection device may be any type of sensor, and accordingly, the detection area refers to an area where the detection device can effectively detect the target.
Taking the detection device as an example of a human body sensor configured with a millimeter wave radar, the millimeter wave radar is configured with an antenna array, the antenna array includes a transmitting antenna and a receiving antenna, the human body sensor transmits a millimeter wave signal (i.e., a sensing signal) to a target through the transmitting antenna in a process of positioning the target in a detection area, receives an echo signal (i.e., a reflection signal) formed by the millimeter wave signal through reflection of the target through the receiving antenna, and can determine the position of the target in the detection area through correlation processing. Here, the detection region refers to a region in which the transmitting antenna of the millimeter wave radar can effectively transmit a millimeter wave signal.
Next, it is explained that the position data is used to indicate the position of the object in the detection area. In one possible implementation, the position data is represented by coordinates in a first coordinate system, which is a rectangular coordinate system of a horizontal plane in which the detection area is located. For example, taking a human body sensor as an example, if the position data is represented by coordinates (x, y), the coordinate value x represents the horizontal distance between the target and the horizontal plane where the human body sensor is located in the detection area, and the coordinate value y represents the vertical distance between the target and the horizontal plane where the human body sensor is located in the detection area.
In one possible implementation, as shown in fig. 3, the step 310 of determining the position data of the target according to the reflected signal may include the following steps:
and 311, performing spectrum analysis according to the reflected signals, and determining the movement data of the target.
The movement data of the target is different from the position data, and the position of the target in the detection area is described through coordinates in a second coordinate system, wherein the second coordinate system is a polar coordinate system of a horizontal plane where the detection area is located. In one possible implementation, the movement data of the target includes, but is not limited to: distance, speed, angle. For example, taking a human body sensor as an example, if the movement data is represented by coordinates (R, v, θ), a coordinate value R represents a radial distance between the target and the human body sensor on a horizontal plane on which the detection region is located, a coordinate value v represents a speed at which the target moves relative to the human body sensor, and a coordinate value θ represents an angle between a connection line between the target and the human body sensor and a y-axis of the horizontal plane on which the detection region is located, as shown in fig. 4.
The following describes a process of analyzing a spectrum of moving data of a target by taking a reflected signal as an echo signal as an example:
firstly, determining a time domain signal S based on a mathematical model X = AS + N of an echo signal; here, the time domain signal S is a signal related to distance, velocity, and angle, and is represented as f (R, v, θ).
The second step, using formula F (R, v, θ) =Σn = {0,N-1}f (R, v, θ) e -j2πkn/N Wherein N represents the total number of sampling points, N represents the nth sampling point, and FFT is performed on the time domain signal S to obtain a frequency domain signal S', which is denoted as F (R, v, θ).
And thirdly, solving a power spectrum of the frequency domain signal S' by using a formula P (R, v, theta) = | F (R, v, theta) |, so as to obtain a power spectrum signal which is represented as P (R, v, theta).
And fourthly, representing the peak value of the power spectrum signal as the moving data of the target as (R, v, theta), wherein R represents a distance, v represents a speed, and theta represents an angle.
Step 313, converting the movement data of the target from the second coordinate system to the first coordinate system to obtain the position data of the target.
As described above, since the movement data is the position of the object in the detection area represented by the coordinates in the second coordinate system and the position data is the position of the object in the detection area represented by the coordinates in the first coordinate system, in order to describe the trajectory distribution of the object in the detection area more accurately and effectively, the position data is obtained from the movement data, and the coordinate system conversion is performed substantially between the movement data and the position data. In this way, the accuracy of the moving direction recognition can be further improved.
In one possible implementation, the first coordinate system is a rectangular coordinate system and the second coordinate system is a polar coordinate system. Continuing to refer to fig. 4, assuming that the target is located at point a, the movement data of the target is represented as (R, v, θ), in fig. 4, R represents the polar diameter of point a and θ represents the polar angle of point a. Then, the position data (X, Y) of the object can be obtained by the following calculation formula:
X=R×sin(θ);
Y=R×cos(θ)。
and step 330, tracking and positioning the position of the target in the detection area according to the position data of the target, and determining the track distribution of the target in the detection area.
In order to avoid that the current identified human body moving direction is single, which results in poor control flexibility of the intelligent device, in this embodiment, the identification of the target moving direction is realized based on the track distribution of the target in the detection area. Wherein the trajectory profile is used to describe different positions of the object in the detection area. It should be understood that if the moving directions of the targets are different, the positions of the targets in the detection area will be different, so that the track distribution is different, and then the moving directions of the targets can be identified more accurately based on the track distribution, so as to ensure the flexibility of the intelligent device control.
In one possible implementation, the algorithms that implement trajectory tracking include at least various types of filtering algorithms, such as the KALMAN algorithm, to find different locations of targets in the detection area.
Therefore, different positions of the target in the detection area can be determined according to the position data of the target through the algorithm of track tracking, and the different positions of the target in the detection area are connected to form the track distribution of the target in the detection area.
And 350, calculating the track trend change based on the track distribution of the target in the detection area, and determining the moving direction of the target.
First, it is explained that the track trend changes, and substantially reflects the change trend of the target between different positions in the detection area.
Fig. 5 shows schematic diagrams of different track distributions of the target in the detection area, in fig. 5, a horizontal plane of the detection area corresponds to a rectangular coordinate system, that is, a first coordinate system, an X axis of the first coordinate system represents a horizontal direction of the horizontal plane of the detection area, a Y axis represents a vertical direction of the horizontal plane of the detection area, wherein both the track distribution a and the track distribution b are curves pointing from a-X axis to a + X axis and pointing upward to a Y axis, but an upward trend of the track distribution a is more obvious, indicating that the target is more prone to move upward, and a leftward trend of the track distribution b is more obvious, indicating that the target is more prone to move leftward.
That is to say, the moving tendency of the target does not depend on the trajectory distribution, but is closely related to the trajectory trend change, so in this embodiment, before determining the moving direction of the target, it is first necessary to determine the trajectory trend change corresponding to the trajectory distribution.
In a possible implementation manner, the algorithm of the track trend change at least includes a difference method, a difference summation method, and the like, so that the track trend change corresponding to the track distribution can be determined through the algorithm of the track trend change, and the moving direction of the target is accurately reflected through the track trend change.
Fig. 6 shows a top view of the horizontal plane of the detection area, in fig. 6, the X axis represents the horizontal direction of the horizontal plane of the detection area, and the Y axis represents the vertical direction of the horizontal plane of the detection area, at this time, the target can move from left to right and from right to left in the horizontal direction, and can also move from top to bottom and from bottom to top in the vertical direction.
Thus, in one possible implementation, the moving direction of the target includes: the target moving direction indicating device comprises a first moving direction used for indicating that the target moves from left to right in the horizontal direction, a second moving direction used for indicating that the target moves from right to left in the horizontal direction, a third moving direction used for indicating that the target moves from bottom to top in the vertical direction, and a fourth moving direction used for indicating that the target moves from top to bottom in the vertical direction. Of course, in other embodiments, the moving direction of the target may be any combination of the above moving directions, for example, from left to right in the horizontal direction and from top to bottom in the vertical direction, from right to left in the horizontal direction and from top to bottom in the vertical direction, from left to right in the horizontal direction and from bottom to top in the vertical direction, from right to left in the horizontal direction and from bottom to top in the vertical direction, which is not limited herein.
With continued reference to fig. 6, the inventor has realized that the detection device (e.g., a sensor) generally uses a central vertical line as a boundary, for example, the left side of the central vertical line of the detection device is defined as left, and the right side of the central vertical line is defined as right, and it can be understood that, during moving the target left and right in the horizontal direction, the central vertical line may or may not pass through the central vertical line, and it can also be understood that, during moving the target left and right in the horizontal direction, the target may or may not always move left and right in a certain quadrant of the first coordinate system, so that, in one possible implementation, whether the target passes through the detection device or not can be further distinguished with respect to the moving process of the target in the horizontal direction, that is, with respect to the first moving direction/the second moving direction, whether the target passes through the detection device or not can be further distinguished. For example, the moving direction of the target is a first moving direction and the target passes the detection device, or the moving direction of the target is a second moving direction and the target does not pass the detection device. In this way, the accuracy of the moving direction recognition is further improved.
After the moving direction of the target is obtained, for the detection device, the moving direction can be sent to the gateway, and the gateway can judge whether the moving direction meets the device linkage condition. If the gateway determines that the moving direction meets the device linkage condition, the controlled device can be further controlled to execute the action corresponding to the moving direction.
Through the process, more moving directions of the target can be identified based on the change of the track trend of the target, namely more moving directions of the target can be identified, so that the problem that the currently identified moving direction of the human body is single is avoided, and the problem that the flexibility of equipment control is poor in the related technology can be effectively solved.
Referring to fig. 7, in an exemplary embodiment, step 330 may include the steps of:
in step 331, a plurality of location data of the target is determined.
Each piece of position data is determined according to the reflection signals received in the current time period.
Still taking the human body sensor as an example, as described above, one millimeter wave signal corresponds to one echo signal, and after correlation processing, one position of the target in the detection area can be determined, and then, based on a plurality of echo signals received within one time period, a plurality of positions of the target in the detection area can be determined. Specifically, for the millimeter wave radar configured in the human body sensor, a transmitting antenna of the millimeter wave radar can transmit a plurality of millimeter wave signals to a target in each time period, and accordingly, the millimeter wave radar can receive echo signals formed by the plurality of millimeter wave signals through target reflection through a receiving antenna, and after relevant processing, a plurality of positions of the target in a detection area are determined, that is, a plurality of position data of the target are obtained. It should be noted that the time period refers to a time period during which the detection device receives a plurality of reflected signals, and may also be considered as a time period during which the detection device transmits a plurality of sensing signals, and the time period may be flexibly adjusted according to actual needs of an application scenario, for example, the time period is 10 minutes, and then the millimeter wave radar may transmit a plurality of millimeter wave signals in each 10 minutes. Of course, in other embodiments, the duration of the time period is not limited to a time period of a fixed duration, but is determined by a fixed number of reflection signals/sensing signals, for example, the fixed number of reflection signals/sensing signals is 10, and then the millimeter wave radar transmits 10 millimeter wave signals in each time period, which is not specifically limited in this embodiment.
That is, for each time period, a plurality of positions of the object in the detection area may be determined.
Step 333, clustering the plurality of position data, and determining the centroid position of the target in the current time period.
In this embodiment, clustering refers to a process of dividing multiple positions of a target in a detection area in the same time period into the same category, and then, through clustering, multiple positions of a target in a detection area in different time periods may be divided into multiple categories. One category corresponds to the centroid position of one time period, and it can also be considered that a plurality of positions of the target in the detection area within the same time period are divided to form the centroid position of the time period. In such a way, the calculation amount of tracking and positioning is favorably reduced, and the identification speed of the moving direction identification is favorably improved.
In one possible implementation, the algorithm for implementing clustering includes but is not limited to: the K-MEAN algorithm, the DBSCAN algorithm, etc.
Fig. 8 shows a schematic diagram of clustering of multiple positions in the same time period, as shown in fig. 8, for a time period T, multiple positions 301 of an object in a detection region can be determined before clustering, and a centroid position 302 of the time period T can be determined in the detection region after clustering.
And step 335, forming a trajectory distribution of the target in the detection area according to the centroid positions of the target in different time periods.
FIG. 9 is a schematic diagram showing the distribution of trajectories formed at the center of mass position during different time periods, as shown in FIG. 9, during time period T n-1 The centroid position of the target in the detection area is 303, which is expressed as (X) n-1 ,Y n-1 ) At a time period T n When the center of mass of the target in the detection area is 304 as the target moves in the detection area, it is denoted as (X) n ,Y n ) Then, by the centroid position 303 and the centroid position 304, a trajectory profile 305 of the object in the detection area can be formed.
In one possible implementation, the trajectory distribution is implemented by a trajectory set, and it can be considered that the trajectory set stores centroid positions indicating different time periods of the target, and describes the trajectory distribution of the target in the detection area. The trace set may be a first-in first-out queue, an array, a stack, or the like.
In one possible implementation, the trace sets are trace queues that follow a first-in-first-out principle.
Specifically, as shown in fig. 10, through steps 3351 to 3355, after the target enters the detection area, when the centroid position of the target in the current time period is obtained, if the trajectory queue is not full, the centroid position of the target in the current time period may be directly stored to the tail of the trajectory queue; and otherwise, if the track queue is full, deleting the centroid position of the target at the head of the track queue, and storing the centroid position of the target at the current time period to the tail of the track queue.
Through the cooperation of the above embodiment, the track distribution of the target in the detection area can be accurately obtained and is used as the basis for the operation of the track trend change, so that the accurate judgment of the moving direction of the target can be realized, and the improvement of the flexibility of equipment control is facilitated.
Referring to FIG. 11, in an exemplary embodiment, step 350 may include the steps of:
step 351, calculating first trend data and/or second trend data according to the centroid positions of the targets in the trajectory distribution in different time periods.
Wherein the first trend data is used for indicating the moving trend of the target in the first direction, and the second trend data is used for indicating the moving trend of the target in the second direction. It should be noted that the first direction refers to a direction in which a first coordinate axis of the first coordinate system is located, the second direction refers to a direction in which a second coordinate axis of the first coordinate system is located, and for a rectangular coordinate system (i.e., the first coordinate system) corresponding to a horizontal plane in which the detection region is located, the first direction refers to a horizontal direction in which an X axis of the rectangular coordinate system is located, and the second direction refers to a vertical direction in which a Y axis of the rectangular coordinate system is located, so that the first trend data reflects a moving trend of the target in the horizontal direction, and the second trend data reflects a moving trend of the target in the vertical direction.
The calculation of the first trend data and/or the second trend data is illustrated by way of example with a trajectory profile described by a trajectory queue following the first-in-first-out principle:
referring back to fig. 10, in step 430, the first trend data and/or the second trend data are calculated according to the centroid positions of different time periods in the trajectory queue. The calculation formula is as follows:
Figure BDA0003780880270000121
Figure BDA0003780880270000122
wherein XR represents the first trend data and YR represents the second trend data;
XT n+1 coordinate value, XT, of centroid position on X axis representing n +1 time period T n A coordinate value representing the centroid position of the nth time period T on the X axis;
YT n+1 coordinate value, YT, of the centroid position in the Y axis representing the (n + 1) th time period T n+1 A coordinate value representing the centroid position of the nth time period T on the Y axis;
n denotes the length of the trace queue.
Further, the inventor has realized that when the target belongs to a non-rigid body maneuvering target, such as a human target, the movement of the target may cause the position data obtained by the detection device to be unstable, and ultimately affect the accuracy of the calculation of the first trend data and/or the second trend data, and for this reason, with continued reference to fig. 10, in this embodiment, before step 430, the method may further include the steps of:
and step 410, smoothing the centroid positions of the targets in the trajectory queue in different time periods. Specifically, the smoothing process is performed according to the following calculation formula:
XT n =αXT n +(1-α)XT n-1
YT n =αYT n +(1-α)YT n-1
wherein XT n+1 Coordinate value, XT, of centroid position on X axis representing n +1 time period T n A coordinate value representing the centroid position of the nth time period T on the X axis;
YT n+1 coordinate value, YT, of the centroid position in the Y axis representing the (n + 1) th time period T n+1 A coordinate value indicating the centroid position in the Y axis for the nth time period T;
alpha is a filter coefficient and can be flexibly adjusted according to the actual needs of an application scene. For example, α is 0.3.
Based on the process, the abnormal values can be eliminated by smoothing the centroid positions in different time periods, so that the centroid positions in the trajectory queue are guaranteed to be as smooth and accurate as possible, and the calculation accuracy of the first trend data and/or the second trend data is improved.
It should be noted that, in the process of forming the trajectory distribution, if the target moves continuously after entering the detection area, the centroid positions of the target in different time periods are obtained continuously through clustering, at this time, the centroid positions of the different time periods are sequentially stored in the trajectory queue according to the first-in first-out principle until the trajectory queue is full, the calculation of entering the first trend data and/or the second trend data is triggered, as shown in fig. 10, step 410 is executed from step 3355, however, the inventors also realize that once the target stops moving in the detection area or the target leaves the detection area, a new centroid position will not be obtained, that is, there is no centroid position in the next time period relative to the current time period, at this time, if the trajectory queue is not full, it is likely that the calculation of entering the first trend data and/or the second trend data cannot be triggered, and for this purpose, after step 3351, the method may further include the following steps: and if the centroid position of the next time period does not exist, skipping to execute the step 410, otherwise, skipping to execute the step 3353 if the centroid position of the next time period exists.
Based on the process, the real-time performance of the calculation of the first trend data and/or the second trend data is fully ensured, so that the continuous identification of the moving direction of the target in real time and accurately is realized.
And 353, determining the moving direction of the target according to the first trend data and/or the second trend data.
Wherein, on the one hand, the moving direction of the object is determined according to the comparison of the first trend data and the first threshold.
Specifically, if the first trend data is greater than the first threshold, the moving direction of the target is a first moving direction, and the first moving direction is used for indicating that the target moves from left to right in the first direction; if the first trend data is not larger than the first threshold, the moving direction of the target is a second moving direction, and the second moving direction is used for indicating that the target moves from right to left in the first direction.
As described above, in the process of moving the target left and right in the first direction, the target may pass through the detection device, or may always move left and right in a certain quadrant of the first coordinate system, and therefore, with respect to the first moving direction/the second moving direction, it is further possible to further distinguish whether the target passes through the detection device. In one possible implementation, whether the target passes through the detection device is achieved by performing zero-crossing detection on the first trend data.
Specifically, the zero-crossing detection is performed on the first trend data using the following calculation formula:
Figure BDA0003780880270000141
wherein Z represents a zero-crossing detection result, 1 represents that the target passes through the detection device, and 0 represents that the target does not pass through the detection device;
XT n+1 representing the time period T in the trace profile n+1 The coordinate value of the centroid position in the horizontal direction, XT n Time period T in trace distribution n The centroid position of (a).
On the other hand, the moving direction of the target is determined based on the comparison of the second trend data with the second threshold.
Specifically, if the second trend data is greater than the second threshold, the moving direction of the target is a third moving direction, and the third moving direction is used for indicating that the target moves from bottom to top in the second direction; if the second trend data is not greater than the second threshold, the moving direction of the target is a fourth moving direction, and the fourth moving direction is used for indicating that the target moves from top to bottom in the second direction.
Through the process, the track trend change calculation is carried out based on the track distribution of the target in the detection area, and the moving direction of the target can be accurately determined, so that the device can be flexibly controlled to execute the action corresponding to the moving direction of the target.
Fig. 14 is a schematic diagram of a specific implementation of a device control method in an application scenario. This application scenario is applicable to the implementation environment shown in fig. 1, in which the smart device configured with the millimeter wave radar is a human body sensor, and accordingly, the target is a human body.
Taking a human body sensor as an example, the following description is made on the process of determining the moving direction of the target:
step 801, based on the millimeter wave radar configured by the human body sensor, a millimeter wave signal is transmitted to a person entering a detection area by a transmitting antenna of the millimeter wave radar, so that an echo signal formed by reflection of the person is received by a receiving antenna of the millimeter wave radar.
Step 802, after receiving the echo signal, the human body sensor performs correlation processing to obtain movement data of the human body, where the movement data at least includes distance, speed, angle, and the like.
Step 803, converting the movement data of the person from the polar coordinate system to the rectangular coordinate system to obtain position data of the person, wherein the position data is used for indicating the position of the person in the detection area.
And step 804, in the same time period, obtaining the centroid position of the person in the detection area in the time period from a plurality of positions of the person in the detection area through clustering.
Step 805, as the person moves in the detection area, a plurality of centroid positions of the person in the detection area are obtained correspondingly in different time periods, and the plurality of centroid positions are stored in the trajectory queue.
And 806, smoothing the centroid positions in the track queue to form track distribution of the target in the detection area.
In step 807, the smoothed trajectory queue is subjected to trajectory trend change calculation to obtain first trend data XR and second trend data YR.
In step 808, the moving direction of the person in the detection area is determined according to the first trend data XR and the second trend data YR.
As shown in fig. 13, taking the first threshold value as 0 for example, if the first trend data XR is greater than 0 and the zero-crossing detection result =0, it is determined that the moving direction of the person is moving from left to right and does not pass through the human body sensor; if the first trend data XR is larger than 0 and the zero-crossing detection result =1, determining that the moving direction of the person moves from left to right and passes through the human body sensor; if the first trend data XR is smaller than 0 and the zero-crossing detection result =0, determining that the moving direction of the person is moving from right to left and does not pass through the human body sensor; if the first trend data XR is less than 0 and the zero crossing detection result =1, it is determined that the moving direction of the person is moving from right to left and passes through the body sensor.
As shown in fig. 14, taking the second threshold as 0 for example, if the second trend data YR is greater than 0, it is determined that the moving direction of the person is moving from bottom to top, that is, the person makes a far movement with respect to the human body sensor; if the second trend data YR is smaller than 0, it is determined that the movement direction of the person is moving from top to bottom, that is, the person makes an approaching motion with respect to the human body sensor.
Thus, the human body sensor accurately determines the moving direction of the human body in the detection area.
In connection with the implementation environment shown in fig. 1, the motion sensor sends the determined moving direction to the gateway to determine whether the moving direction satisfies the device linkage condition through the gateway, and sends a corresponding device control command to the controlled device when the moving direction satisfies the device linkage condition, so that the controlled device can perform an action corresponding to the moving direction of the person in response to the device control command, that is, step 809 is performed.
For example, if it is determined that a person gradually approaches the body sensor, the gateway control lamp gradually becomes bright, or if it is determined that a person gradually moves away from the body sensor, the gateway control lamp gradually becomes dark; for another example, for a human body sensor arranged in an aisle, if it is determined that a person passes through the human body sensor from left to right and the gateway determines that the user is in the away-from-home mode, the switch, the air conditioner, the lamp and the like in the living room are controlled to be turned off, and if it is determined that the person passes through the human body sensor from right to left and the gateway determines that the user is in the home mode, the switch, the air conditioner, the lamp and the like in the living room are controlled to be turned on.
In the application scene, the technology of positioning the human body by the millimeter wave radar is utilized, and statistics on track trend changes are combined, so that a plurality of moving directions of the target can be accurately identified in real time, for example, whether the moving directions pass through detection equipment or not is from left to right, from right to left, from top to bottom and from bottom to top, and further more personalized and more intelligent equipment control can be realized, and richer intelligent home experience is provided for a user.
The following are embodiments of the apparatus of the present application that can be used to perform device control in accordance with the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the apparatus control method of the present application.
Referring to fig. 15, an embodiment of the present application provides an apparatus control device 1100, which includes, but is not limited to:
the position determining module 1110 is configured to receive the reflected signal, which is generated by reflecting the sensing signal by the target, and determine position data of the target according to the reflected signal, where the position data is used to indicate a position of the target in the detection area.
The target tracking module 1130 is configured to track and position the target according to the position data of the target, and determine the track distribution of the target in the detection area.
And a movement determining module 1150, configured to perform an operation of trajectory trend change based on the trajectory distribution of the target in the detection area, and determine a moving direction of the target, so that if the moving direction of the controlled device in the target satisfies the device linkage condition, an action corresponding to the moving direction of the target is performed.
It should be noted that, when the device control apparatus provided in the foregoing embodiment performs device control, the division of the functional modules is merely illustrated, and in practical applications, the functions may be distributed to different functional modules according to needs, that is, the internal structure of the device control apparatus is divided into different functional modules to complete all or part of the functions described above.
In addition, the device control apparatus and the device control method provided by the above embodiments belong to the same concept, and the specific manner in which each module performs operations has been described in detail in the method embodiments, and is not described again here.
FIG. 16 shows a schematic of a structure of an electronic device according to an example embodiment. The electronic device is suitable for use in the server side 170 in the implementation environment shown in fig. 1.
It should be noted that the electronic device is only an example adapted to the application and should not be considered as providing any limitation to the scope of use of the application. The hardware structure of the electronic device 2000 may have a large difference due to the difference of configuration or performance, as shown in fig. 16, the electronic device 2000 includes: a power supply 210, an interface 230, at least one memory 250, and at least one Central Processing Unit (CPU) 270.
Specifically, the power supply 210 is used to provide operating voltages for various hardware devices on the electronic device 2000.
The interface 230 includes at least one wired or wireless network interface for interacting with external devices. For example, interaction between the smart device 130 and the server 170 in the implementation environment shown in FIG. 1 is performed.
Of course, in other examples of the present application, the interface 230 may further include at least one serial-to-parallel conversion interface 233, at least one input/output interface 235, at least one USB interface 237, and the like, as shown in fig. 10, which is not limited thereto.
The storage 250 is used as a carrier for resource storage, and may be a read-only memory, a random access memory, a magnetic disk or an optical disk, etc., and the resources stored thereon include an operating system 251, an application 253, data 255, etc., and the storage manner may be a transient storage or a permanent storage.
The operating system 251 is used for managing and controlling hardware devices and application programs 253 on the electronic device 2000 to implement the operation and processing of the mass data 255 in the memory 250 by the central processing unit 270, and may be Windows Server TM, mac OS XTM, unix TM, linux TM, free BSD TM, or the like.
The application 253 is a computer program that performs at least one specific task on the operating system 251, and may include at least one module (not shown in fig. 16), each of which may respectively include a computer program for the electronic device 2000. For example, the device control apparatus can be regarded as an application 253 deployed in the electronic device 2000.
The data 255 may be a photograph, a picture, or the like stored in a magnetic disk, or may be a reflection signal, position data, or the like, and is stored in the memory 250.
The central processor 270 may include one or more processors and is configured to communicate with the memory 250 through at least one communication bus to read the computer programs stored in the memory 250, and further implement operations and processing on the mass data 255 in the memory 250. The device control method is accomplished, for example, by the central processor 270 reading a form of a series of computer programs stored in the memory 250.
Furthermore, the present application can be implemented by hardware circuits or by hardware circuits in combination with software, and therefore, the implementation of the present application is not limited to any specific hardware circuits, software, or a combination of the two.
Referring to fig. 17, in an embodiment of the present application, an electronic device 4000 is provided, where the electronic device 400 may include: smart devices, servers, etc. that configure millimeter wave radar.
In fig. 17, the electronic device 4000 includes at least one processor 4001, at least one communication bus 4002, and at least one memory 4003.
Processor 4001 is coupled to memory 4003, such as via communication bus 4002. Optionally, the electronic device 4000 may further include a transceiver 4004, and the transceiver 4004 may be used for data interaction between the electronic device and other electronic devices, such as transmission of data and/or reception of data. It should be noted that the transceiver 4004 is not limited to one in practical applications, and the structure of the electronic device 4000 is not limited to the embodiment of the present application.
The Processor 4001 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or other Programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor 4001 may also be a combination that performs a computational function, including, for example, a combination of one or more microprocessors, a combination of a DSP and a microprocessor, or the like.
Communication bus 4002 may include a path that carries information between the aforementioned components. The communication bus 4002 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The communication bus 4002 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 17, but this does not mean only one bus or one type of bus.
The Memory 4003 may be a ROM (Read Only Memory) or other types of static storage devices that can store static information and instructions, a RAM (Random Access Memory) or other types of dynamic storage devices that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic Disc storage medium or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these.
A computer program is stored in the memory 4003, and the processor 4001 reads the computer program stored in the memory 4003 through the communication bus 4002.
The computer program realizes the device control method in each of the above embodiments when executed by the processor 4001.
In addition, a storage medium is provided in the embodiments of the present application, and a computer program is stored on the storage medium, and when being executed by a processor, the computer program realizes the device control method in the embodiments described above.
A computer program product is provided in an embodiment of the present application, the computer program product comprising a computer program stored in a storage medium. The processor of the computer device reads the computer program from the storage medium, and the processor executes the computer program, so that the computer device executes the device control method in each of the embodiments described above.
Compared with the prior art, the method and the device can accurately judge the multiple moving directions of the target, so that the controlled device executes the action corresponding to the moving direction of the target when the moving direction of the target meets the device linkage condition, and the flexibility of device control can be obviously improved.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a few embodiments of the present application and it should be noted that those skilled in the art can make various improvements and modifications without departing from the principle of the present application, and that these improvements and modifications should also be considered as the protection scope of the present application.

Claims (13)

1. An apparatus control method, characterized in that the method comprises:
receiving a reflected signal formed by reflection of a target, and determining position data of the target according to the reflected signal, wherein the position data is used for indicating the position of the target in a detection area;
tracking and positioning the position of the target in a detection area according to the position data of the target, and determining the track distribution of the target in the detection area;
and calculating the track trend change based on the track distribution of the target in the detection area, and determining the moving direction of the target, so that if the moving direction of the controlled equipment in the target meets equipment linkage conditions, an action corresponding to the moving direction of the target is executed.
2. The method of claim 1, wherein the tracking and locating the target according to the position data of the target, and determining the trajectory distribution of the target in the detection area comprises:
determining a plurality of said position data for said target, each said position data determined from said reflected signal received during a current time period; the time period refers to a time period for receiving a plurality of the reflected signals;
clustering a plurality of the position data, and determining the centroid position of the target in the current time period;
and forming the trajectory distribution of the target in the detection area by the centroid positions of the target in different time periods.
3. The method of claim 2, wherein forming a trajectory profile of the object in the detection area from the centroid positions of the object over different time periods comprises:
if the track queue is not full, storing the centroid position of the target in the current time period to the tail of the track queue, wherein the track queue is used for indicating the track distribution of the target in the detection area; or
And if the track queue is full, deleting the centroid position at the head of the track queue, and storing the centroid position of the target in the current time period to the tail of the track queue.
4. The method of claim 1, wherein the calculating of the change of the track trend based on the track distribution of the target in the detection area to determine the moving direction of the target comprises:
calculating first trend data and/or second trend data according to the centroid positions of the target in the trajectory distribution in different time periods, wherein the first trend data is used for indicating the moving trend of the target in a first direction, and the second trend data is used for indicating the moving trend of the target in a second direction;
and determining the moving direction of the target according to the first trend data and/or the second trend data.
5. The method of claim 4, wherein prior to calculating the first trend data and/or the second trend data based on centroid positions of the targets in the trajectory profile over different time periods, the method further comprises:
and smoothing the centroid positions of the targets in the track distribution in different time periods so as to enable the operation of track trend change to be carried out according to the processed centroid positions.
6. The method of claim 4, wherein determining the direction of movement of the target based on the first trend data and/or the second trend data comprises:
determining a moving direction of the target according to the comparison of the first trend data and a first threshold; or
And determining the moving direction of the target according to the comparison between the second trend data and a second threshold value.
7. The method of claim 6, wherein determining the direction of movement of the target based on the comparison of the first trend data to a first threshold comprises:
if the first trend data is larger than the first threshold, the moving direction of the target is a first moving direction, and the first moving direction is used for indicating that the target moves from left to right in the first direction; or
If the first trend data is not greater than the first threshold, the moving direction of the target is a second moving direction, and the second moving direction is used for indicating that the target moves from right to left in the first direction.
8. The method of claim 7, wherein determining the direction of movement of the target based on the comparison of the first trend data to the first threshold further comprises:
performing zero-crossing detection on the first trend data to obtain a zero-crossing detection result;
and determining whether the target passes through a detection device in the process of moving left and right in the first direction or not according to the zero-crossing detection result.
9. The method of claim 6, wherein determining the direction of movement of the target based on the comparison of the second trend data to a second threshold comprises:
if the second trend data is greater than the second threshold, the moving direction of the target is a third moving direction, and the third moving direction is used for indicating that the target moves from bottom to top in the second direction; or
If the second trend data is not greater than the second threshold, the moving direction of the target is a fourth moving direction, and the fourth moving direction is used for indicating that the target moves from top to bottom in the second direction.
10. The method of any one of claims 1 to 9, wherein determining position data of the target from the reflected signals comprises:
determining movement data of the target according to spectral analysis performed on the reflected signal;
and converting the moving data of the target from a second coordinate system to a first coordinate system to obtain the position data of the target.
11. An apparatus control device, characterized in that the device comprises:
the position determining module is used for receiving a reflected signal formed by the reflection of a target and determining position data of the target according to the reflected signal, wherein the position data is used for indicating the position of the target in a detection area;
the target tracking module is used for tracking and positioning the position of the target in a detection area according to the position data of the target and determining the track distribution of the target in the detection area;
and the movement determining module is used for calculating the change of the track trend based on the track distribution of the target in the detection area and determining the movement direction of the target, so that if the movement direction of the controlled equipment in the target meets equipment linkage conditions, the action corresponding to the movement direction of the target is executed.
12. An electronic device, comprising: at least one processor, at least one memory, and at least one communication bus, wherein,
the memory has a computer program stored thereon, and the processor reads the computer program in the memory through the communication bus;
the computer program, when executed by the processor, implements the device control method of any one of claims 1 to 10.
13. A storage medium on which a computer program is stored, the computer program realizing the device control method according to any one of claims 1 to 10 when executed by a processor.
CN202210929095.XA 2022-08-03 2022-08-03 Device control method, device, electronic device and storage medium Pending CN115390468A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210929095.XA CN115390468A (en) 2022-08-03 2022-08-03 Device control method, device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210929095.XA CN115390468A (en) 2022-08-03 2022-08-03 Device control method, device, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN115390468A true CN115390468A (en) 2022-11-25

Family

ID=84118067

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210929095.XA Pending CN115390468A (en) 2022-08-03 2022-08-03 Device control method, device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN115390468A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116300499A (en) * 2023-03-20 2023-06-23 深圳绿米联创科技有限公司 Equipment control method, device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116300499A (en) * 2023-03-20 2023-06-23 深圳绿米联创科技有限公司 Equipment control method, device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11435468B2 (en) Radar-based gesture enhancement for voice interfaces
US11686815B2 (en) Character recognition in air-writing based on network of radars
CN111399642B (en) Gesture recognition method and device, mobile terminal and storage medium
CN110741385B (en) Gesture recognition method and device, and positioning tracking method and device
Stephan et al. Radar-based human target detection using deep residual u-net for smart home applications
JP7447700B2 (en) Clustering device, method and electronic device for radar reflection points
CN113267773B (en) Millimeter wave radar-based accurate detection and accurate positioning method for indoor personnel
WO2019119195A1 (en) Target signal detection method and device, unmanned aerial vehicle, and agricultural unmanned aerial vehicle
CN110687816A (en) Intelligent household control system and method based on millimeter wave radar
CN111474537A (en) Radar personnel monitoring and measuring system and method
CN113064483A (en) Gesture recognition method and related device
CN115390468A (en) Device control method, device, electronic device and storage medium
Hyun et al. Human-vehicle classification scheme using doppler spectrum distribution based on 2D range-doppler FMCW radar
CN113918019A (en) Gesture recognition control method and device for terminal equipment, terminal equipment and medium
CN109581343A (en) Multi-radar network device and multi-direction object detection method
CN113537035A (en) Human body target detection method, human body target detection device, electronic device and storage medium
CN114047503B (en) Method and device for detecting moving object, electronic equipment and storage medium
JP7484492B2 (en) Radar-based attitude recognition device, method and electronic device
CN112731387A (en) Starting method and device for range hood, range hood and processor
CN116616747A (en) Gesture recognition method and device, electronic equipment and storage medium
CN111523619A (en) Target existence probability calculation method and device, electronic equipment and storage medium
TWI772208B (en) Method for counting number of people based on mmwave radar
CN115841707A (en) Radar human body posture identification method based on deep learning and related equipment
CN112180377B (en) Non-contact type man-machine interaction positioning method, tracking method, terminal and readable storage medium
CN114037044A (en) Person counting method, person counting device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination