WO2021147122A1 - Procédé et appareil de détection d'être humain, dispositif terminal, support de stockage et dispositif électronique - Google Patents

Procédé et appareil de détection d'être humain, dispositif terminal, support de stockage et dispositif électronique Download PDF

Info

Publication number
WO2021147122A1
WO2021147122A1 PCT/CN2020/074286 CN2020074286W WO2021147122A1 WO 2021147122 A1 WO2021147122 A1 WO 2021147122A1 CN 2020074286 W CN2020074286 W CN 2020074286W WO 2021147122 A1 WO2021147122 A1 WO 2021147122A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
radar
human body
reflection area
position information
Prior art date
Application number
PCT/CN2020/074286
Other languages
English (en)
Chinese (zh)
Inventor
郭思佳
刘成
郭冠出
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2021147122A1 publication Critical patent/WO2021147122A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications

Definitions

  • the present disclosure relates to the field of information processing technology, and in particular to a human body detection method, device, terminal device, computer-readable storage medium, and electronic equipment.
  • the radar can detect the distance and angle, but it is difficult to determine whether the detection object is a human body based on the distance and angle.
  • An embodiment of the first aspect of the present disclosure proposes a human body detection method, including: acquiring detection information of radars on a terminal device, where the number of the radars is multiple; and the detection information includes: target distance and radar signal reflection amount; target The distance is the distance between the radar signal reflection position and the radar; according to the detection information and the radar position information, the position information of the first target object and the front reflection area of the first target object are determined; The facing reflection area is the reflection area when the first target object is facing the radar; the first target object is the object located at the reflection position of the radar signal; according to the facing reflection area, the reflection area is determined State whether the first target object is a human body.
  • the method further includes: when the first target object is a human body, tracking the first target object in real time Target object, and obtain the detection information of the radar on the terminal device to determine the position information of the second target object; according to the position information of the second target object, determine whether the second target object is the hand of the first target object When the second target object is the hand of the first target object, track the second target object in real time, and determine the position of the hand according to the change in the position information of the second target object gesture.
  • the determining the position information of the first target object and the facing reflection area of the first target object according to the detection information and the position information of the radar includes: The target distance in the detection information of the radar, the position information of the radar, and the three-positioning algorithm are used to determine the position information of the first target object; the target distance in the detection information of the radar, the amount of reflection of the radar signal, And a preset reflectivity to determine the reflection area of the radar to the first target object; according to the reflection area of the radar to the first target object, the position information of the radar, and the first target object The location information to determine the facing reflection area.
  • the determining whether the first target object is a human body according to the facing reflection area includes: determining whether the facing reflection area is within a preset area threshold range, and Whether the stationary time of the first target object is greater than a first time threshold; if the facing reflection area is within a preset area threshold range, and the stationary time of the first target object is less than or equal to the first time threshold, it is determined
  • the first target object is a human body; if the facing reflection area is within a preset area threshold range, and the static time of the first target object is greater than the first time threshold, it is determined that the first target object is Non-human body; if the facing reflection area is outside the preset area threshold range, it is determined that the first target object is a non-human body.
  • the method further includes: in response to determining that the first target object is a human body, according to the facing reflection area, The reflection area determines the age stage of the human body.
  • the method in response to determining that the first target object is a human body, after determining the age stage of the human body according to the front reflection area, the method further includes: responding to determining the age stage of the human body For children, obtain the length of time the first target object looks at the terminal device; in response to the length of time being greater than a second time threshold, prompt the first target object to stop the gaze operation and perform timing; in response to the timing duration being greater than The third time threshold controls the display screen of the terminal device to stop the display operation.
  • the determining whether the second target object is the hand of the first target object according to the position information of the second target object includes: according to the position information of the second target object Position information and tracked position information of the first target object, determine the distance between the second target object and the first target object; determine the distance between the second target object and the first target object Whether the distance of is greater than a preset distance threshold; in response to the distance being less than or equal to the preset distance threshold, it is determined that the second target object is the hand of the first target object.
  • the method further includes: determining a control operation on the terminal device according to the gesture; controlling the terminal device according to the control operation, and It is determined that the control operation starts timing; in response to the timing duration being greater than the fourth time threshold, the detection information of the radar on the terminal device is reacquired, the position information of the second target object is determined, and the hand gesture is determined.
  • the radar is a millimeter wave radar.
  • An embodiment of the second aspect of the present disclosure proposes a human body detection device, including: an acquisition module for acquiring detection information of radar on a terminal device, the number of the radar is multiple; the detection information includes: target distance and radar The amount of signal reflection; the target distance is the distance between the radar signal reflection position and the radar; a determining module for determining the position information of the first target object according to the detection information and the position information of the radar, and The facing reflection area of the first target object; the facing reflection area is the reflection area when the first target object is facing the radar; the first target object is the reflection area of the radar signal Object; the determining module is also used to determine whether the first target object is a human body according to the facing reflection area.
  • the embodiment of the third aspect of the present disclosure proposes a terminal device, a terminal device body, a radar and a processor located in the terminal device body; the number of the radar is multiple; the processor is connected to the radar , Used to implement the human body detection method as described in the embodiment of the first aspect.
  • the embodiment of the fourth aspect of the present disclosure proposes a non-transitory computer-readable storage medium, and when the instructions in the storage medium are executed by a processor, the human body detection method as described above is realized.
  • An embodiment of the fifth aspect of the present disclosure proposes an electronic device that includes: a memory and a processor; the memory stores computer instructions, and when the computer instructions are executed by the processor, the implementation is as described above Human body detection method.
  • Fig. 1 is a schematic flowchart of a human body detection method according to an embodiment of the present disclosure
  • Figure 2 is a schematic diagram of the structure of multiple radars embedded in a terminal device
  • FIG. 3 is a schematic flowchart of a human body detection method according to another embodiment of the present disclosure.
  • FIG. 4 is a schematic flowchart of a human body detection method according to another embodiment of the present disclosure.
  • Figure 5 is a schematic diagram of the real-time tracking and monitoring hand structure
  • Fig. 6 is a schematic structural diagram of a human body detection device according to an embodiment of the present disclosure.
  • Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present disclosure.
  • Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
  • FIG. 1 is a schematic flowchart of a human body detection method provided by an embodiment of the disclosure.
  • the human body detection method includes the following steps:
  • Step 101 Obtain the detection information of the radar on the terminal device, the number of radars is multiple; the detection information includes: the target distance and the amount of radar signal reflection; the target distance is the distance between the radar signal reflection position and the radar.
  • the detection information of the radar may include, but is not limited to, the target distance and the amount of radar signal reflection.
  • the target distance can be understood as the distance between the radar signal reflection position and the radar; the radar signal reflection amount can be understood as the radar signal strength.
  • the amount of radar signal reflection can be jointly determined by the reflectivity of the target object to the radar, the reflection area and the distance between the target object and the radar.
  • the reflection area is determined by the area of the target object itself and the position relative to the radar. For example, if the target object is on the side of the radar, the reflection area relative to the position of the target object decreases when the position of the target object is on the front of the radar.
  • different target objects have different reflectivity to radar. In the case of the same reflectivity and the target distance is known, the reflection area is different, and the radar signal reflection amount obtained is different.
  • the reflectivity of the human body to the radar and the target distance are known, and the reflection area of the human body when it is on the front of the radar is greater than the reflection area when the human body is on the side of the radar. The amount of radar signal reflection at the side.
  • the radar may be a millimeter wave radar.
  • millimeter wave radar works in the millimeter wave band.
  • millimeter wave refers to the 30-300GHz frequency band (wavelength is 1-10mm).
  • the wavelength of millimeter wave is between centimeter wave and light wave, so millimeter wave has the advantages of microwave guidance and photoelectric guidance.
  • the millimeter waveguide seeker has the characteristics of small size, light weight and high spatial resolution.
  • optical seekers such as infrared, laser, and television
  • the millimeter waveguide seeker has a strong ability to penetrate fog, smoke, and dust, and has the characteristics of all-weather (except heavy rain).
  • millimeter-wave radar can penetrate plastics, ceramics and other materials without perforating and will not damage the appearance of the product itself.
  • the horizontal measurement angle is larger (up to 80 degrees) and the measurement distance is longer. The distance can be up to two meters when detecting the human body, and up to one meter when detecting hand gestures.
  • Step 102 Determine the position information of the first target object and the facing reflection area of the first target object according to the detection information and the position information of the radar; the facing reflection area is the reflection area when the first target object is facing the radar; A target object is an object located at the reflection position of the radar signal.
  • the position information of the first target object is determined according to the target distance in the detection information of the radar, the position information of the radar, and the three positioning algorithm; the position information of the first target object is determined according to the target distance in the detection information of the radar, the amount of radar signal reflection, and the preset Determine the reflection area of the radar to the first target object; determine the facing reflection area according to the reflection area of the radar to the first target object, the position information of the radar, and the position information of the first target object.
  • the first target object may be an object located at a radar signal reflection position.
  • the number of lightning is 3 as an example, such as radar A, B, C; the placement position of radar A, B, C s1, s2, s3, in order to prevent the radar A, B, C signal from each other
  • radar A, B, and C can be set to different working time slots.
  • radar A works in 1, 4, 7... time slots
  • radar B works in 2, 5, 8... time slots
  • radar C works in 3, 6, 9...
  • the sampling frequency is 10HZ, which can be detected according to radar
  • the information is obtained from radars A, B, C, and the distance from the target object at t n+1 , t n+2 , t n+3 is d1, d2, d3, according to the placement position of radar A, B, C
  • the distance between the radar and the target object is combined with the three positioning algorithm to determine the position s of the target object.
  • the position s of the target object can be obtained when the mean square value of
  • the reflection area of the radar to the target object can be obtained; further, according to the reflection area of the target object, radar A, B , The placement position of C and the position information of the target object can determine the reflection area of the target object facing the radar position.
  • the azimuth information of the radar and the target object can be obtained. For example, obtain the vertical line of radar A passing through the target object on the display screen of the terminal device, and use the vertical line as the normal; then, obtain the angle between the connecting line of radar A and the target object and the normal; in addition, in the present disclosure
  • the relationship between the reflection area of the radar facing the target object and the reflection area of the radar lateral (with an angle) target object can be obtained in advance. Therefore, it is known that the reflection area of the radar A on the target object is the same as that of the radar A.
  • the angle between the connecting line and the normal of the target object is converted by the relationship between the reflection area of the radar A facing the target object and the reflection area of the radar A side (with an angle) target object, and the target object can be obtained.
  • the reflection area of radar A in the same way, the reflection area of the target object facing radar B and radar C can be obtained; finally, the reflection area of the target object facing radar A, B, and C is averaged to determine The reflective area of the target object facing the radar position.
  • Step 103 Determine whether the first target object is a human body according to the facing reflection area.
  • the frontal reflection area can be compared with a preset area threshold. According to the comparison result and whether the stationary time of the target object is greater than the first time threshold, it is determined whether the target object is human body.
  • the specific implementation process is as follows:
  • Step 301 Determine whether the facing reflection area is within a preset area threshold, and whether the static time of the first target object is greater than the first time threshold.
  • Step 302 If the facing reflection area is within the preset area threshold range, and the static time of the first target object is less than or equal to the first time threshold, it is determined that the first target object is a human body.
  • Step 303 If the facing reflection area is within the preset area threshold range, and the static time of the first target object is greater than the first time threshold, it is determined that the first target object is a non-human body.
  • Step 304 If the facing reflection area is outside the preset area threshold range, it is determined that the first target object is a non-human body.
  • the frontal reflection area can be compared with the preset area threshold, and the stationary time of the target object with the first time threshold. If the frontal reflection area is within the threshold range of the preset area, and the target If the stationary time of the object is less than or equal to the first time threshold, it can be determined that the target object is a human body; if the facing reflection area is within the preset area threshold range, and the stationary time of the target object is greater than the first time threshold, the target can be determined The object is a non-human body; if the facing reflection area is outside the threshold range of the preset area, for example, the reflection area is greater than the upper limit of the preset area threshold or less than the lower limit of the preset area threshold, then the target object is determined to be non-human .
  • the radar based on the detection information, location information, and location information of the target object from the radar, it can be accurately determined whether the current target object is a human body based on the facing reflection area.
  • the age stage of the human body can be determined according to the front reflection area.
  • the age stage may include, but is not limited to, adults, children, and so on.
  • the higher the height of the target object the larger the corresponding front reflection area.
  • the front reflection area corresponding to the height of the target object can be compared with the front reflection area corresponding to the height of 1.2 meters.
  • the age stage of the target object is determined as an adult; when the height of the target object corresponds to the front reflection area is less than or equal to the front reflection area corresponding to the height of 1.2 meters In the case of area, the age of the target object is determined to be a child.
  • the length of time the child stares at the terminal device display screen can be obtained. If the length is greater than the second time threshold (e.g., 2 minutes), a prompt message appears on the terminal device display screen, prompting the child to stop the gaze operation and start timing. In response to the timing duration greater than the third time threshold (e.g., 1 minute), the child is still The gaze operation is not stopped. At this time, the display screen of the terminal device stops displaying the operation.
  • the second time threshold e.g. 2 minutes
  • the detection information of the radar can be used to determine the distance between the terminal display and the child.
  • a prompt appears on the terminal device display
  • the message prompts that the child is too close to the terminal display screen and starts timing.
  • the timing duration is greater than the third threshold, the child is still too close to the terminal display screen.
  • the terminal device display screen stops displaying operations.
  • the target object can be tracked in real time and determined according to radar detection information.
  • the hand of the target object, and then determine the gesture of the target object the specific implementation process is as follows:
  • Step 401 When the first target object is a human body, the first target object is tracked in real time, and the detection information of the radar on the terminal device is acquired, and the position information of the second target object is determined.
  • the first target object when the first target object is a human body, the first target object is tracked and detected in real time.
  • the second target object can be detected according to the placement position of the radar and the distance from the radar. The distance to determine the position information of the second target object.
  • the distance between the second target object and the first target object is within a preset range (such as , 20 cm), determine that the second target object is a human body part.
  • Step 402 Determine whether the second target object is the hand of the first target object according to the position information of the second target object.
  • the position information of the tracked first target object can be compared with the position information of the second target object, and the comparison result can be used as the distance between the second target object and the first target object;
  • the distance between the second target object and the first target object is compared with a preset distance threshold (for example, arm length).
  • a preset distance threshold for example, arm length
  • Step 403 When the second target object is the hand of the first target object, track the second target object in real time, and determine the hand gesture according to the change of the position information of the second target object.
  • the movement of the hand can be tracked and monitored in real time, and the user's three-dimensional gesture can be judged according to the three-positioning method.
  • the palm moves from left to right, it is judged as a rightward gesture; if it moves from right to left, it is judged as a leftward gesture; if the palm moves from top to bottom, it is judged as an upward gesture; if If the palm moves from bottom to top, it is judged as a downward gesture; if the palm moves from front to back, it is judged as a forward gesture; if the palm moves from back to front, it is judged as a backward gesture.
  • the control operation of the terminal device can be determined according to the gesture; when the terminal device is controlled according to the control operation, the timing is started, in response to the timing duration being greater than the fourth time Threshold, re-acquire the detection information of the radar on the terminal device, determine the position information of the second target object, and determine the hand gesture. That is, after the terminal device performs corresponding control according to the gesture, it responds to the next gesture instruction after at least the fourth time threshold after the control operation, and does not respond to the gesture given within the fourth time threshold.
  • the detection information of the radar on the terminal device is obtained, and the number of the radar is multiple; the detection information includes: the target distance and the radar signal reflection amount; the target distance is the radar signal reflection position and The distance between the radars; according to the detection information and the position information of the radar, the position information of the first target object and the front reflection area of the first target object are determined; the front reflection area is the first target object directly facing The reflection area at the time of the radar; the first target object is an object located at the reflection position of the radar signal; according to the facing reflection area, it is determined whether the first target object is a human body.
  • This method can accurately determine whether the current target object is a human body based on the detection information, position information and target object position information of the radar, and according to the frontal reflection area, and when the current target object is a human body, it can further confirm that the human body gesture is performed on the human machine. Interaction, while distinguishing the age stage of the human body, and protect the eyesight when the age stage is a child.
  • an embodiment of the present disclosure also provides a human body detection device, because the human body detection device provided by the embodiment of the present disclosure is different from the human body detection method provided by the above several embodiments.
  • the foregoing implementation of the human body detection method is also applicable to the human body detection device provided in this embodiment, and will not be described in detail in this embodiment.
  • Fig. 6 is a schematic structural diagram of a human body detection device according to an embodiment of the present disclosure. As shown in FIG. 6, the human body detection device includes: an acquisition module 610 and a determination module 620.
  • the acquisition module 610 is used to acquire the detection information of the radar on the terminal device, the number of radars is multiple; the detection information includes: the target distance and the amount of radar signal reflection; the target distance is the distance between the radar signal reflection position and the radar Distance; determining module 620, used to determine the position information of the first target object and the front reflection area of the first target object according to the detection information and the position information of the radar; the front reflection area is when the first target object is facing the radar The first target object is an object located at the reflection position of the radar signal; the determining module 620 is further configured to determine whether the first target object is a human body according to the facing reflection area.
  • the human body detection device of the embodiment of the present disclosure acquires the detection information of the radar on the terminal device, and the number of radars is multiple; the detection information includes: the target distance and the amount of radar signal reflection; the target distance is the distance between the radar signal reflection position and the radar Distance; determine the position information of the first target object and the front reflection area of the first target object according to the detection information and the position information of the radar; the front reflection area is the reflection area when the first target object is facing the radar; The first target object is the object at the reflection position of the radar signal; according to the facing reflection area, it is determined whether the first target object is a human body.
  • the device can realize the detection information, position information and position information of the target object through the radar.
  • the reflection area can accurately determine whether the current target object is a human body, and when the current target object is a human body, it can further confirm human body gestures for human-computer interaction, while distinguishing the age of the human body, and protect the eyesight when the age is a child .
  • the present disclosure also proposes a terminal device, as shown in FIG. 7, which is a schematic structural diagram of a terminal device according to an embodiment of the present disclosure.
  • the terminal device 700 includes a terminal device body 710, a radar 720 and a processor 730 located in the terminal device body.
  • the number of radars is multiple; the processor 730 is connected to the radar 720 and is used to execute the human body detection method described in the above embodiment.
  • the present disclosure also proposes a non-transitory computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the above-mentioned human body detection method is realized.
  • FIG. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
  • the electronic equipment includes:
  • the processor 1002 implements the human body detection method provided in the foregoing embodiment when executing the instruction.
  • the electronic equipment also includes:
  • the communication interface 1003 is used for communication between the memory 1001 and the processor 1002.
  • the memory 1001 is used to store computer instructions that can run on the processor 1002.
  • the memory 1001 may include a high-speed RAM memory, and may also include a non-volatile memory (non-volatile memory), for example, at least one disk memory.
  • the processor 1002 is configured to implement the human body detection method described in the foregoing embodiment when executing the program.
  • the bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component (PCI) bus, or an Extended Industry Standard Architecture (EISA) bus. Wait.
  • ISA Industry Standard Architecture
  • PCI Peripheral Component
  • EISA Extended Industry Standard Architecture
  • the bus can be divided into an address bus, a data bus, a control bus, and so on. For ease of representation, only one thick line is used in Figure 8, but it does not mean that there is only one bus or one type of bus.
  • the memory 1001, the processor 1002, and the communication interface 1003 are integrated on a single chip, the memory 1001, the processor 1002, and the communication interface 1003 can communicate with each other through internal interfaces.
  • the processor 1002 may be a central processing unit (Central Processing Unit, referred to as CPU), or a specific integrated circuit (Application Specific Integrated Circuit, referred to as ASIC), or configured to implement one or more of the embodiments of the present disclosure integrated circuit.
  • CPU Central Processing Unit
  • ASIC Application Specific Integrated Circuit
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present disclosure, “plurality” means at least two, such as two, three, etc., unless otherwise specifically defined.
  • a "computer-readable medium” can be any device that can contain, store, communicate, propagate, or transmit a program for use by an instruction execution system, device, or device or in combination with these instruction execution systems, devices, or devices.
  • computer readable media include the following: electrical connections (electronic devices) with one or more wiring, portable computer disk cases (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable and editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because it can be used, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable media if necessary. The program is processed in a manner to obtain the program electronically, and then stored in the computer memory.
  • each part of the present disclosure can be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods can be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • Discrete logic gate circuits with logic functions for data signals Logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate array (PGA), field programmable gate array (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried in the method of the foregoing embodiments can be implemented by a program instructing relevant hardware to complete.
  • the program can be stored in a computer-readable storage medium, and the program can be stored in a computer-readable storage medium. When executed, it includes one of the steps of the method embodiment or a combination thereof.
  • the functional units in the various embodiments of the present disclosure may be integrated into one processor, or each unit may exist alone physically, or two or more units may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software function module and sold or used as an independent product, it can also be stored in a computer readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

L'invention concerne un procédé et un appareil de détection d'être humain, un dispositif terminal, un support de stockage et un dispositif électronique. Le procédé consiste à : acquérir des informations de détection d'un radar sur un dispositif terminal, une pluralité de radars étant ménagés, les informations de détection comprenant une distance cible et une quantité de réflexion de signal radar, et la distance cible étant la distance entre une position de réflexion de signal radar et le radar (101) ; déterminer des informations de position d'un premier objet cible et d'une zone de réflexion faisant directement face du premier objet cible d'après les informations de détection et les informations de position du radar, la zone de réflexion faisant directement face étant une zone de réflexion lorsque le premier objet cible fait directement face au radar, et le premier objet cible est un objet situé à la position de réflexion de signal radar (102) ; et déterminer, d'après la zone de réflexion faisant directement face, si le premier objet cible est un être humain (103).
PCT/CN2020/074286 2020-01-20 2020-02-04 Procédé et appareil de détection d'être humain, dispositif terminal, support de stockage et dispositif électronique WO2021147122A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010066140.4A CN111308463B (zh) 2020-01-20 2020-01-20 人体检测方法、装置、终端设备、存储介质及电子设备
CN202010066140.4 2020-01-20

Publications (1)

Publication Number Publication Date
WO2021147122A1 true WO2021147122A1 (fr) 2021-07-29

Family

ID=71156429

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/074286 WO2021147122A1 (fr) 2020-01-20 2020-02-04 Procédé et appareil de détection d'être humain, dispositif terminal, support de stockage et dispositif électronique

Country Status (2)

Country Link
CN (1) CN111308463B (fr)
WO (1) WO2021147122A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115092785A (zh) * 2022-06-21 2022-09-23 无锡威孚高科技集团股份有限公司 基于毫米波雷达的儿童单独出入电梯轿厢预警方法及***
WO2023246121A1 (fr) * 2022-06-22 2023-12-28 青岛海尔空调器有限总公司 Procédé et dispositif de commande d'unité intérieure, et climatiseur
CN118250870A (zh) * 2024-05-28 2024-06-25 深圳市亮佳美照明有限公司 户外洗墙灯的节能控制方法、装置、设备及存储介质

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111766576B (zh) * 2020-06-29 2023-12-26 京东方科技集团股份有限公司 基于毫米波雷达检测的会议***方法及相关设备
CN112485779B (zh) * 2020-11-13 2023-09-08 珠海格力电器股份有限公司 雷达功率的控制方法、装置、电子设备和计算机可读介质
CN112485782B (zh) * 2020-11-24 2024-07-09 京东方科技集团股份有限公司 坐姿监测方法及装置、电子设备和可读存储介质
CN113687350B (zh) * 2021-08-24 2024-04-05 杭州海康威视数字技术股份有限公司 一种跌倒检测方法、装置、电子设备及存储介质
CN115059990A (zh) * 2022-03-01 2022-09-16 北京小米移动软件有限公司 空调控制方法、装置和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102058411A (zh) * 2010-10-27 2011-05-18 中国人民解放军第四军医大学 一种多通道基于uwb雷达式生命探测仪
CN106231419A (zh) * 2016-08-30 2016-12-14 北京小米移动软件有限公司 操作执行方法及装置
CN109856617A (zh) * 2019-01-24 2019-06-07 珠海格力电器股份有限公司 基于微波雷达的拍摄方法、装置、处理器及终端
US20190205903A1 (en) * 2014-02-25 2019-07-04 Nec Corporation Information-processing device, data analysis method, and recording medium
CN110290353A (zh) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 监控方法及装置、电子设备以及存储介质
CN110412378A (zh) * 2019-07-30 2019-11-05 北京经纬恒润科技有限公司 目标物体检测方法及装置

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AUPR301401A0 (en) * 2001-02-09 2001-03-08 Commonwealth Scientific And Industrial Research Organisation Lidar system and method
JP3730956B2 (ja) * 2002-12-11 2006-01-05 本田技研工業株式会社 移動体用送受信機の軸調整装置
JP4258328B2 (ja) * 2003-09-12 2009-04-30 オムロン株式会社 2周波ドップラ測距装置およびその装置を備えた検出システム
EP2281667B1 (fr) * 2005-09-30 2013-04-17 iRobot Corporation Robot compagnon pour interaction personnelle
CN102190081B (zh) * 2010-03-04 2013-09-04 南京航空航天大学 基于视觉的飞艇定点鲁棒控制方法
CN102306285B (zh) * 2011-08-15 2013-07-17 北京北大千方科技有限公司 人形识别方法
JP5929675B2 (ja) * 2012-09-28 2016-06-08 株式会社デンソーウェーブ レーザレーダ装置
JP6136524B2 (ja) * 2013-04-23 2017-05-31 株式会社デンソー レーダ装置、及び検査システム
KR102417610B1 (ko) * 2016-03-03 2022-07-07 삼성전자주식회사 근거리 초고주파 레이더를 이용한 코드 판독 방법 및 장치
JP6825835B2 (ja) * 2016-07-07 2021-02-03 日本無線株式会社 レーダ交通量計測装置及び方法
US10816658B2 (en) * 2016-09-07 2020-10-27 OmniPreSense Corporation Radar enabled weapon detection system
CN117310741A (zh) * 2017-01-03 2023-12-29 应诺维思科技有限公司 用于检测和分类物体的激光雷达***和方法
CN107202987A (zh) * 2017-05-31 2017-09-26 武汉大学 入侵目标检测定位方法及***
CN108710127B (zh) * 2018-04-19 2020-10-30 上海鹰觉科技有限公司 低空及海面环境下的目标检测识别方法及***
CN108880701A (zh) * 2018-05-14 2018-11-23 深圳市万普拉斯科技有限公司 调整天线辐射性能的方法、装置及移动终端
CN108919218A (zh) * 2018-06-07 2018-11-30 北京邮电大学 一种非接触式车内人数及位置判断的方法及装置
CN108888249A (zh) * 2018-06-07 2018-11-27 北京邮电大学 一种非接触式车内多人生命体征监测的方法及装置
CN109213162A (zh) * 2018-09-01 2019-01-15 哈尔滨工程大学 一种多传感器信息融合的水面无人艇水池自主靠泊离岸方法
CN109709529A (zh) * 2019-03-05 2019-05-03 深圳市镭神智能***有限公司 一种旋转棱镜和多线激光雷达测距***
CN109996175B (zh) * 2019-05-15 2021-04-30 苏州矽典微智能科技有限公司 室内定位***和方法
CN110267192B (zh) * 2019-05-24 2021-09-14 中国联合网络通信集团有限公司 定位方法及装置
CN110244298B (zh) * 2019-07-26 2021-09-10 北京东方至远科技股份有限公司 一种InSAR数据升降轨联合滑坡分析方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102058411A (zh) * 2010-10-27 2011-05-18 中国人民解放军第四军医大学 一种多通道基于uwb雷达式生命探测仪
US20190205903A1 (en) * 2014-02-25 2019-07-04 Nec Corporation Information-processing device, data analysis method, and recording medium
CN106231419A (zh) * 2016-08-30 2016-12-14 北京小米移动软件有限公司 操作执行方法及装置
CN109856617A (zh) * 2019-01-24 2019-06-07 珠海格力电器股份有限公司 基于微波雷达的拍摄方法、装置、处理器及终端
CN110290353A (zh) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 监控方法及装置、电子设备以及存储介质
CN110412378A (zh) * 2019-07-30 2019-11-05 北京经纬恒润科技有限公司 目标物体检测方法及装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115092785A (zh) * 2022-06-21 2022-09-23 无锡威孚高科技集团股份有限公司 基于毫米波雷达的儿童单独出入电梯轿厢预警方法及***
WO2023246121A1 (fr) * 2022-06-22 2023-12-28 青岛海尔空调器有限总公司 Procédé et dispositif de commande d'unité intérieure, et climatiseur
CN118250870A (zh) * 2024-05-28 2024-06-25 深圳市亮佳美照明有限公司 户外洗墙灯的节能控制方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN111308463B (zh) 2022-06-07
CN111308463A (zh) 2020-06-19

Similar Documents

Publication Publication Date Title
WO2021147122A1 (fr) Procédé et appareil de détection d'être humain, dispositif terminal, support de stockage et dispositif électronique
US10013056B2 (en) Dynamic eye tracking calibration
US9459694B2 (en) Cursor movement device
CN108958490B (zh) 电子装置及其手势识别方法、计算机可读存储介质
KR101850009B1 (ko) 레이더에 기반한 제스처 인식
WO2018036229A1 (fr) Procédé, dispositif et équipement de commande tactile de projection
US20150253428A1 (en) Determining positional information for an object in space
US20160274732A1 (en) Touchless user interfaces for electronic devices
CN111759214A (zh) 一种自动门开合控制方法
CN113370911B (zh) 车载传感器的位姿调整方法、装置、设备和介质
WO2017043056A1 (fr) Procédé d'aide à la conduite et dispositif et programme d'aide à la conduite utilisant ledit procédé
KR20220062400A (ko) 투사 방법 및 시스템
CN107479710A (zh) 智能镜及其控制方法、装置、设备及存储介质
CN106557209A (zh) 红外触摸屏触控信号的处理方法、装置及终端设备
CN109032354B (zh) 电子装置及其手势识别方法、计算机可读存储介质
CN113721232B (zh) 目标对象检测方法、装置、电子设备及介质
US9652081B2 (en) Optical touch system, method of touch detection, and computer program product
US20210055792A1 (en) Method and electronic device for eye-tracking
CN109358755B (zh) 用于移动终端的手势检测方法、装置和移动终端
CN105786165B (zh) 一种信息处理方法和电子设备
CN113064806B (zh) 控制方法和电子设备
CN204515684U (zh) 一种电子设备
US20240215788A1 (en) Collided position determination method, computer-readable storage medium, and robot
CN117148313A (zh) 一种数据处理方法、检测装置及电子设备
CN115519586A (zh) 机器人的悬崖检测方法、机器人及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20916164

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20916164

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27/03/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20916164

Country of ref document: EP

Kind code of ref document: A1