CN113140089A - Behavior-assisted audio-visual sensor system - Google Patents

Behavior-assisted audio-visual sensor system Download PDF

Info

Publication number
CN113140089A
CN113140089A CN202110456326.5A CN202110456326A CN113140089A CN 113140089 A CN113140089 A CN 113140089A CN 202110456326 A CN202110456326 A CN 202110456326A CN 113140089 A CN113140089 A CN 113140089A
Authority
CN
China
Prior art keywords
edge computing
sensor
control system
background control
behavior
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110456326.5A
Other languages
Chinese (zh)
Inventor
侯培民
刘俊辰
李涛
武鸿熙
袁晓明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Runpower Information Technology Co ltd
Original Assignee
Shanghai Runpower Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Runpower Information Technology Co ltd filed Critical Shanghai Runpower Information Technology Co ltd
Priority to CN202110456326.5A priority Critical patent/CN113140089A/en
Publication of CN113140089A publication Critical patent/CN113140089A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/02Mechanical actuation
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/02Mechanical actuation
    • G08B13/12Mechanical actuation by the breaking or disturbance of stretched cords or wires
    • G08B13/122Mechanical actuation by the breaking or disturbance of stretched cords or wires for a perimeter fence
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/10Information sensed or collected by the things relating to the environment, e.g. temperature; relating to location
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/50Safety; Security of things, users, data or systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/60Positioning; Navigation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet

Landscapes

  • Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Multimedia (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Environmental & Geological Engineering (AREA)
  • Toxicology (AREA)
  • Computer Security & Cryptography (AREA)
  • Alarm Systems (AREA)

Abstract

The invention provides a behavior-assisted audiovisual sensor system, comprising: the system comprises a background control system, a plurality of terminal sensor platforms, a remote control platform, an edge computing gateway and a plurality of depth cameras, wherein the background control system is in communication connection with and controls the terminal sensor platforms, the terminal sensor platforms are provided with behavior auxiliary audio-visual sensors, the behavior auxiliary audio-visual sensors comprise the edge computing gateway, the edge computing gateway is in communication connection with and controls the depth cameras, and the edge computing gateway is connected with a voice loudspeaker; the background control system is used for coordinating the coordinates and parameters of each BALSB sensor, and communicating and interacting data with other software. The invention can identify, track and position personnel in the designated area, and effectively improves the data acquisition and analysis capability of the system.

Description

Behavior-assisted audio-visual sensor system
Technical Field
The invention relates to the technical field of Internet of things, in particular to a behavior auxiliary audio-visual sensor system.
Background
Along with the development of the internet of things technology, the sensor is more and more intelligent, and particularly along with the development of the intelligent driving technology, the sensor is more and more like the functional development of human organs. A composite sensor integrating vision, hearing and balance is a necessary result of the development of a new generation of internet of things, and we define the sensor as a Behavior-aided audio-visual sensor (behavor assisted listening and watching base sensor-balb), and an internet of things System consisting of a plurality of balbs is called a Behavior-aided audio-visual sensing System (balb System).
Disclosure of Invention
The technical problem to be solved by the present invention is to provide a behavior-aided audiovisual sensor system to solve the problems mentioned in the background art.
The technical problem solved by the invention is realized by adopting the following technical scheme: a behavioral aid audiovisual sensor system, comprising: the system comprises a background control system, a plurality of terminal sensor platforms, a remote control platform, an edge computing gateway and a plurality of depth cameras, wherein the background control system is in communication connection with and controls the terminal sensor platforms, the terminal sensor platforms are provided with behavior auxiliary audio-visual sensors, the behavior auxiliary audio-visual sensors comprise the edge computing gateway, the edge computing gateway is in communication connection with and controls the depth cameras, and the edge computing gateway is connected with a voice loudspeaker; the background control system is used for coordinating the coordinates and parameters of each BALSB sensor, and communicating and interacting data with other software.
And the background control system is communicated with the tail end sensor platform and the remote control platform through Ethernet or 5G.
The edge computing gateway reads and processes the depth camera data to digitize the depth map, the RGB map and the infrared map.
The edge computing gateway adopts the depth camera data to realize the following functions: distance and fence recognition of objects within a scene, person localization and tracking, capture of human body motion, localization of sound, speech parsing, and dialogue.
Compared with the prior art, the invention has the beneficial effects that: the invention can set a virtual electronic fence in a designated area, actively alarms the behavior of breaking into the fence, can identify, track and position personnel, and effectively improves the data acquisition and analysis capability of the system.
Drawings
FIG. 1 is a system architecture diagram of the present invention.
FIG. 2 is a schematic diagram of an end sensor platform architecture according to the present invention.
Detailed Description
In the description of the present invention, it should be noted that unless otherwise specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected, mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements.
As shown in fig. 1 and 2, a behavior-assisted audiovisual sensor system includes: the system comprises a background control system, a plurality of terminal sensor platforms, a remote control platform, an edge computing gateway and a plurality of depth cameras, wherein the background control system is in communication connection with and controls the terminal sensor platforms, the terminal sensor platforms are provided with behavior auxiliary audio-visual sensors, the behavior auxiliary audio-visual sensors comprise the edge computing gateway, the edge computing gateway is in communication connection with and controls the depth cameras, and the edge computing gateway is connected with a voice loudspeaker; the background control system is used for coordinating the coordinates and parameters of each BALSB sensor, and communicating and interacting data with other software.
And the background control system is communicated with the tail end sensor platform and the remote control platform through Ethernet or 5G.
The edge computing gateway reads and processes the depth camera data to digitize the depth map, the RGB map and the infrared map.
The edge computing gateway adopts the depth camera data to realize the following functions: distance and fence recognition of objects within a scene, person localization and tracking, capture of human body motion, localization of sound, speech parsing, and dialogue.
The end sensor platform can adopt a fixed angle type, the radial angle of the sensor is fixed, the whole platform does not have any adjustable structure, the cost is simplified to the utmost extent, and the end sensor platform is suitable for being used in the environment with fixed angle and fixed field.
The tail end sensor platform can be longitudinally adjustable, the longitudinal angle of the sensor is adjustable by a stepping motor, and the platform has near-far angle control and is suitable for being used in environments with span and scene control of tail end scenes.
The tail end sensor platform can be omni-directionally adjustable, the sensor is based on an omni-directional electric control holder, the scene omni-directional use can be carried out by matching with back end software, and the tail end sensor platform is suitable for complex scenes and the environment with the requirement of full-field tracking.
The invention can arrange the electric stepping guide rail in the whole upper space of the application scene to move equipment in the upper space of the field, and can design the function of a fixed-point moving area and the prevention and control arrangement.
The invention can be used for calibrating ground physical patterns in a scene, is used for grabbing points and fixing points of the displacement robot, and is designed by integrating equipment and the displacement robot, so that automatic control movement and fixed point defense deployment and control in the scene are realized.
The foregoing shows and describes the general principles and broad features of the present invention and advantages thereof. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (4)

1. A behavioral aid audiovisual sensor system, comprising: the background control system is characterized in that: the background control system is in communication connection with and controls a plurality of terminal sensor platforms, the background control system is in communication connection with and remote control platforms, behavior auxiliary audio-visual sensors are arranged on the terminal sensor platforms and comprise edge computing gateways, the edge computing gateways are in communication connection with and control a plurality of depth cameras, and voice loudspeakers are connected to the edge computing gateways; the background control system is used for coordinating the coordinates and parameters of each BALSB sensor, and communicating and interacting data with other software.
2. A behavioural aid audiovisual sensor system according to claim 1, characterized in that: and the background control system is communicated with the tail end sensor platform and the remote control platform through Ethernet or 5G.
3. A behavioural aid audiovisual sensor system according to claim 1 or 2, characterized in that: the edge computing gateway reads and processes the depth camera data to digitize the depth map, the RGB map and the infrared map.
4. A behavioural aid audiovisual sensor system according to claim 3, characterized in that: the edge computing gateway adopts the depth camera data to realize the following functions: distance and fence recognition of objects within a scene, person localization and tracking, capture of human body motion, localization of sound, speech parsing, and dialogue.
CN202110456326.5A 2021-04-26 2021-04-26 Behavior-assisted audio-visual sensor system Pending CN113140089A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110456326.5A CN113140089A (en) 2021-04-26 2021-04-26 Behavior-assisted audio-visual sensor system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110456326.5A CN113140089A (en) 2021-04-26 2021-04-26 Behavior-assisted audio-visual sensor system

Publications (1)

Publication Number Publication Date
CN113140089A true CN113140089A (en) 2021-07-20

Family

ID=76812371

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110456326.5A Pending CN113140089A (en) 2021-04-26 2021-04-26 Behavior-assisted audio-visual sensor system

Country Status (1)

Country Link
CN (1) CN113140089A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101827260B1 (en) * 2016-09-29 2018-03-22 (유)원테크 Smart fence
CN110032540A (en) * 2019-04-11 2019-07-19 北京宙心科技有限公司 A kind of artificial intelligence edge calculations equipment
CN111844039A (en) * 2020-07-23 2020-10-30 上海上实龙创智能科技股份有限公司 Wisdom space system based on robot control
CN111898524A (en) * 2020-07-29 2020-11-06 江苏艾什顿科技有限公司 5G edge computing gateway and application thereof
CN212752301U (en) * 2020-08-22 2021-03-19 深圳云塔信息技术有限公司 But remote management's multi-functional edge calculates gateway

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101827260B1 (en) * 2016-09-29 2018-03-22 (유)원테크 Smart fence
CN110032540A (en) * 2019-04-11 2019-07-19 北京宙心科技有限公司 A kind of artificial intelligence edge calculations equipment
CN111844039A (en) * 2020-07-23 2020-10-30 上海上实龙创智能科技股份有限公司 Wisdom space system based on robot control
CN111898524A (en) * 2020-07-29 2020-11-06 江苏艾什顿科技有限公司 5G edge computing gateway and application thereof
CN212752301U (en) * 2020-08-22 2021-03-19 深圳云塔信息技术有限公司 But remote management's multi-functional edge calculates gateway

Similar Documents

Publication Publication Date Title
EP3623118B1 (en) Emotion recognizer, robot including the same, and server including the same
Nakadai et al. Real-time auditory and visual multiple-object tracking for humanoids
Nakadai et al. Active audition for humanoid
US7526361B2 (en) Robotics visual and auditory system
US20090030552A1 (en) Robotics visual and auditory system
US6967455B2 (en) Robot audiovisual system
JP3862087B2 (en) Bio-type automatic vision and gaze control system based on biological eye movement system
CN109941231B (en) Vehicle-mounted terminal equipment, vehicle-mounted interaction system and interaction method
CN105058389A (en) Robot system, robot control method, and robot
JP2007221300A (en) Robot and control method of robot
CN105912980A (en) Unmanned plane and unmanned plane system
US11854564B1 (en) Autonomously motile device with noise suppression
JP2004198656A (en) Robot audio-visual system
US11433546B1 (en) Non-verbal cuing by autonomous mobile device
US11646009B1 (en) Autonomously motile device with noise suppression
CN113140089A (en) Behavior-assisted audio-visual sensor system
US11345052B1 (en) Extensible mast device
EP4061103A1 (en) Neck-worn device
US11217235B1 (en) Autonomously motile device with audio reflection detection
Kühn et al. Multimodal saliency-based attention: A lazy robot's approach
JP3843741B2 (en) Robot audio-visual system
Spexard et al. Human-like person tracking with an anthropomorphic robot
US11480968B1 (en) System for dynamic positioning of an autonomous mobile device with respect to a user
Balakrishnan et al. Stereopsis method for visually impaired to identify obstacles based on distance
US11422568B1 (en) System to facilitate user authentication by autonomous mobile device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210720