CN117130313A - Intelligent full scene central control system and control method - Google Patents

Intelligent full scene central control system and control method Download PDF

Info

Publication number
CN117130313A
CN117130313A CN202311216582.2A CN202311216582A CN117130313A CN 117130313 A CN117130313 A CN 117130313A CN 202311216582 A CN202311216582 A CN 202311216582A CN 117130313 A CN117130313 A CN 117130313A
Authority
CN
China
Prior art keywords
module
control system
equipment
internet
central processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311216582.2A
Other languages
Chinese (zh)
Inventor
朱正
周镕鑫
陈茂森
赵学进
石璨
杜辰君
陈秋实
张蔚
单侪
田凤军
张杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN202311216582.2A priority Critical patent/CN117130313A/en
Publication of CN117130313A publication Critical patent/CN117130313A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/12Advertising or display means not otherwise provided for using special optical effects
    • G09F19/18Advertising or display means not otherwise provided for using special optical effects involving the use of optical projection means, e.g. projection of images on clouds
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Marketing (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The invention belongs to the technical field of embedded technology and the technical field of small target identification and tracking, and particularly relates to an intelligent full scene central control system and a control method, wherein the intelligent full scene central control system comprises a main control system and a handheld controllable flickering light-emitting device, wherein the handheld controllable flickering light-emitting device is external equipment for a user to operate; the main control system comprises a chassis shell, a central processing module, an image acquisition module, a dim target identification module, a target tracking projection module, a first wireless communication module and a power supply module, wherein the image acquisition module, the dim target identification module, the target tracking projection module, the first wireless communication module and the power supply module are all connected to the central processing module, and the handheld controllable flickering luminous device comprises a light source switch, a second wireless communication module and a flickering frequency control module. The intelligent full-scene central control system and the control method provided by the invention are simple to use, accurate and quick in control, low in learning cost and good in compatibility, and can be adapted to most complex scenes, so that the defects of the prior art are overcome, and a new solution is provided for the construction of intelligent ecological scenes of the Internet of things.

Description

Intelligent full scene central control system and control method
Technical Field
The invention belongs to the technical field of embedded technology and the technical field of small target recognition and tracking, and particularly relates to an intelligent full scene central control system and a control method.
Background
Along with the rapid development of the technology of the Internet of things, intelligent ecological application scenes such as intelligent houses, intelligent workshops and the like gradually tend to be diversified and integrated, and the technology of the Internet of things rapidly develops, so that various products of the Internet of things are layered endlessly, the lives of people are greatly enriched and facilitated, and the living habits of people are changed.
However, the current intelligent ecological scene still has some problems:
the operation and the use of the first and most of the internet of things products cannot be supported by the mobile phone terminal, and the real convenience cannot be achieved.
Secondly, in most intelligent ecological scenes, a large number of Internet of things devices are often accessed, products of different companies often need to download different APP for control, and even if the products of the same company are used, the operation is extremely complicated when the large number of Internet of things devices continue to be selected and controlled.
Third, when current thing networking equipment is used, study cost is still higher, and is not friendly to groups such as old man, children.
Disclosure of Invention
The invention aims to provide an intelligent full-scene central control system and a control method, and provides an intelligent full-scene central control system and a control method which are simple to use, accurate and quick in control, low in learning cost, good in compatibility and capable of adapting to a plurality of complex scenes, so that the defects of the prior art are overcome, and a new solution is provided for the construction of intelligent ecological scenes of the Internet of things.
The technical scheme adopted by the invention is as follows:
an intelligent full scene central control system and a control method thereof comprise a main control system and a handheld controllable flickering lighting device, wherein the handheld controllable flickering lighting device is an external device for a user to operate;
the main control system comprises a case shell, a central processing module, an image acquisition module, a small and weak target identification module, a target tracking projection module, a first wireless communication module and a power supply module, wherein the image acquisition module, the small and weak target identification module, the target tracking projection module, the first wireless communication module and the power supply module are all connected with the central processing module, and the central processing module can communicate and control other Internet of things equipment in a scene through the wireless communication module;
the handheld controllable scintillation light-emitting device mainly acts on to emit light beams to form light spots at a distance, the device is small and exquisite, the function is simple, three keys are arranged outside the handheld controllable scintillation light-emitting device, one key is responsible for emitting a switch of the light beams, the other two keys control lamplight to flash according to different frequencies so as to represent instructions for determining and canceling, and the handheld controllable scintillation light-emitting device comprises a light source switch, a second wireless communication module and a scintillation frequency control module, and can set more types of scintillation frequencies according to specific requirements. When the beam switch is turned on, the device sends an instruction to the main control system through the second wireless communication module, and the main control system turns on the dormant image acquisition module. The device can be composed of a light source switch capable of accurately generating long-distance fine light spots, such as a high-power light source or a laser light source, a wireless communication module capable of sending instructions to a main control system, and a flicker frequency control module.
The central processing module can use a main board taking a multi-core high-speed CPU chip as a core, and can also use an embedded processor chip integrated with an operating system as a core. The system comprises storage elements such as an SD card and the like, and a computing unit such as a DSP and the like, and realizes the control operation of the whole system.
The man-machine interaction interface can be directly sent to the central processing module for storage and calling by the Internet of things equipment, and can also be automatically generated by the central processing module according to parameters which can be regulated by the Internet of things equipment and templates. If the connected internet of things equipment is an illuminating lamp, the user interface can be simplified into a switch button only by controlling the on-off of the light, if the on-off of the light is also controlled, the current brightness and the increase and decrease control of the light brightness can be displayed on the user interface, and the user can realize corresponding operation according to the needs.
The image acquisition module can acquire images in the intelligent ecological application scene by adopting a visible light camera or an infrared camera, and can adopt one or more groups of cameras, and the imaging range of the cameras is required to be large enough so as to acquire image information of all areas where the Internet of things equipment is located in the space. The numerical control three-dimensional rotating shaft structure can also be adopted, and when a command is received, the whole environment is scanned, and image data are acquired.
The weak and small target recognition module can be realized by adopting a high-performance FPGA chip as a main board or a ZYNQ heterogeneous development platform so as to ensure the real-time performance of image recognition, and can also be realized by directly realizing algorithms and the like through modules such as a CPU, a DSP and the like. The algorithm for identifying the weak and small targets mainly comprises the steps of preprocessing, dividing and identifying the image information, and finding the position information of the light spot by analyzing the image information such as local gray values, gradients and the like. The module analyzes the frequency of the flicker of the light points according to the frame rate of image acquisition and transmits the necessary information to the central processing module.
The object tracking projection module adopts a numerical control three-dimensional four-rotating-shaft structure and projection equipment, the projection equipment can realize omnibearing projection in an intelligent ecological scene, the numerical control three-dimensional four-rotating-shaft structure comprises a first rotating shaft, a second rotating shaft, an outer circular ring and a third rotating shaft, the inner circular ring is fixed with the lower surface of a chassis shell through the first rotating shaft, the outer circular ring is connected with a supporting shaft through the second rotating shaft, the tangential plane of the outer circular ring is fixed with the projection equipment through the third rotating shaft, the projection equipment realizes control of 180-degree angle in a plane through a fourth rotating shaft, the projection equipment can also freely rotate in the tangential plane through a certain angle, the former can realize omnibearing projection in combination, and the specific rotation angles of the four rotating shafts are calculated by a central processing module and are controlled by sending instructions.
The first wireless communication module comprises a Bluetooth communication module, a WIFI communication module and other technologies, wherein the Bluetooth communication module and the WIFI communication module are connected with the Internet of things equipment and the control system through signals, and the Internet of things equipment and the control system are guaranteed to realize interconnection communication.
An intelligent full scene central control method, comprising the following steps:
step S1, starting a system: the user installs the new Internet of things equipment at a specific position in the intelligent ecological scene, the Internet of things equipment sends basic information of the user to the main control system of the invention through a wireless communication technology, and the main control system processes and stores corresponding information to realize initialization;
step S2, building a database: the main control system adds the position information of the area where the equipment of the Internet of things is located into an area library, so that the equipment is convenient to use at any time;
step S3, production interface: the main control system generates a human-computer interaction interface according to parameters which can be regulated by the Internet of things equipment;
step S4, identifying signals: when a user wants to control the Internet of things equipment, the user only needs to transmit a beam of light through the handheld controllable flickering light-emitting equipment, the light spot is hit at the position of the equipment, and the main control system can automatically identify the position of the light spot through the small and weak target identification module;
step S5, man-machine interaction: the man-machine interaction interface is projected to a corresponding area in real time, at the moment, the man-machine interaction interface is moved by the handheld controllable flickering lighting device, the functions of determining and canceling are realized by controlling the flickering frequency of the light spots, and parameters are adjusted at specific positions on the man-machine interaction interface;
step S6, the control equipment: the main control system can automatically detect the interface function corresponding to the position of the light spot and the operation being executed by the user so as to realize the control of the corresponding equipment.
The invention has the technical effects that:
according to the intelligent full-scene central control system, the scheme of controlling the Internet of things equipment by intelligent mobile phone software which is widely adopted at present is replaced, inconvenience in searching and operation in mobile phones caused by excessive and various Internet of things equipment is avoided, all other works can be directly found out and required operation is performed, all other works are completely carried out by a machine, the more the accessed Internet of things equipment is, the advantages of the scheme can be reflected, the system is convenient and rapid, the learning cost is low, and the operation experience is more suitable for people's usual habits, more humanized and intelligentized.
The intelligent full-scene central control system is suitable for various intelligent ecological application scenes, such as intelligent households, intelligent workshops and the like. The market is wide and the prospect is wide.
The weak and small target recognition algorithm provided by the invention ensures that the module can recognize target information accurately in real time. And can stably work in various complex environments, thereby ensuring effective communication of user demands.
The target tracking projection module adopts a numerical control three-dimensional rotating shaft structure, so that the projection equipment can project in an omnibearing manner in an intelligent ecological scene. The module can project in real time in the corresponding area where the target is located, and real-time accurate command control is realized by combining with other modules.
Drawings
Fig. 1 is a schematic diagram of the present invention.
Fig. 2 is a schematic diagram of a central processing module according to the present invention.
Fig. 3 is a schematic diagram of a front view of a portion of a central control system according to the present invention.
Fig. 4 is a schematic rear view of the structure of fig. 4.
Fig. 5 is a right-side view of the structure of fig. 4.
Fig. 6 is a schematic top view of fig. 4.
Fig. 7 is a schematic diagram of the control method of the present invention.
Fig. 8 is a flow chart of the control method of the present invention.
In the drawings, the list of components represented by the various numbers is as follows:
1. a chassis housing; 2. an image acquisition module; 3. a first rotating shaft; 4. an outer ring; 5. a third rotating shaft; 6. a projection device.
Detailed Description
The present invention will be specifically described with reference to examples below in order to make the objects and advantages of the present invention more apparent. It should be understood that the following text is intended to describe only one or more specific embodiments of the invention and does not limit the scope of the invention strictly as claimed.
1-6, an intelligent full scene central control system comprises a main control system and a handheld controllable flickering lighting device, wherein the handheld controllable flickering lighting device is external equipment for a user to operate;
the main control system comprises a case shell 1, a central processing module, an image acquisition module 2, a small and weak target identification module, a target tracking projection module, a first wireless communication module and a power supply module, wherein the image acquisition module, the small and weak target identification module, the target tracking projection module, the first wireless communication module and the power supply module are all connected to the central processing module, and the central processing module can communicate and control other Internet of things equipment in a scene through the wireless communication module;
the hand-held controllable flashing lighting device mainly has the function of emitting beamlets to form light spots at a distance, is small and exquisite in device and simple in function, and is provided with three keys, wherein one key is responsible for a switch for emitting the beamlets, and the other two keys control lamplight to flash according to different frequencies so as to represent instructions for determining and canceling. The hand-held controllable flickering light emitting device comprises a light source switch, a second wireless communication module and a flickering frequency control module. When the beam switch is turned on, the device sends an instruction to the main control system through wireless communication, and the main control system turns on the dormant image acquisition module. The device can be composed of a light source switch capable of accurately generating long-distance fine light spots, such as a high-power light source or a laser light source, a wireless communication module capable of sending instructions to a main control system, and a flicker frequency control module.
As shown in fig. 1 and fig. 2, the central processing module includes a computing unit, an information processing unit and a storage unit, the central processing module is integrated with a man-machine interaction system and a control unit, and is implemented by a device capable of meeting the computing and information processing requirements, and the central processing module can use a motherboard with a multi-core high-speed CPU chip as a core, or can use a motherboard with an embedded processor chip integrated with an operating system as a core. The system comprises storage elements such as an SD card and the like, and a computing unit such as a DSP and the like, and realizes the control operation of the whole system.
As shown in fig. 2 and fig. 1, the man-machine interaction interface can be directly sent to the central processing module by the internet of things equipment for storage and calling, or can be automatically generated by the central processing module according to parameters available for adjustment by the internet of things equipment and templates. If the connected internet of things equipment is an illuminating lamp, the user interface can be simplified into a switch button only by controlling the on-off of the light, if the on-off of the light is also controlled, the current brightness and the increase and decrease control of the light brightness can be displayed on the user interface, and the user can realize corresponding operation according to the needs.
As shown in fig. 3, fig. 4 and fig. 5, the image acquisition module 2 may use a visible light camera or an infrared camera to acquire images in an intelligent ecological application scene, and may use one or more groups of cameras, which require that the imaging range of the camera should be large enough so as to acquire image information of the area where all the internet of things devices are located in the space. The numerical control three-dimensional rotating shaft structure can also be adopted, and when a command is received, the whole environment is scanned, and image data are acquired.
As shown in fig. 1, 5 and 6, the weak and small target recognition module can be implemented by adopting a high-performance FPGA chip as a main board of a core or a heterogeneous development platform such as ZYNQ, so as to ensure the real-time performance of image recognition, or can be implemented by directly implementing algorithms through modules such as a CPU and a DSP. The algorithm for identifying the weak and small targets mainly comprises the steps of preprocessing, dividing and identifying image information, and finding out the position information of a light spot by analyzing the image information such as local gray values, gradients and the like. The module analyzes the frequency of the flicker of the light points according to the frame rate of the image acquisition, transmits the necessary information to the central processing module,
the weak and small target recognition module is combined with a weak and small target recognition algorithm, so that the module can recognize target information accurately in real time, can work in various complex environments stably, and ensures effective communication of user demands.
As shown in fig. 3, fig. 4 and fig. 5, the target tracking projection module includes a numerically controlled three-dimensional four-axis structure and a projection device, so that the projection device can perform omnibearing projection in an intelligent ecological scene, the numerically controlled three-dimensional four-axis structure includes an inner circular ring, a first rotating shaft 3, a second rotating shaft, an outer circular ring 4 and a third rotating shaft 5, the inner circular ring is fixed with the lower surface of the chassis housing 1 through the first rotating shaft 3, the outer circular ring 4 is connected with a supporting shaft through the second rotating shaft, the tangential plane of the outer circular ring 4 is fixed with the projection device 6 through the third rotating shaft 5, the projection device 6 can be rotated in a circle in the circular ring plane, the tangential plane of the outer circular ring 4 is fixed with the third rotating shaft 5 through the third rotating shaft, the projection device can also perform omnibearing projection in a certain angle, the specific rotation angles of the four rotating shafts are calculated by the central processing module, and the central processing module sends instructions to control, the projection device 6 can perform control of an angle of 180 degrees in a plane through the fourth rotating shaft, and the projection device can perform accurate control in real time in a corresponding area with the target, and the accurate control module is realized.
As shown in fig. 1, the first wireless communication module includes technologies such as a bluetooth communication module and a WIFI communication module, where the bluetooth communication module and the WIFI communication module are both connected to the internet of things device and the control system by signals, so as to ensure that the internet of things device and the control system realize interconnection communication. The market is wide and the prospect is wide.
As shown in fig. 7-8, an intelligent full scene central control method, the control method comprises the following steps:
step S1, starting a system: the user installs the new Internet of things equipment at a specific position in the intelligent ecological scene, the Internet of things equipment sends basic information of the user to the main control system of the invention through a wireless communication technology, and the main control system processes and stores corresponding information to realize initialization;
step S2, building a database: the main control system adds the position information of the area where the equipment of the Internet of things is located into an area library, so that the equipment is convenient to use at any time;
step S3, production interface: the main control system generates a human-computer interaction interface according to parameters which can be regulated by the Internet of things equipment;
step S4, identifying signals: when a user wants to control the Internet of things equipment, the user only needs to transmit a beam of light through the handheld controllable flickering light-emitting equipment, the light spot is hit at the position of the equipment, and the main control system can automatically identify the position of the light spot through the small and weak target identification module;
step S5, man-machine interaction: the human-computer interaction interface is projected to the corresponding area in real time, at the moment, the human-computer interaction interface is moved by the handheld controllable flickering light-emitting device, the functions of determining and canceling are realized by controlling the flickering frequency of the light spots, and parameters can be adjusted at specific positions on the human-computer interaction interface. The market is wide, and the prospect is wide;
step S6, the control equipment: the intelligent control system replaces the scheme of controlling the Internet of things equipment by the intelligent mobile phone software widely adopted at present, avoids inconvenience of searching and operating in the mobile phone caused by excessive and complicated Internet of things equipment, can directly find out the target to be controlled and perform required operation, and completely communicates all other works with a machine to execute, so that the more the accessed Internet of things equipment is, the more the advantage of the scheme is reflected, the system is convenient and rapid, the learning cost is low, the operation experience is more suitable for people's usual habit, and the system is more humanized and intelligent.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention. Structures, devices and methods of operation not specifically described and illustrated herein, unless otherwise indicated and limited, are implemented according to conventional means in the art.

Claims (10)

1. An intelligent full scene central control system, which is characterized in that: the handheld controllable flashing lighting device is external equipment;
the main control system comprises a chassis shell (1), a central processing module, an image acquisition module (2), a weak and small target identification module, a target tracking projection module, a first wireless communication module and a power supply module, wherein the image acquisition module, the weak and small target identification module, the target tracking projection module, the first wireless communication module and the power supply module are all connected to the central processing module, and the central processing module is communicated and controlled with other Internet of things equipment in a scene through the wireless communication module;
the portable controllable flashing lighting device is characterized in that three keys are arranged outside the portable controllable flashing lighting device, one key is a switch for emitting light beams, the other two keys are respectively determined and cancelled instructions, and the portable controllable flashing lighting device comprises a light source switch, a second wireless communication module and a flashing frequency control module.
2. An intelligent full scene central control system according to claim 1, characterized in that: the central processing module comprises a computing unit, an information processing unit, a storage unit, a man-machine interaction system and a control unit.
3. An intelligent full scene central control system according to claim 2, characterized in that: the central processing module divides the whole intelligent ecological scene into a plurality of areas and stores the areas in the storage unit when initializing, and the central processing module acquires the position information and the equipment information of the Internet of things equipment in the intelligent ecological scene and stores and establishes a database.
4. An intelligent full scene central control system according to claim 2, characterized in that: the man-machine interaction interface is directly sent to the central processing module by the Internet of things equipment for storage and calling, or is automatically generated by the central processing module according to parameters of the Internet of things equipment and templates.
5. An intelligent full scene central control system according to claim 1, characterized in that: the image acquisition module (2) adopts a plurality of groups of cameras to acquire images in the intelligent ecological application scene, and the image acquisition module (2) further comprises a numerical control three-dimensional rotating shaft structure.
6. An intelligent full scene central control system according to claim 1, characterized in that: the small and weak target recognition module is composed of a microprocessor and comprises FPGA, ZYNQ, CPU or DSP modules, and the small and weak target recognition algorithm is used for preprocessing, dividing and recognizing the image information and transmitting the image information to the central processing module.
7. An intelligent full scene central control system according to claim 1, characterized in that: the target tracking projection module comprises a numerical control three-dimensional four-rotating-shaft structure and projection equipment (6).
8. The intelligent full scene central control system according to claim 7, wherein: the three-dimensional four pivot structures of numerical control include inside ring, first pivot (3), second pivot, outside ring (4), third pivot (5), inside ring passes through first pivot (3) and chassis housing (1) lower surface fixation, be connected through the second pivot between outside ring (4) and the back shaft, be fixed with projection equipment (6) through third pivot (5) on outside ring (4) tangential plane, projection equipment (6) realize the control of 180 degrees angles in the plane through the fourth pivot, three-dimensional four pivot structures of numerical control pass through central processing module control.
9. An intelligent full scene central control system according to claim 1, characterized in that: the first wireless communication module comprises a Bluetooth communication module and a WIFI communication module, and the Bluetooth communication module and the WIFI communication module are both in signal connection with the Internet of things equipment and the control system.
10. An intelligent full scene central control method applied to the intelligent full scene central control system of any one of claims 1-9, which is characterized in that: the control method comprises the following steps:
step S1, starting a system: the user installs the new Internet of things equipment at a specific position in the intelligent ecological scene, the Internet of things equipment sends basic information of the user to the main control system of the invention through a wireless communication technology, and the main control system processes and stores corresponding information to realize initialization;
step S2, building a database: the main control system adds the position information of the area where the equipment of the Internet of things is located into an area library, so that the equipment is convenient to use at any time;
step S3, production interface: the main control system generates a human-computer interaction interface according to parameters which can be regulated by the Internet of things equipment;
step S4, identifying signals: when a user wants to control the Internet of things equipment, the user only needs to hold the controllable flashing light-emitting equipment; emitting a beam of light, striking a light spot at the position of the equipment, and automatically identifying the position of the light spot by a main control system through a small target identification module;
step S5, man-machine interaction: the man-machine interaction interface is projected to a corresponding area in real time, at the moment, the man-machine interaction interface moves through the handheld controllable flickering light-emitting device, and the functions of determining and canceling are realized through controlling the flickering frequency of the light spots;
step S6, the control equipment: the main control system can automatically detect the interface function corresponding to the position of the light spot and the operation being executed by the user so as to realize the control of the corresponding equipment.
CN202311216582.2A 2023-09-20 2023-09-20 Intelligent full scene central control system and control method Pending CN117130313A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311216582.2A CN117130313A (en) 2023-09-20 2023-09-20 Intelligent full scene central control system and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311216582.2A CN117130313A (en) 2023-09-20 2023-09-20 Intelligent full scene central control system and control method

Publications (1)

Publication Number Publication Date
CN117130313A true CN117130313A (en) 2023-11-28

Family

ID=88858231

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311216582.2A Pending CN117130313A (en) 2023-09-20 2023-09-20 Intelligent full scene central control system and control method

Country Status (1)

Country Link
CN (1) CN117130313A (en)

Similar Documents

Publication Publication Date Title
US8872442B2 (en) Illumination system
EP3096184B1 (en) Method and device for controlling flash light and terminal
US8190278B2 (en) Method for control of a device
US9301372B2 (en) Light control method and lighting device using the same
US10097801B2 (en) Front projection eReader system
CN102333400B (en) Lighting remote control system
CN106062842B (en) Lighting system and controller and mobile subscriber terminal for controlling it
EP3033927B1 (en) Lighting control via a mobile computing device
WO2016104257A1 (en) Illumination device
CN102331884A (en) Projecting system with touch control projecting images
TWI499223B (en) Remote control system for pointing robot
JPH09265346A (en) Space mouse, mouse position detection device and visualization device
JP2009110688A (en) Illuminating device, illuminating method, and program
WO2020010577A1 (en) Micro projector having ai interaction function and projection method therefor
US11365871B2 (en) Human tracking to produce improved jobsite lighting
CN103092357B (en) A kind of implementation method of Scan orientation and projected keyboard device
CN101126965B (en) Optical movement sensing device and its sensing method
CN117130313A (en) Intelligent full scene central control system and control method
JP4296607B2 (en) Information input / output device and information input / output method
CN113641237A (en) Method and system for feature operation mode control in an electronic device
CN105657187A (en) Visible light communication method and system and equipment
CN107368200B (en) Remote control device
US10364946B1 (en) Smart bulb system
KR20180006580A (en) Smart clothes displaying pattern and color according to surrounding environment and method for controlling the same
CN217060959U (en) Interactive projection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination