CN113301315B - Projection system based on infrared touch screen frame - Google Patents

Projection system based on infrared touch screen frame Download PDF

Info

Publication number
CN113301315B
CN113301315B CN202110482250.3A CN202110482250A CN113301315B CN 113301315 B CN113301315 B CN 113301315B CN 202110482250 A CN202110482250 A CN 202110482250A CN 113301315 B CN113301315 B CN 113301315B
Authority
CN
China
Prior art keywords
infrared
neural network
touch screen
projector
identification mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110482250.3A
Other languages
Chinese (zh)
Other versions
CN113301315A (en
Inventor
李栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi Jiawei Technology Corp ltd
Original Assignee
Guangxi Jiawei Technology Corp ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangxi Jiawei Technology Corp ltd filed Critical Guangxi Jiawei Technology Corp ltd
Priority to CN202110482250.3A priority Critical patent/CN113301315B/en
Publication of CN113301315A publication Critical patent/CN113301315A/en
Application granted granted Critical
Publication of CN113301315B publication Critical patent/CN113301315B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/185Electrical failure alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B5/00Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied
    • G08B5/22Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied using electric transmission; using electromagnetic transmission
    • G08B5/36Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied using electric transmission; using electromagnetic transmission using visible light sources
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Emergency Management (AREA)
  • Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Projection Apparatus (AREA)

Abstract

The invention relates to the technical field of short-focus projection, and particularly discloses a projection system based on an infrared touch screen frame, which comprises: the system comprises a macro projector, a projection display screen, an ops computer, a first Wifi module, an infrared touch screen frame, a controller and a second Wifi module, wherein the first Wifi module is provided with a first antenna, the second Wifi module is provided with a second antenna, the Wifi module is connected with the ops computer, the ops computer is connected with a projection control system of the macro projector, the second Wifi module is connected with the controller, and the controller is connected with an infrared control system of the infrared touch screen frame; the first Wifi module is in wireless connection with the second Wifi module, the second Wifi module sends infrared touch screen data to an ops computer, and the infrared touch screen data are processed by the ops computer and then converted into control instructions of the macro projector and transmitted to the projection control system. The use site is neat and attractive due to the connection in a Wifi mode, and the defect that a user is stumbled by a wire in the use process is avoided.

Description

Projection system based on infrared touch screen frame
Technical Field
The invention belongs to the technical field of short-focus projection, and particularly relates to a projection system based on an infrared touch screen frame.
Background
A macro projector, also called a short focus projector, has a very short transmittance, i.e. the ratio of the distance between the projector and the screen to the screen size. The micro-distance projector can project about 80-120 inches of pictures in a short distance, and the projector is not required to be installed in front of a user when being installed because the projection distance is shortened, so that the direct projection of the light of the projector to the eyes of a speaker can be avoided, and meanwhile, the shadow of the speaker is prevented from being transmitted on a screen to shield the pictures. In addition, when the short-focus interactive projection is combined with electronic equipment such as an infrared touch screen frame and the like for use, a user can easily write and mark on a projection picture and simultaneously operate the computer, so that the interaction with the computer is realized, and the short-focus interactive projection is more vivid and vivid.
In addition, because of the throw distance of microspur projecting apparatus is very short, generally place the below in the place ahead of projection display screen with the microspur projecting apparatus directly, for improving user's user demand, reduce the barrier before the projection display screen, also can place the top in the place ahead of projection display screen with the microspur projecting apparatus, convenience of customers' bigger home range. Therefore, in recent years, there has been a macro projector (one is an angle-adjustable lens, and the other is a lens provided above and below the front of the projection display screen, and the lens for projecting the screen is selected as required) which can be configured as required by a user, thereby greatly increasing the adaptability of the macro projector.
Furthermore, the macro projectors in the current market are generally connected with an infrared touch screen frame in a wired manner, the infrared touch screen frame acquires action data of a user on projection, the action data is converted into image data or action instructions, and processing results are displayed on a projection screen. However, the wired connection cable needs to be arranged before use, which is not only troublesome, but also not beautiful enough, and the user may touch the connection cable during use, which may cause the occurrence of cable breakage.
Disclosure of Invention
The invention aims to provide a projection system based on an infrared touch screen frame, so that the defect that the existing microspur projector is connected with the infrared touch screen frame in a wired mode to touch a connecting line is overcome.
In order to achieve the above object, the present invention provides a projection system based on an infrared touch screen frame, comprising: the system comprises a macro projector, a projection display screen, an ops computer, a first Wifi module, an infrared touch screen frame, a controller and a second Wifi module, wherein the first Wifi module is provided with a first antenna, the second Wifi module is provided with a second antenna, the projection display screen is used for receiving a projection screen of the macro projector, and the infrared touch screen frame is arranged on the projection display screen;
the first Wifi module is connected with the ops computer, the ops computer is connected with a projection control system of the macro projector, the second Wifi module is connected with the controller, and the controller is connected with an infrared control system of the infrared touch screen frame to acquire the infrared touch screen data and send the second Wifi module;
the first Wifi module is wirelessly connected with the second Wifi module to receive the infrared touch screen data, the second Wifi module sends the infrared touch screen data to the ops computer, and the infrared touch screen data are converted into control instructions of the macro projector after being processed by the ops computer and transmitted to the projection control system.
Preferably, in the above technical scheme, the first antenna includes a first sub-antenna and a second sub-antenna, the ops computer and the first Wifi module are located in the macro projector, the first sub-antenna is located above the front side of the macro projector, and the second sub-antenna is located at the bottom of the macro projector.
Preferably, in the above technical solution, the mobile terminal further comprises an automatic tracking camera and a first convolution neural network module, wherein an infrared identification mark is arranged at the lower part of the front side of the infrared touch screen frame, and the automatic tracking camera is arranged at the front side of the macro projector and connected to the ops computer;
the automatic tracking camera collects infrared identification mark images at different angles and inputs the infrared identification mark images into a first convolution neural network of the first convolution neural network module for learning training so as to form the first convolution neural network module capable of identifying the infrared identification marks;
setting an initial position of the automatic tracking camera to be aligned with the infrared identification mark, and when the microspur projector moves, the automatic tracking camera collects the image of the infrared identification mark in real time and inputs the image to the first convolution neural network module to identify the real-time position of the infrared identification mark;
and the automatic tracking camera is rotated according to the real-time position, and the first Wifi module is selectively communicated with the first sub-antenna or the second sub-antenna according to the rotation angle.
Preferably, in the above technical solution, acquiring infrared identification mark images at different angles and inputting the acquired infrared identification mark images into the first convolutional neural network of the first convolutional neural network module for learning training specifically includes:
dividing a plurality of infrared identification mark images into a training set and a testing set;
inputting a training set into the first convolutional neural network for training so as to extract the characteristics of the infrared identification mark, acquiring the position of the infrared identification mark according to the characteristics of the infrared identification mark, and generating network parameters of the first convolutional neural network after the set iteration times are met;
and testing the network parameters of the first convolutional neural network by using a test set to obtain the optimal first convolutional neural network with optimal network parameters.
Preferably, in the above technical scheme, the infrared identification mark is a Wifi indicator light with a certain shape, and the Wifi indicator light is connected to the controller.
Preferably, in the above technical solution, the automatically tracking camera specifically includes, according to the real-time position rotation angle: the method comprises the steps of establishing coordinates in a real-time acquired image, acquiring an initial position of an infrared identification mark in the coordinates by a first convolution neural network module, storing the initial position, acquiring offset of the infrared identification mark in the coordinates in real time when a macro projector moves, and rotating an angle by an automatic tracking camera according to the offset.
Preferably, in the above technical solution, the projector further includes a connection port, and the connection port is connected to the ops computer and is disposed on the housing of the macro projector.
Preferably, in the above technical scheme, the wireless communication device further comprises a second convolutional neural network module and an alarm, the alarm is connected to the ops computer, the Wifi indicator light is red when working normally, and blue when working abnormally;
collecting images of the Wifi indicator lamp with different angles and different colors, inputting the images into a second convolutional neural network of the second convolutional neural network module for learning training to form the second convolutional neural network module capable of identifying the Wifi indicator lamp;
the automatic tracking camera acquires images of the Wifi indicator lamp in real time, inputs the images into the second convolutional neural network module to identify the color of the Wifi indicator lamp, and controls the alarm to give an alarm when the color is blue.
Compared with the prior art, the invention has the following beneficial effects:
1. according to the projection system based on the infrared touch screen frame, infrared touch screen data of the infrared touch screen frame are transmitted to an ops computer in a wireless Wifi mode for processing and conversion, and then transmitted to a macro projector for screen projection, conventional wiring harness arrangement is omitted in the wireless Wifi mode, and the projection system can be used only by successfully pairing the Wifi. The wiring harness is not required to be arranged, so that the use site is neat and attractive, and the defect that a user is stumbled by wires in the use process is avoided.
2. The ops computer and the first Wifi module are arranged in the macro projector, the two sub-antennas are arranged above and below the macro projector, and when the position of the macro projector is located below the front of the projection display screen, the first sub-antenna is selected to be used; when the position of microspur projecting apparatus is located the top in the place ahead of projection display screen, the selection uses the sub-antenna of second, not only can avoid the Wifi signal of microspur projecting apparatus and infrared touch screen frame to receive electronic interference, improves infrared touch screen data transmission's speed and throws the screen speed to when selecting an antenna to carry out Wifi transmission, can save the antenna consumption.
3. The invention adopts the convolution neural network to identify the position of the infrared identification mark, has high identification efficiency, controls the camera to automatically track by utilizing the coordinate offset, thereby adjusting the used sub-antenna, and adopts the combination of the automatic tracking technology and the picture identification technology, thereby having better effect.
4. The invention can also identify the color of the Wifi indicator lamp according to the convolutional neural network to obtain the fault state of the second Wifi module, thereby alarming and leading the macro projection system to have the fault early warning function.
Drawings
Fig. 1 is a block diagram of a projection system based on an infrared touch screen frame according to the present invention.
Fig. 2 is a flow chart of a method of selecting a first sub-antenna and a second sub-antenna according to the present invention.
Figure 3 is a schematic diagram of a NIN network architecture according to the present invention.
Fig. 4 is a schematic diagram of Wifi spreading of the first sub-antenna according to the present invention.
Fig. 5 is a schematic diagram of Wifi spreading of a second sub-antenna according to the present invention.
Detailed Description
Specific embodiments of the present invention will be described in detail below with reference to the accompanying drawings, but it should be understood that the scope of the present invention is not limited to the specific embodiments.
As shown in fig. 1, the infrared touch screen frame based projection system in this embodiment includes: microspur projecting apparatus 1, projection display screen 2, ops computer 3, first Wifi module 4, infrared touch screen frame 8, controller 9, second Wifi module 10 and automatic tracking camera 5, first Wifi module 4 is equipped with first antenna 41, and second Wifi module 10 is equipped with second antenna 101, and projection display screen 2 is used for receiving throwing of microspur projecting apparatus 1 and screens, and infrared touch screen frame 8 is located on projection display screen 2.
It can be understood that the ops computer 3 is preferably a micro ops computer on the market, the infrared touch screen frame 2 can also be a commonly-used infrared multi-point touch screen frame on the market, the controller 9 is preferably a DPS controller, and the automatic tracking camera 5 is only required to be of a common type on the market.
The first Wifi module 4 is connected with the ops computer 3, the ops computer 3 is connected with the projection control system 11 of the macro projector 1, the second Wifi module 10 is connected with the controller 9, and the controller 9 is connected with the infrared control system 81 of the infrared touch screen frame 8 to acquire infrared touch screen data and send the second Wifi module 10. The first Wifi module 4 is wirelessly connected with the second Wifi module 10 to receive infrared touch screen data, the second Wifi module 10 sends the infrared touch screen data to the ops computer 3, and the infrared touch screen data are processed by the ops computer 3 and then converted into control instructions of the macro projector 1 to be transmitted to the projection control system 11.
During the use, match first Wifi module 4 and second Wifi module 10 to use, the user does relevant trigger action in infrared touch screen 8 department, infrared touch screen 8 snatchs trigger action and converts infrared touch screen data signal transmission to the controller into, controller 9 transmits infrared touch screen data signal to ops computer 3 through first Wifi module 4 and second Wifi module 10, produces the control command (including control sub-instruction and data display sub-instruction) of macro projecting apparatus 1 after having converted by ops computer 3, projection control system 11 receives control command and throws the screen to projection display screen 2 through the camera lens after processing and shows.
Furthermore, in this embodiment, the micro-projector 1 and the ops computer 3 are integrally designed, so that Wifi signal transmission of the micro-projector that can be placed above or below the front of the projection display screen is more reasonable, and in order to make transmission more energy-saving and less electromagnetic interference after the Wifi signal integration design, the first antenna 41 of the first Wifi module 4 is optimally designed, that is, the first antenna 41 is set as the first sub-antenna 411 and the second sub-antenna 412, the ops computer 3 is set as the first convolution neural network module 31, the ops computer 3 and the first Wifi module 4 are disposed in the micro-projector 1, the first sub-antenna 411 is disposed above the front side of the micro-projector 1, and the second sub-antenna 412 is disposed at the bottom of the micro-projector 1. The lower part of the front side of the infrared touch screen frame 8 is provided with an infrared identification mark, the infrared identification mark is a Wifi indicator lamp with a certain shape, the Wifi indicator lamp is connected with the controller 9, and the automatic tracking camera 5 is arranged on the front side of the microspur projector 1 and is connected with the ops computer 3.
The method for selecting the first sub-antenna 411 and the second sub-antenna 412 in this embodiment is described in detail below to make it clear to those skilled in the art, and as shown in fig. 2, the method specifically includes:
step S1, training the first convolutional neural network module 31: the automatic tracking camera 5 collects images of the infrared identification marks at different angles and inputs the images into the first convolutional neural network of the first convolutional neural network module 31 for learning training, so as to form the first convolutional neural network module 31 capable of identifying the infrared identification marks.
Specifically, acquiring infrared identification mark images at different angles and inputting the infrared identification mark images into a first convolution neural network of a first convolution neural network module to perform learning training specifically comprises the following steps:
and S11, dividing the infrared identification mark images into a training set and a testing set, wherein the proportion of the training set to the testing set is 5:1.
S12, inputting the training set into a first convolution neural network for training to extract the characteristics of the infrared identification mark, acquiring the position of the infrared identification mark according to the characteristics of the infrared identification mark, and generating network parameters of the first convolution neural network after the set iteration times are met;
and S13, testing the network parameters of the first convolutional neural network by using the test set to obtain the optimal first convolutional neural network with optimal network parameters.
It can be understood that the infrared identification mark in this embodiment is a Wifi indicator light, and therefore, under any environment, the shape of the Wifi indicator light can be shot by the automatic tracking camera 5.
It can be understood that any convolutional neural network can be selected as the convolutional neural network in this embodiment, for example, the convolutional neural network selects a multilayer perceptual convolutional neural network model based on an NIN network structure, and a deep learning model is constructed based on the NIN network structure, where the entire structure of the NIN is composed of multiple multilayer perceptual convolutional layers Mlpconv and a full connection layer. Each Mlpconv is a convolution operation of a micro-network structure of a multilayer perceptron on each local receptive field, and consists of a layer of convolution and two layers of perception layers; and adding a full connection layer after the last Mlpconv layer, and realizing the identification output of the picture target object by a linear regression method. As shown in fig. 3, the deep learning model network architecture for identifying and detecting a picture target object with high accuracy provided by the embodiment of the present invention includes:
an input layer: the input target object picture is a three-channel true color image, and the image size is a data set of 32x32x 3.
First Mlpconv layer: the convolution kernels are set to be 16 in size of 3x3x3, the perceptron is realized through 1x1 convolution, namely 1x1x16 and 1x1x32, and finally dimensionality reduction processing is carried out through pooling layers.
Second Mlpconv layer: the convolution kernels are set to be 32 in size of 3x3x16, the perceptron is realized by 1x1 convolution, namely 1x1x32 and 1x1x32, and finally dimensionality reduction processing is carried out through pooling layers.
Third Mlpconv layer: the convolution kernels are set to be 64 in size of 3x3x32, the perceptron is realized by 1x1 convolution, namely 1x1x64 and 1x1x64, and finally dimensionality reduction processing is carried out through a pooling layer.
Fourth Mlpconv layer: the convolution kernels are set to be 128 in size of 3x3x64, the perceptron is realized by 1x1 convolution, namely 1x1x128 and 1x1x128, and finally the dimensionality reduction processing is carried out through the pooling layer.
Full connection layer: the pictures output by the convolutional layer and the pooling layer have the characteristics of a higher layer, a two-dimensional feature map is mapped to a one-dimensional space through a full connection layer, 20x1 full connection parameters are set, the pictures are output as picture target object detection values through a linear regression method, a Leaky-Relu function is used for nonlinear activation, and meanwhile L1 regularization processing is used.
And S2, setting an initial position of the automatic tracking camera 5 to align the infrared identification mark, and when the macro projector 1 moves, acquiring an infrared identification mark image in real time by the automatic tracking camera 5 and inputting the infrared identification mark image to the first convolution neural network module 31 to identify the real-time position of the infrared identification mark.
Step S3, the automatic tracking camera 5 rotates an angle according to the real-time position, and the first Wifi module 4 selectively connects the first sub-antenna 411 or the second sub-antenna 412 according to the rotation angle.
Specifically, the automatic tracking camera 5 specifically includes, according to the real-time position rotation angle: the method comprises the steps that coordinates are established in an image obtained in real time, the first convolution neural network module obtains initial position coordinates of an infrared identification mark in the coordinates, the initial position coordinates are stored in an ops computer, when a macro projector moves, the offset of the infrared identification mark in the coordinates is obtained in real time, the camera is automatically tracked, the first Wifi module selects a first sub-antenna or a second sub-antenna according to the rotation angle according to the offset rotation angle. The offset implementation method comprises the following steps: a database of coordinates and offset can be established, and the offset can be obtained by searching the relationship between the coordinates and the offset.
When the projection distance of the micro-distance projector is short, the projection distance can be not only arranged above the front of the projection display screen, but also arranged below the front of the projection display screen. The first Wifi module is matched with the second Wifi module to be used, the initial position of the automatic tracking camera is aligned to the infrared identification mark, the position of the infrared identification mark can be tracked by the automatic tracking camera in real time after the infrared identification mark is identified by the first convolution neural network module, when a user adjusts the position of the macro projector according to actual conditions, for example, the initial position is below the front of the projection display screen, when the macro projector is moved to the upper part of the front of the projection display screen, the real-time rotation angle of the camera is automatically tracked in the moving process, the rotation angle signal is sent to the first Wifi module by an ops computer, the first Wifi module selects the second sub-antenna to communicate according to the rotation angle, the second sub-antenna can be connected to the second antenna of the second Wifi module without contacting a control mainboard 20 of the macro projector, the projection display screen, an infrared touch screen frame and other electronic devices, so that Wifi transmission interference is small, the speed of Wifi transmission is improved, the display effect delayed by the interference of electronic equipment is avoided, the user is shown in a specific graph 5 and a specific graph.
In addition, in this embodiment, the automatic tracking camera 5 may also be a camera capable of measuring distance, and the rotation angle of the automatic tracking camera 5 according to the real-time position may also be: the method comprises the steps that coordinates are established in an image obtained in real time, a first convolution neural network module obtains initial position coordinates of an infrared identification mark in the coordinates, an automatic tracking camera 5 obtains the distance from the automatic tracking camera to the infrared identification mark, the initial position coordinates and the distance are stored in an ops computer, when a macro projector moves, the automatic tracking camera 5 obtains the real-time distance from the automatic tracking camera to the infrared identification mark and the real-time coordinates, therefore, the offset of the automatic tracking camera is obtained according to the real-time distance and the real-time coordinates, the automatic tracking camera selects a first sub-antenna or a second sub-antenna according to the offset rotating angle, and a first Wifi module selects a first sub-antenna or a second sub-antenna according to the rotating angle.
Further referring to fig. 1, the ops computer 3 further includes a connection port 6, the connection port 6 is connected to the ops computer 3 and is disposed on the housing of the macro projector 1, and the connection port 6 can be connected to a computer to set the Wifi function and also can be connected to the computer to transmit data.
Furthermore, the ops computer 3 is further provided with a second convolutional neural network module 32 and an alarm 7, the alarm 7 is connected with the ops computer 3, in the embodiment, the Wifi indicator light is bright red when working normally, and is bright blue when working abnormally.
Specifically, the images of the Wifi indicator lamps with different angles and different colors are collected and input into a second convolutional neural network of a second convolutional neural network module for learning training, so that the second convolutional neural network module capable of identifying the Wifi indicator lamps is formed.
The automatic tracking camera acquires images of the Wifi indicator lamps in real time, inputs the images into the second convolutional neural network module to identify the colors of the Wifi indicator lamps, and the ops computer controls the alarm to give an alarm when the images are blue, so that a user is reminded of maintaining the device.
The foregoing descriptions of specific exemplary embodiments of the present invention have been presented for purposes of illustration and description. It is not intended to limit the invention to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiments were chosen and described in order to explain certain principles of the invention and its practical application to enable one skilled in the art to make and use various exemplary embodiments of the invention and various alternatives and modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims and their equivalents.

Claims (6)

1. A projection system based on an infrared touch screen frame, comprising: the system comprises a micro-distance projector, a projection display screen, an ops computer, a first Wifi module, an infrared touch screen frame, a controller and a second Wifi module, wherein the first Wifi module is provided with a first antenna, the second Wifi module is provided with a second antenna, the projection display screen is used for receiving a projection screen of the micro-distance projector, and the infrared touch screen frame is arranged on the projection display screen;
the first Wifi module is connected with the ops computer, the ops computer is connected with a projection control system of the macro projector, the second Wifi module is connected with the controller, and the controller is connected with an infrared control system of the infrared touch screen frame to acquire data of the infrared touch screen and send the data to the second Wifi module;
the first Wifi module is wirelessly connected with the second Wifi module to receive data of the infrared touch screen, the second Wifi module sends the data of the infrared touch screen to the ops computer, and the data of the infrared touch screen is converted into a control instruction of the macro projector after being processed by the ops computer and is transmitted to the projection control system;
the first antenna comprises a first sub-antenna and a second sub-antenna, the ops computer and the first Wifi module are arranged in the macro projector, the first sub-antenna is arranged above the front side of the macro projector, and the second sub-antenna is arranged at the bottom of the macro projector;
the automatic tracking camera is arranged on the front side of the microspur projector and connected with the ops computer;
the automatic tracking camera collects infrared identification mark images at different angles and inputs the infrared identification mark images into a first convolution neural network of the first convolution neural network module for learning training so as to form the first convolution neural network module capable of identifying the infrared identification marks;
setting an initial position of the automatic tracking camera to be aligned with the infrared identification mark, and when the microspur projector moves, the automatic tracking camera collects the image of the infrared identification mark in real time and inputs the image to the first convolution neural network module to identify the real-time position of the infrared identification mark;
and the automatic tracking camera is rotated according to the real-time position, and the first Wifi module is selectively communicated with the first sub-antenna or the second sub-antenna according to the rotation angle.
2. The infrared touch screen frame-based projection system of claim 1, wherein acquiring infrared identification mark images at different angles and inputting the infrared identification mark images into the first convolutional neural network of the first convolutional neural network module for learning training specifically comprises:
dividing a plurality of infrared identification mark images into a training set and a testing set;
inputting a training set into the first convolution neural network for training to extract the characteristics of the infrared identification mark, acquiring the position of the infrared identification mark according to the characteristics of the infrared identification mark, and generating network parameters of the first convolution neural network after the set iteration times are met;
and testing the network parameters of the first convolutional neural network by using a test set to obtain the optimal first convolutional neural network with optimal network parameters.
3. The infrared touch screen frame-based projection system as claimed in claim 1, wherein the infrared identification mark is a Wifi indicator light with a certain shape, and the Wifi indicator light is connected to the controller.
4. The infrared touchscreen frame-based projection system of claim 1, wherein said auto-tracking camera, based on said real-time position rotation angle, specifically comprises: the method comprises the steps of establishing coordinates in a real-time acquired image, acquiring an initial position of an infrared identification mark in the coordinates by a first convolution neural network module, storing the initial position, acquiring offset of the infrared identification mark in the coordinates in real time when a macro projector moves, and rotating an angle by an automatic tracking camera according to the offset.
5. The infrared touchscreen frame-based projection system of claim 3, further comprising a connection port, said connection port connected to said ops computer and located on a housing of said pico projector.
6. The infrared touch screen frame-based projection system of claim 3, further comprising a second convolutional neural network module and an alarm, wherein the alarm is connected to the ops computer, and the Wifi indicator lamp is red when working normally and blue when working abnormally;
collecting images of the Wifi indicator lamp with different angles and different colors, inputting the images into a second convolutional neural network of the second convolutional neural network module for learning training to form the second convolutional neural network module capable of identifying the Wifi indicator lamp;
the automatic tracking camera acquires images of the Wifi indicator lamp in real time, inputs the images into the second convolutional neural network module to identify the color of the Wifi indicator lamp, and controls the alarm to give an alarm when the images are blue.
CN202110482250.3A 2021-04-30 2021-04-30 Projection system based on infrared touch screen frame Active CN113301315B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110482250.3A CN113301315B (en) 2021-04-30 2021-04-30 Projection system based on infrared touch screen frame

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110482250.3A CN113301315B (en) 2021-04-30 2021-04-30 Projection system based on infrared touch screen frame

Publications (2)

Publication Number Publication Date
CN113301315A CN113301315A (en) 2021-08-24
CN113301315B true CN113301315B (en) 2023-04-07

Family

ID=77321711

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110482250.3A Active CN113301315B (en) 2021-04-30 2021-04-30 Projection system based on infrared touch screen frame

Country Status (1)

Country Link
CN (1) CN113301315B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101776971A (en) * 2009-01-10 2010-07-14 张海云 Multi-point touch screen device and positioning method
CN103455207A (en) * 2012-06-04 2013-12-18 爱国者数码科技有限公司 Projection system with interaction function
CN103914152A (en) * 2014-04-11 2014-07-09 周光磊 Recognition method and system for multi-point touch and gesture movement capturing in three-dimensional space

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130169887A1 (en) * 2012-01-03 2013-07-04 Aiptek International Inc. Projection Application Device Working Through a Wireless Video Output Device and Its Video Output Control Unit
US9640144B2 (en) * 2012-02-13 2017-05-02 Hitachi Maxell, Ltd. Projector, figure input/display apparatus, portable terminal, and program
TW201349029A (en) * 2012-05-21 2013-12-01 Everest Display Inc Interactive projection system and control method with light spot identification
CN202735987U (en) * 2012-07-26 2013-02-13 郑州信大捷安信息技术股份有限公司 Infrared positioning and wireless control type interactive projection system
KR20140028221A (en) * 2012-08-28 2014-03-10 삼성전자주식회사 Method and apparatus for setting electronic blackboard system
CN103024324B (en) * 2012-12-10 2016-06-22 Tcl通力电子(惠州)有限公司 A kind of short out-of-focus projection system
CN103230163A (en) * 2013-04-18 2013-08-07 无锡奇纬智能视膜科技有限公司 Multimedia infrared touch interactive table
CN203673430U (en) * 2013-08-20 2014-06-25 华东师范大学 Mirror plane interactive device
CN203405798U (en) * 2013-09-10 2014-01-22 深圳市摩拓触摸科技有限公司 Infrared interactive electronic whiteboard system
CN204145638U (en) * 2014-06-17 2015-02-04 王玉 A kind of intelligent wireless interaction display unit
CN104486604A (en) * 2014-12-30 2015-04-01 湖南巨手科技发展有限公司 Touch screen projection display device
CN105260021A (en) * 2015-10-15 2016-01-20 深圳市祈锦通信技术有限公司 Intelligent interactive projection system
CN106713880B (en) * 2016-12-02 2019-03-15 北京一数科技有限公司 A kind of portable intelligent optical projection system
CN209070293U (en) * 2018-11-09 2019-07-05 猫猫太史互联网科技(苏州)有限公司 Intelligent large-size screen monitors
CN110416701A (en) * 2019-08-05 2019-11-05 西安多小波信息技术有限责任公司 A kind of air communications antenna system and communication means based on flight attitude perception

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101776971A (en) * 2009-01-10 2010-07-14 张海云 Multi-point touch screen device and positioning method
CN103455207A (en) * 2012-06-04 2013-12-18 爱国者数码科技有限公司 Projection system with interaction function
CN103914152A (en) * 2014-04-11 2014-07-09 周光磊 Recognition method and system for multi-point touch and gesture movement capturing in three-dimensional space

Also Published As

Publication number Publication date
CN113301315A (en) 2021-08-24

Similar Documents

Publication Publication Date Title
US11425802B2 (en) Lighting system and method
US10371504B2 (en) Light fixture commissioning using depth sensing device
CN104427252B (en) Method and its electronic equipment for composograph
CN103019014B (en) Projection apparatus, and projection control method
US11579904B2 (en) Learning data collection device, learning data collection system, and learning data collection method
CN109344715A (en) Intelligent composition control method, device, electronic equipment and storage medium
CN106506749B (en) Mobile terminal with flash lamp component
JP6104143B2 (en) Device control system and device control method
CN106599929B (en) Virtual reality feature point screening space positioning method
CN110471403B (en) Method for guiding an autonomously movable machine by means of an optical communication device
CN103376921A (en) Laser labeling system and method
CN106598288A (en) Positioning system and method for laser pen mouse
CN105262538B (en) A kind of optical information positioning system
CN110471402A (en) The system and method that the machine for capableing of autonomous is guided
JP7297877B2 (en) Optical communication device and information transmission and reception method
CN109040729B (en) Image white balance correction method and device, storage medium and terminal
US20120002044A1 (en) Method and System for Implementing a Three-Dimension Positioning
CN113301315B (en) Projection system based on infrared touch screen frame
CN104363399A (en) Projection angle adjustment method of adaptive projection system
CN106023858B (en) Move the infrared projection advertisement interaction systems over the ground of anti-tampering formula
CN112153300A (en) Multi-view camera exposure method, device, equipment and medium
CN110213407A (en) A kind of operating method of electronic device, electronic device and computer storage medium
CN110532860A (en) The modulation of visible light bar code and recognition methods based on RGB LED lamp
CN106959747B (en) Three-dimensional human body measuring method and apparatus thereof
CN104076990A (en) Screen positioning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant