CN115016716B - Projection interaction method and system - Google Patents

Projection interaction method and system Download PDF

Info

Publication number
CN115016716B
CN115016716B CN202210606437.4A CN202210606437A CN115016716B CN 115016716 B CN115016716 B CN 115016716B CN 202210606437 A CN202210606437 A CN 202210606437A CN 115016716 B CN115016716 B CN 115016716B
Authority
CN
China
Prior art keywords
projection
sub
preset
sequence
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210606437.4A
Other languages
Chinese (zh)
Other versions
CN115016716A (en
Inventor
刘翔宇
危学涛
郭磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southern University of Science and Technology
Original Assignee
Southern University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southern University of Science and Technology filed Critical Southern University of Science and Technology
Priority to CN202210606437.4A priority Critical patent/CN115016716B/en
Publication of CN115016716A publication Critical patent/CN115016716A/en
Application granted granted Critical
Publication of CN115016716B publication Critical patent/CN115016716B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application relates to a projection interaction method and a projection interaction system, wherein the method comprises the following steps: acquiring a projection picture sequence of a target projection image; dividing a target projection image into subareas; respectively encoding the position coordinates of each subarea to obtain a coding sequence of the position coordinates of each subarea, and corresponding the coding sequence of the subarea to a projection picture sequence of the subarea; before each projection picture is projected, if the corresponding coding bit in the coding sequence of the sub-region of the projection picture is a first preset coding bit, adjusting the color intensity of the sub-region; each projection screen is projected. By adopting the method, the accuracy of target positioning can be improved.

Description

Projection interaction method and system
Technical Field
The present application relates to the field of projection technologies, and in particular, to a projection interaction method and system.
Background
With the development of projection technology, human-computer interaction (HCI) technology has emerged, and visual interaction (visual interaction, VI) is one of the main applications of human-computer interaction, and usually, the projection of a picture can be performed by using human hand.
In the existing man-machine interaction method, a projector can embed beacon information for pixels of a projection picture, then transmit a beacon signal through visible light modulation based on the embedded beacon information, and a photodiode receives the beacon signal to achieve target positioning.
However, in the above method, the accuracy of the target positioning is low due to abnormal brightness and flickering of the projected screen.
Disclosure of Invention
Based on the foregoing, it is necessary to provide a projection interaction method and system capable of improving positioning accuracy.
In a first aspect, the present application provides a projection interaction method. The method comprises the following steps: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into subareas; respectively encoding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; before projecting each projection picture, if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-region; and projecting each projection picture.
In one embodiment, the encoding the position coordinates of each sub-region to obtain the encoded sequence of the position coordinates of each sub-region includes: respectively carrying out Gray coding on the position coordinates of each subarea to obtain a Gray coding sequence; and carrying out Manchester encoding on the Gray code sequence to obtain Manchester code sequences of position coordinates of the subareas.
In one embodiment, if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is the first preset coding bit, the adjusting the color intensity of the sub-region includes: if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is the first preset coding bit, increasing the color intensity of the R channel of the sub-region by a first preset value, and decreasing the color intensity of the B channel of the sub-region by a second preset value, wherein the second preset value is a preset multiple of the first preset value.
In a second aspect, the application further provides a projection interaction device. The device comprises:
The acquisition module is used for acquiring a projection picture sequence of the target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; the dividing module is used for dividing the target projection image into subareas; the coding module is used for respectively coding the position coordinates of each sub-region, obtaining a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; the color intensity adjusting module is used for adjusting the color intensity of the subarea of the projection picture if the corresponding coding bit in the coding sequence is a first preset coding bit before the projection of each projection picture; and the projection module is used for projecting each projection picture.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor which when executing the computer program performs the steps of: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into subareas; respectively encoding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; before projecting each projection picture, if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-region; and projecting each projection picture.
In a fourth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into subareas; respectively encoding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; before projecting each projection picture, if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-region; and projecting each projection picture.
In a fifth aspect, the present application also provides a computer program product. The computer program product comprises a computer program which, when executed by a processor, implements the steps of: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into subareas; respectively encoding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; before projecting each projection picture, if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-region; and projecting each projection picture.
In a sixth aspect, the present application provides a projection interaction method. The method comprises the following steps: acquiring a color sensing signal output by a color sensor in a preset signal period; processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals; the digital voltage signals are digitized into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized coded bits of the digital voltage signals are determined to be first preset coded bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; and decoding the digitized sequence to obtain corresponding position coordinates.
In one embodiment, the acquiring the color sensing signal output by the color sensor in the preset signal period includes: acquiring an analog current signal output by the color sensor; and determining the analog current signal with the longest receiving duration and the analog current signal between the analog current signal and the analog current signal with the longest receiving duration as the color sensing signal output in the preset signal period.
In one embodiment, the processing the color sensing signal output in the preset signal period to obtain a corresponding digital voltage signal includes: converting the analog current signals corresponding to the receiving duration time into digital voltage signals; and after sampling, quantizing and filtering the digital voltage signals, obtaining the corresponding digital voltage signals.
In one embodiment, the decoding the digitized sequence to obtain the corresponding position coordinates includes: performing Manchester decoding on the digitized sequence to obtain a digitized sequence after Manchester decoding; and Gray decoding is carried out on the digitized sequence after Manchester decoding, and the position coordinates are obtained.
In one embodiment, the position coordinates include R-channel position coordinates and B-channel position coordinates, the method further comprising: and if the R channel position coordinate is the same as the B channel position coordinate and is the same as the position coordinate of any one of the pre-stored sub-areas, determining that the obtained position coordinate is valid, otherwise, determining that the obtained position coordinate is invalid.
In one embodiment, the method further comprises: and determining the sliding track by combining the position coordinates corresponding to a plurality of continuous preset signal periods.
In a seventh aspect, the present application further provides a projection interaction apparatus. The device comprises: the acquisition module is used for acquiring a color sensing signal output by the color sensor in a preset signal period; the processing module is used for processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals; the digitizing module is used for digitizing the digital voltage signals into corresponding digitizing sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized encoding bits of the digital voltage signals are determined to be first preset encoding bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; and the decoding module is used for decoding the digitized sequence to obtain corresponding position coordinates.
In an eighth aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor which when executing the computer program performs the steps of: acquiring a color sensing signal output by a color sensor in a preset signal period; processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals; the digital voltage signals are digitized into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized coded bits of the digital voltage signals are determined to be first preset coded bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; and decoding the digitized sequence to obtain corresponding position coordinates.
In a ninth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of: acquiring a color sensing signal output by a color sensor in a preset signal period; processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals; the digital voltage signals are digitized into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized coded bits of the digital voltage signals are determined to be first preset coded bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; and decoding the digitized sequence to obtain corresponding position coordinates.
In a tenth aspect, the present application also provides a computer program product. The computer program product comprises a computer program which, when executed by a processor, implements the steps of: acquiring a color sensing signal output by a color sensor in a preset signal period; processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals; the digital voltage signals are digitized into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized coded bits of the digital voltage signals are determined to be first preset coded bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; and decoding the digitized sequence to obtain corresponding position coordinates.
In an eleventh aspect, a projection interactive system, the system comprising: a projection device, a color sensor, and a reception processing device in communication with the color sensor,
The projection device is used for: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into subareas; respectively encoding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region; before projecting each projection picture, if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-region; projecting each projection picture;
the color sensor is used for outputting a color sensing signal;
The reception processing device is configured to: acquiring a color sensing signal output by the color sensor in a preset signal period; processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals; the digital voltage signals are digitized into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized coded bits of the digital voltage signals are determined to be first preset coded bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; and decoding the digitized sequence to obtain corresponding position coordinates.
According to the projection interaction method and system, after the projection equipment acquires the projection picture sequence of the target projection image, the target projection image can be divided into the subareas, the position coordinates of the subareas are respectively encoded, the encoding sequence of the position coordinates of the subareas is obtained, and the encoding sequence of the subareas corresponds to the projection picture sequence of the subareas; before the projection device projects each projection picture, if the corresponding coding bit in the coding sequence of the sub-region of the projection picture is a first preset coding bit, the color intensity of the sub-region is adjusted; then, the projection device can project each projection picture; the receiving processing equipment communicated with the color sensor can acquire a color sensing signal output by the color sensor in a preset signal period, process the color sensing signal output by the color sensor in the preset signal period to obtain a corresponding digital voltage signal, digitize the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than a voltage threshold value, the digitized coded bit of the digital voltage signal is determined as a first preset coded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, the coded bit of the digital voltage signal after being digitized is determined to be a second preset coded bit, and the digital sequence is decoded, so that the corresponding position coordinate can be obtained. The problem of abnormal brightness and flickering of the projection picture can be solved by respectively encoding the position coordinates of each subarea, and the accuracy of the position coordinates obtained based on the changed color sensing signals is high by adjusting the color intensity of the subarea of each projection picture.
Drawings
FIG. 1 is a block diagram of a projection interactive system in one embodiment;
FIG. 2 is a flow diagram of a projection interaction method in one embodiment;
Fig. 3 is a schematic diagram of gray coding of 8×8 sub-regions;
FIG. 4 is a schematic diagram of sub-region division and color intensity adjustment of sub-regions;
Fig. 5 is a schematic diagram of color space conversion modulation based on YCbCr color space;
FIG. 6 is a diagram of beacon information for a sub-region;
FIG. 7 is a flow diagram of a projection interaction method in one embodiment;
FIG. 8 is a flow chart of a receiving processing device acquiring a color sensing signal in one embodiment;
FIGS. 9 a-9 c are schematic diagrams of a test image;
FIGS. 10 a-10 c are schematic diagrams illustrating the perceived impact of different images and different frame rates on the human eye;
FIGS. 11 a-11 b are schematic diagrams of a perception rate of a communication distance and a viewing angle to a human eye;
FIG. 12 is a graph showing the results of a sub-region size versus beacon identification rate;
FIGS. 13 a-13 c are diagrams of the results of different images and different frame rates affecting the beacon identification rate;
FIG. 14 is a graphical representation of the result of a communication distance impact on the beacon identification rate;
FIG. 15 is a diagram of a positioning result;
FIG. 16 is a diagram showing the result of the effect of different background colors on positioning accuracy;
FIG. 17 is a block diagram of a projection interaction apparatus in one embodiment;
FIG. 18 is a block diagram of a projection interaction apparatus in another embodiment;
Fig. 19 is an internal structural view of the computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
Visual interaction is one of the main applications of man-machine interaction, which means that projected pictures can be interactively used by using the actions of a human hand. In the existing man-machine interaction method based on multiple intelligent projectors, in order to attempt to improve user experience and device performance, projection systems can be classified into three types.
One type is a depth camera based projection system that typically uses infrared structured light and time of flight (time offlight, toF) technology to acquire three-dimensional information of an interactive scene on a projector. However, this requires equipment replacement of the existing projector, which requires the projector to have its own depth camera, resulting in high projection system deployment requirements. One type is a projection system based on a general camera, which is to add a camera to a projector to achieve interaction with human motion by using captured two-dimensional image information. While conventional cameras are easily deployed on projectors, a large number of image processing algorithms can affect the real-time performance of the projector, increasing the complexity of the projection system. One type is a non-visual device based projection system that uses sound, pressure, acceleration, light, and other non-visual media to send and receive data to a projector. The user's finger may wear a lightweight sensor to enable human-machine interaction with the projector. Furthermore, the projection system based on the non-visual equipment only needs to add a low-cost hardware circuit, and the complexity of a signal processing algorithm is low.
Among them, an intelligent Digital Micromirror Device (DMD) projector based on visible light communication (visible light communication, VLC) is widely used because it does not require any hardware addition or modification, for example, the DMD projector can be used for teaching in indoor classroom and meeting projection.
During interaction of the projector with the user, the user may wear a lightweight receiving sensor, e.g., a Photodiode (PD), and the DMD projector may embed different beacon information (i.e., man-machine interaction data) for each pixel and modulate the transmitted beacon signal through a visible light channel. The receiving sensor based on the PD can receive the beacon signal sent by the projector, converts the received optical signal into an electric signal, and can obtain man-machine interaction data after the electric signal is processed, so that man-machine interaction is completed. Wherein the visible light channel is considered to be free from electromagnetic interference, thus providing higher positioning accuracy. Therefore, the PD-based receiver has good real-time performance, can support the positioning and tracking of dynamic targets (mice and gestures), and greatly expands the application of the projector.
However, when the PD-based receiver currently implements man-machine interaction, the accuracy of target positioning is low due to abnormal brightness and flicker of the projected picture.
Based on this, the present application provides a projection interactive system capable of improving positioning accuracy, and as shown in fig. 1, a block diagram of a projection interactive system is provided, where the projection interactive system includes a projection device 102, a color sensor 104, and a receiving processing device 106.
After the projection device 102 obtains the projection picture sequence of the target projection image, the target projection image may be divided into sub-regions, and the position coordinates of each sub-region are respectively encoded to obtain the encoded sequence of the position coordinates of the sub-region, and before each projection picture is projected, if the encoded bit corresponding to the sub-region of the projection picture in the encoded sequence is the first preset encoded bit, the color intensity of the sub-region may be adjusted, so that after the color sensor 104 obtains the color sensing signal through the visible light channel, the receiving processing device 106 in communication with the color sensor 104 may determine the corresponding position coordinates based on the color sensing signal output by the color sensor 104 in the preset signal period, thereby implementing positioning.
Wherein, the projection device 104 may be composed of two subsystems, which are a light engine and a driving board, respectively; the light engine includes optics, RGB-LEDs, and a digital micro-mirror device (DMD) projector, which may be model DLPDLCR2000EVM; the drive board includes a DLPC2607 display controller and a DLPA1000 processor.
The projection device 102 may obtain the target projection image from a terminal, which may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, smart televisions, etc.; projection device 102 may also obtain a target projection image from a server; thereafter, the projection device 102 may generate a sequence of projection pictures of the target projection image.
The terminal or the server may also generate a projection image sequence of the target projection image and divide the target projection image into sub-areas, and further, the projection device 102 may obtain the projection image sequence of the target projection image and the target projection image after dividing the sub-areas from the terminal or the server.
Wherein the color sensor may comprise a photodiode array, e.g., an N x N photodiode array, N may be 8 or other value; when n=8, every 16 photodiodes have blue, green, red and no filters, all the photodiode arrays of the same color are connected in parallel to increase the induced current of the detected light intensity; the color sensor can be internally embedded with convex lenses so as to increase the intensity of light signals received by the color sensor and facilitate the identification of three primary colors; when the color sensor is used for realizing man-machine interaction, the color sensor can be worn on a finger of a user as a slight sensor or embedded in a projection command tool, for example, a projector command pen, and can be packaged in other modes, and the embodiment of the application is not limited.
In one embodiment, as shown in fig. 2, a projection interaction method is provided, and the method is applied to the projection device 104 in fig. 1 for illustration, and the method includes the following steps:
step 202, a projection frame sequence of a projection image of a target is acquired.
The projection device is used for obtaining a projection picture sequence of the target projection image before projecting the target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures, and the plurality of projection pictures can be understood as repeated projection of the target projection image.
Step 204, dividing the target projection image into sub-areas.
The number of the sub-areas is multiple, the number of the sub-areas is related to positioning accuracy, when the number of the sub-areas is multiple, the smaller the size of the sub-areas is, the higher the positioning accuracy is, for example, the size of the sub-areas can be 5cm×5cm, 10cm×10cm or other values, and the specific value of the size of the sub-areas can be set according to practical application scenarios, which is not limited in this embodiment.
And 206, respectively encoding the position coordinates of each sub-region, obtaining the encoding sequence of the position coordinates of each sub-region, and corresponding the encoding sequence of the sub-region to the projection picture sequence of the sub-region.
The method comprises the steps that the lower left corner of a target projection image is taken as an origin of coordinates, world position coordinates of all subareas can be obtained, the world position coordinates of the subareas comprise a plurality of numbers, and an area formed by the plurality of numbers is the subarea; after the world position coordinates of all the subareas are processed, the position coordinates of all the subareas can be obtained, and the position coordinates are a sequence consisting of 0 and 1; after the position coordinates of each sub-region are respectively encoded, the obtained encoding sequence of the position coordinates of each sub-region is also a sequence consisting of 0 and 1, and the quality of the projection picture of each sub-region can be improved by a mode of respectively encoding the position coordinates of each sub-region.
Wherein the projection picture sequence of the subarea comprises a plurality of continuous projection pictures of the subarea; the coding sequence of the sub-region corresponds to the projection picture sequence of the sub-region, and it is understood that when the projection pictures of the sub-region are projected in one period, the number of bits of the coding bits in the coding sequence of the sub-region is the same as the number of times the projection pictures of the sub-region are projected.
For example, the projection screen sequence of the target projection image may be [ A1, A2, A3, A4], wherein each of A1 to A4 indicates repeated projection of the target projection image, and when the target projection image is repeatedly projected, the repeated projected target projection image is also divided into sub-areas, and the sub-areas after division of each of the repeated projected target projection images are the same; for example, after the target projection image is divided into the sub-regions, the coding sequence in which q1 is a sub-region is "0110", the first coding bit "0" in "0110" corresponds to q1 in the target projection image corresponding to A1, the second coding bit "1" in "0110" corresponds to q1 in the target projection image corresponding to A2, the third coding bit "1" in "0110" corresponds to q1 in the target projection image corresponding to A3, and the fourth coding bit "0" in "0110" corresponds to q1 in the target projection image corresponding to A4.
In particular, the number of bits of the coded bits in the coded sequence of sub-regions is related to projection parameters, including the focal length of the projection device, the projection screen size, the Liquid CRYSTAL DISPLAY (LCD) panel size, the distance of the projection device from the projection screen, and the preset sub-region size.
For example, the maximum distance D max of the projection device from the projection screen and the minimum distance D min of the projection device from the projection screen satisfy the formula (1), and the formula (1) is as follows:
Where f max is the maximum focal length of the projection device, f min is the minimum focal length of the projection device, S size is the projection screen size, and L size is the LCD panel size.
In the case where D max、fmax、Lsize is known or D min、fmin、Lsize is known, based on formula (1), projection screen size S size can be obtained. Thus, given the preset sub-region size, the number of aliquots of the long side of the projection screen and the number of aliquots of the wide side of the projection screen can be determined, and further, the number of coded bits in the coded sequence of the sub-region can be determined by the following formula:
in coding sequences of subregions Wherein,Representing an upward rounding.
For example, when each sub-region is 5cm by 5cm in size, the long side of the projection screen may be divided into m equal parts, and the wide side of the projection screen may be divided into n equal parts, m may be 20, n may be 15, and thus the code sequence of the sub-region may be obtained
Specifically, the projection device encodes the position coordinates of each sub-region respectively, and obtains a coding sequence of the position coordinates of each sub-region, including: gray coding is respectively carried out on the position coordinates of each subarea, and a Gray coding sequence is obtained; performing Manchester coding on the Gray coding sequence to obtain a Manchester coding sequence of the position coordinates of each subarea; wherein the number of bits of the encoded bits in the manchester encoded sequence is twice the number of bits of the encoded bits in the gray encoded sequence.
The number of consecutive projection pictures in the projection picture sequence of the target projection image is the same as the number of bits of the coding bits in the coding sequence of the sub-region, so that the number of consecutive projection pictures in the projection picture sequence of the target projection image can also be determined by determining the number of bits of the coding bits in the coding sequence of the sub-region.
When the existing PD-based receiver realizes man-machine interaction, abnormal brightness change and problems can occur in a projection picture, and the projection interaction method provided by the application can ensure that the coding bits in the Gray coding sequences of adjacent subregions have only one bit difference by respectively Gray coding the position data of each subregion, so that obvious boundary difference and brightness change between the adjacent subregions are avoided; human eye flickering perception caused by long time 0 or long time 1 can be avoided by Manchester encoding the Gray code sequence, so that user experience is improved.
Since the relation between the number of projection images to be gray-coded and the number of sub-regions is O (log 2 n), n being the number of parts of the projection screen broadside, in the case of satisfying the formula of the number of bits of the coding bits in the coding sequence of the sub-regions, as shown in fig. 3, a schematic diagram of the principle of gray coding of 8×8 sub-regions is provided; wherein black indicates a coding bit of "1", and white indicates a coding bit of "0".
Step 208, before projecting each projection picture, if the corresponding coding bit in the coding sequence of the sub-region of the projection picture is the first preset coding bit, adjusting the color intensity of the sub-region.
The coding sequence of the subareas is a sequence consisting of 0 and 1, so that the first preset coding bit can be set to 1, and before projection pictures of all subareas are projected, if the corresponding coding bit in the coding sequence of the subareas is the first preset coding bit, the color intensity of the subareas can be adjusted, so that the accuracy of target positioning can be improved based on the change of the color intensity.
Specifically, if a corresponding coding bit in a coding sequence of a sub-region of a projection picture is a first preset coding bit, adjusting the color intensity of the sub-region includes: if the corresponding coding bit in the coding sequence of the sub-region of the projection picture is a first preset coding bit, the color intensity of the R channel of the sub-region is increased by a first preset value, and the color intensity of the B channel of the sub-region is decreased by a second preset value, wherein the second preset value is a preset multiple of the first preset value.
The preset multiple may be determined by combining the relation between the YCbCr color space and the (R, G, B) values in the RGB color space, and the requirement of the overall light intensity after adjusting the color brightness.
The YCbCr color space is a commonly used color coding scheme in video products such as cameras and digital televisions, and is not an absolute color space, but is a compressed and offset version of the YUV color space (Y: signal luminance, U/V: signal chromaticity). In the YCbCr color space, Y refers to a luminance component, cb refers to a blue chrominance component, and Cr refers to a red chrominance component.
Since the human eye is more sensitive to the Y component in the image/video, the human eye cannot perceive a luminance change in the transmitted image when the chrominance component varies between Cb and Cr. In the projection apparatus, the light intensity and color of each pixel in the projected image are generated by additive mixing of different red, green, and blue (three primary colors). The conversion between YCbCr color space and RGB color space is as follows:
further deriving equation (2) may yield equation (3), equation (3) being as follows:
In the formulas (2) and (3), Y 709 is a perceived brightness, the (R, G, B) value on the RGB color space is an integer between 0 and 255, and by changing the (R, G, B) value to satisfy the robust beacon communication and the brightness variation which is hardly noticeable to the human eye, that is, to satisfy the overall light intensity after adjusting the color brightness, the constraint condition is set as follows according to the formulas (2) and (3):
considering the integer nature of the computer system, it is desirable to select appropriate values for δ 1、δ2 and δ 3 so that the chromaticity changes (color changes) of C b and C r tend to be 0, while the luminance changes of Y 709 remain unchanged.
From equation (4), it can be found that the color intensity of the G channel has the highest weight coefficient, and if δ 2 is added to the color intensity of the G channel, equation (4) will be a complex multi-solution problem. Multiple (δ 123) solutions require a large amount of computation time by the color sensor and cannot guarantee that the computed solution is the correct solution.
Further, it can also be found from the formula (4) that the weight coefficient of the color intensity of the R channel is almost 3 times that of the B channel. Therefore, the preset multiple may be set to 3, so that when the first preset value is δ and the preset multiple is 3, if the corresponding coding bit in the coding sequence of the sub-region of the projection picture is the first preset coding bit, δ may be increased in the color intensity of the R channel for the sub-region and 3 δ may be reduced in the color intensity of the B channel for the sub-region, and the color intensity of the G channel may remain unchanged, i.e., (δ 123) = (δ,0, -3 δ).
If the corresponding coding bit in the coding sequence of the sub-region of the projection picture is not the first preset coding bit, or the corresponding coding bit in the coding sequence of the sub-region of the projection picture is the second preset coding bit, the projection device can not adjust the color intensity of the sub-region; the second preset encoding bit may be "0".
As shown in fig. 4, a schematic diagram of sub-region division and color intensity adjustment of sub-regions is provided, a projection device divides a target projection image into sub-regions, and after gray coding and manchester coding are performed on position coordinates of each sub-region, a coding sequence of the position coordinates of the sub-region can be obtained.
For example, in a projection picture in a sequence of projection pictures, the coded bits of each sub-region of the lowest row are sequentially "100111110", so that, before projecting the projection picture, when the coded bits of each sub-region of the lowest row are the first preset coded bits, δ is increased in the color intensity of the R channel of the sub-region and 3δ is decreased in the color intensity of the B channel; when the coding bit of each subarea of the lowest row is not the first preset coding bit, the color intensity of the R channel and the color intensity of the B channel of the subarea are not adjusted.
Since the communication distance affects the intensity of the color sensor receiving the color sensing signal, the communication distance affects the first preset value. Moreover, the color of the projected image will also affect the first preset value, but in the application scenario, the communication distance is usually determined in advance, and therefore, the effect of the color of the projected image on the first preset value will be analyzed.
For example, the color of the projected image may be configured as a monochrome image, the monochrome image color including: red, orange, yellow, green, blue, violet, white and black. Then, in the case of communication distance determination, for example, the communication distance is 2 meters, the influence of the different first preset value on each monochrome image is tested.
According to the test result, the larger the first preset value is, the easier the position coordinates corresponding to the color channel are decoded. However, when the first preset value is greater than 7, the human eye can perceive a light intensity variation in most color cases. In the case of a monochromatic image with a blue color, the purkinje effect still causes the human eye to perceive a change in light intensity. The first preset value may be set to any value from 4 to 8 according to factors such as different projection parameters, projected image colors, and ambient light conditions.
As shown in fig. 5, a color space conversion modulation schematic diagram based on YCbCr color space is provided, and as can be seen from fig. 5 and equation (4), when the parameter δ is set too large, it is easy to demodulate/decode the R-channel position coordinate and the B-channel position coordinate, but the overall brightness or color change of the screen causes human eye perception; if the parameter delta is set too small, the color sensor may not be able to sense the change of color intensity, i.e. cannot acquire the color sensing signal, thereby causing failure in resolving the R-channel position coordinates and the B-channel position coordinates; wherein the reference red/blue intensity is an unmodified reference color intensity.
Step 210, each projection screen is projected.
According to the projection interaction method, the projection equipment can acquire the projection picture sequence of the target projection image, divide the target projection image into the subareas, further encode the position coordinates of each subarea respectively to acquire the encoding sequence of the position coordinates of each subarea, and correspond the encoding sequence of the subarea to the projection picture sequence of the subarea, so that before each projection picture is projected, if the corresponding encoding bit in the encoding sequence of the subarea of the projection picture is the first preset encoding bit, the color intensity of the subarea is adjusted, and then each projection picture is projected. Before the projection picture of each subarea is projected, the color intensity of each subarea is adjusted, so that the positioning can be realized based on the change of the color intensity of each subarea, and the positioning accuracy is improved.
Before the projection device transmits the optical signal corresponding to the coding sequence of the position coordinates of each sub-region, the projection device may add a sequence header before the coding sequence of the position coordinates of each sub-region and add a parity bit and a tail frame after the sequence header, so that complete beacon information of each sub-region may be formed.
For example, the projection device has an original display resolution of 640×340 pixels, a frame rate of 60Hz, and in the complete beacon information of each sub-region, the coded bits of the sequence header are five bits, which may be denoted as "11111"; nine bits are used to represent the code sequence of the position coordinates of the sub-region; the parity bits and the encoded bits of the tail frame are one bit each, which may each be denoted as "0"; wherein the sequence header and parity bits and tail frames may be used for synchronization and verification.
The coded bits of the sequence header are five bits, and from the optical signal receiving angle, it can be understood that the duration of the optical signal corresponding to the first coded bit in the coded sequence of the sub-region is longest; it is further understood that, after receiving the optical signal corresponding to the last encoded bit in the encoded sequence of the sub-region, the re-receiving is continued from the optical signal corresponding to the first encoded bit of the sub-region, and the receiving duration of the optical signal corresponding to the first encoded bit in the encoded sequence of the sub-region is longer than the receiving duration of the optical signals corresponding to other encoded bits in the encoded sequence of the sub-region.
As shown in fig. 6, a schematic diagram of beacon information of a sub-area is provided, the sequence header lasts for 5 bits, each time slot of the remaining bits lasts for 1 bit, or it is understood that the number of bits of the sequence header is 5, the number of bits of the parity check bits and the number of bits of the tail frame are all 1, and the number of bits of each bit code in the code sequence is also 1; for example, when the number of bits of the code sequence of the sub-region is 18 bits and the frame rate is 60Hz, the projection device transmits the total number of bits of the beacon information of the sub-region is 25 bits, which means that 25/60 seconds is required to transmit the beacon information of the sub-region.
When the projection screen sequence, the first preset value and the receiving duration of the analog current signal are changed according to the size and the positioning accuracy of the projection screen, if the projection device works in an image projection mode, the projection image usually stays for more than 2 seconds, and the projection device can transmit a plurality of complete beacon information; if the projection device works in the video projection mode and a severe dynamic image change is generated in some frames, the receiving processing device can wait 1-2 seconds and then conduct position coordinate recognition again so as to achieve target positioning.
In one embodiment, as shown in fig. 7, a projection interaction method is provided, and the method is applied to the receiving processing device in fig. 1, for illustration, the method includes the following steps:
Step 702, acquiring a color sensing signal output by a color sensor in a preset signal period.
Step 704, processing the color sensing signal output in the preset signal period to obtain a corresponding digital voltage signal.
The color sensor can acquire the light signal transmitted by the projection equipment, and the color sensor can acquire a color sensing signal after processing the light signal, so that the receiving processing equipment acquires the color sensing signal output by the color sensor in a preset signal period, and processes the color sensing signal output in the preset signal period to acquire a corresponding digital voltage signal; the preset signal period is a period set by the color sensor receiving signal.
Step 706, digitizing the digital voltage signal into a corresponding digitized sequence, wherein if the voltage value of the digital voltage signal is greater than the voltage threshold, the digitized encoded bit of the digital voltage signal is determined as a first preset encoded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit after the digital voltage signal is digitized as a second preset code bit.
Step 708, decoding the digitized sequence to obtain corresponding position coordinates.
The color sensor can be worn on the user's finger as a light sensor or embedded in the projection command tool, so that the corresponding position coordinates are used for indicating the position of the user's finger in the projection picture or the position of the projection command pen in the projection picture, and based on the position, the positioning of the user's finger or the positioning of the projection command tool can be realized, namely, the target positioning can be realized.
Specifically, the receiving processing device decodes the digitized sequence to obtain corresponding position coordinates, including: performing Manchester decoding on the digitized sequence to obtain a digitized sequence after Manchester decoding; the receiving processing device performs gray decoding on the digitized sequence after Manchester decoding to obtain position coordinates.
According to the projection interaction method, the receiving processing equipment can acquire the color sensing signals output by the color sensor in the preset signal period, and process the color sensing signals output by the color sensor in the preset signal period to acquire corresponding digital voltage signals; the digital voltage signals are digitized into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized coded bits of the digital voltage signals are determined to be first preset coded bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; and decoding the digitized sequence to obtain the corresponding position coordinates. Because the color sensing signals are transmitted after being regulated based on the colors, after the projection device projects the picture, the position coordinates obtained by the receiving processing device based on the color sensing signals received by the changing color intensity have high accuracy of successful positioning.
Specifically, the receiving processing device obtains the color sensing signal output by the color sensor in the preset signal period, an optional implementation manner is shown in fig. 8, and fig. 8 is a schematic flow chart of obtaining the color sensing signal by the receiving processing device according to the embodiment, which may include the following steps:
step 802, an analog current signal output by a color sensor is obtained.
Wherein the color sensor receives a light signal from the projection device, the color sensor may convert the light signal to an analog current signal.
In step 804, the analog current signal with the longest receiving duration and the analog current signal between the analog current signal and the analog current signal with the longest next receiving duration are determined as the color sensing signal output in the preset signal period.
When the projection device transmits the optical signal corresponding to the coding sequence of each sub-region to the color sensor, the projection device adds a sequence header before the coding sequence of each sub-region, and the receiving duration of the first coding bit in the coding sequence of each sub-region can be determined to be longest based on the sequence header, so that the receiving processing device can determine the analog current signal with the longest receiving duration and the analog current signal between the analog current signal and the analog current signal with the longest next receiving duration as the color sensing signal output by the color sensor in the preset signal period.
Specifically, processing the color sensing signal output in the preset signal period to obtain a corresponding digital voltage signal, including: and converting the analog current signals corresponding to the receiving duration time into digital voltage signals, and obtaining corresponding digital voltage signals after sampling, quantizing and filtering the digital voltage signals.
The receiving and processing equipment can comprise a singlechip, and the model of the singlechip can be STM32; the color sensor can send the analog current signals to the singlechip, and the singlechip can convert the analog current signals corresponding to each receiving duration into each digital voltage signal through the USB-RS32, and sample, quantize and filter each digital voltage signal to obtain the corresponding digital voltage signal.
In one mode, the singlechip may send the digital voltage signal to the receiving and processing device, where the receiving and processing device determines a voltage value in the digital voltage signal, and if the voltage value of the digital voltage signal is greater than a voltage threshold, determines a code bit after digitizing the digital voltage signal as a first preset code bit, where the first preset code bit may be "1"; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, the code bit after the digital voltage signal is digitized is determined as a second preset code bit, and the second preset code bit can be "0".
In another mode, the singlechip can send the digital voltage signal to the server, the server judges the voltage value in the digital voltage signal, and if the voltage value of the digital voltage signal is greater than a voltage threshold value, the digital voltage signal digitized coding bit is determined to be a first preset coding bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; then, the server can send the digitized sequence after the digital voltage signal is digitized to the receiving processing equipment, and the receiving processing equipment decodes the digitized sequence to obtain corresponding position coordinates; or the server decodes the digitized sequence to obtain corresponding position coordinates, and sends the position coordinates to the receiving processing equipment; if the application scene is a conference room projection scene, the server may be a local server set in the conference room.
In one embodiment, the embodiment of the present application is another optional method embodiment performed by the receiving processing device through position coordinates, where the position coordinates include R-channel position coordinates and B-channel position coordinates, and the method includes the steps of:
if the R channel position coordinate is the same as the B channel position coordinate and is the same as the position coordinate of any one of the pre-stored subareas, the receiving and processing equipment determines that the obtained position coordinate is valid, otherwise, the receiving and processing equipment determines that the obtained position coordinate is invalid.
Because the transmitted data is influenced by the ambient light noise, the responsivity of the photodiode array in the color sensor, the amplification factor of the operational amplifier in the color sensor and other factors in the communication process of the projection equipment and the color sensor, the accuracy of the position coordinates determined by the R channel position coordinates and the B channel position coordinates is high in order to improve the position coordinate recognition rate; the R channel position coordinate and the B channel position coordinate are different, the R channel position coordinate and the B channel position coordinate are the same, and the R channel position coordinate is different from the position coordinate of any pre-stored sub-area, so that the situation is probabilistic.
For the situation that the R channel position coordinates and the B channel position coordinates are different, the receiving and processing equipment can instruct the color sensor to output the color sensing signal again, and further continuously judge the relation between the R channel position coordinates and the B channel position coordinates, and if the R channel position coordinates and the B channel position coordinates are different within the preset times due to the consistency of the position coordinate identification, the obtained position coordinates are invalid, namely the positioning is failed; the preset number of times may be 5 times or other values.
For the case that the R channel position coordinate is the same as the B channel position coordinate and the R channel position coordinate is different from the position coordinate of any one of the pre-stored sub-regions, it is assumed that the probability of the error recognition of the R channel position coordinate is P E1, the probability of the error recognition of the B channel position coordinate is P E2, and since the three primary colors are orthogonal to each other, there is no linear correlation between P E1 and P E2, the probability of the error recognition of the R channel position coordinate and the B channel position coordinate is P E1×PE2 at the same time, and compared with the single channel case, P E1×PE2<PE1,PE1×PE2<PE2, the positioning accuracy can be improved when the positioning decision is performed by adopting the two-channel position coordinate.
Specifically, the receiving processing device may also determine the sliding track in combination with position coordinates corresponding to a plurality of consecutive preset signal periods.
After determining the sliding track in combination with the position coordinates corresponding to the continuous multiple preset signal periods, the receiving processing device may determine a processing operation on the target projection image based on the sliding track, for example, the processing operation may be to switch to a next image of the target projection image; the sliding track has a corresponding relation with the processing operation.
The color sensor can be worn on a finger of a user as a light sensor or can be embedded in the projection command tool, so that based on color sensing signals output by the color sensor, the receiving processing equipment can also determine gesture directions by combining position coordinates of a plurality of subareas, further determine processing operations on the target projection image, and the gesture directions have a corresponding relation with the processing operations, for example, when the gesture directions are upward, the processing operations on the target projection image are closing the target projection image.
Under the condition of considering factors such as the size of a subarea, different images and different frame rates, the communication distance (namely the distance from a projection device to a projection screen), the viewing angle and the like, the projection interaction method provided by the application is tested in terms of human eye perception, beacon recognition rate and positioning accuracy; the beacon identification rate can be understood as a position coordinate identification rate; the application is carried out in a laboratory with the lower left corner of the projection screen as the origin of coordinates, and the height of the projection device from the ground is 1.2m.
In the aspect of human eye perception test, experiments are carried out at 10:00 am and 8:00 pm, so that the influence of ambient light on a projection interaction method can be conveniently analyzed; wherein the distance between the observer and the projection screen was set to 2 meters and 5 participants were tested for human perception.
In particular, the projection device may transmit the coding sequence using the yellow and black images as monochromatic background content while testing the effect of the sub-region size on the human eye perception; the frame rate at which the projected image is transmitted is 60Fps.
When the sub-area sizes are 5cm×5cm, 15cm×15cm, and 25cm×25cm, respectively, the human eye cannot perceive the brightness variation at the three sub-area sizes during the daytime; at night, when the sub-area sizes are 5cm×5cm and 15cm×15cm, the human eye can perceive a change in luminance under a yellow background image because: at night, the brightness of the projection screen is a main indoor light source, and most of the brightness received by human eyes comes from the projection screen; the projection device encodes the position coordinates of the sub-regions separately, which may make the brightness change between adjacent sub-regions insignificant, but there may be significant differences in the coordinate data between non-adjacent sub-regions, thereby causing human eye perception.
In particular, the distance between the projection device and the projection screen was 1.5m when testing the impact of different images and different frame rates on the perception of the human eye. As shown in fig. 9 a-9 c, a schematic diagram of a test image is provided, fig. 9a is Lena image, fig. 9b is Peppers image, and fig. 9c is cartoon video stream image.
The application tests Lena images to transmit the coding sequence at different frame rates, lena images and Peppers images to alternately appear the coding sequence and the cartoon video stream transmission coding sequence, and the test results shown in figures 10 a-10 c can be obtained.
As shown in fig. 10 a-10 c, a schematic diagram of the influence of different images and different frame rates on human eye perception is provided, wherein fig. 10a shows the perception rate when Lena images transmit a coding sequence at different frame rates, fig. 10b shows the perception rate when Lena images and Peppers images interactively appear to transmit the coding sequence, and fig. 10c shows the perception rate when the cartoon video stream has abnormal picture brightness change (namely, normal brightness change generated by the cartoon video stream), video stream buffer change and color change.
As can be seen from fig. 10a, the different frame rates do not result in the human eye perceiving a change in brightness of Lena images; as can be seen from the graph of fig. 10b, the interactive transmission of the coding sequence between Lena and Peppers images does not cause human perception either; as can be seen from fig. 10c, the human eye does not perceive any abnormal change in terms of abnormal change in brightness of the picture, buffer change in the video stream, and color change.
Specifically, when testing the influence of communication distance and viewing angle on human eye perception, the test is divided into daytime and evening, lena images are used for transmitting a coding sequence, the frame rate is 60Fps, the distance between projection equipment and a projection screen is 1 m-5 m, and the viewing angle of an observer is-60 °~60°; wherein 100% indicates that the human eye cannot perceive the brightness change.
As shown in fig. 11a and 11b, a schematic diagram of a communication distance and a perception rate of a viewing angle to human eyes is provided, wherein fig. 11a shows the perception rate at different communication distances and fig. 11b shows the perception rate at different viewing angles.
As can be seen from fig. 11a, when the communication distance increases, the human eye can perceive the change of brightness, because the increase of the communication distance causes the size of the projected picture to increase when projected, thereby reducing the brightness of the projection screen, which is easily disturbed by the ambient light, possibly resulting in the human eye perceiving the change of brightness; as can be seen from fig. 11b, the different viewing angles have no effect on the perceived performance of the human eye at a distance of 1.5 cm; thus, based on fig. 11a and 11b, it can be seen that the communication distance is a relatively important factor affecting the human eye perception performance with respect to the viewing angle.
In the aspect of beacon recognition rate test, the test is divided into daytime and night, and the curtains are all closed, so that the interference of ambient light in the room is avoided; specifically, in testing the effect of sub-region size on the beacon identification rate, a convex lens may be embedded within the color sensor, and the convex lens size may be 1.5cm×1.5cm.
As shown in fig. 12, a schematic diagram of the result of the sub-area size versus the beacon identification rate is provided, and it can be seen from fig. 12 that as the sub-area size decreases, the beacon identification rate decreases, because the color sensor may receive multiple color intensities from multiple sub-areas as the sub-area size decreases, which may cause the beacon identification to fail; furthermore, as the size of the sub-area increases, the beacon identification rate also fluctuates because the color sensor may be located on the boundary of the two sub-areas, also resulting in failure to identify the beacon.
Specifically, when the influence of different images and different frame rates on the beacon identification rate is tested, the test condition is the same as the condition when the influence of different images and different frame rates on the human eye perception is tested.
As shown in fig. 13 a-13 c, a schematic diagram of the result of the effect of different images and different frame rates on the beacon recognition rate is provided, wherein fig. 13a shows the beacon recognition rate when Lena images transmit the coding sequence at different frame rates, fig. 13b shows the beacon recognition rate when Lena images and Peppers images interactively appear to transmit the coding sequence, and fig. 13c shows the beacon recognition rate when the cartoon video stream has abnormal brightness change (i.e. normal brightness change generated by the cartoon video stream), buffer change of the video stream and color change.
As can be seen from fig. 13a and 13b, the different frame rates and the different images have no effect on the beacon identification rate; it can be found from fig. 13c that when the projection device encodes the position coordinates of each sub-region in the cartoon video stream, the position coordinates may not be recognized finally, because the position coordinates are recognized according to the color intensity change on the R channel and the color intensity change on the B channel, the reference color intensities of the R channel and the B channel are always changed due to the rapid change of the video content of each frame of the cartoon video stream, so that the receiving processing device cannot set a reasonable voltage threshold, and the position coordinates are failed to recognize.
Specifically, when the influence of the communication distance on the beacon recognition rate is tested, the sub-region size is 15m×15m, as shown in fig. 14, providing a result diagram of the influence of the communication distance on the beacon recognition rate, it can be seen from fig. 14 that the beacon recognition rate decreases as the communication distance increases, because the increase of the communication distance causes the light intensity received by the color sensor to decrease, and the useful light intensity is mixed into the ambient light noise, resulting in failure of the position coordinate recognition.
In the aspect of positioning accuracy test, specifically, the application tests the positioning accuracy by testing the positioning accuracy of the edge of the projection screen and the center of the projection screen on the projection screen and drawing positioning points and tracking graphs, and the test uses 30 random positioning points to test the positioning accuracy.
As shown in fig. 15, a schematic diagram of a positioning result is provided, and as can be seen from fig. 15, when the size of the sub-area is 10cm×10cm, the positioning accuracy of the centimeter level can be achieved, and the average positioning error is 8cm; the positioning precision of the edge of the projection screen is the same as that of the center of the projection screen, so that the provided projection interaction method can well realize target positioning and can provide effective target positioning for any point on the projection screen.
In the aspect of positioning accuracy test, specifically, the application respectively projects images transmitted in the daytime when a positioning result is obtained onto a yellow desktop and a white wall, and observes the influence of different background color planes on positioning accuracy. As shown in fig. 16, a schematic diagram of the result of the influence of different background colors on the positioning accuracy is provided, and as can be found from fig. 16, the influence of different background color planes on the positioning accuracy is almost not influenced, so that the projection interaction method provided by the application can perform effective man-machine interaction on different background planes. According to the test, the projection interaction method provided by the application can realize centimeter-level target positioning, and widens man-machine interaction of the intelligent projector based on visible light positioning.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a projection interaction device for realizing the above-mentioned projection interaction method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in one or more embodiments of the projection interaction device provided below may refer to the limitation of the projection interaction method hereinabove, and will not be repeated herein.
In one embodiment, as shown in fig. 17, there is provided a projection interaction apparatus, comprising: an acquisition module 1702, a partitioning module 1704, an encoding module 1706, a color intensity adjustment module 1708, and a projection module 1710, wherein:
An acquisition module 1702 configured to acquire a projection screen sequence of a target projection image, where the projection screen sequence includes a plurality of continuous projection screens; a dividing module 1704, configured to divide the target projection image into sub-regions; the encoding module 1706 is configured to encode the position coordinates of each sub-region respectively, obtain a coding sequence of the position coordinates of each sub-region, and correspond the coding sequence of the sub-region to a projection picture sequence of the sub-region; the color intensity adjustment module 1708 is configured to adjust the color intensity of the sub-region of the projection picture if the corresponding encoding bit in the encoding sequence of the sub-region of the projection picture is a first preset encoding bit before projecting each projection picture; the projection module 1710 is configured to project each projection screen.
In one embodiment, the encoding module 1706 is further configured to: gray coding is respectively carried out on the position coordinates of each subarea, and a Gray coding sequence is obtained; and carrying out Manchester encoding on the Gray code sequence to obtain the Manchester code sequence of the position coordinates of each subarea.
In one embodiment, the color intensity adjustment module 1708 is further configured to: if the corresponding coding bit in the coding sequence of the sub-region of the projection picture is a first preset coding bit, the color intensity of the R channel of the sub-region is increased by a first preset value, and the color intensity of the B channel of the sub-region is decreased by a second preset value, wherein the second preset value is a preset multiple of the first preset value.
In one embodiment, as shown in fig. 18, there is provided a projection interaction apparatus, comprising: an acquisition module 1802, a processing module 1804, a digitizing module 1806, and a decoding module 1808, wherein:
An acquisition module 1802, configured to acquire a color sensing signal output by a color sensor in a preset signal period; the processing module 1804 is configured to process the color sensing signal output in the preset signal period to obtain a corresponding digital voltage signal; a digitizing module 1806, configured to digitize the digital voltage signal into a corresponding digitized sequence, where if the voltage value of the digital voltage signal is greater than the voltage threshold, the digitized encoded bit of the digital voltage signal is determined as a first preset encoded bit; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; the decoding module 1808 is configured to decode the digitized sequence to obtain corresponding position coordinates.
In one embodiment, the acquisition module 1802 is further configured to: acquiring an analog current signal output by a color sensor; and determining the analog current signal with the longest receiving duration and the analog current signal between the analog current signal and the analog current signal with the longest receiving duration as the color sensing signal output in the preset signal period.
In one embodiment, the processing module 1804 is further configured to: converting the analog current signals corresponding to the receiving duration time into digital voltage signals; and after sampling, quantizing and filtering the digital voltage signals, obtaining corresponding digital voltage signals.
In one embodiment, the decoding module 1808 is further configured to: performing Manchester decoding on the digitized sequence to obtain a digitized sequence after Manchester decoding; gray decoding is carried out on the digitized sequence after Manchester decoding, and position coordinates are obtained.
In one embodiment, the processing module 1804 is further configured to: if the R channel position coordinate is the same as the B channel position coordinate and is the same as the position coordinate of any one of the pre-stored subareas, the obtained position coordinate is determined to be valid, otherwise, the obtained position coordinate is determined to be invalid.
In one embodiment, the processing module 1804 is further configured to: and determining the sliding track by combining the position coordinates corresponding to a plurality of continuous preset signal periods.
The modules in the projection interaction apparatus described above may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure of which may be as shown in fig. 19. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a projection interaction method.
It will be appreciated by those skilled in the art that the structure shown in FIG. 19 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In an embodiment, there is also provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
In one embodiment, a projection interactive system is provided, the system comprising: a projection device, a color sensor, a receiving processing device in communication with the color sensor,
Projection apparatus for: acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures; dividing the target projection image into subareas; respectively encoding the position coordinates of each subarea to obtain a coding sequence of the position coordinates of each subarea, and corresponding the coding sequence of the subarea to a projection picture sequence of the subarea; before each projection picture is projected, if the corresponding coding bit in the coding sequence of the sub-region of the projection picture is a first preset coding bit, adjusting the color intensity of the sub-region; projecting each projection picture;
a color sensor for outputting a color sensing signal;
A reception processing device for: acquiring a color sensing signal output by a color sensor in a preset signal period; processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals; the digital voltage signals are digitized into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized coded bits of the digital voltage signals are determined to be first preset coded bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; and decoding the digitized sequence to obtain corresponding position coordinates.
The user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or sufficiently authorized by each party.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magneto-resistive random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (PHASE CHANGE Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (10)

1. A projection interaction method, applied to a projection device, the method comprising:
acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures;
dividing the target projection image into subareas;
respectively encoding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region;
before projecting each projection picture, if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-region;
Projecting each projection picture; in the projection process of each projection picture, a color sensor acquires an optical signal transmitted by the projection equipment in the projection process of each projection picture, processes the optical signal to obtain a color sensing signal, and outputs the color sensing signal in a preset signal period, so that a receiving processing equipment acquires the color sensing signal output by the color sensor in the preset signal period; processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals; the digital voltage signals are digitized into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized coded bits of the digital voltage signals are determined to be first preset coded bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit; and decoding the digitized sequence to obtain corresponding position coordinates.
2. The method according to claim 1, wherein the encoding the position coordinates of each of the sub-regions separately to obtain the encoded sequence of the position coordinates of each of the sub-regions includes:
Respectively carrying out Gray coding on the position coordinates of each subarea to obtain a Gray coding sequence;
and carrying out Manchester encoding on the Gray code sequence to obtain Manchester code sequences of position coordinates of the subareas.
3. The method according to claim 1 or 2, wherein if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is the first preset coding bit, adjusting the color intensity of the sub-region comprises:
If the corresponding coding bit of the sub-region of the projection picture in the coding sequence is the first preset coding bit, increasing the color intensity of the R channel of the sub-region by a first preset value, and decreasing the color intensity of the B channel of the sub-region by a second preset value, wherein the second preset value is a preset multiple of the first preset value.
4.A projection interaction method, applied to a receiving processing device, the method comprising:
Acquiring a color sensing signal output by a color sensor in a preset signal period;
processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals;
The digital voltage signals are digitized into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized coded bits of the digital voltage signals are determined to be first preset coded bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit;
and decoding the digitized sequence to obtain corresponding position coordinates.
5. The method of claim 4, wherein the acquiring the color sensing signal output by the color sensor during the preset signal period comprises:
Acquiring an analog current signal output by the color sensor;
And determining the analog current signal with the longest receiving duration and the analog current signal between the analog current signal and the analog current signal with the longest receiving duration as the color sensing signal output in the preset signal period.
6. The method according to claim 5, wherein the processing the color sensing signal output in the preset signal period to obtain a corresponding digital voltage signal includes:
converting the analog current signals corresponding to the receiving duration to digital voltage signals;
And after sampling, quantizing and filtering the digital voltage signals, obtaining the corresponding digital voltage signals.
7. The method according to any one of claims 4 to 6, wherein decoding the digitized sequence to obtain corresponding position coordinates comprises:
performing Manchester decoding on the digitized sequence to obtain a digitized sequence after Manchester decoding;
and Gray decoding is carried out on the digitized sequence after Manchester decoding, and the position coordinates are obtained.
8. The method of claim 7, wherein the position coordinates comprise R-channel position coordinates and B-channel position coordinates, the method further comprising:
And if the R channel position coordinate is the same as the B channel position coordinate and is the same as the position coordinate of any one of the pre-stored sub-areas, determining that the obtained position coordinate is valid, otherwise, determining that the obtained position coordinate is invalid.
9. The method of claim 8, wherein the method further comprises:
And determining the sliding track by combining the position coordinates corresponding to a plurality of continuous preset signal periods.
10. A projection interactive system, the system comprising: a projection device, a color sensor, and a reception processing device in communication with the color sensor,
The projection device is used for:
acquiring a projection picture sequence of a target projection image, wherein the projection picture sequence comprises a plurality of continuous projection pictures;
dividing the target projection image into subareas;
respectively encoding the position coordinates of each sub-region to obtain a coding sequence of the position coordinates of each sub-region, and corresponding the coding sequence of the sub-region to a projection picture sequence of the sub-region;
before projecting each projection picture, if the corresponding coding bit of the sub-region of the projection picture in the coding sequence is a first preset coding bit, adjusting the color intensity of the sub-region;
projecting each projection picture;
The color sensor is used for collecting light signals transmitted by the projection equipment in the projection process of each projection picture, processing the light signals, obtaining color sensing signals and outputting the color sensing signals in a preset signal period;
The reception processing device is configured to:
acquiring a color sensing signal output by the color sensor in a preset signal period;
processing the color sensing signals output in the preset signal period to obtain corresponding digital voltage signals;
The digital voltage signals are digitized into corresponding digitized sequences, wherein if the voltage value of the digital voltage signals is larger than a voltage threshold value, the digitized coded bits of the digital voltage signals are determined to be first preset coded bits; if the voltage value of the digital voltage signal is smaller than or equal to the voltage threshold value, determining the code bit of the digital voltage signal after being digitized as a second preset code bit;
and decoding the digitized sequence to obtain corresponding position coordinates.
CN202210606437.4A 2022-05-31 2022-05-31 Projection interaction method and system Active CN115016716B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210606437.4A CN115016716B (en) 2022-05-31 2022-05-31 Projection interaction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210606437.4A CN115016716B (en) 2022-05-31 2022-05-31 Projection interaction method and system

Publications (2)

Publication Number Publication Date
CN115016716A CN115016716A (en) 2022-09-06
CN115016716B true CN115016716B (en) 2024-07-05

Family

ID=83070926

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210606437.4A Active CN115016716B (en) 2022-05-31 2022-05-31 Projection interaction method and system

Country Status (1)

Country Link
CN (1) CN115016716B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107728413A (en) * 2016-08-12 2018-02-23 深圳市光峰光电技术有限公司 Optical projection system and the brightness adjusting method applied to optical projection system
CN108332670A (en) * 2018-02-06 2018-07-27 浙江大学 A kind of structured-light system coding method for merging the positive and negative Gray code of RGB channel and striped blocks translation

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6671003B1 (en) * 1999-06-30 2003-12-30 Thomson Licensing S.A. Automated calibration in a projection display apparatus
US6480175B1 (en) * 1999-09-17 2002-11-12 International Business Machines Corporation Method and system for eliminating artifacts in overlapped projections
DE10244821A1 (en) * 2002-09-26 2004-04-01 Philips Intellectual Property & Standards Gmbh projection system
JP5485574B2 (en) * 2009-03-26 2014-05-07 株式会社タムラ製作所 Projection system, projection method, projection program, and projection vector calculation apparatus
US9497447B2 (en) * 2011-06-15 2016-11-15 Scalable Display Technologies, Inc. System and method for color and intensity calibrating of a display system for practical usage
CN103809880B (en) * 2014-02-24 2017-02-08 清华大学 Man-machine interaction system and method
KR102420103B1 (en) * 2016-07-14 2022-07-13 삼성전자주식회사 Projection system with enhanced color and contrast
JP2018197824A (en) * 2017-05-24 2018-12-13 キヤノン株式会社 Projection device, information processing apparatus, and method for controlling them, and program
JP2020053710A (en) * 2018-09-21 2020-04-02 キヤノン株式会社 Information processing apparatus, projection device, control method of projection device, and program
CN110177265A (en) * 2019-06-11 2019-08-27 成都极米科技股份有限公司 Method for controlling projection, projection control and optical projection system
CN113556521B (en) * 2020-04-23 2024-03-26 青岛海信激光显示股份有限公司 Projection image correction method and projection system
CN113934089A (en) * 2020-06-29 2022-01-14 中强光电股份有限公司 Projection positioning system and projection positioning method thereof
CN112261396B (en) * 2020-10-26 2022-02-25 成都极米科技股份有限公司 Projection method, projection device, projection equipment and computer readable storage medium
CN112770095B (en) * 2021-01-28 2023-06-30 广州方硅信息技术有限公司 Panoramic projection method and device and electronic equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107728413A (en) * 2016-08-12 2018-02-23 深圳市光峰光电技术有限公司 Optical projection system and the brightness adjusting method applied to optical projection system
CN108332670A (en) * 2018-02-06 2018-07-27 浙江大学 A kind of structured-light system coding method for merging the positive and negative Gray code of RGB channel and striped blocks translation

Also Published As

Publication number Publication date
CN115016716A (en) 2022-09-06

Similar Documents

Publication Publication Date Title
US11373338B2 (en) Image padding in video-based point-cloud compression CODEC
KR102010228B1 (en) Image processing apparatus, image processing method, and program
US10582196B2 (en) Generating heat maps using dynamic vision sensor events
US11671712B2 (en) Apparatus and methods for image encoding using spatially weighted encoding quality parameters
US8553931B2 (en) System and method for adaptively defining a region of interest for motion analysis in digital video
US10872434B2 (en) Image processing apparatus and method
US11143879B2 (en) Semi-dense depth estimation from a dynamic vision sensor (DVS) stereo pair and a pulsed speckle pattern projector
WO2019047985A1 (en) Image processing method and device, electronic device, and computer-readable storage medium
US9049369B2 (en) Apparatus, system and method for projecting images onto predefined portions of objects
CN108702477B (en) Image processing apparatus and method
CN104284119A (en) Apparatus, system and method for projecting images onto predefined portions of objects
CN112887728A (en) Electronic device, control method and system of electronic device
US9241141B1 (en) Projection block extraction
CN106888355A (en) Bit-rate controller and the method for limiting output bit rate
US9385807B2 (en) Light wave communication
US20180091799A1 (en) Robust disparity estimation in the presence of significant intensity variations for camera arrays
CN109191398B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
KR20190080732A (en) Estimation of illumination chromaticity in automatic white balancing
CN115016716B (en) Projection interaction method and system
US11025891B2 (en) Method and system for generating a two-dimensional and a three-dimensional image stream
US11221477B2 (en) Communication apparatus, communication system, and data communication method
CN113196742A (en) Method, system, and computer readable medium for image sensor communication using different transmit data sequence rates and receive frame rates
WO2020162293A1 (en) Image processing device, image processing method, and image processing system
US10102432B2 (en) Image recognition method
KR20190101833A (en) Electronic device for compressing image based on compression loss data associated with compression of a plurality of blocks into which image has been segmented and method for operating thefeof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant