CN112882612B - Display method, display device and display system - Google Patents

Display method, display device and display system Download PDF

Info

Publication number
CN112882612B
CN112882612B CN202110039053.4A CN202110039053A CN112882612B CN 112882612 B CN112882612 B CN 112882612B CN 202110039053 A CN202110039053 A CN 202110039053A CN 112882612 B CN112882612 B CN 112882612B
Authority
CN
China
Prior art keywords
image
straight line
line segment
display device
quadrangle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110039053.4A
Other languages
Chinese (zh)
Other versions
CN112882612A (en
Inventor
王镜茹
胡风硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202110039053.4A priority Critical patent/CN112882612B/en
Publication of CN112882612A publication Critical patent/CN112882612A/en
Application granted granted Critical
Publication of CN112882612B publication Critical patent/CN112882612B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Position Input By Displaying (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the application discloses a display method, display equipment and a display system. One embodiment of the display method includes: acquiring an image acquired by an image acquisition device; detecting and obtaining a rectangular frame of display equipment in the image; and displaying the mark at the corresponding display position of the display device according to the coordinates of the central point of the image under the coordinate system established by the rectangular frame. According to the embodiment, the mode of calculating the projection point by shooting the image in real time is used for replacing the interactive mode of imaging on the display device by the traditional laser pen, so that the problems of unclear display of the projection image and the like can be avoided.

Description

Display method, display device and display system
Technical Field
The application relates to the field of display technology. And more particularly, to a display method, a display device, a display system, a computer device, and a computer-readable storage medium.
Background
The electronic interactive whiteboard has wide application in the fields of business offices, intelligent education, intelligent medical treatment and the like, and provides a convenient mode for information interaction. However, since two polarizing plates orthogonal to each other exist in the liquid crystal screen, when light emitted by a common laser pen is projected onto the screen, a phenomenon that most of the light is absorbed by the polarizing plates and cannot be displayed on the screen may occur.
Disclosure of Invention
It is an object of the present application to provide a display method, a display device, a display system, a computer device, and a computer-readable storage medium, which solve at least one of the problems existing in the prior art.
In order to achieve the above purpose, the following technical scheme is adopted in the application:
the first aspect of the present application provides a display method, which includes:
acquiring an image acquired by an image acquisition device;
detecting and obtaining a rectangular frame of display equipment in the image;
and displaying the mark at the corresponding display position of the display device according to the coordinates of the central point of the image under the coordinate system established by the rectangular frame.
According to the display method provided by the first aspect of the application, the mode of calculating the projection point by shooting the image in real time is used for replacing the interactive mode of imaging on the display device by the traditional laser pen, and the problems that the display of the projection image is unclear and the like can be avoided.
In one possible implementation manner, the detecting the rectangular frame of the display device in the image includes:
performing straight line detection on the image to obtain a straight line segment contained in the image;
dividing each straight line segment into an upper frame group, a lower frame group, a left frame group and a right frame group;
respectively selecting one straight line segment from the upper frame group, the lower frame group, the left frame group and the right frame group to form straight line segment combinations, obtaining all straight line segment combinations, and connecting four straight line segments in each straight line segment combination into a quadrangle to obtain a plurality of quadrangles;
selecting a quadrangle of the corresponding display device from the plurality of quadrangles according to the intersection ratio of each side in each quadrangle and the straight line segment forming the side;
and performing perspective transformation on the quadrangle of the corresponding display device to obtain a rectangular frame of the display device in the image.
In one possible implementation manner, the selecting, from the plurality of quadrilaterals, the quadrilateral of the corresponding display device according to an intersection ratio of each edge in each quadrilateral and a straight line segment forming the edge includes:
and calculating the sum value of the intersection ratio of each side contained in each quadrangle and the straight line segments forming each side, and selecting the quadrangle with the largest sum value as the quadrangle of the corresponding display equipment.
Dividing the detected straight lines into four groups of upper, lower, left and right according to the relative positions, and then selecting one straight line from each group to form a quadrangle; innovative design is based on the rule that the optimal solution is selected from all quadrangles according to the cross ratio score as the detected whiteboard frame.
In one possible implementation manner, the value of the cross ratio is:
wherein iou represents the cross-over ratio, l A Representing edges, l B Representing the forming edge l A Straight line segment of (1) A ∩l B Representing edge l A And straight line segment l B Length of overlapping portion l A ∪l B Representing edge l A And straight line segment l B Length after combining.
In one possible implementation manner, the dividing each line segment into an upper frame group, a lower frame group, a left frame group and a right frame group includes:
taking a straight line segment with an included angle smaller than 45 degrees with the horizontal direction of the straight line segments contained in the image as a horizontal straight line segment and a straight line segment with an included angle larger than 45 degrees with the horizontal direction as a vertical straight line segment;
the method comprises the steps of dividing a horizontal straight line segment with a midpoint above a center point of an image into an upper frame group, dividing a horizontal straight line segment with a midpoint below the center point of the image into a lower frame group, dividing a vertical straight line segment with a midpoint at the left side of the center point of the image into a left frame group, and dividing a vertical straight line segment with a midpoint at the right side of the center point of the image into a right frame group.
In one possible implementation manner, the performing the straight line detection on the image to obtain the straight line segment included in the image includes:
binarizing the image according to the color of the rectangular frame of the display equipment to obtain a binarized image;
and carrying out linear detection on the binarized image to obtain a linear segment contained in the image.
In the implementation mode, the influence of the background interference on the straight line segment can be reduced by carrying out binarization pretreatment on the acquired image.
In one possible implementation manner, the performing perspective transformation on the quadrangle of the corresponding display device includes:
calibrating a center point of the image;
after calibration, perspective transformation is carried out on the quadrangle of the corresponding display equipment;
displaying the mark at the corresponding display position of the display device according to the coordinates of the central point of the image in the coordinate system established by the rectangular frame comprises:
and determining a center point of the image in the image after perspective transformation of the quadrangle of the corresponding display equipment through the calibration detection, acquiring the coordinates of the center point of the image under a coordinate system established by the rectangular frame, and displaying the identification at the corresponding display position of the display equipment according to the coordinates.
The perspective transformation mode adopted by the implementation mode ensures that the display identification position on the display equipment is more accurate.
In one possible implementation manner, displaying the identifier at the corresponding display position of the display device according to the coordinates of the central point of the image in the coordinate system established by the rectangular frame includes:
and determining the display position of the mark according to the coordinate average value of the coordinates of the central point of the current frame image under the coordinate system established by the rectangular frame and the coordinates of the central point of the previous N frame images under the coordinate system established by the corresponding rectangular frame.
According to the implementation mode, for each frame of picture, the average value is calculated by the projection position detected by the current frame and the position of four continuous frames of pictures in front of the current frame, and the projection position is used as an imaging point finally on the display device, so that smoothness of a user in the use process can be improved.
A second aspect of the present application provides a display apparatus that performs the display method provided in the first aspect, the display apparatus comprising:
the acquisition module is used for acquiring the image acquired by the image acquisition device;
the detection module is used for detecting and obtaining a rectangular frame of the display equipment in the image;
and the display module is used for displaying the mark at the corresponding display position according to the coordinates of the central point of the image under the coordinate system established by the rectangular frame.
According to the display device provided by the second aspect of the application, the mode of calculating the projection point by shooting the image in real time is used for replacing the interaction mode of imaging on the display device by the traditional laser pen, and the problems that the display of the projection image is unclear and the like can be avoided.
A third aspect of the present application provides a display system, comprising: the second aspect of the application provides a display device and an image acquisition apparatus for acquiring an image.
In one possible embodiment, the image acquisition device is arranged at the end of a pen-shaped housing.
The pen-shaped shell in the implementation mode can increase the accuracy of displaying the identification and is convenient for users to use.
In one possible implementation, the display device is an electronic whiteboard device.
A fourth aspect of the present application provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the display method provided in the first aspect of the present application when executing the program.
A fifth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the display method provided by the first aspect of the present application.
The beneficial effects of this application are as follows:
according to the technical scheme, the mode of calculating the projection point by shooting the image in real time is used for replacing the interactive mode of imaging on the display device by the traditional laser pen, and the problems that the display of the projection image is unclear and the like can be avoided.
Drawings
The following detailed description of the embodiments of the present application is provided in further detail with reference to the accompanying drawings.
FIG. 1 shows a flow chart of a display method provided by an embodiment of the present application;
FIG. 2 shows a physical view of an acquired image provided by an embodiment of the present application;
FIG. 3 (a) illustrates a combination of straight line segments provided by an embodiment of the present application;
FIG. 3 (b) shows a quadrilateral formed by processing straight line segments provided by embodiments of the present application;
FIG. 4 shows a block diagram of a display system provided by an embodiment of the present application;
fig. 5 shows a structural diagram of a display device provided by an embodiment of the present application;
fig. 6 shows a schematic structural diagram of a computer system implementing an apparatus provided in an embodiment of the present application.
Detailed Description
In order to more clearly illustrate the present application, the present application is further described below with reference to examples and drawings. Like parts in the drawings are denoted by the same reference numerals. It is to be understood by persons skilled in the art that the following detailed description is intended to be illustrative, and not restrictive, and that this invention is not to be limited to the specific embodiments shown.
Electronic whiteboard is the preferred product of most enterprises now, and it has swift convenient handwriting mode, lets people's work efficiency have very big promotion, and electronic whiteboard when using, often cooperates with laser indicator (laser pen), utilizes the visible laser of laser pen transmission, forms the projection that has the sign on electronic whiteboard, brings the convenience for the viewer.
However, since the electronic whiteboard is used as a liquid crystal display device, two polarizing plates orthogonal to each other exist in a member of the electronic whiteboard, so that most of light rays may be absorbed by the polarizing plates and cannot be displayed on the electronic whiteboard when the laser emitted by the laser pen is projected onto the electronic whiteboard. Further, excessive direct-view laser light causes a problem such as temporary or permanent damage to the eyes of the viewer.
Accordingly, to solve the above-mentioned problems, an embodiment of the present application provides a display method, which uses forming a display mark at a corresponding display position of a display device to replace laser light emitted by a laser pen, so as to avoid the problems of unclear display and the like.
Specifically, as shown in fig. 1, the display method includes:
s10, acquiring an image acquired by the image acquisition device.
The image acquisition device provided in this embodiment may be a miniature camera or the like.
For example, for convenience in forming the display mark, the micro camera may be disposed at an end of the pen-shaped housing, for convenience in description, a structure formed by the micro camera and the housing is referred to as a "stylus", and a center point of the photographed image is a position where the stylus nib is opposite to the pointing direction, i.e. the "stylus" is used to replace the existing "laser pen".
In a specific example, referring to fig. 2, fig. 2 shows a physical image acquired by the image acquisition device, and it can be seen that some other objects, such as "flowerpot", "cabinet", "window" and so on, are shot in the acquired image in addition to the display device.
S20, detecting and obtaining the rectangular frame of the display device in the image.
It is easy to understand that, the rectangular frame of the display device provided in the present application is a peripheral frame of the appearance of the display device (the frame of the display device in fig. 2), and it should be noted that, in the prior art, the appearance of the display device is mostly rectangular, so in this embodiment, the frame of the default display device is a rectangular frame. The display method provided in this embodiment may be adaptively modified according to the shape of the actual display device, for example, when the appearance shape of the display device is circular, the change is detected to obtain the circular frame of the display device in the image when step S20 is performed.
In order to capture the entire display device into an image, the image capturing device needs to capture the image at a position remote from the display device during actual operation.
With continued reference to fig. 2, in addition to the rectangular shape of the display device, the shape of the display device such as a cabinet and a window is rectangular, and the image of the display device needs to be extracted from the acquired image to distinguish from other noise shapes (cabinet and window), so step S20 further includes the following substeps:
s201, performing straight line detection on the image to obtain a straight line segment contained in the image;
it is easy to understand that in the acquired image, besides the frame straight line segment of the display device, there are a large number of other interference straight line segments in the background, so that preprocessing is generally required for the image, and part of noise line segments are filtered to reduce interference on straight line detection, specifically, in some embodiments, step S201 may include: binarizing the image according to the color of the rectangular frame of the display equipment to obtain a binarized image; and (3) carrying out linear detection on the binarized image to obtain a linear segment contained in the image.
The binarization processing of the image is to set the gray scale of the point on the image to 0 or 255, that is, to make the whole image show a clear black-and-white effect. The gray level image with 256 brightness levels is selected by proper threshold value to obtain the binary image which can still reflect the whole and partial characteristics of the image, so that the method is beneficial to that when the image is further processed, the aggregate property of the image is only related to the position of the point with the pixel value of 0 or 255, the multi-level value of the pixel is not related, the processing is simplified, and the processing and compression amount of data are small.
In a specific example, with continued reference to fig. 2, it may be known that the frame of the display device is black, and the colors of other interference line segments are complex and changeable, so that the acquired image is firstly binarized to obtain a binarized image, then a threshold value of a color channel is set, for example, 50 is selected as the threshold value of the color channel, pixels with pixel values of R, G, B channels in the image being smaller than the threshold value are reserved, three channel values of other pixel points are set to 255 (white), and after information except for the color similar to black in the image is filtered, the information is input into a straight line segment detection algorithm to detect, so as to obtain a straight line segment contained in the binarized image.
The straight line segment directly obtained is a straight line segment included in the binary image, but the straight line segment included in the binary image corresponds to the straight line segment included in the original image.
Preferably, a straight line segment detection algorithm (Line Segment Detector, LSD) may be employed to detect straight line segments in the binarized image.
Since the frame color of the display device is known to be black, three channel values of other pixel points are set to white (255) having strong visibility from black except for the pixel whose three channel pixel value is smaller than the threshold value in the retention image R, G, B. However, if the frame color of the display device is other, for example, the frame color of the display device is red, the three channel value of other pixels may be set to be white or green; the frame of the display device is white in color, and three channel values of other pixel points can be set to be black.
And preprocessing the shot image aiming at a frame with a specific color (such as black) of the display device, reserving darker pixels in the image, and then performing straight line detection to reduce interference of other possible straight lines in the background on detection.
S202, dividing each straight line segment into an upper frame group, a lower frame group, a left frame group and a right frame group;
specifically, the detected end point coordinates (x 1 ,y 1 ),(x 2 ,y 2 ) The length of the straight line segment can be calculated by the following formula,
and screening out the straight line segments with the lengths smaller than a preset length threshold value in each straight line segment. The specific value of the threshold value may be determined according to the length, width, etc. of the display device, for example, the length of the display device is 0.5m, and the width is 0.3m, and the length threshold value may be designed to be 0.2m.
Then, the included angle between each straight line segment and the horizontal direction is calculated by the following formula
Dividing straight line segments with included angles smaller than 45 degrees into horizontal lines, or dividing the straight line segments into vertical lines to obtain horizontal and vertical straight line segment groups;
calculating the midpoint of each straight line segment, for a horizontal straight line segment group, dividing the straight line segment into an upper frame group if the midpoint of the straight line segment is above the center point of the image, and otherwise, dividing the straight line segment into a lower frame group; for a vertical straight line segment group, if the midpoint of the straight line segment is to the left of the center point of the image, the straight line segment is divided into a left frame group, otherwise, the straight line segment is divided into a right frame group.
It should be noted that, if no straight line satisfying the condition exists in any straight line group, the frame quadrangle of the display device in the description screen cannot contain the center point of the image, that is, the "stylus" does not point to the screen of the display device, and at this time, no projection point is displayed on the screen of the display device.
S203, respectively selecting one straight line segment from the upper frame group, the lower frame group, the left frame group and the right frame group to form straight line segment combinations, obtaining all straight line segment combinations, and connecting four straight line segments in each straight line segment combination into one quadrangle to obtain a plurality of quadrangles;
it should be noted that all the included straight line segments in each frame group need to be traversed to be combined, for example, the upper frame group includes the straight line segment a 1 、A 2 The method comprises the steps of carrying out a first treatment on the surface of the The lower frame group comprises a straight line segment B 1 、B 2 The method comprises the steps of carrying out a first treatment on the surface of the The left frame group comprises a straight line segment C 1 The method comprises the steps of carrying out a first treatment on the surface of the The framed group comprises a straight line segment D 1 The method comprises the steps of carrying out a first treatment on the surface of the Then a polygon may be obtained: a is that 1 B 1 C 1 D 1 、A 1 B 2 C 1 D 1 、A 2 B 1 C 1 D 1 、A 2 B 2 C 1 D 1
As shown in FIG. 3, FIG. 3 (a) shows an optional straight line segment in each of four different frame groups, the straight line segment in the upper frame group being defined as A for convenience of description and understanding 1 Defining straight line segment in lower frame group as B 1 Defining straight line segment in left frame group as C 1 Defining the straight line segment in the right frame group as D 1 The method comprises the steps of carrying out a first treatment on the surface of the FIG. 3 (b) shows a quadrilateral A formed from the processing of each straight line segment in FIG. 3 (a) 1 B 1 C 1 D 1
However, as can be seen from FIG. 3 (a), straight line segment A 1 、B 1 、C 1 D (D) 1 And cannot directly form quadrilateral A 1 B 1 C 1 D 1 The quadrangle A can be formed only after the straight line segments are cut or lengthened 1 B 1 C 1 D 1
Taking fig. 3 as an example, for straight line segment a 1 The left part is needed to be cut to obtain the quadrangular side A 1 The method comprises the steps of carrying out a first treatment on the surface of the For straight line segment B 1 The left and right extensions are needed to obtain the quadrangular side B 1 The method comprises the steps of carrying out a first treatment on the surface of the For straight line segment C 1 The upper edge part is required to extend and the lower edge part is required to be cut to obtain a quadrangular edge C 1 The method comprises the steps of carrying out a first treatment on the surface of the For straight line segment D 1 Cut up and down to obtain quadrangular edge D 1 Finally, a quadrangle A as shown in FIG. 3 (b) is formed 1 B 1 C 1 D 1
S204, selecting a quadrangle of the corresponding display device from the quadrangles according to the intersection ratio of each side in each quadrangle and the straight line segment forming the side;
specifically, a sum value of the intersection ratio of each side included in each quadrangle and the straight line segment forming each side is calculated, and the quadrangle with the largest sum value is selected as the quadrangle of the corresponding display device.
In some embodiments, the frame l may be obtained by the following formula A Is a cross-over ratio of:
wherein iou represents the cross-over ratio, l A Representing edges, l B Representing the forming edge l A Straight line segment of (1) A ∩l B Representing edge l A And straight line segment l B Length of overlapping portion l A ∪l B Representing edge l A And straight line segment l B Length after combining.
With continued reference to FIG. 3, in combination withAs can be seen from FIGS. 3 (a) and 3 (b), the upper side A of the quadrilateral 1 Is straight line segment A 1 Formed by left cutting, for the upper edge, A 1 Is l A ,A 1 Is l B L is A ∈l B I.e. upper l A ,l B Meets the condition of complete coincidence, and the intersection ratio of the upper edgesNamely A 1 Length of (A) and A 1 Length ratio of (c) is provided.
Similarly, right side D of quadrangle 1 Is straight line segment D 1 Cutting the upper and lower sides to form D for the right side 1 Is l A ,D 1 Is l B L is A ∈l B I.e. right l A ,l B Meets the condition of complete coincidence, and the right intersection ratioNamely D 1 Length of (D) and D 1 Length ratio of (c) is provided.
Similarly, the left side C of the quadrangle 1 Is straight line segment C 1 Extend upward to form, for the left side, C 1 Is l A ,C 1 Is l B As can be seen with reference to FIG. 3, middle left l A ,l B To partially coincide, the ratio of the left hand intersection
Similarly, the lower edge B of the quadrangle 1 Is straight line segment B 1 Extend to the left and right, and for the lower edge, B 1 Is l A ,B 1 Is l B I.e. lower l A ,l B To partially coincide, the lower edges have a ratio of intersection
Not shown in FIG. 3 is l A ,l B Misalignment, but as will be appreciated by those skilled in the art, the edges are entirely made of straight line segmentsIs formed of an "extension" of (a).
Based on the calculation method of the intersection ratio, the intersection ratio of the four sides of the quadrangle is sequentially obtained, and the four intersection ratios are added to obtain the intersection ratio sum value of the four sides of the quadrangle.
And calculating the sum of the cross ratios of all the quadrangles, and selecting the quadrangle with the largest sum as the quadrangle of the corresponding display equipment.
S205, performing perspective transformation on the quadrangle of the corresponding display device to obtain a rectangular frame of the display device in the image.
Specifically, performing perspective transformation on the quadrangle of the corresponding display device includes:
calibrating the center point of the image;
the center point of the image is the center point of the acquired image.
The position of the center point of the image is calibrated by adopting a special color (such as red).
After calibration, perspective transformation is carried out on the quadrangle of the corresponding display equipment, and then the rectangular frame of the display equipment in the image is obtained.
S30, displaying the mark at the corresponding display position of the display device according to the coordinates of the central point of the image under the coordinate system established by the rectangular frame.
Specifically, through detecting calibration, determining a center point of an image (namely, a center point of an original image) in an image after perspective transformation of a quadrilateral of a corresponding display device, acquiring coordinates of the center point of the image under a coordinate system established by a rectangular frame, and displaying an identifier at a corresponding display position of the display device according to the coordinates.
In a specific example, after the rectangular frame of the display device is obtained, a coordinate system is established according to the rectangular frame, and then a pre-calibrated red coordinate point position (namely, the center point position of the original image) can be detected according to the R, G, B three-channel value of the pixel point, and the display mark of the stylus at the corresponding display position of the display device can be determined according to the red coordinate point position.
To accurately display the position of the logo on the display device, in some embodiments, the display position of the logo is determined from a coordinate average of coordinates of the center point of the current frame image under the coordinate system established with the rectangular frame and coordinates of the center point of the previous N frame image under the coordinate system established with the corresponding rectangular frame.
According to the display method provided by the embodiment, the mode of calculating the projection point by shooting the image in real time is used for replacing the interactive mode of imaging on the display device by the traditional laser pen, so that the problems of unclear display of the projection image and the like can be avoided.
As shown in fig. 4, another embodiment of the present application provides a display system including a display device and an image capturing apparatus for capturing an image;
as shown in fig. 5, the display device includes:
the acquisition module is used for acquiring the image acquired by the image acquisition device;
the detection module is used for detecting and obtaining a rectangular frame of the display equipment in the image;
and the display module is used for displaying the mark at the corresponding display position according to the coordinates of the central point of the image under the coordinate system established by the rectangular frame.
Specifically, the display device is an electronic whiteboard device. The image acquisition device is arranged at the end part of the pen-shaped shell.
It should be noted that, the principle and workflow of the display system provided in this embodiment are similar to those of the above display method, and the relevant points may be referred to the above description, which is not repeated herein.
As shown in fig. 6, a computer system suitable for use in implementing the display system provided in the above-described embodiment includes a central processing module (CPU) that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) or a program loaded from a storage section into a Random Access Memory (RAM). In the RAM, various programs and data required for the operation of the computer system are also stored. The CPU, ROM and RAM are connected by a bus. An input/output (I/O) interface is also connected to the bus.
The following components are connected to the I/O interface, including the input part of the keyboard, mouse, etc.; an output section including a display such as a Liquid Crystal Display (LCD) and a speaker; a storage section including a hard disk or the like; and a communication section including a network interface card such as a LAN card, a modem, and the like. The communication section performs communication processing via a network such as the internet. The drives are also connected to the I/O interfaces as needed. Removable media such as magnetic disks, optical disks, magneto-optical disks, semiconductor memories, and the like are mounted on the drive as needed so that a computer program read therefrom is mounted into the storage section as needed.
In particular, according to the present embodiment, the procedure described in the above flowcharts may be implemented as a computer software program. For example, the present embodiments include a computer program product comprising a computer program tangibly embodied on a computer-readable medium, the computer program containing program code for performing the method shown in the flowchart. In such embodiments, the computer program may be downloaded and installed from a network via a communication portion, and/or installed from a removable medium.
The flowcharts and diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to the present embodiments. In this regard, each block in the flowchart or schematic diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the diagrams and/or flowchart illustration, and combinations of blocks in the diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules involved in the present embodiment may be implemented in software or in hardware. The described modules may also be provided in a processor, for example, as: a processor comprises an acquisition module, a detection module and a display module. The names of these modules do not constitute a limitation on the module itself in some cases. For example, the acquisition module may also be described as an "acquisition module".
On the other hand, the present embodiment also provides a nonvolatile computer storage medium, which may be the nonvolatile computer storage medium included in the apparatus in the above embodiment or may be a nonvolatile computer storage medium existing separately and not incorporated in the terminal. The above-described nonvolatile computer storage medium stores one or more programs, which when executed by an apparatus, enable the apparatus to implement the display method provided by the foregoing embodiment.
In the description of the present application, it should be noted that the azimuth or positional relationship indicated by the terms "upper", "lower", etc. are based on the azimuth or positional relationship shown in the drawings, and are merely for convenience of description of the present application and simplification of the description, and are not indicative or implying that the apparatus or element in question must have a specific azimuth, be configured and operated in a specific azimuth, and thus should not be construed as limiting the present application. Unless specifically stated or limited otherwise, the terms "mounted," "connected," and "coupled" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art as the case may be.
It should also be noted that in the description of the present application, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
It should be apparent that the foregoing examples of the present application are merely illustrative of the present application and not limiting of the embodiments of the present application, and that various other changes and modifications may be made by one of ordinary skill in the art based on the foregoing description, and it is not intended to be exhaustive of all embodiments, and all obvious changes and modifications that come within the scope of the present application are intended to be embraced by the technical solution of the present application.

Claims (12)

1. A display method, comprising:
acquiring an image acquired by an image acquisition device;
detecting and obtaining a rectangular frame of display equipment in the image;
displaying the mark at the corresponding display position of the display device according to the coordinates of the central point of the image under the coordinate system established by the rectangular frame;
the detecting to obtain the rectangular frame of the display device in the image includes:
performing straight line detection on the image to obtain a straight line segment contained in the image;
dividing each straight line segment into an upper frame group, a lower frame group, a left frame group and a right frame group;
respectively selecting one straight line segment from the upper frame group, the lower frame group, the left frame group and the right frame group to form straight line segment combinations, obtaining all straight line segment combinations, and connecting four straight line segments in each straight line segment combination into a quadrangle to obtain a plurality of quadrangles;
selecting a quadrangle of the corresponding display device from the plurality of quadrangles according to the intersection ratio of each side in each quadrangle and the straight line segment forming the side;
performing perspective transformation on the quadrangle of the corresponding display device to obtain a rectangular frame of the display device in the image;
the selecting a quadrangle of the corresponding display device from the plurality of quadrangles according to the intersection ratio of each side in each quadrangle and the straight line segment forming the side comprises:
and calculating the sum value of the intersection ratio of each side contained in each quadrangle and the straight line segments forming each side, and selecting the quadrangle with the largest sum value as the quadrangle of the corresponding display equipment.
2. The method according to claim 1, wherein the cross ratio has a value of:
wherein iou represents the cross-over ratio, l A Representing edges, l B Representing the forming edge l A Straight line segment of (1) A ∩l B Representing edge l A And straight line segment l B Length of overlapping portion l A ∪l B Representing edge l A And straight line segment l B Length after combining.
3. The method of claim 1, wherein dividing each line segment into an upper frame group, a lower frame group, a left frame group, and a right frame group comprises:
taking a straight line segment with an included angle smaller than 45 degrees with the horizontal direction of the straight line segments contained in the image as a horizontal straight line segment and a straight line segment with an included angle larger than 45 degrees with the horizontal direction as a vertical straight line segment;
the method comprises the steps of dividing a horizontal straight line segment with a midpoint above a center point of an image into an upper frame group, dividing a horizontal straight line segment with a midpoint below the center point of the image into a lower frame group, dividing a vertical straight line segment with a midpoint at the left side of the center point of the image into a left frame group, and dividing a vertical straight line segment with a midpoint at the right side of the center point of the image into a right frame group.
4. The method of claim 1, wherein the performing the straight line detection on the image to obtain the straight line segment included in the image includes:
binarizing the image according to the color of the rectangular frame of the display equipment to obtain a binarized image;
and carrying out linear detection on the binarized image to obtain a linear segment contained in the image.
5. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the performing perspective transformation on the quadrangle of the corresponding display device includes:
calibrating a center point of the image;
after calibration, perspective transformation is carried out on the quadrangle of the corresponding display equipment;
displaying the mark at the corresponding display position of the display device according to the coordinates of the central point of the image in the coordinate system established by the rectangular frame comprises:
and determining a center point of the image in the image after perspective transformation of the quadrangle of the corresponding display equipment through the calibration detection, acquiring the coordinates of the center point of the image under a coordinate system established by the rectangular frame, and displaying the identification at the corresponding display position of the display equipment according to the coordinates.
6. The method of claim 1, wherein displaying the identification at the corresponding display position of the display device according to coordinates of the center point of the image in the coordinate system established with the rectangular frame comprises:
and determining the display position of the mark according to the coordinate average value of the coordinates of the central point of the current frame image under the coordinate system established by the rectangular frame and the coordinates of the central point of the previous N frame images under the coordinate system established by the corresponding rectangular frame.
7. A display device performing the method of any of claims 1-6, comprising:
the acquisition module is used for acquiring the image acquired by the image acquisition device;
the detection module is used for detecting and obtaining a rectangular frame of the display equipment in the image;
and the display module is used for displaying the mark at the corresponding display position according to the coordinates of the central point of the image under the coordinate system established by the rectangular frame.
8. A display system comprising the display device according to claim 7 and an image capturing means for capturing an image.
9. The system of claim 8, wherein the image capture device is disposed at an end of a pen-shaped housing.
10. The system of claim 8, wherein the display device is an electronic whiteboard device.
11. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1-6 when the program is executed by the processor.
12. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any of claims 1-6.
CN202110039053.4A 2021-01-12 2021-01-12 Display method, display device and display system Active CN112882612B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110039053.4A CN112882612B (en) 2021-01-12 2021-01-12 Display method, display device and display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110039053.4A CN112882612B (en) 2021-01-12 2021-01-12 Display method, display device and display system

Publications (2)

Publication Number Publication Date
CN112882612A CN112882612A (en) 2021-06-01
CN112882612B true CN112882612B (en) 2024-01-23

Family

ID=76044802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110039053.4A Active CN112882612B (en) 2021-01-12 2021-01-12 Display method, display device and display system

Country Status (1)

Country Link
CN (1) CN112882612B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101464751A (en) * 2009-01-06 2009-06-24 杭州华银视讯科技有限公司 Electronic white board system based on image detection and detection method thereof
CN101730876A (en) * 2007-05-26 2010-06-09 李汶基 Pointing device using camera and outputting mark
KR20100082171A (en) * 2009-01-08 2010-07-16 (주)유플로우 Apparatus for screen remote controlling
CN103777751A (en) * 2012-10-25 2014-05-07 三星电子株式会社 A method for displaying a cursor on a display and system performing the same
CN106325725A (en) * 2015-06-25 2017-01-11 中兴通讯股份有限公司 Touch screen control method and device, and mobile terminal
CN109117794A (en) * 2018-08-16 2019-01-01 广东工业大学 A kind of moving target behavior tracking method, apparatus, equipment and readable storage medium storing program for executing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7542072B2 (en) * 2004-07-28 2009-06-02 The University Of Maryland Device using a camera and light polarization for the remote displacement of a cursor on a display

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101730876A (en) * 2007-05-26 2010-06-09 李汶基 Pointing device using camera and outputting mark
CN101464751A (en) * 2009-01-06 2009-06-24 杭州华银视讯科技有限公司 Electronic white board system based on image detection and detection method thereof
KR20100082171A (en) * 2009-01-08 2010-07-16 (주)유플로우 Apparatus for screen remote controlling
CN103777751A (en) * 2012-10-25 2014-05-07 三星电子株式会社 A method for displaying a cursor on a display and system performing the same
CN106325725A (en) * 2015-06-25 2017-01-11 中兴通讯股份有限公司 Touch screen control method and device, and mobile terminal
CN109117794A (en) * 2018-08-16 2019-01-01 广东工业大学 A kind of moving target behavior tracking method, apparatus, equipment and readable storage medium storing program for executing

Also Published As

Publication number Publication date
CN112882612A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
US7171056B2 (en) System and method for converting whiteboard content into an electronic document
US10083522B2 (en) Image based measurement system
US8265393B2 (en) Photo-document segmentation method and system
CN112348815A (en) Image processing method, image processing apparatus, and non-transitory storage medium
Gaiani et al. An advanced pre-processing pipeline to improve automated photogrammetric reconstructions of architectural scenes
US8780131B2 (en) Systems and methods for text-based personalization of images
Sarel et al. Separating transparent layers through layer information exchange
US11521311B1 (en) Collaborative disparity decomposition
US7342572B2 (en) System and method for transforming an ordinary computer monitor into a touch screen
Tarini et al. 3D acquisition of mirroring objects using striped patterns
US20180300957A1 (en) Panoramic camera systems
CN110009561A (en) A kind of monitor video target is mapped to the method and system of three-dimensional geographical model of place
AU2011250829B2 (en) Image processing apparatus, image processing method, and program
Tripathi et al. Removal of fog from images: A review
CN111368717B (en) Line-of-sight determination method, line-of-sight determination device, electronic apparatus, and computer-readable storage medium
Shi et al. Single image dehazing in inhomogeneous atmosphere
AU2011250827B2 (en) Image processing apparatus, image processing method, and program
CN101667303A (en) Three-dimensional reconstruction method based on coding structured light
CN105488475B (en) Method for detecting human face in mobile phone
US20200304713A1 (en) Intelligent Video Presentation System
US9336607B1 (en) Automatic identification of projection surfaces
CN109803172A (en) A kind of processing method of live video, device and electronic equipment
CN108337494A (en) A kind of calibration method of projection device, device, projection device and terminal device
CN109214350A (en) A kind of determination method, apparatus, equipment and the storage medium of illumination parameter
CN112433641B (en) Implementation method for automatic calibration of desktop prop interaction system of multiple RGBD depth sensors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant