CN112882612A - Display method, display equipment and display system - Google Patents

Display method, display equipment and display system Download PDF

Info

Publication number
CN112882612A
CN112882612A CN202110039053.4A CN202110039053A CN112882612A CN 112882612 A CN112882612 A CN 112882612A CN 202110039053 A CN202110039053 A CN 202110039053A CN 112882612 A CN112882612 A CN 112882612A
Authority
CN
China
Prior art keywords
image
straight line
line segment
display
quadrangle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110039053.4A
Other languages
Chinese (zh)
Other versions
CN112882612B (en
Inventor
王镜茹
胡风硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202110039053.4A priority Critical patent/CN112882612B/en
Publication of CN112882612A publication Critical patent/CN112882612A/en
Application granted granted Critical
Publication of CN112882612B publication Critical patent/CN112882612B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Position Input By Displaying (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the application discloses a display method, display equipment and a display system. One specific embodiment of the display method comprises: acquiring an image acquired by an image acquisition device; detecting to obtain a rectangular frame of the display device in the image; and displaying the identifier at the corresponding display position of the display equipment according to the coordinate of the central point of the image in a coordinate system established by the rectangular frame. According to the embodiment, the mode of shooting images in real time and calculating the projection points is utilized to replace the traditional interactive mode of imaging the laser pen on the display equipment, so that the problems that the display of the projection images is unclear and the like can be avoided.

Description

Display method, display equipment and display system
Technical Field
The present application relates to the field of display technology. And more particularly, to a display method, a display apparatus, a display system, a computer apparatus, and a computer-readable storage medium.
Background
The electronic interactive whiteboard has wide application in the fields of business office, intelligent education, intelligent medical treatment and the like, and provides a convenient mode for information interaction. However, because two orthogonal polarizers are arranged in the liquid crystal screen, when light emitted by a common laser pen is projected onto the screen, most of the light may be absorbed by the polarizers and cannot be displayed on the screen.
Disclosure of Invention
An object of the present application is to provide a display method, a display device, a display system, a computer device, and a computer-readable storage medium, so as to solve at least one of the problems of the related art.
In order to achieve the purpose, the following technical scheme is adopted in the application:
a first aspect of the present application provides a display method, including:
acquiring an image acquired by an image acquisition device;
detecting to obtain a rectangular frame of the display device in the image;
and displaying the identifier at the corresponding display position of the display equipment according to the coordinate of the central point of the image in the coordinate system established by the rectangular frame.
According to the display method provided by the first aspect of the application, the mode of shooting images in real time and calculating projection points is utilized to replace the interactive mode of imaging of a traditional laser pen on display equipment, and the problems that the display of the projected images is unclear and the like can be avoided.
In one possible implementation manner, the detecting that a rectangular frame of the display device in the image is obtained includes:
performing straight line detection on the image to obtain a straight line segment contained in the image;
dividing each straight line segment into an upper frame group, a lower frame group, a left frame group and a right frame group;
respectively selecting a straight line segment from the upper frame group, the lower frame group, the left frame group and the right frame group to form straight line segment combinations, obtaining all straight line segment combinations, and connecting four straight line segments in each straight line segment combination into a quadrangle to obtain a plurality of quadrangles;
selecting a quadrangle corresponding to the display device from the plurality of quadrangles according to the intersection ratio of each side in each quadrangle to the straight line segment forming the side;
and carrying out perspective transformation on the quadrangle corresponding to the display equipment to obtain a rectangular frame of the display equipment in the image.
In one possible implementation manner, the selecting a quadrangle of a corresponding display device from the plurality of quadrangles according to an intersection ratio of each side of each quadrangle to a straight line segment forming the side includes:
and calculating the sum of the intersection ratio of each side contained in each quadrangle and the straight line segment forming each side, and selecting the quadrangle with the maximum sum as the quadrangle corresponding to the display equipment.
In the implementation mode, the detected straight lines are divided into four groups, namely an upper group, a lower group, a left group and a right group according to relative positions, and then one straight line is selected from each group to form a quadrangle; and innovatively designing a rule for selecting an optimal solution from all quadrangles according to the intersection ratio scores as the detected whiteboard border.
In a possible implementation manner, the value of the intersection-to-parallel ratio is as follows:
Figure BDA0002894950350000021
wherein iou represents the cross-over ratio, lARepresents an edge, lBIndicating the formation of an edge lAStraight line segment of (1)A∩lBRepresents an edge lAAnd the straight line segment lBLength of the overlapping part of (1)A∪lBRepresents an edge lAAnd the straight line segment lBThe combined length.
In one possible implementation, the dividing each straight line segment into an upper frame group, a lower frame group, a left frame group, and a right frame group respectively includes:
taking a straight line segment with an included angle of less than 45 degrees with the horizontal direction in the straight line segments contained in the image as a horizontal straight line segment, and taking a straight line segment with an included angle of more than 45 degrees with the horizontal direction as a vertical straight line segment;
dividing a horizontal straight line segment with a midpoint positioned above the center point of the image into an upper frame group, dividing a horizontal straight line segment with a midpoint positioned below the center point of the image into a lower frame group, dividing a vertical straight line segment with a midpoint positioned on the left side of the center point of the image into a left frame group, and dividing a vertical straight line segment with a midpoint positioned on the right side of the center point of the image into a right frame group.
In a possible implementation manner, the performing line detection on the image to obtain a line segment included in the image includes:
carrying out binarization on the image according to the color of a rectangular frame of the display equipment to obtain a binarized image;
and carrying out linear detection on the binary image to obtain a linear segment contained in the image.
According to the implementation mode, the collected image is subjected to binarization preprocessing, so that the influence of background interference straight line segments can be reduced.
In one possible implementation, the perspective transformation of the quadrangle of the corresponding display device includes:
calibrating the central point of the image;
after calibration, carrying out perspective transformation on the quadrangle of the corresponding display equipment;
the displaying the identifier at the corresponding display position of the display device according to the coordinate of the central point of the image in the coordinate system established by the rectangular frame comprises:
and determining the central point of the image in the image after the quadrangle of the corresponding display equipment is subjected to perspective transformation through the detection of the calibration, acquiring the coordinate of the central point of the image under a coordinate system established by the rectangular frame, and displaying the identifier at the corresponding display position of the display equipment according to the coordinate.
The perspective transformation mode adopted by the implementation mode enables the display identification position on the display equipment to be more accurate.
In a possible implementation manner, the displaying, according to the coordinate of the central point of the image in the coordinate system established by the rectangular frame, the identifier at the corresponding display position of the display device includes:
and determining the display position of the identifier according to the coordinate average value of the coordinate of the center point of the current frame image in the coordinate system established by the rectangular frame and the coordinate of the center point of the previous N frame images in the coordinate system established by the corresponding rectangular frame.
According to the implementation mode, for each frame of picture, the average value of the projection position detected by the current frame and the position of the continuous four frames of pictures in front of the current frame is calculated to be used as the final imaging point on the display equipment, so that the smoothness of a user in the using process can be improved.
A second aspect of the present application provides a display apparatus that performs the display method provided by the first aspect, the display apparatus including:
the acquisition module is used for acquiring the image acquired by the image acquisition device;
the detection module is used for detecting and obtaining a rectangular frame of the display equipment in the image;
and the display module is used for displaying the identifier at the corresponding display position according to the coordinate of the central point of the image under the coordinate system established by the rectangular frame.
According to the display device provided by the second aspect of the application, the mode of shooting images in real time and calculating projection points is utilized to replace the traditional interactive mode of imaging a laser pen on the display device, so that the problems that the display of the projected images is unclear and the like can be avoided.
A third aspect of the present application provides a display system comprising: the second aspect of the application provides a display device and an image acquisition device for acquiring images.
In a possible implementation, the image acquisition device is arranged at the end of a pen-shaped housing.
The pen-shaped housing in this implementation may increase the accuracy of the displayed indicia and facilitate user access.
In one possible implementation, the display device is an electronic whiteboard device.
A fourth aspect of the present application provides a computer device, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the display method provided by the first aspect of the present application when executing the program.
A fifth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the display method provided by the first aspect of the present application.
The beneficial effect of this application is as follows:
the technical scheme that this application provided utilizes the mode of taking the image in real time and calculating the projection point to replace the interactive mode of traditional laser pen formation of image on display device, can avoid appearing the projection image and show the scheduling problem not clearly.
Drawings
The following describes embodiments of the present application in further detail with reference to the accompanying drawings.
FIG. 1 illustrates a flow chart of a display method provided by an embodiment of the present application;
FIG. 2 illustrates a pictorial view of a captured image provided by an embodiment of the present application;
FIG. 3(a) shows a combination of straight line segments provided by an embodiment of the present application;
FIG. 3(b) is a diagram illustrating a quadrilateral formed by a treated straight line segment provided by an embodiment of the present application;
FIG. 4 illustrates a block diagram of a display system provided by an embodiment of the present application;
fig. 5 illustrates a structure diagram of a display device provided by an embodiment of the present application;
fig. 6 shows a schematic structural diagram of a computer system implementing the apparatus provided in the embodiment of the present application.
Detailed Description
In order to more clearly explain the present application, the present application is further described below with reference to the embodiments and the accompanying drawings. Similar parts in the figures are denoted by the same reference numerals. It is to be understood by persons skilled in the art that the following detailed description is illustrative and not restrictive, and is not intended to limit the scope of the present application.
Electronic whiteboard is the first-selected product of most enterprises now, and it has swift convenient handwriting, lets people's work efficiency have had very big promotion, and electronic whiteboard when using often cooperates with laser designator (laser pen), utilizes the visible laser of laser pen transmission, forms the projection that has the identifiability on electronic whiteboard, brings the convenience for the viewer.
However, since the electronic whiteboard is used as a liquid crystal display device, two polarizing plates orthogonal to each other are arranged in the component, so that when the laser emitted from the laser pen is projected onto the electronic whiteboard, most of the light may be absorbed by the polarizing plates and cannot be displayed on the electronic whiteboard. Further, excessive direct-view laser light may cause problems such as temporary or permanent damage to the eyes of the viewer.
Therefore, in order to solve the above problems, an embodiment of the present application provides a display method that avoids the problems of unclear display and the like by forming a display mark at a corresponding display position of a display device instead of laser light emitted from a laser pointer.
Specifically, as shown in fig. 1, the display method includes:
and S10, acquiring the image acquired by the image acquisition device.
The image acquisition device provided by the embodiment can be a micro camera and the like.
For convenience of description, a structure formed by the micro camera and the housing is referred to as a "stylus", and a central point of a shot image is a position where a pen point of the stylus is opposite to a pointing direction, that is, the existing "laser pen" is replaced by the "stylus".
In a specific example, referring to fig. 2, fig. 2 shows a physical diagram captured by an image capturing device, and it can be known that some other objects, such as "flower pot", "cabinet", "window", etc., may be captured in the captured image besides the display device.
And S20, detecting the rectangular frame of the display device in the image.
It is easy to understand that the rectangular frame of the display device provided in the present application is a peripheral frame of the display device (the frame of the display device in fig. 2), and it should be noted that, in the prior art, the appearance of the display device is mostly rectangular, and therefore, in this embodiment, the frame of the display device is a rectangular frame by default. The display method provided by the present embodiment may be adaptively modified according to the shape of the actual display device, for example, when the external shape of the display device is a circle, the change is to detect the circular frame of the display device in the obtained image when step S20 is performed.
In order to capture the entire image of the display device, the image capturing device is required to capture the image at a position far from the display device during actual operation.
With continued reference to fig. 2, in fig. 2, besides the rectangular graphics of the display device, the graphics such as cabinet, window, etc. are also rectangular, and it is necessary to extract the graphics of the display device from the captured image to distinguish them from other noise graphics (cabinet, window, etc.), therefore, step S20 further includes the following sub-steps:
s201, carrying out straight line detection on the image to obtain straight line segments contained in the image;
it is easily understood that, in the acquired image, besides the border line segments of the display device, there are a large number of other interfering line segments in the background, and the image generally needs to be preprocessed to filter out part of the noise line segments to reduce the interference to the line detection, specifically, in some embodiments, the step S201 may include: carrying out binarization on the image according to the color of a rectangular frame of the display equipment to obtain a binarized image; and carrying out linear detection on the binary image to obtain a linear segment contained in the image.
In the binarization processing of the image, the gray scale of a point on the image is set to 0 or 255, that is, the whole image exhibits a significant black-and-white effect. That is, the 256 brightness level gray scale image is selected by proper threshold value to obtain binary image which can still reflect the whole and local features of the image, thus being beneficial to that when the image is further processed, the collective property of the image is only related to the position of the point with the pixel value of 0 or 255, and the multi-level value of the pixel is not related, so that the processing is simple, and the processing and compression amount of the data is small.
In a specific example, with reference to fig. 2, it can be known that a frame of the display device is black, and colors of other interference line segments are complex and changeable, so that firstly, a binary image is obtained by performing binarization processing on the acquired image, then a threshold value of the color channel is set, for example, 50 is selected as the threshold value of the color channel, pixels in which R, G, B three-channel pixel values are all smaller than the threshold value are retained, three-channel values of other pixel points are set to be 255 (white), and after information except for approximately black in the image is filtered out, the information is input to a line segment detection algorithm for detection, so as to obtain line segments included in the binary image.
It should be noted that the directly obtained straight line segments are straight line segments included in the binarized image, but the straight line segments included in the binarized image correspond to the straight line segments included in the original image.
Preferably, a Line Segment Detector (LSD) may be used to detect Line segments in the binarized image.
It should be noted that, since the border color of the known display device is black, besides the pixels with the R, G, B three-channel pixel values smaller than the threshold value in the retained image, the three-channel values of other pixels are set to be white (255) with strong identification with black. However, if the border color of the display device is other, for example, the border color of the display device is red, three channel values of other pixel points may also be set to white or green; the color of the frame of the display equipment is white, and three channel values of other pixel points can be set to be black.
The method comprises the steps of preprocessing a shot image aiming at a frame with a specific color (such as black) of the display equipment, reserving pixels with darker colors in the image, and then performing straight line detection to reduce interference on detection caused by other possibly existing straight lines in the background.
S202, dividing each straight line segment into an upper frame group, a lower frame group, a left frame group and a right frame group;
specifically, the coordinates (x) of the end point of each detected straight line segment are output by the LSD algorithm1,y1),(x2,y2) The length of the straight line segment can be calculated by the following formula,
Figure BDA0002894950350000071
and screening out the straight line segments with the length smaller than a preset length threshold value in each straight line segment. The specific value of the threshold value can be determined according to the length and the width of the display device, for example, the length of the display device is 0.5m, the width is 0.3m, and the length threshold value can be designed to be 0.2 m.
Then, the included angle between each straight line segment and the horizontal direction is calculated by the following formula
Figure BDA0002894950350000072
Dividing straight line segments with included angles smaller than 45 degrees into horizontal lines, otherwise, dividing the straight line segments into vertical lines, and obtaining horizontal and vertical straight line segment groups;
calculating the middle point of each straight line segment, and for the horizontal straight line segment group, if the middle point of the straight line segment is above the central point of the image, dividing the straight line segment into an upper frame group, and otherwise, dividing the straight line segment into a lower frame group; for a group of vertical straight line segments, a straight line segment is classified into a left frame group if its midpoint is to the left of the image center point, and into a right frame group otherwise.
It should be noted that, if there is no straight line satisfying the condition in any straight line group, it is described that the display device frame quadrangle in the picture may not include the center point of the image, that is, the "stylus" is not pointing to the screen of the display device, and at this time, no projection point is displayed on the screen of the display device.
S203, selecting a straight line segment from the upper frame group, the lower frame group, the left frame group and the right frame group respectively to form a straight line segment combination, obtaining all straight line segment combinations, and connecting four straight line segments in each straight line segment combination into a quadrangle to obtain a plurality of quadrangles;
it should be noted that all the included straight line segments in each bounding box group need to be traversed for combination, for example, the upper bounding boxThe group includes a straight line segment A1、A2(ii) a The lower frame group comprises a straight line segment B1、B2(ii) a The left frame group comprises a straight line segment C1(ii) a The frame group comprises a straight line segment D1(ii) a Then a polygon is obtained: a. the1B1C1D1、A1B2C1D1、A2B1C1D1、A2B2C1D1
As shown in FIG. 3, FIG. 3(a) shows an optional straight line segment in each of four different frame groups, and for convenience of description and understanding, the straight line segment in the upper frame group is defined as A1Defining the straight line segment in the lower frame group as B1Defining the straight line segment in the left frame group as C1Defining the straight line segment in the right frame group as D1(ii) a FIG. 3(b) shows a quadrilateral A formed by processing each straight line segment in FIG. 3(a)1B1C1D1
However, as shown in FIG. 3(a), the straight line segment A1、B1、C1And D1And cannot directly form the quadrangle A1B1C1D1The quadrangle A is formed only after cutting or extending each straight line segment1B1C1D1
Taking FIG. 3 as an example, for straight line segment A1Requiring the left part to be cut to obtain the side A of the quadrilateral1(ii) a For straight line segment B1Requiring left and right extensions to obtain sides B of the quadrilateral1(ii) a For straight line segment C1Requiring extension of the upper part and cutting of the lower part to obtain a quadrangular edge C1(ii) a For straight line segment D1Requiring up and down cutting to obtain a quadrilateral edge D1Finally, a quadrangle A as shown in FIG. 3(b) is formed1B1C1D1
S204, selecting a quadrangle corresponding to the display equipment from the quadrangles according to the intersection ratio of each side in each quadrangle and the straight line segment forming the side;
specifically, the sum of the intersection ratio of each side included in each quadrangle and the straight line segment forming each side is calculated, and the quadrangle with the largest sum is selected as the quadrangle corresponding to the display device.
In some embodiments, the frame l can be obtained by the following formulaACross-over ratio of (a):
Figure BDA0002894950350000081
wherein iou represents the cross-over ratio, lARepresents an edge, lBIndicating the formation of an edge lAStraight line segment of (1)A∩lBRepresents an edge lAAnd the straight line segment lBLength of the overlapping part of (1)A∪lBRepresents an edge lAAnd the straight line segment lBThe combined length.
With continued reference to FIG. 3, and with reference to FIGS. 3(a) and 3(b), the upper side A of the quadrilateral is shown1Is a straight line segment A1Formed by left cutting, to the upper side, A1Is 1A,A1Is 1BThen l isA∈lBI.e. upper lA,lBThe condition of complete coincidence is met, and the upper intersection ratio
Figure BDA0002894950350000082
Namely A1Length of (A) and1the length of (c).
In the same way, the right side D of the quadrangle1Is a straight line segment D1Performing upper and lower edge cutting, and for the right edge, D1Is 1A,D1Is 1BThen l isA∈lBI.e. the right lA,lBThe condition of complete coincidence is met, and the right intersection ratio
Figure BDA0002894950350000083
I.e. D1Length and D of1The length of (c).
In the same way, the left side C of the quadrangle1Is a straight line segment C1Is formed extending upwards, to the left, C1Is 1A,C1Is 1BReferring to FIG. 3, it can be seen that l is the left middleA,lBAre partially coincident, left cross-over ratio
Figure BDA0002894950350000084
Similarly, the lower side B of the quadrangle1Is a straight line segment B1Left and right extending, for the lower side, B1Is 1A,B1Is 1BI.e. lower lA,lBIn order to partially overlap, the intersection ratio of the lower edges
Figure BDA0002894950350000085
L is not shown in FIG. 3A,lBMisalignment, but this is understood by those skilled in the art, i.e., the edges are formed entirely by "extensions" of the straight segments.
And sequentially acquiring the intersection ratios of the four sides of the quadrangle based on the calculation method of the intersection ratio, and adding the four intersection ratios to obtain the intersection ratio sum value of the four sides of the quadrangle.
And calculating the intersection ratio sum value of all the quadrangles, and selecting the quadrangle with the maximum sum value as the quadrangle corresponding to the display equipment.
S205, performing perspective transformation on the quadrangle of the corresponding display device to obtain a rectangular frame of the display device in the image.
Specifically, the perspective transformation of the quadrangle of the corresponding display device includes:
calibrating the central point of the image;
the center point of the image is the center point of the acquired image.
And calibrating the position of the central point of the image by using a special color (such as red).
And after calibration, carrying out perspective transformation on the quadrangle of the corresponding display equipment to obtain the rectangular frame of the display equipment in the image.
And S30, displaying the identifier at the corresponding display position of the display device according to the coordinate of the central point of the image in the coordinate system established by the rectangular frame.
Specifically, by detecting the calibration, a center point of an image (i.e., a center point of an original image) is determined in an image after perspective transformation is performed on a quadrilateral of a corresponding display device, coordinates of the center point of the image in a coordinate system established by a rectangular frame are obtained, and a display identifier is displayed at a corresponding display position of the display device according to the coordinates.
In a specific example, after a rectangular frame of the display device is obtained, a coordinate system is established according to the rectangular frame, a pre-calibrated red coordinate point position (i.e., a central point position of an original image) can be detected according to three R, G, B channel values of the pixel points, and a display identifier of the stylus at a corresponding display position of the display device can be determined according to the red coordinate point position.
In order to accurately display the position of the marker on the display device, in some embodiments, the display position of the marker is determined according to the coordinate average value of the coordinates of the center point of the current frame image in the coordinate system established by the rectangular frame and the coordinates of the center point of the previous N frame images in the coordinate system established by the corresponding rectangular frame.
According to the display method provided by the embodiment, the interactive mode that the traditional laser pen images on the display equipment is replaced by the mode that the image is shot in real time to calculate the projection point can be used, and the problems that the display of the projection image is unclear and the like can be avoided.
As shown in fig. 4, another embodiment of the present application provides a display system including a display device and an image capturing apparatus for capturing an image;
as shown in fig. 5, the display apparatus includes:
the acquisition module is used for acquiring the image acquired by the image acquisition device;
the detection module is used for detecting a rectangular frame of the display equipment in the obtained image;
and the display module is used for displaying the identifier at the corresponding display position according to the coordinate of the central point of the image in the coordinate system established by the rectangular frame.
Specifically, the display device is an electronic whiteboard device. The image capturing device is arranged at the end of a pen-shaped housing.
It should be noted that the principle and the work flow of the display system provided in this embodiment are similar to those of the display method, and reference may be made to the above description for relevant parts, which are not described herein again.
As shown in fig. 6, a computer system suitable for implementing the display system provided by the above-described embodiments includes a central processing module (CPU) that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) or a program loaded from a storage section into a Random Access Memory (RAM). In the RAM, various programs and data necessary for the operation of the computer system are also stored. The CPU, ROM, and RAM are connected thereto via a bus. An input/output (I/O) interface is also connected to the bus.
An input section including a keyboard, a mouse, and the like; an output section including a speaker and the like such as a Liquid Crystal Display (LCD); a storage section including a hard disk and the like; and a communication section including a network interface card such as a LAN card, a modem, or the like. The communication section performs communication processing via a network such as the internet. The drive is also connected to the I/O interface as needed. A removable medium such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive as necessary, so that a computer program read out therefrom is mounted into the storage section as necessary.
In particular, the processes described in the above flowcharts may be implemented as computer software programs according to the present embodiment. For example, the present embodiments include a computer program product comprising a computer program tangibly embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section, and/or installed from a removable medium.
The flowchart and schematic diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to the present embodiments. In this regard, each block in the flowchart or schematic diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the schematic and/or flowchart illustration, and combinations of blocks in the schematic and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the present embodiment may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor includes an acquisition module, a detection module, and a display module. Wherein the names of the modules do not in some cases constitute a limitation of the module itself. For example, the acquisition module may also be described as an "acquisition module".
On the other hand, the present embodiment also provides a nonvolatile computer storage medium, which may be the nonvolatile computer storage medium included in the apparatus in the foregoing embodiment, or may be a nonvolatile computer storage medium that exists separately and is not assembled into a terminal. The non-volatile computer storage medium stores one or more programs that, when executed by an apparatus, enable the apparatus to implement the display method provided by the foregoing embodiment.
In the description of the present application, it should be noted that the terms "upper", "lower", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, which are only for convenience in describing the present application and simplifying the description, and do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and operate, and thus, should not be construed as limiting the present application. Unless expressly stated or limited otherwise, the terms "mounted," "connected," and "connected" are intended to be inclusive and mean, for example, that they may be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
It is further noted that, in the description of the present application, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
It should be understood that the above-mentioned examples are given for the purpose of illustrating the present application clearly and not for the purpose of limiting the same, and that various other modifications and variations of the present invention may be made by those skilled in the art in light of the above teachings, and it is not intended to be exhaustive or to limit the invention to the precise form disclosed.

Claims (14)

1. A display method, comprising:
acquiring an image acquired by an image acquisition device;
detecting to obtain a rectangular frame of the display device in the image;
and displaying the identifier at the corresponding display position of the display equipment according to the coordinate of the central point of the image in the coordinate system established by the rectangular frame.
2. The method of claim 1, wherein the detecting a rectangular bezel of a display device in the image comprises:
performing straight line detection on the image to obtain a straight line segment contained in the image;
dividing each straight line segment into an upper frame group, a lower frame group, a left frame group and a right frame group;
respectively selecting a straight line segment from the upper frame group, the lower frame group, the left frame group and the right frame group to form straight line segment combinations, obtaining all straight line segment combinations, and connecting four straight line segments in each straight line segment combination into a quadrangle to obtain a plurality of quadrangles;
selecting a quadrangle corresponding to the display device from the plurality of quadrangles according to the intersection ratio of each side in each quadrangle to the straight line segment forming the side;
and carrying out perspective transformation on the quadrangle corresponding to the display equipment to obtain a rectangular frame of the display equipment in the image.
3. The method according to claim 2, wherein the selecting the quadrangle of the corresponding display device from the plurality of quadrangles according to the intersection ratio of each side of each quadrangle to the straight line segment forming the side comprises:
and calculating the sum of the intersection ratio of each side contained in each quadrangle and the straight line segment forming each side, and selecting the quadrangle with the maximum sum as the quadrangle corresponding to the display equipment.
4. The method of claim 2, wherein the cross-over ratio is:
Figure FDA0002894950340000011
wherein iou represents the cross-over ratio, lARepresents an edge, lBIndicating the formation of an edge lAStraight line segment of (1)A∩lBRepresents an edge lAAnd the straight line segment lBLength of the overlapping part of (1)A∪lBRepresents an edge lAAnd the straight line segment lBThe combined length.
5. The method of claim 2, wherein the dividing each straight line segment into an upper set of frames, a lower set of frames, a left set of frames, and a right set of frames comprises:
taking a straight line segment with an included angle of less than 45 degrees with the horizontal direction in the straight line segments contained in the image as a horizontal straight line segment, and taking a straight line segment with an included angle of more than 45 degrees with the horizontal direction as a vertical straight line segment;
dividing a horizontal straight line segment with a midpoint positioned above the center point of the image into an upper frame group, dividing a horizontal straight line segment with a midpoint positioned below the center point of the image into a lower frame group, dividing a vertical straight line segment with a midpoint positioned on the left side of the center point of the image into a left frame group, and dividing a vertical straight line segment with a midpoint positioned on the right side of the center point of the image into a right frame group.
6. The method of claim 2, wherein the detecting the straight line of the image to obtain the straight line segment included in the image comprises:
carrying out binarization on the image according to the color of a rectangular frame of the display equipment to obtain a binarized image;
and carrying out linear detection on the binary image to obtain a linear segment contained in the image.
7. The method of claim 2,
the perspective transformation of the quadrangle of the corresponding display device comprises:
calibrating the central point of the image;
after calibration, carrying out perspective transformation on the quadrangle of the corresponding display equipment;
the displaying the identifier at the corresponding display position of the display device according to the coordinate of the central point of the image in the coordinate system established by the rectangular frame comprises:
and determining the central point of the image in the image after the quadrangle of the corresponding display equipment is subjected to perspective transformation through the detection of the calibration, acquiring the coordinate of the central point of the image under a coordinate system established by the rectangular frame, and displaying the identifier at the corresponding display position of the display equipment according to the coordinate.
8. The method according to claim 1, wherein the displaying the identifier at the corresponding display position of the display device according to the coordinates of the central point of the image in the coordinate system established by the rectangular frame comprises:
and determining the display position of the identifier according to the coordinate average value of the coordinate of the center point of the current frame image in the coordinate system established by the rectangular frame and the coordinate of the center point of the previous N frame images in the coordinate system established by the corresponding rectangular frame.
9. A display device for performing the method of any one of claims 1-8, comprising:
the acquisition module is used for acquiring the image acquired by the image acquisition device;
the detection module is used for detecting and obtaining a rectangular frame of the display equipment in the image;
and the display module is used for displaying the identifier at the corresponding display position according to the coordinate of the central point of the image under the coordinate system established by the rectangular frame.
10. A display system comprising a display device as claimed in claim 9 and an image acquisition apparatus for acquiring images.
11. The system of claim 10, wherein the image capture device is disposed at an end of a pen-shaped housing.
12. The system of claim 10, wherein the display device is an electronic whiteboard device.
13. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1-8 when executing the program.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-8.
CN202110039053.4A 2021-01-12 2021-01-12 Display method, display device and display system Active CN112882612B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110039053.4A CN112882612B (en) 2021-01-12 2021-01-12 Display method, display device and display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110039053.4A CN112882612B (en) 2021-01-12 2021-01-12 Display method, display device and display system

Publications (2)

Publication Number Publication Date
CN112882612A true CN112882612A (en) 2021-06-01
CN112882612B CN112882612B (en) 2024-01-23

Family

ID=76044802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110039053.4A Active CN112882612B (en) 2021-01-12 2021-01-12 Display method, display device and display system

Country Status (1)

Country Link
CN (1) CN112882612B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060023111A1 (en) * 2004-07-28 2006-02-02 The University Of Maryland Device using a camera and light polarization for the remote displacement of a cursor on a display
CN101464751A (en) * 2009-01-06 2009-06-24 杭州华银视讯科技有限公司 Electronic white board system based on image detection and detection method thereof
CN101730876A (en) * 2007-05-26 2010-06-09 李汶基 Pointing device using camera and outputting mark
KR20100082171A (en) * 2009-01-08 2010-07-16 (주)유플로우 Apparatus for screen remote controlling
CN103777751A (en) * 2012-10-25 2014-05-07 三星电子株式会社 A method for displaying a cursor on a display and system performing the same
CN106325725A (en) * 2015-06-25 2017-01-11 中兴通讯股份有限公司 Touch screen control method and device, and mobile terminal
CN109117794A (en) * 2018-08-16 2019-01-01 广东工业大学 A kind of moving target behavior tracking method, apparatus, equipment and readable storage medium storing program for executing

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060023111A1 (en) * 2004-07-28 2006-02-02 The University Of Maryland Device using a camera and light polarization for the remote displacement of a cursor on a display
CN101730876A (en) * 2007-05-26 2010-06-09 李汶基 Pointing device using camera and outputting mark
CN101464751A (en) * 2009-01-06 2009-06-24 杭州华银视讯科技有限公司 Electronic white board system based on image detection and detection method thereof
KR20100082171A (en) * 2009-01-08 2010-07-16 (주)유플로우 Apparatus for screen remote controlling
CN103777751A (en) * 2012-10-25 2014-05-07 三星电子株式会社 A method for displaying a cursor on a display and system performing the same
CN106325725A (en) * 2015-06-25 2017-01-11 中兴通讯股份有限公司 Touch screen control method and device, and mobile terminal
CN109117794A (en) * 2018-08-16 2019-01-01 广东工业大学 A kind of moving target behavior tracking method, apparatus, equipment and readable storage medium storing program for executing

Also Published As

Publication number Publication date
CN112882612B (en) 2024-01-23

Similar Documents

Publication Publication Date Title
CN112348815B (en) Image processing method, image processing apparatus, and non-transitory storage medium
US10083522B2 (en) Image based measurement system
US10657600B2 (en) Systems and methods for mobile image capture and processing
US8120665B2 (en) Image processing method and apparatus, digital camera, and recording medium recording image processing program
US8265393B2 (en) Photo-document segmentation method and system
US10198661B2 (en) System for determining alignment of a user-marked document and method thereof
US7342572B2 (en) System and method for transforming an ordinary computer monitor into a touch screen
US20040165786A1 (en) System and method for converting whiteboard content into an electronic document
US9495735B2 (en) Document unbending systems and methods
AU2011250829B2 (en) Image processing apparatus, image processing method, and program
US20120320427A1 (en) Image processing method, image processing device and scanner
AU2011250827B2 (en) Image processing apparatus, image processing method, and program
CN102236784A (en) Screen area detection method and system
CN109803172B (en) Live video processing method and device and electronic equipment
US20180182088A1 (en) Automatic Detection, Counting, and Measurement of Lumber Boards Using a Handheld Device
CN106295644A (en) Symbol Recognition and device
CN104240259B (en) High photographing instrument voucher intelligence cutting edge correction system and method based on contours segmentation
CN112433641B (en) Implementation method for automatic calibration of desktop prop interaction system of multiple RGBD depth sensors
CN103400387A (en) Method and device for adsorbing line segment in image, as well as method and device for constructing polygon
CN104933430B (en) A kind of Interactive Image Processing method and system for mobile terminal
CN112882612B (en) Display method, display device and display system
CN115457055A (en) Illuminance meter value identification method, electronic device, and storage medium
CN109101960A (en) Identity text information detection method and device
EP3872707A1 (en) Automatic detection, counting, and measurement of lumber boards using a handheld device
JP2014119965A (en) Information terminal equipment and program and recognition system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant