CN113395482A - Color-related intelligent two-dimensional video device and two-dimensional video playing method thereof - Google Patents

Color-related intelligent two-dimensional video device and two-dimensional video playing method thereof Download PDF

Info

Publication number
CN113395482A
CN113395482A CN202010170850.1A CN202010170850A CN113395482A CN 113395482 A CN113395482 A CN 113395482A CN 202010170850 A CN202010170850 A CN 202010170850A CN 113395482 A CN113395482 A CN 113395482A
Authority
CN
China
Prior art keywords
color
video
intelligent
playing
risk
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010170850.1A
Other languages
Chinese (zh)
Inventor
张大庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pinghu Laidun Optical Instrument Manufacturing Co ltd
Original Assignee
Pinghu Laidun Optical Instrument Manufacturing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pinghu Laidun Optical Instrument Manufacturing Co ltd filed Critical Pinghu Laidun Optical Instrument Manufacturing Co ltd
Priority to CN202010170850.1A priority Critical patent/CN113395482A/en
Publication of CN113395482A publication Critical patent/CN113395482A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

The invention discloses a color-related intelligent two-dimensional video device, which comprises a video generation unit. The video generation unit is used for connecting a plurality of clearest images generated by the microscope imaging system into a video in series, wherein the clearest images respectively correspond to different color values of the variable color light. The color-dependent intelligent two-dimensional video device may further include an intelligent recognition unit 520, a labeling unit 530, a display unit 540, and an intelligent play control unit 550. The invention at least comprises the following advantages: the clearest image of the color-variable light with different color values shot under the microscope visual field is displayed on a screen after being spliced, and a video is generated, so that the whole view of a researched object under the microscopic visual field can be conveniently checked, and an interested area can be conveniently researched.

Description

Color-related intelligent two-dimensional video device and two-dimensional video playing method thereof
Technical Field
The invention relates to the technical field of picture imaging, in particular to a color-related intelligent two-dimensional video device based on a microscope-shot picture and a color-related intelligent two-dimensional video playing method thereof.
Background
The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
The microscope is used for researching images of various organisms or cells in a micro state, has wide application range, and is the most important instrument in the fields of biology and medicine. The microscope is a tool for observing microscopic objects, the field of view of the microscope is very limited, only a small area can be observed at the same time, and for a relatively large object or a slightly large area, the whole appearance of the object cannot be directly obtained from the microscope, so that the microscope needs to be continuously adjusted to obtain images of different areas, but the panoramic observation in one field of view cannot be usually realized.
In addition, since the microscope is a technique that requires light to see the inside of an object, and the variable color light suitable for each object is different in practice, it is difficult to directly obtain a clear photograph due to the structure of the conventional microscope itself. Without a clear figure, all subsequent studies became meaningless.
It should be noted that the above background description is only for the sake of clarity and complete description of the technical solutions of the present invention and for the understanding of those skilled in the art. Such solutions are not considered to be known to the person skilled in the art merely because they have been set forth in the background section of the invention.
Disclosure of Invention
In order to overcome the defects in the prior art, embodiments of the present invention provide a color-dependent intelligent two-dimensional video device and a color-dependent intelligent two-dimensional video playing method thereof, where the microscope imaging system can view the texture and the entire view of different areas of an object under study in a panoramic manner under variable color lights of different colors.
The embodiment of the application discloses: a color-related intelligent two-dimensional video device comprises a video generation unit, which is used for connecting a plurality of clearest images generated by a microscope imaging system into a video in series, wherein the clearest images respectively correspond to different color values of a variable color light.
Further, the color-dependent intelligent two-dimensional video apparatus further comprises an intelligent recognition unit, coupled to the video generation unit, for recognizing the content of the video according to the different color values to generate a low-risk segment and a high-risk segment, wherein the high-risk segment includes a tumor tissue image.
Further, the color-dependent intelligent two-dimensional video apparatus further comprises a marking unit, coupled to the intelligent identification unit, for adding a marking mark to a specific position of the high-risk segment of the video to generate a high-risk marked segment, wherein the specific position corresponds to the tumor tissue imaging.
Further, the color-dependent intelligent two-dimensional video device further comprises a display unit, coupled to the video generation unit, for playing the video for an operator to quickly view the plurality of clearest images in the video respectively corresponding to the different color values of the variable color light.
Furthermore, the color-dependent intelligent two-dimensional video device further comprises an intelligent play control unit, coupled to the indication unit and the display unit, for controlling the display unit to play the low-risk segment of the video at a first play speed and to play the high-risk indication segment of the video at a second play speed, wherein the first play speed is greater than the second play speed.
The embodiment of the application discloses: a color-related intelligent two-dimensional video playing method comprises the following steps: and connecting a plurality of clearest images generated by a microscope imaging system into a video, wherein the clearest images respectively correspond to different color values of the variable color light.
Further, the method further comprises: the content of the video is identified according to different color values to generate a low risk section and a high risk section, wherein the high risk section comprises a tumor tissue imaging.
Further, the method further comprises: adding a marker to a specific location of the high-risk segment of the video to generate a high-risk marker segment, wherein the specific location corresponds to the tumor tissue imaging.
Further, the method further comprises: and playing the video to enable an operator to quickly watch the plurality of clearest images in the video, wherein the images respectively correspond to the different color values of the variable color light.
Further, the method further comprises: and playing the low-risk section of the video at a first playing speed and playing the high-risk marking section of the video at a second playing speed, wherein the first playing speed is greater than the second playing speed.
By means of the technical scheme, the invention has the following beneficial effects: the clearest image of the color-variable light with different color values shot under the microscope visual field is displayed on a screen after being spliced, and a video is generated, so that the whole view of a researched object under the microscopic visual field can be conveniently checked, and an interested area can be conveniently researched. In addition, the invention provides a color-related intelligent two-dimensional video device and a color-related intelligent two-dimensional video playing device thereof, which can automatically adjust the playing speed according to the content of the video, and play the low-risk section at a faster playing speed when detecting that the video is a low-risk section; when the video is detected to be a high-risk section including a tumor tissue image, the high-risk section and the mark thereof are played at a slower playing speed, and the observer can easily and clearly see the tumor tissue and the specific position thereof.
In order to make the aforementioned and other objects, features and advantages of the invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of a microscope imaging system in an embodiment of the invention.
Fig. 2 is a block diagram of a microscope imaging system in an embodiment of the invention.
Fig. 3 is a schematic view of an image taken by the imaging unit.
FIG. 4 is a block diagram of a color-dependent intelligent two-dimensional video device according to a first embodiment of the present invention.
FIG. 5 is a block diagram of a color-dependent intelligent two-dimensional video device according to a second embodiment of the present invention.
Fig. 6 is a flowchart of a video playing method according to an embodiment of the present invention.
Reference numerals of the above figures: 10. a microscope imaging system; 110. a carrier platform; 120. a variable color light emitting source; 130. a user input interface; 140. a control unit; 150. an imaging unit; 160. a calculation unit; 170. a determination unit; 410. a video generation unit; OB1, target; r _ RANGE, G _ RANGE, B _ RANGE, three color RANGEs; Δ R, Δ G, Δ B, three color adjustment values; r _ MAX, G _ MAX, B _ MAX, three maximum color values; r _ MIN, G _ MIN, B _ MIN, three minimum color values; AREA 11-AREA 35, AREA; s610, S620, S630, S640, S650 and steps; 40. 50, color-dependent intelligent two-dimensional video devices; 520. an intelligent identification unit; 530. a marking unit; 540. a display unit; 550. and a smart play control unit.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, in the description of the present invention, the terms "first", "second", and the like are used for descriptive purposes only and for distinguishing similar objects, and no precedence between the two is considered as indicating or implying relative importance. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified.
Please refer to fig. 1 and fig. 2. Fig. 1 is a schematic diagram of a microscope imaging system 10 in a first embodiment of the invention, and fig. 2 is a block diagram of a microscope imaging system 10 in a first embodiment of the invention. As shown in fig. 1 and fig. 2, a microscope imaging system 10 includes a stage 110, a color-changeable light source 120, a user input interface 130, a control unit 140, a computing unit, an imaging unit 150, a computing unit 160, and a determining unit 170. The object stage 110 is used for placing an object OB1 to be observed; the color-changeable light-emitting source 120 is located above the object stage 110 and is used for providing a color-changeable light to the object OB 1. The user input interface 130 is used for providing a user with three color RANGEs R _ RANGE, G _ RANGE, B _ RANGE and three color adjustment values Δ R, Δ G, Δ B for inputting the variable color light, wherein the three color RANGEs R _ RANGE, G _ RANGE, B _ RANGE respectively include three maximum color values R _ MAX, G _ MAX, B _ MAX and three minimum color values R _ MIN, G _ MIN, B _ MIN. The control unit 140 is coupled to the variable color light emitting sources 120 and the user input interface 130, and configured to adjust the variable color light emitting sources 120 according to the three color RANGEs R _ RANGE, G _ RANGE, and B _ RANGE input by the user and the three color adjustment values Δ R, Δ G, and Δ B. The imaging unit 150 is used to capture the object OB1 to generate a plurality of images per second. When the control unit 140 adjusts the color-changeable light of the color-changeable light source 120 according to the three color RANGEs R _ RANGE, G _ RANGE, B _ RANGE and the three color adjustment values Δ R, Δ G, Δ B, the imaging unit 150 captures a plurality of images of the color-changeable light at different color values. The calculating unit 160 is coupled to the imaging unit 150 for calculating the sharpness of the plurality of images. The determining unit 170 is coupled to the calculating unit 160, and is configured to determine a clearest image from the plurality of images.
In one possible embodiment, the variable color light emitting source 120 may comprise a multi-color LED light source, and the multi-color LED light source comprises at least R, G, B LED lamps, and the modulation frequency of the variable color light emitting source 120 is 40 kHz.
In a possible embodiment, the three color RANGEs R _ RANGE, G _ RANGE, and B _ RANGE include three maximum color values R _ MAX, G _ MAX, and B _ MAX and three minimum color values R _ MIN, G _ MIN, and B _ MIN. At this time, the control unit 140 firstly adjusts the color-changeable light of the color-changeable light source 120 into three maximum color values R _ MAX, G _ MAX, and B _ MAX, and then gradually decreases the color-changeable light of the color-changeable light source 120 according to one of the three color adjustment values after every predetermined time interval until the color-changeable light of the color-changeable light source 120 is adjusted into three minimum color values R _ MIN, G _ MIN, and B _ MIN. For example, the user can set all three maximum color values of the three color RANGEs R _ RANGE, G _ RANGE, and B _ RANGE of the variable color light to 255, all three minimum color values to 0, and all three color adjustment values Δ R, Δ G, and Δ B to 255, so that the control unit 140 first adjusts the color values (R, G, B) of the variable color light source 120 to (255 ), and at this time, the imaging unit 150 captures a plurality of images of the target object OB1 corresponding to the color values (R, G, B) of the variable color light to (255 ); then, after every predetermined time interval (for example, every 30 seconds or after the imaging unit 150 has captured all the images of the object OB 1), the control unit 140 first decrements one color value of the color-changeable light source 120 by 255 (three color adjustment values Δ R, Δ G, Δ B) each time, and sequentially decrements and adjusts the color value to (0,255,255), (255,0,255), (255, 0), (0,0,255), (0,255,0), (255,0,0, 0), (0,0,0) until all three color values of the color-changeable light source 120 are adjusted to 0 (three minimum color values R _ MIN, G _ MIN, B _ MIN). At this time, the imaging unit 150 also captures a plurality of images of the object OB1 corresponding to the variable color light (0,255,255), (255,0,255), (255, 0), (0,0,255), (0,255,0), (255,0,0), (0,0, 0). The three maximum color values, the three minimum color values, the three color adjustment values and the predetermined time interval are only exemplary and are not limitations of the present invention. The setting values can be designed into different values according to actual requirements, and the scope of the invention is also covered by the invention.
It is to be noted that the color adjustment values Δ R, Δ G, Δ B of only one of the color values can be decremented at a time when the color change is performed. Generally, when the color value (R, G, B) of the variable color light is (255,255,255), the corresponding color is white, and all three LED lamps (R _ LED, G _ LED, B _ LED) emit light together; when the color value (R, G, B) of the variable color light is (0,255,255), the corresponding color is cyan, and the two LED lamps (G _ LED and B _ LED) emit light together; when the color value (R, G, B) of the variable color light is (255,0,255), the corresponding color is magenta, and the two LED lamps (R _ LED, B _ LED) emit light together; when the color value (R, G, B) of the variable color light is (255, 0), the corresponding color is yellow, and the two LED lamps (R _ LED, G _ LED) emit light together; when the color value (R, G, B) of the variable color light is (0, 255), the corresponding color is blue, and only one LED lamp (B _ LED) emits light together; when the color value (R, G, B) of the variable color light is (0,255,0), the corresponding color is green, and only one LED lamp (G _ LED) emits light together; when the color value (R, G, B) of the variable color light is (255,0,0), the corresponding color is red, and only one LED lamp (R _ LED) emits light together; when the color value (R, G, B) of the variable color light is (0,0,0), the corresponding color is black, and all the LED lamps do not emit light at this time.
Referring to fig. 3, fig. 3 is a schematic diagram of an image captured by the imaging unit 150. As shown in fig. 3, the imaging unit 150 divides the object OB1 into mxn regions (e.g., 3x5 regions AREA 11-AREA 35, each region size is 16 × 16), and takes p block images for each region (e.g., the first region AREA11), wherein the p block images of each region are captured under the same color value of the variable color light. In the above example, when all three maximum color values R _ MAX, G _ MAX, and B _ MAX are 255, all three minimum color values R _ MIN, G _ MIN, and B _ MIN are 0, and all three color adjustment values Δ R, Δ G, and Δ B are 255 (8 color values), 1600(8X200) block images of the first AREA11 are captured, and the 1600 block images correspond to different color values (255 ), (0,255,255), (255,0,255, 0,0), (0,0,255), (0,255,0, 0). By analogy, 1600 block images of a second AREA12 are captured for the second AREA12 until all the AREAs (including 3x5 AREAs 11-AREA 35) are completely captured for 1600 block images. That is, a total of mxnxp block images are obtained at each color value, and in the above example, a total of 5 × 3 × 200 block images are captured at each color value.
It should be noted that m, n, and p are only exemplary and not limiting. The setting values can be designed into different values according to actual requirements, and the scope of the invention is also covered by the invention.
It should be noted that, in the above embodiment, each of the mxn regions uses the same three color RANGEs R _ RANGE, G _ RANGE, B _ RANGE and the same three color adjustment values Δ R, Δ G, Δ B to capture the block image, which is only an example and is not a limitation of the present invention. In other embodiments, the different regions may use different three color RANGEs R _ RANGE, G _ RANGE, B _ RANGE (including three maximum color values R _ MAX, G _ MAX, B _ MAX and three minimum color values R _ MIN, G _ MIN, B _ MIN) or different three color adjustment values Δ R, Δ G, Δ B to capture the block image. In this case, different three color RANGEs R _ RANGE, G _ RANGE, B _ RANGE (including three maximum color values R _ MAX, G _ MAX, B _ MAX and three minimum color values R _ MIN, G _ MIN, B _ MIN) and/or different three color adjustment values Δ R, Δ G, Δ B are input for the respective regions. For example, the three maximum color values R _ MAX, G _ MAX, and B _ MAX of the AREA11 may be set to 255, and 255, the three minimum color values R _ MIN, G _ MIN, and B _ MIN may be set to 0, and 0, the three color adjustment values Δ R, Δ G, and Δ B may be set to 255, and 255, the three maximum color values R _ MAX, G _ MAX, and B _ MAX of the AREA12 may be set to 255,0, and 210, the three minimum color values R _ MIN, G _ MIN, and B _ MIN may be set to 51, 0, and 180, the three color adjustment values Δ R, Δ G, and Δ B may be set to 255,0, and 15, the three maximum color values R _ MAX, G _ MAX, and B _ MAX of the AREA13 may be set to 150, 180, and 0, the three minimum color adjustment values R _ MIN, G, and Δ B may be set to 75, 150, and 0, the three color adjustment values Δ R, Δ G, and Δ B may be set to 15, respectively, 5. 0, and so on, this can accomplish three different color RANGEs R _ RANGE, G _ RANGE, and B _ RANGE (three different maximum color values R _ MAX, G _ MAX, and B _ MAX, and three different minimum color values R _ MIN, G _ MIN, and B _ MIN) for different regions.
In one possible embodiment, the three color RANGEs R _ RANGE, G _ RANGE, and B _ RANGE of the color-changeable light may include three maximum color values R _ MAX, G _ MAX, and B _ MAX and three corresponding decrementing times N _ R, N _ G, N _ B. For example, the user may set the three maximum color values of the three color RANGEs R _ RANGE, G _ RANGE, and B _ RANGE of the variable color light to 255, and 255 respectively, set the three decrement times N _ R, N _ G, N _ B to 3, 0, and 2, and set the three color adjustment values Δ R, Δ G, and Δ B to 15, 0, and 51 respectively, so that the control unit 140 first adjusts the variable color light of the variable color light source 120 to (255, and 255), and at this time, the imaging unit 150 captures a plurality of images of the object OB1 corresponding to the variable color light 255; then, after every predetermined time interval (for example, every 30 seconds or after the imaging unit 150 has captured all the images of the object OB 1), the control unit 140 first decrements the color-changeable light source 120 by one of the color adjustment values Δ R, Δ G, Δ B each time, and sequentially decrements and adjusts the color-changeable light source to (240,255,255), (225,255,255), (210,255,255), (255,255,204), (255,255,153), (240,255,255), (240,255,204), (240,255,153), (225,255,255), (225,255,204), (225,255,153), (210,255,255), (210,255,204), (210,255,153) until the color value of the color-changeable light source 120 is adjusted to (210,255,153). In this process, the color value R is set to 255, 240, 225, 210 in order (3 decrements in total), the color value G is set to 255 (0 decrements in total), and the color value B is set to 255,204, 153 in order (2 decrements in total).
In another possible embodiment, the three color ranges of the variable color light may also include the three minimum color values and the corresponding three increasing times, and also fall within the scope of the present invention.
In the determination process, first, the determining unit 170 defaults a first image as a main image to be finally output, then calculates the sharpness of p block images for each region calculating unit 160, and updates the highest sharpness value if the sharpness of a specific block image is greater than the highest sharpness of the region on the main image, and simultaneously replaces the default block image of the region on the main image with the specific block image. For example, if the color value is (255,255,255), the determining unit 170 first defaults the first block images of all the AREAs AREA 11-AREA 35 to be the final output main image, and the first block images of all the AREAs correspond to the color value (255,255,255). Then, for the first AREA11, the calculating unit 160 calculates the sharpness of 200 block images, and if the sharpness of a specific block image is greater than the highest sharpness of the first AREA11 on the main image, the highest sharpness value is updated, and the determining unit 170 replaces the default block image (i.e., the first block image) of the first AREA11 on the main image with the specific block image. When the sharpness of all the 200 block images of the first AREA11 is compared, the determining unit 170 determines a clearest block image. By analogy, for all the AREAs AREA11 to AREA35, the calculating unit 160 calculates the sharpness of 200 block images, and the final determining unit 170 determines the clearest block images of each of the AREAs AREA11 to AREA35, and combines the clearest block images into the final output main image. Thus, there is a clearest main image for each color value, and in the first example, the invention generates 8 clearest main images (8 color values). Each main image is spliced by the clearest AREAs AREA 11-AREA 35.
Referring to fig. 4, fig. 4 is a block diagram of a color-dependent intelligent two-dimensional video device 40 according to a first embodiment of the invention. The color-dependent intelligent two-dimensional video apparatus 40 includes a video generation unit 410 for concatenating a plurality of clearest images generated by the microscope imaging system 10 into a video, wherein the clearest images respectively correspond to different color values of a variable color light.
Referring to fig. 5, fig. 5 is a block diagram of a color-dependent intelligent two-dimensional video device 50 according to a second embodiment of the invention. In the present embodiment, the color-dependent intelligent two-dimensional video apparatus 50 includes a video generating unit 410, an intelligent identifying unit 520, a marking unit 530, a display unit 540, and an intelligent playing control unit 550. The intelligent recognition unit 520 is coupled to the video generation unit 410, and is configured to recognize the content of the video according to the different color values to generate a low risk section and a high risk section, wherein the high risk section includes a tumor tissue image. The marking unit 530 is coupled to the smart identification unit 520, and configured to add a marking mark to a specific location of the high-risk segment of the video to generate a high-risk marking segment, wherein the specific location corresponds to the tumor tissue imaging. The display unit 540 is coupled to the video generating unit 410 for playing the video for an operator to quickly view the plurality of clearest images in the video respectively corresponding to the color values of the variable color light. The smart play control unit 550 is coupled to the indication unit 530 and the display unit 540, and configured to control the display unit 540 to play the low-risk segment of the video at a first play speed and play the high-risk indication segment of the video at a second play speed, where the first play speed is greater than the second play speed.
In other words, the color-dependent intelligent two-dimensional video apparatus 50 can automatically adjust the playing speed according to the content of the video, and when it is detected that the video is a low-risk section, the low-risk section is played at a faster playing speed; when the video is detected to be a high-risk section including a tumor tissue image, the high-risk section and the mark thereof are played at a slower playing speed, and the observer can easily and clearly see the tumor tissue and the specific position thereof.
Referring to fig. 6, fig. 6 is a flowchart of a video playing method according to an embodiment of the present invention. The method comprises the following steps:
s610: and connecting a plurality of clearest images generated by a microscope imaging system into a video, wherein the clearest images respectively correspond to different color values of the variable color light.
S620: the content of the video is identified according to different color values to generate a low risk section and a high risk section, wherein the high risk section comprises a tumor tissue imaging.
S630: adding a marker to a specific location of the high-risk segment of the video to generate a high-risk marker segment, wherein the specific location corresponds to the tumor tissue imaging.
S640: and playing the video to enable an operator to quickly watch the plurality of clearest images in the video, wherein the images respectively correspond to the different color values of the variable color light.
S650: and playing the low-risk section of the video at a first playing speed and playing the high-risk marking section of the video at a second playing speed, wherein the first playing speed is greater than the second playing speed.
By means of the technical scheme, the invention has the following beneficial effects: the clearest image of the color-variable light with different color values shot under the microscope visual field is displayed on a screen after being spliced, and a video is generated, so that the whole view of a researched object under the microscopic visual field can be conveniently checked, and an interested area can be conveniently researched. In addition, the invention provides a color-related intelligent two-dimensional video device and a color-related intelligent two-dimensional video playing device thereof, which can automatically adjust the playing speed according to the content of the video, and play the low-risk section at a faster playing speed when detecting that the video is a low-risk section; when the video is detected to be a high-risk section including a tumor tissue image, the high-risk section and the mark thereof are played at a slower playing speed, and the observer can easily and clearly see the tumor tissue and the specific position thereof.
The principle and the implementation mode of the invention are explained by applying specific embodiments in the invention, and the description of the embodiments is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A color dependent intelligent two-dimensional video apparatus, comprising:
the video generation unit is used for serially connecting a plurality of clearest images generated by a microscope imaging system into a video, wherein the clearest images respectively correspond to different color values of variable color light.
2. The color-dependent intelligent two-dimensional video apparatus according to claim 1, further comprising:
an intelligent identification unit, coupled to the video generation unit, for identifying the content of the video according to the different color values to generate a low risk section and a high risk section, wherein the high risk section includes a tumor tissue image.
3. The color-dependent intelligent two-dimensional video apparatus according to claim 2, further comprising:
a marking unit, coupled to the intelligent recognition unit, for adding a marking mark to a specific location of the high-risk segment of the video to generate a high-risk marking segment, wherein the specific location corresponds to the tumor tissue imaging.
4. The color-dependent intelligent two-dimensional video apparatus according to claim 3, further comprising:
and the display unit is coupled with the video generation unit and used for playing the video so as to enable an operator to quickly watch the plurality of clearest images in the video, wherein the images respectively correspond to the different color values of the variable color light.
5. The color-dependent intelligent two-dimensional video apparatus according to claim 4, further comprising:
and the intelligent playing control unit is coupled with the marking unit and the display unit and is used for controlling the display unit to play the low-risk section of the video at a first playing speed and play the high-risk marking section of the video at a second playing speed, wherein the first playing speed is higher than the second playing speed.
6. A color-dependent intelligent two-dimensional video playing method is characterized by comprising the following steps:
and connecting a plurality of clearest images generated by a microscope imaging system into a video, wherein the clearest images respectively correspond to different color values of the variable color light.
7. The color-dependent intelligent two-dimensional video playing method according to claim 6, further comprising:
the content of the video is identified according to different color values to generate a low risk section and a high risk section, wherein the high risk section comprises a tumor tissue imaging.
8. The color-dependent intelligent two-dimensional video playing method according to claim 7, further comprising:
adding a marker to a specific location of the high-risk segment of the video to generate a high-risk marker segment, wherein the specific location corresponds to the tumor tissue imaging.
9. The color-dependent intelligent two-dimensional video playing method according to claim 8, further comprising:
and playing the video to enable an operator to quickly watch the plurality of clearest images in the video, wherein the images respectively correspond to the different color values of the variable color light.
10. The color-dependent intelligent two-dimensional video playback method according to claim 9, further comprising:
and playing the low-risk section of the video at a first playing speed and playing the high-risk marking section of the video at a second playing speed, wherein the first playing speed is greater than the second playing speed.
CN202010170850.1A 2020-03-12 2020-03-12 Color-related intelligent two-dimensional video device and two-dimensional video playing method thereof Pending CN113395482A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010170850.1A CN113395482A (en) 2020-03-12 2020-03-12 Color-related intelligent two-dimensional video device and two-dimensional video playing method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010170850.1A CN113395482A (en) 2020-03-12 2020-03-12 Color-related intelligent two-dimensional video device and two-dimensional video playing method thereof

Publications (1)

Publication Number Publication Date
CN113395482A true CN113395482A (en) 2021-09-14

Family

ID=77615661

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010170850.1A Pending CN113395482A (en) 2020-03-12 2020-03-12 Color-related intelligent two-dimensional video device and two-dimensional video playing method thereof

Country Status (1)

Country Link
CN (1) CN113395482A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040252884A1 (en) * 2003-06-12 2004-12-16 Fuji Xerox Co., Ltd. Methods for multisource color normalization
CN101076724A (en) * 2004-04-14 2007-11-21 美国医软科技公司 Liver disease diagnosis system, method and graphical user interface
WO2017169233A1 (en) * 2016-03-29 2017-10-05 ソニー株式会社 Imaging processing device, imaging processing method, computer program and electronic device
CN108227173A (en) * 2016-12-22 2018-06-29 阿诺德和里克特电影技术公司 Electron microscope

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040252884A1 (en) * 2003-06-12 2004-12-16 Fuji Xerox Co., Ltd. Methods for multisource color normalization
CN101076724A (en) * 2004-04-14 2007-11-21 美国医软科技公司 Liver disease diagnosis system, method and graphical user interface
WO2017169233A1 (en) * 2016-03-29 2017-10-05 ソニー株式会社 Imaging processing device, imaging processing method, computer program and electronic device
CN108227173A (en) * 2016-12-22 2018-06-29 阿诺德和里克特电影技术公司 Electron microscope

Similar Documents

Publication Publication Date Title
US7454065B2 (en) Specific point detecting method and device
US11494960B2 (en) Display that uses a light sensor to generate environmentally matched artificial reality content
US10820786B2 (en) Endoscope system and method of driving endoscope system
CN105979238A (en) Method for controlling global imaging consistency of multiple cameras
TW200838357A (en) Method and system for detecting effect of lighting device
CN116506993A (en) Light control method and storage medium
CN105791783B (en) Camera imaging color adjusting method and system
CN110956642A (en) Multi-target tracking identification method, terminal and readable storage medium
KR20130033331A (en) Sensibility lighting control apparatus and method
CN104795021A (en) LED display screen self-color-regulation method
CN109922574A (en) Light efficiency adjustment control method, system and the storage medium of LED Landscape Lighting
CN113395482A (en) Color-related intelligent two-dimensional video device and two-dimensional video playing method thereof
CN110290313B (en) Method for guiding automatic focusing equipment to be out of focus
KR20150111627A (en) control system and method of perforamance stage using indexing of objects
CN113395508A (en) Color-related intelligent 3D video device and intelligent 3D video playing method thereof
CN113391439A (en) Color-related microscope imaging system and control method thereof
US7812862B2 (en) White balance adjustment method for a digital image capturing device
KR20170081519A (en) Apparatus for controling lighting using facial expression and method thereof
TW202327286A (en) Interference mitigation based on selected signal patterns
US20200394757A1 (en) Method for marking an image region in an image of an image sequence
CN110874862A (en) System and method for three-dimensional reconstruction
CN110874863A (en) Three-dimensional reconstruction method and system for three-dimensional reconstruction
EP3140982B1 (en) Device with a camera and a screen
CN109862256B (en) Device and method for visually positioning belt fiber
CN113395507A (en) Intelligent 3D video device with brightness correlation and intelligent 3D video playing method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210914