CN112468845A - Processing method and processing device for screen projection picture - Google Patents

Processing method and processing device for screen projection picture Download PDF

Info

Publication number
CN112468845A
CN112468845A CN202011279096.1A CN202011279096A CN112468845A CN 112468845 A CN112468845 A CN 112468845A CN 202011279096 A CN202011279096 A CN 202011279096A CN 112468845 A CN112468845 A CN 112468845A
Authority
CN
China
Prior art keywords
area
picture
screen projection
screen
areas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011279096.1A
Other languages
Chinese (zh)
Inventor
盛伟明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011279096.1A priority Critical patent/CN112468845A/en
Publication of CN112468845A publication Critical patent/CN112468845A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application discloses a method and a device for processing a screen projection picture, and belongs to the technical field of communication. The method for processing the screen projection picture comprises the following steps: dividing a screen projection picture into at least two areas according to the importance degree of the picture; respectively determining a region identifier for each of at least two regions; and coding and transmitting the screen projection picture according to the area identification. The at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area. The method and the device can improve the smooth degree of the picture when the screen is projected.

Description

Processing method and processing device for screen projection picture
Technical Field
The application belongs to the technical field of communication, and particularly relates to a method and a device for processing a screen projection picture.
Background
The screen projection technology enables picture synchronization between two or more electronic devices. In the related art, the screen projection may be realized by a push mode or a screen recording mode. Wherein. The screen recording mode has the principle that the electronic equipment serving as the screen projection terminal continuously records a screen and synchronously sends a screen image to the electronic equipment serving as the screen projection terminal for playing. In the above process, the screen projection terminal needs to encode and transmit the screen projection picture, and the screen projection terminal needs to decode and play the screen projection picture. And the screen projection terminal is in communication connection with the screen projected terminal so as to transmit and receive screen projection pictures.
In the process of implementing the present application, the inventors found that at least the following problems exist in the related art: when the screen projection equipment in the related technology projects the screen, the smoothness of picture playing is not ideal enough, and the phenomenon that the audio and video pictures are asynchronous or the pictures are blocked easily occurs.
Disclosure of Invention
The embodiment of the application aims to provide a method and a device for processing a screen projection picture.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a method for processing a screen projection image, including: dividing a screen projection picture into at least two areas according to the importance degree of the picture; respectively determining a region identifier for each of at least two regions; coding and sending a screen projection picture according to the area identification; the at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area.
In a second aspect, an embodiment of the present application provides another method for processing a screen projection image, where the screen projection image is divided into at least two regions according to importance of the image, and each of the at least two regions is identified by a determined region identifier, and the method includes: receiving a screen projection picture; decoding and playing the screen projection picture according to the area identification; the at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area.
In a third aspect, an embodiment of the present application provides an apparatus for processing a screen projection image, including: the division identification module is used for dividing the screen projection picture into at least two areas according to the picture importance degree and respectively determining area identifications for each area of the at least two areas; the coding sending module is used for coding and sending the screen projection picture according to the area identifier; the at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area.
In a fourth aspect, an embodiment of the present application provides another apparatus for processing a screen projection image, where the screen projection image is divided into at least two regions according to importance levels of the image, and each of the at least two regions is determined to have a region identifier, and the processing apparatus includes: the decoding playing module is used for receiving the screen-projected picture, and decoding and playing the screen-projected picture according to the area identifier; the at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area.
In a fifth aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect or implement the steps of the method according to the second aspect.
In a sixth aspect, embodiments of the present application provide a readable storage medium on which a program or instructions are stored, which when executed by a processor, implement the steps of the method according to the first aspect or implement the steps of the method according to the second aspect.
According to the embodiment of the application, the screen projection picture is divided according to the picture importance degree, and the area identification is respectively determined for each area in the screen projection picture. Therefore, the screen projection picture can be processed according to the area identification. For example, the projected picture is coded and transmitted or decoded and played according to the area identification. Through the processing mode, the embodiment of the application can perform different modes of processing on different areas in the screen projection picture so as to completely reserve important areas (such as static scene areas or dynamic scene areas) in the screen projection picture and abandon non-important areas (such as subtitle areas or background areas) in the screen projection picture according to circumstances. In summary, on the basis of ensuring the definition of the screen projection picture, the embodiment of the application can improve the smoothness of transmission and playing of the screen projection picture, and effectively avoids the phenomenon that the audio and video pictures are asynchronous or the pictures are blocked.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flowchart illustrating steps of a method for processing a screen shot according to an embodiment of the present disclosure;
FIG. 2 is a second flowchart illustrating a method for processing a screen shot according to an embodiment of the present application;
FIG. 3 is a third flowchart illustrating a method for processing a projected screen according to an embodiment of the present disclosure;
FIG. 4 is a flowchart illustrating a fourth step of a method for processing a screen shot according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating a fifth step of a method for processing a screen shot according to an embodiment of the present application;
FIG. 6 is a flowchart illustrating steps of another method for processing a projected screen according to an embodiment of the present disclosure;
FIG. 7 is a second flowchart illustrating a method for processing a projected image according to another embodiment of the present application;
FIG. 8 is a flowchart illustrating a method for processing a screen shot according to an embodiment of the present disclosure;
FIG. 9 is a block diagram of the components of the electronic device of an embodiment of the present application;
fig. 10 is a block diagram of another electronic device according to an embodiment of the present application.
Wherein, the correspondence between the reference numbers and the names of the components in fig. 9 is:
100: processing apparatus of screen projection screen, 110: partition identification module, 120: encoding transmission module, 130: signal monitoring module, 400: electronic device, 410: processor, 420: a memory.
Wherein, the correspondence between the reference numbers and the names of the components in fig. 10 is:
200: processing apparatus of screen projection screen, 210: decoding playing module, 400: electronic device, 410: processor, 420: a memory.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The method for processing a screen projection image, the apparatus 100 for processing a screen projection image, the apparatus 200 for processing a screen projection image, the electronic device 400 and the readable storage medium provided in the embodiments of the present application are described in detail below with reference to fig. 1 to 10.
As shown in fig. 1, an embodiment of the present application provides a method for processing a screen projection image, where the method for processing a screen projection image includes:
s102, dividing a screen projection picture into at least two areas according to the picture importance degree;
s104, respectively determining area identification for each of at least two areas;
and S106, coding and sending the screen projection picture according to the area identification.
For screen projection products in the related art, if resources or bandwidth of a communication network between a screen projection terminal and a screen projected terminal are insufficient, fluency of the screen projected terminal in playing a screen projection screen is affected. In addition, if the encoding processing performance of the screen projection terminal is low, or the decoding and encoding processing performance of the screen projection terminal is low, the smoothness of the screen projection terminal in playing the screen projection screen is also affected. The above situation may cause the phenomenon that the audio and video pictures are not synchronous or the pictures are jammed when the screen is projected.
Therefore, in order to improve the fluency of the screen projection picture during playing, the embodiment of the application provides a method for processing the screen projection picture at the screen projection terminal. The screen projection terminal of the embodiment of the application can be electronic equipment such as a mobile phone, a personal computer and a tablet personal computer. According to the screen projection terminal, the screen projection picture is divided into at least two areas according to the importance degree of the picture, and the area identification is respectively determined for each area of the at least two areas. Therefore, the embodiment of the application can encode and transmit the projected picture according to the area identification.
According to the method and the device, different areas including the subtitle area, the background area, the static scene area and the dynamic scene area in the screen projection picture are processed in different modes, so that the important areas in the screen projection picture can be completely reserved when the screen is projected, and the non-important areas in the screen projection picture are abandoned according to the situation. Therefore, on the basis of ensuring the definition of the screen projection picture, the embodiment of the application can improve the smoothness of transmission and playing of the screen projection picture, and effectively avoids the phenomenon that the audio and video pictures are asynchronous or the pictures are blocked.
For example, the screen projection screen may be divided into a background area and a scene area, a first area identifier may be determined for the background area, and a second area identifier may be determined for the scene area. The pictures displayed in the scene area generally have a greater influence on the viewing experience of the viewer, and the pictures displayed in the background area have a lesser influence on the viewing experience of the viewer, and the pictures displayed in the background area have a lower relative importance. According to the method and the device, the importance degrees of different areas in the screen projection picture are identified and determined by identifying the area identification, and then when the communication network bandwidth is insufficient or the decoding performance of the screen projection terminal is not good, part of contents or data in an unimportant background area are abandoned, so that the picture is guaranteed to be smooth.
For example, the present embodiment may divide the screen projection screen into a background area, a scene area, and a subtitle area. The scene area can be further divided into a static scene area and a dynamic scene area. The dynamic scene area is a dynamic picture display area for displaying characters or animals, and the static scene area is a static surface display area for displaying nearby scenery or props. Accordingly, according to the importance degree of the picture, the embodiment determines a first area identifier for the background area, a second area identifier for the subtitle area, a third area identifier for the static scene area, and a fourth area identifier for the dynamic scene area. Therefore, the embodiment can adopt different processing modes for pictures with different importance degrees, such as a background area, a static scene area, a dynamic scene area, a caption area and the like, by identifying the area identifier so as to ensure that dynamic pictures, such as characters, animals, captions and the like, are clearly projected, and appropriately abandon the pictures in the background area so as to ensure that the pictures are smooth.
In the embodiment, the processing mode of dividing the area of the projection screen according to the importance degree of the screen can abandon the non-important screen on the basis of ensuring that the important screen is clearly projected, so as to obtain excellent projection effect.
In some embodiments of the present application, determining a region identifier for each of at least two regions respectively includes: for each of the at least two regions, a region identifier corresponding to the picture importance of each region is determined.
In other words, as shown in fig. 2, the method for processing a screen shot according to the present embodiment includes:
s202, dividing a screen projection picture into at least two areas according to the picture importance degree;
s204, respectively determining area identifications respectively corresponding to the picture importance degrees of the areas for each of at least two areas;
and S206, coding and transmitting the screen projection picture according to the area identification.
For example, the at least two regions of the present embodiment include a background region and a scene region. The area mark corresponding to the background area is a first mark, the area mark corresponding to the scene area is a second mark, and the picture importance degree represented by the first mark is lower than that represented by the second mark. The embodiment determines the area identification corresponding to each area according to the importance degree of the screen of each area so as to ensure that the screen projection screen of each area is reasonably processed.
In some embodiments of the present application, the area identifier is a numerical identifier, and the numerical identifier of each area is positively correlated to the importance level of the picture of each area.
In other words, the present embodiment directly numerically marks the screen importance level of each region. For example, the value of the area id of the dynamic scene area is higher than the value of the area id of the static scene area, the value of the area id of the subtitle area is higher than the value of the area id of the background area, and the value of the area id of the static scene area and the value of the area id of the dynamic scene area are respectively higher than the value of the area id of the background area.
Through the embodiment, each area in the screen projection picture can be quickly, efficiently and conveniently represented according to the picture importance degree, so that smooth transmission and playing of the screen projection picture are further guaranteed.
In some embodiments of the present application, encoding and sending a screen projection picture according to a region identifier specifically include: converting the screen projection picture into a data bit string; packaging the data bit string into a network abstraction layer unit with a head structure; adding an area identification field corresponding to the area identification in a head structure of a network extraction layer unit; and sending the screen projection picture.
In other words, as shown in fig. 3, the method for processing a screen shot according to the present embodiment includes:
s302, dividing a screen projection picture into at least two areas according to the picture importance degree;
s304, respectively determining area identification for each of at least two areas;
s306, converting the screen projection picture into a data bit string;
s308, packaging the data bit string into a network abstraction layer unit with a structure;
s310, adding an area identification field corresponding to the area identification in a head structure of a network extraction layer unit;
and S312, sending a screen projection picture.
In the present embodiment, the H26x video encoding standard is used to encode, transmit, decode, and play a projected picture. Techniques for encoding or decoding using the H26x video coding standard employ a hierarchical structure. Namely: the functionality of the H26x video coding standard is divided into two layers. The Video Coding Layer (hereinafter referred to as "Video Coding Layer" for short as "VCL") is responsible for efficient Video content representation. The network Abstraction Layer (network Abstraction Layer, NAL) is used for the packet encapsulation and the unpacking and Abstraction of the video data packets. The Network Abstraction Layer Unit (NALU) is the Unit with the smallest video data packet of the network Abstraction Layer. In the encoding or decoding process using the H26x video encoding standard, a Data bit String (hereinafter, referred to as "String of Data Bits", hereinafter, referred to as "SODB") is the most original encoded Data. The original Byte Sequence Payload (hereinafter referred to as "Raw Byte Sequence Payload") is padded with tail bits after the data bit string for Byte alignment. And finally outputting the compressed and coded data bit string by the video coding layer. The data of the video coding layer is mapped or encapsulated into the network abstraction layer unit before transmission or storage. The network extraction layer packs the data bit string into original byte sequence load and then adds the head structure to form a network extraction layer unit. Each nal unit is a variable-length byte string including a certain word syntax element, which specifically includes a header structure of one byte for indicating a data type, and a raw byte sequence payload of several integer bytes.
In the embodiment, the network abstraction layer unit has the characteristic of a head structure, and the head structure is additionally provided with an area identification field corresponding to the area identification. The area identification fields corresponding to the area identification of each area in the screen projection picture are different from each other. The header structure is used to indicate the data type, and when the nal unit is decompressed, the header structure does not need to be decompressed. Therefore, by adding the area identification field corresponding to the area identification in the head structure, the screen-projected terminal can acquire the area attribute or the importance degree of the screen projection picture corresponding to the network extraction layer unit to be decompressed before decompressing the network extraction layer unit. Therefore, the screen projection terminal can decompress and play the screen projection pictures in a targeted manner.
In some embodiments of the embodiment of the present application, determining at least one region to be sent in at least two regions specifically includes: at least one region to be transmitted in the at least two regions is determined according to the signal strength of the communication network.
In other words, as shown in fig. 4, the method for processing a screen shot according to the present embodiment includes:
s402, dividing a screen projection picture into at least two areas according to the picture importance degree;
s404, respectively determining area identifications for each of at least two areas;
s406, converting the screen projection picture into a data bit string;
s408, packaging the data bit string into a network abstraction layer unit with a structure;
s410, adding an area identification field corresponding to the area identification in the head structure of the network extraction layer unit;
s412, determining at least one region to be sent in at least two regions according to the signal strength of the communication network;
and S414, sending the network extraction layer unit of at least one region to be sent according to the region identification field corresponding to the region identification of at least one region to be sent.
The embodiment monitors the signal intensity of the communication network in real time, and when the signal intensity of the communication network is poor, the screen projection terminal appropriately discards the network extraction layer unit corresponding to the area identification field with low importance according to the area identification field in the network extraction layer unit.
In some embodiments of the present application, sending a screen projection screen specifically includes: determining at least one region to be transmitted in at least two regions; and sending the network extraction layer unit of at least one region to be sent according to the region identification field corresponding to the region identification of at least one region to be sent.
In other words, as shown in fig. 5, the method for processing a screen shot according to the present embodiment includes:
s502, dividing a screen projection picture into at least two areas according to the picture importance degree;
s504, respectively determining area identification for each of at least two areas;
s506, converting the screen projection picture into a data bit string;
s508, packaging the data bit string into a network abstraction layer unit with a structure;
s510, adding an area identification field corresponding to the area identification in a head structure of a network extraction layer unit;
s512, determining at least one region to be sent in the at least two regions;
and S514, sending the network extraction layer unit of at least one region to be sent according to the region identification field corresponding to the region identification of at least one region to be sent.
The embodiment aims to transmit the screen projection picture in a targeted manner by the screen projection terminal. For example, if the communication network bandwidth is insufficient or the decoding performance of the projected terminal is not good, the present embodiment selects and transmits the network abstraction layer units of the caption area, the static scene area and the dynamic scene area from the caption area, the background area, the static scene area, the dynamic scene area, and the like, and discards the network abstraction layer unit of the background area.
As shown in fig. 6, an embodiment of the present application provides a method for processing a screen projection image, where the method for processing a screen projection image includes:
s602, receiving a screen projection picture;
s604, decoding and playing the screen projection picture according to the area identification.
The method for processing the screen projection picture is suitable for the screen projection terminal. The screen-projected terminal of the embodiment of the application can be electronic equipment such as a mobile phone, a personal computer, a tablet personal computer and a television. According to the screen projection terminal, the screen projection picture is divided into at least two areas according to the picture importance degree, and the area identification is respectively determined for each area of the at least two areas. The at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area. Correspondingly, the screen projection terminal receives the screen projection picture from the screen projection terminal, and decodes and plays the screen projection picture according to the area identification made by the screen projection terminal.
Because the area identifications corresponding to the areas in the screen projection picture represent the attributes or the importance degrees of the areas, the screen projection terminal of the embodiment of the application can execute decoding processing in different modes on different areas in the screen projection picture. Therefore, on the basis of ensuring the definition of the screen projection picture, the embodiment of the application can improve the smoothness of transmission and playing of the screen projection picture, and effectively avoids the phenomenon that the audio and video pictures are asynchronous or the pictures are blocked.
In some embodiments of the present application, the method for decoding and playing the screen-shot picture according to the area identifier includes: acquiring an area identification field corresponding to the area identification from a head structure of a network extraction layer unit; determining the picture importance degree of the screen projection picture according to the area identification field; when the screen-projected picture is an important picture, decoding the network extraction layer unit; when the screen-projecting picture is a non-important picture, replacing a network extraction layer unit with a general data unit; and playing the screen projection picture.
In other words, as shown in fig. 7, the method for processing a screen shot according to the present embodiment includes:
s702, receiving a screen projection picture;
s704, obtaining an area identification field corresponding to the area identification from the head structure of the network extraction layer unit;
s706, determining the picture importance degree of the screen projection picture according to the area identification field;
s708, when the screen projection picture is an important picture, decoding the network extraction layer unit;
s710, when the screen projection picture is a non-important picture, replacing a network extraction layer unit with a general data unit;
and S712, playing the screen projection picture.
By the embodiment, when the screen-projected terminal receives the network abstraction layer unit marked with the area identification field, the screen-projected terminal can decode and play according to the actual performance of the equipment. When decoding by the screen-projected terminal is difficult, it can be considered to discard the network abstraction layer unit with low importance. Namely: and replacing the network abstraction layer unit with the generic data unit. Wherein the generic data element comprises data characterizing the generic color. The universal data unit is adopted to replace the network extraction layer unit, so that the decoding efficiency can be improved under the condition of not obviously developing the screen projection image definition degree.
As shown in fig. 8, in some embodiments of the present application, a method for processing a projection screen includes:
s802, the screen projection terminal starts to project the screen and records the screen;
s804, the screen projection terminal performs area division on the recorded pictures;
s806, marking corresponding importance indexes by the screen projection terminal according to the importance of the area;
s808, the screen projection terminal encodes the picture, and the structural characteristics of the head of the layer unit are extracted by utilizing the H26x technology network, so that the field of the importance index is increased;
s810, the screen projection terminal encodes the importance indexes of the corresponding areas into the importance index fields of the network extraction layer units;
s812, judging whether the signal strength of the communication network is good;
if the determination result is yes, then execute S814, and if the determination result is no, then execute S822;
s814, sending a network abstraction layer unit;
s816, the screen-projected terminal receives the network extraction layer unit;
s818, judging whether the performance of the screen projected terminal is good or not;
if the determination result is yes, S820 is executed, and if the determination result is no, S824 is executed;
s820, decoding the network extraction layer unit;
s822, judging whether the picture area is important;
if the determination result is yes, then execute S814, and if the determination result is no, then execute S824;
s824, discarding the network abstraction layer unit;
and S826, the screen-projected terminal uses the alternative color value to replace the network abstraction layer unit.
In the above embodiment, the screen projection terminal is adopted to encode the importance index of the corresponding region into the importance index field of the network abstraction layer unit, and the screen projection terminal is adopted to decompress and play the network abstraction layer unit according to the importance index field.
As shown in fig. 9, an embodiment of the present application provides a processing apparatus 100 for a screen projection screen, where the processing apparatus 100 for a screen projection screen is used for a screen projection terminal, and the apparatus includes: a partition identification module 110 and a code transmission module 120. The division identification module 110 is configured to divide the screen-shot picture into at least two regions according to the picture importance degree, and determine a region identification for each of the at least two regions. The encoding and transmitting module 120 is used for encoding and transmitting the screen projection picture according to the area identifier. The at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area.
As shown in fig. 9, an embodiment of the present application provides a processing apparatus 100 for projecting a screen, further including: a signal monitoring module 130. The signal monitoring module 130 is used for monitoring the signal strength of the communication network. The code transmission module 120 is further configured to determine at least one region to be transmitted in the at least two regions according to the signal strength.
As shown in fig. 10, an embodiment of the present application provides a processing apparatus 200 for a screen projection screen, where the processing apparatus 200 for a screen projection screen is used for a terminal to be screen projected, the screen projection screen is divided into at least two areas according to screen importance, each area in the at least two areas is respectively identified by a determined area identifier, and the processing apparatus 200 for a screen projection screen includes: and the decoding playing module 210, wherein the decoding playing module 210 is configured to receive the screen-projected picture, and decode and play the screen-projected picture according to the area identifier.
The processing apparatus 100 or 200 for screen projection in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The processing apparatus 100 or 200 for a screen shot in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The processing device 100 or 200 for screen projection images provided in this embodiment of the application can implement each process implemented in the method embodiments of fig. 1 to fig. 8, and for avoiding repetition, details are not described here again.
As shown in fig. 9 and fig. 10, an electronic device 400 provided in an embodiment of the present application includes a processor 410, a memory 420, and a program or an instruction stored on the memory 420 and executable on the processor 410, where the program or the instruction when executed by the processor 410 implements the steps of the method for processing a screen shot according to any embodiment of the present application.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
The embodiment of the present application provides a readable storage medium, on which a program or an instruction is stored, and when the program or the instruction is executed by a processor, the steps of the method for processing a screen projection image according to any embodiment of the present application are implemented.
The processor is the processor in the electronic device in the above embodiment. Readable storage media, including computer-readable storage media, such as Read-Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the methods of the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A method for processing a screen projection picture is characterized by comprising the following steps:
dividing the screen projection picture into at least two areas according to the picture importance degree;
respectively determining a region identifier for each of the at least two regions;
coding and sending the screen projection picture according to the area identification;
the at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area.
2. The method for processing a screen projection image according to claim 1, wherein the determining a region identifier for each of the at least two regions respectively comprises:
and respectively determining the area identifications respectively corresponding to the picture importance degrees of the areas for the areas in the at least two areas.
3. The method for processing a projected picture according to claim 2, wherein the area identifiers are numerical identifiers, and the numerical identifiers of the respective areas are positively correlated with the picture importance levels of the respective areas.
4. The method for processing the screen shot according to any one of claims 1 to 3, wherein the encoding and sending the screen shot according to the area identifier specifically comprises:
converting the screen projection picture into a data bit string;
packaging the data bit string into a network abstraction layer unit with a head structure;
adding an area identification field corresponding to the area identification in the head structure of the network extraction layer unit;
and sending the screen projection picture.
5. The method for processing a projected picture according to claim 4,
the determining at least one region to be sent in the at least two regions specifically includes: determining the at least one region to be transmitted in the at least two regions according to the signal strength of the communication network; and/or
The sending the screen projection picture specifically comprises:
determining at least one region to be transmitted in the at least two regions;
and sending the network abstraction layer unit of the at least one region to be sent according to the region identification field corresponding to the region identification of the at least one region to be sent.
6. A method for processing a screen projection picture, wherein the screen projection picture is divided into at least two areas according to picture importance, and each area in the at least two areas is respectively identified by a determined area identifier, the method comprising:
receiving the screen projection picture;
decoding and playing the screen projection picture according to the area identification;
the at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area.
7. The method for processing the screen shot picture according to claim 6, wherein the screen shot picture is encapsulated in a nal unit with a header structure, and the decoding and playing the screen shot picture according to the area identifier specifically comprises:
acquiring an area identification field corresponding to the area identification from the head structure of the network extraction layer unit;
determining the picture importance degree of the screen projection picture according to the area identification field;
when the screen-projecting picture is an important picture, decoding the network extraction layer unit;
when the screen projection picture is a non-important picture, replacing the network extraction layer unit with a general data unit;
and playing the screen projection picture.
8. A processing device for projecting a screen, comprising:
the screen projection system comprises a division identification module, a display module and a display module, wherein the division identification module is used for dividing a screen projection picture into at least two areas according to the picture importance degree and respectively determining area identifications for each of the at least two areas;
the coding sending module is used for coding and sending the screen projection picture according to the area identification;
the at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area.
9. The apparatus for processing a projected picture according to claim 8, further comprising:
a signal monitoring module for monitoring signal strength of a communication network;
the code transmission module is further configured to determine at least one region to be transmitted in the at least two regions according to the signal strength.
10. A processing device for a screen projection picture, wherein the screen projection picture is divided into at least two areas according to picture importance degrees, and each area in the at least two areas is determined to be an area identifier, the processing device comprises:
the decoding playing module is used for receiving the screen-projected picture and decoding and playing the screen-projected picture according to the area identification;
the at least two regions comprise at least one or a combination of: subtitle area, background area, static scene area, and dynamic scene area.
CN202011279096.1A 2020-11-16 2020-11-16 Processing method and processing device for screen projection picture Pending CN112468845A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011279096.1A CN112468845A (en) 2020-11-16 2020-11-16 Processing method and processing device for screen projection picture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011279096.1A CN112468845A (en) 2020-11-16 2020-11-16 Processing method and processing device for screen projection picture

Publications (1)

Publication Number Publication Date
CN112468845A true CN112468845A (en) 2021-03-09

Family

ID=74836927

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011279096.1A Pending CN112468845A (en) 2020-11-16 2020-11-16 Processing method and processing device for screen projection picture

Country Status (1)

Country Link
CN (1) CN112468845A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106850515A (en) * 2015-12-07 2017-06-13 ***通信集团公司 A kind of data processing method and video acquisition device, decoding apparatus
CN109168032A (en) * 2018-11-12 2019-01-08 广州酷狗计算机科技有限公司 Processing method, terminal, server and the storage medium of video data
CN110022463A (en) * 2019-04-11 2019-07-16 重庆紫光华山智安科技有限公司 Video interested region intelligent coding method and system are realized under dynamic scene
CN110784745A (en) * 2019-11-26 2020-02-11 科大讯飞股份有限公司 Video transmission method, device, system, equipment and storage medium
CN110856019A (en) * 2019-11-20 2020-02-28 广州酷狗计算机科技有限公司 Code rate allocation method, device, terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106850515A (en) * 2015-12-07 2017-06-13 ***通信集团公司 A kind of data processing method and video acquisition device, decoding apparatus
CN109168032A (en) * 2018-11-12 2019-01-08 广州酷狗计算机科技有限公司 Processing method, terminal, server and the storage medium of video data
CN110022463A (en) * 2019-04-11 2019-07-16 重庆紫光华山智安科技有限公司 Video interested region intelligent coding method and system are realized under dynamic scene
CN110856019A (en) * 2019-11-20 2020-02-28 广州酷狗计算机科技有限公司 Code rate allocation method, device, terminal and storage medium
CN110784745A (en) * 2019-11-26 2020-02-11 科大讯飞股份有限公司 Video transmission method, device, system, equipment and storage medium

Similar Documents

Publication Publication Date Title
JP7388465B2 (en) processing equipment
CN110868600B (en) Target tracking video plug-flow method, display method, device and storage medium
EP3833033B1 (en) Transmission apparatus, reception apparatus, and reception method
JP5553945B2 (en) Bitstream subset instructions
US11638066B2 (en) Method, device and computer program for encapsulating media data into a media file
WO2015076277A1 (en) Transmission apparatus, transmission method, reception apparatus and reception method
US10708611B2 (en) Systems and methods for signaling of video parameters and information associated with caption services
KR20100042632A (en) Video indexing method, and video indexing device
CN109640089B (en) Image coding and decoding method and device
JP2013527676A (en) Method and apparatus for encoding 3D video data, and decoding method and apparatus
CN106791829B (en) Method and equipment for establishing virtual reference frame
CN112468845A (en) Processing method and processing device for screen projection picture
JP2017123618A (en) Transmitter, transmission method, receiver and receiving method
JP2005515685A (en) Robust signal coding
JP5886341B2 (en) Transmitting apparatus, transmitting method, receiving apparatus, and receiving method
EP3549349B1 (en) A decoder, encoder, computer program and method
JP6989037B2 (en) Transmitter, transmitter, receiver and receiver
JP6822536B2 (en) Transmitter, transmitter, receiver and receiver
JP6350638B2 (en) Transmitting apparatus, transmitting method, receiving apparatus, and receiving method
JP2016076957A (en) Transmitter, transmission method, receiver and reception method
CN116017053A (en) Video playing method based on WebSocket and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210309

RJ01 Rejection of invention patent application after publication