WO2022168516A1 - 情報処理装置および方法 - Google Patents
情報処理装置および方法 Download PDFInfo
- Publication number
- WO2022168516A1 WO2022168516A1 PCT/JP2022/000078 JP2022000078W WO2022168516A1 WO 2022168516 A1 WO2022168516 A1 WO 2022168516A1 JP 2022000078 W JP2022000078 W JP 2022000078W WO 2022168516 A1 WO2022168516 A1 WO 2022168516A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- wireless communication
- unit
- error
- communication channel
- information
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 57
- 238000000034 method Methods 0.000 title abstract description 105
- 238000004891 communication Methods 0.000 claims abstract description 193
- 230000005540 biological transmission Effects 0.000 claims abstract description 122
- 238000003672 processing method Methods 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims description 66
- 238000012544 monitoring process Methods 0.000 claims description 25
- 238000001514 detection method Methods 0.000 claims description 16
- 230000007774 longterm Effects 0.000 claims description 4
- 230000007423 decrease Effects 0.000 abstract 1
- 230000008569 process Effects 0.000 description 80
- 238000005516 engineering process Methods 0.000 description 36
- 238000013139 quantization Methods 0.000 description 34
- 230000015654 memory Effects 0.000 description 21
- 238000004364 calculation method Methods 0.000 description 19
- 238000001914 filtration Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 17
- 230000003044 adaptive effect Effects 0.000 description 14
- 230000008707 rearrangement Effects 0.000 description 13
- 238000009825 accumulation Methods 0.000 description 9
- 230000001131 transforming effect Effects 0.000 description 9
- 238000006243 chemical reaction Methods 0.000 description 6
- 230000002146 bilateral effect Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000001934 delay Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 101100243951 Caenorhabditis elegans pie-1 gene Proteins 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- VBRBNWWNRIMAII-WYMLVPIESA-N 3-[(e)-5-(4-ethylphenoxy)-3-methylpent-3-enyl]-2,2-dimethyloxirane Chemical compound C1=CC(CC)=CC=C1OC\C=C(/C)CCC1C(C)(C)O1 VBRBNWWNRIMAII-WYMLVPIESA-N 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 244000144972 livestock Species 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000005065 mining Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/164—Feedback from the receiver or from the transmission channel
- H04N19/166—Feedback from the receiver or from the transmission channel concerning the amount of transmission errors, e.g. bit error rate [BER]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/107—Selection of coding mode or of prediction mode between spatial and temporal predictive coding, e.g. picture refresh
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/164—Feedback from the receiver or from the transmission channel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/172—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
Definitions
- the present disclosure relates to an information processing device and method, and in particular, it is possible to suppress an increase in the period during which the image quality of a decoded image is reduced due to an error occurring on the receiving side when transmitting encoded data of a moving image.
- the present invention relates to an information processing device and method.
- 5G 5th generation mobile communication system
- IMT International Mobile Telecommunications
- use cases are specified according to the application. For example, a use case that enables large-capacity data transmission (eMBB (enhance mobile broadband)) and a use case that enables highly reliable and low-delay data transmission (URLLC (Ultra Reliable Low Latency Communication)) were stipulated. .
- eMBB enhanced mobile broadband
- URLLC Ultra Reliable Low Latency Communication
- the delay time requirements differ for each of these use cases.
- the requirement for delay time in the wireless section is 4ms.
- the required condition for the delay time of the radio section is 0.5ms.
- the present disclosure has been made in view of this situation, and it is possible to suppress the increase in the period during which the image quality of the decoded image is reduced due to the occurrence of an error on the receiving side when transmitting the encoded data of the moving image. It makes it possible.
- An information processing apparatus is capable of performing transmission with a delay lower than that of the first wireless communication channel from a receiving device that receives encoded data of a moving image transmitted via the first wireless communication channel.
- a receiving device that receives encoded data of a moving image transmitted via the first wireless communication channel.
- an error information acquisition unit that acquires error information transmitted via a wireless communication channel
- an encoding control unit that controls encoding of the moving image based on the error information acquired by the error information acquisition unit. It is an information processing device comprising
- An information processing method is a first wireless communication channel capable of transmission from a receiving device that receives encoded data of a moving image transmitted via a first wireless communication channel with a delay lower than that of the first wireless communication channel. 2.
- An information processing apparatus includes a data receiving unit that receives coded data of a moving image transmitted via a first wireless communication channel, and an error related to the coded data received by the data receiving unit.
- an error information transmission unit configured to transmit error information, which is information indicative of It is an information processing device.
- An information processing method receives encoded data of a moving image transmitted via a first wireless communication channel, and transmits error information, which is information indicating an error related to the encoded data, to the first wireless communication channel.
- the coded data is transmitted to the transmission source of the encoded data via a second wireless communication channel capable of transmission with a delay lower than that of the first wireless communication channel.
- transmission with a delay lower than that of the first wireless communication channel from a receiving device that receives encoded data of a moving image transmitted via the first wireless communication channel is performed.
- Error information transmitted via a possible second wireless communication channel is obtained, and coding of the moving image is controlled based on the obtained error information.
- encoded data of a moving image transmitted via a first wireless communication channel is received, and error information, which is information indicating an error related to the encoded data, is received. , is transmitted to the transmission source of the coded data via a second radio channel capable of transmission with a lower delay than the first radio channel.
- FIG. 10 is a diagram illustrating an example of delays when dealing with errors; 1 is a diagram showing a main configuration example of an image transmission system; FIG. It is a block diagram which shows the main structural examples of an image coding apparatus. 3 is a block diagram showing a main configuration example of an encoding unit; FIG. It is a block diagram which shows the main structural examples of an image decoding apparatus. 2 is a block diagram showing a main configuration example of a decoding unit; FIG. 10 is a flowchart for explaining an example of the flow of image encoding processing; FIG. 10 is a flowchart for explaining an example of the flow of image decoding processing; FIG. FIG.
- FIG. 10 is a diagram illustrating an example of delays when dealing with errors; It is a figure explaining the example of video encoding. It is a figure explaining the example of an intra stripe. It is a figure explaining the example of code amount. It is a figure explaining the example of encoding control.
- FIG. 11 is a flowchart for explaining an example of the flow of encoding control processing; FIG. It is a figure explaining the example of encoding control.
- FIG. 11 is a flowchart for explaining an example of the flow of encoding control processing;
- FIG. 1 is a diagram showing a main configuration example of an image transmission system;
- FIG. 1 is a diagram showing a main configuration example of an image transmission system;
- FIG. 1 is a diagram showing a main configuration example of an image transmission system;
- FIG. 1 is a diagram showing a main configuration example of an image transmission system;
- FIG. It is a block diagram which shows the main structural examples of a computer.
- Non-Patent Document 1 (above)
- Non-Patent Document 2 Recommendation ITU-T H.264 (04/2017) "Advanced video coding for generic audiovisual services", April 2017
- Non-Patent Document 3 Recommendation ITU-T H.265 (02/18) "High efficiency video coding", February 2018
- Non-Patent Document 4 Benjamin Bross, Jianle Chen, Shan Liu, Ye-Kui Wang, “Versatile Video Coding (Draft 7)", JVET-P2001-vE, Joint Video Experts Team (JVET) of ITU-T SG 16 WP 3 and ISO/IEC JTC 1/SC 29/WG 11 16th Meeting: Geneva, CH, 1-11 Oct 2019
- Non-Patent Document 5 Satoshi Nagata, Kazuaki Takeda, Daisuke Umeda, Hideaki Takahashi, Kenichiro Aoyagi, "3GPP Release 15 Standardization Technology Outline”, https://www.nttd
- the content described in the non-patent documents and patent documents mentioned above is also the basis for determining the support requirements.
- Quad-Tree Block Structure and QTBT (Quad Tree Plus Binary Tree) Block Structure described in the above non-patent literature are not directly described in the examples, they are within the scope of disclosure of the present technology, shall meet the support requirements of the claims.
- technical terms such as Parsing, Syntax, and Semantics are also within the scope of disclosure of the present technology even if they are not directly described in the embodiments, and meet the support requirements for a range of
- the term "block” (not a block indicating a processing unit) used in the description as a partial area of an image (picture) or a processing unit indicates an arbitrary partial area in a picture unless otherwise specified. Its size, shape, characteristics, etc. are not limited.
- the "block” includes TB (Transform Block), TU (Transform Unit), PB (Prediction Block), PU (Prediction Unit), SCU (Smallest Coding Unit), CU (Coding Unit), LCU (Largest Coding Unit), CTB (Coding Tree Block), CTU (Coding Tree Unit), sub-blocks, macro-blocks, tiles, or slices.
- the block size may be specified not only directly but also indirectly.
- the block size may be specified using identification information that identifies the size.
- the block size may be designated by a ratio or a difference from the size of a reference block (for example, LCU or SCU).
- a reference block for example, LCU or SCU.
- the above-mentioned information indirectly specifying a size may be used as the information. By doing so, the information amount of the information can be reduced, and the coding efficiency can be improved in some cases.
- This block size specification also includes block size range specification (for example, block size range specification, etc.).
- ⁇ Delay in response to error in image transmission system> Conventionally, various systems have been developed as image transmission systems for transmitting image data. For example, a system has been developed for transmitting moving images and the like using wireless communication. Since image data such as moving images generally have a large data size, it has been considered to encode (compress) and transmit the data.
- the image transmission system 10 shown in FIG. 1 has an encoder 11 on the transmission side (ie transmission source side) and a decoder 12 on the reception side (ie transmission destination side).
- Image data is encoded by the encoder 11 .
- the encoded data (bit stream) is then transmitted to the decoder 12 via the wireless network 21 .
- the bitstream is decoded by the decoder 12 and output as image data (decoded image).
- decoder 12 cannot obtain the decoded image. If the image data to be transmitted is a moving image and the frames following the frame in which the error occurred are inter-coded, the error will propagate to the following frames, and the decoded images will be decoded for the following frames as well. There was a risk that it would continue to be unobtainable.
- bitstream transmission that is, encoding of image data
- the decoder 12 fails in reception or decoding, it transmits error information indicating the error to the encoder 11 via the wireless network 21 . After acquiring the error information, the encoder 11 performs encoding so that the error does not propagate to subsequent frames.
- the decoder 12 can obtain the decoded image earlier.
- 3GPP Three Generation Partnership Project
- IMT International Mobile Telecommunications
- 5G The specifications of the system (hereinafter also referred to as 5G) have been studied and created.
- use cases are defined according to the application. For example, a use case that enables large-capacity data transmission (eMBB (enhance mobile broadband)) and a use case that enables highly reliable and low-delay data transmission (URLLC (Ultra Reliable Low Latency Communication)) were stipulated. .
- eMBB enhanced-capacity data transmission
- URLLC Ultra Reliable Low Latency Communication
- eMBB high-capacity use case
- the delay time requirements differ for each of these use cases.
- the requirement for delay time in the wireless section is 4ms.
- the required condition for the delay time of the radio section is 0.5ms.
- the network delay may also increase compared to the low latency use case (URLLC).
- the timing of encoding control based on error information may be delayed. If the timing of this encoding control is delayed, there is a possibility that the time until the decoded image can be obtained in the decoder 12 increases.
- each frame is encoded on the transmitting side
- the encoded data is sequentially transmitted from the transmitting side to the receiving side, and decoded on the receiving side.
- an error is mixed in the packet at time t1 and the error is detected at time t2 on the receiving side.
- the error is notified via the wireless network 21 of the large-capacity use case (eMBB)
- the error is sent to the transmitting side at time t3 (for example, 10 ms later) due to network delay. be notified. Therefore, the encoding control based on the error is performed on the frame next to the frame P2 to be processed at time t3. Therefore, in the case of the example of FIG. 2, decoded images for three frames cannot be obtained (lost). If a frame (decoded image) cannot be obtained, the image quality of the moving image (decoded moving image) is reduced.
- the error information is transmitted through wireless communication that is different from the wireless communication channel used for bitstream transmission and has a lower delay than the wireless communication channel used for bitstream transmission.
- a second wireless communication capable of transmission with a delay lower than that of the first wireless communication channel Error information transmitted via a path is acquired, and encoding of moving images is controlled based on the acquired error information.
- an information processing device from a receiving device that receives encoded data of a moving image transmitted via a first wireless communication channel, a second wireless communication that enables transmission with a delay lower than that of the first wireless communication channel an error information acquisition unit for acquiring error information transmitted via a path; and an encoding control unit for controlling encoding of a moving image based on the error information acquired by the error information acquisition unit.
- encoded data of a moving image transmitted via a first wireless communication channel is received, and error information, which is information indicating an error related to the encoded data, is transmitted to the first wireless communication channel.
- error information which is information indicating an error related to the encoded data
- the coded data is transmitted to the transmission source of the encoded data via the second wireless communication channel, which is capable of transmission with a lower delay than the channel.
- a data receiving unit that receives encoded data of a moving image transmitted via a first wireless communication channel, and an error that is information indicating an error related to the encoded data received by the data receiving unit.
- an error information transmission unit that transmits information to the transmission source of the encoded data via a second wireless communication channel capable of transmitting information with a lower delay than the first wireless communication channel.
- FIG. 3 is a diagram showing a main configuration example of an image transmission system to which the present technology is applied.
- the image transmission system 100 shown in FIG. 3 is a system for transmitting moving images.
- the image transmission system 100 has an image encoding device 111 and an image decoding device 112 .
- the image encoding device 111 and the image decoding device 112 are communicably connected to each other via a wireless network 121 .
- the image encoding device 111 and the image decoding device 112 are communicably connected to each other via a wireless network 122 .
- the image encoding device 111 acquires image data of a moving image to be transmitted, encodes it, and generates the encoded data (bitstream).
- the image encoding device 111 transmits the bitstream to the image decoding device 112 via the wireless network 121 .
- the image decoding device 112 receives and decodes the bitstream.
- the image decoding device 112 outputs image data of a decoded image (decoded moving image) obtained by the decoding.
- the wireless network 121 is a wireless communication channel capable of large-capacity data transmission (high transmission data rate) compared to the wireless network 122 .
- the specifications of wireless network 121 are arbitrary, but require a transmission data rate capable of transmitting a bitstream of image data.
- the image decoding device 112 transmits error information indicating the error via the wireless network 122 to the image encoding device. 111.
- the image encoding device 111 receives the error information.
- the image encoding device 111 controls encoding of moving images based on the received error information and the like. For example, the image encoding device 111 performs encoding so that the error does not propagate to subsequent frames.
- the wireless network 122 is a wireless communication path capable of data transmission with high reliability and low delay compared to the wireless network 121.
- the specifications of wireless network 122 are arbitrary, but require lower latency requirements than wireless network 121 .
- the wireless networks 121 and 122 are wireless communication paths with different frequency bands (channels).
- a 5G large capacity use case eMBB
- a 5G low latency use case URLLC
- the wireless network 121 is a 5G high-capacity use case (eMBB) wireless communication channel
- the wireless network 122 is a 5G low-latency use case (URLLC) wireless communication channel.
- eMBB 5G high-capacity use case
- URLLC 5G low-latency use case
- the image encoding device 111 can also monitor the state of the wireless network 121 and obtain QoE (Quality of Experience) information, which is a subjective evaluation of the wireless network 121 .
- the image encoding device 111 can control the encoding of moving images also based on the QoE information.
- This QoE information may be any information. For example, as in the method described in Non-Patent Document 5, information such as wireless disconnection during communication and handover failure collected from terminals using the mechanism of MDT (Minimization of Drive Test) is included in this QoE information. may be included.
- the image transmission system 100 may have any number of image encoding devices 111 and image decoding devices 112 . Also, the image transmission system 100 may have devices other than the image encoding device 111 and the image decoding device 112 . Furthermore, the image transmission system 100 may have wireless communication channels other than the wireless networks 121 and 122 .
- FIG. 4 is a block diagram showing a main configuration example of the image encoding device 111 in FIG.
- FIG. 4 shows main elements such as the processing unit and data flow, and what is shown in FIG. 4 is not necessarily all. That is, in the image encoding device 111, there may be processing units not shown as blocks in FIG. 4, or there may be processes or data flows not shown as arrows or the like in FIG.
- the image encoding device 111 has an encoding unit 211, a communication unit 212, and an encoding control unit 213.
- the communication unit 212 has a data transmission unit 221 , a network state monitoring unit 222 and an error information monitoring unit 223 .
- the encoding unit 211 encodes the image data (moving image to be transmitted) input to the image encoding device 111 and generates the encoded data (bitstream).
- This encoding method is arbitrary. For example, apply AVC (Advanced Video Coding) described in Non-Patent Document 2, HEVC (High Efficiency Video Coding) described in Non-Patent Document 3, or VVC (Versatile Video Coding) described in Non-Patent Document 4. You may Of course, encoding schemes other than these can also be applied.
- the encoding unit 211 supplies the generated bitstream to the communication unit 212 (the data transmission unit 221 thereof).
- the communication unit 212 performs processing related to communication.
- the data transmission unit 221 acquires the bitstream supplied from the encoding unit 211.
- the data transmission unit 221 transmits the acquired bitstream to the image decoding device 112 via the wireless network 121 (eMBB).
- the network status monitoring unit 222 monitors the status of the wireless network 121 and obtains QoE information about the network.
- the network status monitoring unit 222 supplies the obtained QoE information to the encoding control unit 213.
- the error information monitoring unit 223 monitors error information transmitted from the image decoding device 112 via the wireless network 122 (URLLC). When error information is transmitted from the image decoding device 112 , the error information monitoring unit 223 receives the error information via the wireless network 122 . That is, the error information monitoring unit 223 receives the encoded data of the moving image transmitted via the wireless network 121 from the image decoding device 112 via the wireless network 122 capable of transmission with a lower delay than the wireless network 121. Get the transmitted error information. The error information monitoring section 223 supplies the received error information to the encoding control section 213 .
- URLLC wireless network 122
- the encoding control unit 213 controls encoding processing executed by the encoding unit 211 .
- the encoding control unit 213 controls encoding processing executed by the encoding unit 211 by supplying the encoding unit 211 with encoding control information specifying an encoding method, parameters, and the like.
- the encoding control unit 213 acquires error information supplied from the error information monitoring unit 223 and controls the encoding unit 211 based on the error information. For example, when acquiring error information, the encoding control unit 213 causes the encoding unit 211 to perform encoding processing so that the error indicated by the error information does not propagate to subsequent frames.
- the encoding control unit 213 acquires the QoE information supplied from the network state monitoring unit 222, and controls the encoding unit 211 based on the QoE information. For example, the encoding control unit 213 causes the encoding unit 211 to perform encoding processing so as to improve the communication status of the wireless network 121 .
- FIG. 5 is a block diagram showing a main configuration example of the encoding section 211 in FIG.
- FIG. 5 shows main elements such as the processing unit and data flow, and what is shown in FIG. 5 is not necessarily all.
- processing units not shown as blocks in FIG. 5, or there may be processes or data flows not shown as arrows or the like in FIG.
- the encoding unit 211 has a rearrangement buffer 251, a calculation unit 252, a coefficient conversion unit 253, a quantization unit 254, an encoding unit 255, and an accumulation buffer 256.
- the encoding unit 211 also has an inverse quantization unit 257 , an inverse coefficient transform unit 258 , a calculation unit 259 , an in-loop filter unit 260 and a frame memory 261 .
- the encoding section 211 has a prediction section 262 and a rate control section 263 .
- the prediction section 262 has an inter prediction section 271 and an intra prediction section 272 .
- Each frame (input image) of a moving image is input to the encoding unit 211 in its reproduction order (display order).
- the rearrangement buffer 251 acquires and holds (stores) each input image in its reproduction order (display order).
- the rearrangement buffer 251 rearranges the input image in encoding order (decoding order) or divides the input image into processing unit blocks.
- the rearrangement buffer 251 supplies each processed input image to the calculation unit 252 .
- the calculation unit 252 subtracts the predicted image supplied from the prediction unit 262 from the image corresponding to the processing unit block supplied from the rearrangement buffer 251 to derive residual data, which the coefficient conversion unit 253 supply to
- the coefficient conversion unit 253 acquires residual data supplied from the calculation unit 252 .
- the coefficient conversion unit 253 also coefficient-converts the residual data by a predetermined method to derive conversion coefficient data. Any method can be used for this coefficient conversion processing. For example, it may be an orthogonal transform.
- the coefficient transform unit 253 supplies the derived transform coefficient data to the quantization unit 254 .
- the quantization unit 254 acquires transform coefficient data supplied from the coefficient transform unit 253 . Also, the quantization unit 254 quantizes the transform coefficient data to derive quantized coefficient data. At that time, the quantization section 254 performs quantization at a rate designated by the rate control section 263 . The quantization section 254 supplies the derived quantization coefficient data to the encoding section 255 and the inverse quantization section 257 .
- the encoding unit 255 acquires the quantization coefficient data supplied from the quantization unit 254.
- the encoding unit 255 also acquires filter-related information such as filter coefficients supplied from the in-loop filter unit 260 . Furthermore, the encoding unit 255 acquires information on the optimum prediction mode supplied from the prediction unit 262 .
- the encoding unit 255 entropy-encodes (lossless-encodes) the information, generates a bit string (encoded data), and multiplexes it.
- This entropy encoding method is arbitrary.
- the encoding unit 255 can apply CABAC (Context-based Adaptive Binary Arithmetic Code) as this entropy encoding.
- CABAC Context-based Adaptive Binary Arithmetic Code
- CAVLC Context-based Adaptive Variable Length Code
- the encoding unit 255 supplies the encoded data derived as described above to the accumulation buffer 256 .
- the accumulation buffer 256 temporarily holds the encoded data obtained by the encoding unit 255 .
- the accumulation buffer 256 supplies the retained encoded data to the data transmission section 221 as, for example, a bit stream at a predetermined timing.
- the inverse quantization unit 257 acquires the quantization coefficient data supplied from the quantization unit 254.
- the inverse quantization unit 257 inversely quantizes the quantized coefficient data to derive transform coefficient data.
- This inverse quantization processing is the inverse processing of the quantization processing executed in the quantization section 254 .
- the inverse quantization unit 257 supplies the derived transform coefficient data to the inverse coefficient transform unit 258 .
- the inverse coefficient transform unit 258 acquires transform coefficient data supplied from the inverse quantization unit 257 .
- the inverse coefficient transform unit 258 performs inverse coefficient transform on the transform coefficient data by a predetermined method to derive residual data.
- This inverse coefficient transforming process is the inverse process of the coefficient transforming process executed in the coefficient transforming section 253 .
- the coefficient transforming unit 253 performs orthogonal transform processing on the residual data
- the inverse coefficient transforming unit 258 performs inverse orthogonal transform processing, which is the reverse processing of the orthogonal transform processing, on the transform coefficient data. do.
- the inverse coefficient transformer 258 supplies the derived residual data to the calculator 259 .
- the calculation unit 259 acquires the residual data supplied from the inverse coefficient transform unit 258 and the predicted image supplied from the prediction unit 262 .
- the calculation unit 259 adds the residual data and the predicted image corresponding to the residual data to derive a local decoded image.
- the calculation unit 259 supplies the derived locally decoded image to the in-loop filter unit 260 and the frame memory 261 .
- the in-loop filter unit 260 acquires the local decoded image supplied from the calculation unit 259. Also, the in-loop filter unit 260 acquires an input image (original image) supplied from the rearrangement buffer 251 .
- the information input to the in-loop filter unit 260 is arbitrary, and information other than these information may be input. For example, if necessary, prediction mode, motion information, code amount target value, quantization parameter qP, picture type, block (CU, CTU, etc.) information and the like may be input to the in-loop filter unit 260. good.
- the in-loop filter unit 260 appropriately performs filtering on the local decoded image.
- the in-loop filter unit 260 also uses the input image (original image) and other input information for the filtering process as necessary.
- the in-loop filter unit 260 can apply a bilateral filter as its filtering process.
- the in-loop filter unit 260 can apply a deblocking filter (DBF (DeBlocking Filter)) as its filtering process.
- the in-loop filter section 260 can apply an adaptive offset filter (SAO (Sample Adaptive Offset)) as its filtering process.
- the in-loop filter section 260 can apply an adaptive loop filter (ALF (Adaptive Loop Filter)) as its filtering process.
- ALF adaptive Loop Filter
- the in-loop filter section 260 can combine and apply a plurality of these filters as filtering. Which filter to apply and in what order to apply are arbitrary and can be selected as appropriate.
- the in-loop filter unit 260 applies four in-loop filters, a bilateral filter, a deblocking filter, an adaptive offset filter, and an adaptive loop filter, in this order as filtering.
- the filtering process executed by the in-loop filtering unit 260 is arbitrary and is not limited to the above example.
- the in-loop filter unit 260 may apply a Wiener filter or the like.
- the in-loop filter unit 260 supplies the filtered locally decoded image to the frame memory 261 .
- the in-loop filter unit 260 supplies the filter-related information to the encoding unit 255 .
- the frame memory 261 executes processing related to storage of image-related data. For example, the frame memory 261 acquires the local decoded image supplied from the calculation unit 259 and the filtered locally decoded image supplied from the in-loop filter unit 260 and holds (stores) them. Also, the frame memory 261 reconstructs a decoded image for each picture using the local decoded image and holds it (stores it in the buffer in the frame memory 261). The frame memory 261 supplies the decoded image (or part thereof) to the prediction section 262 in response to a request from the prediction section 262 .
- the prediction unit 262 executes processing related to prediction image generation. For example, the prediction unit 262 acquires an input image (original image) supplied from the rearrangement buffer 251 . For example, the prediction unit 262 acquires a decoded image (or part thereof) read from the frame memory 261 .
- the inter prediction unit 271 of the prediction unit 262 refers to the decoded image of another frame as a reference image, performs inter prediction and motion compensation, and generates a predicted image. Also, the intra prediction unit 272 of the prediction unit 262 performs intra prediction with reference to the decoded image of the current frame as a reference image to generate a predicted image.
- the prediction unit 262 evaluates the predicted images generated in each prediction mode, and selects the optimum prediction mode based on the evaluation results. Then, the prediction section 262 supplies the predicted image generated in the optimum prediction mode to the calculation section 252 and the calculation section 259 . Also, the prediction unit 262 supplies information about the optimum prediction mode selected by the above process to the encoding unit 255 as necessary.
- the prediction unit 262 (the inter prediction unit 271 and the intra prediction unit 272 thereof) can also perform prediction under the control of the encoding control unit 213.
- the prediction unit 262 can acquire coding control information supplied from the coding control unit 213 and execute intra prediction and inter prediction according to the coding control information.
- the rate control unit 263 controls the quantization operation rate of the quantization unit 254 based on the code amount of the encoded data accumulated in the accumulation buffer 256 so that overflow or underflow does not occur.
- FIG. 6 is a block diagram showing a main configuration example of the image decoding device 112 in FIG.
- FIG. 6 shows main components such as the processing unit and data flow, and what is shown in FIG. 6 is not necessarily all. That is, in the image decoding device 112, there may be processing units not shown as blocks in FIG. 6, or there may be processes or data flows not shown as arrows or the like in FIG.
- the image decoding device 112 has a communication unit 311, a decoding control unit 312, and a decoding unit 313.
- Communication unit 311 has data reception unit 321 , reception error detection unit 322 , and error information transmission unit 323 .
- the communication unit 311 performs processing related to communication.
- the data receiving unit 321 receives a bitstream transmitted from the image encoding device 111 via the wireless network 121 (eMBB).
- the data receiver 321 supplies the received bitstream to the decoder 313 .
- the reception error detection unit 322 monitors the reception status of the data reception unit 321 and detects errors (reception errors) that occur in the data reception unit 321 . When detecting a reception error, the reception error detection section 322 supplies error information indicating the reception error to the error information transmission section 323 . The reception error detection section 322 also supplies the error detection result (information indicating whether or not a reception error has been detected, etc.) to the decoding control section 312 .
- the error information transmission unit 323 transmits error information to the image encoding device 111 via the wireless network 122 (URLLC). This error information is transmitted to the image encoding device 111 via the wireless network 122 (URLLC) and received by the error information monitoring unit 223 .
- the error information transmitting unit 323 transmits the error information, which is information indicating an error related to the encoded data received by the data receiving unit 321, via the wireless network 122 capable of transmission with a lower delay than the wireless network 121. Send to the source of the encoded data.
- the error information transmission section 323 acquires error information indicating a reception error supplied from the reception error detection section 322 . Also, the error information transmission unit 323 acquires error information indicating a decoding error, supplied from the decoding unit 313 . The error information transmission unit 323 transmits the acquired error information to the image encoding device 111 .
- the error information transmitted by the error information transmission unit 323 may include information indicating an error during reception of encoded data. Also, the error information transmitted by the error information transmission unit 323 may include information indicating an error during decoding of encoded data. Of course, the error information transmitted by the error information transmission unit 323 may contain both information, or may contain information indicating other errors.
- the decoding control unit 312 controls decoding processing executed by the decoding unit 313 .
- the decoding control unit 312 controls the decoding process executed by the decoding unit 313 by supplying the decoding unit 313 with decoding control information specifying the decoding method, parameters, and the like.
- the decoding control unit 312 acquires the error detection result supplied from the reception error detection unit 322, and controls the encoding unit 211 based on the error detection result.
- the decoding unit 313 acquires the bitstream supplied from the data receiving unit 321.
- the decoding unit 313 decodes the bitstream to generate image data of a decoded image (decoded moving image to be transmitted).
- the decoding unit 313 outputs the image data to the outside of the image decoding device 112 .
- the decoding unit 313 can execute this decoding process under the control of the decoding control unit 312 .
- the decoding section 313 supplies error information indicating the decoding error to the error information transmitting section 323 .
- FIG. 7 is a block diagram showing a main configuration example of the decoding unit 313 in FIG.
- FIG. 7 shows main elements such as the processing unit and data flow, and what is shown in FIG. 7 is not necessarily all. That is, in the decoding unit 313, there may be processing units not shown as blocks in FIG. 7, or there may be processes or data flows not shown as arrows or the like in FIG.
- the decoding unit 313 includes an accumulation buffer 351, a decoding unit 352, an inverse quantization unit 353, an inverse coefficient transform unit 354, a calculation unit 355, an in-loop filter unit 356, a rearrangement buffer 357, a frame memory 358 and a prediction unit 359 .
- the accumulation buffer 351 acquires and holds (stores) the bitstream supplied from the data receiving unit 321 .
- the accumulation buffer 351 extracts encoded data included in the accumulated bitstream at a predetermined timing or when predetermined conditions are met, and supplies the extracted data to the decoding unit 352 .
- the decoding unit 352 acquires encoded data supplied from the accumulation buffer 351 .
- the decoding unit 352 decodes the acquired encoded data.
- the decoding unit 352 applies entropy decoding (lossless decoding) such as CABAC or CAVLC, for example. That is, the decoding unit 352 decodes the encoded data by a decoding method corresponding to the encoding method of the encoding process executed by the encoding unit 255 .
- the decoding unit 352 decodes the encoded data and derives quantized coefficient data.
- the decoding unit 352 supplies the derived quantization coefficient data to the inverse quantization unit 353 .
- the decoding unit 352 when an error (decoding error) occurs in the decoding process, the decoding unit 352 generates error information indicating the decoding error and supplies it to the error information transmitting unit 323 .
- the inverse quantization unit 353 performs inverse quantization processing on the quantized coefficient data to derive transform coefficient data. This inverse quantization processing is the inverse processing of the quantization processing executed in the quantization section 254 .
- the inverse quantization unit 353 supplies the derived transform coefficient data to the inverse coefficient transform unit 354 .
- the inverse coefficient transform unit 354 acquires transform coefficient data supplied from the inverse quantization unit 353 .
- the inverse coefficient transform unit 354 performs inverse coefficient transform processing on the transform coefficient data to derive residual data.
- This inverse coefficient transforming process is the inverse process of the coefficient transforming process executed in the coefficient transforming section 253 .
- the inverse coefficient transforming unit 354 supplies the derived residual data to the computing unit 355 .
- the calculation unit 355 acquires the residual data supplied from the inverse coefficient transform unit 354 and the predicted image supplied from the prediction unit 359 .
- the calculation unit 355 adds the residual data and the predicted image corresponding to the residual data to derive a local decoded image.
- the calculation unit 355 supplies the derived locally decoded image to the in-loop filter unit 356 and the frame memory 358 .
- the in-loop filter unit 356 acquires the local decoded image supplied from the calculation unit 355.
- the in-loop filter unit 356 appropriately performs filtering on the local decoded image.
- the in-loop filter unit 356 can apply a bilateral filter as its filtering process.
- the in-loop filter unit 356 can apply a deblocking filter (DBF (DeBlocking Filter)) as its filtering process.
- DPF DeBlocking Filter
- the in-loop filter unit 356 can apply an adaptive offset filter (SAO (Sample Adaptive Offset)) as its filtering process.
- the in-loop filter unit 356 can apply an adaptive loop filter (ALF (Adaptive Loop Filter)) as its filtering process.
- ALF Adaptive Loop Filter
- the in-loop filter unit 356 can combine and apply a plurality of these filters as filtering. Which filter to apply and in what order are optional and can be selected as appropriate.
- the in-loop filter unit 356 applies four in-loop filters, a bilateral filter, a deblocking filter, an adaptive offset filter, and an adaptive loop filter, in this order as filtering.
- the filtering process executed by the in-loop filtering unit 356 is arbitrary and is not limited to the above example.
- the in-loop filter unit 356 may apply a Wiener filter or the like.
- the in-loop filter unit 356 executes filter processing corresponding to the filter processing executed by the in-loop filter unit 260 .
- the in-loop filter unit 356 supplies the filtered locally decoded image to the rearrangement buffer 357 and the frame memory 358 .
- the rearrangement buffer 357 receives the locally decoded image supplied from the in-loop filter unit 356 and holds (stores) it.
- the rearrangement buffer 357 reconstructs a decoded image for each picture using the local decoded image and holds it (stores it in the buffer).
- the rearrangement buffer 357 rearranges the obtained decoded images from decoding order to reproduction order.
- the rearrangement buffer 357 outputs the decoded image group rearranged in order of reproduction to the outside of the image decoding device 112 as moving image data.
- the frame memory 358 acquires the local decoded image supplied from the calculation unit 355, reconstructs the decoded image for each picture unit, and stores it in the buffer within the frame memory 358. Also, the frame memory 358 acquires the in-loop filtered locally decoded image supplied from the in-loop filter unit 356, reconstructs the decoded image for each picture, and stores it in the buffer in the frame memory 358. do.
- the frame memory 358 appropriately supplies the stored decoded image (or part thereof) to the prediction unit 359 as a reference image.
- the prediction unit 359 acquires the decoded image (or part thereof) read from the frame memory 358.
- the prediction unit 359 performs prediction processing in the prediction mode adopted during encoding, and generates a prediction image by referring to the decoded image as a reference image.
- the prediction unit 359 supplies the generated prediction image to the calculation unit 355 .
- the encoding unit 211 acquires the image data of the moving image to be transmitted in step S201.
- step S202 the encoding unit 211 encodes the image data acquired in step S201 according to the encoding control of the encoding control unit 213 to generate a bitstream.
- step S203 the data transmission unit 221 transmits the bitstream generated in step S202 to the image decoding device 112 via the wireless network 121 (eMBB).
- eMBB wireless network 121
- step S204 the network state monitoring unit 222 monitors the state of the wireless network 121 and appropriately supplies QoE information to the coding control unit 213.
- step S ⁇ b>205 the error information monitoring unit 223 monitors transmission of error information via the wireless network 122 .
- the error information monitoring unit 223 receives the error information and supplies it to the encoding control unit 213 .
- step S206 the encoding control unit 213 controls the encoding process executed in step S202 based on the processing results (monitoring results) of steps S204 and S205.
- step S207 the encoding control unit 213 determines whether or not to end the image encoding process. If it is determined that the moving image encoding is continuing and the image encoding process is not to end, the process returns to step S201, and the subsequent processes are repeated.
- step S207 when it is determined in step S207 that the image encoding process is to end, the image encoding process ends.
- the data receiving unit 321 receives the bitstream transmitted from the image encoding device 111 via the wireless network 121 (eMBB) in step S301.
- step S302 the reception error detection unit 322 monitors the reception process in step S301, and if a reception error occurs, detects the reception error.
- step S303 the decoding control unit 312 controls the process (decoding process) in step S304, which will be described later, based on the reception error detection result in step S302.
- step S304 the decoding unit 313 decodes the bitstream received in step S301 according to the decoding control in step S303, and generates image data of the decoded moving image. This image data is output to the outside of the image decoding device 112 .
- step S305 if a decoding error occurs in the decoding process in step S304, the decoding unit 313 detects the decoding error.
- step S306 the error information transmission unit 323 determines whether an error has been detected. That is, error information transmitting section 323 determines whether or not a reception error has been detected in step S302, and whether or not a decoding error has been detected in step S305. If an error is detected, that is, if at least one of a reception error and a decoding error is detected, the process proceeds to step S307.
- step S307 the error information transmission unit 323 transmits error information indicating the detected error to the image encoding device 111 via the wireless network 122 (URLLC).
- step S307 the process proceeds to step S308.
- step S306 determines whether a reception error nor a decoding error has been detected. If it is determined in step S306 that no error has been detected, that is, that neither a reception error nor a decoding error has been detected, the process of step S307 is skipped and the process proceeds to step S308.
- step S308 the error information transmission unit 323 determines whether or not to end the image decoding process. If it is determined that bitstream transmission is continuing and the image decoding process is not to end, the process returns to step S301, and the process is repeated thereafter.
- step S308 when it is determined in step S308 that the image decoding process is to end, the image decoding process ends.
- the decoded image loss can be 2 frames.
- Second Embodiment> ⁇ Intra stripe> Any encoding control method may be used to prevent the error from propagating to subsequent frames. For example, when a technique called intra-stripe is applied in image coding, this intra-stripe may be used.
- each frame of a moving image includes an intra frame (I) which is a frame to which intra encoding is performed and an inter frame (P) which is a frame to which inter encoding is performed.
- I intra frame
- P inter frame
- all frames are interframes (P), and a partial area of each frame is set as an intra area for intra coding.
- This intra area is also called an intra stripe.
- the position of the intra-stripe (intra-region) is moved frame by frame so that it circulates in a predetermined number of frames. For example, the frame is divided into N and one partial area is set as the intra area. Then, the intra area is moved to the next partial area one by one for each frame, and returns to the original position after N frames.
- the code amount of each frame can be smoothed compared to the example of B in FIG. As a result, an increase in buffer capacity can be suppressed, and an increase in delay can be suppressed.
- vector control may reduce the image quality of the decoded image in the intra area. Therefore, even if a decoded image for one frame is obtained, the image quality may be reduced. Therefore, there is a possibility that the period during which the image quality of the decoded image is reduced due to the occurrence of an error on the receiving side during the transmission of the encoded data of the image will increase.
- encoding control may be performed so as to return the intra-stripe position to the initial position.
- the encoding control unit 213 may return the position of the intra area to the initial position.
- the intra area is a partial area of the frame that is composed of a plurality of blocks arranged in the vertical direction of the frame. is moved to the right every time.
- the encoding control unit 213 may return the position of the intra area to the left end of the frame.
- the encoding control unit 213 controls the encoding unit 211 to move the intra-stripe position of Pic1 to the initial position (left end of the frame). .
- an intra-stripe decoded image can be obtained without reducing image quality. Therefore, when a decoded image for one frame is obtained, a frame image whose image quality is not reduced can be obtained. Therefore, it is possible to suppress an increase in the period during which the image quality of the decoded image is reduced due to the occurrence of an error on the receiving side when the encoded data of the image is transmitted.
- step S401 the encoding control unit 213 determines whether or not an error is detected in step S401. If it is determined that an error has been detected, the process proceeds to step S402.
- step S402 the encoding control unit 213 controls the encoding unit 211 to return the intra stripe to the left end (initial position) of the frame.
- step S402 ends, the encoding control process ends, and the process proceeds to step S207 in FIG.
- step S401 determines whether error has been detected. If it is determined in step S401 that no error has been detected, the process of step S402 is skipped, the encoding control process ends, and the process proceeds to step S207 in FIG.
- the intra-stripe boundary may be mode-constrained so that error data is not propagated from the error area.
- VVC described in Non-Patent Document 4
- each frame of a moving image includes an intra frame (I), which is a frame to which intra encoding is performed, and an inter frame (P), which is a frame to which inter encoding is performed.
- I intra frame
- P inter frame
- it may be controlled to insert an intra frame.
- the moving image to be transmitted includes an intra-frame that is an intra-encoded frame.
- the encoding control unit 213 may set the frame to be encoded next as an intra frame.
- the encoding control unit 213 controls the encoding unit 211 to set Pic1 as an intra frame.
- step S431 the encoding control unit 213 determines whether or not an error is detected in step S431. If it is determined that an error has been detected, the process proceeds to step S432.
- step S432 the encoding control unit 213 controls the encoding unit 211 to insert intra frames.
- step S432 ends, the encoding control process ends, and the process proceeds to step S207 in FIG.
- step S431 determines whether error has been detected. If it is determined in step S431 that no error has been detected, the process of step S432 is skipped, the encoding control process ends, and the process proceeds to step S207 in FIG.
- bitstream transmission and error information transmission may be performed in the same channel (same frequency band).
- bitstream transmission takes place on the downlink 511 of the wireless network 501 and the transmission of the error information takes place on the uplink 512 of the same wireless network 501 (i.e. the same frequency band).
- bitstream transmission can be performed by large-capacity use-case (eMBB) communication, and error information transmission can be performed by low-delay use-case (URLLC) communication.
- eMBB large-capacity use-case
- URLLC low-delay use-case
- the first radio channel that transmits the bitstream is the downlink of the same frequency band as the second radio channel that transmits the error information
- the IMT International Mobile Telecommunications
- the second wireless communication channel is an uplink of the same frequency band as the first wireless communication channel
- a wireless communication path that satisfies the requirements of URLLC (Ultra Reliable Low Latency Communication) of the communication system may be used.
- stop eMBB communication during the error period may be controlled.
- bitstream transmission and error information transmission may be performed in different network slices.
- network slicing allows a network to be virtually divided into multiple network slices, each of which can be used. Such a function may be used to perform bitstream transmission and error information transmission.
- bitstream transmission is performed in a certain network slice 551 of the 5G network 541, and error information transmission is performed in another network slice 552 of the same 5G network 541.
- bitstream transmission can be performed by large-capacity use-case (eMBB) communication
- error information transmission can be performed by low-delay use-case (URLLC) communication.
- eMBB large-capacity use-case
- URLLC low-delay use-case
- the first wireless communication channel that transmits the bitstream is a network slice that is different from the second wireless communication channel that transmits the error information
- the regulation IMT International Mobile Telecommunications
- a second wireless communication path is a network slice different from the first wireless communication path
- URLLC Ultra Reliable A wireless communication path that satisfies the requirements of Low Latency Communication
- bitstream transmission and error information transmission may be performed in communication paths of different wireless communication standards.
- bitstream transmission is performed in a wireless network 571
- error information transmission is performed in a wireless network 572 with a communication standard different from that of the wireless network 571.
- the wireless network 571 may be, for example, a wireless communication path conforming to the IMT (International Mobile Telecommunications)-Advanced standard (hereinafter also referred to as 4G). Also, the wireless network 571 may be a wireless communication path conforming to LTE (Long Term Evolution) established by 3GPP (Third Generation Partnership Project). Furthermore, the wireless network 571 may be a wireless communication channel (hereinafter also referred to as Wi-Fi (registered trademark)) using the IEEE (Institute of Electrical and Electronics Engineers) 802.11 standard. Of course, the wireless network 571 may be a communication path other than these communication standards. On the other hand, in the wireless network 572, for example, a 5G wireless communication path may be used.
- bitstream transmission can be performed by large-capacity communication, and error information can be transmitted by low-delay use case (URLLC) communication.
- URLLC low-delay use case
- the first radio channel that transmits the bitstream is a radio channel conforming to the IMT (International Mobile Telecommunications)-Advanced standard defined by the International Telecommunications Union, and the LTE (LTE) established by the 3GPP (Third Generation Partnership Project). Long Term Evolution) or IEEE (Institute of Electrical and Electronics Engineers) 802.11 standards.
- the second wireless communication channel that transmits the error information may be a wireless communication channel that satisfies the requirements of URLLC (Ultra Reliable Low Latency Communication) of a wireless communication system that satisfies the regulation IMT-2020 defined by the International Telecommunication Union. .
- ⁇ Computer> The series of processes described above can be executed by hardware or by software.
- a program that constitutes the software is installed in the computer.
- the computer includes, for example, a computer built into dedicated hardware and a general-purpose personal computer capable of executing various functions by installing various programs.
- FIG. 21 is a block diagram showing an example of the hardware configuration of a computer that executes the series of processes described above by a program.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input/output interface 910 is also connected to the bus 904 .
- An input unit 911 , an output unit 912 , a storage unit 913 , a communication unit 914 and a drive 915 are connected to the input/output interface 910 .
- the input unit 911 consists of, for example, a keyboard, mouse, microphone, touch panel, input terminals, and the like.
- the output unit 912 includes, for example, a display, a speaker, an output terminal, and the like.
- the storage unit 913 is composed of, for example, a hard disk, a RAM disk, a nonvolatile memory, or the like.
- the communication unit 914 is composed of, for example, a network interface.
- Drive 915 drives removable media 921 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
- the CPU 901 loads, for example, a program stored in the storage unit 913 into the RAM 903 via the input/output interface 910 and the bus 904, and executes the above-described series of programs. is processed.
- the RAM 903 also appropriately stores data necessary for the CPU 901 to execute various processes.
- a program executed by a computer can be applied by being recorded on removable media 921 such as package media, for example.
- the program can be installed in the storage unit 913 via the input/output interface 910 by loading the removable medium 921 into the drive 915 .
- This program can also be provided via wired or wireless transmission media such as local area networks, the Internet, and digital satellite broadcasting.
- the program can be received by the communication unit 914 and installed in the storage unit 913 .
- this program can be installed in the ROM 902 or the storage unit 913 in advance.
- This technology can be applied to any image encoding/decoding method.
- This technology can be applied to any configuration.
- the present technology can be applied to transmitters and receivers (for example, television receivers and mobile phones) in satellite broadcasting, distribution on the Internet, and distribution to terminals by cellular communication, or optical discs, magnetic discs, and flash memories.
- transmitters and receivers for example, television receivers and mobile phones
- satellite broadcasting distribution on the Internet
- terminals by cellular communication, or optical discs, magnetic discs, and flash memories.
- various electronic devices such as devices (for example, hard disk recorders and cameras) that record images on such media and reproduce images from these storage media.
- the present technology includes a processor (e.g., video processor) as a system LSI (Large Scale Integration), etc., a module (e.g., video module) using a plurality of processors, etc., a unit (e.g., video unit) using a plurality of modules, etc.
- a processor e.g., video processor
- LSI Large Scale Integration
- module e.g., video module
- a unit e.g., video unit
- it can be implemented as a part of the configuration of the device, such as a set (for example, a video set) in which other functions are added to the unit.
- the present technology can also be applied to a network system configured by a plurality of devices.
- the present technology may be implemented as cloud computing in which a plurality of devices share and jointly process via a network.
- this technology is implemented in cloud services that provide image (moving image) services to arbitrary terminals such as computers, AV (Audio Visual) equipment, portable information processing terminals, and IoT (Internet of Things) devices. You may make it
- a system means a set of multiple components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a single device housing a plurality of modules in one housing, are both systems. .
- Systems, devices, processing units, etc. to which this technology is applied can be used in any field, such as transportation, medical care, crime prevention, agriculture, livestock industry, mining, beauty, factories, home appliances, weather, and nature monitoring. . Moreover, its use is arbitrary.
- this technology can be applied to systems and devices used to provide viewing content. Further, for example, the present technology can also be applied to systems and devices used for traffic, such as traffic condition supervision and automatic driving control. Further, for example, the technology can be applied to systems and devices that serve security purposes. Also, for example, the present technology can be applied to systems and devices used for automatic control of machines and the like. Furthermore, for example, the technology can be applied to systems and devices used in agriculture and animal husbandry. The present technology can also be applied to systems and devices that monitor natural conditions such as volcanoes, forests, oceans, and wildlife. Further, for example, the technology can be applied to systems and devices used for sports.
- Embodiments of the present technology are not limited to the above-described embodiments, and various modifications are possible without departing from the gist of the present technology.
- a configuration described as one device may be divided and configured as a plurality of devices (or processing units).
- the configuration described above as a plurality of devices (or processing units) may be collectively configured as one device (or processing unit).
- part of the configuration of one device (or processing unit) may be included in the configuration of another device (or other processing unit) as long as the configuration and operation of the system as a whole are substantially the same. .
- the above-described program may be executed on any device.
- the device should have the necessary functions (functional blocks, etc.) and be able to obtain the necessary information.
- each step of one flowchart may be executed by one device, or may be executed by a plurality of devices.
- the plurality of processes may be executed by one device, or may be shared by a plurality of devices.
- a plurality of processes included in one step can also be executed as processes of a plurality of steps.
- the processing described as multiple steps can also be collectively executed as one step.
- a computer-executed program may be configured such that the processing of the steps described in the program is executed in chronological order according to the order described in this specification, in parallel, or when calls are executed. It may also be executed individually at necessary timings such as when it is interrupted. In other words, as long as there is no contradiction, the processing of each step may be executed in an order different from the order described above. Furthermore, the processing of the steps describing this program may be executed in parallel with the processing of other programs, or may be executed in combination with the processing of other programs.
- the present technology can also take the following configuration.
- An information processing apparatus comprising: an encoding control unit that controls encoding of the moving image based on the error information acquired by the error information acquisition unit.
- the moving image is intra-encoded with a part of the frame set to an intra region when encoding each frame;
- the position of the intra area is moved in a predetermined direction for each frame so as to rotate in a predetermined number of frames,
- the information processing apparatus wherein the encoding control unit returns a position of the intra area to an initial position when the error information acquisition unit acquires the error information.
- the intra-region is a partial region of the frame composed of a plurality of blocks arranged in the vertical direction of the frame; The position of the intra area is shifted rightward for each frame, with the left end of the frame being the initial position,
- the information processing apparatus according to (2) wherein, when the error information acquisition unit acquires the error information, the encoding control unit returns the position of the intra area to the left end of the frame.
- the moving image includes an intra-frame that is an intra-encoded frame;
- the information processing apparatus according to (1) wherein, when the error information acquisition unit acquires the error information, the encoding control unit sets a frame to be encoded next as the intra frame.
- the information processing apparatus includes information indicating an error in receiving the encoded data.
- the error information includes information indicating an error during decoding of the encoded data.
- (7) further comprising an encoding unit that encodes the moving image and generates the encoded data;
- the encoding control unit controls the encoding unit based on the error information acquired by the error information acquiring unit.
- the encoding control unit further controls encoding of the moving image based on the state of the first wireless channel monitored by the wireless channel state monitoring unit.
- the information processing device according to claim 1.
- the information processing apparatus according to any one of (1) to (8), wherein the first wireless communication channel and the second wireless communication channel are communication channels in different frequency bands.
- the first wireless communication channel is a communication channel that satisfies eMBB (enhance mobile broadband) requirements of a wireless communication system that satisfies IMT (International Mobile Telecommunications)-2020 defined by the International Telecommunications Union,
- the first wireless communication channel is a downlink of the same frequency band as the second wireless communication channel, and is a wireless communication system that satisfies the regulation IMT (International Mobile Telecommunications)-2020 defined by the International Telecommunications Union.
- the second wireless communication channel is an uplink in the same frequency band as the first wireless communication channel, and is a wireless communication channel that satisfies the requirements of URLLC (Ultra Reliable Low Latency Communication) of the wireless communication system (
- the information processing apparatus according to any one of 1) to (8).
- the first wireless communication channel is a network slice different from the second wireless communication channel, and eMBB ( enhance mobile broadband),
- the second wireless channel is the network slice different from the first wireless channel, and is a wireless channel that satisfies URLLC (Ultra Reliable Low Latency Communication) requirements of the wireless communication system.
- the information processing device according to any one of (8).
- the information processing apparatus according to any one of (1) to (8), wherein the first wireless communication channel and the second wireless communication channel are communication channels of different wireless communication standards.
- the first wireless communication channel is a wireless communication channel conforming to the IMT (International Mobile Telecommunications)-Advanced standard defined by the International Telecommunications Union, LTE (Long Term Evolution) established by 3GPP (Third Generation Partnership Project). ) or using the IEEE (Institute of Electrical and Electronics Engineers) 802.11 standard
- the second wireless communication channel is a wireless communication channel that satisfies the requirements of URLLC (Ultra Reliable Low Latency Communication) of a wireless communication system that satisfies the regulation IMT-2020 defined by the International Telecommunication Union. Information according to (13) processing equipment.
- a data receiving unit that receives encoded data of a moving image transmitted via the first wireless communication channel; Error information, which is information indicating an error related to the encoded data received by the data receiving unit, is transmitted to the encoded data via a second wireless communication channel capable of transmission with a lower delay than the first wireless communication channel. and an error information transmission unit that transmits to the transmission source of the information processing apparatus.
- a reception error detection unit that detects a reception error of the encoded data by the data reception unit;
- the error information includes information indicating the reception error detected by the reception error detection unit.
- the error information includes information indicating a decoding error regarding decoding of the encoded data received by the data receiving unit.
- (19) further comprising a decoding unit that decodes the encoded data received by the data receiving unit;
- the information processing apparatus according to (18), wherein the error information transmission unit acquires the information indicating the decoding error supplied from the decoding unit, and transmits the error information including the information indicating the decoding error.
- (20) receiving coded data of a moving image transmitted via the first wireless communication channel; error information, which is information indicating an error in the encoded data, is transmitted to the source of the encoded data via a second wireless communication channel capable of transmission with a lower delay than the first wireless communication channel. Processing method.
- 100 image transmission system 111 image encoding device, 112 image decoding device, 121 wireless network, 122 wireless network, 211 encoding unit, 212 communication unit, 213 encoding control unit, 221 data transmission unit, 222 network status monitoring unit, 223 error information monitoring unit, 255 encoding unit, 262 prediction unit, 271 inter prediction unit, 272 intra prediction unit, 311 communication unit, 312 decoding control unit, 313 decoding unit, 321 data reception unit, 322 reception error detection unit, 323 Error information transmission unit, 352 decoding unit, 359 prediction unit, 501 wireless network, 511 downlink, 512 uplink, 541 5G network, 551 and 552 network slices, 571 wireless network, 572 wireless network, 900 computer
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
Description
1.エラー対応時の遅延
2.第1の実施の形態(画像伝送システム)
3.第2の実施の形態(符号化制御1)
4.第3の実施の形態(符号化制御2)
5.第4の実施の形態(画像伝送システムの他の例)
6.付記
<技術内容・技術用語をサポートする文献等>
本技術で開示される範囲は、実施の形態に記載されている内容だけではなく、出願当時において公知となっている以下の非特許文献や特許文献に記載されている内容も含まれる。
非特許文献2:Recommendation ITU-T H.264 (04/2017) "Advanced video coding for generic audiovisual services", April 2017
非特許文献3:Recommendation ITU-T H.265 (02/18) "High efficiency video coding", february 2018
非特許文献4:Benjamin Bross, Jianle Chen, Shan Liu, Ye-Kui Wang, "Versatile Video Coding (Draft 7)", JVET-P2001-vE, Joint Video Experts Team (JVET) of ITU-T SG 16 WP 3 and ISO/IEC JTC 1/SC 29/WG 11 16th Meeting: Geneva, CH, 1-11 Oct 2019
非特許文献5:永田聡,武田和晃,梅田大将,高橋秀明,青柳健一郎, "3GPP Release15標準化技術概要", https://www.nttdocomo.co.jp/binary/pdf/corporate/technology/rd/technical_journal/bn/vol26_3/vol26_3_007jp.pdf
特許文献1:特開2010-062946号公報
従来、画像データを伝送する画像伝送システムとして様々なシステムが開発された。例えば、無線通信を用いて動画像等を伝送するシステムが開発された。一般的に、動画像等の画像データはデータサイズが大きいので、符号化(圧縮)して伝送することが考えられた。
そこで、エラー情報の伝送を、ビットストリームの伝送に用いる無線通信路と異なる、ビットストリームの伝送に用いる無線通信路よりも低遅延の無線通信を介して行うようにする。
<画像伝送システム>
図3は、本技術を適用した画像伝送システムの主な構成例を示す図である。図3に示される画像伝送システム100は、動画像を伝送するシステムである。図3に示されるように、画像伝送システム100は、画像符号化装置111および画像復号装置112を有する。画像符号化装置111および画像復号装置112は、無線ネットワーク121を介して、通信可能に互いに接続されている。また、画像符号化装置111および画像復号装置112は、無線ネットワーク122を介して、通信可能に互いに接続されている。
図4は、図3の画像符号化装置111の主な構成例を示すブロック図である。
図5は、図4の符号化部211の主な構成例を示すブロック図である。
図6は、図3の画像復号装置112の主な構成例を示すブロック図である。
図7は、図6の復号部313の主な構成例を示すブロック図である。
次に、画像伝送システム100において実行される処理について説明する。図8のフローチャートを参照して、画像符号化装置111により実行される画像符号化処理の流れの例を説明する。
次に、画像復号装置112により実行される画像復号処理の流れの例を、図9のフローチャートを参照して説明する。
<イントラストライプ>
なお、エラーが後のフレームに伝搬しないようにする符号化制御の方法は任意である。例えば、画像符号化においてイントラストライプと称する技術が適用される場合、このイントラストライプを利用してもよい。
そこで、エラー情報が取得された場合、イントラストライプの位置を初期位置に戻すように、符号化制御を行ってもよい。
その場合の、図8のステップS206において実行される符号化制御処理の流れの例を、図15のフローチャートを参照して説明する。
<イントラフレームの挿入>
例えば、図11のAに示されるように、動画像の各フレームが、イントラ符号化が行われるフレームであるイントラフレーム(I)と、インター符号化が行われるフレームであるインターフレーム(P)とにより構成されるとする。その場合、図16に示されるように、イントラフレームを挿入するように制御してもよい。
その場合の、図8のステップS206において実行される符号化制御処理の流れの例を、図17のフローチャートを参照して説明する。
<画像伝送システムのその他の構成1>
画像伝送システム100の構成は、図3の例に限定されない。例えば、図18に示されるように、ビットストリームの伝送とエラー情報の伝送とを、互いに同一チャネル(同一周波数帯域)において行うようにしてもよい。
また、例えば、図19に示されるように、ビットストリームの伝送とエラー情報の伝送とを、互いに異なるネットワークスライスにおいて行うようにしてもよい。例えば、5Gにおいては、ネットワークスライシングにより、ネットワークを複数のネットワークスライスに仮想的に分割し、それぞれを利用することができる。このような機能を利用して、ビットストリームの伝送とエラー情報の伝送とを行うようにしてもよい。
また、例えば、図20に示されるように、ビットストリームの伝送とエラー情報の伝送とを、互いに異なる無線通信規格の通信路において行うようにしてもよい。
<コンピュータ>
上述した一連の処理は、ハードウエアにより実行させることもできるし、ソフトウエアにより実行させることもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここでコンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータ等が含まれる。
本技術は、任意の画像符号化・復号方式に適用することができる。
本技術を適用したシステム、装置、処理部等は、例えば、交通、医療、防犯、農業、畜産業、鉱業、美容、工場、家電、気象、自然監視等、任意の分野に利用することができる。また、その用途も任意である。
本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。
(1) 第1無線通信路を介して伝送される動画像の符号化データを受信する受信装置から前記第1無線通信路よりも低遅延な伝送が可能な第2無線通信路を介して伝送されるエラー情報を取得するエラー情報取得部と、
前記エラー情報取得部により取得された前記エラー情報に基づいて、前記動画像の符号化を制御する符号化制御部と
を備える情報処理装置。
(2) 前記動画像は、各フレームの符号化の際に、前記フレームの一部がイントラ領域に設定されてイントラ符号化され、
前記イントラ領域の位置は、所定のフレーム数で巡回するように、前記フレーム毎に所定の向きに移動され、
前記符号化制御部は、前記エラー情報取得部により前記エラー情報が取得された場合、前記イントラ領域の位置を初期位置に戻す
(1)に記載の情報処理装置。
(3) 前記イントラ領域は、前記フレームの垂直方向に並ぶ複数のブロックにより構成される、前記フレームの部分領域であり、
前記イントラ領域の位置は、前記フレームの左端を前記初期位置とし、前記フレーム毎に右向きに移動され、
前記符号化制御部は、前記エラー情報取得部により前記エラー情報が取得された場合、前記イントラ領域の位置を前記フレームの左端に戻す
(2)に記載の情報処理装置。
(4) 前記動画像は、イントラ符号化されるフレームであるイントラフレームを含み、
前記符号化制御部は、前記エラー情報取得部により前記エラー情報が取得された場合、次に符号化されるフレームを前記イントラフレームに設定する
(1)に記載の情報処理装置。
(5) 前記エラー情報は、前記符号化データの受信の際のエラーを示す情報を含む
(1)乃至(4)のいずれかに記載の情報処理装置。
(6) 前記エラー情報は、前記符号化データの復号の際のエラーを示す情報を含む
(1)乃至(5)のいずれかに記載の情報処理装置。
(7) 前記動画像を符号化し、前記符号化データを生成する符号化部をさらに備え、
前記符号化制御部は、前記エラー情報取得部により取得された前記エラー情報に基づいて前記符号化部を制御する
(1)乃至(6)のいずれかに記載の情報処理装置。
(8) 前記第1無線通信路の状態を監視する無線通信路状態監視部をさらに備え、
前記符号化制御部は、さらに、前記無線通信路状態監視部により監視される前記第1無線通信路の状態に基づいて、前記動画像の符号化を制御する
(1)乃至(7)のいずれかに記載の情報処理装置。
(9) 前記第1無線通信路および前記第2無線通信路は、互いに異なる周波数帯の通信路である
(1)乃至(8)のいずれかに記載の情報処理装置。
(10) 前記第1無線通信路は、国際電気通信連合が定める規定IMT(International Mobile Telecommunications)-2020を満足する無線通信システムの、eMBB(enhance Mobile broadband)の要件を満たす通信路であり、
前記第2無線通信路は、前記無線通信システムの、URLLC(Ultra Reliable Low Latency Communication)の要件を満たす通信路である
(9)に記載の情報処理装置。
(11) 前記第1無線通信路は、前記第2無線通信路と同一の周波数帯の下りリンクであって、国際電気通信連合が定める規定IMT(International Mobile Telecommunications)-2020を満足する無線通信システムの、eMBB(enhance Mobile broadband)の要件を満たす無線通信路であり、
前記第2無線通信路は、前記第1無線通信路と同一の周波数帯の上りリンクであって、前記無線通信システムの、URLLC(Ultra Reliable Low Latency Communication)の要件を満たす無線通信路である
(1)乃至(8)のいずれかに記載の情報処理装置。
(12) 前記第1無線通信路は、前記第2無線通信路と異なるネットワークスライスであって、国際電気通信連合が定める規定IMT(International Mobile Telecommunications)-2020を満足する無線通信システムの、eMBB(enhance Mobile broadband)の要件を満たす無線通信路であり、
前記第2無線通信路は、前記第1無線通信路と異なる前記ネットワークスライスであって、前記無線通信システムの、URLLC(Ultra Reliable Low Latency Communication)の要件を満たす無線通信路である
(1)乃至(8)のいずれかに記載の情報処理装置。
(13) 前記第1無線通信路および前記第2無線通信路は、互いに異なる無線通信規格の通信路である
(1)乃至(8)のいずれかに記載の情報処理装置。
(14) 前記第1無線通信路は、国際電気通信連合が定める規定IMT(International Mobile Telecommunications)-Advanced規格に準拠する無線通信路、3GPP(Third Generation Partnership Project)により策定されたLTE(Long Term Evolution)に準拠する無線通信路、または、IEEE(Institute of Electrical and Electronics Engineers)802.11規格を使用した無線通信路であり、
前記第2無線通信路は、国際電気通信連合が定める規定IMT-2020を満足する無線通信システムの、URLLC(Ultra Reliable Low Latency Communication)の要件を満たす無線通信路である
(13)に記載の情報処理装置。
(15) 第1無線通信路を介して伝送される動画像の符号化データを受信する受信装置から前記第1無線通信路よりも低遅延な伝送が可能な第2無線通信路を介して伝送されるエラー情報を取得し、
取得された前記エラー情報に基づいて、前記動画像の符号化を制御する
情報処理方法。
前記データ受信部が受信する前記符号化データに関するエラーを示す情報であるエラー情報を、前記第1無線通信路よりも低遅延な伝送が可能な第2無線通信路を介して、前記符号化データの送信元に送信するエラー情報送信部と
を備える情報処理装置。
(17) 前記データ受信部による前記符号化データの受信エラーを検出する受信エラー検出部をさらに備え、
前記エラー情報は、前記受信エラー検出部により検出された前記受信エラーを示す情報を含む
(16)に記載の情報処理装置。
(18) 前記エラー情報は、前記データ受信部により受信された前記符号化データの復号に関する復号エラーを示す情報を含む
(16)または(17)に記載の情報処理装置。
(19) 前記データ受信部により受信された前記符号化データを復号する復号部をさらに備え、
前記エラー情報送信部は、前記復号部から供給される前記復号エラーを示す情報を取得し、前記復号エラーを示す情報を含む前記エラー情報を送信する
(18)に記載の情報処理装置。
(20) 第1無線通信路を介して伝送される動画像の符号化データを受信し、
前記符号化データに関するエラーを示す情報であるエラー情報を、前記第1無線通信路よりも低遅延な伝送が可能な第2無線通信路を介して、前記符号化データの送信元に送信する
情報処理方法。
Claims (20)
- 第1無線通信路を介して伝送される動画像の符号化データを受信する受信装置から前記第1無線通信路よりも低遅延な伝送が可能な第2無線通信路を介して伝送されるエラー情報を取得するエラー情報取得部と、
前記エラー情報取得部により取得された前記エラー情報に基づいて、前記動画像の符号化を制御する符号化制御部と
を備える情報処理装置。 - 前記動画像は、各フレームの符号化の際に、前記フレームの一部がイントラ領域に設定されてイントラ符号化され、
前記イントラ領域の位置は、所定のフレーム数で巡回するように、前記フレーム毎に所定の向きに移動され、
前記符号化制御部は、前記エラー情報取得部により前記エラー情報が取得された場合、前記イントラ領域の位置を初期位置に戻す
請求項1に記載の情報処理装置。 - 前記イントラ領域は、前記フレームの垂直方向に並ぶ複数のブロックにより構成される、前記フレームの部分領域であり、
前記イントラ領域の位置は、前記フレームの左端を前記初期位置とし、前記フレーム毎に右向きに移動され、
前記符号化制御部は、前記エラー情報取得部により前記エラー情報が取得された場合、前記イントラ領域の位置を前記フレームの左端に戻す
請求項2に記載の情報処理装置。 - 前記動画像は、イントラ符号化されるフレームであるイントラフレームを含み、
前記符号化制御部は、前記エラー情報取得部により前記エラー情報が取得された場合、次に符号化されるフレームを前記イントラフレームに設定する
請求項1に記載の情報処理装置。 - 前記エラー情報は、前記符号化データの受信の際のエラーを示す情報を含む
請求項1に記載の情報処理装置。 - 前記エラー情報は、前記符号化データの復号の際のエラーを示す情報を含む
請求項1に記載の情報処理装置。 - 前記動画像を符号化し、前記符号化データを生成する符号化部をさらに備え、
前記符号化制御部は、前記エラー情報取得部により取得された前記エラー情報に基づいて前記符号化部を制御する
請求項1に記載の情報処理装置。 - 前記第1無線通信路の状態を監視する無線通信路状態監視部をさらに備え、
前記符号化制御部は、さらに、前記無線通信路状態監視部により監視される前記第1無線通信路の状態に基づいて、前記動画像の符号化を制御する
請求項1に記載の情報処理装置。 - 前記第1無線通信路および前記第2無線通信路は、互いに異なる周波数帯の通信路である
請求項1に記載の情報処理装置。 - 前記第1無線通信路は、国際電気通信連合が定める規定IMT(International Mobile Telecommunications)-2020を満足する無線通信システムの、eMBB(enhance Mobile broadband)の要件を満たす通信路であり、
前記第2無線通信路は、前記無線通信システムの、URLLC(Ultra Reliable Low Latency Communication)の要件を満たす通信路である
請求項9に記載の情報処理装置。 - 前記第1無線通信路は、前記第2無線通信路と同一の周波数帯の下りリンクであって、国際電気通信連合が定める規定IMT(International Mobile Telecommunications)-2020を満足する無線通信システムの、eMBB(enhance Mobile broadband)の要件を満たす無線通信路であり、
前記第2無線通信路は、前記第1無線通信路と同一の周波数帯の上りリンクであって、前記無線通信システムの、URLLC(Ultra Reliable Low Latency Communication)の要件を満たす無線通信路である
請求項1に記載の情報処理装置。 - 前記第1無線通信路は、前記第2無線通信路と異なるネットワークスライスであって、国際電気通信連合が定める規定IMT(International Mobile Telecommunications)-2020を満足する無線通信システムの、eMBB(enhance Mobile broadband)の要件を満たす無線通信路であり、
前記第2無線通信路は、前記第1無線通信路と異なる前記ネットワークスライスであって、前記無線通信システムの、URLLC(Ultra Reliable Low Latency Communication)の要件を満たす無線通信路である
請求項1に記載の情報処理装置。 - 前記第1無線通信路および前記第2無線通信路は、互いに異なる無線通信規格の通信路である
請求項1に記載の情報処理装置。 - 前記第1無線通信路は、国際電気通信連合が定める規定IMT(International Mobile Telecommunications)-Advanced規格に準拠する無線通信路、3GPP(Third Generation Partnership Project)により策定されたLTE(Long Term Evolution)に準拠する無線通信路、または、IEEE(Institute of Electrical and Electronics Engineers)802.11規格を使用した無線通信路であり、
前記第2無線通信路は、国際電気通信連合が定める規定IMT-2020を満足する無線通信システムの、URLLC(Ultra Reliable Low Latency Communication)の要件を満たす無線通信路である
請求項13に記載の情報処理装置。 - 第1無線通信路を介して伝送される動画像の符号化データを受信する受信装置から前記第1無線通信路よりも低遅延な伝送が可能な第2無線通信路を介して伝送されるエラー情報を取得し、
取得された前記エラー情報に基づいて、前記動画像の符号化を制御する
情報処理方法。 - 第1無線通信路を介して伝送される動画像の符号化データを受信するデータ受信部と、
前記データ受信部が受信する前記符号化データに関するエラーを示す情報であるエラー情報を、前記第1無線通信路よりも低遅延な伝送が可能な第2無線通信路を介して、前記符号化データの送信元に送信するエラー情報送信部と
を備える情報処理装置。 - 前記データ受信部による前記符号化データの受信エラーを検出する受信エラー検出部をさらに備え、
前記エラー情報は、前記受信エラー検出部により検出された前記受信エラーを示す情報を含む
請求項16に記載の情報処理装置。 - 前記エラー情報は、前記データ受信部により受信された前記符号化データの復号に関する復号エラーを示す情報を含む
請求項16に記載の情報処理装置。 - 前記データ受信部により受信された前記符号化データを復号する復号部をさらに備え、
前記エラー情報送信部は、前記復号部から供給される前記復号エラーを示す情報を取得し、前記復号エラーを示す情報を含む前記エラー情報を送信する
請求項18に記載の情報処理装置。 - 第1無線通信路を介して伝送される動画像の符号化データを受信し、
前記符号化データに関するエラーを示す情報であるエラー情報を、前記第1無線通信路よりも低遅延な伝送が可能な第2無線通信路を介して、前記符号化データの送信元に送信する
情報処理方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022579393A JPWO2022168516A1 (ja) | 2021-02-08 | 2022-01-05 | |
US18/263,386 US20240080457A1 (en) | 2021-02-08 | 2022-01-05 | Information processing apparatus and information processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021017957 | 2021-02-08 | ||
JP2021-017957 | 2021-02-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022168516A1 true WO2022168516A1 (ja) | 2022-08-11 |
Family
ID=82742294
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/000078 WO2022168516A1 (ja) | 2021-02-08 | 2022-01-05 | 情報処理装置および方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240080457A1 (ja) |
JP (1) | JPWO2022168516A1 (ja) |
WO (1) | WO2022168516A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004193850A (ja) * | 2002-12-10 | 2004-07-08 | Sony Corp | 符号化制御方法及び符号化制御プログラム |
US20080165246A1 (en) * | 2007-01-06 | 2008-07-10 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling intra-refreshing in a video telephony communication system |
WO2013033676A1 (en) * | 2011-09-02 | 2013-03-07 | Microsoft Corporation | Video refresh with error propagation tracking and error feedback from receiver |
WO2013033679A1 (en) * | 2011-09-02 | 2013-03-07 | Microsoft Corporation | Video refresh using error-free reference frames |
WO2014002385A1 (ja) * | 2012-06-25 | 2014-01-03 | 日本電気株式会社 | 映像符号化/復号装置、方法、プログラム |
-
2022
- 2022-01-05 WO PCT/JP2022/000078 patent/WO2022168516A1/ja active Application Filing
- 2022-01-05 JP JP2022579393A patent/JPWO2022168516A1/ja active Pending
- 2022-01-05 US US18/263,386 patent/US20240080457A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004193850A (ja) * | 2002-12-10 | 2004-07-08 | Sony Corp | 符号化制御方法及び符号化制御プログラム |
US20080165246A1 (en) * | 2007-01-06 | 2008-07-10 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling intra-refreshing in a video telephony communication system |
WO2013033676A1 (en) * | 2011-09-02 | 2013-03-07 | Microsoft Corporation | Video refresh with error propagation tracking and error feedback from receiver |
WO2013033679A1 (en) * | 2011-09-02 | 2013-03-07 | Microsoft Corporation | Video refresh using error-free reference frames |
WO2014002385A1 (ja) * | 2012-06-25 | 2014-01-03 | 日本電気株式会社 | 映像符号化/復号装置、方法、プログラム |
Non-Patent Citations (1)
Title |
---|
SATOSHI NAGATA: "3GPP Release 15 Standardization Technology Overview", NTT DOCOMO TECHNICAL JOURNAL, vol. 26, no. 3, 1 November 2018 (2018-11-01), pages 37 - 46, XP055956407 * |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022168516A1 (ja) | 2022-08-11 |
US20240080457A1 (en) | 2024-03-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10694202B2 (en) | Indication of bilateral filter usage in video coding | |
CN101822057B (zh) | 编码和解码视频数据的方法和装置 | |
EP2829064B1 (en) | Parameter determination for exp-golomb residuals binarization for lossless intra hevc coding | |
JP7517350B2 (ja) | 画像処理装置および方法 | |
KR20170108112A (ko) | 팔레트 코딩 모드를 위한 이스케이프 컬러 코딩 | |
US20130114732A1 (en) | Video and data processing using even-odd integer transforms | |
KR20220062085A (ko) | 비디오 처리에서 양자화 파라미터 시그널링 | |
US20210385456A1 (en) | Image processing apparatus and method | |
JP7494858B2 (ja) | 画像処理装置および方法 | |
EP2801194B1 (en) | Quantization matrix (qm) coding based on weighted prediction | |
US9264715B2 (en) | Moving image encoding method, moving image encoding apparatus, and computer-readable medium | |
WO2022168516A1 (ja) | 情報処理装置および方法 | |
TW202101997A (zh) | 影像處理裝置及方法 | |
WO2019188464A1 (ja) | 画像符号化装置、画像符号化方法、画像復号装置、および画像復号方法 | |
EP3624450A1 (en) | Wavefront parallel processing of luma and chroma components | |
WO2022044845A1 (ja) | 画像処理装置および方法 | |
EP4060995A1 (en) | Image processing device, bit stream generation method, coefficient data generation method, and quantization coefficient generation method | |
US20220312041A1 (en) | Method and apparatus for signaling decoding data using high level syntax elements | |
JP7517348B2 (ja) | 画像処理装置および方法 | |
US20240163437A1 (en) | Image processing device and method | |
WO2023223830A1 (ja) | 送信装置および方法、管理装置および方法、受信装置および方法、プログラム、並びに画像伝送システム | |
WO2023053957A1 (ja) | 画像処理装置および方法 | |
US20230396794A1 (en) | Systems and methods for motion vector predictor list improvements | |
Kazemi | End-to-end distortion modeling and channel adaptive optimization of mixed layer multiple description coding scheme | |
US20220021899A1 (en) | Image encoding apparatus, image encoding method, image decoding apparatus, and image decoding method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22749395 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022579393 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18263386 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22749395 Country of ref document: EP Kind code of ref document: A1 |