JPWO2020023139A5 - - Google Patents
Download PDFInfo
- Publication number
- JPWO2020023139A5 JPWO2020023139A5 JP2021503725A JP2021503725A JPWO2020023139A5 JP WO2020023139 A5 JPWO2020023139 A5 JP WO2020023139A5 JP 2021503725 A JP2021503725 A JP 2021503725A JP 2021503725 A JP2021503725 A JP 2021503725A JP WO2020023139 A5 JPWO2020023139 A5 JP WO2020023139A5
- Authority
- JP
- Japan
- Prior art keywords
- depth
- pixel
- scene
- pixel value
- perimeter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims 8
Claims (15)
を含む、システムであって、
前記深度レイシェーダは、
3次元シーンを形成している複数のピクセルのそれぞれと関連する深度を定義する深度マップを受信し、
第1ピクセル値と第2ピクセル値との間のカラーグラデーションを定義し、
第1ピクセル値と第2ピクセル値との間の前記カラーグラデーションにおける各連続ステップは、大きさが増加している対応する深度と関連付けられており、
周囲によって画定される前記3次元シーンの選択部分に対して深度レイ層を適用するためのグラフィックスエンジンによって実行可能な命令を提供し、
前記深度レイ層は、前記周囲の内部のピクセルの値を変更し、一方で、前記周囲の外部のピクセルは前記深度レイ層によって変更されず、
前記周囲の内部の前記変更されたピクセルは、少なくとも1つの背景オブジェクトの形状を示しており、かつ、前記深度マップによって前記ピクセルと関連付けられた前記深度に対応する、前記カラーグラデーションから選択されたピクセル値をそれぞれ想定する、
ように、実行可能である、
システム。 The memory, the depth ray shader stored in the memory, and
Is a system, including
The depth ray shader is
Receives a depth map that defines the depth associated with each of the multiple pixels forming the 3D scene.
Define a color gradient between the first pixel value and the second pixel value,
Each continuous step in the color gradation between the first pixel value and the second pixel value is associated with a corresponding depth of increasing magnitude.
Provides instructions that can be executed by the graphics engine to apply a depth ray layer to the selection of the 3D scene defined by the surroundings .
The depth ray layer modifies the values of the pixels inside the perimeter , while the pixels outside the perimeter are not altered by the depth ray layer.
The modified pixel inside the perimeter represents the shape of at least one background object and is a pixel selected from the color gradient that corresponds to the depth associated with the pixel by the depth map. Assuming each value,
So feasible,
system.
前記前景オブジェクトは、前記周囲の内部のピクセルによって画定され、かつ、前記深度レイ層から除外される、
請求項1に記載のシステム。 The selection of the 3D scene is bounded by a perimeter surrounding at least one foreground object.
The foreground object is defined by internal pixels around it and excluded from the depth ray layer.
The system according to claim 1.
請求項2に記載のシステム。 The depth ray layer modifies the pixel values of at least one background object in the 3D scene.
The system according to claim 2.
前記深度レイ層は、少なくとも1つの背景オブジェクトの形状を示すために、前記透明な前景オブジェクトの境界の内部のピクセル値を変化させる、
請求項1に記載のシステム。 The selection of the 3D scene includes a transparent foreground object.
The depth ray layer changes the pixel values inside the boundaries of the transparent foreground object to indicate the shape of at least one background object.
The system according to claim 1.
前記定義されたカラーグラデーションの各値は、前記少なくとも1つの前景オブジェクトとの閾値コントラスト比を提供する、
請求項1に記載のシステム。 The depth ray shader defines the color gradation based on the pixel values of at least one foreground object in the selected portion of the 3D scene.
Each value of the defined color gradation provides a threshold contrast ratio with the at least one foreground object.
The system according to claim 1.
請求項1記載のシステム。 The depth ray shader defines a color gradient based on user-selectable color preferences.
The system according to claim 1.
請求項1に記載のシステム。 The color gradation defines a range of pixel values of the same hue that change from another hue in at least one of tones, tones, and shades.
The system according to claim 1.
第1ピクセル値と第2ピクセル値との間のカラーグラデーションを定義するステップであり、
第1ピクセル値と第2ピクセル値との間の前記カラーグラデーションにおける各連続ステップは、大きさが増加している対応する深度と関連付けられている、ステップと、
周囲によって画定される前記3次元シーンの選択部分に対して深度レイ層を適用するためのグラフィックスエンジンによって実行可能な命令を提供するステップであり、
前記深度レイ層は、前記周囲の内部のピクセルの値を変更し、一方で、前記周囲の外部のピクセルは前記深度レイ層によって変更されず、
前記周囲の内部の前記変更されたピクセルは、少なくとも1つの背景オブジェクトの形状を示しており、かつ、前記深度マップによって前記ピクセルと関連付けられた前記深度に対応する、前記カラーグラデーションに含まれるピクセル値をそれぞれ想定する、
ステップと、
を含む、方法。 A step of receiving a depth map that defines the depth associated with each of the multiple pixels forming the 3D scene,
It is a step that defines the color gradation between the first pixel value and the second pixel value.
Each continuous step in the color gradation between the first pixel value and the second pixel value is associated with a corresponding depth of increasing magnitude, with the step.
A step that provides instructions that can be executed by a graphics engine to apply a depth ray layer to a selection of the 3D scene defined by the surroundings .
The depth ray layer modifies the values of the pixels inside the perimeter , while the pixels outside the perimeter are not altered by the depth ray layer.
The modified pixel inside the perimeter represents the shape of at least one background object, and the pixel value contained in the color gradient corresponding to the depth associated with the pixel by the depth map. Assuming each ,
Steps and
Including, how.
前記前景オブジェクトは、前記周囲の内部のピクセルによって画定され、かつ、前記深度レイ層から除外される、
請求項8に記載の方法。 The selection of the 3D scene is bounded by a perimeter surrounding at least one foreground object.
The foreground object is defined by internal pixels around it and excluded from the depth ray layer.
The method according to claim 8.
請求項8に記載の方法。 The depth ray layer modifies the pixel values of at least one background object in the 3D scene.
The method according to claim 8.
前記深度レイ層は、少なくとも1つの背景オブジェクトの形状を示すために、前記透明な前景オブジェクトの境界の内部のピクセル値を変化させる、
請求項8に記載の方法。 The selection of the 3D scene includes a transparent foreground object.
The depth ray layer changes the pixel values inside the boundaries of the transparent foreground object to indicate the shape of at least one background object.
The method according to claim 8.
前記3次元シーンの前記選択部分における少なくとも1つの前景オブジェクトのピクセル値に基づいて、前記カラーグラデーションを定義するステップ、を含み、
前記定義されたカラーグラデーションの各値は、前記少なくとも1つの前景オブジェクトとの閾値コントラスト比を提供する、
請求項8に記載の方法。 The step of defining the color gradation further
A step of defining the color gradation based on the pixel value of at least one foreground object in the selected portion of the 3D scene.
Each value of the defined color gradation provides a threshold contrast ratio with the at least one foreground object.
The method according to claim 8.
ユーザが選択可能なカラープリファレンスに基づいて、カラーグラデーションを定義するステップ、を含む、
請求項8に記載の方法。 The step of defining the color gradation further
Includes steps to define a color gradient, based on user-selectable color preferences.
The method according to claim 8.
請求項8に記載の方法。 The color gradation defines a range of pixel values of the same hue that change from another hue in at least one of tones, tones, and shades.
The method according to claim 8.
前記コンピュータプロセスは、
3次元シーンを形成している複数のピクセルのそれぞれと関連する深度を定義する深度マップを受信するステップと、
第1ピクセル値と第2ピクセル値との間のカラーグラデーションを定義するステップであり、
第1ピクセル値と第2ピクセル値との間の前記カラーグラデーションにおける各連続ステップは、大きさが増加している対応する深度と関連付けられている、ステップと、
周囲によって画定される前記3次元シーンの選択部分に対して深度レイ層を適用するためのグラフィックスエンジンによって実行可能な命令を提供するステップであり、
前記深度レイ層は、前記周囲の内部のピクセルの値を変更し、一方で、前記周囲の外部のピクセルは前記深度レイ層によって変更されず、
前記周囲の内部の前記変更されたピクセルは、少なくとも1つの背景オブジェクトの形状を示しており、かつ、前記深度マップによって前記ピクセルと関連付けられた前記深度に対応する、前記カラーグラデーションに含まれるピクセル値をそれぞれ想定する、
ステップと、
を含む、コンピュータで読取り可能な記憶媒体。 A tangible computer-readable storage medium containing one or more coded instructions for running a computer process.
The computer process
A step of receiving a depth map that defines the depth associated with each of the multiple pixels forming the 3D scene,
It is a step that defines the color gradation between the first pixel value and the second pixel value.
Each continuous step in the color gradation between the first pixel value and the second pixel value is associated with a corresponding depth of increasing magnitude, with the step.
A step that provides instructions that can be executed by a graphics engine to apply a depth ray layer to a selection of the 3D scene defined by the surroundings .
The depth ray layer modifies the values of the pixels inside the perimeter , while the pixels outside the perimeter are not altered by the depth ray layer.
The modified pixel inside the perimeter represents the shape of at least one background object, and the pixel value contained in the color gradient corresponding to the depth associated with the pixel by the depth map. Assuming each ,
Steps and
Computer-readable storage media, including.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/042,862 | 2018-07-23 | ||
US16/042,862 US10643398B2 (en) | 2018-07-23 | 2018-07-23 | Depth ray layer for reduced visual noise |
PCT/US2019/037845 WO2020023139A1 (en) | 2018-07-23 | 2019-06-19 | Depth ray layer for reduced visual noise |
Publications (3)
Publication Number | Publication Date |
---|---|
JP2021532470A JP2021532470A (en) | 2021-11-25 |
JPWO2020023139A5 true JPWO2020023139A5 (en) | 2022-05-30 |
JP7422734B2 JP7422734B2 (en) | 2024-01-26 |
Family
ID=67138193
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2021503725A Active JP7422734B2 (en) | 2018-07-23 | 2019-06-19 | Depth ray layer for visual noise reduction |
Country Status (6)
Country | Link |
---|---|
US (2) | US10643398B2 (en) |
EP (1) | EP3827413A1 (en) |
JP (1) | JP7422734B2 (en) |
KR (1) | KR20210037674A (en) |
CN (1) | CN112534479A (en) |
WO (1) | WO2020023139A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020113202A1 (en) * | 2018-11-30 | 2020-06-04 | University Of Southern California | Double-blinded, randomized trial of augmented reality low-vision mobility and grasp aid |
US11423621B1 (en) * | 2020-05-21 | 2022-08-23 | Facebook Technologies, Llc. | Adaptive rendering in artificial reality environments |
Family Cites Families (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6127990A (en) | 1995-11-28 | 2000-10-03 | Vega Vista, Inc. | Wearable display and methods for controlling same |
US6064354A (en) | 1998-07-01 | 2000-05-16 | Deluca; Michael Joseph | Stereoscopic user interface method and apparatus |
TW464817B (en) | 1999-08-18 | 2001-11-21 | Ibm | Technique for creating audience-specific views of documents |
KR100519779B1 (en) * | 2004-02-10 | 2005-10-07 | 삼성전자주식회사 | Method and apparatus for high speed visualization of depth image-based 3D graphic data |
JP4804120B2 (en) * | 2005-11-17 | 2011-11-02 | 株式会社バンダイナムコゲームス | Program, information storage medium, and image generation system |
US8584028B2 (en) * | 2006-10-31 | 2013-11-12 | Microsoft Corporation | Adaptable transparency |
US7969438B2 (en) * | 2007-01-23 | 2011-06-28 | Pacific Data Images Llc | Soft shadows for cinematic lighting for computer graphics |
US8213711B2 (en) * | 2007-04-03 | 2012-07-03 | Her Majesty The Queen In Right Of Canada As Represented By The Minister Of Industry, Through The Communications Research Centre Canada | Method and graphical user interface for modifying depth maps |
US8665258B2 (en) | 2008-09-16 | 2014-03-04 | Adobe Systems Incorporated | Generating a depth map based on a single image |
JP5527856B2 (en) * | 2008-09-25 | 2014-06-25 | コーニンクレッカ フィリップス エヌ ヴェ | 3D image data processing |
US8867820B2 (en) * | 2009-10-07 | 2014-10-21 | Microsoft Corporation | Systems and methods for removing a background of an image |
US20110109617A1 (en) * | 2009-11-12 | 2011-05-12 | Microsoft Corporation | Visualizing Depth |
US8493383B1 (en) * | 2009-12-10 | 2013-07-23 | Pixar | Adaptive depth of field sampling |
US8902229B2 (en) * | 2010-01-13 | 2014-12-02 | Samsung Electronics Co., Ltd. | Method and system for rendering three dimensional views of a scene |
EP2539759A1 (en) | 2010-02-28 | 2013-01-02 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
US8467601B2 (en) * | 2010-09-15 | 2013-06-18 | Kyran Daisy | Systems, methods, and media for creating multiple layers from an image |
US8824834B2 (en) * | 2011-09-23 | 2014-09-02 | Adobe Systems Incorporated | Adaptive sampling guided by multilateral filtering |
US9424767B2 (en) | 2012-06-18 | 2016-08-23 | Microsoft Technology Licensing, Llc | Local rendering of text in image |
US9619911B2 (en) | 2012-11-13 | 2017-04-11 | Qualcomm Incorporated | Modifying virtual object display properties |
US9536345B2 (en) * | 2012-12-26 | 2017-01-03 | Intel Corporation | Apparatus for enhancement of 3-D images using depth mapping and light source synthesis |
US9230368B2 (en) | 2013-05-23 | 2016-01-05 | Microsoft Technology Licensing, Llc | Hologram anchoring and dynamic positioning |
US9953556B2 (en) | 2013-10-04 | 2018-04-24 | University Of Manitoba | Color correction method for optical see-through displays |
CN105979900B (en) * | 2014-02-04 | 2020-06-26 | 皇家飞利浦有限公司 | Visualization of depth and position of blood vessels and robot-guided visualization of blood vessel cross-sections |
AU2014202574A1 (en) | 2014-05-13 | 2015-12-03 | Canon Kabushiki Kaisha | Positioning of projected augmented reality content |
US9659381B2 (en) | 2015-01-26 | 2017-05-23 | Daqri, Llc | Real time texture mapping for augmented reality system |
US10297068B2 (en) * | 2017-06-06 | 2019-05-21 | Adshir Ltd. | Method for ray tracing augmented objects |
US10511895B2 (en) | 2015-10-09 | 2019-12-17 | Warner Bros. Entertainment Inc. | Cinematic mastering for virtual reality and augmented reality |
US10074160B2 (en) * | 2016-09-30 | 2018-09-11 | Disney Enterprises, Inc. | Point cloud noise and outlier removal for image-based 3D reconstruction |
US10122994B2 (en) * | 2016-11-11 | 2018-11-06 | Disney Enterprises, Inc. | Object reconstruction from dense light fields via depth from gradients |
US10467798B2 (en) * | 2016-12-19 | 2019-11-05 | Canon Medical Systems Corporation | Rendering a global illumination image from a volumetric medical imaging data set |
US10432944B2 (en) * | 2017-08-23 | 2019-10-01 | Avalon Holographics Inc. | Layered scene decomposition CODEC system and methods |
JP6942566B2 (en) * | 2017-08-30 | 2021-09-29 | キヤノン株式会社 | Information processing equipment, information processing methods and computer programs |
-
2018
- 2018-07-23 US US16/042,862 patent/US10643398B2/en active Active
-
2019
- 2019-06-19 WO PCT/US2019/037845 patent/WO2020023139A1/en unknown
- 2019-06-19 CN CN201980049112.7A patent/CN112534479A/en active Pending
- 2019-06-19 KR KR1020217004087A patent/KR20210037674A/en unknown
- 2019-06-19 JP JP2021503725A patent/JP7422734B2/en active Active
- 2019-06-19 EP EP19735132.3A patent/EP3827413A1/en active Pending
-
2020
- 2020-04-06 US US16/841,035 patent/US11182977B2/en active Active
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5990903A (en) | Method and apparatus for performing chroma key, transparency and fog operations | |
US9407790B2 (en) | Method for noise-robust color changes in digital images | |
US8330769B2 (en) | System and method for monochromatic tinting using saturation maps | |
CN1731449A (en) | A method of image restoration | |
JP2009093182A5 (en) | ||
JP7327732B2 (en) | Image data interpolation method | |
CN103617596A (en) | Image color style transformation method based on flow pattern transition | |
US11328399B2 (en) | Method and apparatus, and storage medium for processing style image | |
CN110288670B (en) | High-performance rendering method for UI (user interface) tracing special effect | |
CN104504666A (en) | Tone mapping method based on Laplacian pyramid | |
Seo et al. | Image recoloring using linear template mapping | |
JPWO2020023139A5 (en) | ||
WO2020023139A9 (en) | Depth ray layer for reduced visual noise | |
CN117037724B (en) | Picture display method, device and equipment of ink screen and storage medium | |
CN116630510B (en) | Method, equipment and medium for generating related cone gradual change texture | |
TWI747006B (en) | Picture display method, picture processing method and system | |
US20140093167A1 (en) | Recoloring images of a web page according to a representative color | |
CA2674104A1 (en) | Method and graphical user interface for modifying depth maps | |
US9721328B2 (en) | Method to enhance contrast with reduced visual artifacts | |
CN114596213A (en) | Image processing method and device | |
KR102550124B1 (en) | MultI-pass Layer Type Cartoon Shader Method for Webtoons | |
Zhang et al. | Aesthetic enhancement of landscape photographs as informed by paintings across depth layers | |
JP7417166B2 (en) | Depth map accuracy improvement device, method, and program | |
CN111724449B (en) | Image processing method, device and equipment | |
CN114841852A (en) | Image processing method, device, equipment and storage medium |