CN110689601A - Scatter elimination algorithm suitable for AR virtual soft package synthesis - Google Patents

Scatter elimination algorithm suitable for AR virtual soft package synthesis Download PDF

Info

Publication number
CN110689601A
CN110689601A CN201910946759.1A CN201910946759A CN110689601A CN 110689601 A CN110689601 A CN 110689601A CN 201910946759 A CN201910946759 A CN 201910946759A CN 110689601 A CN110689601 A CN 110689601A
Authority
CN
China
Prior art keywords
dxy
point
scattered
image
scatter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910946759.1A
Other languages
Chinese (zh)
Inventor
徐耀华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaoxing Xiujia Technology Co Ltd
Original Assignee
Shaoxing Xiujia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaoxing Xiujia Technology Co Ltd filed Critical Shaoxing Xiujia Technology Co Ltd
Priority to CN201910946759.1A priority Critical patent/CN110689601A/en
Publication of CN110689601A publication Critical patent/CN110689601A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/04Architectural design, interior design

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a scatter elimination algorithm suitable for AR virtual soft loading synthesis, which comprises the following steps: carrying out image processing operation on scattered points in the selected area, wherein the scattered points are corroded firstly and then expanded; positioning scattered points of which the inner edges in the selected area are connected into a line or form scattered point blocks or form scattered point bands, calculating Gaussian weighted sum of peripheral pixels one by one along the peripheral area of the scattered points, and adsorbing the scattered points; when the commodity is replaced, extracting the region selection information to obtain an image M, carrying out mean value filtering processing on the image M, then taking the image M as a transparency mask, and synthesizing a scene and a replaced commodity material; the algorithm can eliminate scattered points to the maximum extent, the scattered point elimination rate is 98%, the scattered point misjudgment rate is 10%, the scattered points are intelligently reduced through a computer, after the scattered points are reduced, the side seams are not obvious, the influence of the clutter points on the vision can be weakened by using the feather fusion technology in the third step, and meanwhile, the rendered materials can be better fused into a scene.

Description

Scatter elimination algorithm suitable for AR virtual soft package synthesis
Technical Field
The invention relates to the technical field of AR virtual soft assembling synthesis, in particular to a scatter elimination algorithm suitable for AR virtual soft assembling synthesis.
Background
In an AR virtual soft-fit synthesis system, areas such as a wall surface, a ground surface, a window and the like need to be distinguished, information of the areas needs to be manually calibrated through other tools of a system module, the system records the information of the areas by using a mask, png, in the mask, an integer of every 10 of gray values represents one area, and it is assumed that: wall areas are represented using a 10-50 gray scale, with 60 background walls, floor 220, window 250, etc., as shown in FIG. 1;
different gray values in the selected area mask represent different selected area ranges, however, a large number of point gray values exist between the selected areas and around objects with complex edges, such as bonsais, and do not belong to any selected areas, and the point gray values are called scattered points. The operation of picking up the selected area is difficult to avoid flaws, the thickness degree of the flaws is different according to scenes of people, particularly, in the position of a seam between areas, if the area information in the flaws is shielded, the stray wave scattering points on the edge are highlighted, and the scattering points can be clearly seen as shown in fig. 2;
scatter points cannot be avoided, the scatter points are inevitably introduced in the manual processing, and even if the manually generated template has no scatter points, new scatter points can be generated when the zooming scene is adapted to the mobile phone screen. The scatter points can not be seen from the left side of the upper image, and the right image is a schematic diagram which omits the region selection information and highlights the scatter points. The number of scattered points is different from the number of selected areas and the complexity of the selected areas, the number of scattered points in fig. 2 reaches 1900, and many scattered points can cause thin black/white edges to appear on the edges after the clothes changing, as shown in fig. 3, the visual experience is influenced, although the thin black/white edges are not obvious, the small flaw can be highlighted when the high-definition screen or the large screen is projected, and the visual experience is directly influenced.
Disclosure of Invention
Aiming at the problems, the invention provides a scattered point elimination algorithm suitable for AR virtual soft assembling synthesis, scattered points in different forms can be eliminated to the maximum extent through different algorithm processing, the scattered point elimination rate is 98%, the scattered point misjudgment rate is 10%, stray points can be intelligently reduced through a computer, gaps between edge seams are reduced, after the scattered points are reduced, the edge seams can become less obvious, the third step is called when commodity replacement operation is carried out each time, the influence of the stray points on vision can be weakened by using the feathering fusion technology in the third step, and rendered materials can be better blended into a scene.
The invention provides a scatter elimination algorithm suitable for AR virtual soft loading synthesis, which comprises the following steps:
the method comprises the following steps: preprocessing the scattered points of the stars in the selected areas, and when a scene is loaded for the first time, making i belong to (10,20, …,250) to represent the value of each selected area, and enabling i to be sequentially and circularly processed from 10;
taking out the selected area i, and sequentially reading the pixel values m of each point in the mask(x,y)Obtaining M(x,y)Is expressed as shown in formula (1):
Figure BDA0002224388640000021
then M is added(x,y)The value is stored in a new picture and is represented by an image M, then the image M is subjected to image processing operation of firstly corroding and then expanding, and finally the result is written back to the mask(x,y)When the value is 255, m is set(x,y)=i;
Step two: firstly, positioning scattered points, calculating the Gaussian weighted sum of peripheral pixels one by one along the peripheral area of the scattered points, and adsorbing the scattered points according to the distance in the normal direction;
the method specifically comprises the following steps: png picture is processed point by point, then scattered points are processed continuously, a two-dimensional array Dxy is established first, then the two-dimensional array Dxy is initialized, and Dxy [ k ] is set][0]=100,Dxy[k][1]=0,Dxy[k][2]=0,
Figure BDA0002224388640000032
If the coordinates of the scatter point are (x, y), the related data in the 5 × 5 rectangular window with the scatter point as the center is operated, i.e. let l, n be [ -2,2]And l, n are integers, when the (x + l, y + n) point in the window is in a selected region, i.e. m(x+l,y+n)% 10-0, el,n1, otherwise el,nWhen 0, then Dxy is calculated from equation (2):
Figure BDA0002224388640000031
after the calculation of Dxy [ K ] [1] and Dxy [ K ] [2], selecting a K value to solve a formula (3):
Dxy[k][0]=|Dxy[k][2]-g(x,y)|/Dxy[k][1](3)
finding the minimum Dxy [ k ] using equation (3)min][0]Value, modify the marked value m of the scatter(x,y)k min10, then saving the mask.
Step three: when the commodity is replaced, firstly extracting the region selection information by using the first step to obtain an image M, carrying out mean value filtering processing on the image M, then taking the image M as a transparency mask, and synthesizing a scene and a replaced commodity material according to a formula (4):
C'(x,y)=C(x,y)*(1-α(x,y)+S(x,y)*α(x,y) (4)
wherein, C(x,y)、S(x,y)The gray values of the scene and the commodity materials at the (x, y) point; m(x,y)Is the gray value of the image M at the (x, y) point.
The further improvement lies in that: g in the formula (3) in the second step(x,y)To read out the gray value of the (x, y) point from the original effect map.
The further improvement lies in that: after the calculation of Dxy [ K ] [1] and Dxy [ K ] [2] in the formula (2) in the second step is finished, the range of the K value is as follows in sequence: k ∈ [1,25], and then equation (3) is solved.
The further improvement lies in that: png picture point by point in said step, when m is processed(x,y)When% 10 is 0, the dot is determined to be scattered.
The invention has the beneficial effects that: through carrying out different algorithm processing to the scattered point of different forms, can the maximize elimination scattered point, scattered point elimination rate is 98%, scattered point misjudgment rate is 10%, reduce scattered point through computer intelligence, reduce the gap between the side seam, after scattered point reduces, the side seam can become so obvious, recall step three in the time of changing commodity operation at every turn, utilize the eclosion of step three to fuse the technique, can weaken the influence of miscellaneous wave point to the vision, let during the material of rendering can be better blends into the scene simultaneously.
Drawings
FIG. 1 is a diagram illustrating a mask picture of a selected area of a scene according to the background art of the present invention;
FIG. 2 is a schematic diagram of a selected area mask picture of a scene according to the background art after edge clutter and scatter are highlighted;
FIG. 3 is a schematic diagram illustrating the effect of black/white edges caused by non-eliminated scatter dots in the background art of the present invention;
FIG. 4 is a schematic diagram illustrating the reloading effect after the scatter is eliminated in the embodiment of the invention;
FIG. 5 is a schematic diagram of edge detection according to an embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating a substantial effect of eliminating scatter in an embodiment of the present invention;
FIG. 7 is a diagram illustrating the original effect of selected pictures according to an embodiment of the present invention;
FIG. 8 is an effect diagram rendered after a picture is selected for de-scatter.
Detailed Description
In order to further understand the present invention, the following detailed description will be made with reference to the following examples, which are only used for explaining the present invention and are not to be construed as limiting the scope of the present invention.
As shown in fig. 4, 5, 6, 7, and 8, the present embodiment provides a scatter-elimination algorithm suitable for AR virtual soft-loading synthesis, which includes the following steps:
the method comprises the following steps: preprocessing the scattered points of the stars in the selected areas, and when a scene is loaded for the first time, making i belong to (10,20, …,250) to represent the value of each selected area, and enabling i to be sequentially and circularly processed from 10;
taking out the selected area i, and sequentially reading the pixel values m of each point in the mask(x,y)Obtaining M(x,y)Is expressed as shown in formula (1):
Figure BDA0002224388640000051
then M is added(x,y)The value is stored in a new picture and is represented by an image M, then the image M is subjected to image processing operation of firstly corroding and then expanding, and finally the result is written back to the mask(x,y)When the value is 255, m is set(x,y)=i;
Step two: firstly, positioning scattered points, calculating the Gaussian weighted sum of peripheral pixels one by one along the peripheral area of the scattered points, and adsorbing the scattered points according to the distance in the normal direction;
the method specifically comprises the following steps: png picture is processed point by point, when m is(x,y)If the percentage is 10 ∈ [ -2, 2], determining the point is scattered, then processing the scattered point, and establishing a two-dimensional array Dxy, making l, n ∈ [ -2,2 ∈ []And l and n are integers, a two-dimensional array Dxy of the two-dimensional array 25 multiplied by 3 is established, then the two-dimensional array Dxy is initialized, and Dxy is set [ k ]][0]=100,Dxy[k][1]=0,Dxy[k][2]=0,
Figure BDA0002224388640000052
If the coordinates of the scatter point are (x, y), the related data in the 5 × 5 rectangular window with the scatter point as the center is operated, i.e. let l, n be [ -2,2]And l, n are integers, when the (x + l, y + n) point in the window is in a selected region, i.e. m(x+l,y+n)% 10-0, el,n1, otherwise el,nWhen 0, then Dxy is calculated from equation (2):
Figure BDA0002224388640000061
after computing Dxy [ K ] [1] and Dxy [ K ] [2], selecting a K value, wherein K belongs to [1,25], and then solving the formula (3):
Dxy[k][0]=|Dxy[k][2]-g(x,y)|/Dxy[k][1](3)
wherein, g(x,y)Reading out the gray value of an (x, y) point from an original effect graph;
finding the minimum Dxy [ k ] using equation (3)min][0]Value, modify the marked value m of the scatter(x,y)k min10, then saving the mask.
Step three: when the commodity is replaced, firstly extracting the region selection information by using the first step to obtain an image M, carrying out mean value filtering processing on the image M, then taking the image M as a transparency mask, and synthesizing a scene and a replaced commodity material to obtain a formula (4):
C'(x,y)=C(x,y)*(1-α(x,y)+S(x,y)*α(x,y) (4)
wherein, C(x,y)、S(x,y)The gray values of the scene and the commodity materials at the (x, y) point; m(x,y)Is the gray value of the image M at the (x, y) point.
Fig. 7 is a picture selected in the embodiment, and after the processing of the above-mentioned scatter-point-removing algorithm, an effect picture rendered after scatter point removal as shown in fig. 8 is obtained, wherein 7 replaced commodities are shown in fig. 8, which are respectively wallpaper, curtain, wall decoration, wall painting, sofa cloth, floor and ceiling lamp, and 10 design selection areas are shown;
after the light and shadow rendering, the effect graph of the graph 8 is obtained, and careful observation can obtain that the edge seam flaws of the miscellaneous points are not displayed at the seam between the commodities, and the scattered points are basically and completely eliminated, so that the algorithm has a good effect of eliminating the scattered points.
Through carrying out different algorithm processing to the scattered point of different forms, can the maximize elimination scattered point, scattered point elimination rate is 98%, scattered point misjudgment rate is 10%, reduce scattered point through computer intelligence, reduce the gap between the side seam, after scattered point reduces, the side seam can become so obvious, recall step three in the time of changing commodity operation at every turn, utilize the eclosion of step three to fuse the technique, can weaken the influence of miscellaneous wave point to the vision, let during the material of rendering can be better blends into the scene simultaneously.
The foregoing illustrates and describes the principles, general features, and advantages of the present invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (4)

1. A scatter-elimination algorithm suitable for AR virtual soft-loading synthesis is characterized by comprising the following steps:
the method comprises the following steps: preprocessing the scattered points of the stars in the selected areas, and when a scene is loaded for the first time, making i belong to (10,20, …,250) to represent the value of each selected area, and enabling i to be sequentially and circularly processed from 10;
taking out the selected area i, and sequentially reading the pixel values m of each point in the mask(x,y)Obtaining M(x,y)Is expressed as shown in formula (1):
Figure FDA0002224388630000011
then M is added(x,y)The value is stored in a new picture and is represented by an image M, then the image M is subjected to image processing operation of firstly corroding and then expanding, and finally the result is written back to the mask(x,y)When the value is 255, m is set(x,y)=i;
Step two: firstly, positioning scattered points, calculating the Gaussian weighted sum of peripheral pixels one by one along the peripheral area of the scattered points, and adsorbing the scattered points according to the distance in the normal direction;
the method specifically comprises the following steps: png picture is processed point by point, then scattered points are processed continuously, a two-dimensional array Dxy is established first, then the two-dimensional array Dxy is initialized, and Dxy [ k ] is set][0]=100,Dxy[k][1]=0,Dxy[k][2]=0,If the coordinates of the scatter point are (x, y), the related data in the 5 × 5 rectangular window with the scatter point as the center is operated, i.e. let l, n be [ -2,2]And l, n are integers, when the (x + l, y + n) point in the window is in a selected region, i.e. m(x+l,y+n)% 10-0, el,n1, otherwise el,nWhen 0, then Dxy is calculated from equation (2):
Figure FDA0002224388630000021
after the calculation of Dxy [ K ] [1] and Dxy [ K ] [2], selecting a K value to solve a formula (3):
Dxy[k][0]=|Dxy[k][2]-g(x,y)|/Dxy[k][1](3)
finding the minimum Dxy [ k ] using equation (3)min][0]Value, modify the marked value m of the scatter(x,y)=kmin10, then saving the mask.
Step three: when the commodity is replaced, firstly extracting the region selection information by using the first step to obtain an image M, carrying out mean value filtering processing on the image M, then taking the image M as a transparency mask, and synthesizing a scene and a replaced commodity material according to a formula (4):
C'(x,y)=C(x,y)*(1-α(x,y)+S(x,y)*α(x,y) (4)
wherein, C(x,y)、S(x,y)The gray values of the scene and the commodity materials at the (x, y) point; m(x,y)Is the gray value of the image M at the (x, y) point.
2. The scatter-elimination algorithm for AR virtual soft-fitting synthesis according to claim 1, wherein: g in the formula (3) in the second step(x,y)To read out the gray value of the (x, y) point from the original effect map.
3. The scatter-elimination algorithm for AR virtual soft-fitting synthesis according to claim 1, wherein: after the calculation of Dxy [ K ] [1] and Dxy [ K ] [2] in the formula (2) in the second step is finished, the range of the K value is as follows in sequence: k ∈ [1,25], and then equation (3) is solved.
4. The scatter-elimination algorithm for AR virtual soft-fitting synthesis according to claim 1, wherein: png picture point by point in said step, when m is processed(x,y)When% 10 is 0, the dot is determined to be scattered.
CN201910946759.1A 2019-10-07 2019-10-07 Scatter elimination algorithm suitable for AR virtual soft package synthesis Pending CN110689601A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910946759.1A CN110689601A (en) 2019-10-07 2019-10-07 Scatter elimination algorithm suitable for AR virtual soft package synthesis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910946759.1A CN110689601A (en) 2019-10-07 2019-10-07 Scatter elimination algorithm suitable for AR virtual soft package synthesis

Publications (1)

Publication Number Publication Date
CN110689601A true CN110689601A (en) 2020-01-14

Family

ID=69111368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910946759.1A Pending CN110689601A (en) 2019-10-07 2019-10-07 Scatter elimination algorithm suitable for AR virtual soft package synthesis

Country Status (1)

Country Link
CN (1) CN110689601A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111986113A (en) * 2020-08-20 2020-11-24 浙江理工大学 Optical image shadow eliminating method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2157544A1 (en) * 2008-08-01 2010-02-24 Julius-Maximilians-Universität Würzburg System for adaptive removal of speckle noise in digital images and generation of a colour composite product based on automated analysis of speckle characteristics
CN105139394A (en) * 2015-08-19 2015-12-09 杭州电子科技大学 Noise image quality evaluation method combining reconstruction with noise scatter histograms
CN107330980A (en) * 2017-07-06 2017-11-07 重庆邮电大学 A kind of virtual furnishings arrangement system based on no marks thing
CN108416700A (en) * 2018-02-05 2018-08-17 湖南城市学院 A kind of interior decoration design system based on AR virtual reality technologies
CN109949400A (en) * 2019-03-22 2019-06-28 南京可居网络科技有限公司 Shadow estimation and reconstructing method suitable for the virtual soft dress synthesis of AR
CN109960872A (en) * 2019-03-22 2019-07-02 南京可居网络科技有限公司 The virtual soft dress collocation management system of AR and its working method
CN110120030A (en) * 2019-03-28 2019-08-13 河南农业大学 Processing method, application, computer-readable medium and the disease occurring area measuring method of wheat diseases generation image

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2157544A1 (en) * 2008-08-01 2010-02-24 Julius-Maximilians-Universität Würzburg System for adaptive removal of speckle noise in digital images and generation of a colour composite product based on automated analysis of speckle characteristics
CN105139394A (en) * 2015-08-19 2015-12-09 杭州电子科技大学 Noise image quality evaluation method combining reconstruction with noise scatter histograms
CN107330980A (en) * 2017-07-06 2017-11-07 重庆邮电大学 A kind of virtual furnishings arrangement system based on no marks thing
CN108416700A (en) * 2018-02-05 2018-08-17 湖南城市学院 A kind of interior decoration design system based on AR virtual reality technologies
CN109949400A (en) * 2019-03-22 2019-06-28 南京可居网络科技有限公司 Shadow estimation and reconstructing method suitable for the virtual soft dress synthesis of AR
CN109960872A (en) * 2019-03-22 2019-07-02 南京可居网络科技有限公司 The virtual soft dress collocation management system of AR and its working method
CN110120030A (en) * 2019-03-28 2019-08-13 河南农业大学 Processing method, application, computer-readable medium and the disease occurring area measuring method of wheat diseases generation image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘云凤等: "基于AR与VR技术的装修行业的研究", 《科技风》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111986113A (en) * 2020-08-20 2020-11-24 浙江理工大学 Optical image shadow eliminating method and system
CN111986113B (en) * 2020-08-20 2024-03-22 浙江理工大学 Optical image shadow elimination method and system

Similar Documents

Publication Publication Date Title
CN112348815B (en) Image processing method, image processing apparatus, and non-transitory storage medium
US11386528B2 (en) Denoising filter
US9153068B2 (en) Clipless time and lens bounds for improved sample test efficiency in image rendering
Kovesi MATLAB and Octave functions for computer vision and image processing
US7889913B2 (en) Automatic compositing of 3D objects in a still frame or series of frames
Rematas et al. Image-based synthesis and re-synthesis of viewpoints guided by 3d models
US6791540B1 (en) Image processing apparatus
US9142043B1 (en) System and method for improved sample test efficiency in image rendering
CN112132164B (en) Target detection method, system, computer device and storage medium
US8275170B2 (en) Apparatus and method for detecting horizon in sea image
WO2007145654A1 (en) Automatic compositing of 3d objects in a still frame or series of frames and detection and manipulation of shadows in an image or series of images
CN116342519A (en) Image processing method based on machine learning
CN110689601A (en) Scatter elimination algorithm suitable for AR virtual soft package synthesis
JP2014106713A (en) Program, method, and information processor
CN115601616A (en) Sample data generation method and device, electronic equipment and storage medium
Trapp et al. Occlusion management techniques for the visualization of transportation networks in virtual 3D city models
CN107231551B (en) A kind of image detecting method and device
US6377279B1 (en) Image generation apparatus and image generation method
Čejka et al. Tackling problems of marker-based augmented reality under water
Subhasri et al. SUPER PIXEL BASED VIRTUAL TEXTURE MAPPING OF IMAGE SYNTHESIS.
WO2011058626A1 (en) Image processing device and slide show display
Zheng et al. Efficient screen space anisotropic blurred soft shadows
Zhang The application of directional derivative in the design of animation characters and background elements
JP2004054635A (en) Picture processor and its method
KR101470497B1 (en) Apparatus for Generating Non-Photorealistic Image using Direction Sensor and Method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200114

WD01 Invention patent application deemed withdrawn after publication