CN110276774A - Drawing practice, device, terminal and the computer readable storage medium of object - Google Patents

Drawing practice, device, terminal and the computer readable storage medium of object Download PDF

Info

Publication number
CN110276774A
CN110276774A CN201910565646.7A CN201910565646A CN110276774A CN 110276774 A CN110276774 A CN 110276774A CN 201910565646 A CN201910565646 A CN 201910565646A CN 110276774 A CN110276774 A CN 110276774A
Authority
CN
China
Prior art keywords
dimensional
coordinate
image
terminal
dimensional image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910565646.7A
Other languages
Chinese (zh)
Other versions
CN110276774B (en
Inventor
邓健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910565646.7A priority Critical patent/CN110276774B/en
Publication of CN110276774A publication Critical patent/CN110276774A/en
Application granted granted Critical
Publication of CN110276774B publication Critical patent/CN110276774B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application belongs to technical field of image processing, more particularly to a kind of drawing practice of object, device, terminal and computer readable storage medium, wherein, the drawing practice of object includes: the first depth image for obtaining the actual environment of camera assembly shooting, establishes the three-dimensional system of coordinate based on actual environment according to the first depth image;It obtains camera assembly and treats two dimensional image and the second depth image corresponding with two dimensional image that the whole surface of drawing object is shot, and obtain the real-time pose data of terminal simultaneously;Edge extracting is carried out to two dimensional image, obtains the corresponding Edge Feature Points of two dimensional image;Coordinate of the Edge Feature Points under three-dimensional system of coordinate is calculated according to real-time pose data and the second depth image, and waits for the pattern of drawing object according to Coordinate generation;Wherein, pattern includes the 3 dimensional drawing of object to be drawn and/or the plan view of object to be drawn;Improve the drafting efficiency of object dimensional perspective view and plan view.

Description

Drawing practice, device, terminal and the computer readable storage medium of object
Technical field
The application belongs to technical field of image processing more particularly to a kind of drawing practice of object, device, terminal and meter Calculation machine readable storage medium storing program for executing.
Background technique
It, can be by drawing the 3 dimensional drawing of object or the plan view of object when expressing three-dimension object Mode expressed.
Currently, needing to utilize the works such as ruler when the plan view of the desired 3 dimensional drawing for drawing object of user or object Tool measures object, is then drawn again by Freehandhand-drawing or using computer graphics software, has and draws low efficiency The problem of.
Summary of the invention
The embodiment of the present application provides drawing practice, device, terminal and the computer readable storage medium of a kind of object, can With solve object 3 dimensional drawing or plan view drafting low efficiency the technical issues of.
The embodiment of the present application first aspect provides a kind of drawing practice of object, is applied to terminal, comprising:
The first depth image for obtaining the actual environment of camera assembly shooting, establishes base according to first depth image In the three-dimensional system of coordinate of the actual environment;
Obtain the camera assembly treat two dimensional image that the whole surface of drawing object is shot and with institute Corresponding second depth image of two dimensional image is stated, and obtains the real-time pose data of the terminal simultaneously;
Edge extracting is carried out to the two dimensional image, obtains the corresponding Edge Feature Points of the two dimensional image;
The Edge Feature Points are calculated in the three-dimensional according to the real-time pose data and second depth image Coordinate under coordinate system, and the pattern of the object to be drawn according to the Coordinate generation;The pattern includes described wait draw The plan view of the 3 dimensional drawing of object and/or the object to be drawn.
The embodiment of the present application second aspect provides a kind of plotting unit of object, is configured at terminal, comprising:
Unit is established, the first depth image of the actual environment for obtaining camera assembly shooting is deep according to described first It spends image and establishes the three-dimensional system of coordinate based on the actual environment;
Acquiring unit treats the whole surface of drawing object is shot two for obtaining the camera assembly Image and the second depth image corresponding with the two dimensional image are tieed up, and obtains the real-time pose data of the terminal simultaneously;
It is special to obtain the corresponding edge of the two dimensional image for carrying out edge extracting to the two dimensional image for extraction unit Sign point;
Image-drawing unit, it is special for calculating the edge according to the real-time pose data and second depth image Coordinate of the sign point under the three-dimensional system of coordinate, and the pattern of the object to be drawn according to the Coordinate generation;The pattern The plan view of 3 dimensional drawing and/or the object to be drawn including the object to be drawn.
The embodiment of the present application third aspect provides a kind of terminal, including camera assembly, memory, processor and storage In memory and the computer program that can run on a processor, processor realize the above method when executing computer program The step of.
The embodiment of the present application fourth aspect provides a kind of computer readable storage medium, and computer readable storage medium is deposited The step of containing computer program, the above method realized when computer program is executed by processor.
In the embodiment of the present application, by obtaining the camera shooting of terminal after establishing based on the three-dimensional system of coordinate of actual environment The two dimensional image and corresponding with the two dimensional image second that the whole surface that component treats drawing object is shot Depth image, and the real-time pose data of the terminal are obtained simultaneously;Then edge extracting, then to two dimensional image is carried out, is obtained The corresponding Edge Feature Points of the two dimensional image;So that terminal can be according to the real-time pose data and described Two depth images calculate coordinate of the Edge Feature Points under the three-dimensional system of coordinate, and wait for drawing object according to the Coordinate generation The plan view of the 3 dimensional drawing of body and/or object to be drawn;That is, drawing object to be drawn in the embodiment of the present application The 3 dimensional drawing of body or when the plan view for object of drawing, user only needs to treat drawing object using the camera assembly of terminal After the whole surface of body is shot, the 3 dimensional drawing or plan view of object to be drawn can be automatically generated by terminal, and Do not need user using ruler first to object carry out field survey after, by Freehandhand-drawing or using computer graphics software into The drafting of row object dimensional perspective view and plan view improves the drafting efficiency of object dimensional perspective view and plan view.
Detailed description of the invention
Technical solution in ord to more clearly illustrate embodiments of the present application, below will be to needed in the embodiment Attached drawing is briefly described, it should be understood that the following drawings illustrates only some embodiments of the application, therefore is not to be seen as It is the restriction to range, it for those of ordinary skill in the art, without creative efforts, can be with Other relevant attached drawings are obtained according to these attached drawings.
Fig. 1 is a kind of first implementation flow schematic diagram of the drawing practice of object provided by the embodiments of the present application;
Fig. 2 is the establishment process schematic diagram of three-dimensional system of coordinate provided by the embodiments of the present application;
Fig. 3 is a kind of specific implementation flow signal of the drawing practice step 104 of object provided by the embodiments of the present application Figure;
Fig. 4 is the first schematic diagram of the display interface of terminal provided by the embodiments of the present application;
Fig. 5 is a kind of the second implementation process schematic diagram of the drawing practice of object provided by the embodiments of the present application;
Fig. 6 is the second schematic diagram of the display interface of terminal provided by the embodiments of the present application;
Fig. 7 is the structural schematic diagram of the plotting unit of object provided by the embodiments of the present application;
Fig. 8 is the structural schematic diagram of terminal provided by the embodiments of the present application.
Specific embodiment
In order to which the objects, technical solutions and advantages of the application are more clearly understood, with reference to the accompanying drawings and embodiments, The application is further elaborated.It should be appreciated that specific embodiment described herein is only used to explain this Shen Please, it is not used to limit the application.Meanwhile in the description of the present application, term " first ", " second " etc. are only used for distinguishing and retouch It states, is not understood to indicate or imply relative importance.
The tools such as ruler are utilized currently, generally requiring in the plan view of the 3 dimensional drawing or object of drawing object Object is measured, is then drawn again by Freehandhand-drawing or using computer graphics software, this drafting mode is not only Larger workload, and it is also easy to the influence of tested amount error, lead to the 3 dimensional drawing drawn or plan view and practical object Body can not correspond to completely, have the problem of drawing low efficiency and drawing low precision.
In the embodiment of the present application, by obtaining the camera shooting of terminal after establishing based on the three-dimensional system of coordinate of actual environment The two dimensional image and the second depth corresponding with two dimensional image that the whole surface that component treats drawing object is shot Image, and the real-time pose data of terminal are obtained simultaneously;Then edge extracting, then to two dimensional image is carried out, obtains two dimensional image Corresponding Edge Feature Points;So that terminal can calculate edge according to real-time pose data and the second depth image Coordinate of the characteristic point under three-dimensional system of coordinate, and the 3 dimensional drawing of drawing object waited for according to the Coordinate generation and/or wait draw The plan view of figure object;That is, in the embodiment of the present application, the 3 dimensional drawing of object to be drawn is being drawn or wait draw When the plan view of object, user only needs the whole surface for treating drawing object using the camera assembly of terminal to carry out shooting it Afterwards, the 3 dimensional drawing or plan view that object to be drawn can be automatically generated by terminal, it is first right using ruler without user After object carries out field survey, object dimensional perspective view and plane are carried out by Freehandhand-drawing or using computer graphics software The drafting of figure improves the drafting efficiency of object dimensional perspective view and plan view and draws precision.
A kind of implementation process schematic diagram of the drawing practice of object provided by the embodiments of the present application as shown in figure 1, this is drawn Drawing method is applied to terminal, can be executed by the plotting unit of the object configured in terminal, suitable for that need to improve the three-dimensional of object The drafting efficiency of perspective view and/or plan view and the situation for drawing precision.The terminal may include smart phone, tablet computer, The mobile terminals such as learning machine, also, the terminal can be equipped with camera assembly.The drawing practice of above-mentioned object may include step Rapid 101 to step 104.
Step 101, the first depth image for obtaining the actual environment of camera assembly shooting, builds according to the first depth image The three-dimensional system of coordinate for the actual environment that is based on.
In the embodiment of the present application, when carrying out the drawing of object, the of the actual environment of camera assembly shooting is first obtained One depth image completes the initialization of measurement to establish the three-dimensional system of coordinate based on actual environment.
Wherein, camera assembly may include depth camera and RGB camera.Depth camera is used for sampling depth figure Picture, RGB camera is for acquiring two-dimensional image (two dimensional image).
The gray value of each pixel of depth image can be used for characterizing distance of the certain point apart from camera in scene.
In some embodiments of the application, the resolution ratio of depth camera can be equal to the resolution of RGB camera Rate, so that each pixel on two dimensional image is available to accurate depth information.
In some embodiments of the application, depth camera can be TOF camera.
In some embodiments of the application, camera assembly can also be that can export depth image and two dimension simultaneously The 3D camera of flat image.
When carrying out the drawing of object using terminal, camera applications can be first opened, open camera assembly, carry out reality The acquisition of first depth image of environment, and when receiving the first depth image of depth camera return, with the first depth The point in any one corresponding actual environment of effective depth data in image is coordinate origin, establishes and is based on actual environment Three-dimensional system of coordinate, and as carry out object drawing when coordinate calculate reference frame.
For example, as shown in Fig. 2, establishing the three-dimensional based on actual environment using any one point on sofa as coordinate origin Coordinate system.
It should be noted that in the embodiment of the present application, after the completion of three-dimensional system of coordinate is established, indicating that camera assembly is complete At the initialization before drawing, the drawing for carrying out object can be started.Since the initialization procedure can not need to carry out plane knowledge Not, and it is also not necessary to carry out the movement of terminal and acquisition multiformat film, therefore, the application realizes the drawing of object When, have the characteristics that initialization is fireballing, can achieve the effect of " second opens ".
Step 102, obtain camera assembly treat two dimensional image that the whole surface of drawing object is shot and The second depth image corresponding with two dimensional image, and the real-time pose data of terminal are obtained simultaneously.
In the embodiment of the present application, it obtains camera assembly and treats the whole surface of drawing object is shot two During tieing up image and the second depth image corresponding with two dimensional image, terminal can be made around object of which movement to be drawn, with The whole surface for treating drawing object is shot;Objects around A terminal to be drawn can also be allowed to move, so that terminal treats drawing The whole surface of object is shot.In addition, in some embodiments of the application, terminal and object to be drawn can also be with Move simultaneously, it is only necessary to the whole surface that terminal can treat drawing object is shot, the application not to terminal and The motion state of object to be drawn is limited.
Correspondingly, in the embodiment of the present application, during the real-time pose data of above-mentioned acquisition terminal, if terminal phase For three-dimensional system of coordinate origin occur change in location, and object to be drawn relative to three-dimensional system of coordinate origin there is no Change in location, then the real-time pose data of above-mentioned acquisition terminal refer to the position for obtaining origin of the terminal relative to three-dimensional system of coordinate Appearance data.If terminal relative to three-dimensional system of coordinate origin there is no change in location, and object to be drawn is sat relative to three-dimensional Change in location has occurred in the origin of mark system, then the real-time pose data of above-mentioned acquisition terminal, which refer to, obtains terminal relative to wait draw The pose data of figure object.If terminal relative to three-dimensional system of coordinate origin occur change in location, and object to be drawn relative to Change in location also occurs for the origin of three-dimensional system of coordinate, then the real-time pose data of above-mentioned acquisition terminal refer to that acquisition terminal is opposite In the origin of three-dimensional system of coordinate pose data and obtain pose data of the terminal relative to object to be drawn.
Optionally, the pose data of above-mentioned acquisition terminal may include: the benefit when three-dimensional system of coordinate establishes completion Obtain the six-freedom degree pose data of terminal in real time with Inertial Measurement Unit IMU.
Since object has six-freedom degree, the i.e. one-movement-freedom-degree along three rectangular co-ordinate axis directions of x, y, z in space With the rotational freedom around these three reference axis.Therefore, the position of object is determined completely, just it must be appreciated that this six freedom Degree.
Inertial Measurement Unit (Inertial measurement unit, IMU) be measurement three axis angular rate of object and The device of acceleration.In general, an IMU contains three uniaxial accelerometers and three uniaxial gyroscopes, acceleration The acceleration that detection object founds three axis in carrier coordinate system unification and independence is counted, and gyroscope detects carrier relative to navigational coordinate system Angular speed measures the angular speed and acceleration of object in three dimensions, and the posture of object is calculated with this.Therefore, this Shen When please obtain pose data of the terminal relative to three-dimensional system of coordinate origin, it can use Inertial Measurement Unit IMU and obtained.
In some embodiments of the application, above-mentioned acquisition terminal is relative to can when the pose data for object of drawing To refer to pose data of the terminal relative to the specified point of body surface to be drawn.
In some embodiments of the application, position of the terminal relative to the specified point of body surface to be drawn is being obtained When appearance data, Inertial Measurement Unit IMU can also be set at the specified point, to obtain terminal in real time relative to the specified point Six-freedom degree pose data.
Step 103, edge extracting is carried out to two dimensional image, obtains the corresponding Edge Feature Points of two dimensional image.
Due to during drawing the 3 dimensional drawing of object to be drawn or drawing the plan view of object to be drawn, It only can need to pay close attention to the edge of object to be drawn, therefore, can be obtained by way of carrying out edge extracting to two dimensional image To the Edge Feature Points for needing to be used to draw.
For example, then above-mentioned Edge Feature Points are the point on each rib of cuboid when object to be drawn is cuboid.
Wherein, the mode of edge extracting may include utilizing Differential Detection algorithm, Roberts gradient algorithm and Sobel (Sobel) edge detection algorithm carries out edge extracting.
Step 104, Edge Feature Points are calculated under three-dimensional system of coordinate according to real-time pose data and the second depth image Coordinate, and wait for according to Coordinate generation the pattern of drawing object;Pattern include object to be drawn 3 dimensional drawing and/or to The plan view of drawing object.
Wherein, above-mentioned plan view may include six views and three-view diagram.
In the embodiment of the present application, by obtaining the camera shooting of terminal after establishing based on the three-dimensional system of coordinate of actual environment The two dimensional image and the second depth corresponding with two dimensional image that the whole surface that component treats drawing object is shot Image, and the real-time pose data of terminal are obtained simultaneously;Then edge extracting, then to two dimensional image is carried out, obtains two dimensional image Corresponding Edge Feature Points;So that terminal can calculate edge according to real-time pose data and the second depth image Coordinate of the characteristic point under three-dimensional system of coordinate, and the 3 dimensional drawing of drawing object waited for according to the Coordinate generation and/or wait draw The plan view of figure object;That is, in the embodiment of the present application, the 3 dimensional drawing of object to be drawn is being drawn or wait draw When the plan view of object, user only needs the whole surface for treating drawing object using the camera assembly of terminal to carry out shooting it Afterwards, the 3 dimensional drawing or plan view that object to be drawn can be automatically generated by terminal, it is first right using ruler without user After object carries out field survey, object dimensional perspective view and plane are carried out by Freehandhand-drawing or using computer graphics software The drafting of figure improves the drafting efficiency of object dimensional perspective view and plan view and draws precision.
In some embodiments of the application, as shown in figure 3, above-mentioned steps 104, according to real-time pose data and It may include: step 301 to step 302 that second depth image, which calculates coordinate of the Edge Feature Points under three-dimensional system of coordinate,.
Step 301, according to Edge Feature Points position on 2d and corresponding depth value, edge spy is determined The pixel coordinate of sign point on 2d.
In the embodiment of the present application, pixel coordinate refers to the relative positional relationship on two dimensional image between each pixel, and The coordinate formed in conjunction with the depth value of each pixel.
For example, pixel coordinate system can be established using the pixel in the lower left corner of two dimensional image as two-dimensional coordinate origin, and Determine the two-dimensional coordinate of each pixel on two dimensional image, then by the depth value of two-dimensional coordinate and each pixel of two dimensional image It is combined, obtains the pixel coordinate of each pixel in two dimensional image.
Step 302, pixel coordinate is mapped as by three-dimensional coordinate according to the parameter information of camera assembly and real-time pose data Coordinate under system.
The coordinate being mapped to pixel coordinate according to the parameter information of camera assembly and pose data under three-dimensional system of coordinate It include: the picture that the two dimensional image in terminal display interface is determined according to the parameter information of camera assembly and the pose data of terminal The mapping matrix of coordinate under plain coordinate and three-dimensional system of coordinate, and pixel coordinate is mapped to by three-dimensional coordinate according to mapping matrix Coordinate under system.
Wherein, the parameter information of camera assembly includes that the internal reference of camera assembly and outer ginseng, internal reference are included in u axis and the axis side v Upward equivalent focal length fx, fy, and as virtual center point the coordinate u0, v0 of plane.
It should be noted that above-mentioned be mapped to pixel coordinate according to the parameter information and pose data of camera assembly It can be mapped using mapping matrix general in the related technology during coordinate under three-dimensional system of coordinate.
In the embodiment of the present application, the seat in each Edge Feature Points for obtaining two dimensional image under three-dimensional system of coordinate After mark, it is equivalent to the seat under three-dimensional system of coordinate of each point in the 3 dimensional drawing for having had been built up object to be drawn Mark, that is to say, that in each Edge Feature Points for obtaining two dimensional image after the coordinate under three-dimensional system of coordinate, i.e., Can wait for the 3 dimensional drawing of drawing object according to these Coordinate generations, and the 3 dimensional drawing for obtaining object to be drawn it Afterwards, the 3 dimensional drawing is projected with any one projecting direction, the plan view of object to be drawn can be obtained.
In order to make the plan view generated more meet the habit of user, in some embodiments of the application, according to seat Mark generates the plan view of object to be drawn, and may include: the projecting direction selection instruction received to 3 dimensional drawing, according to throwing Shadow direction selection instructs the six views or three-view diagram for generating object to be drawn.
For example, as shown in figure 4, after the 3 dimensional drawing for obtaining object to be drawn, it can be in the display interface of terminal 41 show 3 D stereo Figure 42 of object to be drawn, and receive that user triggers on the display interface 41 of terminal it is vertical to three-dimensional The projecting direction selection instruction of body figure, the projecting direction (arrow as shown in Figure 4 indicated according to the projecting direction selection instruction Direction) generate the six views or three-view diagram of object to be drawn.
For example, projecting direction selection instruction is triggered by touch gestures, with the projection side of object plane figure really to be drawn To.
Wherein, six views may include front view, rearview, left view, right view, top and bottom perspective views, three-view diagram It may include front view, left view and top view.
In above-mentioned each embodiment, as shown in figure 5, the drawing practice of above-mentioned object can also include step 501 to Step 504.
Step 501, the first depth image for obtaining the actual environment of camera assembly shooting, builds according to the first depth image The three-dimensional system of coordinate for the actual environment that is based on.
Step 502, obtain camera assembly treat the sub- two dimensional image of multiframe that the whole surface of drawing object is shot with And sub- depth image corresponding with sub- two dimensional image, and the real-time pose data of terminal are obtained simultaneously.
Step 503, splicing is carried out to sub- two dimensional image and sub- depth image according to the real-time pose data of terminal, Obtain two dimensional image and the second depth image corresponding with two dimensional image.
Camera assembly carries out image taking in such a way that every external optical signals of acquisition generate a frame image.Generally In the case of, the frequency acquisition of image is 1 second 30 frame.Therefore, it when the whole surface for treating drawing object is shot, claps The image taken the photograph includes usually multiple image, is not a frame image, so may include obtaining in above-mentioned steps 102 The sub- two dimensional image of the multiframe that the whole surface that camera assembly treats drawing object is shot and corresponding with sub- two dimensional image Sub- depth image.
In addition, since camera assembly is obtaining the sub- X-Y scheme of multiframe treating the whole surface of drawing object and being shot During picture and sub- depth image corresponding with sub- two dimensional image, terminal is movement relative to object to be drawn, and And it is therefore regular circumference or straight line need between the sub- two dimensional image of multiframe according to real-time pose number that motion profile, which is not, According to being spliced, redundancy section is removed, above-mentioned two dimensional image can be just obtained, correspondingly, sub- depth corresponding with sub- two dimensional image Image is also required to similarly splice just obtain above-mentioned second depth image.
Step 504, it identifies the object to be drawn in two dimensional image, and side is carried out to the object to be drawn in two dimensional image Edge extracts, and obtains the corresponding Edge Feature Points of two dimensional image.
For example, can use the object to be drawn in algorithm of target detection identification two dimensional image.Common target detection is calculated Method has local binary patterns (LBP) algorithm, orientation Gradient Features combination supporting vector machine model and convolutional neural networks model Deng.
Step 505, Edge Feature Points are calculated under three-dimensional system of coordinate according to real-time pose data and the second depth image Coordinate, and wait for according to Coordinate generation the pattern of drawing object;Pattern include object to be drawn 3 dimensional drawing and/or to The plan view of drawing object.
The implementation of above-mentioned steps 501 and step 505 may refer to above-mentioned steps 101 and step 104, herein no longer It repeats.
In practical applications, when treating drawing object and being drawn, usually using terminal around object to be drawn into Row movement, to shoot the two dimensional image and the second depth image corresponding with two dimensional image of object whole surface to be drawn.
For example, above-mentioned terminal is mobile phone, user is in the sculpture for seeing that a comparison has artistic feeling, when wanting to be drawn, then It can use mobile phone to be shot around the whole surface of sculpture, obtain sculpture two dimensional image and corresponding with two dimensional image Second depth image, and the 3 dimensional drawing and/or plan view of sculpture are obtained in turn.However, in shooting process, Yong Huyou It possibly can not be confirmed whether that the whole surface of sculpture is all shot completion, without leaving the place not shot.
In addition, when the structure for object of drawing is more symmetrical, for example, what is shot is the object to be drawn an of elliposoidal When body, user is also possible to not to be confirmed whether the whole surface of object to be drawn shooting completion.
Therefore, in some embodiments of the application, the drawing practice of above-mentioned object can also include: according to terminal Real-time pose data, in the mobile guidance mark of the shooting interface display of terminal, and mobile guidance mark is with terminal location Change and change, user is allowed to judge whether to have shot the whole surface of object to be drawn according to guidance mark At.
It is carried out using vertical direction as axis around object to be drawn for example, above-mentioned mobile guidance mark can be used for guiding terminal 360 degree of shooting, and guiding terminal prolongs vertical direction upwards and vertical direction shoots downwards object to be drawn, and it is deep to obtain second Spend image and two dimensional image.
It wherein, include treating drawing object to carry out just around the shooting that object to be drawn carries out 360 degree using vertical direction as axis The shooting of view, left view, rearview and right view, and prolonging the shooting that vertical direction is upward and vertical direction is downward includes pair Object to be drawn carries out the shooting of bottom view and top view.
Specifically, as shown in fig. 6, above-mentioned mobile guidance mark may include one with upward arrow and downward The vertical axes 61 of arrow, and the guide line 62 with an arrow when the one end of guide line 62 without arrow and have the one of arrow When end is overlapped, indicate to have treated the shooting that drawing object carries out front view, left view, rearview and right view;When vertical axes 61 When upward arrow disappears, indicate to have treated the shooting that drawing object carries out bottom view;When the downward arrow of vertical axes 61 disappears When, it indicates to have treated the shooting that drawing object carries out top view.
It should be noted that be merely illustrative herein, in some embodiments of the application, above-mentioned guidance mark The guidance mark that can also be other shapes is known, as long as the whole surface that user can be guided to complete to treat drawing object carries out Shooting.
In addition, can directly give up this when it is the direction that can not be shot that object of drawing, which is deposited in a certain direction, The view in direction is triggered by user and completes shooting, that is, can be considered that the whole surface for having treated drawing object is shot.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present application constitutes any limit It is fixed.
Fig. 7 shows a kind of structural schematic diagram of the plotting unit 700 of object provided by the embodiments of the present application, object Plotting unit is configured at terminal, including establishes unit 701, acquiring unit 702, extraction unit 703 and image-drawing unit 704.
Unit 701 is established, the first depth image of the actual environment for obtaining camera assembly shooting is deep according to first It spends image and establishes the three-dimensional system of coordinate based on actual environment;
Acquiring unit 702 treats the two dimension that the whole surface of drawing object is shot for obtaining camera assembly Image and the second depth image corresponding with two dimensional image, and the real-time pose data of terminal are obtained simultaneously;
Extraction unit 703 obtains the corresponding Edge Feature Points of two dimensional image for carrying out edge extracting to two dimensional image;
Image-drawing unit 704, for calculating Edge Feature Points in three-dimensional according to real-time pose data and the second depth image Coordinate under coordinate system, and wait for according to Coordinate generation the pattern of drawing object;Pattern includes the 3 dimensional drawing of object to be drawn And/or the plan view of object to be drawn.
It should be noted that for convenience and simplicity of description, the specific work of the plotting unit 700 of the object of foregoing description Make process, can be with reference to the corresponding process of method described in above-mentioned Fig. 1 to Fig. 6, details are not described herein.
As shown in figure 8, the application provides a kind of terminal of drawing practice for realizing above-mentioned object, which can be with It may include: processor 81, memory 82, one for terminals such as smart phone, tablet computer, PC (PC), learning machines A or multiple input equipments 83 (one is only shown in Fig. 8), one or more output equipment 84 (one is only shown in Fig. 8) and Camera assembly 85.Processor 81, memory 82, input equipment 83, output equipment 84 and camera assembly 85 are connected by bus 86 It connects.
It should be appreciated that in the embodiment of the present application, alleged processor 81 can be central processing unit, and the processor is also Can be other general processors, digital signal processor, specific integrated circuit, field programmable gate array or other can Programmed logic device, discrete gate or transistor logic, discrete hardware components etc..General processor can be micro process Device or the processor are also possible to any conventional processor etc..
Input equipment 83 may include that dummy keyboard, Trackpad, fingerprint adopt sensor (fingerprint for acquiring user believed The directional information of breath and fingerprint), microphone etc., output equipment 84 may include display, loudspeaker etc..
Memory 82 may include read-only memory and random access memory, and provide instruction sum number to processor 81 According to.Part or all of memory 82 can also include nonvolatile RAM.For example, memory 82 may be used also With the information of storage device type.
Above-mentioned memory 82 is stored with computer program, and above-mentioned computer program can be run on above-mentioned processor 81, example Such as, above-mentioned computer program is the program of the drawing practice of object.Realization when above-mentioned processor 81 executes above-mentioned computer program Step in the drawing practice embodiment of above-mentioned object, such as step 101 shown in FIG. 1 is to step 104.Alternatively, above-mentioned processing Device 81 realizes the function of each module/unit in above-mentioned each Installation practice when executing above-mentioned computer program, such as shown in Fig. 7 The function of unit 701 to 704.
Above-mentioned computer program can be divided into one or more module/units, and said one or multiple modules/ Unit is stored in above-mentioned memory 82, and is executed by above-mentioned processor 81, to complete the application.Said one is multiple Module/unit can be the series of computation machine program instruction section that can complete specific function, and the instruction segment is above-mentioned for describing Implementation procedure of the computer program in the terminal of the drawing of above-mentioned carry out object.For example, above-mentioned computer program can be divided It is cut into and establishes unit, acquiring unit, extraction unit and image-drawing unit, each unit concrete function is as follows: unit is established, for obtaining First depth image of the actual environment for taking camera assembly to shoot establishes three based on actual environment according to the first depth image Tie up coordinate system;Acquiring unit treats the X-Y scheme that the whole surface of drawing object is shot for obtaining camera assembly Picture and the second depth image corresponding with two dimensional image, and the real-time pose data of terminal are obtained simultaneously;Extraction unit is used In carrying out edge extracting to two dimensional image, the corresponding Edge Feature Points of two dimensional image are obtained;Image-drawing unit, for according in real time Pose data and the second depth image calculate coordinate of the Edge Feature Points under three-dimensional system of coordinate, and are waited for according to Coordinate generation The pattern of drawing object;Pattern includes the 3 dimensional drawing of object to be drawn and/or the plan view of object to be drawn.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each function Can unit, module division progress for example, in practical application, can according to need and by above-mentioned function distribution by difference Functional unit, module complete, i.e., the internal structure of above-mentioned apparatus is divided into different functional unit or module, with complete All or part of function described above.Each functional unit in embodiment, module can integrate in a processing unit In, it is also possible to each unit and physically exists alone, can also be integrated in one unit with two or more units, on It states integrated unit both and can take the form of hardware realization, can also realize in the form of software functional units.In addition, Each functional unit, module specific name be also only for convenience of distinguishing each other, the protection model being not intended to limit this application It encloses.The specific work process of unit in above system, module, can refer to corresponding processes in the foregoing method embodiment, herein It repeats no more.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may be aware that described in conjunction with the examples disclosed in the embodiments of the present disclosure Unit and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions It is implemented in hardware or software actually, the specific application and design constraint depending on technical solution.Professional technique Personnel can use different methods to achieve the described function each specific application, but this realization should not be recognized For beyond scope of the present application.
In embodiment provided herein, it should be understood that disclosed device/terminal and method can pass through Other modes are realized.For example, device/terminal embodiment described above is only schematical, for example, above-mentioned module Or the division of unit, only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple Unit or assembly can be combined or can be integrated into another system, or some features can be ignored or not executed.It is another Point, shown or discussed mutual coupling or direct-coupling or communication connection can be through some interfaces, device Or the INDIRECT COUPLING or communication connection of unit, it can be electrical property, mechanical or other forms.
Above-mentioned unit as illustrated by the separation member may or may not be physically separated, as unit The component of display may or may not be physical unit, it can and it is in one place, or may be distributed over more In a network unit.Some or all of unit therein can be selected to realize this embodiment scheme according to the actual needs Purpose.
It, can also be in addition, each functional unit in each embodiment of the application can integrate in one processing unit It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list Member both can take the form of hardware realization, can also realize in the form of software functional units.
If above-mentioned integrated module/unit is realized in the form of SFU software functional unit and sells as independent product Or it in use, can store in a computer readable storage medium.Based on this understanding, the application realizes above-mentioned All or part of the process in embodiment method can also instruct relevant hardware to complete by computer program, above-mentioned Computer program can be stored in a computer readable storage medium, which, can be real when being executed by processor The step of existing above-mentioned each embodiment of the method.Wherein, above-mentioned computer program includes computer program code, above-mentioned computer Program code can be source code form, object identification code form, executable file or certain intermediate forms etc..Above-mentioned computer Readable medium may include: any entity or device, recording medium, USB flash disk, the shifting that can carry above-mentioned computer program code Dynamic hard disk, magnetic disk, CD, computer storage, read-only memory (Read-Only Memory, ROM), random access memory Device (Random Access Memory, RAM), electric carrier signal, telecommunication signal and software distribution medium etc..It needs to illustrate , content that above-mentioned computer-readable medium includes can according to make laws in jurisdiction and the requirement of patent practice into Row increase and decrease appropriate, such as do not include electricity according to legislation and patent practice, computer-readable medium in certain jurisdictions Carrier signal and telecommunication signal.
Above above-described embodiment is only to illustrate the technical solution of the application, rather than its limitations;Although referring to aforementioned reality Example is applied the application is described in detail, those skilled in the art should understand that: it still can be to aforementioned each Technical solution documented by embodiment is modified or equivalent replacement of some of the technical features;And these are modified Or replacement, the spirit and scope of each embodiment technical solution of the application that it does not separate the essence of the corresponding technical solution should all Comprising within the scope of protection of this application.

Claims (10)

1. a kind of drawing practice of object is applied to terminal characterized by comprising
The first depth image for obtaining the actual environment of camera assembly shooting, is established according to first depth image based on described The three-dimensional system of coordinate of actual environment;
It obtains the camera assembly and treats two dimensional image that the whole surface of drawing object is shot and with described two Corresponding second depth image of image is tieed up, and obtains the real-time pose data of the terminal simultaneously;
Edge extracting is carried out to the two dimensional image, obtains the corresponding Edge Feature Points of the two dimensional image;
The Edge Feature Points are calculated in the three-dimensional coordinate according to the real-time pose data and second depth image Coordinate under system, and the pattern of the object to be drawn according to the Coordinate generation;The pattern includes the object to be drawn 3 dimensional drawing and/or the object to be drawn plan view.
2. drawing practice as described in claim 1, which is characterized in that the object to be drawn according to the Coordinate generation Plan view, comprising:
Receive to the projecting direction selection instruction of the 3 dimensional drawing, according to the projecting direction selection instruction generate it is described to The six views or three-view diagram of drawing object.
3. drawing practice as claimed in claim 1 or 2, which is characterized in that described according to the real-time pose data and institute It states the second depth image and calculates coordinate of the Edge Feature Points under the three-dimensional system of coordinate, comprising:
According to position of the Edge Feature Points on the two dimensional image and corresponding depth value, the edge feature is determined Pixel coordinate of the point on the two dimensional image;
The pixel coordinate is mapped as the three-dimensional according to the parameter information of the camera assembly and the real-time pose data Coordinate under coordinate system.
4. drawing practice as described in claim 1, which is characterized in that the acquisition camera assembly treats drawing object The two dimensional image and the second depth image corresponding with the two dimensional image that whole surface is shot, comprising:
Obtain the camera assembly treat the sub- two dimensional image of multiframe that the whole surface of drawing object is shot and with it is described The corresponding sub- depth image of sub- two dimensional image;
Splicing is carried out to the sub- two dimensional image and the sub- depth image according to the real-time pose data of the terminal, is obtained To the two dimensional image and the second depth image corresponding with the two dimensional image;
It is described that edge extracting is carried out to the two dimensional image, comprising:
It identifies the object to be drawn in the two dimensional image, and edge is carried out to the object to be drawn in the two dimensional image and is mentioned It takes, obtains the corresponding Edge Feature Points of the two dimensional image.
5. drawing practice as described in claim 1, which is characterized in that the drawing practice further include:
It according to the real-time pose data of the terminal, is identified in the mobile guidance of the shooting interface display of the terminal, and described Mobile guidance mark changes with the variation of the terminal location.
6. drawing practice as described in claim 1, which is characterized in that it is described obtain terminal real-time pose data include:
The three-dimensional system of coordinate establish complete when, using Inertial Measurement Unit IMU obtain in real time the six of the terminal from By degree pose data.
7. a kind of plotting unit of object, is configured at terminal characterized by comprising
Unit is established, the first depth image of the actual environment for obtaining camera assembly shooting, according to first depth map As establishing the three-dimensional system of coordinate based on the actual environment;
Acquiring unit treats the two dimensional image that the whole surface of drawing object is shot for obtaining the camera assembly And the second depth image corresponding with the two dimensional image, and the real-time pose data of the terminal are obtained simultaneously;
Extraction unit obtains the corresponding Edge Feature Points of the two dimensional image for carrying out edge extracting to the two dimensional image;
Image-drawing unit exists for calculating the Edge Feature Points according to the real-time pose data and second depth image Coordinate under the three-dimensional system of coordinate, and the pattern of the object to be drawn according to the Coordinate generation;The pattern includes institute State the 3 dimensional drawing of object to be drawn and/or the plan view of the object to be drawn.
8. plotting unit as claimed in claim 7, which is characterized in that
The image-drawing unit is also used to receive the projecting direction selection instruction to the 3 dimensional drawing, according to the projection side The six views or three-view diagram of the object to be drawn are generated to selection instruction.
9. a kind of terminal, including camera assembly, memory, processor and storage are in the memory and can be in the processing The computer program run on device, which is characterized in that the processor realizes such as claim 1 when executing the computer program The step of to 6 any one the method.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, and feature exists In realization is such as the step of claim 1 to 6 any one the method when the computer program is executed by processor.
CN201910565646.7A 2019-06-26 2019-06-26 Object drawing method, device, terminal and computer-readable storage medium Active CN110276774B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910565646.7A CN110276774B (en) 2019-06-26 2019-06-26 Object drawing method, device, terminal and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910565646.7A CN110276774B (en) 2019-06-26 2019-06-26 Object drawing method, device, terminal and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN110276774A true CN110276774A (en) 2019-09-24
CN110276774B CN110276774B (en) 2021-07-23

Family

ID=67963487

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910565646.7A Active CN110276774B (en) 2019-06-26 2019-06-26 Object drawing method, device, terminal and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN110276774B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110838160A (en) * 2019-10-30 2020-02-25 广东优世联合控股集团股份有限公司 Building image display mode conversion method, device and storage medium
CN111063015A (en) * 2019-12-13 2020-04-24 重庆首厚智能科技研究院有限公司 Method and system for efficiently drawing point locations
CN112150527A (en) * 2020-08-31 2020-12-29 深圳市慧鲤科技有限公司 Measuring method and device, electronic device and storage medium
CN112197708A (en) * 2020-08-31 2021-01-08 深圳市慧鲤科技有限公司 Measuring method and device, electronic device and storage medium
CN112348890A (en) * 2020-10-27 2021-02-09 深圳技术大学 Space positioning method and device and computer readable storage medium
CN116704129A (en) * 2023-06-14 2023-09-05 维坤智能科技(上海)有限公司 Panoramic view-based three-dimensional image generation method, device, equipment and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1559055A (en) * 2001-09-26 2004-12-29 �ձ������ȷ湫˾ Image generating apparatus, image generating method, and computer program
CN103500467A (en) * 2013-10-21 2014-01-08 深圳市易尚展示股份有限公司 Constructive method of image-based three-dimensional model
US20140160123A1 (en) * 2012-12-12 2014-06-12 Microsoft Corporation Generation of a three-dimensional representation of a user
CN106651755A (en) * 2016-11-17 2017-05-10 宇龙计算机通信科技(深圳)有限公司 Panoramic image processing method and device for terminal and terminal
CN107562226A (en) * 2017-09-15 2018-01-09 广东虹勤通讯技术有限公司 A kind of 3D drafting systems and method
CN108170297A (en) * 2017-09-11 2018-06-15 南京睿悦信息技术有限公司 Real-time six degree of freedom VR/AR/MR equipment localization methods
CN108225258A (en) * 2018-01-09 2018-06-29 天津大学 Based on inertance element and laser tracker dynamic pose measuring apparatus and method
CN108304119A (en) * 2018-01-19 2018-07-20 腾讯科技(深圳)有限公司 object measuring method, intelligent terminal and computer readable storage medium
CN108564616A (en) * 2018-03-15 2018-09-21 中国科学院自动化研究所 Method for reconstructing three-dimensional scene in the rooms RGB-D of fast robust
CN108629830A (en) * 2018-03-28 2018-10-09 深圳臻迪信息技术有限公司 A kind of three-dimensional environment method for information display and equipment
CN108924412A (en) * 2018-06-22 2018-11-30 维沃移动通信有限公司 A kind of image pickup method and terminal device
CN109102537A (en) * 2018-06-25 2018-12-28 中德人工智能研究院有限公司 A kind of three-dimensional modeling method and system of laser radar and the combination of ball curtain camera
CN109186461A (en) * 2018-07-27 2019-01-11 南京阿凡达机器人科技有限公司 A kind of measurement method and measuring device of cabinet size

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1559055A (en) * 2001-09-26 2004-12-29 �ձ������ȷ湫˾ Image generating apparatus, image generating method, and computer program
US20140160123A1 (en) * 2012-12-12 2014-06-12 Microsoft Corporation Generation of a three-dimensional representation of a user
CN103500467A (en) * 2013-10-21 2014-01-08 深圳市易尚展示股份有限公司 Constructive method of image-based three-dimensional model
CN106651755A (en) * 2016-11-17 2017-05-10 宇龙计算机通信科技(深圳)有限公司 Panoramic image processing method and device for terminal and terminal
CN108170297A (en) * 2017-09-11 2018-06-15 南京睿悦信息技术有限公司 Real-time six degree of freedom VR/AR/MR equipment localization methods
CN107562226A (en) * 2017-09-15 2018-01-09 广东虹勤通讯技术有限公司 A kind of 3D drafting systems and method
CN108225258A (en) * 2018-01-09 2018-06-29 天津大学 Based on inertance element and laser tracker dynamic pose measuring apparatus and method
CN108304119A (en) * 2018-01-19 2018-07-20 腾讯科技(深圳)有限公司 object measuring method, intelligent terminal and computer readable storage medium
CN108564616A (en) * 2018-03-15 2018-09-21 中国科学院自动化研究所 Method for reconstructing three-dimensional scene in the rooms RGB-D of fast robust
CN108629830A (en) * 2018-03-28 2018-10-09 深圳臻迪信息技术有限公司 A kind of three-dimensional environment method for information display and equipment
CN108924412A (en) * 2018-06-22 2018-11-30 维沃移动通信有限公司 A kind of image pickup method and terminal device
CN109102537A (en) * 2018-06-25 2018-12-28 中德人工智能研究院有限公司 A kind of three-dimensional modeling method and system of laser radar and the combination of ball curtain camera
CN109186461A (en) * 2018-07-27 2019-01-11 南京阿凡达机器人科技有限公司 A kind of measurement method and measuring device of cabinet size

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
凯德: "《Pro/ENGINEER技术应用从业通》", 30 September 2008 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110838160A (en) * 2019-10-30 2020-02-25 广东优世联合控股集团股份有限公司 Building image display mode conversion method, device and storage medium
CN111063015A (en) * 2019-12-13 2020-04-24 重庆首厚智能科技研究院有限公司 Method and system for efficiently drawing point locations
CN112150527A (en) * 2020-08-31 2020-12-29 深圳市慧鲤科技有限公司 Measuring method and device, electronic device and storage medium
CN112197708A (en) * 2020-08-31 2021-01-08 深圳市慧鲤科技有限公司 Measuring method and device, electronic device and storage medium
CN112150527B (en) * 2020-08-31 2024-05-17 深圳市慧鲤科技有限公司 Measurement method and device, electronic equipment and storage medium
CN112348890A (en) * 2020-10-27 2021-02-09 深圳技术大学 Space positioning method and device and computer readable storage medium
CN112348890B (en) * 2020-10-27 2024-01-23 深圳技术大学 Space positioning method, device and computer readable storage medium
CN116704129A (en) * 2023-06-14 2023-09-05 维坤智能科技(上海)有限公司 Panoramic view-based three-dimensional image generation method, device, equipment and storage medium
CN116704129B (en) * 2023-06-14 2024-01-30 维坤智能科技(上海)有限公司 Panoramic view-based three-dimensional image generation method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN110276774B (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN110276774A (en) Drawing practice, device, terminal and the computer readable storage medium of object
CN110006343B (en) Method and device for measuring geometric parameters of object and terminal
CN112652016B (en) Point cloud prediction model generation method, pose estimation method and pose estimation device
CN105143907B (en) Alignment system and method
AU2013224660B2 (en) Automated frame of reference calibration for augmented reality
CN110246147A (en) Vision inertia odometer method, vision inertia mileage counter device and mobile device
CN109003325A (en) A kind of method of three-dimensional reconstruction, medium, device and calculate equipment
CN109242961A (en) A kind of face modeling method, apparatus, electronic equipment and computer-readable medium
CN106980368A (en) A kind of view-based access control model calculating and the virtual reality interactive device of Inertial Measurement Unit
CN108805917A (en) Sterically defined method, medium, device and computing device
CN110378994A (en) Human face model building and Related product
CN109840949A (en) Augmented reality image processing method and device based on optical alignment
CN110276317A (en) A kind of dimension of object detection method, dimension of object detection device and mobile terminal
CN103136744A (en) Apparatus and method for calculating three dimensional (3D) positions of feature points
CN111833447A (en) Three-dimensional map construction method, three-dimensional map construction device and terminal equipment
CN110378947A (en) 3D model reconstruction method, device and electronic equipment
US20230298280A1 (en) Map for augmented reality
CN108898669A (en) Data processing method, device, medium and calculating equipment
CN111161398A (en) Image generation method, device, equipment and storage medium
CN112657176A (en) Binocular projection man-machine interaction method combined with portrait behavior information
CN110222651A (en) A kind of human face posture detection method, device, terminal device and readable storage medium storing program for executing
CN110310325A (en) A kind of virtual measurement method, electronic equipment and computer readable storage medium
US10296098B2 (en) Input/output device, input/output program, and input/output method
CN113724309A (en) Image generation method, device, equipment and storage medium
CN110378948A (en) 3D model reconstruction method, device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant