CN110209325A - A kind of 3D scene display control method, system and equipment - Google Patents

A kind of 3D scene display control method, system and equipment Download PDF

Info

Publication number
CN110209325A
CN110209325A CN201910377060.8A CN201910377060A CN110209325A CN 110209325 A CN110209325 A CN 110209325A CN 201910377060 A CN201910377060 A CN 201910377060A CN 110209325 A CN110209325 A CN 110209325A
Authority
CN
China
Prior art keywords
video camera
selection area
display control
roaming
roaming video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910377060.8A
Other languages
Chinese (zh)
Inventor
吴智敏
郭建伟
胡颖
张永涛
李江明
俞翔
黄仝宇
汪刚
宋一兵
侯玉清
刘双广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gosuncn Technology Group Co Ltd
Original Assignee
Gosuncn Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gosuncn Technology Group Co Ltd filed Critical Gosuncn Technology Group Co Ltd
Priority to CN201910377060.8A priority Critical patent/CN110209325A/en
Publication of CN110209325A publication Critical patent/CN110209325A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a kind of 3D scene display control methods, comprising: when response selection instruction, the corresponding selection area of the selection instruction is obtained in 3D scene;Rotation roaming video camera, so that the roaming video camera is directed at the central point of the selection area;The roaming video camera is controlled close to the selection area, so that the selection area is in magnifying state in current screen;Judge whether the boundary point of the selection area exceeds the field range of the roaming video camera;If so, stopping the mobile roaming video camera;If it is not, then continuing to move to the roaming video camera.The invention also discloses a kind of 3D scene display control programs and a kind of 3D scene display control apparatus.Using the embodiment of the present invention, the easy to operate of 3D scene display control is enabled to, it is convenient and efficient.

Description

A kind of 3D scene display control method, system and equipment
Technical field
The present invention relates to 3D display technical field more particularly to a kind of 3D scene display control methods, system and equipment.
Background technique
It checks in the prior art and mouse is usually moved on to point to be viewed when the image of specified point, and by mouse Pulley or keyboard check the mode that image amplifies.Such as 3D scene interactivity scheme, it is flat by keyboard direction, mouse The observation visual angle of the operations such as shifting control 3D scene, and combined mouse idler wheel realizes that furthering for 3D scene zooms out effect, Yi Jitong It crosses UI control and other controls is carried out to 3D scene, i.e., when amplifying display to 3D scene, need to combine keyboard direction, mouse It is marked with and other UI controls is operated, this control mode is simultaneously unnatural.Therefore, existed using interactive operation in the prior art When 3D Scene realization roams task (for example needing to check a certain details scene), multiple operational controls is needed to be applied in combination, With certain operation complexity.
Summary of the invention
The purpose of the embodiment of the present invention is that providing a kind of 3D scene display control method, system and equipment, 3D is enabled to Scene display control it is easy to operate, it is convenient and efficient.
To achieve the above object, the embodiment of the invention provides a kind of 3D scene display control methods, comprising:
When responding selection instruction, the corresponding selection area of the selection instruction is obtained in 3D scene;
Rotation roaming video camera, so that the roaming video camera is directed at the central point of the selection area;
The roaming video camera is controlled close to the selection area, is put so that the selection area is in current screen Big state;
Judge whether the boundary point of the selection area exceeds the field range of the roaming video camera;
If so, stopping the mobile roaming video camera;If it is not, then continuing to move to the roaming video camera.
Compared with prior art, 3D scene display control method provided in an embodiment of the present invention refers in response selection first Selection area is generated after order, i.e., only needs user to issue selection instruction, do not need user and manipulate other controls again;Then it overflows Swimming video camera can be automatically close to selection area, so that selection area is in magnifying state in current screen, to reach amplification Show the effect of selection area.In addition, due to the perspective phenomenon of roaming video camera cause it is remote small close big, so when roaming video camera When the continuous selection area in 3D scene, selection area can seem increasing from the point of view of roaming video camera.Using Roaming video camera checks 3D scene, and display can be amplified to selection area based on selection instruction, does not need to combine multiple controls Part is realized, the easy to operate of 3D scene display control is enabled to, convenient and efficient.
As an improvement of the above scheme, the selection instruction is the long-pressing instruction that instruction device issues;The then acquisition institute The corresponding selection area of selection instruction is stated, is specifically included:
Corresponding first touch point and corresponding second touching of while lifting operation when obtaining the instruction device pushing operation Control point;
The selection area is constructed according to first touch point and second touch point.
As an improvement of the above scheme, the selection area is rectangular area or border circular areas;Wherein,
When the selection area is rectangular area, first touch point and second touch point are the rectangle region Non-conterminous two vertex in domain;
When the selection area is border circular areas, first touch point and second touch point are the circle The boundary point in domain, and first touch point and the straight line that second touch point is formed by connecting are the straight of the border circular areas Diameter.
As an improvement of the above scheme, the field range of the roaming video camera is one using the roaming video camera as vertex Rectangular pyramid, the bottom surface of the rectangular pyramid is the display area of current screen.
As an improvement of the above scheme, the control roaming video camera is specifically included close to the selection area:
The roaming video camera is controlled along default straight line close to the selection area;Wherein, the default straight line is institute State the straight line that roaming video camera is formed by connecting in the central point of initial position and the selection area.
As an improvement of the above scheme, when the control roaming video camera is close to the selection area, further includes:
Uniform interpolation is carried out to the position and posture of the roaming video camera.
As an improvement of the above scheme, the method also includes:
The roaming video camera is controlled far from the selection area, so that the selection area is in current screen in contracting Small state.
To achieve the above object, the embodiment of the invention also provides a kind of 3D scene display control programs, comprising:
Selection area obtains module, and when for responding selection instruction, it is corresponding that the selection instruction is obtained in 3D scene Selection area;
Camera control module is roamed, for rotating roaming video camera, so that roaming video camera alignment is described selected The central point in region;The roaming video camera is controlled close to the selection area, so that the selection area is in current screen In magnifying state;
Judgment module, for judging whether the boundary point of the selection area exceeds the visual field model of the roaming video camera It encloses;If so, stopping the mobile roaming video camera;If it is not, then continuing to move to the roaming video camera.
As an improvement of the above scheme, the selection instruction is the long-pressing instruction that instruction device issues;The then selected area Domain obtains module and is specifically used for:
Corresponding first touch point and corresponding second touching of while lifting operation when obtaining the instruction device pushing operation Control point;
The selection area is constructed according to first touch point and second touch point.
To achieve the above object, the embodiment of the present invention also provides a kind of 3D scene display control apparatus, including processor, deposits Reservoir and storage in the memory and are configured as the computer program executed by the processor, and the processor is held The 3D scene display control method as described in above-mentioned any embodiment is realized when the row computer program.
Detailed description of the invention
Fig. 1 is a kind of flow chart of 3D scene display control method provided in an embodiment of the present invention;
Fig. 2 is that video perspective geometry model is illustrated in the video projection provided in an embodiment of the present invention based on 3D map Figure;
Fig. 3 is the position view of selection area in a kind of 3D scene display control method provided in an embodiment of the present invention;
Fig. 4 is roaming video camera imaging schematic diagram in a kind of 3D scene display control method provided in an embodiment of the present invention;
Fig. 5 is a kind of structural schematic diagram of 3D scene display control program 10 provided in an embodiment of the present invention;
Fig. 6 is a kind of structural schematic diagram of 3D scene display control apparatus 20 provided in an embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
Embodiment one
It is a kind of flow chart of 3D scene display control method provided in an embodiment of the present invention referring to Fig. 1, Fig. 1;Include:
When S1, response selection instruction, the corresponding selection area of the selection instruction is obtained in 3D scene;
S2, rotation roaming video camera, so that the roaming video camera is directed at the central point of the selection area;
S3, the control roaming video camera are close to the selection area, so that the selection area is in current screen In magnifying state;
S4, judge whether the boundary point of the selection area exceeds the field range of the roaming video camera;
S5, if so, stopping the mobile roaming video camera;If it is not, then continuing to move to the roaming video camera.
It is worth noting that 3D scene described in the embodiment of the present invention is 3D map, the embodiment of the present invention is not particularly suited for 2D scene.Because true geographical space belongs to three-dimensional, 2D map can not simulate completely.Such as roaming video camera have high point and Low spot point, if using 2D map, roaming video camera elevation information will be filtered, while roam video camera be can Control, there is horizontal and vertical directions in control direction, if can only calculate ball machine roughly using 2D map Horizontal control parameter.
The video projection based on 3D map is utilized in roaming video camera described in the embodiment of the present invention, which is According to the position and posture of video subpoint, projection plane is established.As shown in Fig. 2, local coordinate system 01- xyz represents projection Space of points position and posture, a, b, c, d are the projection planes of building, which is a rectangle, for substituting monitor video Picture carries out geometry calculating.Wherein 01The coordinate of point is that position and video of the ball machine in real world (geographical space) are thrown The position of shadow point.Then projected position of the 3D map grid vertex on the projection plane is calculated, if projected position is fallen in In the region abcd, then video data is covered on the grid vertex as texture information.And in 3D scene of the embodiment of the present invention Having a video camera, this video camera is in the 3D scene Projection Display to display screen for will be constructed in computer, This video camera be it is virtual, be present in computer 3D scene, by convert this roaming video camera spatial position, 3D The picture that scape is presented on the display screen can constantly change, and video camera herein is known as " roaming video camera ".
Specifically, in step sl, when responding selection instruction, it is corresponding selected that the selection instruction is obtained in 3D scene Region.Preferably, the selection instruction is the long-pressing instruction that instruction device issues;When obtaining the instruction device pushing operation pair The first touch point for answering and while lifting operation corresponding second touch point;According to first touch point and second touching Control point constructs the selection area.The instruction device can be the instruction devices with indicating function such as mouse, stylus.
Preferably, the selection area is rectangular area or border circular areas;Wherein, when the selection area is rectangular area When, first touch point and second touch point are non-conterminous two vertex in the rectangular area;When described selected When region is border circular areas, first touch point is the boundary point of the border circular areas with second touch point, and described The straight line that first touch point and second touch point are formed by connecting is the diameter of the border circular areas.
Specifically, Fig. 3 is to select area in a kind of 3D scene display control method provided in an embodiment of the present invention referring to Fig. 3 The position view in domain;User drags mouse by long-pressing mouse, starts to press with mousebutton and lift to mousebutton Two location points (i.e. described first touch point and second touch point) risen construct a rectangular area ABCD.
Specifically, in step s 2, rotation roaming video camera, so that the roaming video camera is directed at the selection area Central point.Preferably, the field range of the roaming video camera is one using the roaming video camera as the rectangular pyramid on vertex, described The bottom surface of rectangular pyramid is the display area of current display screen.
Specifically, in step s3, controlling the roaming video camera close to the selection area, so that the selection area Magnifying state is in current screen.It is worth noting that mentioned herein be exaggerated, refer to selection area on the screen Display area becomes larger, and the size of actual selection area is fixed.
Preferably, the control roaming video camera is specifically included close to the selection area: the control roaming is taken the photograph Camera is along default straight line close to the selection area;Wherein, the default straight line is the roaming video camera in initial position The straight line that the central point of place and the selection area is formed by connecting.
The rectangular area ABCD is simply positioned at a rectangle frame on display screen, can not determine which area in 3D map Domain is selected by frame, for this purpose, referring to fig. 4, with the roaming camera position, (coordinate system o-xyz indicates described unrestrained to the embodiment of the present invention Swim position and the posture of video camera) and the rectangular area ABCD is selected to construct a rectangular pyramid o-ABCD, and with the roaming Position for video camera is set to vertex and extends to rectangular pyramid, and calculates the intersection point of rectangular pyramid and 3D map after extending.Described unrestrained During trip video camera furthers, guarantee this display position of four intersection points on the screen near screen edge, then this four The region that intersection point is constituted will be displayed on the screen with being maximized, to reach the video camera according to the selection area position And the mode of size come realize 3D map details amplification effect.Perspective phenomenon due to roaming video camera causes remote small close Greatly, so when roaming selection area of the video camera constantly in 3D scene, selection area is from the point of view of roaming video camera Can seem increasing (actual size of selection area does not change in fact).
As shown in figure 4, wherein S1S2S3S4 refers to projection plane (display screen), ABCD is the selected area of subscriber frame choosing Domain (region of frame choosing on the screen).Assuming that subscriber frame has selected rectangular area ABCD, according to the Transformation Relation of Projection, it is known that with What family selected is the region A1B1C1D1 (intersection point of rectangular pyramid and 3D map after extension) on 3D map, that is to say, that Yong Huxi Hope the further observation region A1B1C1D1.Here the purpose for relocating out the region A1B1C1D1 is exactly in response to user Frame selection operation (be based on screen), while telling computer, the region (being based on 3D map) of the true frame choosing of user.A1,B1,C1, D1 is this four intersection point (four intersection points are different to establish a capital presence), in addition, this four intersection points are not necessarily in the same plane.
Further, when the control roaming video camera is close to the selection area, further includes: taken the photograph to the roaming The position and posture of camera carries out uniform interpolation.So that the roaming video camera observation picture is smooth, the 3D for eliminating user is dizzy Dizzy disease.
Specifically, judging whether the boundary point of the selection area exceeds the roaming video camera in step S4~S5 Field range;If so, stopping the mobile roaming video camera, the selection area is in the display screen at this time Amplify display state;If it is not, then continuing to move to the roaming video camera.
Further, the method also includes: the control roaming video camera is far from the selection area, so that the choosing Region is determined in current screen in deflated state.Alternatively, the roaming can be rotated when needing to scale the selection area Video camera, then frame selects distant view.
When it is implemented, assuming that user wishes to observe a certain specific region of 3D map, circle can be selected surely should by dragging Selection area, roaming video camera will further observation visual angle automatically to a suitable position, and in the position when, The region for the front frame choosing that the visual field can be completely covered.Meanwhile during camera angles further frame favored area, by unrestrained The position and posture for swimming video camera carries out the mode of uniform interpolation, so that video camera observation picture is smooth, eliminates the 3D of user Vertigo.
Compared with prior art, 3D scene display control method provided in an embodiment of the present invention refers in response selection first Selection area is generated after order, i.e., only needs user to issue selection instruction, do not need user and manipulate other controls again;Then it overflows Swimming video camera can be automatically close to selection area, so that selection area is in magnifying state in current screen, to reach amplification Show the effect of selection area.3D scene is checked using roaming video camera, selection area can be put based on selection instruction Big display does not need that multiple controls is combined to realize, enables to the easy to operate of 3D scene display control, convenient and efficient.
Embodiment two
It is a kind of structural schematic diagram of 3D scene display control program 10 provided in an embodiment of the present invention referring to Fig. 5, Fig. 5; Include:
Selection area obtains module 11, and when for responding selection instruction, it is corresponding that the selection instruction is obtained in 3D scene Selection area;
Camera control module 12 is roamed, for rotating roaming video camera, so that the roaming video camera is directed at the choosing Determine the central point in region;The roaming video camera is controlled close to the selection area, so that the selection area is in current screen In be in magnifying state;
Judgment module 13, for judging whether the boundary point of the selection area exceeds the visual field model of the roaming video camera It encloses;If so, stopping the mobile roaming video camera;If it is not, then continuing to move to the roaming video camera.
Preferably, the selection instruction is the long-pressing instruction that instruction device issues;Then it is specific to obtain module 11 for selection area For: corresponding first touch point and while lifting operation corresponding second touch-control when obtaining the instruction device pushing operation Point;The selection area is constructed according to first touch point and second touch point.
Preferably, the selection area is rectangular area or border circular areas;Wherein, when the selection area is rectangular area When, first touch point and second touch point are non-conterminous two vertex in the rectangular area;When described selected When region is border circular areas, first touch point is the boundary point of the border circular areas with second touch point, and described The straight line that first touch point and second touch point are formed by connecting is the diameter of the border circular areas.
Preferably, the field range of the roaming video camera is one using the roaming video camera as the rectangular pyramid on vertex, institute The bottom surface for stating rectangular pyramid is the display area of current screen.
Preferably, the roaming camera control module 12 is specifically used for: the control roaming video camera is along default straight Line is close to the selection area;Wherein, the default straight line is the roaming video camera in initial position and the selected area The straight line that the central point in domain is formed by connecting.The roaming camera control module 12 controls the roaming video camera close to the choosing When determining region, further includes: carry out uniform interpolation to the position and posture of the roaming video camera.
Further, the roaming camera control module 12 is also used to: the control roaming video camera is far from the choosing Region is determined, so that the selection area is in deflated state in current screen.
The course of work of the specific 3D scene display control program 10 please refers to 3D described in above-described embodiment one The process of scape display control method, details are not described herein.
Compared with prior art, 3D scene display control program 10 provided in an embodiment of the present invention, selection area first obtains Modulus block 11 generates selection area after responding selection instruction, i.e., only needs user to issue selection instruction, do not need user Other controls are manipulated again;Then roaming camera control module 12 control roaming video camera is automatically close to selection area, so that choosing Region is determined in current screen in magnifying state, to achieve the effect that amplification display selection area.Using roaming video camera It checks 3D scene, display can be amplified to selection area based on selection instruction, do not need that multiple controls is combined to realize, it can So that 3D scene display control is easy to operate, it is convenient and efficient.
Embodiment three
It is a kind of structural schematic diagram of 3D scene display control apparatus 20 provided in an embodiment of the present invention referring to Fig. 6, Fig. 6; The 3D scene display control apparatus 20 of the embodiment includes: processor 21, memory 22 and is stored in the memory 22 And the computer program that can be run on the processor 21.The processor 21 is realized above-mentioned when executing the computer program Step in each 3D scene display control method embodiment, such as step S1~S5 shown in FIG. 1.Alternatively, the processor Realize that the function of each module/unit in above-mentioned each Installation practice, such as selection area obtain when the 21 execution computer program Modulus block 11.
Illustratively, the computer program can be divided into one or more module/units, one or more A module/unit is stored in the memory 22, and is executed by the processor 21, to complete the present invention.It is one Or multiple module/units can be the series of computation machine program instruction section that can complete specific function, the instruction segment is for retouching State implementation procedure of the computer program in the 3D scene display control apparatus 20.For example, the computer program can Module 11, roaming camera control module 12 and judgment module 13, each module concrete function are obtained to be divided into selection area It is as follows:
Selection area obtains module 11, and when for responding selection instruction, it is corresponding that the selection instruction is obtained in 3D scene Selection area;
Camera control module 12 is roamed, for rotating roaming video camera, so that the roaming video camera is directed at the choosing Determine the central point in region;The roaming video camera is controlled close to the selection area, so that the selection area is in current screen In be in magnifying state;
Judgment module 13, for judging whether the boundary point of the selection area exceeds the visual field model of the roaming video camera It encloses;If so, stopping the mobile roaming video camera;If it is not, then continuing to move to the roaming video camera.
The 3D scene display control apparatus 20 can be desktop PC, notebook, palm PC and cloud service Device etc. calculates equipment.The 3D scene display control apparatus 20 may include, but be not limited only to, processor 21, memory 22.Ability Field technique personnel are appreciated that the schematic diagram is only the example of 3D scene display control apparatus 20, do not constitute to 3D The restriction of scape display control apparatus 20 may include perhaps combining certain components or not than illustrating more or fewer components Same component, such as the 3D scene display control apparatus 20 can also include input-output equipment, network access equipment, bus Deng.
The processor 21 can be central processing unit (Central Processing Unit, CPU), can also be Other general processors, digital signal processor (Digital Signal Processor, DSP), specific integrated circuit (Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field- Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic, Discrete hardware components etc..General processor can be microprocessor or the processor is also possible to any conventional processor Deng the processor 21 is the control centre of the 3D scene display control apparatus 20, whole using various interfaces and connection The various pieces of a 3D scene display control apparatus 20.
The memory 22 can be used for storing the computer program and/or module, the processor 21 by operation or The computer program and/or module being stored in the memory 22 are executed, and calls the data being stored in memory 22, Realize the various functions of the 3D scene display control apparatus 20.The memory 22 can mainly include storing program area and storage Data field, wherein storing program area can application program needed for storage program area, at least one function (for example sound plays Function, image player function etc.) etc.;Storage data area, which can be stored, uses created data (such as audio number according to mobile phone According to, phone directory etc.) etc..In addition, the memory 22 may include high-speed random access memory, it can also include non-volatile Memory, such as hard disk, memory, plug-in type hard disk, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash card (Flash Card), at least one disk memory, flush memory device or other Volatile solid-state part.
Wherein, if the integrated module/unit of the 3D scene display control apparatus 20 is in the form of SFU software functional unit Realize and when sold or used as an independent product, can store in a computer readable storage medium.Based on this The understanding of sample, the present invention realize all or part of the process in above-described embodiment method, can also be referred to by computer program Relevant hardware is enabled to complete, the computer program can be stored in a computer readable storage medium, the computer journey Sequence by processor 21 when being executed, it can be achieved that the step of above-mentioned each embodiment of the method.Wherein, the computer program includes meter Calculation machine program code, the computer program code can be source code form, object identification code form, executable file or certain Intermediate form etc..The computer-readable medium may include: can carry the computer program code any entity or Device, recording medium, USB flash disk, mobile hard disk, magnetic disk, CD, computer storage, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software Distribution medium etc..It should be noted that the content that the computer-readable medium includes can be according to making laws in jurisdiction Requirement with patent practice carries out increase and decrease appropriate, such as in certain jurisdictions, according to legislation and patent practice, computer Readable medium does not include electric carrier signal and telecommunication signal.
It should be noted that the apparatus embodiments described above are merely exemplary, wherein described be used as separation unit The unit of explanation may or may not be physically separated, and component shown as a unit can be or can also be with It is not physical unit, it can it is in one place, or may be distributed over multiple network units.It can be according to actual It needs that some or all of the modules therein is selected to achieve the purpose of the solution of this embodiment.In addition, device provided by the invention In embodiment attached drawing, the connection relationship between module indicate between them have communication connection, specifically can be implemented as one or A plurality of communication bus or signal wire.Those of ordinary skill in the art are without creative efforts, it can understand And implement.
The above is a preferred embodiment of the present invention, it is noted that for those skilled in the art For, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications are also considered as Protection scope of the present invention.

Claims (10)

1. a kind of 3D scene display control method characterized by comprising
When responding selection instruction, the corresponding selection area of the selection instruction is obtained in 3D scene;
Rotation roaming video camera, so that the roaming video camera is directed at the central point of the selection area;
The roaming video camera is controlled close to the selection area, so that the selection area is in current screen in amplification shape State;
Judge whether the boundary point of the selection area exceeds the field range of the roaming video camera;
If so, stopping the mobile roaming video camera;If it is not, then continuing to move to the roaming video camera.
2. 3D scene display control method as described in claim 1, which is characterized in that the selection instruction is instruction device hair Long-pressing instruction out;It is then described to obtain the corresponding selection area of the selection instruction, it specifically includes:
Corresponding first touch point and while lifting operation corresponding second touch point when obtaining the instruction device pushing operation;
The selection area is constructed according to first touch point and second touch point.
3. 3D scene display control method as claimed in claim 2, which is characterized in that the selection area be rectangular area or Border circular areas;Wherein,
When the selection area is rectangular area, first touch point and second touch point are in the rectangular area Non-conterminous two vertex;
When the selection area is border circular areas, first touch point and second touch point are the border circular areas Boundary point, and the straight line that is formed by connecting of first touch point and second touch point is the diameter of the border circular areas.
4. 3D scene display control method as described in claim 1, which is characterized in that the field range of the roaming video camera For one using the roaming video camera as the rectangular pyramid on vertex, the bottom surface of the rectangular pyramid is the display area of current screen.
5. 3D scene display control method as described in claim 1, which is characterized in that the control roaming video camera leans on The nearly selection area, specifically includes:
The roaming video camera is controlled along default straight line close to the selection area;Wherein, the default straight line is described unrestrained The straight line that trip video camera is formed by connecting in the central point of initial position and the selection area.
6. 3D scene display control method as described in claim 1, which is characterized in that the control roaming video camera leans on When the nearly selection area, further includes:
Uniform interpolation is carried out to the position and posture of the roaming video camera.
7. 3D scene display control method as described in claim 1, which is characterized in that the method also includes:
The roaming video camera is controlled far from the selection area, so that the selection area is in current screen in diminution shape State.
8. a kind of 3D scene display control program characterized by comprising
Selection area obtains module, and when for responding selection instruction, it is corresponding selected that the selection instruction is obtained in 3D scene Region;
Camera control module is roamed, for rotating roaming video camera, so that the roaming video camera is directed at the selection area Central point;The roaming video camera is controlled close to the selection area, so that the selection area is in current screen Magnifying state;
Judgment module, for judging whether the boundary point of the selection area exceeds the field range of the roaming video camera;If It is then to stop moving the roaming video camera;If it is not, then continuing to move to the roaming video camera.
9. 3D scene display control program as claimed in claim 8, which is characterized in that the selection instruction is instruction device hair Long-pressing instruction out;Then the selection area obtains module and is specifically used for:
Corresponding first touch point and while lifting operation corresponding second touch point when obtaining the instruction device pushing operation;
The selection area is constructed according to first touch point and second touch point.
10. a kind of 3D scene display control apparatus, which is characterized in that including processor, memory and be stored in the storage In device and it is configured as the computer program executed by the processor, the processor is realized when executing the computer program 3D scene display control method as claimed in any of claims 1 to 7 in one of claims.
CN201910377060.8A 2019-05-07 2019-05-07 A kind of 3D scene display control method, system and equipment Pending CN110209325A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910377060.8A CN110209325A (en) 2019-05-07 2019-05-07 A kind of 3D scene display control method, system and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910377060.8A CN110209325A (en) 2019-05-07 2019-05-07 A kind of 3D scene display control method, system and equipment

Publications (1)

Publication Number Publication Date
CN110209325A true CN110209325A (en) 2019-09-06

Family

ID=67785536

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910377060.8A Pending CN110209325A (en) 2019-05-07 2019-05-07 A kind of 3D scene display control method, system and equipment

Country Status (1)

Country Link
CN (1) CN110209325A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110908558A (en) * 2019-10-30 2020-03-24 维沃移动通信(杭州)有限公司 Image display method and electronic equipment
CN111698425A (en) * 2020-06-22 2020-09-22 四川易热科技有限公司 Method for realizing consistency of real scene roaming technology
CN112076470A (en) * 2020-08-26 2020-12-15 北京完美赤金科技有限公司 Virtual object display method, device and equipment
CN113617029A (en) * 2021-08-03 2021-11-09 网易(杭州)网络有限公司 Display control method, device, equipment and medium in game
CN113835607A (en) * 2021-08-19 2021-12-24 南京奥拓电子科技有限公司 Method and device for viewing scene in display terminal and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049266A (en) * 2012-12-17 2013-04-17 天津大学 Mouse operation method of Delta 3D (Three-Dimensional) scene navigation
CN105635650A (en) * 2014-10-29 2016-06-01 北京同步科技有限公司 Three-dimensional video monitoring method and three-dimensional video monitoring system
CN106981101A (en) * 2017-03-09 2017-07-25 衢州学院 A kind of control system and its implementation for realizing three-dimensional panorama roaming
CN108255932A (en) * 2017-12-07 2018-07-06 石化盈科信息技术有限责任公司 The roaming browsing method and system of digital factory based on three-dimensional digital platform
CN108579083A (en) * 2018-04-27 2018-09-28 腾讯科技(深圳)有限公司 Virtual scene display method, apparatus, electronic device and storage medium
CN108594996A (en) * 2018-04-16 2018-09-28 微幻科技(北京)有限公司 The method and device of automatic visual angle adjustment in a kind of virtual roaming

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049266A (en) * 2012-12-17 2013-04-17 天津大学 Mouse operation method of Delta 3D (Three-Dimensional) scene navigation
CN105635650A (en) * 2014-10-29 2016-06-01 北京同步科技有限公司 Three-dimensional video monitoring method and three-dimensional video monitoring system
CN106981101A (en) * 2017-03-09 2017-07-25 衢州学院 A kind of control system and its implementation for realizing three-dimensional panorama roaming
CN108255932A (en) * 2017-12-07 2018-07-06 石化盈科信息技术有限责任公司 The roaming browsing method and system of digital factory based on three-dimensional digital platform
CN108594996A (en) * 2018-04-16 2018-09-28 微幻科技(北京)有限公司 The method and device of automatic visual angle adjustment in a kind of virtual roaming
CN108579083A (en) * 2018-04-27 2018-09-28 腾讯科技(深圳)有限公司 Virtual scene display method, apparatus, electronic device and storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110908558A (en) * 2019-10-30 2020-03-24 维沃移动通信(杭州)有限公司 Image display method and electronic equipment
CN110908558B (en) * 2019-10-30 2022-10-18 维沃移动通信(杭州)有限公司 Image display method and electronic equipment
CN111698425A (en) * 2020-06-22 2020-09-22 四川易热科技有限公司 Method for realizing consistency of real scene roaming technology
CN112076470A (en) * 2020-08-26 2020-12-15 北京完美赤金科技有限公司 Virtual object display method, device and equipment
CN113617029A (en) * 2021-08-03 2021-11-09 网易(杭州)网络有限公司 Display control method, device, equipment and medium in game
CN113835607A (en) * 2021-08-19 2021-12-24 南京奥拓电子科技有限公司 Method and device for viewing scene in display terminal and storage medium
CN113835607B (en) * 2021-08-19 2024-01-16 南京奥拓电子科技有限公司 Method, device and storage medium for viewing scene in display terminal

Similar Documents

Publication Publication Date Title
CN110209325A (en) A kind of 3D scene display control method, system and equipment
WO2020119684A1 (en) 3d navigation semantic map update method, apparatus and device
CN106548516B (en) Three-dimensional roaming method and device
US9443353B2 (en) Methods and systems for capturing and moving 3D models and true-scale metadata of real world objects
JP6203406B2 (en) System and method for determining plane spread in an augmented reality environment
JP7178416B2 (en) METHOD AND DEVICE FOR PROCESSING VIRTUAL RESOURCES IN GAME SCENES
WO2017092303A1 (en) Virtual reality scenario model establishing method and device
US20210074245A1 (en) Image display method and apparatus, storage medium, and electronic device
US11880956B2 (en) Image processing method and apparatus, and computer storage medium
JP6747292B2 (en) Image processing apparatus, image processing method, and program
US10969949B2 (en) Information display device, information display method and information display program
EP2157545A1 (en) Entertainment device, system and method
JP2016534461A (en) Method and apparatus for representing a physical scene
CN109640070A (en) A kind of stereo display method, device, equipment and storage medium
CN103970518A (en) 3D rendering method and device for logic window
JP2015088819A (en) Imaging simulation device
US20220277520A1 (en) Information processing apparatus, information processing method, and storage medium
JP5980393B1 (en) Terminal device
CN112973121B (en) Reflection effect generation method and device, storage medium and computer equipment
CN113750522A (en) Game skill processing method and device and electronic equipment
CN108986230A (en) Image aspects management method, device and electronic equipment
US9881419B1 (en) Technique for providing an initial pose for a 3-D model
CN111589151A (en) Method, device, equipment and storage medium for realizing interactive function
JP2005228110A (en) Information processing method and apparatus
CN108564399B (en) Value attribute setting method and device, recommendation method and device for stadium seats

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190906