CN111857520A - 3D visual interactive display method and system based on digital twins - Google Patents

3D visual interactive display method and system based on digital twins Download PDF

Info

Publication number
CN111857520A
CN111857520A CN202010550791.0A CN202010550791A CN111857520A CN 111857520 A CN111857520 A CN 111857520A CN 202010550791 A CN202010550791 A CN 202010550791A CN 111857520 A CN111857520 A CN 111857520A
Authority
CN
China
Prior art keywords
camera
digital twin
debugging
interactive display
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010550791.0A
Other languages
Chinese (zh)
Inventor
孙赫成
李伟
郭金明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Xirui Digital Technology Co ltd
Original Assignee
Guangdong Xirui Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Xirui Digital Technology Co ltd filed Critical Guangdong Xirui Digital Technology Co ltd
Priority to CN202010550791.0A priority Critical patent/CN111857520A/en
Publication of CN111857520A publication Critical patent/CN111857520A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a 3D visual interactive display method based on digital twins, which comprises the following steps: creating a virtual 3D scene, and introducing a corresponding digital twin body containing actual object data information into the virtual 3D scene; establishing a debugging camera in front of each digital twin body, and determining the moving range of the debugging camera according to the data information of the corresponding actual object; numbering each debugging camera, and recording related information of each debugging camera; the related information comprises the serial number of the debugging camera, a digital twin body corresponding to the debugging camera, the position information of the current debugging camera and the moving range of the debugging camera; and switching the visual angle of the camera to the visual angle of the debugging camera corresponding to the number according to the number input by the user, and then displaying the corresponding digital twin according to the operation of the user. The invention can clearly display the digital twin body to be observed.

Description

3D visual interactive display method and system based on digital twins
Technical Field
The invention relates to the technical field of digital twins, in particular to a 3D visual interactive display method and system based on digital twins.
Background
At present, the digital wave mat represented by new technologies such as internet of things, big data, artificial intelligence and the like is global, and the physical world and the corresponding digital world form two major systems for parallel development and interaction. The digital world exists for serving the physical world, the physical world is high-efficient and orderly because the digital world becomes, and the digital twin technology comes with fortune, gradually extends and expands from the manufacturing industry to the urban space, and deeply influences urban planning, construction and development.
However, the existing 3D visualization interaction is performed purely based on a common model, most of the 3D visualization interaction is performed by controlling a camera, and the model is only displayed as an object of a scene, so that the expressed content is single, and the model to be understood cannot be clearly displayed.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a 3D visual interactive display method and system based on digital twins.
The technical scheme of the invention is realized as follows:
A3D visual interactive display method based on digital twinning comprises the following steps:
creating a virtual 3D scene, and introducing digital twins each containing actual object data information into the virtual 3D scene;
Establishing a debugging camera in front of each digital twin body, and determining the moving range of the debugging camera according to the data information of the corresponding actual object; wherein the position of the current commissioning camera is within the active range;
numbering and creating links for each debugging camera, and recording related information of each debugging camera; the related information comprises the serial number of the debugging camera, a digital twin body corresponding to the debugging camera, the position information of the current debugging camera and the moving range of the debugging camera;
switching the visual angle of the camera to the visual angle of the debugging camera corresponding to the number according to the number input by the user, and then displaying the corresponding digital twin according to the operation of the user; and after the position is switched, the moving range of the camera is the moving range of the debugging camera corresponding to the number input by the user.
Further, the step of determining the range of motion of the camera according to the data information of the corresponding actual object includes:
determining a center point of the digital twin and related dimensional information of the digital twin, the related dimensional information including length, width, height and/or radius information;
and determining a central point of the debugging camera according to the central point of the digital twin body, and then determining an activity boundary of the debugging camera according to the relevant dimension information and the central point of the debugging camera.
Further, the step of displaying the corresponding digital twin according to the operation of the user includes:
and obtaining moving axis values in four directions of the operated mouse in real time through a 3D engine, then determining the offset of the camera in real time according to the moving axis values to control the camera to move, and displaying the digital twin in real time according to the vision of the current camera.
Further, the step of displaying the corresponding digital twin according to the operation of the user further includes:
and controlling the distance degree of the camera according to the change value of the middle roller of the operated mouse, and displaying the digital twin in real time according to the vision of the current camera.
Furthermore, in the step of controlling the distance of the camera according to the change value of the middle roller of the operated mouse and displaying the digital twin body in the vision of the current camera,
stopping the forward input of the mouse wheel when the distance between the camera and the digital twin is less than 0.5 star aver;
stopping the backward input of the mouse wheel when the distance between the camera and the three-dimensional model is greater than 2 star aver;
wherein aver is the average value of the sum of the length, the width and the height of the digital twin.
A3D visual interactive display system based on digital twinning comprises
The scene establishing module is used for establishing a virtual 3D scene and introducing a corresponding digital twin body containing actual object data information into the virtual 3D scene;
the debugging camera establishing module is used for establishing a debugging camera in front of each digital twin body and determining the moving range of the debugging camera according to the data information of the corresponding actual object; wherein the position of the current commissioning camera is within the active range;
the debugging camera information recording module is used for numbering each debugging camera and recording related information of each debugging camera; the related information comprises the serial number of the debugging camera, a digital twin body corresponding to the debugging camera, the position information of the current debugging camera and the moving range of the debugging camera;
the interactive display module is used for switching the visual angle of the camera to the visual angle of the debugging camera corresponding to the number according to the number input by the user, and then displaying the corresponding digital twin body according to the operation of the user; and after the position is switched, the moving range of the camera is the moving range of the debugging camera corresponding to the number input by the user.
The debugging camera establishing module is further specifically used for determining a central point of the digital twin and relevant dimension information of the digital twin, wherein the relevant dimension information comprises length, width, height and/or radius information; and the central point of the debugging camera is determined according to the central point of the digital twin body, and then the activity boundary of the debugging camera is determined according to the relevant dimension information and the central point of the debugging camera.
Further, the interactive display module comprises
And the camera moving display unit is used for acquiring moving axis values in four directions of the operated mouse in real time through the 3D engine, then determining the offset of the camera in real time according to the moving axis values to control the camera to move, and displaying the digital twin in real time according to the vision of the current camera.
Further, the interactive display module further comprises
And the camera zooming and displaying unit is used for controlling the distance of the camera according to the change value of the middle roller of the operated mouse and displaying the digital twin in real time according to the vision of the current camera.
Further, the camera zooming display unit is also used for stopping the forward input of the mouse wheel when the distance between the camera and the digital twin is less than 0.5 aver, and stopping the backward input of the mouse wheel when the distance between the camera and the three-dimensional model is more than 2 aver; wherein aver is the average value of the sum of the length, the width and the height of the digital twin.
Compared with the prior art, the invention has the following advantages: according to the method and the device, the digital twin body containing actual object data information is displayed in the virtual scene, the camera in the virtual scene is switched to jump to the position in front of the appointed digital twin body, and then the digital twin body is displayed according to the operation of a user, so that the digital twin body to be observed can be clearly displayed. And the activity range of the camera after the jump is the activity range of the debugging camera corresponding to the model.
The mouse rotates around the digital twin body by operating the middle roller of the mouse, and the moving range of the camera is determined by calculating relevant size information (such as length, width, height and the like) of the digital twin body, so that the camera can be operated by a user conveniently.
The invention carries out 3D visual interaction based on the model obtained by the digital twins, the model obtained by the digital twins is matched with an actual environment scene, some information of an actual object can be recorded, the camera is switched in the model to be displayed to enable the lens to move more smoothly, the moving distance of the camera of the observed object is set by calculating the length, width and height proportion of the actual object, and the observed object can be clearly displayed.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flow chart of a digital twin-based 3D visual interactive display method according to an embodiment of the present invention;
Fig. 2 is a structural block diagram of a digital twin-based 3D visual interactive display system according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the invention discloses a 3D visual interactive display method based on digital twins, which includes the following steps:
s1, creating a virtual 3D scene, and introducing digital twins each containing actual object data information into the virtual 3D scene;
s2, creating a debugging camera in front of each digital twin, and determining the moving range of the debugging camera according to the data information of the corresponding actual object; wherein the position of the current commissioning camera is within the active range;
s3, numbering and creating links for each debugging camera, and recording related information of each debugging camera; the related information comprises the serial number of the debugging camera, a digital twin body corresponding to the debugging camera, the position information of the current debugging camera and the moving range of the debugging camera;
S4, switching the visual angle of the camera to the visual angle of the debugging camera corresponding to the number according to the number input by the user, and then displaying the corresponding digital twin according to the operation of the user; and after the position is switched, the moving range of the camera is the moving range of the debugging camera corresponding to the number input by the user.
Referring to fig. 2, an embodiment of the present invention further provides a digital twin-based 3D visual interactive display system, which includes
The scene establishing module is used for establishing a virtual 3D scene and introducing a corresponding digital twin body containing actual object data information into the virtual 3D scene;
the debugging camera establishing module is used for establishing a debugging camera in front of each digital twin body and determining the moving range of the debugging camera according to the data information of the corresponding actual object; wherein the position of the current commissioning camera is within the active range;
the debugging camera information recording module is used for numbering each debugging camera and recording related information of each debugging camera; the related information comprises the serial number of the debugging camera, a digital twin body corresponding to the debugging camera, the position information of the current debugging camera and the moving range of the debugging camera;
The interactive display module is used for switching the visual angle of the camera to the visual angle of the debugging camera corresponding to the number according to the number input by the user, and then displaying the corresponding digital twin body according to the operation of the user; and after the position is switched, the moving range of the camera is the moving range of the debugging camera corresponding to the number input by the user.
In the invention, the 3D visual interactive display method based on the digital twin can be executed by the 3D visual interactive display system based on the digital twin as an action subject, and can also be executed by each component in the system. Specifically, S1 may be performed by the scene setup module, S2 may be performed by the camera setup module, S3 may be performed by the camera information recording module, and S4 may be performed by the interactive display module.
In this step of S1, a virtual 3D scene may be created using the 3D engine, where the virtual 3D scene is free of any model empty scenes created according to the environment where the real objects are located. After the virtual 3D scene is created, a digital twin including actual object data information to be observed, that is, a three-dimensional model including actual object data information is imported into the virtual 3D scene. In order to ensure that the environment of each digital twin is the same as the real environment, after each digital twin is introduced, the virtual 3D scene is perfected to correspond to the real environment object 1: 1.
In the step S2, in the virtual 3D scene created by the 3D engine, a debugging camera is created in front of the digital twin to be exhibited, and a link is created, so that the subsequent camera can switch to the vision of the debugging camera according to the created link to observe the digital twin. When a plurality of digital twins need to be shown, a debugging camera is created in front of each digital twins to be shown, so that a user can observe each digital twins which the user wants to observe carefully.
After the debugging camera is created, the moving range of the debugging camera needs to be determined according to the data information of the corresponding actual object, so that the situation that the camera is too close to or too far away from the digital twin body to be beneficial to a user to observe the digital twin body when the user operates the camera to move is avoided.
Specifically, the step of determining the range of motion of the camera according to the data information of the corresponding actual object includes:
s201, determining a central point of a digital twin and relevant size information of the digital twin, wherein the relevant size information comprises length, width, height and/or radius information;
s202, determining a central point of the debugging camera according to the central point of the digital twin body, and then determining an activity boundary of the debugging camera according to the related dimension information and the central point of the debugging camera.
Correspondingly, the debugging camera establishing module is further specifically configured to determine a central point of the digital twin and relevant dimension information of the digital twin, where the relevant dimension information includes length, width, height, and/or radius information; and the central point of the debugging camera is determined according to the central point of the digital twin body, and then the activity boundary of the debugging camera is determined according to the relevant dimension information and the central point of the debugging camera.
In the embodiment of the invention, the active boundary of the camera is debugged, and meanwhile, the active boundary of the camera under the view angle of the debugging camera is also switched to be related to the relevant size and the central point of the digital twin. When the central point of the debugging camera is the central point of the digital twin body, the debugging camera can walk around the digital twin body for a circle, and the digital twin body can be observed carefully by a user. Certainly, the central point of the debugging camera is not necessarily the center of the digital twin body, and can be obtained by calculating according to the length, width and height of the model and a preset proportion; this is due to the different shapes of the digital twins, and the differences in the center points.
When the activity range of the debugging camera is related to the body size of the digital twin body, the situation that the camera is operated too far and is not beneficial to the operation and observation of the user when the user operates the camera can be avoided. According to the body type of the digital twin body, namely the length, the width, the height and/or the radius information and the like of the digital twin body, when the debugging camera can completely observe the digital twin body within the minimum distance from the central point, the range within the minimum distance is the activity range of the debugging camera or the switched camera, the observation of the digital twin body by a user is not influenced, and the careful observation of the digital twin body by the user operating the camera is facilitated.
Specifically, the boundary of the debugging camera may be calculated according to a certain proportion according to the relevant size of the digital twin or automatically generated according to the size of the digital twin. For example, the length, width and height of the digital twin body are acquired by a 3D engine, then the average value aver of the three numbers of the length, width and height is obtained, and the circle range with the digital twin body as the center and 2aver as the radius is used as the moving range of the debugging camera. Of course, the moving range of the debugging camera may be set by the user according to experience, and is not limited herein.
In the step S3, numbering and creating a link for each debugging camera, and recording related information of each debugging camera, so that when a user inputs a number or clicks a model to be displayed, the background can index the debugging camera selected by the user according to a user instruction and switch to the position and the view angle of the debugging camera according to the link. And switching to the camera at the debugging camera position, wherein the moving range is the moving range of the debugging camera, the camera position is the position of the debugging camera recorded at that time, and the digital twin body corresponding to the debugging camera can be observed by operating the camera.
In the step S4, the view angle of the camera is switched to the view angle of the debugging camera corresponding to the number according to the number input by the user, and the digital twin to be displayed is observed. The number input by the user can be the number of the debugging camera directly input by the user in the command box; a link may also be created on the digital twin, and when the user clicks on the link, the camera may be directly switched to the debugging camera corresponding to the model.
For example, a digital twin to be shown in a scene may be numbered in a simulated 3D scene and consistent with a debugging camera number corresponding to the digital twin, and then a button is added above the digital twin, with the name of the button being the number corresponding to the digital twin; when a user clicks a button, the number is directly transmitted into the controller, and the controller skips to the debugging camera with the number after running, so that the switching of the camera is completed. And the user directly clicks the button of the displayed digital twin body to complete the switching of the camera, so that the digital twin body to be observed is clearly displayed.
Specifically, the step of displaying the corresponding digital twin according to the operation of the user includes:
s401, obtaining moving axis values of the operated mouse in four directions in real time through a 3D engine, then determining the offset of the camera in real time according to the moving axis values to control the camera to move, and displaying the digital twin in real time according to the vision of the current camera.
S402, controlling the distance degree of the camera according to the change value of the middle roller of the operated mouse, and displaying the digital twin body in real time according to the vision of the current camera.
Correspondingly, the interactive display module comprises a camera mobile display unit and a camera zoom display unit, wherein
And the camera moving display unit is used for acquiring moving axis values in four directions of the operated mouse in real time through the 3D engine, then determining the offset of the camera in real time according to the moving axis values to control the camera to move, and displaying the digital twin in real time according to the vision of the current camera.
And the camera zooming and displaying unit is used for controlling the distance of the camera according to the change value of the middle roller of the operated mouse and displaying the digital twin in real time according to the vision of the current camera.
Accordingly, S4 is executed by the interactive display module as the action subject, and may also be executed by each component in the interactive display module. Specifically, S401 may be performed by the camera moving display unit, and S402 may be performed by the camera zooming display unit.
In the embodiment of the invention, interactive display is realized mainly by controlling a mouse. When a user operates the middle roller of the mouse, the distance degree of the current camera can be controlled; when the user operates the camera to move, the camera can be moved by operating the mouse to move after the movement command is triggered (for example, the camera can be moved by clicking the mouse and not placing the mouse, and moving the mouse on the desktop), and then the current digital twin is displayed in the visual sense of the current camera.
Specifically, when the camera is operated to move, the 3D engine is used for acquiring moving axis values of the operated mouse in four directions in real time, then the offset of the camera is determined in real time according to the moving axis values so as to control the camera to move, and the digital twin is displayed to the display terminal in real time according to the vision of the current camera for the user to observe; and when the camera is operated to zoom, the distance degree of the camera is controlled according to the change value of the middle roller of the operated mouse, and the digital twin is displayed to the display terminal in real time by the vision of the current camera for the user to observe. Therefore, the digital twin body to be observed can be clearly displayed for the user to view by operating the movement and the rotation of the camera.
In step S402, the step of controlling the distance of the camera according to the magnitude of the change value of the middle wheel of the operated mouse, and displaying the digital twin with the current visual sense of the camera includes:
stopping the forward input of the mouse wheel when the distance between the camera and the digital twin is less than 0.5 star aver;
stopping the backward input of the mouse wheel when the distance between the camera and the three-dimensional model is greater than 2 star aver;
wherein aver is the average value of the sum of the length, the width and the height of the digital twin.
Correspondingly, the camera zooming display unit is also used for stopping the forward input of the mouse wheel when the distance between the camera and the digital twin is less than 0.5 aver, and stopping the backward input of the mouse wheel when the distance between the camera and the three-dimensional model is more than 2 aver; wherein aver is the average value of the sum of the length, the width and the height of the digital twin.
In the embodiment of the invention, the length, width and height of the digital twin body can be obtained through the 3D engine, and then the average value aver of the three numbers is calculated through a preset calculation method. And calculating the distance between the current visual angle of the camera and the digital twin according to the change value of the middle roller of the mouse. Assuming that the input value of the mouse wheel is between-1 and 1, when the distance between the camera and the digital twin is determined to be less than 0.5 aver, the forward input of the mouse wheel is stopped (namely the forward input of the mouse wheel is ineffective to input a value between 0 and 1 backwards only), and when the distance between the camera and the digital twin is greater than 2 aver, the backward input of the mouse wheel is stopped (namely the forward input of the mouse wheel is ineffective to input a value between 0 and-1 backwards only), so that the maximum and minimum distances of the camera moving far and near can be limited, and an optimal observation distance is provided for a user.
In summary, the digital twin including the actual object data information is displayed in the virtual scene, the camera in the virtual scene is switched to jump to the position in front of the designated digital twin, and then the digital twin is displayed according to the operation of the user, so that the digital twin to be observed can be clearly displayed. And the activity range of the camera after the jump is the activity range of the debugging camera corresponding to the model.
Specifically, the mouse is rotated around the digital twin body by operating a middle scroll wheel of the mouse, and the moving range of the camera is determined by calculating relevant size information (such as length, width, height and the like) of the digital twin body, so that the camera can be operated by a user conveniently.
The invention carries out 3D visual interaction based on the model obtained by the digital twins, the model obtained by the digital twins is matched with an actual environment scene, some information of an actual object can be recorded, the camera is switched in the model to be displayed to enable the lens to move more smoothly, the moving distance of the camera of the observed object is set by calculating the length, width and height proportion of the actual object, and the observed object can be clearly displayed.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A3D visual interactive display method based on digital twinning is characterized by comprising the following steps:
creating a virtual 3D scene, and introducing digital twins each containing actual object data information into the virtual 3D scene;
establishing a debugging camera in front of each digital twin body, and determining the moving range of the debugging camera according to the data information of the corresponding actual object; wherein the position of the current commissioning camera is within the active range;
numbering and creating links for each debugging camera, and recording related information of each debugging camera; the related information comprises the serial number of the debugging camera, a digital twin body corresponding to the debugging camera, the position information of the current debugging camera and the moving range of the debugging camera;
switching the visual angle of the camera to the visual angle of the debugging camera corresponding to the number according to the number input by the user, and then displaying the corresponding digital twin according to the operation of the user; and after the position is switched, the moving range of the camera is the moving range of the debugging camera corresponding to the number input by the user.
2. The digital twin-based 3D visual interactive display method according to claim 1, wherein the step of determining the moving range of the debugging camera based on the data information of the corresponding real object comprises:
Determining a center point of the digital twin and related dimensional information of the digital twin, the related dimensional information including length, width, height and/or radius information;
and determining a central point of the debugging camera according to the central point of the digital twin body, and then determining an activity boundary of the debugging camera according to the relevant dimension information and the central point of the debugging camera.
3. The digital twin-based 3D visual interactive display method according to claim 1, wherein the step of displaying the corresponding digital twin according to a user's operation includes:
and obtaining moving axis values in four directions of the operated mouse in real time through a 3D engine, then determining the offset of the camera in real time according to the moving axis values to control the camera to move, and displaying the digital twin in real time according to the vision of the current camera.
4. The digital twin-based 3D visual interactive display method according to claim 1, wherein the step of displaying the corresponding digital twin according to a user's operation further includes:
and controlling the distance degree of the camera according to the change value of the middle roller of the operated mouse, and displaying the digital twin in real time according to the vision of the current camera.
5. The digital twin-based 3D visual interactive display method according to claim 4, wherein in the step of controlling the degree of the camera according to the magnitude of the change value of the middle wheel of the operated mouse and displaying the digital twin with the visual sense of the current camera,
stopping the forward input of the mouse wheel when the distance between the camera and the digital twin is less than 0.5 star aver;
stopping the backward input of the mouse wheel when the distance between the camera and the three-dimensional model is greater than 2 star aver;
wherein aver is the average value of the sum of the length, the width and the height of the digital twin.
6. A3D visual interactive display system based on digital twinning is characterized by comprising
The scene establishing module is used for establishing a virtual 3D scene and introducing a corresponding digital twin body containing actual object data information into the virtual 3D scene;
the debugging camera establishing module is used for establishing a debugging camera in front of each digital twin body and determining the moving range of the debugging camera according to the data information of the corresponding actual object; wherein the position of the current commissioning camera is within the active range;
the debugging camera information recording module is used for numbering each debugging camera and recording related information of each debugging camera; the related information comprises the serial number of the debugging camera, a digital twin body corresponding to the debugging camera, the position information of the current debugging camera and the moving range of the debugging camera;
The interactive display module is used for switching the visual angle of the camera to the visual angle of the debugging camera corresponding to the number according to the number input by the user, and then displaying the corresponding digital twin body according to the operation of the user; and after the position is switched, the moving range of the camera is the moving range of the debugging camera corresponding to the number input by the user.
7. The digital twin based 3D visual interactive display system according to claim 1 wherein the commissioning camera establishing module is further specifically configured to determine a central point of the digital twin and related dimensional information of the digital twin, the related dimensional information including length, width, height and/or radius information; and the central point of the debugging camera is determined according to the central point of the digital twin body, and then the activity boundary of the debugging camera is determined according to the relevant dimension information and the central point of the debugging camera.
8. The digital twin based 3D visual interactive display system of claim 1 wherein the interactive display module comprises
And the camera moving display unit is used for acquiring moving axis values in four directions of the operated mouse in real time through the 3D engine, then determining the offset of the camera in real time according to the moving axis values to control the camera to move, and displaying the digital twin in real time according to the vision of the current camera.
9. The digital twin based 3D visual interactive display system of claim 1 wherein the interactive display module further comprises
And the camera zooming and displaying unit is used for controlling the distance of the camera according to the change value of the middle roller of the operated mouse and displaying the digital twin in real time according to the vision of the current camera.
10. The digital twin based 3D visual interactive display system according to claim 9, wherein the camera zooming display unit is further configured to stop the mouse wheel from inputting forward when the distance between the camera and the digital twin is less than 0.5 aver, and stop the mouse wheel from inputting backward when the distance between the camera and the three-dimensional model is more than 2 aver; wherein aver is the average value of the sum of the length, the width and the height of the digital twin.
CN202010550791.0A 2020-06-16 2020-06-16 3D visual interactive display method and system based on digital twins Pending CN111857520A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010550791.0A CN111857520A (en) 2020-06-16 2020-06-16 3D visual interactive display method and system based on digital twins

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010550791.0A CN111857520A (en) 2020-06-16 2020-06-16 3D visual interactive display method and system based on digital twins

Publications (1)

Publication Number Publication Date
CN111857520A true CN111857520A (en) 2020-10-30

Family

ID=72987175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010550791.0A Pending CN111857520A (en) 2020-06-16 2020-06-16 3D visual interactive display method and system based on digital twins

Country Status (1)

Country Link
CN (1) CN111857520A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115237302A (en) * 2021-06-30 2022-10-25 达闼机器人股份有限公司 Scene switching method, device, medium and electronic equipment based on digital twins
CN115423951A (en) * 2022-11-07 2022-12-02 南京朝鹿鸣科技有限公司 Water supply and drainage visualization method based on digital twinning
CN116958410A (en) * 2023-06-15 2023-10-27 湖南视比特机器人有限公司 3D camera virtual environment platform simulation method based on phase deflection operation
CN117539368A (en) * 2024-01-09 2024-02-09 广州开得联智能科技有限公司 Interaction method, device, equipment and readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007041876A (en) * 2005-08-03 2007-02-15 Samii Kk Image display device and image display program
US20130201188A1 (en) * 2012-02-06 2013-08-08 Electronics And Telecommunications Research Institute Apparatus and method for generating pre-visualization image
US20170154469A1 (en) * 2015-12-01 2017-06-01 Le Holdings (Beijing) Co., Ltd. Method and Device for Model Rendering
US20180336729A1 (en) * 2017-05-19 2018-11-22 Ptc Inc. Displaying content in an augmented reality system
CN109819233A (en) * 2019-01-21 2019-05-28 哈工大机器人(合肥)国际创新研究院 A kind of digital twinned system based on virtual image technology
CN110347131A (en) * 2019-07-18 2019-10-18 中国电子科技集团公司第三十八研究所 The digital twinned system of facing to manufacture
CN110505464A (en) * 2019-08-21 2019-11-26 佳都新太科技股份有限公司 A kind of number twinned system, method and computer equipment
CN110880139A (en) * 2019-09-30 2020-03-13 珠海随变科技有限公司 Commodity display method, commodity display device, terminal, server and storage medium
CN111091611A (en) * 2019-12-25 2020-05-01 青岛理工大学 Workshop digital twin oriented augmented reality system and method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007041876A (en) * 2005-08-03 2007-02-15 Samii Kk Image display device and image display program
US20130201188A1 (en) * 2012-02-06 2013-08-08 Electronics And Telecommunications Research Institute Apparatus and method for generating pre-visualization image
US20170154469A1 (en) * 2015-12-01 2017-06-01 Le Holdings (Beijing) Co., Ltd. Method and Device for Model Rendering
US20180336729A1 (en) * 2017-05-19 2018-11-22 Ptc Inc. Displaying content in an augmented reality system
CN109819233A (en) * 2019-01-21 2019-05-28 哈工大机器人(合肥)国际创新研究院 A kind of digital twinned system based on virtual image technology
CN110347131A (en) * 2019-07-18 2019-10-18 中国电子科技集团公司第三十八研究所 The digital twinned system of facing to manufacture
CN110505464A (en) * 2019-08-21 2019-11-26 佳都新太科技股份有限公司 A kind of number twinned system, method and computer equipment
CN110753218A (en) * 2019-08-21 2020-02-04 佳都新太科技股份有限公司 Digital twinning system and method and computer equipment
CN110880139A (en) * 2019-09-30 2020-03-13 珠海随变科技有限公司 Commodity display method, commodity display device, terminal, server and storage medium
CN111091611A (en) * 2019-12-25 2020-05-01 青岛理工大学 Workshop digital twin oriented augmented reality system and method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115237302A (en) * 2021-06-30 2022-10-25 达闼机器人股份有限公司 Scene switching method, device, medium and electronic equipment based on digital twins
CN115423951A (en) * 2022-11-07 2022-12-02 南京朝鹿鸣科技有限公司 Water supply and drainage visualization method based on digital twinning
CN115423951B (en) * 2022-11-07 2023-09-26 南京朝鹿鸣科技有限公司 Water supply and drainage visualization method based on digital twin
CN116958410A (en) * 2023-06-15 2023-10-27 湖南视比特机器人有限公司 3D camera virtual environment platform simulation method based on phase deflection operation
CN117539368A (en) * 2024-01-09 2024-02-09 广州开得联智能科技有限公司 Interaction method, device, equipment and readable storage medium
CN117539368B (en) * 2024-01-09 2024-05-03 广州开得联智能科技有限公司 Interaction method, device, equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN111857520A (en) 3D visual interactive display method and system based on digital twins
JP6193554B2 (en) Robot teaching apparatus having a three-dimensional display unit
EP1310844B1 (en) Simulation device
US5623583A (en) Three-dimensional model cross-section instruction system
JP2019519387A (en) Visualization of Augmented Reality Robot System
Neves et al. Application of mixed reality in robot manipulator programming
CN106204713B (en) Static merging processing method and device
CN106056655B (en) A kind of editable virtual camera system and method
CN109472872A (en) Shop floor status monitoring method and system
CN109191593A (en) Motion control method, device and the equipment of virtual three-dimensional model
US10964104B2 (en) Remote monitoring and assistance techniques with volumetric three-dimensional imaging
Pai et al. Virtual planning, control, and machining for a modular-based automated factory operation in an augmented reality environment
CN110824956A (en) Simulation interaction system of nuclear power plant control room
JP2008090498A (en) Image processing method and image processor
CN109118584B (en) Method for controlling an automation system, control system and computer program product
Krings et al. Design and evaluation of ar-assisted end-user robot path planning strategies
Eroglu et al. Design and evaluation of a free-hand vr-based authoring environment for automated vehicle testing
CN113021329B (en) Robot motion control method and device, readable storage medium and robot
US6798416B2 (en) Generating animation data using multiple interpolation procedures
CN106393081B (en) Method for controlling robot, terminal and the system of human-computer interaction
CN108401462A (en) Information processing method and system, cloud processing device and computer program product
JP2017211697A (en) Three-dimensional data display device, three-dimensional data display method, and program
CN108499109B (en) Method for realizing real-time unilateral scaling of article based on UE engine
CN112506347A (en) Mixed reality interaction method for monitoring machining process
CN111784797A (en) Robot networking interaction method, device and medium based on AR

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination