WO2008026817A1 - System and method for realizing virtual reality contents of 3-dimension using ubiquitous sensor network - Google Patents

System and method for realizing virtual reality contents of 3-dimension using ubiquitous sensor network Download PDF

Info

Publication number
WO2008026817A1
WO2008026817A1 PCT/KR2007/002362 KR2007002362W WO2008026817A1 WO 2008026817 A1 WO2008026817 A1 WO 2008026817A1 KR 2007002362 W KR2007002362 W KR 2007002362W WO 2008026817 A1 WO2008026817 A1 WO 2008026817A1
Authority
WO
WIPO (PCT)
Prior art keywords
contents
event
unit
space
data
Prior art date
Application number
PCT/KR2007/002362
Other languages
French (fr)
Inventor
Sang Moo Hyun
Keun Pyo Kim
Original Assignee
Qtel Soft Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qtel Soft Co., Ltd. filed Critical Qtel Soft Co., Ltd.
Priority to JP2008558217A priority Critical patent/JP2009528646A/en
Publication of WO2008026817A1 publication Critical patent/WO2008026817A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Definitions

  • the present invention relates to system and method for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network, and more specifically to system and method for realizing virtual reality contents of 3- dimension using a ubiquitous sensor network capable of providing a solution that can integrate technologies for realizing various virtual realties and service them and displaying environmental information such as temperature, illuminance, humidity, etc., and an operating state of contents in virtual reality in real time.
  • a VRML (Virtual Reality Modeling Language) code is a 3- dimensional image realization programming language made so as to realize an internet based 3-dimensional image and is a language for presenting information likewise an HTML code.
  • the VRML code is a language for describing a 3-imensional image space, etc., which can interact with a user (client).
  • the VRML code makes visual images to register them on a web so that a user can view, move and rotate, RO/KR 11.01.2007
  • Such a VRML can be viewed by means of a VRML dedicated viewer.
  • Currently mainly used viewers do not comprise a control panel that can control a change in a camera position or positions of each object in a plurality of objects so that they cannot finely control each object according to the position of the camera.
  • the viewers have been not used for producing the VRML contents that require a control or arrangement of a plurality of objects.
  • the technologies for realizing such virtual reality are realized by a 3-dimensional data authoring tool, causing a problem that each of the produced 3- dimensional contents can confirmed only through each dedicated viewer.
  • the virtual reality realized by using a real space as a model simply provides only a spatial image and does not provide the environmental information or state of the real space, etc., in real time, causing a problem that the lively feeling of the spot is not provided to a user.
  • An object of the present invention provides system and method for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network capable of providing a solution that can integrate technologies for realizing various virtual reality and service them, and detecting environmental information and change state of a real space in real time and providing them .
  • the present invention comprises: a data converting unit converting raw data realizing a real space by means of a general 3- dimensional data authoring tool into VRML files and converting them into VR data files by rotating image data of a real space acquired from a camera; a virtual reality contents producing unit producing object contents and space contents from the VRML files converted from data converting unit and producing panorama VR contents and photo object VR contents from the VR data; an event detecting unit detecting the environmental state of the real space and the change state of object contents, space contents, and panorama VR contents of the space; an event contents producing unit producing the environmental state and change state of the real space detected from the event detecting unit as 3-dimensional event contents; J*L.T/KK2D0
  • a contents managing unit storing the object contents, the space contents, the panorama VR contents, and the photo object VR contents produced from the virtual reality contents producing unit and the event contents produced from the event contents producing unit and providing the contents on a local or a web; and a user terminal having a viewer program for displaying the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents provided from the contents managing unit.
  • the present invention further comprises an Html producing unit converting the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents into Html files to display them on the web.
  • the event detecting unit comprises: a plurality of sensing units detecting the environmental state including at least one of temperature, illuminance, and humidity of the real space and the change state of the contents including at least one of the movements of the object contents, the space contents, and the panorama VR contents; and an event data transmitting unit transmitting the environmental state and the change state detected from the sensing units to the event contents producing unit.
  • the event contents producing unit comprises: an event data receiving unit receiving the environmental state and change state detected from the event RO/KR 11.01.2007
  • an event contents managing unit analyzing the environmental state and change state received in the event data receiving unit to generate the event contents through the control of numerical information and caption establishment according to the change in the corresponding environmental information, the object contents, and the space contents.
  • the contents managing unit comprises: a 3-dimensional content managing unit storing the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents as compression files and managing them; and a viewer data installing unit installing a viewer program capable of displaying the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents on a user terminal accessing the web.
  • the event detecting unit and the event contents producing unit use any one of Zigbee, Bluetooth, and wireless LAN to transceiver data. Also, the event contents producing unit and the contents managing unit use any one of serial communication and USB communication to transceiver data.
  • the present invention comprises the steps of: allowing an event contents producing unit producing event contents to detect event data produced according to the change in 3-dimensional object contents, space contents, and panorama VR contents produced from a virtual reality contents producing unit and RO/KR 11.01.2007
  • the present invention further comprises converting the event contents produced according to the change in the object contents, the space contents, and the panorama VR contents and the change in the environmental state of the real space into Html files to display them on the web.
  • the event contents changes numerical information and caption according to the change in the object contents, the space contents, and the panorama VR contents
  • the present invention has an advantage that the lively feeling of the spot can be provided to a user by providing the environment of the virtual reality space realized using the real space as a model or the change state of the contents in real RO/KR 11.01.2007
  • the present invention has an advantage that it produces the produced 3-dimensional contents and the Html files of the 3-dimensional contents to easily confirm them on the web.
  • FIG. 1 is a block view showing a constitution of a system for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network according to the present invention
  • FIG. 2 is a block view showing a constitution of a virtual reality contents producing unit of FIG. 1
  • FIG. 3 is a block view showing a constitution of an event contents producing unit of FIG. 1 ;
  • FIG. 4 is a block view showing a constitution of a system for installing a user viewer on a user terminal of FIG. 1 ;
  • FIG. 5 is a flow chart showing a method for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network according to the present RO/KR 11.01.2007
  • FIG. 6 is a flow chart showing a method for realizing the contents using the event data of FIG. 5.
  • FIG. 1 is a block view showing a constitution of a system for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network according to the present invention.
  • the system comprises: a data converting unit 100 converting raw data realized for producing 3-dimension contents into VRML files, panorama VR data files, or photo object VR data files; a virtual reality contents producing unit 200 producing 3-dimensional virtual reality contents using the files converted from the data converting unit 100; an event detecting unit 500 detecting the environmental state of a real space and the change state of object contents and space contents of the space; an event contents producing unit 600 producing the environmental state and change state of the real space detected from the event detecting unit 500 as 3-dimensional event contents; a contents managing unit 300 storing the 3-dimensional virtual reality contents produced in the virtual reality contents producing unit 200 and the RO/KR 11.01.2007
  • the data converting unit 100 converts the raw data realized by a general 3- dimensional data authoring tool (For example, 3D Studio, 3D Photo VR Editor, etc.) into the VRML files and converts them into the VR data files by rotating image data acquired from a camera. That is, the raw data realized using the general 3- dimensional authoring tool are converted into the VRML data so that they are extracted as the VRML files.
  • a general 3- dimensional data authoring tool For example, 3D Studio, 3D Photo VR Editor, etc.
  • the data converting unit 100 extracts panorama VR data and photo object VR data from the image data acquired from the camera.
  • the panorama VR data stitches the image data acquired by rotating the camera (for example, about 30 sheets of photograph data photographed by rotating the camera) into a general panorama program to convert them into the panorama VR data.
  • the photo object VR data converts the image data acquired by rotating a body, i.e., an object (for example, data acquired by rotating about 30 sheets of photograph photographing a body (object)) into the photo object VR data.
  • FIG. 2 is a block view showing in more detail a constitution of the virtual reality contents producing unit 200 of FIG. 1.
  • the virtual reality contents producing unit 200 comprises a VRML producing unit 210 producing the object contents and the space contents from the VRML files that are converted and extracted from the data converting unit 100, an image contents producing unit 220 producing the panorama VR contents and the photo object VR contents from the panorama VR data and the photo object VR data; and a data storing unit 230 storing the contents produced from the VRML producing unit 210 and the image contents producing unit 220 according to each file format.
  • the VRML producing unit 210 analyzes the object data and the scene data from the VRML files extracted from the data converting unit 100, wherein the object data of the analyzed data produce the object contents by inserting a rotation, an edition, a trigger establishment, a background color of the object, a lightning control, and an animation in the object data.
  • the VRML producing unit 210 produces converts the object contents and the space contents into the Html files to provide them on the web, wherein the scene data of the analyzed data produce the space contents by establishing an octree space divisional establishment, a viewpoint animation establishment, a link RO/KR 11.01.2007
  • the VRML producing unit 210 reads out the object (body) data using the raw data realized from the 3-dimensional data authoring tool as the VRML basic data files (for example, library) in order to process a portion required for producing the 3-dimensional object contents.
  • the VRML basic data files for example, library
  • a process of producing the 3-dimensional contents is performed using the read out object data.
  • the establishment of automatic rotation, addition, and deletion of the object (body), etc. is performed and the establishment of trigger addition or deletion of the object, etc., is performed.
  • the contents are produced through the background color, lightning control, and insertion of a necessary picture, etc., of the object, and the 3- dimensional object contents are produced by inserting the animation, etc., in all or a portion of the 3-dimensional contents, as needed.
  • the object contents are produced as the object file format.
  • the VRML producing unit 210 reads out the scene data using the raw data realized from the 3-dimensional data authoring tool as the object (body) data as the VRML basic data files (for example, library) in order to process a portion required for producing the 3-dimensional space contents. .
  • the space contents are produced by establishing the octree space divisional establishment, the viewpoint animation establishment, the link of the object, and the billboard from the scene data.
  • the octree space divisional establishment is a scheme dividing a 3- dimensional space into a regular hexahedron.
  • the scheme is a space divisional scheme preventing a waste of an unnecessary memory in a space with relatively fewer objects by leaving finally only one plane in the divided space. This is used to solve the problem of the 3-dimensional rear removal and collision, etc.
  • one regular hexahedron is called a node (different from a node concept in the object file and the VRML file), wherein max polygons indicate the number of max polygons as much as one node can have.
  • the number of max polygons is properly controlled and at the same time, the number or overall nodes is determined.
  • the 3-dimensional scene space can be divided by the number of max polygons as well as a sub-divisional numerical value.
  • a gravity base value is set in the scene data to set the collision and gravity so that the camera does not fall below the set value.
  • the radius of the camera plays an important role in setting the collision function and the viewpoint value, it should be properly selected according to a dimension value of the scene data. Since the space (in particular, door or passage) smaller than the radius of the camera does not pass the camera when the collision function is operated, the Z-axis radius of the camera indicates a distance that the camera moves once. Therefore, the radius of the camera should be properly determined according to the dimension value of the scene data so that navigation can be performed at a proper speed.
  • the space contents are produced by adding or deleting viewpoint animation, establishing the collision and gravity effects through the navigation function, and establishing an anchor with an object, a billboard, and an automatic rotation of space, etc., in the scene data.
  • the space contents are produced as the space contents file format.
  • the anchor performs a function of connecting other places on the 3- dimensional space like the link concept of the HTML.
  • the billboard performs a function of making an object look in a certain direction on the 3-dimensional space irrespective of the position and direction of the camera. That is, one surface of the object should be always perpendicular to RO/KR 11.01.2007
  • the direction of the camera It is mainly used when displaying a signboard or a tree, etc., on the 3-dimensional space.
  • the space contents and the object contents produced as each file format are output and then stored in the data storing unit 230.
  • the contents are converted into the Html files to be able to be displayed on the web.
  • the image contents producing unit 220 receives the stitched panorama VR data through the general panorama program from the data converting unit 100 to realize a horizontal (cylindrical shape) panorama VR and horizontal-vertical (spherical shape) panorama VR contents. Also, the producing unit 220 produces the panorama VR contents including color and brightness control of the panorama VR data, a teleport establishment between the panorama
  • VR data a caption establishment, image quality improvement, an automatic point in time, a left and right automatic rotation, and an image zoom in/zoom out function.
  • the image contents producing unit 220 produces the photo object VR contents including an image edition, a caption establishment of the photo object
  • VR image quality improvement
  • a left and right automatic rotation and an image zoom in/zoom out function by using an OVR (Object Virtual Reality) image function of the photo object VR data files input from the data converter 100.
  • OVR Object Virtual Reality
  • the photo object VR contents and the panorama VR contents produced in the image contents producing unit 220 are converted into the Html r ⁇ JT
  • FIG. 3 is a block view showing in more detail a constitution for producing the event contents.
  • an event detecting unit 500 is a constitution detecting the environmental state of the real space and the change state of object contents and space contents of the space.
  • the event detecting unit 500 comprises a plurality of sensing units 510 and 511 detecting the environmental state including at least one of temperature, humidity, and illuminance of the real space and the change state of the contents including at least one of the movements of the object contents and the space contents; and an event data transmitting unit 520 transmitting the environmental state and the change state detected from the plurality of sensing units 510 and 511 to the event contents producing unit 600.
  • the change (increase or decrease) state of the environment such as temperature, humidity, and illuminance, etc., of the real space or the change state of the object contents or the space contents (for example, a door and a window capable of being opened and closed, an equipment capable of detecting an operation through on/off, etc.) etc., produced in the virtual reality contents producing unit 200 are detected and provided through the plurality of sensing units RO/KR 11.01.2007
  • the change in the environmental state of the spot is provided by providing current temperature periodically detected from a temperature sensor as a numerical data in a certain form such as hexadecimal digits, etc.
  • the space contents such as a door or a window, etc.
  • the space contents are installed with a sensor recognizable opening and closing so that it can confirm when the door or the window is opened and closed.
  • data "0" is transmitted and when the door is opened, data "1" is transmitted so that the opening and closing of the door can be confirmed.
  • the event contents producing unit 520 is a constitution producing the environmental state and the change state of the real space detected from the event detecting unit 500 as the 3-dimensional event contents.
  • the unit 520 comprises an event data receiving unit 610 receiving the environmental state and change state of the contents detected from the event detecting unit 500, an event contents managing unit 620 analyzing the environmental state and change state received in the event data receiving unit 610 to generate the event contents RO/KR 11.01.2007
  • the event data receiving unit 610 uses any one of Zigbee, Bluetooth, and wireless LAN to perform wireless data communication with the event detecting unit
  • the unit 610 uses the Zigbee to perform the wireless data communication.
  • the Zigbee which is a new protocol capable of building a low power and low scale wireless network, is defined in IEEE 802.15.4 standards.
  • the event contents managing unit 620 In order to display the case where the environmental state such as temperature, humidity, and illuminance, etc., is changed, the event contents managing unit 620 newly produces the 3-dimensional event contents or changes a current state to modify the event contents through the control of the numerical information and the change in the caption establishment.
  • the event contents managing unit 620 produces the event contents by changing the caption establishment in order to display the change in the object contents or the space contents detected by the on/off operation.
  • the panorama VR contents produced in the image contents producing unit 220 detects whether the events such as temperature, humidity, and RO/KR 11.01.2007
  • illuminance, etc., of the spot are generated, making it possible to provide the event contents whose change is detected.
  • the panorama VR contents a certain region capable of displaying the numerical data like a hotspot is established, wherein the event contents such as the numerical data, etc., can be displayed in the established region.
  • the Html producing unit 630 converts the produced event contents into the Html files to display them on the web.
  • the event contents produced in the event contents managing unit 620 is converted into the HTML data in the HTML producing unit 620 according to a selection of a user and can be then output. Also, the event contents can be output as the produced format without being converted into the HTML data.
  • the contents managing unit 300 uploads the object contents, the space contents, the panorama VR contents, and the photo object VR contents produced from the virtual reality contents producing unit 200 and the event contents file produced from the event contents producing unit 600 to manage them to be provided on a local or a web.
  • FIG. 4 is a block view showing a constitution of the contents managing unit
  • the contents managing unit 300 comprises a 3-dimensional content managing unit 310 storing the object contents, the space contents, the panorama RO/KR 11.01.2007
  • the contents managing unit 300 is a server system.
  • the user terminal 400 has the viewer program for displaying the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents provided from the contents managing unit 300.
  • the user terminal 400 is a PC.
  • the 3-dimensional contents managing unit 310 in the contents managing unit 300 confirms whether the viewer program is installed to be able to display the 3-dimensional contents data on the user terminal 400.
  • the viewer data installing unit 320 in the contents managing unit 300 installs the viewer program in the user terminal 400 and if the installation of the viewer program is completed, it stores the 3-dimensional contents data included in the Html in a RO/KR 11.01.2007
  • FIG. 5 is a flow chart showing a method for realizing virtual reality contents of 3-dimension in real time according to the present invention and FIG. 6 is a flow chart showing a method for realizing the contents using the event data of FIG. 5.
  • the event contents producing unit 600 and the virtual reality contents producing unit 200 producing the 3-dimensional contents detects and analyzes the event data and the raw data for realizing the 3- dimensional contents from the data converting unit 100 and the event detecting unit 500 (S100 and S200).
  • the virtual reality contents producing unit 200 analyzes the VRML data to sort (S300) them into the object data and the scene data.
  • the object data and scene data sorted In the step S300 edits each data through the VRML basic data files (for example, library) to produce (S400) the object contents and the space contents and it uploads the contents produced in FCT/KR2007
  • step S400 to the contents managing unit 300 to register and then store (S500) them.
  • the step S400 further comprises a step of converting the produced contents into the Html files to display them on the web.
  • the object contents produced in the step S400 read out the object data using the VRML basic data files and are produced through a rotation, an edition, and a trigger establishment, a background color of the object, a lightning control, and an animation insertion in the read out object data.
  • the space contents produced in the step S400 read out the scene data using the VRML basic data files by establishing a function of a space divisional establishment, a viewpoint animation establishment, a link of the object, and a billboard function in the read out scene data.
  • the contents managing unit 300 After performing the step S500, the contents managing unit 300 detects (S600) the user terminal 400 accessing the local or the web to judge (S700) whether the viewer program for displaying the object and space contents is installed.
  • the contents managing unit 300 displays (S800) the object contents and the space contents and the event contents to be described later to transmit an output (S900) them to the user terminal 400.
  • S800 the object contents and the space contents and the event contents to be described later to transmit an output (S900) them to the user terminal 400.
  • the virtual reality contents producing unit 200 analyzes the image data to sort them into the panorama VR data and the photo object VR data so that it produces the panorama
  • VR contents and the photo object VR contents uploads the produced contents to the virtual reality contents managing unit 300 to register and then store them.
  • the event contents producing unit 600 analyzes each of the detected event data (S210 and S220).
  • the event data analyzed in the step S220 are produced as the event contents in the event contents producing unit 600 according to the change in the object contents and the space contents and the change in the environmental state of the real space (S230).
  • the environmental state such as temperature, humidity, and illuminance, etc.
  • the 3-dimensional event contents are newly produced or the current state is changed to modify the event contents through the control of the numerical information and the caption establishment.
  • the event contents are produced through the caption establishment, an automatic performance of a touch sensor, etc.
  • the produced event contents are converted into the Html files to display them on the web.
  • the event contents produced in the step S230 are stored S240 in the contents managing unit 300.
  • the contents managing unit 300 displays the object contents, the space contents, and the event contents (S800) to transmit and output

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to system and method for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network capable of providing a solution that can integrate technologies for realizing various virtual realities and service them and displaying environmental information such as temperature, illuminance, humidity, etc., and an operating state of contents in virtual reality in real time. In order to accomplish this, the present invention comprises: a virtual reality contents producing unit producing 3-dimensional contents from raw data; an event detecting unit detecting the change state of the contents; an event contents producing unit producing the environmental state and change state of the real space detected from the event detecting unit as 3-dimensional event contents; a contents managing unit storing the contents produced from the virtual reality contents producing unit and the event contents produced from the event contents producing unit and providing the contents on a local or a web; and a user terminal having a viewer program for displaying the contents provided from the contents managing unit. Therefore, the present invention has an advantage that the lively feeling of the spot can be provided to a user by providing the environment of the virtual reality space realized using the real space as a model or the change state of the contents in real time

Description

F(JJL /KK2U0
RO/KR 11.01.2007
SYSTEM AND METHOD FOR REALIZING VIRTUAL REALITY CONTENTS OF 3-DIMENSION USING UBIQUITOUS SENSOR NETWORK
Technical Field The present invention relates to system and method for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network, and more specifically to system and method for realizing virtual reality contents of 3- dimension using a ubiquitous sensor network capable of providing a solution that can integrate technologies for realizing various virtual realties and service them and displaying environmental information such as temperature, illuminance, humidity, etc., and an operating state of contents in virtual reality in real time.
Background Art
Generally, a VRML (Virtual Reality Modeling Language) code is a 3- dimensional image realization programming language made so as to realize an internet based 3-dimensional image and is a language for presenting information likewise an HTML code.
In other words, the VRML code is a language for describing a 3-imensional image space, etc., which can interact with a user (client). The VRML code makes visual images to register them on a web so that a user can view, move and rotate, RO/KR 11.01.2007
objects on a 3-dimensional screen as well as makes interaction between the objects.
Such a VRML can be viewed by means of a VRML dedicated viewer. Currently mainly used viewers do not comprise a control panel that can control a change in a camera position or positions of each object in a plurality of objects so that they cannot finely control each object according to the position of the camera. As a result, the viewers have been not used for producing the VRML contents that require a control or arrangement of a plurality of objects.
Also, as the technologies for realizing the virtual reality, there are a 3- dimension-based avatar, an object, a scene, an image-based panorama VR, and a photo object VR, etc.
However, the technologies for realizing such virtual reality are realized by a 3-dimensional data authoring tool, causing a problem that each of the produced 3- dimensional contents can confirmed only through each dedicated viewer. Also, the virtual reality realized by using a real space as a model simply provides only a spatial image and does not provide the environmental information or state of the real space, etc., in real time, causing a problem that the lively feeling of the spot is not provided to a user.
Disclosure FCT/KR200 3
RO/KR 11.01.2007
Technical Problem
The present invention proposes to solve the problems. An object of the present invention provides system and method for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network capable of providing a solution that can integrate technologies for realizing various virtual reality and service them, and detecting environmental information and change state of a real space in real time and providing them .
Technical Solution In order to accomplish the object, the present invention comprises: a data converting unit converting raw data realizing a real space by means of a general 3- dimensional data authoring tool into VRML files and converting them into VR data files by rotating image data of a real space acquired from a camera; a virtual reality contents producing unit producing object contents and space contents from the VRML files converted from data converting unit and producing panorama VR contents and photo object VR contents from the VR data; an event detecting unit detecting the environmental state of the real space and the change state of object contents, space contents, and panorama VR contents of the space; an event contents producing unit producing the environmental state and change state of the real space detected from the event detecting unit as 3-dimensional event contents; J*L.T/KK2D0
RO/KR 11.01.2007
a contents managing unit storing the object contents, the space contents, the panorama VR contents, and the photo object VR contents produced from the virtual reality contents producing unit and the event contents produced from the event contents producing unit and providing the contents on a local or a web; and a user terminal having a viewer program for displaying the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents provided from the contents managing unit.
Also, the present invention further comprises an Html producing unit converting the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents into Html files to display them on the web.
Also, the event detecting unit comprises: a plurality of sensing units detecting the environmental state including at least one of temperature, illuminance, and humidity of the real space and the change state of the contents including at least one of the movements of the object contents, the space contents, and the panorama VR contents; and an event data transmitting unit transmitting the environmental state and the change state detected from the sensing units to the event contents producing unit.
Also, the event contents producing unit comprises: an event data receiving unit receiving the environmental state and change state detected from the event RO/KR 11.01.2007
detecting unit; and an event contents managing unit analyzing the environmental state and change state received in the event data receiving unit to generate the event contents through the control of numerical information and caption establishment according to the change in the corresponding environmental information, the object contents, and the space contents.
Also, the contents managing unit comprises: a 3-dimensional content managing unit storing the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents as compression files and managing them; and a viewer data installing unit installing a viewer program capable of displaying the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents on a user terminal accessing the web.
Also, the event detecting unit and the event contents producing unit use any one of Zigbee, Bluetooth, and wireless LAN to transceiver data. Also, the event contents producing unit and the contents managing unit use any one of serial communication and USB communication to transceiver data.
Also, the present invention comprises the steps of: allowing an event contents producing unit producing event contents to detect event data produced according to the change in 3-dimensional object contents, space contents, and panorama VR contents produced from a virtual reality contents producing unit and RO/KR 11.01.2007
the change in the environmental state of a real space; producing the event contents according to the change in the object contents and the space contents and the change in the environmental state of the real space and storing the produced event contents in a contents managing unit by allowing the event contents producing unit to analyze the detected event data; and transmitting the event contents, the object contents, the space contents, and the panorama VR contents to a user terminal and outputting them by allowing the contents managing unit to detect the user terminal accessing a local or a web.
Also, the present invention further comprises converting the event contents produced according to the change in the object contents, the space contents, and the panorama VR contents and the change in the environmental state of the real space into Html files to display them on the web.
Also, the event contents changes numerical information and caption according to the change in the object contents, the space contents, and the panorama VR contents
Advantageous Effects
The present invention has an advantage that the lively feeling of the spot can be provided to a user by providing the environment of the virtual reality space realized using the real space as a model or the change state of the contents in real RO/KR 11.01.2007
time
Also, the present invention has an advantage that it produces the produced 3-dimensional contents and the Html files of the 3-dimensional contents to easily confirm them on the web.
Description of Drawings
The above and other objects, features and advantages of the present invention will become apparent from the following description of preferred embodiments given in conjunction with the accompanying drawings, in which: Fig. 1 is a block view showing a constitution of a system for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network according to the present invention;
FIG. 2 is a block view showing a constitution of a virtual reality contents producing unit of FIG. 1 ; FIG. 3 is a block view showing a constitution of an event contents producing unit of FIG. 1 ;
FIG. 4 is a block view showing a constitution of a system for installing a user viewer on a user terminal of FIG. 1 ;
FIG. 5 is a flow chart showing a method for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network according to the present RO/KR 11.01.2007
invention; and
FIG. 6 is a flow chart showing a method for realizing the contents using the event data of FIG. 5.
Best Mode
Hereinafter, the preferred embodiments of the present invention will be described in detail with reference to accompanying drawings.
FIG. 1 is a block view showing a constitution of a system for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network according to the present invention. As shown in FIG. 1 , the system comprises: a data converting unit 100 converting raw data realized for producing 3-dimension contents into VRML files, panorama VR data files, or photo object VR data files; a virtual reality contents producing unit 200 producing 3-dimensional virtual reality contents using the files converted from the data converting unit 100; an event detecting unit 500 detecting the environmental state of a real space and the change state of object contents and space contents of the space; an event contents producing unit 600 producing the environmental state and change state of the real space detected from the event detecting unit 500 as 3-dimensional event contents; a contents managing unit 300 storing the 3-dimensional virtual reality contents produced in the virtual reality contents producing unit 200 and the RO/KR 11.01.2007
event contents producing unit 600 and managing them to be able to display them on a local region or a web region; and a user terminal 400 having a viewer program to display the 3-dimensional contents provided from the contents managing unit 300. The data converting unit 100 converts the raw data realized by a general 3- dimensional data authoring tool (For example, 3D Studio, 3D Photo VR Editor, etc.) into the VRML files and converts them into the VR data files by rotating image data acquired from a camera. That is, the raw data realized using the general 3- dimensional authoring tool are converted into the VRML data so that they are extracted as the VRML files.
Also, the data converting unit 100 extracts panorama VR data and photo object VR data from the image data acquired from the camera. Herein, the panorama VR data stitches the image data acquired by rotating the camera (for example, about 30 sheets of photograph data photographed by rotating the camera) into a general panorama program to convert them into the panorama VR data.
Also, the photo object VR data converts the image data acquired by rotating a body, i.e., an object (for example, data acquired by rotating about 30 sheets of photograph photographing a body (object)) into the photo object VR data. The VRML data, the panorama VR data, and the photo object VR data RO/KR 11.01.2007
converted in the data converting unit 100 are input to the virtual reality contents producing unit 200.
FIG. 2 is a block view showing in more detail a constitution of the virtual reality contents producing unit 200 of FIG. 1. In FIG. 2, the virtual reality contents producing unit 200 comprises a VRML producing unit 210 producing the object contents and the space contents from the VRML files that are converted and extracted from the data converting unit 100, an image contents producing unit 220 producing the panorama VR contents and the photo object VR contents from the panorama VR data and the photo object VR data; and a data storing unit 230 storing the contents produced from the VRML producing unit 210 and the image contents producing unit 220 according to each file format.
The VRML producing unit 210 analyzes the object data and the scene data from the VRML files extracted from the data converting unit 100, wherein the object data of the analyzed data produce the object contents by inserting a rotation, an edition, a trigger establishment, a background color of the object, a lightning control, and an animation in the object data..
Also, the VRML producing unit 210 produces converts the object contents and the space contents into the Html files to provide them on the web, wherein the scene data of the analyzed data produce the space contents by establishing an octree space divisional establishment, a viewpoint animation establishment, a link RO/KR 11.01.2007
of the object, a billboard, a gravity value, a collision value, and an automatic rotation in the scene data.
Herein, the VRML producing unit 210 reads out the object (body) data using the raw data realized from the 3-dimensional data authoring tool as the VRML basic data files (for example, library) in order to process a portion required for producing the 3-dimensional object contents.
At this time, a process of producing the 3-dimensional contents is performed using the read out object data. At this time, the establishment of automatic rotation, addition, and deletion of the object (body), etc., is performed and the establishment of trigger addition or deletion of the object, etc., is performed.
Also, the contents are produced through the background color, lightning control, and insertion of a necessary picture, etc., of the object, and the 3- dimensional object contents are produced by inserting the animation, etc., in all or a portion of the 3-dimensional contents, as needed. The object contents are produced as the object file format.
Also, the VRML producing unit 210 reads out the scene data using the raw data realized from the 3-dimensional data authoring tool as the object (body) data as the VRML basic data files (for example, library) in order to process a portion required for producing the 3-dimensional space contents. .
RO/KR 11.01.2007
In this case, the space contents are produced by establishing the octree space divisional establishment, the viewpoint animation establishment, the link of the object, and the billboard from the scene data.
The octree space divisional establishment is a scheme dividing a 3- dimensional space into a regular hexahedron. The scheme is a space divisional scheme preventing a waste of an unnecessary memory in a space with relatively fewer objects by leaving finally only one plane in the divided space. This is used to solve the problem of the 3-dimensional rear removal and collision, etc.
In the octree space division, one regular hexahedron is called a node (different from a node concept in the object file and the VRML file), wherein max polygons indicate the number of max polygons as much as one node can have.
That is, if it is larger than the number of polygons, the space is recursively divided again.
Accordingly, when dividing the 3-dimensional space, firstly, the number of max polygons is properly controlled and at the same time, the number or overall nodes is determined. Also, the 3-dimensional scene space can be divided by the number of max polygons as well as a sub-divisional numerical value.
Also, since a 3-dimensional graphic engine is used on the web, the size of file is very small. Therefore, since it does not completely sense the collision between the camera and the object as in a general 3-dimensional graphic game RO/KR 11.01.2007
engine, a gravity base value is set in the scene data to set the collision and gravity so that the camera does not fall below the set value.
Also, since the radius of the camera plays an important role in setting the collision function and the viewpoint value, it should be properly selected according to a dimension value of the scene data. Since the space (in particular, door or passage) smaller than the radius of the camera does not pass the camera when the collision function is operated, the Z-axis radius of the camera indicates a distance that the camera moves once. Therefore, the radius of the camera should be properly determined according to the dimension value of the scene data so that navigation can be performed at a proper speed.
Also, the space contents are produced by adding or deleting viewpoint animation, establishing the collision and gravity effects through the navigation function, and establishing an anchor with an object, a billboard, and an automatic rotation of space, etc., in the scene data. The space contents are produced as the space contents file format.
The anchor performs a function of connecting other places on the 3- dimensional space like the link concept of the HTML.
The billboard performs a function of making an object look in a certain direction on the 3-dimensional space irrespective of the position and direction of the camera. That is, one surface of the object should be always perpendicular to RO/KR 11.01.2007
the direction of the camera. It is mainly used when displaying a signboard or a tree, etc., on the 3-dimensional space.
Therefore, the space contents and the object contents produced as each file format are output and then stored in the data storing unit 230. The contents are converted into the Html files to be able to be displayed on the web.
Meanwhile, the image contents producing unit 220 receives the stitched panorama VR data through the general panorama program from the data converting unit 100 to realize a horizontal (cylindrical shape) panorama VR and horizontal-vertical (spherical shape) panorama VR contents. Also, the producing unit 220 produces the panorama VR contents including color and brightness control of the panorama VR data, a teleport establishment between the panorama
VR data, a caption establishment, image quality improvement, an automatic point in time, a left and right automatic rotation, and an image zoom in/zoom out function.
Also, the image contents producing unit 220 produces the photo object VR contents including an image edition, a caption establishment of the photo object
VR, image quality improvement, a left and right automatic rotation, and an image zoom in/zoom out function by using an OVR (Object Virtual Reality) image function of the photo object VR data files input from the data converter 100.
Also, the photo object VR contents and the panorama VR contents produced in the image contents producing unit 220 are converted into the Html r<JT
RO/KR 11.01.2007
files to be able to be displayed on the web and the files of the photo object contents and the panorama VR contents produced as each file format are stored in the data storing unit 230.
FIG. 3 is a block view showing in more detail a constitution for producing the event contents. As shown in FIGS. 1 and 3, an event detecting unit 500 is a constitution detecting the environmental state of the real space and the change state of object contents and space contents of the space. For example, the event detecting unit 500 comprises a plurality of sensing units 510 and 511 detecting the environmental state including at least one of temperature, humidity, and illuminance of the real space and the change state of the contents including at least one of the movements of the object contents and the space contents; and an event data transmitting unit 520 transmitting the environmental state and the change state detected from the plurality of sensing units 510 and 511 to the event contents producing unit 600. The change (increase or decrease) state of the environment such as temperature, humidity, and illuminance, etc., of the real space or the change state of the object contents or the space contents (for example, a door and a window capable of being opened and closed, an equipment capable of detecting an operation through on/off, etc.) etc., produced in the virtual reality contents producing unit 200 are detected and provided through the plurality of sensing units RO/KR 11.01.2007
510 and 511 in real time, making it possible to provide lively feeling to a user.
In other words, the change in the environmental state of the spot is provided by providing current temperature periodically detected from a temperature sensor as a numerical data in a certain form such as hexadecimal digits, etc.
Also, the space contents such as a door or a window, etc., are installed with a sensor recognizable opening and closing so that it can confirm when the door or the window is opened and closed. In other words, when the door is closed, data "0" is transmitted and when the door is opened, data "1" is transmitted so that the opening and closing of the door can be confirmed.
Also, in the case of the object contents operated by using a power supply in the real space, whether the object contents is operated can be firmed through the on/off of the power supply.
Also, the event contents producing unit 520 is a constitution producing the environmental state and the change state of the real space detected from the event detecting unit 500 as the 3-dimensional event contents. The unit 520 comprises an event data receiving unit 610 receiving the environmental state and change state of the contents detected from the event detecting unit 500, an event contents managing unit 620 analyzing the environmental state and change state received in the event data receiving unit 610 to generate the event contents RO/KR 11.01.2007
through the change in a numerical information establishment and the change in a caption establishment according to the change in the corresponding environmental information, the object contents, and the space contents, and an Html producing unit 63fO converting the event contents produced from the event contents managing unit 620 to the Html files to display them on the web.
The event data receiving unit 610 uses any one of Zigbee, Bluetooth, and wireless LAN to perform wireless data communication with the event detecting unit
500. Preferably, the unit 610 uses the Zigbee to perform the wireless data communication. The Zigbee, which is a new protocol capable of building a low power and low scale wireless network, is defined in IEEE 802.15.4 standards.
In order to display the case where the environmental state such as temperature, humidity, and illuminance, etc., is changed, the event contents managing unit 620 newly produces the 3-dimensional event contents or changes a current state to modify the event contents through the control of the numerical information and the change in the caption establishment.
Also, the event contents managing unit 620 produces the event contents by changing the caption establishment in order to display the change in the object contents or the space contents detected by the on/off operation.
Meanwhile, the panorama VR contents produced in the image contents producing unit 220 detects whether the events such as temperature, humidity, and RO/KR 11.01.2007
illuminance, etc., of the spot are generated, making it possible to provide the event contents whose change is detected. In other words, in the case of the panorama VR contents, a certain region capable of displaying the numerical data like a hotspot is established, wherein the event contents such as the numerical data, etc., can be displayed in the established region.
The Html producing unit 630 converts the produced event contents into the Html files to display them on the web.
Meanwhile, the event contents produced in the event contents managing unit 620 is converted into the HTML data in the HTML producing unit 620 according to a selection of a user and can be then output. Also, the event contents can be output as the produced format without being converted into the HTML data.
Referring again to FIG. 1 , the contents managing unit 300 uploads the object contents, the space contents, the panorama VR contents, and the photo object VR contents produced from the virtual reality contents producing unit 200 and the event contents file produced from the event contents producing unit 600 to manage them to be provided on a local or a web.
FIG. 4 is a block view showing a constitution of the contents managing unit
300 and a constitution of a system for installing a user viewer in a user terminal
400. In FIG. 4, the contents managing unit 300 comprises a 3-dimensional content managing unit 310 storing the object contents, the space contents, the panorama RO/KR 11.01.2007
VR contents, the photo object VR contents, and the event contents as compression files and managing them, and a viewer data installing unit 320 installing a viewer program capable of displaying the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents on the user terminal accessing the web. Preferably, the contents managing unit 300 is a server system.
The user terminal 400 has the viewer program for displaying the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents provided from the contents managing unit 300. Preferably, the user terminal 400 is a PC.
When the user terminal 400 is connected to the contents managing unit 300 through an internet explorer 410 installed in the user terminal 400 in order to receive the 3-dimensional contents data provided on the web, the 3-dimensional contents managing unit 310 in the contents managing unit 300 confirms whether the viewer program is installed to be able to display the 3-dimensional contents data on the user terminal 400.
If the viewer program is not installed in the user terminal 400, the viewer data installing unit 320 in the contents managing unit 300 installs the viewer program in the user terminal 400 and if the installation of the viewer program is completed, it stores the 3-dimensional contents data included in the Html in a RO/KR 11.01.2007
temporary folder 420 of the user terminal 400and displays them on the internet explorer 410.
Also, if the viewer program is completed, the unit 320 deletes the files stored in the temporary folder 420. FIG. 5 is a flow chart showing a method for realizing virtual reality contents of 3-dimension in real time according to the present invention and FIG. 6 is a flow chart showing a method for realizing the contents using the event data of FIG. 5.
Referring to FIGS. 1 to 6, the event contents producing unit 600 and the virtual reality contents producing unit 200 producing the 3-dimensional contents detects and analyzes the event data and the raw data for realizing the 3- dimensional contents from the data converting unit 100 and the event detecting unit 500 (S100 and S200).
According to the judgment results, when the analyzed data are the raw data for realizing the 3-dimensional contents as the VRML data produced from the 3-dimensional data authoring tool, the virtual reality contents producing unit 200 analyzes the VRML data to sort (S300) them into the object data and the scene data.
The object data and scene data sorted In the step S300 edits each data through the VRML basic data files (for example, library) to produce (S400) the object contents and the space contents and it uploads the contents produced in FCT/KR2007
RO/KR 11.01.2007
the step S400 to the contents managing unit 300 to register and then store (S500) them.
Also, the step S400 further comprises a step of converting the produced contents into the Html files to display them on the web. The object contents produced in the step S400 read out the object data using the VRML basic data files and are produced through a rotation, an edition, and a trigger establishment, a background color of the object, a lightning control, and an animation insertion in the read out object data.
Also, the space contents produced in the step S400 read out the scene data using the VRML basic data files by establishing a function of a space divisional establishment, a viewpoint animation establishment, a link of the object, and a billboard function in the read out scene data.
After performing the step S500, the contents managing unit 300 detects (S600) the user terminal 400 accessing the local or the web to judge (S700) whether the viewer program for displaying the object and space contents is installed.
According to the judgment results of the step S700, if the viewer program is installed in the user terminal 400, the contents managing unit 300 displays (S800) the object contents and the space contents and the event contents to be described later to transmit an output (S900) them to the user terminal 400. RO/KR 11.01.2007
Also, if the raw data is the image data acquired from the camera, the virtual reality contents producing unit 200 analyzes the image data to sort them into the panorama VR data and the photo object VR data so that it produces the panorama
VR contents and the photo object VR contents and uploads the produced contents to the virtual reality contents managing unit 300 to register and then store them.
Meanwhile, if the event data generated according to the change in the space contents and the 3-dimensional object contents produced from the virtual reality contents producing unit 200 and the change in the environmental state of the real space are detected in the step S200, the event contents producing unit 600 analyzes each of the detected event data (S210 and S220).
The event data analyzed in the step S220 are produced as the event contents in the event contents producing unit 600 according to the change in the object contents and the space contents and the change in the environmental state of the real space (S230). In other words, in order to display the case where the environmental state such as temperature, humidity, and illuminance, etc., the 3-dimensional event contents are newly produced or the current state is changed to modify the event contents through the control of the numerical information and the caption establishment. Also, in order to display the change in the space contents or the object RO/KR 11.01.2007
contents detected by the on/off operation, the event contents are produced through the caption establishment, an automatic performance of a touch sensor, etc.
Also, the produced event contents are converted into the Html files to display them on the web.
The event contents produced in the step S230 are stored S240 in the contents managing unit 300. The contents managing unit 300 displays the object contents, the space contents, and the event contents (S800) to transmit and output
(S900) the event contents produced in the step S230 and the object contents and the space contents produced in the step S400 to the user terminal 400.
Although the preferred embodiments of the present invention have been shown and described, it would be appreciated by those skilled in the art that changes might be made in this embodiment without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.

Claims

RO/KR 11.01.2007What is claimed is:
1. A system for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network comprising: a data converting unit converting raw data realizing a real space by means of a general 3-dimensional data authoring tool into VRML files and converting them into VR data files by rotating image data of a real space acquired from a camera; a virtual reality contents producing unit producing object contents and space contents from the VRML files converted from data converting unit and producing panorama VR contents and photo object VR contents from the VR data; an event detecting unit detecting the environmental state of the real space and the change state of object contents, space contents, and panorama VR contents of the space; an event contents producing unit producing the environmental state and change state of the real space detected from the event detecting unit as 3- dimensional event contents; a contents managing unit storing the object contents, the space contents, the panorama VR contents, and the photo object VR contents produced from the virtual reality contents producing unit and the event contents produced from the event contents producing unit and providing the contents on a local or a web; and RO/KR 11.01.2007
a user terminal having a viewer program for displaying the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents provided from the contents managing unit.
2. The system as claimed in claim 1 , further comprising an Html producing unit converting the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents into Html files to display them on the web.
3. The system as claimed in claim 1 or 2, wherein the event detecting unit comprises: a plurality of sensing units detecting the environmental state including at least one of temperature, illuminance, and humidity of the real space and the change state of the contents including at least one of the movements of the object contents, the space contents, and the panorama VR contents; and an event data transmitting unit transmitting the environmental state and the change state detected from the sensing units to the event contents producing unit.
4. The system as claimed in claim 1 or 2, wherein the event contents producing unit comprises: RO/KR 11.01.2007
an event data receiving unit receiving the environmental state and change state detected from the event detecting unit; and an event contents managing unit analyzing the environmental state and change state received in the event data receiving unit to generate the event contents through the control of numerical information and caption establishment according to the change in the corresponding environmental information, the object contents, the space contents, and a panorama VR contents.
5. The system as claimed in claim 1 or 2, wherein the contents managing unit comprises: a 3-dimensional content managing unit storing the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents as compression files and managing them; and a viewer data installing unit installing a viewer program capable of displaying the object contents, the space contents, the panorama VR contents, the photo object VR contents, and the event contents on a user terminal accessing the web.
6. The system as claimed in claim 1 or 2, wherein the event detecting unit and the event contents producing unit use any one of Zigbee, Bluetooth, and RO/KR 11.01.2007
wireless LAN to transceiver data.
7. The system as claimed in claim 1 or 2, wherein the event contents producing unit and the contents managing unit use any one of serial communication and USB communication to transceiver data.
8. A method for realizing virtual reality contents of 3-dimension using a ubiquitous sensor network comprising the steps of: a) allowing an event contents producing unit producing event contents to detect event data produced according to the change in 3-dimensional object contents, space contents, and panorama VR contents produced from a virtual reality contents producing unit and the change in the environmental state of a real space; b) producing the event contents according to the change in the object contents and the space contents and the change in the environmental state of the real space and storing the produced event contents in a contents managing unit by allowing the event contents producing unit to analyze the event data detected in the step a); and c) transmitting the event contents, the object contents, the space contents, and the panorama VR contents to a user terminal and outputting them by allowing RO/KR 11.01.2007
the contents managing unit to detect the user terminal accessing a local or a web.
9. The method as claimed in claim 8, wherein the step b) comprises converting the event contents produced into Html files to display them on the web.
10. The method as claimed in claim 8, wherein the event contents changes numerical information and caption according to the change in the object contents, the space contents, and the panorama VR contents
PCT/KR2007/002362 2006-09-01 2007-05-14 System and method for realizing virtual reality contents of 3-dimension using ubiquitous sensor network WO2008026817A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008558217A JP2009528646A (en) 2006-09-01 2007-05-14 Realization system and realization method of 3D virtual reality content using ubiquitous sensor network

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2006-0084369 2006-09-01
KR1020060084369A KR100661052B1 (en) 2006-09-01 2006-09-01 System and method for realizing virtual reality contents of 3-dimension using ubiquitous sensor network

Publications (1)

Publication Number Publication Date
WO2008026817A1 true WO2008026817A1 (en) 2008-03-06

Family

ID=37815457

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2007/002362 WO2008026817A1 (en) 2006-09-01 2007-05-14 System and method for realizing virtual reality contents of 3-dimension using ubiquitous sensor network

Country Status (3)

Country Link
JP (1) JP2009528646A (en)
KR (1) KR100661052B1 (en)
WO (1) WO2008026817A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120146894A1 (en) * 2010-12-09 2012-06-14 Electronics And Telecommunications Research Institute Mixed reality display platform for presenting augmented 3d stereo image and operation method thereof
US20120188256A1 (en) * 2009-06-25 2012-07-26 Samsung Electronics Co., Ltd. Virtual world processing device and method
CN109544675A (en) * 2018-10-11 2019-03-29 广东电网有限责任公司 Threedimensional model status visualization method based on holographic data processing transformer equipment
US10585976B2 (en) 2015-04-24 2020-03-10 Korea Institute Of Science And Technology Device and method for representing HTML elements having 3-dimensional information on web
CN112738625A (en) * 2020-12-24 2021-04-30 广东九联科技股份有限公司 Video image enhancement method and device based on set top box
WO2023069016A1 (en) * 2021-10-21 2023-04-27 Revez Motion Pte. Ltd. Method and system for managing virtual content

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101062961B1 (en) * 2009-01-07 2011-09-06 광주과학기술원 System and Method for authoring contents of augmented reality, and the recording media storing the program performing the said method
KR101503994B1 (en) * 2012-02-29 2015-03-19 숭실대학교산학협력단 Socical Commerce Platform System Based Panorama Virtual Reality
US20130314508A1 (en) * 2012-05-25 2013-11-28 Takayuki Arima Management for super-reality entertainment
KR101734655B1 (en) 2015-06-26 2017-05-25 동서대학교산학협력단 360 VR VFX 360 VR content diligence VFX post-production method applied using projection mapping in the manufacturing process
WO2018030784A1 (en) * 2016-08-12 2018-02-15 민상규 Device for user experience using artificial intelligence
KR102567150B1 (en) * 2023-01-10 2023-08-29 (주)메타버즈 A Metaverse Environment Building Method that is Directly Expressed in the Web Page Environment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030056303A (en) * 2001-12-28 2003-07-04 한국전자통신연구원 Immersive virtual environment system based on Internet environment, projection and stepper
KR20040055503A (en) * 2002-12-21 2004-06-26 한국전자통신연구원 Query method in motion database by using 3 dimensional body movement
KR20060003808A (en) * 2004-07-06 2006-01-11 후지쯔 가부시끼가이샤 Server system, user terminal, service providing method and service providing system using the server system and the user terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003085116A (en) * 2001-09-07 2003-03-20 Katsutoshi Takifuji Virtual space information system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030056303A (en) * 2001-12-28 2003-07-04 한국전자통신연구원 Immersive virtual environment system based on Internet environment, projection and stepper
KR20040055503A (en) * 2002-12-21 2004-06-26 한국전자통신연구원 Query method in motion database by using 3 dimensional body movement
KR20060003808A (en) * 2004-07-06 2006-01-11 후지쯔 가부시끼가이샤 Server system, user terminal, service providing method and service providing system using the server system and the user terminal

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120188256A1 (en) * 2009-06-25 2012-07-26 Samsung Electronics Co., Ltd. Virtual world processing device and method
US20120146894A1 (en) * 2010-12-09 2012-06-14 Electronics And Telecommunications Research Institute Mixed reality display platform for presenting augmented 3d stereo image and operation method thereof
US10585976B2 (en) 2015-04-24 2020-03-10 Korea Institute Of Science And Technology Device and method for representing HTML elements having 3-dimensional information on web
CN109544675A (en) * 2018-10-11 2019-03-29 广东电网有限责任公司 Threedimensional model status visualization method based on holographic data processing transformer equipment
CN109544675B (en) * 2018-10-11 2023-04-28 广东电网有限责任公司 Three-dimensional model state visualization method based on holographic data processing transformation equipment
CN112738625A (en) * 2020-12-24 2021-04-30 广东九联科技股份有限公司 Video image enhancement method and device based on set top box
CN112738625B (en) * 2020-12-24 2023-03-31 广东九联科技股份有限公司 Video image enhancement method and device based on set top box
WO2023069016A1 (en) * 2021-10-21 2023-04-27 Revez Motion Pte. Ltd. Method and system for managing virtual content

Also Published As

Publication number Publication date
JP2009528646A (en) 2009-08-06
KR100661052B1 (en) 2006-12-22

Similar Documents

Publication Publication Date Title
WO2008026817A1 (en) System and method for realizing virtual reality contents of 3-dimension using ubiquitous sensor network
JP7187446B2 (en) augmented virtual reality
CN101542536A (en) System and method for compositing 3D images
KR101697713B1 (en) Method and apparatus for generating intelligence panorama VR(virtual reality) contents
WO2019017582A1 (en) Method and system for collecting cloud sourcing-based ar content templates and automatically generating ar content
CN113115110B (en) Video synthesis method and device, storage medium and electronic equipment
CN108809800B (en) Multimedia data processing method, equipment and system thereof
KR102435185B1 (en) How to create 3D images based on 360° VR shooting and provide 360° VR contents service
KR100573983B1 (en) System and method for realizing virtual reality contents of 3-dimension
JP2023512131A (en) Apparatus for multi-angle screen coverage analysis
CN117751374A (en) Distributed command execution in a multi-location studio environment
JP2000505219A (en) 3D 3D browser suitable for the Internet
CN109863746B (en) Immersive environment system and video projection module for data exploration
KR101425672B1 (en) Building Information Modeling Based Communication System, Building Information Modeling Based Communication Server, and Building Information Modeling Based Communication Method in Mobile Terminal and Recording Medium Thereof
EP1517328A1 (en) Information editing device, information editing method, and computer program product
KR20240044097A (en) System and method for realizing virtual realitycontents of 3-dimension using ubiquitous network
KR20000054155A (en) System for emboding dynamic image of it when selected object in three dimensions imagination space
Hudson-Smith Digital urban-the visual city
KR100370869B1 (en) The method of a three dimensional virtual operating simulation
KR20000050196A (en) Three dimensions imagination system for displaying viewing direction and changing image of object by viewing direction, method for emboding it
KR100403943B1 (en) System for reconstructing and editing image of object in screen of three dimensions imagination space
Giertsen et al. An open system for 3D visualisation and animation of geographic information
KR100283617B1 (en) 3D Virtual Space Construction System
KR20000054149A (en) Imagination traveling system of three dimensions
KR102620333B1 (en) System for Controlling Media Art Content by using Viewing data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07746511

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008558217

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07746511

Country of ref document: EP

Kind code of ref document: A1