CN109840042A - The method and associated terminal and system of virtual objects are shown in mixed reality - Google Patents

The method and associated terminal and system of virtual objects are shown in mixed reality Download PDF

Info

Publication number
CN109840042A
CN109840042A CN201811437333.5A CN201811437333A CN109840042A CN 109840042 A CN109840042 A CN 109840042A CN 201811437333 A CN201811437333 A CN 201811437333A CN 109840042 A CN109840042 A CN 109840042A
Authority
CN
China
Prior art keywords
terminal
window
data element
virtual objects
mixed reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811437333.5A
Other languages
Chinese (zh)
Inventor
塞德里克·弗卢里
珍·卡尔蒂尼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orange SA
Original Assignee
Orange SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orange SA filed Critical Orange SA
Publication of CN109840042A publication Critical patent/CN109840042A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • G09G2340/145Solving problems related to the presentation of information to be displayed related to small screens
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The method that the present invention relates to a kind of to show at least one virtual objects in mixed reality, this method includes the following steps executed by first terminal: sending step (E410), for sending at least one the position data element for the pointer being located in a part of the first window of computer applied algorithm, first window is shown by second terminal;Receiving step (E420), for receiving at least one a part of relevant data element to first window;And processing step (E430), for handling at least one a part of relevant data element to first window, to show at least one virtual objects in mixed reality based at least one a part of relevant data element to first window.

Description

The method and associated terminal and system of virtual objects are shown in mixed reality
Technical field
The present invention relates to mixed reality (mixed reality), otherwise referred to as mix real (hybrid reality) General domain, also, relate more specifically to a kind of method that at least one virtual objects is shown in mixed reality.
Background technique
Terminal user, such as computer user, it is often necessary to while multiple computer applied algorithms are used, for example, simultaneously It is used together word processor software and email application.
However, the terminal screen of small size tends not to accomplish the window for showing the application program in all uses simultaneously, To allow users to clearly see them.
For example, the display of certain application programs can only be reduced into icon, and such user is just no longer to those applications Program has any visual feedback, but in order to show that those application programs again, user needs to be implemented control operation.
The solution that therefore, it is necessary to show while one kind can improve multiple application programs.
Summary of the invention
The method that the present invention relates to a kind of to show at least one virtual objects in mixed reality, this method include by first The following steps that terminal executes:
Sending step, for sending the pointer being located in a part of the first window of computer applied algorithm extremely A few position data element, the first window are shown by second terminal;
Receiving step, for receiving at least one a part of relevant data element to the first window;And
Processing step, for handle at least one described a part of relevant data element, so as to be based on it is described At least one a part of relevant data element shows at least one described virtual objects in mixed reality.
Therefore, the user of first terminal can be by the display of the computer applied algorithm in using from the display of second terminal Device extension comes out.Many a virtual objects corresponding with these application programs can be shown simultaneously.
In specific embodiment, at least one described virtual objects are the second windows of computer applied algorithm.
In specific embodiment, the following steps that the method further includes being executed by the first terminal:
For obtaining the display equipment for the second terminal for showing the first window in the first terminal and institute State the obtaining step of at least one position data element in the true environment that second terminal is positioned;And
For obtaining the obtaining step of at least one position data element of the pointer, the pointer at least one Position data element is position data element of the pointer relative to the display equipment.
In specific embodiment, when detection is for showing the order of at least one virtual objects described in execution Sending step.
In specific embodiment, the receiving step includes receiving:
At least one crawl of a part of the first window, is digital image format;And/or
To at least one a part of relevant context data element of the first window;And/or
To at least one a part of relevant associated metadata elements of the first window.
In specific embodiment, the first terminal is mixed reality head-mounted display or mixed reality glasses.
In specific embodiment, the reduction process that the method further includes being executed by the second terminal is used In a part for reducing the first window.
In specific embodiment, the determination step that the method further includes being executed by the second terminal is used A part of the first window is determined at least one position data element according to the pointer.
In addition, the present invention provides a kind of mixed reality terminals for being adapted for carrying out method as described above.
Moreover, being adapted for carrying out method as described above, the system including terminal as described above the present invention provides a kind of.
In certain embodiments, for showing that the mixed reality of the invention of at least one virtual objects of the invention is aobvious Show that each step of method is determined by computer program instructions.
Therefore, the present invention also provides the computer program on a kind of data medium, described program includes being adapted for carrying out root The instruction of the step of according to the mixed reality display methods for being used to show at least one virtual objects of the invention.
Described program can be used any programming language, and it can be source code, object code or source code with The form of code between object code, the form of Zhu Ruwei partial compilation, or be any other desired form.
The present invention also provides a kind of mechanized data medium, the instruction including computer program as described above.
Data medium can be any entity or equipment that can store program.For example, medium may include such as read-only The storage device of memory (ROM) etc., such as CD (CD) ROM or microelectronic circuit ROM, or even may include magnetic recording Device, such as hard disk.
In addition, data medium can be can transmission medium, such as can be via cable or optical cable, by radio or pass through The electric signal or optical signal of other way transmission.Particularly, program of the invention can be downloaded from the network of internet-type.
Optionally, data medium can be wherein include program integrated circuit, the circuit, which is adapted for carrying out, to be discussed Method or for executing discussed method.
Detailed description of the invention
Other features and advantages of the present invention are presented from the description below with reference to attached drawing, and the attached drawing, which is shown, not to be limited The embodiment of feature processed.In figure:
Fig. 1 is in the embodiment of the present invention for showing that the mixed reality display system of at least one virtual objects is shown Figure;
Fig. 2A and Fig. 2 B is shown in the embodiment of the present invention for showing that the mixed reality of at least one virtual objects is aobvious Show the first terminal of system and the diagram of second terminal;
Fig. 3 A and Fig. 3 B are the first terminal of the system of Fig. 1 and the diagram of second terminal respectively;
Fig. 4 and Fig. 5 is shown in the embodiments of the present invention for showing the mixed reality of at least one virtual objects The flow chart of the key step of display methods;And
Fig. 6 is shown in the embodiment of the present invention for showing the mixed reality display system of at least one virtual objects The diagram of the screen of second terminal.
Specific embodiment
Fig. 1 is to show to be used for for showing at least one virtual objects and being adapted for carrying out in embodiments of the present invention Show the diagram of the mixed reality display system 100 of the mixed reality display methods of at least one virtual objects.
Mixed reality is that virtual world is enable to merge with real world to generate the technology of new environment and display, The physical object of middle real world and the digital object of virtual world coexist and can for example interact in real time fashion. Mixed reality allows to show virtual objects by display equipment, so that they are superimposed upon in real world.
System 100 includes first terminal 110, and can also include second terminal 120.First terminal 110 and second is eventually End 120 can be communicated with one another by telecommunication network 130, the telecommunication network 130 for example, internet-type network (such as WiFi), blueteeth network, or fixed or mobile telephone network (for types such as 3G, 4G).
Fig. 2A shows the example of first terminal 110 and second terminal 120, and wherein first terminal 110 is that user wears Mixed reality glasses, and second terminal 120 is computer.The screen of second terminal shows the window of two computer applied algorithms F1 and F2.First terminal 110 and the user U of second terminal 120 can be by staring a part of one of two windows F1 for The pointer P of one terminal 110 is located in this part.
User U can also carry out grabbing the movement of this part of the first window F1, and the crawl campaign is whole by first End 110 is construed to the order of this part of display first window F1.
Then, second terminal 120 can send data relevant to window F1 is directed toward to first terminal 110, so that first Terminal 110 can show new window F3 (B referring to fig. 2) based on the data.Then, second terminal 120 can stop display window F1 can reduce window F1.
Then, user U can move freely new window F3.User can also interact with new window F3.
The usually mixed reality glasses of first terminal 110, such as " Microsoft Hololens " (registered trademark) glasses, Or mixed reality head-mounted display.In a kind of modification, first terminal 110 can be mobile terminal, such as smart phone Type portable phone or digital flat panel computer.
For example, second terminal 120 is fixed terminal or mobile terminal, such as computer, usually personal computer, Smart phone type mobile phone, digital flat panel computer or digital television.
First terminal 110 includes pointer module, the sending module for sending position data, for receiving first window portion The receiving module of divided data, and the processor module for handling true environment data and first window partial data.In addition, First terminal 110 may include for obtaining the acquisition module of position data, the acquisition module for acquiring true environment data, And/or the display equipment for showing virtual objects.
In an example, display equipment is transparent or semitransparent screen, can show virtual objects thereon, described virtual Object generallys use the form of hologram.Transparent screen can be positioned at the front of eyes of user, so that user is seen by screen The virtual objects and true environment shown on to screen.Therefore, transparent screen is typically mounted on mixed reality glasses or installs On mixed reality head-mounted display.
In another example, display equipment is the screen that can show the image from real world, and described image is logical It is often acquired by acquisition module, wherein virtual objects can be superimposed on this image.
In instances, acquisition module is mobile camera or stationary cameras and/or depth transducer.
Sending module and/or receiving module may include one or more remote communication modules, for example, WiFi submodule and/ Or multiple short-haul connections submodules, such as near-field communication (NFC) submodule or bluetooth submodule.
In addition, second terminal 120 may include display equipment, the receiving module for receiving position data, for handling The processor module of position data, the handling module for grabbing first window part, for storing first window partial data Memory module, the diminution module for reducing first window part, and/or the transmission for sending first window partial data Module.
As shown in Figure 3A, the conventional architectures of computer are presented in first terminal 110.First terminal 110 specifically includes processor 300, ROM 302, (for example, electrically erasable programmable read-only memory (EEPROM) class of rewritable nonvolatile memory 304 Type or " flash memory NAND " type), rewritable volatile memory 306 (for random access memory (RAM) type), and communication Interface 308.
According to an embodiment of the invention, the ROM 302 of first terminal 110 constitutes data medium;Implementation according to the present invention Example, the data medium can be read by processor 300 and store computer program P1.In a kind of modification, computer program P1 It is stored in rewritable nonvolatile memory 304.
Computer program P1 defined function module and software module, these functional modules and software module in this example by It is configured to execute the mixed reality display methods for showing at least one virtual objects of embodiment according to the present invention Each step.These functional modules rely on or control the hardware element 300,302,304,306 and 308 of above-mentioned first terminal 110.It is special Not, in this example, they include pointer module, sending module, receiving module, processor module, and are also possible that Obtain module and/or acquisition module.
In addition, as shown in Figure 3B, second terminal 120 is also by including processor 310, ROM 312, rewritable nonvolatile Memory 314 (for example, EEPROM or flash memory NAND type), rewritable volatile memory 316 (for RAM type) and communication Interface 318 is presented the conventional architectures of computer.
It is similar with first terminal 110, according to an embodiment of the invention, second terminal 120 ROM 312 (or it is rewritable easily The property lost memory 314) constitute data medium;According to an embodiment of the invention, the data medium can be read by processor 310 and Store computer program P2.
Computer program P2 defined function module and software module, these functional modules and software module in this example by It is configured to realize the mixed reality display methods for showing at least one virtual objects of embodiment according to the present invention Each step.Particularly, they include receiving module, processor module, handling module, memory module, reduce module and send mould Block.
The function of these various modules is described in detail below with reference to Fig. 4 and Fig. 5 method and step described.
Fig. 4 is shown in embodiments of the present invention for showing the mixed reality method of at least one virtual objects.
This method is executed by first terminal, for example, the first terminal 110 described with reference to Fig. 1 and Fig. 3 A.
In step E410, the sending module of first terminal 110 sends the first window for being located in computer applied algorithm At least one position data element DP of pointer P in a part of F1, first window F1 are shown by second terminal 120.
In addition, the receiving module of first terminal 110 receives related to a part of first window F1 in step E420 At least one data element DPF1.
In addition, in step E430, processor module processing and first window F1 this it is a part of it is relevant at least one Data element DPF1, so as to existing in mixing based on a part of relevant at least one data element DPF1 to first window F1 Virtual objects are shown in reality.
Fig. 5 is shown in another embodiment of the invention for showing that the mixed reality of at least one virtual objects is aobvious Show method.
This method by for show at least one virtual objects mixed reality display system execute, for example, being retouched with reference to Fig. 1 The system 100 stated.
In step F500, the display equipment of second terminal 120 shows first window F1, may also display together one or A number of other window F2.
In step E502, first terminal 110 obtains the display equipment of the second terminal 120 of display first window F1 the One or more position data element DPE in true environment ER locating for one terminal 110 and second terminal 120.The step E502 is executed by the acquisition module and acquisition module of first terminal 110.
Term " true environment " ER indicates that second terminal 120 is positioned with first terminal 110 during obtaining step E502 Space.In general, true environment ER is the form in the room in building.
The acquisition module of first terminal 110 scans true environment ER in three dimensions.Then, by using three-dimensional coordinate, Based on the data from acquisition module, obtains module and virtual environment EV is modeled in three dimensions.
Term " virtual environment " EV is used to refer to that herein number relevant to referential (generally rectangular referential) to be empty Between, the true environment ER that referential indicates to be positioned in obtaining step 502 period first terminal 110 and second terminal 120 is described Expression is carried out by three-dimensional coordinate.
Modeling allows to obtain one or more position data element DPE from the display equipment of second terminal 120.It is more quasi- It really says, during modeling, the display equipment of second terminal 120 can be obtained corresponding with true environment ER by obtaining module Virtual environment EV in one or more groups of three-dimensional coordinates.
In the example that display equipment is rectangular screen E, the three-dimensional coordinate of two relative angles of available screen E, example Such as, the lower left corner A and upper right corner B (see Fig. 6) of screen E, or the even upper left corner and the lower right corner.It is then possible to from screen E's The three-dimensional coordinate of two relative angles derives the height HAB and width LAB of screen E.
The pointer P of the pointer module of first terminal 110 is located in one of first window F1 by the user of first terminal 110 Divide on PF1.
Then, in step E504, at least one position data of the available position for providing pointer of first terminal 110 Element DP, the pointer position data element DP can be the data element for providing pointer relative to the position of display equipment.It should Step 504 can be executed by the acquisition module of first terminal 110, can also be executed by acquisition module.
More precisely, pointer P can be obtained by the acquisition module of first terminal 110 corresponding with true environment ER One or more groups of three-dimensional coordinates in virtual environment EV.Multiple groups three-dimensional coordinate obtained can be used calculate pointer relative to Show the position data element DP of equipment.
In the example that the display equipment of Fig. 6 is rectangular screen E, available pointer P is relative to the certain point on screen Several position data element DP, for example, with pointer P relative to being located at one jiao of screen of point (the usually original of rectangular reference system Point), the algebraic distance of the usually lower left corner A of screen corresponding vertical component HAP and horizontal component LAP.
Then, in step E510, the sending module of first terminal 110 sends the position data element DP of pointer P, and (step F510) can be received by the receiving module of second terminal 120.
When first terminal 110 detects the order C of display virtual objects (step E506), the sending step can be executed E510。
In this example, display command C is the gesture made by user U, usually crawl movement, for example, crawl first The movement of a part of PF1 of window F1, as shown in Figure 2 A.Then, the acquisition module of first terminal 110 can be acquired including user The one or more images or video of gesture.Then, the processor module of first terminal 110 can analyze described image or view Frequently, so as to detection gesture.
In another example, display command C is that the audio data acquired by analysis first terminal 110 detects Voice command.
In another example, display command C is the movement of the control peripheral equipment to first terminal 110, which sets It is standby to may, for example, be button or touch-surface.When first terminal 110 is mixed reality glasses, which can position In in the temple branch of the glasses.
After the receiving module of second terminal 120 receives the position data element DP (step F510) of pointer P, second The processor module of terminal 120 is run in step F512 determines first window F1 with the position data element DP based on pointer P A part of PF1.
In instances, a part of PF1 of first window F1 is the pane of first window F1, column item or label.Therefore, first A part of PF1 of window F1 is usually navigation tag or address field.In another example, a part of PF1 of first window F1 It is made of completely first window F1.
More precisely, in the sub-step of step F512, processor module is by position data element DP and second terminal One or more position data elements of the one or more parts for each window that 120 display equipment is shown are compared. Then, processor module can find a part of PF1 of first window F1, and user is by the finger of the pointer module of first terminal 110 Needle P is placed on this part of PF1 of the first window F1.
This relatively may be selected to carry out before the sub-step for determining the position of pointer P on the screen.
In the example that the display equipment of Fig. 6 is rectangular screen E, in step F510, receiving module can receive screen E Height HAB, screen E width LAB, with pointer P relative to positioned at screen E one jiao (usually lower left corner) point algebra Apart from corresponding vertical component HAP, and the corresponding horizontal component LAP of algebraic distance with pointer P relative to same point.So Afterwards, the coordinate (X, Y) of pointer P can be obtained by following formula:
X=Rx*LAP/LAB;And
Y=Ry*HAP/HAB;
Wherein, Rx and Ry indicates the resolution ratio of screen.
Therefore, even if when first terminal 110 does not know the resolution ratio of the screen of second terminal 120, it is also possible to obtain refer to The coordinate (X, Y) of needle P.
In a kind of modification, step E502, E504, E506, E510, F510 and/or F512, and second terminal are not executed 120 processor module determines according to one or more position data elements of the pointer of the pointer module of second terminal 120 A part of PF1 of one window F1.In this variant, the pointer module of second terminal may, for example, be computer mouse.
Once it is determined that a part of PF1 of first window F1, so that it may by the handling module of second terminal 120 with number A part of PF1 (step F514) of picture format crawl first window F1.
Moreover, one or more context data element DC relevant to a part of PF1 of first window F1 and/or with The relevant one or more associated metadata elements MET of a part of PF1 of one window F1, can be by the memory module of second terminal 120 It is stored in the rewritable nonvolatile memory 314 of second terminal 120 (step F516).
Each context data element DC can be the data element of description computer applied algorithm, and therefore can refer to Show the type, title and version number of computer applied algorithm.Each context data element also can indicate that display size, language Speech, used connection type etc..
For text processor application program, each context data element is also possible to describe the data element of document, And it can for example specify the author of document.
In addition, each associated metadata elements MET can be a part of PF1 shown about the display equipment of second terminal 120 Data element.Therefore, for word processor application, each associated metadata elements may include first window this one The current page number of the text, display that are shown in part, selected font, font size, Contents of clipboard etc..
Moreover, the display equipment that the diminution module of second terminal 120 reduces second terminal 120 is shown in step F518 First window F1 a part.When a part of PF1 of first window F1 is made of entire first window F1, by the first window Mouth F1 narrows down to the size of the icon shown by the display equipment of second terminal 120.In a kind of modification, second terminal 120 is stopped Only show first window F1.
In step F520, the sending module of second terminal 120 is sent and first window F1 determining in step F512 The relevant one or more data element DPF1 of a part of PF1.For example, during step F520, number can be sent The crawl of a part of PF1 of the first window F1 of word picture format and/or context data element and/or associated metadata elements.
Data DPF1 (step relevant to a part of PF1 of first window F1 is received in the receiving module of first terminal 110 Rapid E520) after, the processor module of first terminal 110 is run in step E530 to handle received data element DPF1, to show virtual objects OV in mixed reality based on data element DPF1.It, can be in processing step E530 Define the three-dimensional coordinate of virtual objects OV.
In step E532, the display equipment of first terminal 110 shows virtual objects OV.
More precisely, the display equipment of first terminal 110 can show virtual objects OV, so that first terminal 110 User U can see the virtual objects OV being superimposed upon on true environment ER on first terminal 110.
When first terminal 110 is mixed reality glasses or mixed reality head-mounted display, display equipment be it is transparent or Translucent screen shows virtual objects OV in step E532 thereon.The screen is mounted on glasses or head-mounted display, So that user sees the virtual objects displaying together on screen by screen when user U wearing spectacles or head-mounted display OV and true environment ER.
When first terminal 110 is the mobile terminals such as mobile phone or digital flat panel computer, display equipment is screen, The screen is run in step E532 to show and be superimposed upon from true environment ER's (usually being acquired by the camera of mobile terminal) Virtual objects OV on image.
For example, virtual objects OV is shown in this way: for the user U of first terminal 110, virtual objects OV Seem to be spaced apart with the display surface of the display equipment of second terminal 120.Therefore, virtual objects OV can appear in second eventually Somewhere around the display surface of the display equipment at end 120, such as top, lower section, left side and/or right side.It is then possible to locating By the three-dimensional coordinate of virtual objects OV during reason step E530, it is defined as the aobvious of the second terminal 120 obtained in step E502 Show the function of the position data element DPE of equipment.
In instances, shown virtual objects OV is the crawl of a part of PF1 of the first window of digital image format. In a kind of modification, virtual objects OV can be video.
In another example, based on context data element and/or associated metadata elements creation are virtual right for processor module As OV.During handling the time it takes, i.e., before showing virtual objects OV, the display equipment of first terminal 110 can be with Show the crawl of a part of PF1 of first window.
So, virtual objects can be for example, hologram form, the computer suitable for showing in mixed reality answers With the second window (see, for example, the window F3 of Fig. 2 B) of program or it can be it is corresponding with a part of PF1 of first window And the three dimensional representation (usually hologram form) of the service for the computer applied algorithm for being suitable for showing in mixed reality.
When virtual objects OV is the second window, processor module shows one or more associated metadata elements and can benefit The shape and ergonomics of the second window are defined with one or more context data elements.Therefore, in first window A part of PF1 of F1 is in the example of window, pane, column item or navigation tag, and virtual objects OV can be the second window, can be with Similar to a part of PF1 and show the content substantially the same with a part of PF1.
In other examples, the content of virtual objects OV, shape and/or ergonomics can in a part of PF1 Appearance, shape and/or ergonomics are different.Therefore, when a part of PF1 of first window F1 is two dimensional image, virtual objects OV can be the photo frame including the image.
In step E534, user U can be interacted by first terminal 110 with virtual objects OV.The interaction can wrap Include mobile virtual object OV, and/or the content of modification virtual objects.
For example, content is revised as playback video if virtual objects OV is video.
In another example, when virtual objects OV is the second word processor window, user can be used first eventually 110 dummy keyboard is held to modify its content.
Then, in order to synchronize purpose, content modification can be sent to second terminal 120.
Each step of this method can be repeated to show multiple virtual objects OV.

Claims (12)

1. the method that one kind shows at least one virtual objects (OV, F3) in mixed reality, which is characterized in that the method packet Include the following steps executed by first terminal (110):
Sending step (E410, E510), for sending a part for being located in the first window (F1) of computer applied algorithm (PF1) at least one position data element (DP) of the pointer (P) on, the first window (F1) are aobvious by second terminal (120) Show;
Receiving step (E420, E520), it is relevant extremely to a part of (PF1) of the first window (F1) for receiving A few data element (DPF1);And
Processing step (E430, E530), for handling at least one described data element relevant to a part of (PF1) (DPF1), so as to based at least one described data element (DPF1) relevant to a part of (PF1) in mixed reality Show at least one described virtual objects (OV, F3).
2. according to the method described in claim 1, wherein at least one described virtual objects are the computer applied algorithms Second window (F3).
3. according to claim 1 or method as claimed in claim 2, the method further includes by the first terminal (110) following steps executed:
For obtaining the display equipment (E) for the second terminal (120) for showing the first window (F1) in the first terminal (110) at least one position data element (DPE's) and in the true environment (ER) that is positioned of the second terminal (120) obtains Take step (E502);And
For obtaining the obtaining step (E504) of at least one position data element (DP) of the pointer (P), the pointer (P) at least one described position data element (DP) is position data of the pointer (P) relative to display equipment (E) Element (DP).
4. according to the method in any one of claims 1 to 3, wherein detection (E506) for show it is described at least one The sending step (E510) is executed when order (C) of virtual objects (OV, F3).
5. method according to claim 1 to 4, which is characterized in that the receiving step includes receiving:
At least one crawl of the part (PF1) of the first window (F1), is digital image format;And/or
At least one context data element relevant to a part of (PF1) of the first window (F1);And/or
At least one associated metadata elements relevant to a part of (PF1) of the first window (F1).
6. the method according to any one of claims 1 to 5, wherein the first terminal (110) is that mixed reality is worn Formula display or mixed reality glasses.
7. method according to any one of claim 1 to 6, the method further includes by the second terminal (120) reduction process (F518) executed, for reducing a part of (PF1) of the first window (F1).
8. according to claim 7 and method according to any one of claim 2 to 4, the method further includes The determination step (F512) executed by the second terminal (120), at least one described position according to the pointer (P) Data element (DP) determines a part of (PF1) of the first window (F1).
9. a kind of mixed reality terminal (110), which is characterized in that the mixed reality terminal (110) is adapted for carrying out according to right It is required that method described in any one of 1 to 6.
10. a kind of computer program (P1), which is characterized in that when executing described program (P1) by computer, the computer Program (P1) includes the instruction for executing each step of method according to any one of claim 1 to 6.
11. a kind of mechanized data medium, which is characterized in that the mechanized data media storage includes for holding The computer program (P1) of the instruction of each step of row method according to any one of claim 1 to 6.
12. a kind of system (100), which is characterized in that the system (100) is adapted for carrying out according to claim 7 or claim 8 The method, and including terminal according to claim 9 (110).
CN201811437333.5A 2017-11-28 2018-11-28 The method and associated terminal and system of virtual objects are shown in mixed reality Pending CN109840042A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1761321 2017-11-28
FR1761321A FR3074331A1 (en) 2017-11-28 2017-11-28 METHOD FOR DISPLAYING MIXED REALITY OF AT LEAST ONE VIRTUAL OBJECT, TERMINAL AND ASSOCIATED SYSTEM

Publications (1)

Publication Number Publication Date
CN109840042A true CN109840042A (en) 2019-06-04

Family

ID=60955308

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811437333.5A Pending CN109840042A (en) 2017-11-28 2018-11-28 The method and associated terminal and system of virtual objects are shown in mixed reality

Country Status (4)

Country Link
US (1) US10810801B2 (en)
EP (1) EP3489811B1 (en)
CN (1) CN109840042A (en)
FR (1) FR3074331A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3067546A1 (en) * 2017-06-19 2018-12-14 Orange METHODS OF OPERATOR IDENTIFICATION OF EMBRITTING FRAMES, AND OPERATOR MEMBERSHIP VERIFICATION, COMMUNICATION DEVICE AND COMMUNICATION GATEWAY
TWI790430B (en) * 2020-04-13 2023-01-21 宏碁股份有限公司 Augmented reality system and method for displaying virtual screen using augmented reality glasses
US11175791B1 (en) * 2020-09-29 2021-11-16 International Business Machines Corporation Augmented reality system for control boundary modification
JP2022137622A (en) * 2021-03-09 2022-09-22 キヤノン株式会社 Wearable terminal, control method for the same, and program
US20230376161A1 (en) * 2022-05-19 2023-11-23 Microsoft Technology Licensing, Llc Mouse cursor and content migration between 3d space and physical flat displays

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010257123A (en) * 2009-04-23 2010-11-11 Sony Corp Information processing device, information processing method, and program
US20150205106A1 (en) * 2014-01-17 2015-07-23 Sony Computer Entertainment America Llc Using a Second Screen as a Private Tracking Heads-up Display
US20160147492A1 (en) * 2014-11-26 2016-05-26 Sunny James Fugate Augmented Reality Cross-Domain Solution for Physically Disconnected Security Domains

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9858720B2 (en) * 2014-07-25 2018-01-02 Microsoft Technology Licensing, Llc Three-dimensional mixed-reality viewport

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010257123A (en) * 2009-04-23 2010-11-11 Sony Corp Information processing device, information processing method, and program
US20120032955A1 (en) * 2009-04-23 2012-02-09 Kouichi Matsuda Information processing apparatus, information processing method, and program
US20150205106A1 (en) * 2014-01-17 2015-07-23 Sony Computer Entertainment America Llc Using a Second Screen as a Private Tracking Heads-up Display
US20160147492A1 (en) * 2014-11-26 2016-05-26 Sunny James Fugate Augmented Reality Cross-Domain Solution for Physically Disconnected Security Domains

Also Published As

Publication number Publication date
EP3489811B1 (en) 2023-08-16
US20190164347A1 (en) 2019-05-30
EP3489811A1 (en) 2019-05-29
FR3074331A1 (en) 2019-05-31
US10810801B2 (en) 2020-10-20

Similar Documents

Publication Publication Date Title
US11144177B2 (en) Application execution method by display device and display device thereof
CN109840042A (en) The method and associated terminal and system of virtual objects are shown in mixed reality
US11320957B2 (en) Near interaction mode for far virtual object
US11042294B2 (en) Display device and method of displaying screen on said display device
CN109164964B (en) Content sharing method and device, terminal and storage medium
EP3129871B1 (en) Generating a screenshot
US20190340821A1 (en) Multi-surface object re-mapping in three-dimensional use modes
EP3686723A1 (en) User terminal device providing user interaction and method therefor
US20140362002A1 (en) Display control device, display control method, and computer program product
US20150063785A1 (en) Method of overlappingly displaying visual object on video, storage medium, and electronic device
US9519355B2 (en) Mobile device event control with digital images
US10198831B2 (en) Method, apparatus and system for rendering virtual content
US20160070460A1 (en) In situ assignment of image asset attributes
CN113806054A (en) Task processing method and device, electronic equipment and storage medium
WO2016178850A1 (en) Gpu operation for content rendering
KR20190063853A (en) Method and apparatus for moving an input field
KR20210083016A (en) Electronic apparatus and controlling method thereof
US11093041B2 (en) Computer system gesture-based graphical user interface control
CN112789830A (en) A robotic platform for multi-mode channel-agnostic rendering of channel responses
KR101730381B1 (en) Method, system and non-transitory computer-readable recording medium for controlling scroll based on context information
EP3635527B1 (en) Magnified input panels
KR102496603B1 (en) Method for selecting location to execution screen of application
CN117575882A (en) Method and device for updating vertex buffer area and storage medium
JP2013046410A (en) Method for browsing and/or executing instructions via information-correlated and instruction-correlated image and storage medium therefor
WO2019058539A1 (en) Program, information processing method, and information processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination