CN110140100A - Three-dimensional enhanced reality object user's interface function - Google Patents

Three-dimensional enhanced reality object user's interface function Download PDF

Info

Publication number
CN110140100A
CN110140100A CN201880005791.3A CN201880005791A CN110140100A CN 110140100 A CN110140100 A CN 110140100A CN 201880005791 A CN201880005791 A CN 201880005791A CN 110140100 A CN110140100 A CN 110140100A
Authority
CN
China
Prior art keywords
dimensional
cube
movement
translation
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880005791.3A
Other languages
Chinese (zh)
Other versions
CN110140100B (en
Inventor
F·A·里昂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Morgi Laboratory Ltd
Merge Labs Inc
Original Assignee
Morgi Laboratory Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Morgi Laboratory Ltd filed Critical Morgi Laboratory Ltd
Priority claimed from PCT/US2018/012110 external-priority patent/WO2018126281A1/en
Publication of CN110140100A publication Critical patent/CN110140100A/en
Application granted granted Critical
Publication of CN110140100B publication Critical patent/CN110140100B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Abstract

The invention discloses a kind of devices, it includes the three dimensional object of processor and memory and the unique reference mark of carrying at least two, processor executes so that processor generation includes the instruction of the user interface element three-dimensional environment for interacting with three, using the rotary motion of at least two unique reference mark detection three dimensional physical objects, and the user interface element in three-dimensional environment is updated based on the rotary motion of three dimensional physical object.

Description

Three-dimensional enhanced reality object user's interface function
Copyright and trade dress notice
A part of the disclosure of this patent document includes material protected by copyright.The patent document can be shown And/or description belongs to or is likely to become the content of the trade dress of the owner.Copyright and trade dress owner do not oppose any Facsimile reproduction of the people to the patent disclosure occurred in Patent and Trademark Office's patent document or record, but protect in other respects Stay all copyrights and trade dress rights.
The cross reference of correlation reference
This patent requires the priority of following temporary patent application:
The U.S. Provisional Patent Application No. submitted on January 2 in 2017 is 62/441,525, entitled " augmented reality benchmark The patent of label ", is incorporated herein by reference.
The U.S. Provisional Patent Application No. submitted on March 9 in 2017 is 62/469,292, entitled " three-dimensional enhanced reality The patent of object and correlation function ", is incorporated herein by reference.
Technical field
This disclosure relates to enhance and virtual reality, and more particularly, to user interface and augmented reality and void The interaction of quasi- actual environment and object.
Background technique
Since about 2012, enhancing and virtual reality become ubiquitous in news and technical report.However, In past three, 40 years, both popular by spells several years, people gradually weaken its interest, but several It is again interested after year.The main reason for technology causes excitement but is not able to maintain this excitement is since its cost is excessively high.
Augmented reality (AR) is the mixing of the virtual element of real world and computer system generation.This mixing may be In vision, audio or the field of haptics that user perceives.AR have been demonstrated be in being widely applied it is useful, including sport, joy Pleasure, advertisement, tourism and education.With advances in technology, it is contemplated that it will more and more be used in those fields and Use in extensive other field.
The film and media of the 1980s and the nineties are all that virtual reality technology bring WeiLai Technology revolution increases Brilliance is added.But it usually requires to spend several thousand dollars using system necessary to virtual reality.Therefore, the public is never extensive Using the technology.
Even now, generally existing motion sensor and high quality the small screen are substantially made a price reduction in modern smart phone In the case where, virtual reality and augmented reality are still relatively unknown by the people.One main crucial problem is still the public couple The use problem of extensive virtual reality and augmented reality.People how with virtual reality or augmented reality environment interaction? less In remote future, everyone can possess tactile garment and haptic gloves, provide locating for analog subscriber or experience virtual Or the physical feedback of augmented reality environment.But the system of these types also needs several years time.
The current most common interactive system is hand-held controller, such asOrThis The problem of a little systems be all their price all in hundreds of dollars, and in general, controller be not included in it is associatedEarphone orIn the cost of earphone.Also, neither includeCost, Do not include usingThe cost of required computer.As a result, expectation not only " sees " that virtual or enhancing is existing The order that real user must pay about 1,000 or thousands of dollars could enjoy complete augmented reality or virtual reality experience.
What is desired is that cheap but high precision system or equipment, the system or equipment can be used widely available Technology come track so as to become enhancing and virtual reality device controller or user interface extension.AR and VR based on mobile phone System is (such as) by being tasted comprising one-touch remote controler to this in its whole software package Examination.Price is lower than 100 dollars, these are certainly more attractive for general spectators, but the still mistake for most of public It is high.More preferably, cheaper technology should be possible, and should provide the user experience of high quality, to realize across multiple The detailed interaction of equipment.Also, this technology should be used with the detailed control program of reality environment for enhancing, without Need complicated system nested with tracker.
Detailed description of the invention
Fig. 1 is the system for using three dimensional object Yu augmented reality environmental interaction;
Fig. 2 is the example that can be used for the side group of the cube of augmented reality environmental interaction;
Fig. 3 is made of Fig. 3 A-3H, is a series of cubes, and each cube includes that can be used for handing over augmented reality environment Mutual different elements;
Fig. 4 is the flow chart for the process with augmented reality environmental interaction;
Fig. 5 is in response to update the flow chart of the dynamic process of three dimensional object in the variation of augmented reality environment;
Fig. 6 is the example for the calculating equipment for participating in Computer Vision Detection and tracking three dimensional object;
Fig. 7 is the three dimensional object for calculating equipment and the three dimensional object detected in augmented reality environment being replaced with to rendering Example;
Fig. 8 is that the screen of calculating equipment is shown, showing can be around the three dimensional physical object of three axis translations and rotation;
Fig. 9 is that the screen of calculating equipment is shown, shows and replaces physics three dimensional object with the scapegoat of the three dimensional object of rendering;
Figure 10 is in augmented reality is shown instead of the example of the rendering objects of three dimensional physical object, the three dimensional physical object Include dynamic associated with rendering objects.
Throughout the specification, three-figure reference label, most important one number is assigned in the element occurred in figure Word is figure number, and two least significant digits are specific for the element.Can be assumed the element of unbonded attached drawing description have with The identical characteristic of element of the previously described reference label with identical least significant digit and.
Specific embodiment
The description of device
With reference to Fig. 1, the system 100 using three dimensional object and augmented reality environmental interaction is shown.System 100 includes calculating Equipment 130 and three dimensional object 150.The system optionally includes VR/AR earphone 140.Multiple calculating equipment can be used, but only One calculating equipment is necessary.
Calculating equipment 130 includes central processing unit (CPU) 131, graphics processing unit (GPU) 132, input-output (I/ O) interface 133, network interface 134, internal storage 135, memory 136, camera 137 and display 138.
CPU 131 can execute with for calculate the operating system of equipment 130 it is associated instruction and be suitable for opening With the associated instruction of the one or more application of functions described herein.CPU 131 can be or including one or more micro- Processor, microcontroller, digital signal processor, specific integrated circuit (ASIC) or system on chip (SOC).CPU 131 can be with It is the operation exclusively for vision, figure or audio data and the processor designed, or can be general processor.Although by knowing Not Wei central processing unit, but CPU 131 can actually be multiple processors, such as multi-core processor or pass through bus and connect A series of processors, to increase the total throughout or ability of CPU 131.In order to execute tracking described herein, CPU can be complete Portion is partly to realize the integration " motion chip " that three dimensional object tracks and specially designs.
GPU 132 can execute the instruction for being suitable for enabling functions described herein.Particularly, GPU 132 can be used for Specific image relevant operation that GPU 132 is uniquely suitably executed (such as rendering or related to object detection and computer vision Complex mathematical computations) be used in combination.GPU 132 can be anything belonging to CPU 131.But the difference of GPU 132 It is that it is to execute faster storage operation and access designed for processing vision data (especially vector sum shading operation) Application specific processor, and GPU 132 can execute special Lighting operations in the three-dimensional environment of rendering.In GPU 132 Memory and instruction set are specially designed for operating graph data.In this way, GPU 132 may be particularly suitable for Operation is carried out to image data or quickly and efficiently executes complex mathematical operation as described herein.It is similar with CPU 131, GPU 132 is illustrated as single graphics processing unit, but can actually be so-called multicore format or can by bus or Come the one or more graphics processing units linked, they can be applied to single group or multiple groups processing operation together for other connections.
I/O interface 133 may include one or more general wireline interfaces (for example, universal serial bus (USB), high definition Clear degree multimedia interface (HDMI)), for storing equipment (such as hard disk drive, flash drive or proprietary storage solution) One or more connectors.
I/O interface 133 can be used for and optional external sensor (such as additional camera, lamp, infrared lamp or for three-dimensional The other systems of 150 computer vision of object detection and other operating process) it communicates and instructs optional external sense The movement of device.
Network interface 134 may include radio circuit, analog circuit, digital circuit, one or more antenna and with it is outer Other hardware, firmware and software needed for portion's equipment carries out network communication.Network interface 134 may include wired and wireless company It connects.For example, network may include cellular telephone network interfaces, WLAN (LAN) interface and/or wireless personal area network (PAN) interface.One or more cellular data protocols can be used in cellular telephone network interfaces.Wireless LAN interfaces can be usedWireless communication protocol or other protocol of wireless local area network.The wireless communication association of limited range can be used in wireless PAN interface View, as bluetooth, Or some other public or proprietary wireless personal area network agreement.
Network interface 134 may include one or more application specific processors, with execute as using selected communication protocol with The function such as encryption/decryption, compression/de-compression and coding/decoding necessary to external device communication.Network interface 134 can Entirely or partly to execute some or all of these functions dependent on CPU 131.
Internal storage 135 may include the combination of nonvolatile memory and/or volatile memory, non-volatile to deposit Reservoir and/or volatile memory include that read-only memory (ROM), static state, dynamic and/or magnetoresistive RAM (divide Wei SRAM, DRM, MRAM) and non-volatile writeable memory (such as flash memory).
Internal storage 135 can store the software program and example executed by CPU 131 or GPU 132 (or both together) Journey.The software program of these storages may include operating system software.Operating system may include supporting I/O interface 133 or net The function of network interface 134, such as protocol stack, coding/decoding, compression/de-compression and encryption/decryption.The software program stored It may include application program or " app ", so that calculating equipment execution some or all of is described herein processing and functions.Here make Word " internal storage " and " memory " clearly eliminate the transitory state medium including propagating waveform and transient signal.
Memory 136 can be or including nonvolatile memory, such as hard disk drive, stores and design for a long time Flash memory device, writable media and other proprietary storage mediums, for example, long-term medium for storing image data and designing.
Camera 137 be can capturing ambient light to generate the electronic equipment of the image of those objects in its visual field.Camera 137 are illustrated as single camera, but can be twin-lens or multi-lens camera.Similarly, word " camera " is commonly used in description camera 137, but camera 137 may include infrared illumination, flash lamp or other sharp light sources, infrared camera, depth transducer, light sensing Device or other equipment that can be detected three dimensional object in the range of camera 137 or capture the similar camera of image.Although camera 137 are described as visual imaging camera, but it can actually be or the additional or other function including being adapted for carrying out tracking. For example, laser and/or sound may be used in the technology such as radar (LIDAR) and sonar (Sonar) etc execute object with Track.Although both technologies are not related to " camera " in itself, the two can be used in enhancing or execute in three-dimensional space completely To image tracing.
Display 138 is the electronic equipment comprising electrical activation component, and electrical activation component is visible for being formed over the display Image.Display 138 may include backlit display (such as LCD) or can be the luminous display of the machine (such as OLED).Display 138 is illustrated as individual monitor, but can actually be one or more displays.It is aobvious that other can be used Show device, such as augmented reality light field display (project light onto three-dimensional space or seem to do so) or other kinds of projection Instrument (true and virtual)).
Display 138 can have for the lens by eye focus on display 138, and it is aobvious to can be used as split screen Show that device is presented to the eyes of viewer, especially in the case where calculating equipment 130 is a part of VR/AR earphone 140.
In some cases, one or more additional computing devices (as calculated equipment 130) can pass through network interface 134 Connection, network interface 134 can be wireline interface, such as the nothing of Ethernet, universal serial bus (USB) or such as 802.11x Line interface, LTE make additional computing device be able to carry out some or all of operation other wireless protocols discussed in this article.Example Such as, the CPU 131 and GPU 132 for calculating equipment 130 may not have connection system (for example, multi-core processor or multi-core processor Group) or GPU group (for example, single powerful GPU or by SLI orThe GPU group of interconnection) it is powerful so that connecting like that The calculating equipment connect can be better carried out processor intensive task.Alternatively, VR or AR earphone forms capture device (such as Camera and associative processor and internal storage) or simple mobile device including display and camera can be with rendering apparatus (such as desktop computer or be more able to carry out function some or all of is described below other calculate equipment) distinguishes.Some In embodiment, one or more additional computing devices can be used for executing more processing device intensive task, these tasks pass through I/O interface 133 or network interface 134 unload.
VR/AR earphone 140 is optional component, can accommodate, close, connecting or otherwise with calculate equipment 130 is associated.VR/AR earphone 140 itself can be attached to the calculating equipment or VR/AR earphone of stronger big calculating equipment 140 can be as the calculating equipment 130 execution functional autonomous device of institute discussed in this article itself.
Although being not required for function described herein, VR/AR earphone 140 can be used and obtain and more intend very Augmented reality or virtual reality experience.When being used as augmented reality earphone, VR/AR earphone 140 may include the phase faced out Machine, by the augmented reality object that is dispersed on display 138 to wearer provide VR/AR earphone 140 outside it is real-time Image.Alternatively, if there is no VR/AR earphone 140, then mobile device or tablet computer or other hand-held displays and phase unit Conjunction may be used as " portal ", by " portal " it can be seen that augmented reality or virtual reality.Although herein in conjunction with " enhancing Reality " has carried out general discussion, but when using " augmented reality " word, it should be appreciated that this also includes so-called " virtual existing It is real ", " mixed reality " and other experience for combining any real object with the true environment of 3 D Quasi or experience.
Some position that three dimensional object 150 is located in the world or the physical object that specific position is maintained at by user. Three dimensional object 150, which has, to be adapted for use with computer vision technique and is detected and preferably, is had in different location (example Such as, feature, arm length, pass through room) the steady type that uses and when be presented to calculate equipment 130 and camera 137 when, The feature quickly detected.
Three dimensional object 150 is preferably cube, but can use other shapes.Cube, which has, is particularly suitable for it Several features of these purposes.Significantly, there is only six sides, but each side in six sides can be it is unique simultaneously And it can distinguish opposite to each other.For example, illumination use or color based on particular color use, it is only necessary to which six kinds of colors can area Point.This enables computer vision algorithms make easily to detect which side towards camera 137.Similarly, computer-readable (or Can only distinguish) pattern can be applied to every side of cube, without occupying over a total of six face.If the number in face Amount increases, then the detection (and the complexity for distinguishing it with other sides or non-side) of particular side also increases.In addition, With more multi-sided increase, the total surface area of " side " reduces, so that computer vision side detection algorithm (especially exists At the distance different apart from camera) it is more difficult, because only that so much unique pattern or color are included in lesser side On.
Similarly, if computer vision once may only using less side (such as triangle pyramid) See a side, and when pyramid is rotated to any direction, computer can not predict that easily which side is being presented to Camera.It is thus impossible to detect direction of rotation easily.Also, more in each " side " are held three dimensional object 150 Individual blocks, because it only has less side to be accommodated.This further makes Computer Vision Detection more difficult.
Another benefit of cube is the people and the phase interaction of three-dimensional world that its six sides mapping is readily appreciated that With.Specifically, these sides are mapped well to upper, lower, left, right, front and rear.Therefore, when cube is facing towards user When holding, people can well, virtually and practically correspond to him or she to real world experience to the experience of cube.This makes It must be easier to be converted to augmented reality or reality environment.
It is anyway possible to use the three dimensional object of any amount of side.But cube has unique attribute, Them are made to be more suitable for certain applications, especially handheld applications.In addition, when it is pointed out here that four faces can be replaced when " cube " Or more face any three dimensional object.
Finally, three dimensional object may include the meter of their own although being primarily described as in this application passive Calculate equipment 130, calculate equipment 130 have the function of the power of different level, complexity and.In some cases, three dimensional object can To include camera or infrared camera, lamp, position sensor and rotation sensor, bluetooth, RFID, WiFi and/or for detecting it Itself is relative to the position of exterior space or equipment (for example, calculating equipment 130) and for transmitting this information to calculating equipment 130 other systems.In some cases, three dimensional object can take over it relative to the environment for calculating equipment 130 or its operation Direction, rotation or some or all functions of tracking its position of (for example, space or external sensor, camera or lamp).
Fig. 2 is the exemplary side group that can be used for the cube 200 of augmented reality environmental interaction.Fig. 2 is only may Cube 200 example.As set forth above, it is possible to using other shapes, and can actually be on each face using any The computer of type can recognize image.Alternatively, as set forth above, it is possible to (being used for using the depth engraving on illuminating color, each face The detection of depth sense system), illumination form (for example, illumination of specific shape or pattern) and other detection techniques.
Cube 200 includes six faces 201,202,203,204,205 and 206.In order to enable computer vision algorithms make Work in different depth (such as close to camera-in several inches;Brachium-is in 20-40 inches;And in farther distance- In several feet), selected image has some specific features.Cube is shown as its face in expansion shape to be used to indicate cube The characteristic of body 200.When formed, cube 200 will be cube, and preferably by relatively solid compressible material system At.Preferred material includes foam, polymer, metal and similarly firm and elastic material.The case where being discussed below Under, wherein electronic building brick is integrated in cube 200, it can be made of injection plastic, foam or other materials, as long as they Can be durable during normal use and protect these components.
Firstly, image has relatively large-scale component, these components are apart from camera certain distance (such as the length of arm Or it is farther) at it is easily distinguishable.For face 201, this shape is diamond shape (or rectangular, to depend on how to hold) and relevant big White bar chart.In some cases, bar chart may include copyright information or other information relevant to cube.In face In the case where 202, oversized shape is the center circle surrounded by another central part annulus.For face 203, which is to connect The oblong being connected at the top of the pyramid of " right side " side.For face 204, shape is big by three with white triangles shape therebetween It is angular.For face 205, which is the octagon for wherein having two lines to pass through it or almost pass through.Finally, for face 206, Oversized shape is a series of lines of the three-face view of cube and " top " from the face.It should be noted that trade mark<>It appears in face 203 and 206.
These oversized shapes are easy to be detected by computer vision technique (1) and (2) are in about brachium (20-40 English It is very little) at be distinguished from each other.This is critically important, because when three dimensional object (such as cube 200) is commonly used in holding in user hand, It is used about at brachium.But user can also move equipment closer sometimes.When being held at the length in arm, often The ins and outs of a face 201-206 is likely difficult to be detected.Therefore, large-size images are included on each face, so that Computer vision technique can be used them and carry out detection and still demand-driven at those distances.In addition, when in low coverage When holding from place, details makes computer vision be able to detect fine movement and in the virtual or augmented reality world, and practical three When dimensional object is replaced by virtual objects, corresponding stability of the image in virtual environment is maintained.
But cube 200 further includes the feature for being detected in closer depth by computer vision technique Lens member.When cube 200 is held closer to associated equipment, detection camera may even cannot see that each Complete large-size images on face, and if there is not more images, detecting camera possibly can not determine which face is It is visible.For these situations, lesser lines and shape are dispersed on each face of cube 200.These can be each It is seen in the 201-206 of face.Further it can be noted that small lines and shape are purposely relatively different on face-to-face.For example, Wave almost exists only in face 202, wherein " plus sige " shape exists only in face 204.Triangle exists only in face 205 In and half moon shape almost exist only in face 206.The most diversity of face 203, but still be easy to distinguish with other faces Even if (in the short range of computer vision algorithms make, especially when the search of these algorithms be only limitted to matching may be At be presented in six possible faces of camera one).Lesser lines and shape on each face 201-206 on each face with A variety of different direction of rotation are presented, in order to quickly identify those lines and shape at a variety of different visual angles and viewing distance Shape.
As a result, under multiple common lighting conditions (for example, dark, light) of practical any angle, at least two detections away from From can be by the phase machine testing of comparatively low resolution.It is this for it is different detection depths use include at least two (or More) reference mark of size and the technology that mutually covers in identical reference mark be referred to herein as " multilayer benchmark Label ".The using of multiple multilayer reference marks makes in augmented reality environment with cube 200 (and comprising similar more Layer reference mark other objects) interaction be for blocking (such as the hand or finger for passing through holder), fast moving Robust, and by providing strong tracking with the complex interaction of cube 200.It particularly, can be with by using multilayer reference mark Rotation and the position of high quality are carried out in multiple depth (for example, very close to viewing equipment and brachium or space on desk) Set tracking.
Using being unique in that for this multilayer reference mark, it has near camera to the stabilization of camera distant place Property, and this stability and the label only with single detection depth or layer are significantly different.When the mobile single layer fiducial mark of user When remembering object far from observation camera, observation object (for example, calculating the camera in equipment) is increasingly difficult to the direction with test object The position and.Alternatively, when it shifts near camera, direction and position become to get over if object is to watch and design at a distance To be more difficult to track.As a result, in either case, object seems to move, tremble or become not can be traced.But Using multilayer reference mark, by from camera it is multiple with a distance from the object that keeps, the virtual or augmented reality world can be kept In it is obtained replacement augmented reality or Virtual Reality Object tracking and stability.
It is more that these are generated on multiple faces of three dimensional object in such a way that multilayer reference mark is uniformly this on object Layer reference mark, this has proved to be difficult in the art.Traditionally, reference mark has been as on single face or single The symbol of QR code on object.These reference marks are typically printed on paster or paper and are placed on object with hand, in addition, these bases Fiducial mark note is usually single face.However, the alignment of each reference mark on each face is for accurately tracking when object rotation It (in multiple depths) and is important across face.If face it is misaligned (for example, in " perfection " three dimensional representation of object with From computer vision algorithms make is desired is aligned different modes and is aligned), then the augmented reality object in virtual three-dimensional scene Tracking and stability will be greatly reduced.It can be seen that object is skipped position near several differences, and when at one or When on the bad face of multiple alignments, the floating of object may seem unnatural or (for example) be suspended in a people on hand.
As a result, the multilayer reference mark of cube 200 can not be created by paster or reproducing image here.Substitution Ground, the injection molding of entire object can be used to create in they.Preferably, " bright " region of cube 200 is from each face " dark " region increases about 2-5 millimeters.This is realized by using injection molding, and (it can be contaminated in the region of protrusions At compared with light colour or paint compared with light colour or shoaled by other methods) be precise alignment in forming process.With this side Formula, the cube 200 each obtained is identical.Subsequent computer model can be with one in the cube of these injection moldings Based on a.This is more far better than using paster, direct drawing and other technologies in the plane, because it makes each cube Reference mark is unified.Therefore, the computer model of each cube 200 is also unified, and in given virtual reality or In augmented reality scene, the picture steadiness for replacing the object of cube is equally unified and for the three-dimensional of non-injection molding Object is not shaken.
Under specific circumstances, single face 201-206 is presented to camera comprehensively (and its associated picture is presented to calculating equipment To carry out face identification), or so that multiple visible modes of camera that face hold cube.If it is the former, then hold very much Which easily detects facing towards camera, because it is fully visible for camera.In the latter case, it can usually determine most The direction in the face of front, and the information can be used in combination with the partial view of partially visible side, quickly to make very well Determination-face 201-206 which be visible and their direction.
Purposefully selection surface is colored as the surface of white and black or reflection colour and black, because for computer For vision technique, strong comparison is easiest to detection and opposite 201-206 Rapid matching and classification.If using similar cube The pattern of 200 pattern can then be such that the surface (or some surfaces-such as white surface) of cube 200 reflects, so that it Form further comparison with dark portion.Alternatively, some or all of cube 200 can be coated with anti-reflection coating or material Material, so that reflection or ambient lighting will not interference calculation machine vision and detection and tracking technologies.Also bright height can be used Contrast color, such as iridescent.Ultraviolet light (for ultraviolet radiator and ultraviolet camera) or luminescent coating can be passed with corresponding Sensor is used together.
When being watched at away from the viewing multiple distances of camera by computer vision technique, all foregoing teachings can be real Fine granularity positioning, direction and the rotation tracking of existing cube 200.When close, the meter under many lighting conditions can be passed through Calculation machine vision technique determines specific position and the direction of object by various backgrounds and by mobile and rotation.Work as holding It, still can be in position, orientation, rotation and other fortune due to the multigrade nature of used reference mark in intermediate distance Tracking object in dynamic.By high-caliber following function, cube 200 can be in augmented reality scene by others rendering Three dimensional object replacement.Interaction with cube 200 can be converted in augmented reality environment (for example, in AR headphone or Shown in mobile device), and specifically, the rendering objects being converted into scene, and by cube 200 as real world Scapegoat.
While shown as a series of multilayer reference mark of high contrasts, but other kinds of label can be used, such as Active tag or the tracking from inside to outside carried out by cube itself, or be used in combination with equipment 130 is calculated.
It is a series of cube 350a-350h by Fig. 3 that Fig. 3 A-3H is constituted, each cube includes that can be used for and enhance The different elements of actual environment interaction.
Cube 350a in Fig. 3 A includes key 352a.Key 352a is shown as very big, outside cube 350a Portion is prominent.But key 352a can be small key below cube 350a outer surface, capacitor key or be only that can activate Switch.Key 352a may not be " key ", but can be the internal sensor or pressure detecting of cube 350a Sensor makes cube 350a can detect when the outside that a certain amount of pressure is applied to cube 350a.Sensing Device can have enough granularities, make it possible to pressure of the detection especially on the unilateral side of cube 350a.It therefore, can be with It is detected by the relatively simple processor (by being provided by it function) to work in cube 350a and including the pressure Cube 350a interaction.The information can be sent to associated calculating equipment 130 (Fig. 1) from cube 350a.
Calculating equipment 130 can be operated based on application-specific to program, to react in a specific way.Example Such as, the pressure that key 352a is pressed or sensed can be used as " click " in user interface operation.Alternatively, key 352a is pressed Or the pressure sensed can be used as game or weapon-shooting or Object Operations (for example, door opening) in other three-dimensional environments. Can between cube 350a and associated calculating equipment 130 (Fig. 1) wirelessly (for example, bluetooth or by WiFi or RFID data) are transmitted.
There can be multiple key 352a, there is one or more on each face or a series of can be close to cube 350a Outside or pressure sensor in the inside of cube 350a.The pressure sensed or each key can be with cubes The specific face of 350a is associated.In this way, the friendship of the pressure and specific face that press or sense by key 352a It can mutually join with specific intercorrelation.Paintbrush tool (or the auxiliary for interacting with tool selectors can be enabled by pressing a face Interface), and can be used for selecting different colours or paintbrush size with the interaction of other faces.As being discussed more fully below, stand The translation and rotation of cube itself can replace between color or paintbrush, or in other contexts, in user interface Other options between alternately.
Key 352a can not be key, but can be the computer view of the state in the face of detection cube 350a Feel.If making face fully deformed by applying pressure, since it meets some compression or distortion threshold, then computer can be passed through Vision algorithm detects the deformation, and therefore, can by calculating equipment 130 (Fig. 1) computer vision that runs come It registers key " pressing ", without any actual key in cube 350a, and may be importantly, without being incorporated to Any electronic equipment, battery supply or the processing capacity of cube 350a itself.Equipment 130 can calculated by being somebody's turn to do " key " pressing Upper complete operation, while it is closely similar to provide function relevant to actual physical button discussed above or pressure sensor Function.Due to visible on the face of cube and sightless details, computer vision technique even can be by cube dignity On compression position navigate to particular quadrant or the part of cube.Therefore, it can be created in virtual or augmented reality environment And using the interactive interface in each face for cube, and physical button is needed not rely upon at all.
Cube 350b in Fig. 3 B includes lamp 352b, and may include other several lamps (unmarked).Lamp 352b can It is neutral with space of the detection before camera to be acted for simple, such as computer vision application to image tracing The direction or position of cube 350b.It is then possible to render the cube 350b of the actual physics in replacement augmented reality scene Three-dimensional object.However, it is possible to using multiple lamps of multiple and different colors, to identify the spy of associated cube 350b Determine side or face or edge.As described above, and discussing more fully below, certain surface (rather than just existing object) The identification that easily can determine that cube 350b is operated together with equipment 130 (Fig. 1) is calculated using as physics Object Operations are user interface friendships useful, which can be used for and present on the display 138 for calculating equipment 130 Mutually.
Lamp 352b is shown as single lamp, positioned at the center of certain surface.However, lamp 352b can actually be the spy around face Determine several lamps of pattern.Alternatively, by the face of cube 350b using selective transparency or by using light guide, can be with Lamp 352b is presented to camera 137 in a particular form.The presentation (pattern as shown in Figure 2) of specific pattern can detecte cube The specific face of body 350b, and cube 350b can also be detected and be kept or be placed on desk or calculating equipment 130 Relative position, direction and the integral position of cube 350b when nearby.This makes it possible to the translation and rotation by cube 350b The control of row fine granularity is rotated into, so that even can detecte the small movement or rotation of cube by computer vision technique. Different lighting pattern or color can be used, on each face (or two faces) to realize the tracking to interaction as described herein It is detected with rotation.
Lamp 352b is also possible to dynamically, so that cube 350b includes lighting level detector or camera to detect in room Lighting level.Lamp 352b can react to the illumination level in room, so that the brightness of light increases if it is very bright It is compensated, but if room is very dark, brightness is reduced.
Alternatively, to can detecte cube 350b subsequent for the camera that observation calculates equipment 130 (Fig. 1) or cube 350b Background includes particular color, this makes calculating equipment be more difficult to computer vision operation to detect cube 350b.As Response can indicate that cube 350b changes the color of lamp 352b preferably to protrude from the background (for example, if background is black It is orange and blue can then to indicate that cube 350b is converted to for color and white, because orange be easier to be detected under background To).If detecting background very " mixed and disorderly ", it can indicate that cube 350b makes lamp 352b selection unification, simple pattern (for example, chequer).If the background detected is very simple (for example, a kind of, such as pure color of white), then can indicate to stand More complicated pattern is presented in cube 350b, and is completely independent of white.Multicolor LED lamp array can be used for this purpose, and And it can be with letter in cube 350b, being operated based on the instruction of oneself or the instruction from external computing device 130 (Fig. 1) Single processing element pairing.
Cube 350 in Fig. 3 C includes touch interface 352c.Touch interface 352c can be capacitive touch sensors Or the touch interface of plate, resistive touch sensor or plate or some other types.Touch interface 352c can be a single point (example Such as, it is capable of detecting whether touching), or can be with enough granularities to detect surface (such as cube 350c Entire surface) on touch location or the position touched surface region.Touch interface 352c can be so-called " multiple spot It touches, is able to detect touch interaction that is multiple while carrying out.It (includes more that touch interface 352c, which can distinguish " hard " touch, Multiple pressure power) and " light " touch (including less pressure).Touch interface 352c can cover the one or more of cube 350c The whole surface in face.Touch interface 352c is shown as only covering a part in a face of cube 350c, but each On face, in the subset in face or only may exist touch interface on one face.Touch interface 352c can be by cube 350c Battery and associated processor power supply.
Touch interface 352c can support the interaction with the face of cube 350c, and such as sliding refers to more and slides, similar mouse The more complicated gesture of interaction, the similar interaction clicked or one or more surfaces along cube 350c.For example, using The specific action of touch interface 352c may include the one or more gestures executed in the different sides of cube 350c.Example Such as, two fingers (each finger slides in different directions, and each finger is in the different sides of cube) can indicate phase Associated calculating equipment executes a movement, and sliding can indicate that associated calculating equipment executes not on other two face Same movement.One group of sliding or multiple sliding or multiple click on two faces can switch between level of zoom, and two Same action in different sides can choose some aspects of user interface.It touches while as on multiple faces or individually touches one Sample, which simply acts, can execute a movement, and another movement can be executed by touching while in other faces.
For example, on two faces relative to each other simultaneously touch (or detect enough power while touch) can serve as " crawl " movement in three-dimensional environment with select simultaneously " crawl " to virtual or augmented reality object so that it can be moved or Interaction.For the user of cube 350c, " feeling " extraordinary image is grabbed object, such as augmented reality environment by this movement In remote controler or broom handle or pikestaff or sword.During the interaction with augmented reality environment, it may be required that user keeps phase Pair touch, when to be interacted in augmented reality environment keep to it is selected or pick up object " crawl ".For example, Sword or rifle are held in game, it may be necessary to enterprising in all four faces for a perimeter for constituting a cube (or three faces) Row touches, and is like the weapon as people's possible " holding " in reality.Put down one or two of four faces It can make virtual weapons falling down on hand from people.Alternatively, although all having recorded " touching on all four faces or all three faces Touch ", but the crawl of people release (is detected) to a certain extent by force snesor can discharge weapon.
Cube 350d in Fig. 3 D includes tactile element 352d.Tactile element 352d can be including small weight, by The electric motor that coil surrounds, when electric current passes through coil, coil " can vibrate ", so that internal weight surrounds horse It reaches or center axis rotation.Similar linear acceleration tactile motor intermittently strikes weight along axis for simulating " impact " or hindering Power (is more like " percussion " feeling rather than " rumbling " feeling).6s is the first extensive business machine, with " touching The form of feel (tapic) " engine combines linear acceleration tactile motor.Multiple tactile element 352d can be used for by cube The difference " feeling " of 350d simulation.This is two examples.
Tactile element 352d can be operated together with the augmented reality environment for showing and generating on the computing device, the calculating Equipment viewing cube 350d simultaneously replaces it with some augmented reality objects with the more preferable simulation object.For example, if bounce Heart visually replace cube 350d on the display of the calculating equipment of viewing cube, then tactile element 352d can be with It generates soft " impact " or beats or vibrates to imitate associated heartbeat.Rhythm can be shown with the eyes of viewer are given on display The rhythm matching shown.In this way it is possible to increase the feeling of immersion of associated human heart.The heart of the mankind is not merely displayed in The position for the cube 350d that viewer is held, and can feel cube " bounce " in the hand of the user.Use rifle Shooting strikes someone with sword and can feel " impact " to be generated by tactile element 352d.Toot virtual " pet " can be with It is perceived as the vibration (for example, virtual cat toot sound) generated by tactile element 352d.Equally, this can correspond to viewing cube The vision data (for example, toot sound of cat) presented on the display of the associated calculating equipment of body 350.
It similarly, can be by suitably simulating cube intracorporal multiple virtual " objects " using tactile element 352d.Example Such as, multi-panel labyrinth existing many years.These labyrinths generally comprise the steel ball moved along the wooden corridor in labyrinth, and user Entire labyrinth rotation certain party always must suitably be navigated in specific time, or must restart when ball is moved (restart a little for example, ball is fallen on) in labyrinth.Such virtual maze can be on the calculating device display in augmented reality environment Visually substitute cube 350d.When steel ball movement pass through labyrinth when, can simulate tactile appropriate (such as hit wall, Through hole is fallen or distribution of weight).In this way, the specific part in labyrinth may feel heavier (holding ball) or when ball is hit It hits side or may feel when otherwise being moved in entire labyrinth as by " collision ".Cube 350d can be carried out These and similar tactile act.
Cube 350e in Fig. 3 E includes loudspeaker 352e.Loudspeaker 352e can be multiple loudspeakers, each face one It is a or multiple, or can be single loudspeaker 352e.Loudspeaker can be powered by the battery in cube 350e.Loudspeaker 352e can be executed simply to be acted as playing music or sound indicated by the associated user for calculating equipment.
But sound can with the thing that occurs on the associated associated display for calculating equipment of cube 350e Object is synchronous.For example, cube can play " mew mew " sound, " toot if cube 350e is enhanced the kitten replacement of reality Toot " other movements spoken done of sound or kitten.Therefore, when viewer sees the mew mew sound of augmented reality or virtual kitten When, sound may be from cube itself, rather than from mobile device, VR/AR earphone or neighbouring computer speaker.It is real On border, in augmented reality environment cube " being replaced " at anything caused by any kind of sound may have Associated sound, noise, music etc..Loudspeaker 352e on cube 350e can issue those sound, noise or music. This further increases feeling of immersion again.
Cube 350f in Fig. 3 F includes temperature element (TE 352f.Temperature element (TE 352f can be can usually by using Low-voltage shows to simulate on the associated display for calculating equipment the equipment that increases or decreases its external temperature Augmented reality or Virtual Reality Object.For example, making ice cube room if cube 350f is shown the ice cube replacement in device Temperature can cacesthesia.For cube 350f, it is more suitable that when touch, which feels cold,.Therefore, temperature element (TE 352f can be with Adjust accordingly its temperature.Even if temperature element (TE 352f cannot reach actual cryogenic temperature (temperature as possessed by ice cube), But significantly reducing temperature also will increase the feeling of immersion for holding the experience of virtual reality or augmented reality ice cube.Fine granularity control may It is feasible may also be infeasible, especially at low voltage, but do not require fine granularity control for increasing feeling of immersion.
Similarly, if player or people play the part of " wizard " in augmented reality or reality-virtualizing game and start " flame " Magic arts are discharged with some enemy, then the cube can substitute coming for player's fireball on hand or relevant flame magic arts Source.In this case, when flame magic arts start or start, given out from cube to the warm of the palm of player or finger Sense will increase the immersion experience of the user.The multi-touch or multi-panel movement for being related to touch interface 352c can star flame magic arts Heating with cube 350f is (such as by signified to the software on software and the relevant calculating equipment of temperature element (TE 352f interaction Show).
These of temperature element (TE 352f make the temperature of cube 350f correspond better to replace with many other applications The visual pattern of cube 350f shown on the display that viewing calculates equipment, this increases the whole of cube 350f Strong experience of reality is more preferable for user (especially people of the hand by cube).
Cube 350g in Fig. 3 G includes air bag 352g.Air bag 352g can be an air bag or multiple air bags, or It can not actually be air bag, but can be that a series of electricity are scalable and extendable element is (for example, one, each face or each face Four or five).Similarly, an air bag or multiple air bags can be used on each face of cube 350g.Although description For air bag, but electromagnetic actuators, lever, electric piston and other similar system also can be used.
Air bag 352g can be by being used to fill or empty air bag (or extending or shrink electronic component) with from calculating equipment The corresponding cube 350g of instruction on electronic device control so that cube 350g is deformed.The deformation can be set by calculating Standby control, to correspond better to the shape of the object shown on the computing device.
For example, when user is held in augmented reality environment virtually using the cube 350g of real world as pistol grip Or when augmented reality pistol, cube 350g can be by evacuating two air bags on opposite face and making adjacent (and opposite face) Air bag on face is expanded and is deformed, so that more elongated shape is presented in cube, more like pistol grip.
Alternatively, can all make a series of 6 if showing virtual or augmented reality heart on calculating device display A air bag 352g (one, each face) expansion is so that cube is more round.This makes, cube 350g feel be more like heart without As cube.As described above, tactile element 352d can generate the heart experienced in more round cube 350g simultaneously " beat ", to increase the whole similitude of virtual and real experiences.
Cube 350h in Fig. 3 H includes electrode 352h.Electrode 352h is marked as single electrode, but actually it It can be the series or number of electrode or similar electric device on each face of cube 350h, or cube 350h's There are multiple electrodes on each face.To the research of the specific voltage for being applied to electrode (especially small electrode) it has been shown that straight Connect under the specific voltage for being applied to skin, nerve endings relevant to touch, pressure, heat or pain can in this way by To stimulation with by the experience for making required nerves reaction to imitate closely similar, and feeling needed for not practical generation (such as Touch, pressure, heat, pain etc.).
Therefore, low current can be by the hand of user or the skin of user hand, while holding cube 352h to pass through Using only low current come come specific " feeling " of simulating cube 350h.The electric current can be simulated by applying voltage appropriate Texture (for example, fur, spike, cold stone or metal etc.).Therefore, electrode 352h (or multiple electrodes) can be used for cube 350h The extensive experience of holder's simulation.
Although successively discussing each of cube 350a-350h, any member in various elements discussed Part can be combined with each other in single cube 350.Therefore, tactile element 352d can be combined with touch interface 352c and/or It can be combined with electrode 352h.Each element individually discusses, makes to inform its desired use, but can also to be combined With.Similarly, each element can be set at one of cube or up on all six faces, or combined form, such as The form of touch interface 352c and lamp 352b or any other permutation and combination on each face.It can be by cube for applying Program can be described as " dynamic " with each of these options interacted with the holder of cube 350.As made herein Dynamic is similar to tactile, but is intentionally wider term, including discussed above to one or more elements The use of 352a-352h to create overall dynamics experience for the holder of cube 350.In this way, various element 352a-352h can To be known as " dynamic element ".
For example, when being detected grasping cube 350 by touch interface 352c and being struck using augmented reality sword virtual When enemy, tactile element 352d can be generated in response to striking every time it is appropriate " bang " or " shock " feel.This can be into one Walking, which is immersed in a people, brandishes in " virtual " weapon.Similarly, when pressing the button 352a (or sensing pressure) every time, loudspeaking Device 350 can produce audible feedback relevant to gunslinging, with preferably gun shooting.When rifle quickly opens fire a period of time, temperature Degree element 352f can be heated, and opened fire and felt more like true rifle with response quickly.Similarly, air bag 352g can change cube 350 shape is with preferably perceptual image pistol grip.Although these examples are carried out with reference to the weapon based on game, practical Any other upper option is all available, as long as can be by the way that dexterously using one or more elements, associated element comes Specific augmented reality object is simulated in simulation to a certain extent.
Calculating the communication between equipment and cube 350 can be used bluetooth, WiFi, near field, RFID, infrared or given Any other communication protocol appropriate carries out in the case where bandwidth and power consumption requirements.Generally preferable low-power alternative solution, with Just for the practical any element saving energy for executing discussed function.
The description of process
Referring now to Figure 4, Fig. 4 shows the flow chart for the process with augmented reality environmental interaction.Flow chart tool There is beginning 405 and terminate 495, but the process is substantially circulation, as dotted line returns shown in arrow.It is being watched when calculating equipment When with tracking cube or other three dimensional objects, which can occur repeatedly.
After starting 405, which starts from the generation of the three-dimensional environment at 410.In the display for calculating equipment Upper generation environment.Three-dimensional environment can replace completely real (for example, reality environment) or can with " enhancing " (such as Augmented reality) supplement reality, or can only comprising one or more element-specifics.This replacement and/or supplement take three Tie up the form of the object in the environment or environment of rendering.Thus, for example, user in virtual reality may visually suddenly seemingly Appear in the Mountain of Temple of God in Jerusalem or the bank in Italian Como lake, or the completely imaginary position in immersion game It sets, in the environment based on story or other positions.
User in augmented reality usually still maintains its current location, wherein be built in augmented reality headphone or Camera in equipment (such as mobile phone) serves as " window " into the augmented reality world.In the augmented reality world, user His or her current location can be seen first, but can add extra objects, people or other elements.Therefore, a people can To be sitting in his or her office, but when calculating equipment viewing by augmented reality, fairy maiden can swim in office Near interior wall or announcer may stand in neighbouring corridor and explain to the user of augmented reality equipment.
Augmented reality would typically attempt to merge it is really and non-genuine, with display as normal as possible, but more similar cartoons or The experience of similar game is also possible.For this purpose, more advanced enhancing and virtual reality system depend on laser radar, infrared phase Machine and scanner and other similar technology physically map the three-dimensional feature of current environment.In this way it is possible to determine The accurate dimension and shape in space, and can more accurately integrate object, people or the other elements of any augmented reality.Example Such as, image can replace actual wall without " turning " or seem to be suspended in the air.It, can be with when people is behind furniture It suitably presents, so that visual angle appears not to be breached.These functions and other function be it is possible, this depend on rendering three Tie up the robustness of the related computing devices of environment.
In this case, most of augmented realities or reality environment in the prior art are main (if not spy Surely vision is depended on).Some more complicated systems also include the control that can be tracked by earphone itself or external trace device Device.In this way, such asEtc. systems can track the controller in user hand.Have on these controllers by The interaction on some bases may be implemented in key.However, for example,The tracking of system tracks the single of a unique color The light (therefore multiple balls can be tracked simultaneously) of ball transmitting.Just because of they be it is circular, each " ball " without side, above Or below.Their position can be tracked, but not tracking direction.
Equally,Controller includes the external annulus of key and the hand around keeper, outer circle The capable of emitting traceable infrared light of ring.In this way it is possible to track the position and direction of the hand of keeper.However, the tracking Function need external (or several) camera to track the movement of those hand held controllers.
On the contrary, the use of cube described herein being that cube (or other three dimensional objects) is presented to calculating in next step The camera 420 of equipment.In the most common cases, the camera will be mobile device (such as) on camera, quilt As " portal ", augmented reality environment is experienced by " portal ".Camera does not haveEqual complication systems Equipment.On the contrary, it is only the equipment that most people has possessed, and the equipment does not include for detecting specific infrared markers Specialized hardware or other professional components.
Similarly, although three dimensional object described above can combine the multiple elements that can increase immersion experience, It is that it can be simple as having there are six the cube of unique reference mark.Only with two or three unique reference marks Object may be sufficient.As used herein, phrase " unique reference mark " does not clearly include using as reference mark collection Multiple single lamps, infrared ray or other.In the understanding to this patent, entire controller, such as utilize array of lampAn actually reference mark.If being not on several lamps of known location (usually also more It is more), computer vision technique can not just be knownPosition, orientation or the relative position of controller.Therefore,On single lamp be not reference mark-it be a lamp.Multiple lamps constitute single unique base together Fiducial mark remembers (phrase as used in this patent).
In other words, phrase " unique reference mark " refers to inherently complete single marking, can be used for controller An or face or entire side (not being a single point) and another face or border area point for three dimensional object.Further, it is possible to use solely Special reference mark itself determines the position of the object with reference mark.As seen in the application, a kind of method is wound It founds a capital hexagon cube of the every side with a unique reference mark.With other similar AR Known configurations with VR controller dependent on infrared lamp on controller.Although accurate, each of these lamps are not " complete It is whole " because single lamp be not enough to byOne face of controller or a side and another face or one A border area point.In a group, they can be provided commonly for export direction and location information, even if lamp there are two only, independent one It is a to define any face or side.
The use of unique face (each face includes unique reference mark) is especially important, includes because it reduces experience Gross investment needed for the immersion of " controller " is virtual or augmented reality, and can be realized in not more complicated VR and AR ear The additional functionality that can not be obtained in the case where machine or system and controller.
Although discussed here is that (it is on the face of three dimensional object in black and white high-contrast image for multilayer uniqueness reference mark Form), but in some cases, other computer control and detection techniques can be used for position, rotation and the direction tracking of three dimensional object Some aspects.For example, unique reference mark can be side or angle detection technique, such as the three dimensional object with unique color Each side or angle.One on each angle, the combination of the unique color of specific setting can be used for determining associated with those sides spy Determine face, and determines that (for example, orange angle is in the lower right of cube, and purple angle is located at upper left side, therefore, at cube in direction This direction and this distance in the size based on the angle color detected).
Similarly, color or label can be passively or active, including coating, reflecting material etc., or depend on The light or room light only just escaped from the surface of three dimensional object in certain directions and/or pattern and/or color.For example, unique Multilayer reference mark can be only white and black, but white can be passed through outside three dimensional object by light and be generated.Optionally Or additionally, light can be coloud coding, so that each face is unique colourama, but the pattern on each face or angle It can be identical.Alternatively, pattern can be different on each face or angle, but color can be it is identical.
Similarly, the position, direction and rotation of three dimensional object can be at least partly detected using other technologies.Those Technology includes external trace three dimensional object (for example, the object includes the mark detector and camera for tracking its own position With communication capacity associated with external equipment), the detection based on light, using multiple external cameras simultaneously detect it is multiple or several Side.Movement and rotation and gravity sensor may include in three dimensional object itself with track or enhance to three dimensional object with Track.
Next, the camera for calculating equipment identifies three dimensional object at 430, tracing positional, orientation and movement are started simultaneously at. In this stage, not only three dimensional object is identified as subject to be tracked, and specific side, face or reference mark (and Its orientation, up or down or left or right) it is identified by calculating equipment.Orientation is important because associated software also, it is understood that if User rotates the object in one direction, then the face by be subsequently presented to calculate equipment camera during and The face can make associated virtual or augmented reality rendering objects make corresponding reaction.At 430, tracking, position, side Start to be tracked by software combining camera to movement (including rotation).As described above, camera can be used for executing the tracking, still Object can be with self tracking and to its position of associated calculating equipment report, direction and movement.Alternatively, object and calculating equipment Some or all processes involved in tracking can be executed.
Now, three dimensional object (such as cube) can be with the reality environment or augmented reality shown on display Some aspect of user interface is associated.It is on the computing device that this association, which can be as simple as " you " (user for calculating equipment), Three dimensional object in the virtual or augmented reality environment of display.Alternatively, three dimensional object can be weapon, rifle, ball, map, guide The scapegoat of needle or other kinds of object.Alternatively, three dimensional object can change setting, user with certain menu, operation, volume The visual angle of " view " or augmented reality environment, virtual or augmented reality books the page and virtual or augmented reality environment or right The other similar aspect of elephant is associated.
The association can carry out automatically.For example, user can load particular game, application program or experience.In load, Game, application program or experience can begin to use the camera for calculating equipment.Game, application program or experience may expect to see Cube or other three dimensional objects.Therefore, it can with may be in continuous scanning camera frame expected three dimensional object object. Once finding, software can be automatically associated with the particular aspects of user interface by three dimensional object.
For example, object can become star fighter plane, float in space, and the movement of the object can make star fight Bucket machine moves in a similar way, thus movement of the mirror user to object.Rolls forward object may make star fighter plane It flies or increases speed downwards.Rolling object backward may result in interspace fighter plane rising or slows down.
In other cases, association can be manually selected (for example, by handing over the menu on the display of calculating equipment Mutually), it or can be associated with by the interaction with three dimensional object itself to enable.For example, clicking in a specific way, extruding or mobile Object (for example, in the sky spell " Z ") can make " scaling " function in object control interface or control related application Paintbrush in volume, or selection application program.Movement and/or movement can be predefined by application program itself, Huo Zheke To be user-programmable.In this way, object can serve as " mouse " or as some of any amount of application program Other interactive elements.For example, clicking and torsion (rotating around Y-axis) can be such that object is used as (and visually to appear in phase In the display for closing application program) volume knob.When it turns right, volume may be will increase.When rotating to the left, volume may It can reduce and the volume knob of a standard is similar, from beginning to end, user's only actually, which is held, to be had including different benchmark The cube in six faces of label.
Once at 440 that three dimensional object is associated with particular user interface element, so that it may the test object at 450 Movement.The movement substantially can be any form.For example, translational motion can be rotated to be pivoted, around multiaxis, Xiang Renyi Side or mode " separate " user (or display or camera) upward or downward or towards user.Movement can be quickly or Can be slowly (and can be detected and may be important, it is existing that this depends on enhancing associated with three dimensional object Real object or function).
The movement is also possible to dynamically, such as when between object is thrown to aerial, user or when target.Due to using letter The three dimensional object (for example, multilayer reference mark) that single computer vision technique can be tracked in multiple depth, therefore can be Before throwing at the distance close to user reliably tracking object, and can be further away from user after throwing.? In some cases, a part of multiple three dimensional objects as the game for completing to throw or transmit object can be used.
Due to common is had existed to image tracing a period of time, the maximally related movement for being accordingly used in the application is to be related to The movement of those of the specific single face or multiple faces of tracking three dimensional object.Most commonly, it will be rotated around one or more axis. But it is also likely to be to track which current face is compressed, clicked or which face is (or other in the hand of specific user Place) in hold.For example, the face x of detecting is visible, and assumes that three dimensional object is held in the right hand, face y may most probable It is maintained as the skin closest to the hand of user.When with object interact generation in virtual or augmented reality environment when, the letter Breath can be used for providing the face or the multidate information (for example, heating or impact etc.) closest to the face.
The movement detected can be used for more new user interface and/or three dimensional object 460 itself.Specifically, associated three The step of dimensional object and user interface 440, may be used as the earlier step in terms of recognition user interface, automatically or as selection Property movement, by be user interface and/or three dimensional object at 460 update theme.Thus, for example, can be selected at 440 Volume interaction is selected, in this case, the movement detected at 450 can be used for updating volume.Alternatively, if color selects Device is associated with three dimensional object at 440, then the rotation of the three dimensional object detected at 450 can lead to used coating Color change is (for example, the picture for being used by user and/or being indicated by the three dimensional object in augmented reality or reality environment Pen).If at 440 by three dimensional object in virtual reality or augmented reality game incarnation or racing car or space ship it is related Connection, the then movement detected at 450 (such as rotating forward) can make the augmented reality or Virtual Reality Object increase speed Degree reduces speed or jump or executes other movements.
In judgment step 465, it is specific to determine that the movement of three dimensional physical object is tracked by associated calculating equipment Whether movement terminates.This can by three dimensional object by movement (for example, click or sliding or similar action) cancel selection or Complete selection carry out, or can by time-out (for example, by 4 seconds without change, followed by specific action or selection user circle Surface element) it carries out.If specific movement does not complete (being "No" at 465), which continues to detect movement at 450.
If specific movement terminates (being "Yes" at 465), process continuation determines whether at judgment step 475 Complete whole interaction.Here, existing as the application program of the software operation calculated in equipment, game or other virtual or enhancings Real environment can check whether whole process is completed.This may be very simple, such as game over or user are no longer in labyrinth etc. Navigation.But it is also likely to be complexity, such as user eliminates in the application program of similar coating to paintbrush dataller The selection of tool, but not yet exit application program.(being "No" at 475) if this is the case, then calculating equipment can be 440 Place is associated with some other aspect of user interface by three dimensional object, and the process can start again at.For example, user is Through cancelling the selection selected to paintbrush brush, but paint spraying outfit is had selected for now.Whole process is not yet completed, but initially with The specific interaction of track is over.
If interaction is over (being "Yes" at 475), entire mistake can be determined in judgment step 485 by calculating equipment Whether journey terminates.In the step, software can be simply shut off, or packs up mobile device or other calculating equipment.If it is (being "Yes" at 485) in this way, then complete the process at end point 495.If it is not, (being "No" at 485), then three Dimensional object may be lost because blocking camera, may have moved out the visual field or may become unavailable.This process It may will continue to the position that object and it are identified at 430, and this process may continue therefrom.
Fig. 5 is in response to update the flow chart of the dynamic process of three dimensional object in the variation of augmented reality environment.The stream Journey figure, which has, to be started 505 and terminates 595, but as indicated, which is substantially also circulation.When calculate equipment When viewing and tracking cube or other three dimensional objects, which can occur repeatedly.
The process starts from rendering the three-dimensional environment (or object) such as virtual reality or augmented reality environment in step 510. This is discussed above.Rendering apparatus can be calculating equipment, such as VR/AR earphone, mobile device, tablet computer.
In step 520, calculating equipment can be presented with three dimensional object and can identify it like this.As described above, object It may include one or more reference marks, lamp or can other identified aspects.For generate three dimensional object dynamic, The object is not necessarily required to there is multiple reference marks, but also can have multiple reference marks.
Then, in step 530, three dimensional object can be associated with three-dimensional environment object.Therefore, in virtual or augmented reality In, object can be automatically or associated with object by user action/selection.At this point, being seen on the display for calculating equipment The actual true three-dimension object seen can be on the display by augmented reality or virtual reality object (for example, heart, interspace Fighter plane, personal head portrait, gun etc.) it replaces.In augmented reality environment, real rest part will continue to be normally displayed, But compared with cube or other three dimensional objects, object (for example, heart) will seem to be held in the hand of user.
Calculating equipment can communicate (for example, by bluetooth or other modes) with three dimensional object, which includes ginseng That examines that Fig. 3 discussed can generate dynamic one or more elements.At 540, augmented reality heart can start calculating On the display of equipment " bounce ".Meanwhile calculating equipment can indicate that tactile element 352d starts " to beat " or operate, so as to mould The quasi- heartbeat to match with rhythm that is being shown on display.Further, it can indicate that temperature element (TE 352f is slightly increased The temperature of three dimensional object is with more preferable simulation human heart.Finally, all air bags can be made to inflate to feel with pilot balloon 352g Feel more " round ", to be felt when held in the hands of a user more like human heart.
At 550, the dynamic of three dimensional object such as presses the update of the instruction at 540.As described above, can actually be used together Dynamic any combination generates different feeling or tactile for user, especially holds the user of three dimensional object.
If it is desire to any additional dynamic (being "Yes" at judgment step 555) is (for example, heart is in a manner of theatrical Stop bounce to show sudden cardiac arrest), then instruction can be received from the software that operates on the computing device at 540 and can be At 550 again upgating object dynamic.
If not updating other dynamics (being "No" at 555), which can terminate 595, until it is expected object Dynamically recycle next time.
Fig. 6 is the example for the calculating equipment 630 for participating in Computer Vision Detection and tracking three dimensional object 650.Calculate equipment 630 are illustrated as the front of the back side of mobile device or augmented reality or virtual reality headset.Calculating equipment 630 includes that capture calculates The camera 637 of the image of 630 front of equipment.
Calculate those of the front of equipment 630 object first is that three dimensional object 650.Three dimensional object can be six face cubes, It includes unique reference mark on each face, so that in addition to a position, orientation can be tracked by camera 637.
Fig. 7 is to calculate equipment 730 three dimensional object 650 (Fig. 6) detected in augmented reality environment is replaced with rendering Three dimensional object 750 (such as people) example.Fig. 7 is identical as Fig. 6, in addition to pointing out to calculate the three-dimensional rendered of equipment 730 The description of coherent element is not repeated in the three dimensional object 650 that object 750 replaces Fig. 6 in rendering contexts herein.The three-dimensional of rendering Object 750 can be rendered also, as described below on the identical position of three dimensional object 650 and orientation, the three of rendering Dimensional object 750 can be moved in a manner of identical with three dimensional object 650.
Fig. 8 is that the screen of calculating equipment 830 shows 838, shows the three dimensional physical object that can be rotated around three axis 850.The three dimensional physical object 850 detected by camera 737 can appear on display 838.Because object 850 is in each face It is upper that there is unique reference mark, it is possible to detect its direction and usually once see its multiple side.Image can be used only Camera 737 (such as RGB, black and white or ultraviolet light) comes tracking rotary and direction.
Fig. 9 is that the screen of calculating equipment 930 shows 938, shows the alternative of three dimensional object 950 of rendering with sub Manage three dimensional object 850.Here, the three dimensional object 950 of the rendering on display 938 replaces actual three captured by camera 737 Dimensional object 850.The virtual environment or reality for wherein placing the three dimensional object 950 rendered can be presented in display 938.And And it can be with tracking rotary and other function described herein.
Figure 10 is to replace three-dimensional article with rendering objects 1050 ' in the augmented reality display 1038 of computer equipment 1030 The example of object 1050 is managed, which includes dynamic associated with rendering objects 1050 '.
As described above, dynamic can be by any amount of things of various element 352a-352h (Fig. 3) generation or one group Things.The dynamic for being shown as the heart of the three dimensional object 1050 ' of rendering may include heartbeat, heat, based on the shape for forming air bag The circularity of the cube of shape.As a result, real world three dimensional physical object 1050 can with present on display 1038 Mode " feeling " as 1050 ' outer appearnce of three dimensional object.Dynamic can be updated to correspond to object or provide and display 1038 The feedback of other interactions of the environment of upper display.
Conclusion
Throughout the specification, embodiment and example shown in should be considered as example, rather than protect to disclosed or requirement The device of shield and the limitation of process.Although many examples given here are related to the specific combination of method movement or system element, However, it should be understood that these movements and those elements can be combined otherwise to realize identical target.About flow chart, More and less steps can be taken, and can be combined or step shown in further refining is described herein to realize Method.It is not intended to only in conjunction with the movement of one embodiment discussion, element and feature and is arranged from the similar role in other embodiments It removes.
As used herein, " multiple " mean two or more.As used herein, " one group " project may include one A or multiple such projects.As used herein, no matter in written description or in the claims, term " includes ", " its In include ", " carrying ", " having ", "comprising", " being related to " etc. should be construed as it is open, that is, indicate include but is not limited to.Only By the closing or semi-enclosed mistake that transition phrase " consist of " and " substantially by ... form " are about claim respectively Cross phrase.Carry out modification right using the ordinal term of " first ", " second ", " third " etc. in the claims and requires element Itself be not meant to a claim element prior to any priority, priority or the sequence of another or time, hold The sequence of the behavior of row method, but be used only as label with will have specific names claim element with have it is mutually of the same name Another element (but for using ordinal term) claimed is distinguished, to distinguish claim element.As used herein, " and/ Or " mean that listed project is alternative solution, but alternative solution further includes any combination of listed item.

Claims (20)

1. a kind of device, the three dimensional physical object including processor and memory and the unique reference mark of carrying at least two, The processor executes instructions so that the processor:
Three-dimensional environment is generated, the three-dimensional environment includes the user interface element for interacting with the three-dimensional environment;
The movement of the three dimensional physical object is detected using described at least two unique reference marks;And
Movement based on the three dimensional physical object updates the user interface element in the three-dimensional environment.
2. the apparatus according to claim 1, wherein the three dimensional physical object is that have standing for six unique reference marks Cube has unique reference mark on each face of the cube.
3. the apparatus according to claim 1, wherein the user interface element belongs to the back that can be adjusted up and down Scape, and wherein, the movement of the three dimensional object with translation in one direction or rotates increase background upwards, and with Translation or rotation in different directions reduces downwards background.
4. the apparatus according to claim 1, wherein the user interface element is one in multiple tactic elements A element, and wherein, the movement of the three dimensional object is by translation in one direction or rotation so that user circle Surface element be updated in multiple Sequential Elements relatively after element, and the use is made by translation in different directions or rotation Family interface element is updated to the element earlier above in multiple Sequential Elements.
5. the apparatus according to claim 1, wherein the user interface element is the one or more of the three-dimensional environment Size of the aspect relative to viewer, and wherein the movement of the three dimensional object passes through translation or rotation in one direction So that the size increases and by translation or rotation in different directions so that the size reduces.
6. the apparatus according to claim 1, wherein the user interface element is viewer in the three-dimensional environment Position, and wherein the three dimensional object movement so that the position response in the translation or rotation of the three dimensional physical object It is updated in the three-dimensional environment.
7. the apparatus according to claim 1, wherein the three-dimensional environment includes that can be replaced with the user that variable velocity moves Body, and wherein the three dimensional physical object with the translation of First Speed or rotation so that the scapegoat is transported with associated speed The translation at different rates of the dynamic and described three dimensional physical object or rotation so that the scapegoat with another associated speed Movement.
8. a kind of method interacted with three-dimensional environment, comprising:
Three-dimensional environment is generated, the three-dimensional environment includes the user interface element for interacting with the three-dimensional environment;
The movement of the three dimensional physical object of at least two unique reference marks is carried using phase machine testing;
Movement based on the three dimensional physical object updates the user interface element in the three-dimensional environment.
9. according to the method described in claim 8, wherein, the three dimensional physical object is that have standing for six unique reference marks Cube respectively has unique reference mark on each face of the cube.
10. according to the method described in claim 8, wherein, the user interface element belongs to and can be adjusted up and down Background, and wherein, the movement of the three dimensional object with translation in one direction or rotates increase background upwards, and with Translation in different directions or rotation reduce background downwards.
11. according to the method described in claim 8, wherein, the user interface element is in multiple tactic elements One element, and wherein, the movement of the three dimensional object is by translation in one direction or rotation so that the user Interface element be updated in multiple Sequential Elements relatively after element, and made by translation in different directions or rotation described User interface element is updated to the element earlier above in multiple Sequential Elements.
12. according to the method described in claim 8, wherein, the user interface element is one or more of the three-dimensional environment Size of a aspect relative to viewer, and wherein the movement of the three dimensional object passes through translation or rotation in one direction Turn so that the size increases and by translation or rotation in different directions so that the size reduces.
13. according to the method described in claim 8, wherein, the user interface element is viewer in the three-dimensional environment Position, and wherein the three dimensional object movement so that the position response in the translation or rotation of the three dimensional physical object Turn to update in the three-dimensional environment.
14. according to the method described in claim 8, wherein, the three-dimensional environment includes the user that can be moved with variable velocity Scapegoat, and wherein the three dimensional physical object with the translation of First Speed or rotation so that the scapegoat is with associated speed Movement and the three dimensional physical object are translated at different rates or are rotated so that the scapegoat is with another associated speed Movement.
15. a kind of system, comprising:
Calculate equipment, including processor and memory;
With the camera of the computing device communication;
Three dimensional physical object including at least two unique reference marks;And
Wherein, the processor is used for:
Three-dimensional environment is generated, the three-dimensional environment includes the user interface element for interacting with the three-dimensional environment;
The movement of the three dimensional physical object of described at least two unique reference marks is carried by using phase machine testing, with Determine the absolute and relative position of described two unique reference marks when the three dimensional physical object translates or rotates;
The movement of the three dimensional physical object based on the absolute and relative position instruction by described two unique reference marks, more User interface element in the new three-dimensional environment.
16. system according to claim 15, wherein described at least two unique reference marks are multilayer reference marks.
17. system according to claim 16, wherein the processor carries out motion tracking to the three-dimension object to examine Survey the movement.
18. system according to claim 17, wherein the computer vision system uses in the multilayer reference mark First layer detect the three dimensional object in the movement of the first depth, and detected using the second layer in the multilayer reference mark Movement of the three dimensional object in the second depth.
19. system according to claim 15, wherein each of described six unique reference marks are in following At least one: high-contrast image, comprising particular color, the light-emitting element comprising particular color, including the anti-of particular color Luminescent material and with the three-dimensional feature for associated picture.
20. system according to claim 15, wherein one for calculating equipment in following: mobile phone, flat Plate computer, luggable computer, virtual reality headset, augmented reality earphone and digital camera.
CN201880005791.3A 2017-01-02 2018-01-02 Three-dimensional augmented reality object user interface functionality Active CN110140100B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201762441525P 2017-01-02 2017-01-02
US62/441525 2017-01-02
US201762469292P 2017-03-09 2017-03-09
US62/469292 2017-03-09
PCT/US2018/012110 WO2018126281A1 (en) 2017-01-02 2018-01-02 Three-dimensional augmented reality object user interface functions

Publications (2)

Publication Number Publication Date
CN110140100A true CN110140100A (en) 2019-08-16
CN110140100B CN110140100B (en) 2020-02-28

Family

ID=67569131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880005791.3A Active CN110140100B (en) 2017-01-02 2018-01-02 Three-dimensional augmented reality object user interface functionality

Country Status (2)

Country Link
EP (1) EP3563568A4 (en)
CN (1) CN110140100B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110955988A (en) * 2019-10-30 2020-04-03 郑州飞机装备有限责任公司 Modeling method for solving dynamic characteristics of hanging ejection mechanism by using ADAMS
CN111506379A (en) * 2020-04-17 2020-08-07 成都安易迅科技有限公司 Interface control method and device, storage medium and electronic equipment
US11334215B1 (en) 2021-04-09 2022-05-17 Htc Corporation Method of generating user-interactive object, multimedia system, and non-transitory computer-readable medium
US11449099B2 (en) 2014-07-16 2022-09-20 Ddc Technology, Llc Virtual reality viewer and input mechanism

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005116805A1 (en) * 2004-05-28 2005-12-08 National University Of Singapore An interactive system and method
US20070171194A1 (en) * 2005-12-22 2007-07-26 Francois Conti Workspace expansion controller for human interface systems
CN103793936A (en) * 2012-10-31 2014-05-14 波音公司 Automated frame of reference calibration for augmented reality
CN104508600A (en) * 2012-07-27 2015-04-08 日本电气方案创新株式会社 Three-dimensional user-interface device, and three-dimensional operation method
US20150130836A1 (en) * 2013-11-12 2015-05-14 Glen J. Anderson Adapting content to augmented reality virtual objects
CN104834897A (en) * 2015-04-09 2015-08-12 东南大学 System and method for enhancing reality based on mobile platform
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7295220B2 (en) * 2004-05-28 2007-11-13 National University Of Singapore Interactive system and method
US20050289590A1 (en) * 2004-05-28 2005-12-29 Cheok Adrian D Marketing platform
WO2010124333A1 (en) * 2009-04-29 2010-11-04 Jumbuck Entertainment Ltd Computer input device and computer interface system
US20110065496A1 (en) * 2009-09-11 2011-03-17 Wms Gaming, Inc. Augmented reality mechanism for wagering game systems
GB2527471A (en) * 2013-05-31 2015-12-23 Anki Inc Mobile agents for manipulating, moving, and/or reorienting components

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005116805A1 (en) * 2004-05-28 2005-12-08 National University Of Singapore An interactive system and method
US20070171194A1 (en) * 2005-12-22 2007-07-26 Francois Conti Workspace expansion controller for human interface systems
CN104508600A (en) * 2012-07-27 2015-04-08 日本电气方案创新株式会社 Three-dimensional user-interface device, and three-dimensional operation method
CN103793936A (en) * 2012-10-31 2014-05-14 波音公司 Automated frame of reference calibration for augmented reality
US20150130836A1 (en) * 2013-11-12 2015-05-14 Glen J. Anderson Adapting content to augmented reality virtual objects
CN104834897A (en) * 2015-04-09 2015-08-12 东南大学 System and method for enhancing reality based on mobile platform
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11449099B2 (en) 2014-07-16 2022-09-20 Ddc Technology, Llc Virtual reality viewer and input mechanism
CN110955988A (en) * 2019-10-30 2020-04-03 郑州飞机装备有限责任公司 Modeling method for solving dynamic characteristics of hanging ejection mechanism by using ADAMS
CN110955988B (en) * 2019-10-30 2024-04-02 郑州飞机装备有限责任公司 Modeling method for solving dynamics characteristics of hanging ejection mechanism by ADAMS
CN111506379A (en) * 2020-04-17 2020-08-07 成都安易迅科技有限公司 Interface control method and device, storage medium and electronic equipment
US11334215B1 (en) 2021-04-09 2022-05-17 Htc Corporation Method of generating user-interactive object, multimedia system, and non-transitory computer-readable medium

Also Published As

Publication number Publication date
EP3563568A4 (en) 2020-11-11
EP3563568A1 (en) 2019-11-06
CN110140100B (en) 2020-02-28

Similar Documents

Publication Publication Date Title
US10228773B2 (en) Three-dimensional augmented reality object user interface functions
US11762478B2 (en) Virtual or augmediated topological sculpting, manipulation, creation, or interaction with devices, objects, materials, or other entities
US11826636B2 (en) Depth sensing module and mobile device including the same
JP5525923B2 (en) Image processing program, image processing apparatus, image processing system, and image processing method
US11517821B2 (en) Virtual reality control system
CN104704535A (en) Augmented reality system
CN106055090A (en) Virtual reality and augmented reality control with mobile devices
CN110140100A (en) Three-dimensional enhanced reality object user&#39;s interface function
CN110168475A (en) User&#39;s interface device is imported into virtual reality/augmented reality system
JP2010253277A (en) Method and system for controlling movements of objects in video game
JP2010257461A (en) Method and system for creating shared game space for networked game
CN108986188A (en) AR video generation device
JP6554139B2 (en) Information processing method, apparatus, and program for causing computer to execute information processing method
JP6826626B2 (en) Viewing program, viewing method, and viewing terminal
Loviscach Playing with all senses: Human–Computer interface devices for games
US20240074816A1 (en) Sterilizable image manipulation device
JP7412613B1 (en) Information processing systems and programs
JP7412617B1 (en) Information processing systems and programs
US20230149805A1 (en) Depth sensing module and mobile device including the same
CN117234333A (en) VR object selection method, VR object selection device, electronic device and readable storage medium
JP2024047008A (en) Information Processing System
JP2024047006A (en) Information processing system and program
JP2021051762A (en) Viewing program, viewing method, and viewing terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40011262

Country of ref document: HK