WO2018094289A1 - Placement à distance de contenu numérique pour faciliter un système de réalité augmentée - Google Patents

Placement à distance de contenu numérique pour faciliter un système de réalité augmentée Download PDF

Info

Publication number
WO2018094289A1
WO2018094289A1 PCT/US2017/062426 US2017062426W WO2018094289A1 WO 2018094289 A1 WO2018094289 A1 WO 2018094289A1 US 2017062426 W US2017062426 W US 2017062426W WO 2018094289 A1 WO2018094289 A1 WO 2018094289A1
Authority
WO
WIPO (PCT)
Prior art keywords
digital content
view
perspective
content
augmented reality
Prior art date
Application number
PCT/US2017/062426
Other languages
English (en)
Inventor
Wolfram GAUGLITZ
John FORTKORT
Original Assignee
Picpocket, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Picpocket, Inc. filed Critical Picpocket, Inc.
Publication of WO2018094289A1 publication Critical patent/WO2018094289A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Definitions

  • the present disclosure relates generally to augmented reality systems, and more specifically to systems, methods and applications which facilitate the remote association, storage and retrieval of digital media at a specific geographic location within an augmented reality system.
  • U.S. 8,963,957 (Skarulis), entitled “SYSTEMS AND METHODS FOR AN AUGMENTED REALITY PLATFORM", discloses a 2-step process by which augmented (AR) content is created and displayed to a user using AR software disposed on a client device.
  • the client device is configured to receive information from sensors embedded therein, including a GPS, an accelerometer, and a compass.
  • AR content is created by aligning an instance of media with a view of reality through the use of the client device.
  • a marker which represents at least a portion of the view of reality which is related to the medium, is then generated.
  • Metadata is generated for the medium and marker using data from the sensors of the client device. This metadata includes the geographical location, yaw, pitch, and roll of the client device at the time of marker creation, and also includes the relationship between the marker and the medium.
  • the medium, marker and metadata is then sent to a data repository.
  • the second step of the Skarulis process occurs when a user of a client device (which may be the same as, or different from, the device used to create the AR content) uses the device to view a view of reality which has AR content associated with it.
  • the AR software matches a marker associated with the AR content to the view of reality, as by matching one or more pattems in the marker to one or more patterns in the view of reality.
  • the AR software then overlays the associated medium over the view of reality based on the relationship between the medium and the marker, and displays the resulting augmented view of reality.
  • the AR software may use arrows or other features to instruct the user on how to position the device so that the AR content may be observed.
  • Google Maps is a mapping service developed by Google which offers street maps, 360° panoramic street level views (available through its "Street View” feature), real-time traffic conditions and route planning capabilities.
  • Apple Maps, Mapquest, and various other products and services offer similar features and capabilities.
  • FIG. 1 is an example of a Google Street View URL which captures the address, GPS coordinates and other relevant parameters which may be used to describe a perspective view.
  • FIG. 2 is an example of a Google Street View image described by said URL.
  • a method for viewing digital content in an augmented reality (AR) system.
  • the method comprises (a) receiving, from a computational device equipped with a display, viewpoint data which specifies a perspective view from a geographic location, wherein said view has augmented reality content associated with it; (b) obtaining, from mapping software, an image of a particular view associated with the geographic location; (c) obtaining, from an augmented reality system, digital content corresponding to the particular view of the geographic location; and (d) displaying, on the display of the computational device, the image with the digital content superimposed over it.
  • a method is provided for placing digital content in an augmented reality (AR) system.
  • the method comprises (a) receiving, from a computational device equipped with a display, viewpoint data which specifies a perspective view from a geographic location, wherein said view has augmented reality content associated with it; (b) obtaining, from mapping software, an image of a particular view associated with the geographic location; (c) receiving digital content and associated placement data, wherein said placement data indicates the prescribed location and orientation of the digital content at the perspective view; and (d) displaying, on the display of the computational device, the image with the digital content superimposed over it at the prescribed location and orientation specified in the placement data.
  • a method for determining the placement of digital content in an augmented reality (AR) system.
  • the method comprises (a) providing mapping software which displays at least one of a series of frames related to a real-world location, wherein each frame contains a unique view of the location; (b) providing ghosting software which ghosts digital content onto a view of a real world location which is observed by a user; (c) receiving, from a computational device equipped with a display, input which specifies the location and orientation of the computational device; (d) in response to the received location and orientation of the computational device, using the ghosting software to superimpose, over a real-world view on the display of the computational device which corresponds to the received location and orientation, a ghosted version of digital content corresponding to the received location and orientation; (e) identifying a frame in the mapping software which corresponds to the real-world view; (f) using the identified frame to determine parameters used to superimpose the ghosted content on the real-world view; and (g) displaying
  • a system for augmenting a view of reality.
  • the system comprises (a) a client perspective module stored on a client device, the client device comprising at least one processor, the client perspective module configured to, when executed by the at least one processor, superimpose a first medium over a first view of reality, receive one or more of a change in transparency of the superimposed first medium, a change in size of the superimposed first medium, and a change in position of the
  • a client viewer module stored on the client device, the client viewer module configured to, when executed by the at least one processor, receive a second medium, a second marker, and second metadata from the depository, match the second marker to at least a portion of a second view of reality, and superimpose the second medium over the at least a portion of the second view of reality to generate an augmented view of reality;
  • a mapping module which, in response to user input on the client device which includes a specified location, displays at least one of a series of frames related to that location, wherein each frame contains a unique view of the location;
  • a matching module which matches the augmented view of reality generated by the client viewer module to a corresponding frame in the mapping module; and (e) a display module which displays
  • a system for generating content for an augmented reality system.
  • the system comprises (a) a mapping solution selected from the group consisting of mapping software, mapping applications and mapping services; (b) a browser which accesses said mapping solution and which renders perspective views of a location therefrom; (c) a content placement module which (a) superimposes digital content on a perspective view rendered by the browser from said mapping solution, thereby yielding a composite image, and (b) extracts information from the composite image, wherein the extracted information specifies the location of the superimposed digital content relative to the perspective view; and (d) an augmented reality system which receives composite images generated by the content placement module and which ghosts the corresponding digital content over a real-world view corresponding to the perspective view.
  • a system for generating content for an augmented reality system.
  • the system comprises (a) a mapping solution selected from the group consisting of mapping software, mapping applications and mapping services; (b) an extraction module which accesses said mapping solution and which extracts perspective views of locations therefrom; (c) a content placement module which (a) receives extracted perspective views from said extraction module, (b) superimposes digital content on said extracted perspective views, thereby yielding composite images, and (b) extracts information from the composite images, wherein the extracted information extracted from each composite image specifies the location of the corresponding superimposed digital content relative to the corresponding perspective view; and (d) an augmented reality system which receives composite images generated by the content placement module and which, for each received composite image, ghosts the digital content of the composite image over the real-world view corresponding to the perspective view of the composite image.
  • an augmented reality system which serves up digital content to a viewer installed on a host device.
  • the system comprises (a) a host device equipped with location awareness and orientation awareness functionalities; (b) a database of composite images, wherein each composite image in the database includes a perspective view of a real-world location and digital content to be ghosted over the perspective view, and wherein each composite image has embedded therein parameters which describe the real- world location corresponding to the perspective view and the orientation required for a device to assume relative to the real-world location in order to recall the digital content; and (c) an augmented reality viewer which is installed on said host device and which is in
  • said augmented reality viewer monitors the location and orientation of the host device and, upon determining that the host device is at a location and orientation corresponding to a perspective view of an image in said database of composite images, serves up the corresponding digital content.
  • a tangible, non-transient, computer readable medium having programming instructions recorded therein which, when executed by at least one computer processor, implement any of the foregoing systems or methodologies.
  • AR augmented reality
  • AR offerings lack a convenient means for correction of the location and orientation of AR content. Due to placement errors, the movement of real world objects, or natural phenomena, the placement of AR content may require correction from time to time. Frequently, such corrections may have to be implemented by parties other than the original content creator or provider. However, such corrections are often complicated by the fact that it may not be apparent to the party making the corrections what the original or intended placement and orientation of the AR content was.
  • Still other AR offerings lack the means by which an author of AR content, or an authorized third party, may conveniently place that content from a remote location.
  • This shortcoming impedes the implementation of AR technologies by adding a hurdle to such systems, namely, the need to physically visit a location in order to associate AR content with it.
  • this shortcoming creates a situation in which less visited venues, such as those that are more remote from population centers, tend to be underserved by AR content creators and providers.
  • existing AR offerings may provide insufficient incentive to others to visit such remote locations (as may be desirable, for example, to drive foot traffic to local businesses), provide content for these locations, or consume content that has already been provided.
  • mapping software such as, for example, Google Street View
  • images and/or stock photos to facilitate the remote placement of digital content in a (preferably marker-based) augmented reality (AR) system.
  • AR augmented reality
  • a particular instance of digital content may be associated with a particular view in mapping software such as, for example, a frame from the 'street view' mode of Google Maps.
  • the content may be overlaid or "ghosted" onto that same background image on the user's device for content placement purposes, thus imitating the view that is visible when the user is actually at that very location.
  • an associated device such as, for example, a mobile handheld device such as a mobile phone, or on a desktop, laptop or tablet computer
  • Any relevant information which is required to explicitly designate the location of the "ghosted" content in its real -world surroundings may include, for example, one or more of the GPS coordinates of the location, the device's orientation, handset yaw, pitch and/or roll, altitude, or other such parameters.
  • this information may be sourced from mapping software or a related service (such as, for example, Google Street View) and/or may be estimated given the first-person perspective of the viewer (as, for example, in the case of the device's expected yaw, pitch and roll).
  • This approach allows a user at the intended real-world location or address to recall the image using an identifying frame (which is representative of how and where the image was to be ghosted), along with the relevant metadata which was used to remotely ghost the image.
  • metadata may include, for example, the object's designated GPS coordinates and the handset's intended orientation, yaw, pitch, roll, elevation and altitude, or any other such parameters as may be useful or necessary to implement the systems and methodologies described herein.
  • the views created by the underlying mapping software may be used as a reference from which object recognition software may identify precisely how AR content was (or is) "ghosted" at that particular location. For example, these views may be utilized to discern the size and orientation of the ghosted content relative to the background image.
  • the AR content may be viewed through, for example, camera software or a view finder installed on the user's device, and may appear only when a user running the software/application/viewer meets some or all of the criteria (although preferably not any co- location criteria) that determine whether or not an image should be visible to the user (see, e.g., U.S. 8,963,957 (Skarulis), entitled "SYSTEMS AND METHODS FOR AN
  • mapping software (such as, for example, Google's Street View software) will have the GPS coordinates and other parameters associated with a particular view (such as, for example, the yaw, pitch, roll, altitude and compass orientation of the view) already documented or evident.
  • these systems and methodologies disclosed herein may be extended to other image sources through suitable modifications.
  • these systems and methodologies may utilize 3 rd party images (such as, for example, stock photos) as references for which object recognition software may identify precisely how AR content was (or is) "ghosted" at a particular location.
  • the images provided may not have all of the necessary information (such as, for example, the yaw, pitch, roll and compass orientation of the view) captured and associated within their metadata as may be required to discern the size and orientation of the ghosted content relative to the background image.
  • this information may be added for POIs (points of interest), landmarks or general locations where the supporting information is already known or may be readily divined.
  • object recognition software may be utilized to establish, at least to some degree of approximation, the approximate GPS point of origin for an image or other image parameters.
  • Acceptable metadata may then be written directly to the image itself (for example, by the addition to, or the modification of, existing metadata), or the image may be manipulated (preferably in a visually unperceivable manner) through the use of steganography to associate all of the necessary metadata with the image.
  • a method to facilitate content consumption in a (preferably marker-based) AR system.
  • the method incentivizes users to capture or consume digital content (which may be, for example, photos, images, audio or video files, digital collectibles, documents, records, constructs, characters, trading cards, or other types of media, and which may include instances of licensed content) through notification schemes and constrained availability.
  • digital content which may be, for example, photos, images, audio or video files, digital collectibles, documents, records, constructs, characters, trading cards, or other types of media, and which may include instances of licensed content
  • the method may be utilized to drive users to a particular geographic location by offering a limited set or quantity (N) of digital media which may be consumed or captured at that location.
  • a promotor of a business, event or location may offer five (5) Pokemon-style characters or ten (10) virtual trading cards or licensed digital properties (which may include, for example, Disney characters or images from National Geographic) which may be captured or collected there. As each instance of digital content is collected, it is depleted from the available pool, so that no more instances of that particular content exist after the first quantity Nhave been captured or consumed.
  • This method may be used, for example, to drive customers to nearby business establishments, or to shape the time and flow of consumer foot traffic to the same.
  • a method is provided to facilitate crowdsourced content placement in a (preferably marker-based) AR system.
  • the method incentivizes users to place AR content or media (which may be any of the types of digital content or media disclosed herein) in a particular location or geographic region through notification schemes and the constrained availability of the content or media.
  • the method may be utilized to drive users to place AR content of a specified type, and at a specified location or within a specified geographic region or geofence, by offering digital content or media to only the first N users who place AR content or media at the target location.
  • This method may be used, for example, to incentivize users to create AR content in locations or geographic regions which might otherwise be underserved as a result of, for example, lower population densities.
  • such software may be server based, client based, or various combinations of the two.
  • the software may be implemented as a front end consisting of a distributed application or client, and a back end implemented as a server- based application.
  • a plug- in to the mapping software may be utilized to permit the visual overlay of a (preferably isometric) shape.
  • This shape may be further fashioned in the x, y and z directions to encompass any sized or shaped slice to coincide with an area (or areas) of interest.
  • the slice may be a semi-transparent visual overlay which shows how the slice intercepts the target.
  • Such a visual overlay may be utilized to place an image within a volumetric geofence (herein referred to as a "geospace”) for the purposes of implementing a ghosting or AR algorithm in the systems and methodologies disclosed herein.
  • mapping software may be provided in which each view of reality (for example, each view in the mapping software) is equipped with (or associated with) at least one marker, and in which at least one set of metadata is associated with the at least one marker.
  • This mapping software may then be utilized to implement a variety of AR or ghosting schemes, including those disclosed herein.
  • the mapping software may run in the background, and the resources of the host device may be utilized to continually track the location and orientation of the host device.
  • resources may include, for example, location awareness functionalities (such as, for example, the ability to determine the location of the device by ascertaining its GPS coordinates, through triangulation of cell towers or markers, or by reference to RF or magnetic maps), orientation determining means, accelerometers, compasses, and the like.
  • Viewer software on the host device may then ascertain when the location and orientation of the host device coincides with a frame in the mapping software, and may serve up AR content associated with that frame (or associated with the metadata that is associated with that frame).
  • mapping software may be utilized to compare a marker in such mapping software to a depiction of the marker in an instance of digital media (such as, for example, a stock photo). This may allow appropriate metadata for the digital media to be ascertained.
  • a browser plug-in or extension is provided that may be used with mapping software, application or services (hereinafter referred to collectively as "mapping solutions") of the type disclosed herein such that digital content may be directly superimposed on a perspective view in a tab where the mapping solution is running.
  • the resulting perspective view with the superimposed digital content may extract any and all available, relevant information from the mapping solution (such as, for example, size, orientation and placement of digital content relative to the street view and markers associated with same, as well as the GPS coordinates of the street view and the orientation, yaw, pitch, roll and altitude/elevation of same) for purposes of remotely "ghosting" (placing) the digital content.
  • mapping solution such as, for example, size, orientation and placement of digital content relative to the street view and markers associated with same, as well as the GPS coordinates of the street view and the orientation, yaw, pitch, roll and altitude/elevation of same
  • a standalone software application which extracts and then imports a relevant perspective view from a mapping solution along with relevant, available parameters which describe view's GPS coordinates, orientation, yaw, pitch, roll and altitude/elevation information, into the tool in order to mimic what a mobile device would capture if the "ghosting" (or placement) of digital content was being carried out, in-person, at the actual location itself.
  • the placement, size and orientation of any superimposed digital content (whether selected by an individual user or determined through the application of a rules engine and/or logic) relative to the perspective view would also be captured and described. All the aforementioned parameters may be used within the standalone tool for purposes of communicating and sharing the same with the "ghosting" service with regard to how said digital content is to be stored and/or recalled.
  • all of the relevant information used to describe how digital content is superimposed within a specific perspective view, as well as any and all relevant parameters needed to describe both the location as well as the orientation of the user and/or the handset or mobile device relative to the perspective view and needed to recall the digital content may all be embedded within the metadata of the digital content itself.
  • An augmented reality viewer may recall (view) the digital content simply by interpreting its metadata, thus allowing the digital content (object) to exist independently - without a database or repository to link the information with the digital content.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé pour faciliter le placement d'un contenu numérique dans un système de réalité augmentée (RA). Le procédé consiste à : fournir un logiciel de mappage qui, en réponse à une entrée utilisateur comprenant un emplacement spécifié, affiche au moins l'une d'une série de trames associées à cet emplacement, chaque trame contenant une vue unique de l'emplacement ; recevoir, d'un dispositif informatique équipé d'une unité d'affichage, une entrée qui spécifie un emplacement ; utiliser le logiciel de mappage pour générer un ensemble de trames associées à l'emplacement d'entrée ; placer de manière fantôme un contenu numérique sur au moins l'une des trames de l'ensemble, de sorte à générer au moins une trame fantôme ; et afficher la ou les trames fantômes sur l'unité affichage du dispositif informatique.
PCT/US2017/062426 2016-11-17 2017-11-17 Placement à distance de contenu numérique pour faciliter un système de réalité augmentée WO2018094289A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662423720P 2016-11-17 2016-11-17
US62/423,720 2016-11-17

Publications (1)

Publication Number Publication Date
WO2018094289A1 true WO2018094289A1 (fr) 2018-05-24

Family

ID=62146837

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/062426 WO2018094289A1 (fr) 2016-11-17 2017-11-17 Placement à distance de contenu numérique pour faciliter un système de réalité augmentée

Country Status (1)

Country Link
WO (1) WO2018094289A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109711891A (zh) * 2018-12-28 2019-05-03 深圳瀚德理想视界科技有限公司 一种基于ir和ar技术实现自主广告投放的***及方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140063061A1 (en) * 2011-08-26 2014-03-06 Reincloud Corporation Determining a position of an item in a virtual augmented space
US20160217624A1 (en) * 2013-01-30 2016-07-28 F3 & Associates Coordinate Geometry Augmented Reality Process
US20160292850A1 (en) * 2011-09-30 2016-10-06 Microsoft Technology Licensing, Llc Personal audio/visual system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140063061A1 (en) * 2011-08-26 2014-03-06 Reincloud Corporation Determining a position of an item in a virtual augmented space
US20160292850A1 (en) * 2011-09-30 2016-10-06 Microsoft Technology Licensing, Llc Personal audio/visual system
US20160217624A1 (en) * 2013-01-30 2016-07-28 F3 & Associates Coordinate Geometry Augmented Reality Process

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109711891A (zh) * 2018-12-28 2019-05-03 深圳瀚德理想视界科技有限公司 一种基于ir和ar技术实现自主广告投放的***及方法

Similar Documents

Publication Publication Date Title
US11532140B2 (en) Audio content of a digital object associated with a geographical location
TWI790380B (zh) 深度估計系統之自監督訓練
Schmalstieg et al. Augmented Reality 2.0
US20180270609A1 (en) System And Method For Inserting Messages Displayed To A User When Viewing A Venue
US9767615B2 (en) Systems and methods for context based information delivery using augmented reality
US10325410B1 (en) Augmented reality for enhancing sporting events
JP5582548B2 (ja) 実環境視像における仮想情報の表示方法
US9558559B2 (en) Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
US9613455B1 (en) Local georeferenced data
US20180374267A1 (en) Method and System to Enhance Spectator Experience
US10147399B1 (en) Adaptive fiducials for image match recognition and tracking
US20160063671A1 (en) A method and apparatus for updating a field of view in a user interface
EP2823255B1 (fr) Procede de communication et d'informations en realite augmentee
US20130317912A1 (en) Advertising in Augmented Reality Based on Social Networking
CN112330819B (zh) 基于虚拟物品的交互方法、装置及存储介质
US9600720B1 (en) Using available data to assist in object recognition
WO2011084720A2 (fr) Procédé et système pour un moteur de recherche d'information en réalité augmentée et monétisation de produits associée
US9646418B1 (en) Biasing a rendering location of an augmented reality object
Lee et al. Tideland animal ar: Superimposing 3d animal models to user defined targets for augmented reality game
US20220058875A1 (en) System and method for creating geo-located augmented reality communities
Rigby et al. Augmented reality challenges for cultural heritage
US9619940B1 (en) Spatial filtering trace location
Pawade et al. Augmented reality based campus guide application using feature points object detection
TW201126451A (en) Augmented-reality system having initial orientation in space and time and method
WO2018094289A1 (fr) Placement à distance de contenu numérique pour faciliter un système de réalité augmentée

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17872449

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27.08.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17872449

Country of ref document: EP

Kind code of ref document: A1