WO2015136144A1 - Method and apparatus for superimposing images upon a map - Google Patents

Method and apparatus for superimposing images upon a map Download PDF

Info

Publication number
WO2015136144A1
WO2015136144A1 PCT/FI2014/050187 FI2014050187W WO2015136144A1 WO 2015136144 A1 WO2015136144 A1 WO 2015136144A1 FI 2014050187 W FI2014050187 W FI 2014050187W WO 2015136144 A1 WO2015136144 A1 WO 2015136144A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
captured
map
location
representation
Prior art date
Application number
PCT/FI2014/050187
Other languages
French (fr)
Inventor
Roope Rainisto
Original Assignee
Nokia Technologies Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy filed Critical Nokia Technologies Oy
Priority to PCT/FI2014/050187 priority Critical patent/WO2015136144A1/en
Publication of WO2015136144A1 publication Critical patent/WO2015136144A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/006Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
    • G09B29/007Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes using computer methods
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids
    • G09B29/106Map spot or coordinate position indicators; Map reading aids using electronic means

Definitions

  • An example embodiment of the present invention relates generally to a method, apparatus and computer program product for superimposing an image upon a map and, more particularly, to a method, apparatus and computer program product for
  • Context information may be provided in various manners including information regarding the time and date on which an image was captured, the location at which the image was captured, the subject of the image, such as a person or place, etc.
  • the context information may be visually presented in the form of a map.
  • reduced size representations e.g., thumbnails
  • the reduced size representations of the images may be positioned upon the map so as to coincide with the location at which the images were captured.
  • a user may review the map and identify the images to be further considered based upon the location at which the images were captured as represented by their relative locations upon the map.
  • the reduced size representations representative of the images may largely overlap one another such that a user may have difficulty identifying and reviewing the individual images captured at a respective location.
  • the reduced size representations of the images may largely obscure portions of the map by almost fully covering certain portions of the map at which many images were captured, thereby also making it difficult for the user to associate the images with a particular location upon the map.
  • a method, apparatus, and computer program product are therefore provided in accordance with an example embodiment in order to provide context information to a user regarding one or more images.
  • the method, apparatus, and computer program product of an example embodiment may provide the context information in a visual manner so as to facilitate the identification by a user of one or more images of interest.
  • the method, apparatus and computer program product of an example embodiment may facilitate the superimposition of representations of the images upon the map in a manner that permits context information to be provided for a plurality of images with less risk of the representations of the images significantly overlapping one another or substantially obscuring a portion of a map.
  • a method in an example embodiment, includes identifying, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image. The method of this example embodiment also includes determining a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image. In this example embodiment, the method also includes causing a map to be presented with the representation the image superimposed thereupon.
  • the method of an example embodiment may determine the location of the representation the image to be superimposed upon the map to be a distance from the location in which the image was captured that has a direct relationship to the level of zoom associated with the image. In this regards, the distance from the location at which the image was captured may increase as the image is captured at greater levels of zoom. Also, in regards to determining the location and size of the representation of the image to be superimposed upon the map, the method of an example embodiment may determine the size of the representation of the image to be superimposed upon the map so as to have an inverse relationship to the level of zoom associated by the image. In this regard, the size of the representation the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom.
  • the method of an example embodiment may cause the representation of the image to be superimposed upon the map such that the representation of the image is caused to project out in the third dimension from the map.
  • the method may cause the representation of the image to project outwardly in the third dimension by causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a non-zero value.
  • an apparatus in another example embodiment, includes at least one processor and at least memory including computer program code with at least one memory and the computer code configured to, with the processor, cause the apparatus to at least identify, for an image, a location at which the image was captured, a direction along which the image was captured, and a level of zoom associated with the image.
  • the at least one memory and the computer program code are also configured to, with the processor, cause the apparatus of this example embodiment to determine a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image.
  • the at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of this example embodiment to cause a map to be presented with the representation of the image superimposed thereupon.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of an example embodiment to determine the location and size of the representation of the image to be superimposed upon the map by determining the location of the representation the image to be superimposed upon the map to be a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image.
  • the distance from the location at which the image was captured may increase as the image is captured at greater levels of zoom.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of an example embodiment to determine the location and size of the representation of the image to be superimposed upon the map by determining the size of the representation of the image to be superimposed upon the map to have an inverse relationship to the level of zoom associated with the image.
  • the size of the representation of the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of an example embodiment to cause the representation of the image to be superimposed upon the map such that the representation of the image is caused to project out in the third dimension from the map.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to cause the representation of the image to project outwardly in the third dimension by causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a nonzero value.
  • a computer program product includes at least one non-transitory computer-readable storage medium having computer- executable program code portion stored therein with the computer-executable program code portions including program code instructions configured, upon execution, to identify, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image.
  • the computer- executable program code portions of this example embodiment also include program code instructions configured to determine a location and size of the representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image.
  • the computer-executable program code portions of this example embodiment also include program code instructions configured, upon execution, to cause a map to be presented with the representation of the image superimposed thereupon.
  • the program code instructions configured to determine the location and size of the representation of the image to be superimposed upon the map may include program code instructions configured to determine the location of the representation of the image to be superimposed upon the map to be a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image. In this example embodiment, the distance from the location at which the image was captured may increase as the image is captured at greater levels of zoom.
  • the program code instructions configured to determine the location and size of the representation of the image to be superimposed upon the map in accordance with an example embodiment may include program code instructions configured to determine the size of the representation of the image to be superimposed upon the map to have an inverse relationship to the level of zoom associated with the image. In this example embodiment, the size of the representation of the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom.
  • the program code instructions of an example embodiment that are configured to cause the representation of the image to be superimposed upon the map may be further configured to cause the representation of the image is superimposed upon the map so as to project out in the third dimension from the map.
  • the program code instructions may be further configured to cause the representation of the image to project outwardly in the third dimension by causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a nonzero value.
  • an apparatus in yet another example embodiment, includes means for identifying, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image.
  • the apparatus of this example embodiment also includes means for determining a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image.
  • the apparatus of this example embodiment also includes means for causing the map to be presented with the
  • Figure 1 is a block diagram of an apparatus that may be specifically configured in accordance with an example embodiment of the present invention
  • Figure 2 is a flow chart illustrating operations performed, such as by the apparatus of Figure 1, in accordance with an example embodiment of the present invention
  • Figure 3 is a screen display of a map having a representation of three images superimposed thereupon in accordance with an example embodiment of the present invention
  • Figure 4 is a representation of the screen display of Figure 3 which depicts the location at which the three images were captured and the direction along which the three images were captured relative to the location and size of the representations of the three images superimposed upon the map in accordance with an example embodiment of the present invention.
  • circuitry refers to (a) hardware-only circuit implementations (for example, implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present.
  • This definition of 'circuitry' applies to all uses of this term herein, including in any claims.
  • the term 'circuitry' also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware.
  • the term 'circuitry' as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
  • a method, apparatus and computer program product are provided in accordance with an example embodiment of the present invention in order to provide context information for one or more images.
  • the method, apparatus and computer program product of an example embodiment may provide the context information in a visual manner, such as by causing a map to be presented with a representation of an image superimposed thereupon.
  • the method, apparatus and computer program product of an example embodiment of the present invention may leverage, however, the information associated with the image, such as information regarding the location at which the image was captured, the direction along which the image was captured, and the level of zoom associated with the image, in order to determine the location and size of the representation of the image to be superimposed upon the map.
  • the method, apparatus and computer program product of an example embodiment may reduce the risk that the representations of the images will significantly overlap with one another and/or obscure substantial portions of the underlying map, while providing a user with a visual
  • representation f additional context information so as to facilitate the user's identification and selection of an image in an efficient and intuitive manner.
  • FIG. 1 depicts an apparatus 10 that may be specifically configured in accordance with an example embodiment to the present invention.
  • the apparatus may be embodied by or associated with a variety of electronic devices including a mobile terminal, such as a personal digital assistant (PDA), mobile telephone, smartphone, companion device, for example, a smart watch, pager, mobile television, gaming device, laptop computer, camera, tablet computer, touch surface, video recorder, camera, audio/video player, radio, electronic book, positioning device (for example, global positioning system (GPS) device), or any combination of the aforementioned, and other types of voice and text
  • PDA personal digital assistant
  • companion device for example, a smart watch, pager, mobile television, gaming device, laptop computer, camera, tablet computer, touch surface, video recorder, camera, audio/video player, radio, electronic book, positioning device (for example, global positioning system (GPS) device), or any combination of the aforementioned, and other types of voice and text
  • GPS global positioning system
  • the apparatus may be embodied by or associated with a fixed computing device, such as a computer workstation, a personal computer, a server or the like.
  • the apparatus of the embodiment of Figure 1 may include or otherwise be in communication with a processor 12, a memory device 14, a user interface 16 and optionally a communication interface and/or an image capturing device, e.g., a camera.
  • the processor and/or co-processors or any other processing circuitry assisting or otherwise associated with the processor
  • the memory device may be non-transitory and may include, for example, one or more volatile and/or non-volatile memories.
  • the memory device may be an electronic storage device (for example, a computer readable storage medium) comprising gates configured to store data (for example, bits) that may be retrievable by a machine (for example, a computing device like the processor).
  • the memory device may be configured to store information, data, content, applications, instructions, or the like for enabling the apparatus to carry out various functions in accordance with an example embodiment of the present invention.
  • the memory device could be configured to buffer input data for processing by the processor.
  • the memory device could be configured to store instructions for execution by the processor.
  • the memory device may store a plurality of images and associated information, e.g., metadata, as well as map data from which an image of a map is constructed.
  • the images and/or the map data may be stored remotely and accessed by the processor, such as via a communication interface.
  • the apparatus 10 may be embodied by various devices. However, in some embodiments, the apparatus may be embodied as a chip or chip set. In other words, the apparatus may comprise one or more physical packages (for example, chips) including materials, components and/or wires on a structural assembly (for example, a circuit board). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The apparatus may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single "system on a chip.” As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
  • a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
  • the processor 12 may be embodied in a number of different ways.
  • the processor may be embodied as one or more of various hardware processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like.
  • the processor may include one or more processing cores configured to perform independently.
  • a multi-core processor may enable multiprocessing within a single physical package.
  • the processor may include one or more processors configured in tandem via the bus to enable independent execution of instructions, pipelining and/or multithreading.
  • the processor 12 may be configured to execute instructions stored in the memory device 14 or otherwise accessible to the processor.
  • the processor may be configured to execute hard coded functionality.
  • the processor may represent an entity (for example, physically embodied in circuitry) capable of performing operations according to an embodiment of the present invention while configured accordingly.
  • the processor when the processor is embodied as an ASIC, FPGA or the like, the processor may be specifically configured hardware for conducting the operations described herein.
  • the processor when the processor is embodied as an executor of software instructions, the instructions may specifically configure the processor to perform the algorithms and/or operations described herein when the instructions are executed.
  • the processor may be a processor of a specific device (for example, the client device 10 and/or a network entity) configured to employ an embodiment of the present invention by further configuration of the processor by instructions for performing the algorithms and/or operations described herein.
  • the processor may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor.
  • ALU arithmetic logic unit
  • the apparatus 10 of the illustrated embodiment also includes or is in
  • the user interface such as a display, may be in communication with the processor 12 to provide output to the user and, in some embodiments, to receive an indication of a user input, such as in an instance in which the user interface includes a touch screen display.
  • the user interface may also include a keyboard, a mouse, a joystick, touch areas, soft keys, one or more microphones, a plurality of speakers, or other input/output mechanisms.
  • the processor may comprise user interface circuitry configured to control at least some functions of one or more user interface elements such as a display and, in some embodiments, a plurality of speakers, a ringer, one or more microphones and/or the like.
  • the processor and/or user interface circuitry comprising the processor may be configured to control one or more functions of one or more user interface elements through computer program instructions (for example, software and/or firmware) stored on a memory accessible to the processor (for example, memory device 14, and/or the like).
  • computer program instructions for example, software and/or firmware
  • the apparatus 10 of the illustrated embodiment may also optionally include a communication interface that may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a communications device in communication with the apparatus.
  • the communication interface may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network.
  • the communication interface may include the circuitry for interacting with the antenna(s) to cause transmission of signals via the antenna(s) or to handle receipt of signals received via the antenna(s).
  • the communication interface may alternatively or also support wired communication.
  • the communication interface may include a communication modem and/or other hardware and/or software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
  • DSL digital subscriber line
  • USB universal serial bus
  • the apparatus 10 of some embodiment may include or be associated with a camera, a video recorder or other image capturing device that is in communication with the processor 12.
  • the image capturing device may be any means for capturing an image, such as still image, video images or the like, for storage, display or transmission including, for example, an imaging sensor.
  • the image capturing device may include a digital camera including an imaging sensor capable of capturing an image.
  • the image capturing device may include all hardware, such as a lens, an imaging sensor and/or other optical device(s), and software necessary for capturing an image.
  • the image capturing device may include only the hardware needed to view an image, while the memory stores instructions for execution by the processor in the form of software necessary to capture, store and process an image.
  • the image capturing device may further include a processing element such as a co-processor which assists the processor in processing image data and an encoder and/or decoder for compressing and/or decompressing image data.
  • the encoder and/or decoder may encode and/or decode according to a predefined format, such as a JPEG standard format.
  • the image that is captured may be stored for future viewings and/or manipulations in the memory of the apparatus and/or in a memory external to the apparatus.
  • the apparatus may include means, such as the processor 12 or the like, for identifying a location at which an image was captured, a direction along which the image was captured and a level of zoom associated with the image.
  • the location may be identified in various manners including, for example, latitude and longitude coordinates, an address, an offset from a predefined point of reference or the like.
  • the location at which an image was captured may be automatically determined by the image capturing device at the time at which the image is captured, such as based upon a GPS device embodied by or otherwise associated with the image capturing device.
  • the location may be provided by the user based upon user input at a time coincident with the capture of the image or at some time following the capture of the image.
  • the direction along which the image was captured identifies the direction in which the image capturing device was pointed at the time that the image was captured.
  • the direction may be defined in terms of an azimuth angle and, as described below, may optionally also include an elevation angle.
  • the direction along which the image was captured may be determined automatically by the image capturing device at the time at which the image was captured, such as based upon the reading of a compass, gyroscope or the like that is embodied by or otherwise associated with the image capturing device.
  • the direction may be provided by the user of the image capturing device at the time at which the image is captured or at some time thereafter.
  • the level of zoom associated with the image may also be defined in various manners including, for example, the zoom ratio of the image capturing device at the time at which the image was captured. As such, the level of zoom provides an indication as to whether the image capturing device was zoomed in at the time that the image was captured or zoomed out at the time that the image was captured. Thus, larger or greater levels of zoom are associated with an instance in which the image capturing device is zoomed in and lesser or smaller levels of zoom are associated with instances in which the image capture device is zoomed out.
  • the level of zoom may be determined automatically by the image capturing device or may be based upon user input defining the level of zoom at the time that the image was captured.
  • the location, direction and level of a zoom of an image may be identified in various manners.
  • the apparatus such as the processor 12, may be configured to identify the location, the direction and the level of zoom of the image from information provided by the image capturing device, such as based upon the information automatically captured by the image capturing device and/or provided by a user of the image capturing device.
  • the location, the direction, the level of zoom and optionally other parameters associated with the image may be stored in association with the image.
  • metadata may be associated with the image and stored therewith which defines the location, the direction, the level of zoom and optionally other parameters associated with the image.
  • the apparatus such as the processor, may be configured to identify the location, the direction and the level of zoom associated with the image by retrieving the metadata associated with the respective image.
  • the apparatus 10 may also include means, such as the processor 12 or the like, for determining a location and size of a representation of the image to be superimposed upon a map.
  • the apparatus such as the processor, may be configured to determine the location and size of the representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image.
  • the apparatus such as the processor, may be configured to determine the location and size of the representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image.
  • three different images captured from the same location and along the same general direction at different levels of zoom are superimposed upon a map.
  • image 30 has a broader field of view and a correspondingly lower level of zoom
  • image 32 has an intermediate field of view and an intermediate level of zoom
  • image 34 has a smaller field of view and a larger level of the zoom.
  • Figure 4 depicts the location 36 from which the image capturing device captured the three images. Additionally, diverging lines 38 fan outwardly from the location at which the images were captured to illustrate the broadest field of view, that is, the field of view associated with image 30, and to generally represent the direction along which the images are captured.
  • the direction at which image 32 was captured that is, the direction that the image capturing device was pointed at the time that image 32 was captured, is slightly to the left and along a line that is rotated counter-clockwise relative the direction that image 32 was captured.
  • the direction at which image 34 was captured is slightly to the right and along a line that is rotated clockwise relative the directions that images 30 and 32 were captured.
  • the apparatus 10 may determine the location of the representation of the image to be in the same direction in which the image capturing device was pointed at the time that the respective image was captured.
  • image 32 is located slightly to the left of image 30
  • image 34 is located slightly to the right of image 32 and slightly to the right of the center of image 30.
  • the apparatus, such as the processor, of an example embodiment may determine the location of the representation of the image such that the representation of the image is located along and, in an example embodiment, centered about a line that extends in the direction along which the respective image was captured.
  • the apparatus 10 In determining the location of the representation of the image to be superimposed upon the map, the apparatus 10, such as the processor 12, of an example embodiment may also be configured to determine the location of the representation of the image to be superimposed upon the map to be at a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image.
  • the distance at which the representation of the image is located relative to the location at which the image was captured increases as the image is captured at greater levels of zoom and decreases as the image is captured at lesser levels of zoom.
  • an image having a greater level of zoom may be positioned proximate the location upon the underlying map that is the subject of the zoomed image, such as shown with respect to the representation of image 34 that is located proximate that portion of the observatory depicted in the underlying map that is the subject of the zoomed in image.
  • the distance at which the representation of the image is superimposed upon the map from the location at which the image was captured may be proportional to the level of zoom at which the image was captured.
  • the location of the representation of the image upon the map relative to the location at which the image was captured may have other types of direct relationships to the level of zoom associated with the capture of the image in other embodiments.
  • image 32 is captured with a greater level of zoom than image 30 and, as such, is located further from the location at which the images were captured than the representation of image 30.
  • image 34 was captured at a greater level of zoom than either image 32 or image 30 such that the representation of image 34 is located further from the location at which the images were captured than the representations of either image 32 or image 30.
  • the apparatus 10 such as the processor 12, of an example embodiment may be configured to determine the size of the representation of the image so as to have an inverse relationship to the level of zoom associated with the image.
  • the size of the representation of the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom and may conversely increase as the image is captured at lesser levels of zoom.
  • image 32 is captured at a greater level of zoom than image 30 and, as such, may have a smaller size than image 30.
  • image 34 may be captured at an even greater level of zoom than image 32 and image 30 and, as such, may be represented in a manner that is smaller than either images 32 or 30.
  • the apparatus such as the processor, may determine the size of the representation of the image to be superimposed upon the map in a manner that has an inverse proportional relationship to the level of zoom associated with the image.
  • the inverse relationship between the size of the representation of the image to be superimposed upon the map and the level of zoom associated with the image may be defined to have other types of inverse relationships in other example embodiments.
  • the apparatus 10 may also include means, such as the processor 12, the user interface 16 or the like, for causing a map to be presented with the representation of the image superimposed thereupon.
  • the map data from which the representation of the map is constructed for display may be stored in memory 14 or may be provided by an external database accessible by the processor via, for example, a communication interface.
  • the apparatus, such as the processor, the user interface or the like may be configured to cause any of a wide variety of different representations of maps to be presented upon a display.
  • the map may be a two-dimensional depiction of streets and other geographical features with the names of various cities or other regions denoted on the map.
  • the map may be based upon images, such as shown in Figure 3 and provided by various commercial systems in concluding, for example, the HERE 3D Map system.
  • the apparatus 10 such as the processor 12, of an example embodiment may cause the representation of the image to project outwardly in a third dimension from the map with the images positioned at a distance and with a size relative to the underlying map and the location at which the images were captured that has been determined as described above.
  • the representations of the images are superimposed upon the map such that the images face the location from which the images were captured.
  • the representations of the images are also presented in a manner that the images appear to stand upright and to extend outwardly from the underlying map, such as in the manner of postcards standing on an edge.
  • the images may be highlighted relative to the underlying map and the representations of the images may not obscure as much of the underlying map as in an instance in which the images were laid flat or disposed in the same plane as the map.
  • the method, apparatus and computer program product of an example embodiment may provide additional context for the images simply by the manner in which the images are presented such that a person viewing the images superimposed upon the map may efficiently and intuitively interpret the images relative to the underlying map and obtain information regarding the relative direction of different features and the level of detail regarding various features.
  • the manner in which the location and size of the representations of the images are determined and the manner in which the representations of the images are superimposed upon the map may permit more images that were captured at the same location to be visible at one time without significant overlap since the representations of at least some of the images are spaced from the location at which the images were captured and are not stacked one upon another, thereby providing a viewer of the images and the underlying map with ready access to additional images at different levels of detail.
  • the representations of the images 30, 32 and 34 depicted in the example embodiment of Figures 3 and 4 are planar, the representations of the images may have other shapes, such as curved shapes in other embodiments.
  • the representations of the images may be concave with the concavity of the images facing the location from which the images were captured.
  • the representations of the images 30, 32 and 34 appear to extend orthogonally from the underlying map, such as by appearing to extend only in the z-direction that projects orthogonally outward from an xy plane defined by the map
  • the representations of the images may be tilted so as to define an acute angle with respect to the underlying map, such as by being tilted forwardly or rearwardly in other embodiments.
  • the information regarding the direction along which the image was captured includes not only an azimuth angle, but also an elevation angle
  • a non-zero elevation angle will define an instance in which the direction in which the image capturing device was pointed included an upward or downward component at the time that the image was captured.
  • the apparatus 10 such as the processor 12, the user interface 16 or the like, may be configured to cause the representation of the image that projects outwardly in the third dimension from the underlying map to be tilted relative to the map in an instance in which the elevation angle has a non-zero value.
  • the representation of the image may be leaned backward by an amount that is based upon, such as an amount that is proportional to or equal to, the elevation angle, so as to be representative of an image captured by an image capturing device that was at least partially upwardly facing.
  • the representation of the image may be leaned forwardly by an amount that is based upon, such as an amount that is proportional to or equal to, the elevation angle, so as to be representative of an image captured by an image capturing device that was at least partially downwardly facing.
  • the tilting of the representations of the images in response to a non-zero elevation angle may provide additional context information and may further facilitate the user's efficient and intuitive review and interpretation of the representations of the images superimposed upon the map.
  • representations of still images may be presented upon a map.
  • representations of other types of images such as video images
  • a thumbnail or other representation of a video may be presented upon the map based upon the location, direction and level of zoom associated with the video such that a user may access the video by selecting, e.g., by clicking upon, the representation of the video.
  • the representation of the video may be determined in various manners including by being based upon an average value of the parameter(s) that vary, the initial value of the parameter(s) that vary or the final value of the parameter(s) that vary.
  • Figure 2 is a flowchart of an apparatus 10, method and computer program product according to example embodiments of the invention. It will be understood that each block of the flowchart, and combinations of blocks in the flowchart, may be implemented by various means, such as hardware, firmware, processor, circuitry, and/or other communication devices associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above may be stored by a memory device 14 of an apparatus employing an embodiment of the present invention and executed by a processor 12 of the apparatus. As will be appreciated, any such computer program instructions may be loaded onto a computer or other
  • programmable apparatus for example, hardware
  • These computer program instructions may also be stored in a computer-readable memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer- readable memory produce an article of manufacture the execution of which implements the function specified in the flowchart blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer- implemented process such that the instructions which execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the flowchart blocks.
  • blocks of the flowchart support combinations of means for performing the specified functions and combinations of operations for performing the specified functions for performing the specified functions. It will also be understood that one or more blocks of the flowchart, and combinations of blocks in the flowchart, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
  • certain ones of the operations above may be modified or further amplified. Furthermore, in some embodiments, additional optional operations may be included. Modifications, additions, or amplifications to the operations above may be performed in any order and in any combination.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Mathematical Physics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Ecology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Automation & Control Theory (AREA)
  • Image Processing (AREA)

Abstract

A method, apparatus, and computer program product are provided in order to provide context information in a visual manner to a user regarding one or more images. In terms of a method and for a respective image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image may be identified. The method may also include determining a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image. The method may further include causing a map to be presented with the representation the image superimposed thereupon.

Description

METHOD AND APPARATUS FOR SUPERIMPOSING IMAGES UPON A MAP
TECHNOLOGICAL FIELD
An example embodiment of the present invention relates generally to a method, apparatus and computer program product for superimposing an image upon a map and, more particularly, to a method, apparatus and computer program product for
superimposing an image upon a map in a manner that is based, at least in part, upon a level of zoom associated with the image.
BACKGROUND
A large numbers of images are captured every day of a wide variety of people, places and things. These images may include both still images and video images. With the ever increasing number of images available for review, context information associated with the image is of elevated importance so as to facilitate the identification of one or more images that the user desires to view or focus upon from among many more images that have been captured. Context information may be provided in various manners including information regarding the time and date on which an image was captured, the location at which the image was captured, the subject of the image, such as a person or place, etc.
In regards to context information in the form of the location at which an image was captured, the context information may be visually presented in the form of a map. In this regard, reduced size representations, e.g., thumbnails, of the images that were captured at a location that is included in the map may be displayed on the map. In this regard, the reduced size representations of the images may be positioned upon the map so as to coincide with the location at which the images were captured. Thus, a user may review the map and identify the images to be further considered based upon the location at which the images were captured as represented by their relative locations upon the map.
With the proliferation of images, however, it is common that many images are captured at the same location or at locations that are very near one another. As such, the reduced size representations representative of the images may largely overlap one another such that a user may have difficulty identifying and reviewing the individual images captured at a respective location. Moreover, the reduced size representations of the images may largely obscure portions of the map by almost fully covering certain portions of the map at which many images were captured, thereby also making it difficult for the user to associate the images with a particular location upon the map. BRIEF SUMMARY
A method, apparatus, and computer program product are therefore provided in accordance with an example embodiment in order to provide context information to a user regarding one or more images. In this regard, the method, apparatus, and computer program product of an example embodiment may provide the context information in a visual manner so as to facilitate the identification by a user of one or more images of interest. Moreover, the method, apparatus and computer program product of an example embodiment may facilitate the superimposition of representations of the images upon the map in a manner that permits context information to be provided for a plurality of images with less risk of the representations of the images significantly overlapping one another or substantially obscuring a portion of a map.
In an example embodiment, a method is provided that includes identifying, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image. The method of this example embodiment also includes determining a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image. In this example embodiment, the method also includes causing a map to be presented with the representation the image superimposed thereupon.
In regards to determining the location and size of the representation of the image to be superimposed upon the map, the method of an example embodiment may determine the location of the representation the image to be superimposed upon the map to be a distance from the location in which the image was captured that has a direct relationship to the level of zoom associated with the image. In this regards, the distance from the location at which the image was captured may increase as the image is captured at greater levels of zoom. Also, in regards to determining the location and size of the representation of the image to be superimposed upon the map, the method of an example embodiment may determine the size of the representation of the image to be superimposed upon the map so as to have an inverse relationship to the level of zoom associated by the image. In this regard, the size of the representation the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom.
The method of an example embodiment may cause the representation of the image to be superimposed upon the map such that the representation of the image is caused to project out in the third dimension from the map. In an embodiment in which the direction along which the image was captured includes both azimuth and elevation angles, the method may cause the representation of the image to project outwardly in the third dimension by causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a non-zero value.
In another example embodiment, an apparatus is provided that includes at least one processor and at least memory including computer program code with at least one memory and the computer code configured to, with the processor, cause the apparatus to at least identify, for an image, a location at which the image was captured, a direction along which the image was captured, and a level of zoom associated with the image. The at least one memory and the computer program code are also configured to, with the processor, cause the apparatus of this example embodiment to determine a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image. The at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of this example embodiment to cause a map to be presented with the representation of the image superimposed thereupon.
The at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of an example embodiment to determine the location and size of the representation of the image to be superimposed upon the map by determining the location of the representation the image to be superimposed upon the map to be a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image. In this embodiment, the distance from the location at which the image was captured may increase as the image is captured at greater levels of zoom. The at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of an example embodiment to determine the location and size of the representation of the image to be superimposed upon the map by determining the size of the representation of the image to be superimposed upon the map to have an inverse relationship to the level of zoom associated with the image. In this example embodiment, the size of the representation of the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom.
The at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of an example embodiment to cause the representation of the image to be superimposed upon the map such that the representation of the image is caused to project out in the third dimension from the map. In an
embodiment in which the direction along which the image was captured includes both azimuth and elevation angles, the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to cause the representation of the image to project outwardly in the third dimension by causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a nonzero value.
In a further example embodiment, a computer program product is provided that includes at least one non-transitory computer-readable storage medium having computer- executable program code portion stored therein with the computer-executable program code portions including program code instructions configured, upon execution, to identify, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image. The computer- executable program code portions of this example embodiment also include program code instructions configured to determine a location and size of the representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image. The computer-executable program code portions of this example embodiment also include program code instructions configured, upon execution, to cause a map to be presented with the representation of the image superimposed thereupon.
The program code instructions configured to determine the location and size of the representation of the image to be superimposed upon the map may include program code instructions configured to determine the location of the representation of the image to be superimposed upon the map to be a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image. In this example embodiment, the distance from the location at which the image was captured may increase as the image is captured at greater levels of zoom. The program code instructions configured to determine the location and size of the representation of the image to be superimposed upon the map in accordance with an example embodiment may include program code instructions configured to determine the size of the representation of the image to be superimposed upon the map to have an inverse relationship to the level of zoom associated with the image. In this example embodiment, the size of the representation of the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom.
The program code instructions of an example embodiment that are configured to cause the representation of the image to be superimposed upon the map may be further configured to cause the representation of the image is superimposed upon the map so as to project out in the third dimension from the map. In an embodiment in which the direction along which the image was captured includes both azimuth and elevation angles, the program code instructions may be further configured to cause the representation of the image to project outwardly in the third dimension by causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a nonzero value.
In yet another example embodiment, an apparatus is provided that includes means for identifying, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image. The apparatus of this example embodiment also includes means for determining a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image. The apparatus of this example embodiment also includes means for causing the map to be presented with the
representation of the image superimposed thereupon.
BRIEF DESCRIPTION OF THE DRAWINGS
Having thus described certain embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Figure 1 is a block diagram of an apparatus that may be specifically configured in accordance with an example embodiment of the present invention;
Figure 2 is a flow chart illustrating operations performed, such as by the apparatus of Figure 1, in accordance with an example embodiment of the present invention;
Figure 3 is a screen display of a map having a representation of three images superimposed thereupon in accordance with an example embodiment of the present invention; and Figure 4 is a representation of the screen display of Figure 3 which depicts the location at which the three images were captured and the direction along which the three images were captured relative to the location and size of the representations of the three images superimposed upon the map in accordance with an example embodiment of the present invention.
DETAILED DESCRIPTION
Some embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all, embodiments of the invention are shown. Indeed, various embodiments of the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. As used herein, the terms "data," "content," "information," and similar terms may be used interchangeably to refer to data capable of being transmitted, received and/or stored in accordance with embodiments of the present invention. Thus, use of any such terms should not be taken to limit the spirit and scope of embodiments of the present invention.
Additionally, as used herein, the term 'circuitry' refers to (a) hardware-only circuit implementations (for example, implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present. This definition of 'circuitry' applies to all uses of this term herein, including in any claims. As a further example, as used herein, the term 'circuitry' also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware. As another example, the term 'circuitry' as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device. As defined herein, a "computer-readable storage medium," which refers to a non- transitory physical storage medium (for example, volatile or non- volatile memory device), can be differentiated from a "computer-readable transmission medium," which refers to an electromagnetic signal.
A method, apparatus and computer program product are provided in accordance with an example embodiment of the present invention in order to provide context information for one or more images. In this regard, the method, apparatus and computer program product of an example embodiment may provide the context information in a visual manner, such as by causing a map to be presented with a representation of an image superimposed thereupon. The method, apparatus and computer program product of an example embodiment of the present invention may leverage, however, the information associated with the image, such as information regarding the location at which the image was captured, the direction along which the image was captured, and the level of zoom associated with the image, in order to determine the location and size of the representation of the image to be superimposed upon the map. As such, the method, apparatus and computer program product of an example embodiment may reduce the risk that the representations of the images will significantly overlap with one another and/or obscure substantial portions of the underlying map, while providing a user with a visual
representation f additional context information so as to facilitate the user's identification and selection of an image in an efficient and intuitive manner.
Figure 1 depicts an apparatus 10 that may be specifically configured in accordance with an example embodiment to the present invention. The apparatus may be embodied by or associated with a variety of electronic devices including a mobile terminal, such as a personal digital assistant (PDA), mobile telephone, smartphone, companion device, for example, a smart watch, pager, mobile television, gaming device, laptop computer, camera, tablet computer, touch surface, video recorder, camera, audio/video player, radio, electronic book, positioning device (for example, global positioning system (GPS) device), or any combination of the aforementioned, and other types of voice and text
communications systems. Alternatively, the apparatus may be embodied by or associated with a fixed computing device, such as a computer workstation, a personal computer, a server or the like.
Regardless of the manner in which the apparatus 10 is embodied, the apparatus of the embodiment of Figure 1 may include or otherwise be in communication with a processor 12, a memory device 14, a user interface 16 and optionally a communication interface and/or an image capturing device, e.g., a camera. In some embodiments, the processor (and/or co-processors or any other processing circuitry assisting or otherwise associated with the processor) may be in communication with the memory device via a bus for passing information among components of the apparatus. The memory device may be non-transitory and may include, for example, one or more volatile and/or non-volatile memories. In other words, for example, the memory device may be an electronic storage device (for example, a computer readable storage medium) comprising gates configured to store data (for example, bits) that may be retrievable by a machine (for example, a computing device like the processor). The memory device may be configured to store information, data, content, applications, instructions, or the like for enabling the apparatus to carry out various functions in accordance with an example embodiment of the present invention. For example, the memory device could be configured to buffer input data for processing by the processor. Additionally or alternatively, the memory device could be configured to store instructions for execution by the processor. Still further, the memory device may store a plurality of images and associated information, e.g., metadata, as well as map data from which an image of a map is constructed. Alternatively, the images and/or the map data may be stored remotely and accessed by the processor, such as via a communication interface.
As noted above, the apparatus 10 may be embodied by various devices. However, in some embodiments, the apparatus may be embodied as a chip or chip set. In other words, the apparatus may comprise one or more physical packages (for example, chips) including materials, components and/or wires on a structural assembly (for example, a circuit board). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The apparatus may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single "system on a chip." As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
The processor 12 may be embodied in a number of different ways. For example, the processor may be embodied as one or more of various hardware processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. As such, in some embodiments, the processor may include one or more processing cores configured to perform independently. A multi-core processor may enable multiprocessing within a single physical package. Additionally or alternatively, the processor may include one or more processors configured in tandem via the bus to enable independent execution of instructions, pipelining and/or multithreading.
In an example embodiment, the processor 12 may be configured to execute instructions stored in the memory device 14 or otherwise accessible to the processor. Alternatively or additionally, the processor may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor may represent an entity (for example, physically embodied in circuitry) capable of performing operations according to an embodiment of the present invention while configured accordingly. Thus, for example, when the processor is embodied as an ASIC, FPGA or the like, the processor may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor is embodied as an executor of software instructions, the instructions may specifically configure the processor to perform the algorithms and/or operations described herein when the instructions are executed. However, in some cases, the processor may be a processor of a specific device (for example, the client device 10 and/or a network entity) configured to employ an embodiment of the present invention by further configuration of the processor by instructions for performing the algorithms and/or operations described herein. The processor may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor.
The apparatus 10 of the illustrated embodiment also includes or is in
communication with a user interface 16. The user interface, such as a display, may be in communication with the processor 12 to provide output to the user and, in some embodiments, to receive an indication of a user input, such as in an instance in which the user interface includes a touch screen display. In some embodiments, the user interface may also include a keyboard, a mouse, a joystick, touch areas, soft keys, one or more microphones, a plurality of speakers, or other input/output mechanisms. In an example embodiment, the processor may comprise user interface circuitry configured to control at least some functions of one or more user interface elements such as a display and, in some embodiments, a plurality of speakers, a ringer, one or more microphones and/or the like. The processor and/or user interface circuitry comprising the processor may be configured to control one or more functions of one or more user interface elements through computer program instructions (for example, software and/or firmware) stored on a memory accessible to the processor (for example, memory device 14, and/or the like).
The apparatus 10 of the illustrated embodiment may also optionally include a communication interface that may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a communications device in communication with the apparatus. In this regard, the communication interface may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network. Additionally or alternatively, the communication interface may include the circuitry for interacting with the antenna(s) to cause transmission of signals via the antenna(s) or to handle receipt of signals received via the antenna(s). In some environments, the communication interface may alternatively or also support wired communication. As such, for example, the communication interface may include a communication modem and/or other hardware and/or software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
Although the images may be stored in memory 14 or received from another remote storage device via the communication interface, the apparatus 10 of some embodiment may include or be associated with a camera, a video recorder or other image capturing device that is in communication with the processor 12. The image capturing device may be any means for capturing an image, such as still image, video images or the like, for storage, display or transmission including, for example, an imaging sensor. For example, the image capturing device may include a digital camera including an imaging sensor capable of capturing an image. As such, the image capturing device may include all hardware, such as a lens, an imaging sensor and/or other optical device(s), and software necessary for capturing an image. Alternatively, the image capturing device may include only the hardware needed to view an image, while the memory stores instructions for execution by the processor in the form of software necessary to capture, store and process an image. In an example embodiment, the image capturing device may further include a processing element such as a co-processor which assists the processor in processing image data and an encoder and/or decoder for compressing and/or decompressing image data.
The encoder and/or decoder may encode and/or decode according to a predefined format, such as a JPEG standard format. The image that is captured may be stored for future viewings and/or manipulations in the memory of the apparatus and/or in a memory external to the apparatus.
Referring now to Figure 2, the operations performed, such as by the apparatus 10 of Figure 1 , in order to superimpose an image upon a map in accordance with an example embodiment of the present invention are illustrated. As shown in block 20 of Figure 2, the apparatus may include means, such as the processor 12 or the like, for identifying a location at which an image was captured, a direction along which the image was captured and a level of zoom associated with the image. The location may be identified in various manners including, for example, latitude and longitude coordinates, an address, an offset from a predefined point of reference or the like.
The location at which an image was captured may be automatically determined by the image capturing device at the time at which the image is captured, such as based upon a GPS device embodied by or otherwise associated with the image capturing device.
Alternatively, the location may be provided by the user based upon user input at a time coincident with the capture of the image or at some time following the capture of the image. The direction along which the image was captured identifies the direction in which the image capturing device was pointed at the time that the image was captured. As such, the direction may be defined in terms of an azimuth angle and, as described below, may optionally also include an elevation angle. The direction along which the image was captured may be determined automatically by the image capturing device at the time at which the image was captured, such as based upon the reading of a compass, gyroscope or the like that is embodied by or otherwise associated with the image capturing device.
Alternatively, the direction may be provided by the user of the image capturing device at the time at which the image is captured or at some time thereafter.
The level of zoom associated with the image may also be defined in various manners including, for example, the zoom ratio of the image capturing device at the time at which the image was captured. As such, the level of zoom provides an indication as to whether the image capturing device was zoomed in at the time that the image was captured or zoomed out at the time that the image was captured. Thus, larger or greater levels of zoom are associated with an instance in which the image capturing device is zoomed in and lesser or smaller levels of zoom are associated with instances in which the image capture device is zoomed out. The level of zoom may be determined automatically by the image capturing device or may be based upon user input defining the level of zoom at the time that the image was captured.
The location, direction and level of a zoom of an image may be identified in various manners. In an instance in which the apparatus 10 is configured to cause a representation of the image to be superimposed upon a map at or near the time at which the image is captured, the apparatus, such as the processor 12, may be configured to identify the location, the direction and the level of zoom of the image from information provided by the image capturing device, such as based upon the information automatically captured by the image capturing device and/or provided by a user of the image capturing device.
Alternatively, in an instance in which an image that was previously captured has been stored, such as in memory 14, the location, the direction, the level of zoom and optionally other parameters associated with the image may be stored in association with the image. For example, metadata may be associated with the image and stored therewith which defines the location, the direction, the level of zoom and optionally other parameters associated with the image. In this embodiment, the apparatus, such as the processor, may be configured to identify the location, the direction and the level of zoom associated with the image by retrieving the metadata associated with the respective image.
As shown in block 22 of Figure 2, the apparatus 10 may also include means, such as the processor 12 or the like, for determining a location and size of a representation of the image to be superimposed upon a map. In this regard, the apparatus, such as the processor, may be configured to determine the location and size of the representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image. As shown in Figure 3, three different images captured from the same location and along the same general direction at different levels of zoom are superimposed upon a map. In this regard, image 30 has a broader field of view and a correspondingly lower level of zoom, while image 32 has an intermediate field of view and an intermediate level of zoom and image 34 has a smaller field of view and a larger level of the zoom.
By way of further explanation of the same example embodiment in which the underlying map has been dimmed to permit the images 30, 32 and 34 to be more readily considered, Figure 4 depicts the location 36 from which the image capturing device captured the three images. Additionally, diverging lines 38 fan outwardly from the location at which the images were captured to illustrate the broadest field of view, that is, the field of view associated with image 30, and to generally represent the direction along which the images are captured. As will be noted from a comparison of the images to the map 32, it is noted that the direction at which image 32 was captured, that is, the direction that the image capturing device was pointed at the time that image 32 was captured, is slightly to the left and along a line that is rotated counter-clockwise relative the direction that image 32 was captured. Similarly, it is noted that the direction at which image 34 was captured is slightly to the right and along a line that is rotated clockwise relative the directions that images 30 and 32 were captured.
In regards to determining the location of the representation of the image to be superimposed upon the map, the apparatus 10, such as the processor 12, of an example embodiment may determine the location of the representation of the image to be in the same direction in which the image capturing device was pointed at the time that the respective image was captured. Thus, image 32 is located slightly to the left of image 30 and image 34 is located slightly to the right of image 32 and slightly to the right of the center of image 30. Thus, the apparatus, such as the processor, of an example embodiment may determine the location of the representation of the image such that the representation of the image is located along and, in an example embodiment, centered about a line that extends in the direction along which the respective image was captured.
In determining the location of the representation of the image to be superimposed upon the map, the apparatus 10, such as the processor 12, of an example embodiment may also be configured to determine the location of the representation of the image to be superimposed upon the map to be at a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image. Thus, the distance at which the representation of the image is located relative to the location at which the image was captured increases as the image is captured at greater levels of zoom and decreases as the image is captured at lesser levels of zoom. In an example embodiment, an image having a greater level of zoom may be positioned proximate the location upon the underlying map that is the subject of the zoomed image, such as shown with respect to the representation of image 34 that is located proximate that portion of the observatory depicted in the underlying map that is the subject of the zoomed in image. In one example embodiment, the distance at which the representation of the image is superimposed upon the map from the location at which the image was captured may be proportional to the level of zoom at which the image was captured. However, the location of the representation of the image upon the map relative to the location at which the image was captured may have other types of direct relationships to the level of zoom associated with the capture of the image in other embodiments.
With reference to the example embodiment depicted in Figures 3 and 4, image 32 is captured with a greater level of zoom than image 30 and, as such, is located further from the location at which the images were captured than the representation of image 30.
Similarly, image 34 was captured at a greater level of zoom than either image 32 or image 30 such that the representation of image 34 is located further from the location at which the images were captured than the representations of either image 32 or image 30.
In regards to the determining the size of the representation of the image to be superimposed upon the map, the apparatus 10, such as the processor 12, of an example embodiment may be configured to determine the size of the representation of the image so as to have an inverse relationship to the level of zoom associated with the image. In this regard, the size of the representation of the image to be superimposed upon the map may decrease as the image is captured at greater levels of zoom and may conversely increase as the image is captured at lesser levels of zoom. With respect to the example embodiment of Figures 3 and 4, image 32 is captured at a greater level of zoom than image 30 and, as such, may have a smaller size than image 30. Furthermore, image 34 may be captured at an even greater level of zoom than image 32 and image 30 and, as such, may be represented in a manner that is smaller than either images 32 or 30. In one example embodiment, the apparatus, such as the processor, may determine the size of the representation of the image to be superimposed upon the map in a manner that has an inverse proportional relationship to the level of zoom associated with the image.
Alternatively, the inverse relationship between the size of the representation of the image to be superimposed upon the map and the level of zoom associated with the image may be defined to have other types of inverse relationships in other example embodiments.
As shown in block 24 of Figure 2, the apparatus 10 may also include means, such as the processor 12, the user interface 16 or the like, for causing a map to be presented with the representation of the image superimposed thereupon. The map data from which the representation of the map is constructed for display may be stored in memory 14 or may be provided by an external database accessible by the processor via, for example, a communication interface. The apparatus, such as the processor, the user interface or the like, may be configured to cause any of a wide variety of different representations of maps to be presented upon a display. For example, the map may be a two-dimensional depiction of streets and other geographical features with the names of various cities or other regions denoted on the map. Alternatively, the map may be based upon images, such as shown in Figure 3 and provided by various commercial systems in concluding, for example, the HERE 3D Map system.
In regards to causing the map to be presented, the apparatus 10, such as the processor 12, of an example embodiment may cause the representation of the image to project outwardly in a third dimension from the map with the images positioned at a distance and with a size relative to the underlying map and the location at which the images were captured that has been determined as described above. In the illustrated embodiment, the representations of the images are superimposed upon the map such that the images face the location from which the images were captured. As shown in the example embodiment of Figures 3 and 4, the representations of the images are also presented in a manner that the images appear to stand upright and to extend outwardly from the underlying map, such as in the manner of postcards standing on an edge. By causing the representations of the images to be presented in this manner, the images may be highlighted relative to the underlying map and the representations of the images may not obscure as much of the underlying map as in an instance in which the images were laid flat or disposed in the same plane as the map. Moreover, by determining the location and size of the representations of the images to be superimposed upon the map based upon the location, the direction and the level of zoom associated with the images, the method, apparatus and computer program product of an example embodiment may provide additional context for the images simply by the manner in which the images are presented such that a person viewing the images superimposed upon the map may efficiently and intuitively interpret the images relative to the underlying map and obtain information regarding the relative direction of different features and the level of detail regarding various features. Further, the manner in which the location and size of the representations of the images are determined and the manner in which the representations of the images are superimposed upon the map may permit more images that were captured at the same location to be visible at one time without significant overlap since the representations of at least some of the images are spaced from the location at which the images were captured and are not stacked one upon another, thereby providing a viewer of the images and the underlying map with ready access to additional images at different levels of detail.
Although the representations of the images 30, 32 and 34 depicted in the example embodiment of Figures 3 and 4 are planar, the representations of the images may have other shapes, such as curved shapes in other embodiments. For example, the representations of the images may be concave with the concavity of the images facing the location from which the images were captured.
Additionally or alternatively, while the representations of the images 30, 32 and 34 appear to extend orthogonally from the underlying map, such as by appearing to extend only in the z-direction that projects orthogonally outward from an xy plane defined by the map, the representations of the images may be tilted so as to define an acute angle with respect to the underlying map, such as by being tilted forwardly or rearwardly in other embodiments. For example, in an instance in which the information regarding the direction along which the image was captured includes not only an azimuth angle, but also an elevation angle, a non-zero elevation angle will define an instance in which the direction in which the image capturing device was pointed included an upward or downward component at the time that the image was captured. In this example
embodiment, the apparatus 10, such as the processor 12, the user interface 16 or the like, may be configured to cause the representation of the image that projects outwardly in the third dimension from the underlying map to be tilted relative to the map in an instance in which the elevation angle has a non-zero value.
For example, in an instance in which the elevation angle has a positive angle representative of the image capturing device being tilted upwardly at the time at which the image was captured, the representation of the image may be leaned backward by an amount that is based upon, such as an amount that is proportional to or equal to, the elevation angle, so as to be representative of an image captured by an image capturing device that was at least partially upwardly facing. Alternatively, in an instance in which the elevation angle has a negative angle representative of the image capturing device being tilted downwardly at the time at which the image was captured, the representation of the image may be leaned forwardly by an amount that is based upon, such as an amount that is proportional to or equal to, the elevation angle, so as to be representative of an image captured by an image capturing device that was at least partially downwardly facing.. As such, in this example embodiment, the tilting of the representations of the images in response to a non-zero elevation angle may provide additional context information and may further facilitate the user's efficient and intuitive review and interpretation of the representations of the images superimposed upon the map.
As described above, representations of still images may be presented upon a map. However, representations of other types of images, such as video images, may be presented upon the map in other embodiments. For example, a thumbnail or other representation of a video may be presented upon the map based upon the location, direction and level of zoom associated with the video such that a user may access the video by selecting, e.g., by clicking upon, the representation of the video. In an instance in which the location, direction or level of zoom associated with the video changes during the course of the video, the representation of the video may be determined in various manners including by being based upon an average value of the parameter(s) that vary, the initial value of the parameter(s) that vary or the final value of the parameter(s) that vary.
As described above, Figure 2 is a flowchart of an apparatus 10, method and computer program product according to example embodiments of the invention. It will be understood that each block of the flowchart, and combinations of blocks in the flowchart, may be implemented by various means, such as hardware, firmware, processor, circuitry, and/or other communication devices associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above may be stored by a memory device 14 of an apparatus employing an embodiment of the present invention and executed by a processor 12 of the apparatus. As will be appreciated, any such computer program instructions may be loaded onto a computer or other
programmable apparatus (for example, hardware) to produce a machine, such that the resulting computer or other programmable apparatus implements the functions specified in the flowchart blocks. These computer program instructions may also be stored in a computer-readable memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer- readable memory produce an article of manufacture the execution of which implements the function specified in the flowchart blocks. The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer- implemented process such that the instructions which execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the flowchart blocks.
Accordingly, blocks of the flowchart support combinations of means for performing the specified functions and combinations of operations for performing the specified functions for performing the specified functions. It will also be understood that one or more blocks of the flowchart, and combinations of blocks in the flowchart, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
In some embodiments, certain ones of the operations above may be modified or further amplified. Furthermore, in some embodiments, additional optional operations may be included. Modifications, additions, or amplifications to the operations above may be performed in any order and in any combination.
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings.
Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims

WHAT IS CLAIMED IS:
1. A method comprising:
identifying, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image;
determining a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image; and
causing a map to be presented with the representation of the image superimposed thereupon.
2. A method according to Claim 1 wherein determining the location and size comprises determining the location of the representation of the image to be superimposed upon the map to be a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image.
3. A method according to Claim 2 wherein the distance from the location at which the image was captured increases as the image is captured at greater levels of zoom.
4. A method according to any one of Claims 1 to 3 wherein determining the location and size comprises determining the size of the representation of the image to be superimposed upon the map to have an inverse relationship to the level of zoom associated with the image.
5. A method according to Claim 4 wherein the size of the representation of the image to be superimposed upon the map decreases as the image is captured at greater levels of zoom.
6. A method according to any one of Claims 1 to 5 wherein causing the map to be presented comprises causing the representation of the image to project outwardly in a third dimension from the map.
7. A method according to Claim 6 wherein the direction along which the image was captured comprises azimuth and elevation angles, and wherein causing the representation of the image to project outwardly in the third dimension comprises causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a non-zero value.
8. An apparatus comprising at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the processor, cause the apparatus to at least:
identify, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image;
determine a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image; and
cause a map to be presented with the representation of the image superimposed thereupon.
9. An apparatus according to Claim 8 wherein the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to determine the location and size by determining the location of the representation of the image to be superimposed upon the map to be a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image.
10. An apparatus according to Claim 9 wherein the distance from the location at which the image was captured increases as the image is captured at greater levels of zoom.
11. An apparatus according to any one of Claims 8 to 10 wherein the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to determine the location and size by determining the size of the
representation of the image to be superimposed upon the map to have an inverse relationship to the level of zoom associated with the image.
12. An apparatus according to Claim 11 wherein the size of the representation of the image to be superimposed upon the map decreases as the image is captured at greater levels of zoom.
13. An apparatus according to any one of Claims 8 to 12 wherein the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to cause the map to be presented by causing the representation of the image to project outwardly in a third dimension from the map.
14. An apparatus according to Claim 13 wherein the direction along which the image was captured comprises azimuth and elevation angles, and wherein the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to cause the representation of the image to project outwardly in the third dimension by causing the representation of the image to be tilted relative to the map in an instance in which the elevation angle has a non-zero value.
15. A computer program product comprising at least one non-transitory computer-readable storage medium having computer-executable program code portions stored therein, the computer-executable program code portions comprising program code instructions configured, upon execution, to:
identify, for an image, a location at which the image was captured, a direction along which the image was captured and a level of zoom associated with the image; determine a location and size of a representation of the image to be superimposed upon a map based upon the location at which the image was captured, the direction along which the image was captured and the level of zoom associated with the image; and
cause a map to be presented with the representation of the image superimposed thereupon.
16. A computer program product according to Claim 15 wherein the program code instructions configured to determine the location and size comprise program code instructions configured to determine the location of the representation of the image to be superimposed upon the map to be a distance from the location at which the image was captured that has a direct relationship to the level of zoom associated with the image.
17. A computer program product according to Claim 16 wherein the distance from the location at which the image was captured increases as the image is captured at greater levels of zoom.
18. A computer program product according to any one of Claims 15 to 17 wherein the program code instructions configured to determine the location and size comprise program code instructions configured to determine the size of the representation of the image to be superimposed upon the map to have an inverse relationship to the level of zoom associated with the image.
19. A computer program product according to Claim 18 wherein the size of the representation of the image to be superimposed upon the map decreases as the image is captured at greater levels of zoom.
20. A computer program product according to any one of Claims 15 to 19 wherein the program code instructions configured to cause the map to be presented comprise program code instructions configured to cause the representation of the image to project outwardly in a third dimension from the map.
PCT/FI2014/050187 2014-03-14 2014-03-14 Method and apparatus for superimposing images upon a map WO2015136144A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/FI2014/050187 WO2015136144A1 (en) 2014-03-14 2014-03-14 Method and apparatus for superimposing images upon a map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2014/050187 WO2015136144A1 (en) 2014-03-14 2014-03-14 Method and apparatus for superimposing images upon a map

Publications (1)

Publication Number Publication Date
WO2015136144A1 true WO2015136144A1 (en) 2015-09-17

Family

ID=50439408

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2014/050187 WO2015136144A1 (en) 2014-03-14 2014-03-14 Method and apparatus for superimposing images upon a map

Country Status (1)

Country Link
WO (1) WO2015136144A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090125234A1 (en) * 2005-06-06 2009-05-14 Tomtom International B.V. Navigation Device with Camera-Info
US20090240431A1 (en) * 2008-03-24 2009-09-24 Google Inc. Panoramic Images Within Driving Directions
US20130162665A1 (en) * 2011-12-21 2013-06-27 James D. Lynch Image view in mapping
WO2013098065A1 (en) * 2011-12-30 2013-07-04 Navteq B.V. Path side image on map overlay

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090125234A1 (en) * 2005-06-06 2009-05-14 Tomtom International B.V. Navigation Device with Camera-Info
US20090240431A1 (en) * 2008-03-24 2009-09-24 Google Inc. Panoramic Images Within Driving Directions
US20130162665A1 (en) * 2011-12-21 2013-06-27 James D. Lynch Image view in mapping
WO2013098065A1 (en) * 2011-12-30 2013-07-04 Navteq B.V. Path side image on map overlay

Similar Documents

Publication Publication Date Title
US11595569B2 (en) Supplying content aware photo filters
US9201625B2 (en) Method and apparatus for augmenting an index generated by a near eye display
US9661214B2 (en) Depth determination using camera focus
US9298970B2 (en) Method and apparatus for facilitating interaction with an object viewable via a display
US9729645B2 (en) Method and apparatus for obtaining an image associated with a location of a mobile terminal
CN108462818B (en) Electronic device and method for displaying 360-degree image in the same
US9092897B2 (en) Method and apparatus for displaying interface elements
US11740850B2 (en) Image management system, image management method, and program
US20150235630A1 (en) Transparency Determination for Overlaying Images on an Electronic Display
JP2017211811A (en) Display control program, display control method and display control device
JP6686547B2 (en) Image processing system, program, image processing method
CN103327246A (en) Multimedia shooting processing method, device and intelligent terminal
US20140168258A1 (en) Method and apparatus for augmenting an image of a location with a representation of a transient object
US9063692B2 (en) Method and apparatus for sharing content
JP2017108356A (en) Image management system, image management method and program
US8867785B2 (en) Method and apparatus for detecting proximate interface elements
WO2015136144A1 (en) Method and apparatus for superimposing images upon a map
US9075432B2 (en) Method and apparatus for sharing content
US9412150B2 (en) Method and apparatus for visually representing objects with a modified height
KR102605451B1 (en) Electronic device and method for providing multiple services respectively corresponding to multiple external objects included in image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14715341

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14715341

Country of ref document: EP

Kind code of ref document: A1