WO2007113462A1 - Content processing - Google Patents

Content processing Download PDF

Info

Publication number
WO2007113462A1
WO2007113462A1 PCT/GB2007/000790 GB2007000790W WO2007113462A1 WO 2007113462 A1 WO2007113462 A1 WO 2007113462A1 GB 2007000790 W GB2007000790 W GB 2007000790W WO 2007113462 A1 WO2007113462 A1 WO 2007113462A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
metadata
images
persons
processing means
Prior art date
Application number
PCT/GB2007/000790
Other languages
French (fr)
Inventor
David Roger Wisely
Rory Stewart Turnbull
Original Assignee
British Telecommunications Public Limited Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by British Telecommunications Public Limited Company filed Critical British Telecommunications Public Limited Company
Publication of WO2007113462A1 publication Critical patent/WO2007113462A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32128Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/958Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00137Transmission
    • H04N1/0014Transmission via e-mail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00148Storage
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32037Automation of particular transmitter jobs, e.g. multi-address calling, auto-dialing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3204Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
    • H04N2201/3205Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of identification information, e.g. name or ID code
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3253Position information, e.g. geographical position at time of capture, GPS data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3274Storage or retrieval of prestored additional information

Definitions

  • the present invention relates to content processing and more particularly to processing the content of captured or stored media for cataloguing and sharing such media.
  • On line photo sharing for example, is discussed in US 2003/0063770.
  • This disclosure has a plurality of nodes which store digital images and an on-line peer server which uploads metadata associated with each digital image. This enables a search to be performed on a central server for the location of images having appropriate metadata so that the images can be retrieved from whichever of the nodes they may be stored on.
  • EP 1513080 provides an image storage data base with associated metadata which categorises image date, location and subject to enable subsequent searching of the metadata to locate approximately matched images based on the searchers definitions. All of these prior art schemes are at least partially reliant on the way in which the user who captured the images categorises the image, and requires significant user intervention to correctly associate the images.
  • a content processing method in which captured media images are associated with context to effect distribution and/or storage of the media, the method being performed at a processing means and comprising: receiving from a media capture device one or more images and associated image metadata; comparing the received image metadata with further data made available to the processing means thereby to provide updated image metadata; and storing and/or distributing the or each image to one or more other processing means on the basis of said updated image metadata.
  • the image metadata can include date and/or time metadata and can be compared at said processing means with further data which indicates the location of an identified user over a period of time to provide updated image metadata which represents location information associated with said image.
  • the image metadata can include location information generated by the media capture device.
  • the method can comprise using the location information to identify one or more persons, other than the identified user, having an association with said location and distributing the or each image, or a link thereto, to the one or more persons.
  • the processing means Prior to distributing the or each image, can determine the presence of the one or more persons on a data network and distributes the or each image only to those connected to the network. Prior to distributing the or each image, the processing means may determine the receiving capability of the one or more persons and distributes the or each image to the one or more persons in a format adapted to their respective receiving capability.
  • the image metadata may include presence information indicative of one or more communications devices in the vicinity of the media capture device when the image was captured, said presence information being compared at the processing means with data identifying persons on the basis of their communications devices thereby to infer the identity of persons present when the image was captured.
  • the method may further comprise distributing the or each images, or a link thereto, to the or each person identified as being present.
  • a method of distributing data representing an image or video to one or more known persons comprising establishing a data link with a media capture device, receiving therefrom one or more images or videos and associated metadata generated at the media capture device, and, for a given image or video, identifying, on the basis of the received metadata and stored data relating the metadata to a location, one or more persons associated with said location, distribution subsequently being effected to distribute the image or video to the or each identified person.
  • a system for processing captured media images to determine the context of the captured images including processing means arranged to receive from a media capture device one or more images and associated image metadata; means for comparing the received image metadata with further data made available to the processing means thereby to provide updated image metadata; and means for distributing storing the or each image on the basis of said updated image metadata.
  • Metadata for each image can be created using the date, time and location of the capture as well as information derived from nearby devices or from post-capture processing (e.g. image recognition) operations.
  • the location data may be derived from location signals received directly by the device, for example GPS signals or signals derived from mobile telephony networks. Such data may be enhanced by comparison between captured images having closely related time and date metadata. Images may be further analysed for the presence of known associates of a system user by comparing objects such as people with known objects associated with the user.
  • the user's profile may be used for further, more detailed, processing, for example using events from a diary to add context to captured multimedia.
  • Distribution of captured images may be effected by determining, from a selected list of recipients, the capability of each recipient to receive images and transmitting or dispatching the images in an appropriate format for each recipient.
  • Figures 3 and 4 are schematic representations of a data store used by the servers of figure 1.
  • content processing software 1 which may be network based, for example with a service provider or may be included in a personal computer for example, receives uploaded content from a camera, mobile phone or other device. This receiving may be by means of the user simply inserting a camera card into the PC or by automated uploading of data by way of Bluetooth, an infrared connection, a wireless internet connection or cellular link or by a standard download cable.
  • the user may choose options such as automatic transmission of photographs associated with a particular group of associates to all members of that particular group, in which case no further action is required of the user for the processing and distribution of the downloaded photographs.
  • the user may have chosen to have more control over the distribution of images in which case, before or after context processing of the video content, the user may be offered the opportunity to select one or more groups to whom the image could be distributed.
  • Simple rules may be created by the network or user to set preferences for media sharing - e.g. "when I take a photograph including anyone on my buddy list then it is automatically sent to everyone on that list".
  • the capture device for example a camera or camera phone 2
  • the capture device may be loaded with a software program which automatically associates metadata with each captured image and, if fitted with Bluetooth or other receiving device, may also capture location and presence information from for example other mobile phones present in the vicinity or GPS data available to it.
  • the context of the video is analysed by the content processing program 1 and other information directly available to the user is used to further enhance the context data generated.
  • Such information may be contained in the user's calendar where an inference of the user's location and the events taking place can be determined.
  • Analysis of the metadata, being compared with contact data in the user's electronic address book, by a presence server 3, may enable the presence of other people known to the user to be determined and on-line access to contact's calendar information may be used to confirm the expected presence or absence of other people at the same event.
  • Such other services available on-line to network based servers may include access to mobile phone location records where permitted to determine, confirm or enhance context information relating to captured images.
  • image recognition techniques may also be used to compare video information with stored images of contacts or objects to assist in the determination of presence, location and/or event information and such presence may be used to repeat certain processing loops to determine that user's presence information, location and the like.
  • the content processing function 1 may cause the upload of the image to a content database 4 and a content server 5 may use the associated metadata to cause distribution of the file so uploaded.
  • the content server may send an email message to that user either with the picture as an attachment or with a hyperlink to the file in the content database 4.
  • captured images may be stored in any accessible database with appropriate internet access, so that context of a captured image may determine one or more locations at which the image file should be stored.
  • Other user's receiving apparatus and location may also determine the format and pre- processing of captured images for distribution.
  • the image file may be processed into an MMS message for transmission to the user.
  • the content server may cause the captured image to be printed for posting to that user.
  • a media capture device 2 such as a camera, captures an image 21 and generates a file for transfer to another device.
  • the device also captures as much context information 22 as possible for example by using GPS to capture its location, Bluetooth to discover other devices in its vicinity or localised information beacons which may be present in public buildings and communicate by Bluetooth or infra red for example.
  • Wireless LAN facilities in a building for example may also be used to assist determination of location.
  • the media device 2 may be programmed to look for an opportunity to transfer the captured image file with its metadata, to its base PC. This may be done in any of a number of ways, for example by communicating the information by way of a GPRS enabled telephone or by locating a wireless LAN hotspot for example, the captured media content and context data are uploaded 23.
  • the content processing software (reference numeral 1 of Figure 1) it and the content server 5 collect additional context data 24 using mobile phone records or image recognition, calendar data and the like as hereinbefore mentioned. This enables the completion of metadata 25 to associate with the captured media and its storage 26.
  • the user menu is then considered to determine the distribution criteria for the particular media based on the metadata which defines the context of the captured media. Assuming that the user has determined that the image should be shared 27, then the sharing group is determined and individual user context and preferences may be discovered so that the captured media file may be reformatted and information sent to the recipient group in dependence upon the recipients' contexts and preferences.
  • the user may store data relating specifically to the distribution of captured images. Such a data store may be populated automatically or semi-automatically from a user's existing calendar information, with the user adding additional information when necessary.
  • the context of the image is determined as far as possible for the metadata and calendar information of the owner.
  • the content processing may continue to look for additional information which may allow the image to be categorised and named.
  • the metadata captured by the camera may include the identity of other mobile devices such as phones at the same location. By comparing the phone number with phone numbers (or similar identifying data) in the data base, the presence of other parties at the location may be determined. Thus if "Jane" is identified as in the vicinity of the camera at the time of image capture, looking into Jane's public calendar (column 7 providing an address) could confirm the location and/or the event.
  • An intelligent search can be carried out using a dictionary of corresponding words or phrases to determine a possible title for an event.
  • the context processing may produce a title such as "Bill at Jane's Wedding.”
  • the context processing function 1 deposits the file and metadata automatically on the website set up for Jane's Wedding shown in column 2 of Figure 4 which gives relationships between the various groups, locations of website and membership relationships.
  • FIG 3 some categories of data which may be used by the system are shown associated with people in the user's contact.
  • each user may have a telephone number associated which may be used (in the case of a mobile number) for assisting in both location of the individual and in the distribution of images to that person.
  • a "JPG” link (Column 4) provides an indicator of the location of a comparison picture for the individual so that the potential for identification of people in a captured media image is increased.
  • Column 5 may contain address information or a link to address information so that if it is necessary to post a print of a captured media image, this may be done.
  • Column 6 contains data defining the group or groups to which an individual belongs so that the user may define the image distribution arrangement.
  • Jane belongs to (at least) B1 , B2, G3 and of course W1 (previously defined as Jane's Wedding) so that pictures involving Jane or her wedding might be distributed to members of the W1 group and/or placed on the W1 website.
  • Calendar information is held to facilitate location and/or event identification as previously mentioned and in the final column of figure 3 preferences are noted which may be either entered by the user or gleaned for network data for the particular individual.
  • the invention provides for the automatic or semi-automatic identification of captured media images and their titling, distribution, storage in appropriate places and conversion into appropriate formats for distribution.
  • An example of such an event may be as follows, -in the creation and sharing of the record of a wedding.
  • many guests take photos and videos - traditionally some of these might be shared independently on the web or printed out and sent to the couple. In this case, however, some guests are using videos and cameras with context-content processing software and network links.
  • Tom's camera is able to find out the exact location of the pictures by means of the GPS on his mobile phone (the 2 communicating over Bluetooth). Tom's camera uploads the pictures from the hotel where the reception is taking place (as it is equipped with Wireless LAN).
  • the photos reach the network content processing server it is able to see Tom's calendar and that he is scheduled to be a "Mary's Wedding".
  • Mark who has also taken some pictures and video sequences, is able to link his content with Tom's and, eventually, Mary (back from the honeymoon) is able to see all the content and, with the help of a menu-wizard, create a summary set of pictures and videos from all the contributors and is able to share this with everyone who attended the wedding.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Automation & Control Theory (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

To enable media content to be distributed and stored, metadata associated with a captured image is analysed, together with other data, to determine a context for a captured image. This enables the media file to be named, stored and/or distributed to selected groups of users in accordance with the context of the file.

Description

Content Processing
The present invention relates to content processing and more particularly to processing the content of captured or stored media for cataloguing and sharing such media.
Large amounts of media content are generated by the use of digital cameras and video cameras. Such content is often transferred to a personal computer (PC) by the user who then uses programs such as Adobe ™ Photo Album to store video or photographic content. This requires the user to name the file in some way if it is to be useable as a reminder of the content of the media. If the user wishes to share the media with family or friends it is then necessary for the user to prepare pictures for on-line viewing, possibly by changing size or resolution and up-loading to a sharing website, sending by e-mail a link to the location of the media. For non computer-literate friends, or those absent from home for example, it may be necessary for the user to print out copies of pictures or transfer a copy of the media file to DVD so that it can be shared with others.
This process is time consuming and relies on the user to add in some detail as to location and/or content and other contextual information because the user is often the only person aware of such information. This makes the process of automating such activities difficult.
On line photo sharing, for example, is discussed in US 2003/0063770. This disclosure has a plurality of nodes which store digital images and an on-line peer server which uploads metadata associated with each digital image. This enables a search to be performed on a central server for the location of images having appropriate metadata so that the images can be retrieved from whichever of the nodes they may be stored on.
In US2003/0078837 there is a discussion of extracting information from the content of digital images. The specification notes techniques such as face detection and recognition and people detection and associating date & time information with specific events such as Christmas celebrations.
EP 1513080 provides an image storage data base with associated metadata which categorises image date, location and subject to enable subsequent searching of the metadata to locate approximately matched images based on the searchers definitions. All of these prior art schemes are at least partially reliant on the way in which the user who captured the images categorises the image, and requires significant user intervention to correctly associate the images.
According to the invention, there is provided a content processing method in which captured media images are associated with context to effect distribution and/or storage of the media, the method being performed at a processing means and comprising: receiving from a media capture device one or more images and associated image metadata; comparing the received image metadata with further data made available to the processing means thereby to provide updated image metadata; and storing and/or distributing the or each image to one or more other processing means on the basis of said updated image metadata.
The image metadata can include date and/or time metadata and can be compared at said processing means with further data which indicates the location of an identified user over a period of time to provide updated image metadata which represents location information associated with said image.
Alternatively or additionally, the image metadata can include location information generated by the media capture device.
The method can comprise using the location information to identify one or more persons, other than the identified user, having an association with said location and distributing the or each image, or a link thereto, to the one or more persons. Prior to distributing the or each image, the processing means can determine the presence of the one or more persons on a data network and distributes the or each image only to those connected to the network. Prior to distributing the or each image, the processing means may determine the receiving capability of the one or more persons and distributes the or each image to the one or more persons in a format adapted to their respective receiving capability.
The image metadata may include presence information indicative of one or more communications devices in the vicinity of the media capture device when the image was captured, said presence information being compared at the processing means with data identifying persons on the basis of their communications devices thereby to infer the identity of persons present when the image was captured. The method may further comprise distributing the or each images, or a link thereto, to the or each person identified as being present.
According to a further aspect of the invention, there is provided a method of distributing data representing an image or video to one or more known persons, the method being performed at a processing means and comprising establishing a data link with a media capture device, receiving therefrom one or more images or videos and associated metadata generated at the media capture device, and, for a given image or video, identifying, on the basis of the received metadata and stored data relating the metadata to a location, one or more persons associated with said location, distribution subsequently being effected to distribute the image or video to the or each identified person.
According to a further aspect of the invention, there is provided a system for processing captured media images to determine the context of the captured images, the system including processing means arranged to receive from a media capture device one or more images and associated image metadata; means for comparing the received image metadata with further data made available to the processing means thereby to provide updated image metadata; and means for distributing storing the or each image on the basis of said updated image metadata.
In the disclosure, there is described a content processing method in which captured media images are associated with context by comparing image capture data, location data and known associations between persons to effect distribution and storage of media in dependence upon image content of the captured media images. Metadata for each image can be created using the date, time and location of the capture as well as information derived from nearby devices or from post-capture processing (e.g. image recognition) operations. The location data may be derived from location signals received directly by the device, for example GPS signals or signals derived from mobile telephony networks. Such data may be enhanced by comparison between captured images having closely related time and date metadata. Images may be further analysed for the presence of known associates of a system user by comparing objects such as people with known objects associated with the user. The user's profile (hobbies, friends, diary, home locations etc.) may be used for further, more detailed, processing, for example using events from a diary to add context to captured multimedia. Distribution of captured images may be effected by determining, from a selected list of recipients, the capability of each recipient to receive images and transmitting or dispatching the images in an appropriate format for each recipient.
A content processing method in accordance with the invention will now be described by way of example only with reference to the accompanying drawings of which:- Figure 1 shows an overview of the system used by the method; Figure 2 shows a flow chart used by the system of figure 1 ; and
Figures 3 and 4 are schematic representations of a data store used by the servers of figure 1.
Referring to the drawings, and, in particular figure 1 , content processing software 1 which may be network based, for example with a service provider or may be included in a personal computer for example, receives uploaded content from a camera, mobile phone or other device. This receiving may be by means of the user simply inserting a camera card into the PC or by automated uploading of data by way of Bluetooth, an infrared connection, a wireless internet connection or cellular link or by a standard download cable.
The user may choose options such as automatic transmission of photographs associated with a particular group of associates to all members of that particular group, in which case no further action is required of the user for the processing and distribution of the downloaded photographs. Alternatively, the user may have chosen to have more control over the distribution of images in which case, before or after context processing of the video content, the user may be offered the opportunity to select one or more groups to whom the image could be distributed. Simple rules may be created by the network or user to set preferences for media sharing - e.g. "when I take a photograph including anyone on my buddy list then it is automatically sent to everyone on that list".
The capture device, for example a camera or camera phone 2, may be loaded with a software program which automatically associates metadata with each captured image and, if fitted with Bluetooth or other receiving device, may also capture location and presence information from for example other mobile phones present in the vicinity or GPS data available to it. From the metadata, the context of the video is analysed by the content processing program 1 and other information directly available to the user is used to further enhance the context data generated. Such information may be contained in the user's calendar where an inference of the user's location and the events taking place can be determined.
Analysis of the metadata, being compared with contact data in the user's electronic address book, by a presence server 3, may enable the presence of other people known to the user to be determined and on-line access to contact's calendar information may be used to confirm the expected presence or absence of other people at the same event. Such other services available on-line to network based servers may include access to mobile phone location records where permitted to determine, confirm or enhance context information relating to captured images. Further, image recognition techniques may also be used to compare video information with stored images of contacts or objects to assist in the determination of presence, location and/or event information and such presence may be used to repeat certain processing loops to determine that user's presence information, location and the like.
Having generated additional metadata to associate with the captured image, user preferred storage and distribution is activated within the content processing service. Thus if the user has set particular values for image distribution, say, photographs of a netball team at a netball match to be distributed to all members of the team, the content processing function 1 may cause the upload of the image to a content database 4 and a content server 5 may use the associated metadata to cause distribution of the file so uploaded.
Thus if one member of the distribution list is known to be using a laptop or personal computer 6, which may be determined from network presence information, the content server may send an email message to that user either with the picture as an attachment or with a hyperlink to the file in the content database 4.
It will be noted that although the content database 4 is shown as a single entity in Figure 1 , captured images may be stored in any accessible database with appropriate internet access, so that context of a captured image may determine one or more locations at which the image file should be stored.
Other user's receiving apparatus and location may also determine the format and pre- processing of captured images for distribution. Thus if one user has a mobile phone capable of displaying small images, the image file may be processed into an MMS message for transmission to the user. If another user in a particular group is not contactable by email or mobile phone but needs to receive a copy of a captured image (say, granddad should always get pictures of granddaughter), the content server may cause the captured image to be printed for posting to that user.
Now referring also to figure 2, the process may be considered as operating in the following manner. A media capture device 2 such as a camera, captures an image 21 and generates a file for transfer to another device. At the same time, depending on its capability, the device also captures as much context information 22 as possible for example by using GPS to capture its location, Bluetooth to discover other devices in its vicinity or localised information beacons which may be present in public buildings and communicate by Bluetooth or infra red for example. Wireless LAN facilities in a building for example may also be used to assist determination of location.
Having captured an image or images, the media device 2 may be programmed to look for an opportunity to transfer the captured image file with its metadata, to its base PC. This may be done in any of a number of ways, for example by communicating the information by way of a GPRS enabled telephone or by locating a wireless LAN hotspot for example, the captured media content and context data are uploaded 23.
When the uploaded data is received by the content processing software (reference numeral 1 of Figure 1) it and the content server 5 collect additional context data 24 using mobile phone records or image recognition, calendar data and the like as hereinbefore mentioned. This enables the completion of metadata 25 to associate with the captured media and its storage 26.
The user menu is then considered to determine the distribution criteria for the particular media based on the metadata which defines the context of the captured media. Assuming that the user has determined that the image should be shared 27, then the sharing group is determined and individual user context and preferences may be discovered so that the captured media file may be reformatted and information sent to the recipient group in dependence upon the recipients' contexts and preferences. Turning then to figures 3 and 4, the user may store data relating specifically to the distribution of captured images. Such a data store may be populated automatically or semi-automatically from a user's existing calendar information, with the user adding additional information when necessary.
Thus when a captured media image is received, the context of the image is determined as far as possible for the metadata and calendar information of the owner. The content processing may continue to look for additional information which may allow the image to be categorised and named. Thus, the metadata captured by the camera may include the identity of other mobile devices such as phones at the same location. By comparing the phone number with phone numbers (or similar identifying data) in the data base, the presence of other parties at the location may be determined. Thus if "Jane" is identified as in the vicinity of the camera at the time of image capture, looking into Jane's public calendar (column 7 providing an address) could confirm the location and/or the event. An intelligent search can be carried out using a dictionary of corresponding words or phrases to determine a possible title for an event.
Thus, if the user's own calendar refers to Jane's Wedding and a captured image features Jane, it is possible to produce an intelligent title such as Jane at her Wedding. Similarly, if the captured image features another person, say Bill, the context processing may produce a title such as "Bill at Jane's Wedding."
Having determined that the user has set up a special group (W1 ) for relating to Jane's Wedding, and has requested distribution of images to that group, then the context processing function 1 deposits the file and metadata automatically on the website set up for Jane's Wedding shown in column 2 of Figure 4 which gives relationships between the various groups, locations of website and membership relationships.
In figure 3, some categories of data which may be used by the system are shown associated with people in the user's contact. Thus in figure 3, each user may have a telephone number associated which may be used (in the case of a mobile number) for assisting in both location of the individual and in the distribution of images to that person.
A "JPG" link (Column 4) provides an indicator of the location of a comparison picture for the individual so that the potential for identification of people in a captured media image is increased. There may of course be a reverse link between photos used for comparison and individuals in the data base so that, if the content processing or presence server picks out from (say) "MyPhotos" a potential presence of a contact in the image, it can be cross related through the data present with groups and the like.
Column 5 may contain address information or a link to address information so that if it is necessary to post a print of a captured media image, this may be done.
Column 6 contains data defining the group or groups to which an individual belongs so that the user may define the image distribution arrangement. Thus as shown, Jane belongs to (at least) B1 , B2, G3 and of course W1 (previously defined as Jane's Wedding) so that pictures involving Jane or her wedding might be distributed to members of the W1 group and/or placed on the W1 website.
Calendar information is held to facilitate location and/or event identification as previously mentioned and in the final column of figure 3 preferences are noted which may be either entered by the user or gleaned for network data for the particular individual.
Thus the invention provides for the automatic or semi-automatic identification of captured media images and their titling, distribution, storage in appropriate places and conversion into appropriate formats for distribution.
In a further benefit, because there is commonality between groups it is possible for images captured by other members of the same group to be added into the same distribution set and (for example by adding the images to a common website address) so that an event may be shared between those present at the event and thosenot present but having an interest therein.
An example of such an event may be as follows, -in the creation and sharing of the record of a wedding. In addition to any official photography/videoing typically many guests take photos and videos - traditionally some of these might be shared independently on the web or printed out and sent to the couple. In this case, however, some guests are using videos and cameras with context-content processing software and network links. Tom's camera is able to find out the exact location of the pictures by means of the GPS on his mobile phone (the 2 communicating over Bluetooth). Tom's camera uploads the pictures from the hotel where the reception is taking place (as it is equipped with Wireless LAN). When the photos reach the network content processing server it is able to see Tom's calendar and that he is scheduled to be a "Mary's Wedding". From Tom's buddy list and address book this resolved to Mary Wilson and her partner Herbie Farrow. Pictures of Mary and Herbie and Tom's friends are matched to the pictures - as well as any information gleaned by Tom's camera from mobile phones close by when the picture was taken. The pictures are then stored - with the created meta-data - on the network server and Tom is sent a message asking what he wants to do with the pictures. He indicates that he wants to share them with everyone on his buddy list who is also a buddy with Tom or Herbie. The Network server then determines the presence of this group - who is contactable and on what network and re-formats the pictures for each user. So Alison gets 4 MMS picture messages, Helen gets an email with a web-link to an online album as she has left all her devices at home!
Later, Mark, who has also taken some pictures and video sequences, is able to link his content with Tom's and, eventually, Mary (back from the honeymoon) is able to see all the content and, with the help of a menu-wizard, create a summary set of pictures and videos from all the contributors and is able to share this with everyone who attended the wedding.

Claims

1. A content processing method in which captured media images are associated with context to effect distribution and/or storage of the media, the method being performed at a processing means and comprising: receiving from a media capture device one or more images and associated image metadata; comparing the received image metadata with further data made available to the processing means thereby to provide updated image metadata; and storing and/or distributing the or each image to one or more other processing means on the basis of said updated image metadata.
2. A content processing method as claimed in claim 1 in which the image metadata includes date and/or time metadata and is compared at said processing means with further data which indicates the location of an identified user over a period of time to provide updated image metadata which represents location information associated with said image.
3. A content processing method as claimed in claim 1 in which the image metadata includes location information generated by the media capture device.
4. A content processing method as claimed in claim 2 or claim 3, comprising using the location information to identify one or more persons, other than the identified user, having an association with said location and distributing the or each image, or a link thereto, to the one or more persons.
5. A content processing method as claimed in claim 4, wherein prior to distributing the or each image, the processing means determines the presence of the one or more persons on a data network and distributes the or each image only to those connected to the network.
6. A content processing method as claimed in claim 4 or claim 5, wherein prior to distributing the or each image, the processing means determines the receiving capability of the one or more persons and distributes the or each image to the one or more persons in a format adapted to their respective receiving capability.
7. A content processing method as claimed in claim 1 in which the image metadata includes presence information indicative of one or more communications devices in the vicinity of the media capture device when the image was captured, said presence information being compared at the processing means with data identifying persons on the basis of their communications devices thereby to infer the identity of persons present when the image was captured.
8. A content processing method as claimed in claim 7, further comprising distributing the or each images, or a link thereto, to the or each person identified as being present.
9. A method of distributing data representing an image or video to one or more known persons, the method being performed at a processing means and comprising establishing a data link with a media capture device, receiving therefrom one or more images or videos and associated metadata generated at the media capture device, and, for a given image or video, identifying, on the basis of the received metadata and stored data relating the metadata to a location, one or more persons associated with said location, distribution subsequently being effected to distribute the image or video to the or each identified person.
10. A computer program stored on a computer readable medium, the computer program, when executed on a processor, causing the processor to carry out the method of any one of the preceding claims.
11. A system for processing captured media images to determine the context of the captured images, the system including processing means arranged to receive from a media capture device one or more images and associated image metadata; means for comparing the received image metadata with further data made available to the processing means thereby to provide updated image metadata; and means for distributing storing the or each image on the basis of said updated image metadata.
PCT/GB2007/000790 2006-03-30 2007-03-07 Content processing WO2007113462A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP06251776 2006-03-30
EP06251776.8 2006-03-30

Publications (1)

Publication Number Publication Date
WO2007113462A1 true WO2007113462A1 (en) 2007-10-11

Family

ID=36829804

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2007/000790 WO2007113462A1 (en) 2006-03-30 2007-03-07 Content processing

Country Status (1)

Country Link
WO (1) WO2007113462A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009070841A1 (en) * 2007-12-05 2009-06-11 It Au0801806Rsity Of Technology Social multimedia management
EP2211529A1 (en) 2009-01-21 2010-07-28 Samsung Electronics Co., Ltd. Method for sharing file between control point and media server in a dlna system, and system thereof
WO2012112449A1 (en) * 2011-02-18 2012-08-23 Google Inc. Automatic event recognition and cross-user photo clustering
EP2777012A2 (en) * 2011-11-09 2014-09-17 Microsoft Corporation Event-based media grouping, playback, and sharing
DE102013007248A1 (en) 2013-04-19 2014-10-23 Rainer Lutze A procedure for collecting, aggregating, updating and disseminating post-paid, assessable, content-correctable and verifiable messages on public events of general interest
US8914483B1 (en) 2011-03-17 2014-12-16 Google Inc. System and method for event management and information sharing
US9280545B2 (en) 2011-11-09 2016-03-08 Microsoft Technology Licensing, Llc Generating and updating event-based playback experiences
US9391792B2 (en) 2012-06-27 2016-07-12 Google Inc. System and method for event content stream
US9418370B2 (en) 2012-10-23 2016-08-16 Google Inc. Obtaining event reviews
US10140257B2 (en) 2013-08-02 2018-11-27 Symbol Technologies, Llc Method and apparatus for capturing and processing content from context sensitive documents on a mobile device
KR20190089994A (en) * 2016-12-07 2019-07-31 알리바바 그룹 홀딩 리미티드 Method and device for implementing service operations based on images
US10409858B2 (en) 2013-08-02 2019-09-10 Shoto, Inc. Discovery and sharing of photos between devices
US10432728B2 (en) 2017-05-17 2019-10-01 Google Llc Automatic image sharing with designated users over a communication network
US10476827B2 (en) 2015-09-28 2019-11-12 Google Llc Sharing images and image albums over a communication network
US10769362B2 (en) 2013-08-02 2020-09-08 Symbol Technologies, Llc Method and apparatus for capturing and extracting content from documents on a mobile device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2367158A (en) * 2000-09-26 2002-03-27 6S Ltd Archiving and retrieving items based on episodic memory of groups of people
EP1280329A2 (en) * 2001-06-26 2003-01-29 Eastman Kodak Company An electronic camera and system for transmitting digital image files over a communication network
US20040008872A1 (en) * 1996-09-04 2004-01-15 Centerframe, Llc. Obtaining person-specific images in a public venue
US20040239765A1 (en) * 2003-05-29 2004-12-02 Casio Computer Co., Ltd. Photographed image transmitting apparatus
GB2403099A (en) * 2003-06-20 2004-12-22 Hewlett Packard Development Co Sharing image items

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040008872A1 (en) * 1996-09-04 2004-01-15 Centerframe, Llc. Obtaining person-specific images in a public venue
GB2367158A (en) * 2000-09-26 2002-03-27 6S Ltd Archiving and retrieving items based on episodic memory of groups of people
EP1280329A2 (en) * 2001-06-26 2003-01-29 Eastman Kodak Company An electronic camera and system for transmitting digital image files over a communication network
US20040239765A1 (en) * 2003-05-29 2004-12-02 Casio Computer Co., Ltd. Photographed image transmitting apparatus
GB2403099A (en) * 2003-06-20 2004-12-22 Hewlett Packard Development Co Sharing image items

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
COUNTS S, FELLHEIMER E: "Supporting Social Presence through Lightweight Photo Sharing On and Off the Desktop", PROCEEDINGS OF THE 2004 SIGCHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 24 April 2004 (2004-04-24), Vienna, Austria, pages 599 - 606, XP002396527, ISBN: 1-58113-702-8, Retrieved from the Internet <URL:http://delivery.acm.org/10.1145/990000/985768/p599-counts.pdf?key1=985768&key2=1600776511&coll=&dl=ACM&CFID=15151515&CFTOKEN=6184618> [retrieved on 20060825] *
CRABTREE A, RODDEN T, MARIANI J: "Collaborating around Collections: Informing the continued Development of Photoware", PROCEEDINGS OF THE 2004 ACM CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK, 6 November 2004 (2004-11-06), Chicago, Illinois, USA, pages 396 - 405, XP002396534, ISBN: 1-58113-810-5, Retrieved from the Internet <URL:http://delivery.acm.org/10.1145/1040000/1031673/p396-crabtree.pdf?key1=1031673&key2=5850776511&coll=&dl=ACM&CFID=15151515&CFTOKEN=6184618> [retrieved on 20060825] *
GIRGENSOHN A, ADCOCK J, WILCOX L: "Leveraging Face Recognition Technology to Find and Organize Photos", PROCEEDINGS OF THE 6TH ACM SIGMM INTERNATIONAL WORKSHOP ON MULTIMEDIA INFORMATION RETRIEVAL, 15 October 2004 (2004-10-15), New York, USA, pages 99 - 106, XP002396526, ISBN: 1-58113-940-3, Retrieved from the Internet <URL:http://delivery.acm.org/10.1145/1030000/1026728/p99-girgensohn.pdf?key1=1026728&key2=8059676511&coll=&dl=ACM&CFID=15151515&CFTOKEN=6184618> [retrieved on 20060825] *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009070841A1 (en) * 2007-12-05 2009-06-11 It Au0801806Rsity Of Technology Social multimedia management
EP2211529A1 (en) 2009-01-21 2010-07-28 Samsung Electronics Co., Ltd. Method for sharing file between control point and media server in a dlna system, and system thereof
US8319837B2 (en) 2009-01-21 2012-11-27 Samsung Electronics Co., Ltd Method for sharing file between control point and media server in a DLNA system, and system thereof
WO2012112449A1 (en) * 2011-02-18 2012-08-23 Google Inc. Automatic event recognition and cross-user photo clustering
US10140552B2 (en) 2011-02-18 2018-11-27 Google Llc Automatic event recognition and cross-user photo clustering
US11263492B2 (en) 2011-02-18 2022-03-01 Google Llc Automatic event recognition and cross-user photo clustering
US9355387B1 (en) 2011-03-17 2016-05-31 Google Inc. System and method for event management and information sharing
US8914483B1 (en) 2011-03-17 2014-12-16 Google Inc. System and method for event management and information sharing
US9143601B2 (en) 2011-11-09 2015-09-22 Microsoft Technology Licensing, Llc Event-based media grouping, playback, and sharing
US9280545B2 (en) 2011-11-09 2016-03-08 Microsoft Technology Licensing, Llc Generating and updating event-based playback experiences
US11036782B2 (en) 2011-11-09 2021-06-15 Microsoft Technology Licensing, Llc Generating and updating event-based playback experiences
EP2777012A4 (en) * 2011-11-09 2015-01-14 Microsoft Corp Event-based media grouping, playback, and sharing
EP2777012A2 (en) * 2011-11-09 2014-09-17 Microsoft Corporation Event-based media grouping, playback, and sharing
US10270824B2 (en) 2012-06-27 2019-04-23 Google Llc System and method for event content stream
US9391792B2 (en) 2012-06-27 2016-07-12 Google Inc. System and method for event content stream
US9954916B2 (en) 2012-06-27 2018-04-24 Google Llc System and method for event content stream
US10115118B2 (en) 2012-10-23 2018-10-30 Google Llc Obtaining event reviews
US9418370B2 (en) 2012-10-23 2016-08-16 Google Inc. Obtaining event reviews
DE102013007248A1 (en) 2013-04-19 2014-10-23 Rainer Lutze A procedure for collecting, aggregating, updating and disseminating post-paid, assessable, content-correctable and verifiable messages on public events of general interest
US10769362B2 (en) 2013-08-02 2020-09-08 Symbol Technologies, Llc Method and apparatus for capturing and extracting content from documents on a mobile device
US10409858B2 (en) 2013-08-02 2019-09-10 Shoto, Inc. Discovery and sharing of photos between devices
US10140257B2 (en) 2013-08-02 2018-11-27 Symbol Technologies, Llc Method and apparatus for capturing and processing content from context sensitive documents on a mobile device
US11146520B2 (en) 2015-09-28 2021-10-12 Google Llc Sharing images and image albums over a communication network
US10476827B2 (en) 2015-09-28 2019-11-12 Google Llc Sharing images and image albums over a communication network
KR102230433B1 (en) * 2016-12-07 2021-03-24 어드밴스드 뉴 테크놀로지스 씨오., 엘티디. Method and device for implementing service operations based on images
EP3553675A4 (en) * 2016-12-07 2019-10-23 Alibaba Group Holding Limited Picture-based method and apparatus for implementing service operations
KR20190089994A (en) * 2016-12-07 2019-07-31 알리바바 그룹 홀딩 리미티드 Method and device for implementing service operations based on images
US10432728B2 (en) 2017-05-17 2019-10-01 Google Llc Automatic image sharing with designated users over a communication network
US11212348B2 (en) 2017-05-17 2021-12-28 Google Llc Automatic image sharing with designated users over a communication network
US11778028B2 (en) 2017-05-17 2023-10-03 Google Llc Automatic image sharing with designated users over a communication network

Similar Documents

Publication Publication Date Title
WO2007113462A1 (en) Content processing
US9936086B2 (en) Wireless image distribution system and method
JP4668200B2 (en) Method and system for automatically sharing digital images over a communication network
KR101384931B1 (en) Method, apparatus or system for image processing
US20090300109A1 (en) System and method for mobile multimedia management
JP5016101B2 (en) Digital photo content information service
US8655976B2 (en) Digital file distribution in a social network system
US10003762B2 (en) Shared image devices
US10459968B2 (en) Image processing system and image processing method
US20080052349A1 (en) Methods and System for Aggregating Disparate Batches of Digital Media Files Captured During an Event for the Purpose of Inclusion into Public Collections for Sharing
JP2009259238A (en) Storage device for image sharing and image sharing system and method
JP2014503091A (en) Friends and family tree for social networking
JP2009259239A (en) Storage device for image sharing and image sharing and method
FI115364B (en) Imaging profile for digital imaging
JP2011175426A (en) Communication terminal device in which electronic mail is available
JP2006053758A (en) Image classification device, method and program
JP2006113726A (en) Database service system on web
JP2017037572A (en) Message information providing method, server for deposition service of message information and application program
KR20150061727A (en) Method for providing photograph related service

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07712848

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07712848

Country of ref document: EP

Kind code of ref document: A1