AU2013254921A1 - Method, apparatus and system for determining a label for a group of individuals represented in images - Google Patents

Method, apparatus and system for determining a label for a group of individuals represented in images Download PDF

Info

Publication number
AU2013254921A1
AU2013254921A1 AU2013254921A AU2013254921A AU2013254921A1 AU 2013254921 A1 AU2013254921 A1 AU 2013254921A1 AU 2013254921 A AU2013254921 A AU 2013254921A AU 2013254921 A AU2013254921 A AU 2013254921A AU 2013254921 A1 AU2013254921 A1 AU 2013254921A1
Authority
AU
Australia
Prior art keywords
group
individuals
images
event
social
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2013254921A
Inventor
Nicholas Grant Fulton
Alex Penev
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to AU2013254921A priority Critical patent/AU2013254921A1/en
Publication of AU2013254921A1 publication Critical patent/AU2013254921A1/en
Abandoned legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

- 28 Abstract METHOD, APPARATUS AND SYSTEM FOR DETERMINING A LABEL FOR A GROUP OF INDIVIDUALS REPRESENTED IN IMAGES A method of determining a label for a group of individuals, is disclosed. An image 5 collection (185) is received, a plurality of images of the image collection representing one or more individuals. At least two of the individuals are associated together to determine a group based on appearance of the associated individuals in common ones of the images. One or more social events captured by the common images are identified, based at least on metadata associated with the common images, each social event corresponding to at least 0 one event attribute. A group attendance profile is determined based on the event attributes of the social events attended by the individuals in the group and using the metadata. A label is determined for the group of individuals based on similarity of the group attendance profile to predetermined group attendance profiles, each predetermined group attendance profile corresponding to a social category. 8018089v3 (PO89043_SpeciAs Filed) - 3/9 0co co c con C,,) 1-- 2 0- o _o 0a) -) 80844v 0803FsAsild

Description

- 1 METHOD, APPARATUS AND SYSTEM FOR DETERMINING A LABEL FOR A GROUP OF INDIVIDUALS REPRESENTED IN IMAGES FIELD OF INVENTION The present invention relates to digital photography, and in particular, to categorising groups of individuals based on their attendance behaviour captured in image collections. The 5 present invention also relates to a method and apparatus for determining a social category label, and to a computer program product including a computer readable medium having recorded thereon a computer program for determining a social category label. DESCRIPTION OF BACKGROUND ART 0 Digital cameras enjoy widespread use today. Such devices use one or more sensors to capture light from a scene and convert the captured light to a digital image, commonly as Joint Photographic Experts Group (JPEG) or RAW fonnat files. The files may be later transferred to other devices. The popularity and ease-of-use of modem cameras allow users to capture hundreds or even thousands of photos at a single event, and consequently amass very large 5 personal image collections that span numerous events over many years. Typical tasks that users can perform with their captured images include printing images to physical media (e.g. photo books), digital sharing of images with friends and relatives via electronic channels such as email and social media, and publishing images online to the general public. L0 Users often find it difficult in determining which images should be used for a task. The large number of images in a user's collection means that selection of the images requires significant effort and can discourage the user from undertaking the task. Secondly, users often also find it difficult in choosing suitable recipients, such as determining which particular friends or relatives would most like to see which particular 25 images? The large number of acquaintances, particularly online acquaintances, many people share today means that personalising a selection of images for an audience requires significant effort and can discourage the user from sharing images directly to specific friends. Thirdly, users often also find it difficult adding metadata to images, such as determining what descriptions could be used to label particular images or events. The large 30 number of images and events means that labelling requires significant effort and can discourage the user from adding descriptions. 8018089v3 (PO89043_SpeciAs Filed) -2 Some conventional image selection methods allow users to select the best images to share. However, such conventional image selection methods do not provide support for choosing recipients and adding metadata to images. SUMMARY OF THE INVENTION 5 It is an object of the present invention to substantially overcome, or at least ameliorate, one or more disadvantages of existing arrangements. Disclosed are arrangements which seek to address the above disadvantages of existing arrangements by analysing image collections to determine social category labels of individuals captured in the images, which helps with recipient selection and with automatic metadata 0 labelling. According to one aspect of the present disclosure there is provided a method of determining a label for a group of individuals, the method comprising: receiving an image collection, a plurality of images of said image collection representing one or more individuals; 5 associating at least two of the individuals together to determine a group based on appearance of the associated individuals in common ones of the images; identifying one or more social events captured by the common images, based at least on metadata associated with the common images, each said social event corresponding to at least one event attribute; o determining a group attendance profile based on the event attributes of the social events attended by the individuals in the group and using said metadata; and determining a label for the group of individuals based on similarity of the group attendance profile to predetermined group attendance profiles, each predetermined group attendance profile corresponding to a social category. 25 According to another aspect of the present disclosure there is provided an apparatus for determining a label for a group of individuals, the apparatus comprising: means for receiving an image collection, a plurality of images of said image collection representing one or more individuals; means for associating at least two of the individuals together to determine a group 30 based on appearance of the associated individuals in common ones of the images; means for identifying one or more social events captured by the common images, based at least on metadata associated with the common images, each said social event corresponding to at least one event attribute; 8018089v3 (PO89043_SpeciAs Filed) -3 means for determining a group attendance profile based on the event attributes of the social events attended by the individuals in the group and using said metadata; and means for determining a label for the group of individuals based on similarity of the group attendance profile to predetermined group attendance profiles, each predetermined 5 group attendance profile corresponding to a social category. According to another aspect of the present disclosure there is provided a system for determining a label for a group of individuals, the system comprising: a memory for storing data and a computer program; a processor coupled to the memory for executing the computer program, said computer 0 program comprising instructions for: receiving an image collection, a plurality of images of said image collection representing one or more individuals; associating at least two of the individuals together to determine a group based on appearance of the associated individuals in common ones of the images; 5 identifying one or more social events captured by the common images, based at least on metadata associated with the common images, each said social event corresponding to at least one event attribute; determining a group attendance profile based on the event attributes of the social events attended by the individuals in the group and using said metadata; and 0 determining a label for the group of individuals based on similarity of the group attendance profile to predetermined group attendance profiles, each predetermined group attendance profile corresponding to a social category. According to still another aspect of the present disclosure there is provided a computer readable medium having a computer program stored thereon for determining a label for a 25 group of individuals, the program comprising: code for receiving an image collection, a plurality of images of said image collection representing one or more individuals; code for associating at least two of the individuals together to determine a group based on appearance of the associated individuals in common ones of the images; 30 code for identifying one or more social events captured by the common images, based at least on metadata associated with the common images, each said social event corresponding to at least one event attribute; code for determining a group attendance profile based on the event attributes of the social events attended by the individuals in the group and using said metadata; and 8018089v3 (PO89043_SpeciAs Filed) -4 code for determining a label for the group of individuals based on similarity of the group attendance profile to predetermined group attendance profiles, each predetermined group attendance profile corresponding to a social category. Other aspects of the invention are also disclosed. 5 BRIEF DESCRIPTION OF THE DRAWINGS One or more embodiments of the invention will now be described with reference to the following drawings, in which: Figs. 1A and 1B collectively form a schematic block diagram representation of a computer system upon which arrangements described can be practiced; 0 Fig. 1C shows a schematic block diagram of a software architecture for use in the described arrangements; Fig. 2 is a schematic flow diagram showing a method of determining a label for a group of individuals represented in images of an image collection; Fig. 3A shows an image collection comprising images representing at least two events; 5 Fig. 3B shows a graph representing relationship between events during a time period; Fig. 3C shows an example event represented by images captured of several individuals; Fig. 3D shows event attributes that may describe the event shown in Fig. 3C. Fig. 4A shows a network graph of the faces of individuals whose interactions are 0 captured in an image collection, and shows several groups of individuals; Fig. 4B lists social categories that may describe groups such as those in Fig. 4A; Fig. 5 shows an example group attendance profile for a group of individuals, such as a group in Fig. 4A; Fig. 6 shows an example of a predetermined group attendance profile associated with a 25 social category, such as a social category from Fig. 4B. DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION Where reference is made in any one or more of the accompanying drawings to steps and/or features, which have the same reference numerals, those steps and/or features have for the purposes of this description the same function(s) or operation(s), unless the contrary 30 intention appears. Methods and apparatus for categorising groups of individuals into predetermined social categories based on shared group attendance behaviour of those individuals represented in an image collection, are described below. A user's personal image collection typically 8018089v3 (PO89043_SpeciAs Filed) -5 contains images representing faces of numerous individuals that the user knows and interacts with, such as family, friends and acquaintances of the user. In the described methods, the individuals may be automatically determined in the image collection using methods such as facial recognition. A user's personal image collection also contains events that have been 5 photographed over time. Examples of events that are often photographed include vacations, birthday parties, dinners, picnics, weddings and so on. Therefore, a typical image collection contains images representing both a plurality of events and a plurality of individuals known to the user who has attended the events. Network graphs of individuals who likely know each other may be created based on an 0 analysis of images in an image collection. Determining that two individuals know each other may be achieved by counting interactions of the individuals in the images of the image collection, where two individuals are assumed to interact if the two individuals co-attend an event or co-appear in images together. A resulting network graph may be used to discover the key individuals represented in images of a collection. The key individuals are typically family 5 and closest friends of an owner of the image collection since such individuals are photographed most frequently. Community detection methods related to graph theory may be used to determine groups of densely interacting individuals in images of a collection of images. Densely interacting individuals are individuals in a group of individuals who interact regularly with 0 other individuals in the same group. One such group of individuals is typically the user's nuclear family including parents, children and/or siblings of the user. Age-estimation and gender-estimation of faces associated with individuals of a family group may be used to identify particular roles within the family, and to answer questions such as "who is the user's mother?". For example, an individual one generation older and female 25 may be identified as the mother of the user. Similar questions such as "who is the user's father?", "who are the grandparents?", and so on may also be answered using similar methods for identifying individuals in an image. The well-defined roles within family units allow age estimation and gender-estimation methods to achieve high accuracy. Unfortunately, age estimation and gender-estimation methods are typically not suitable for identifying non-family 30 roles or for dealing with non-family groups of individuals due to a lack of such well-defined roles. A typical user is likely to have many different social groups of individuals represented in images of their image collection who are not family. For example, a social group may be 8018089v3 (PO89043_SpeciAs Filed) -6 labelled as "my circle of close friends", "my workmates", "friends I met overseas", "friends I met through my spouse", "friends I met through my hobby", "families of my daughter's school friends", and so on. The described methods use community detection of a network graph of individuals in an image collection to detect groups of individuals. 5 The described methods may also be used to determine to which social category the groups of individuals belong. The described methods determine which social category the groups of individuals belong using attributes of shared experiences and attendance behaviour of a group, as captured in an image collection. In accordance with the described methods, groups of individuals are determined and 0 the groups of individuals are categorised into one or more predetermined social categories using analysis of shared attendance behaviour. The shared attendance behaviour is referred herein as an attendance behaviour profile. Figs. 1A and 1B depict a computer system 100, upon which the various arrangements described can be practiced. 5 As seen in Fig. 1A, the computer system 100 includes a server computer module 101 communicating with user camera devices 181-A, 181-B, 181-C and 181-D via a communications network 120. In the example of Fig. 1A, the user camera devices include a personal computer 181-A, a tablet device 181-B, a mobile phone 181-C and/or a digital camera 101-D, which may all be used by a user to capture images for an image collection 185 0 as seen in Fig. 1C. Input devices such as a keyboard 102, a mouse pointer device 103, a scanner 126, a camera 127, and a microphone 180, are connected to the computer module 101. Output devices including a printer 115, a display device 114 and loudspeakers 117 are also connected to the computer module 101. An external Modulator-Demodulator (Modem) transceiver 25 device 116 may be used by the computer module 101 for communicating to and from the communications network 120 via a connection 121. The communications network 120 may be a wide-area network (WAN), such as the Internet, a cellular telecommunications network, or a private WAN. Where the connection 121 is a telephone line, the modem 116 may be a traditional "dial-up" modem. Alternatively, where the connection 121 is a high capacity (e.g., 30 cable) connection, the modem 116 may be a broadband modem. A wireless modem may also be used for wireless connection to the communications network 120. The computer module 101 typically includes at least one processor unit 105, and a memory unit 106. For example, the memory unit 106 may have semiconductor random access 8018089v3 (PO89043_SpeciAs Filed) -7 memory (RAM) and semiconductor read only memory (ROM). The computer module 101 also includes an number of input/output (1/0) interfaces including: an audio-video interface 107 that couples to the video display 114, loudspeakers 117 and microphone 180; an I/O interface 113 that couples to the keyboard 102, mouse 103, scanner 126, camera 127 and 5 optionally a joystick or other human interface device (not illustrated); and an interface 108 for the external modem 116 and printer 115. In some implementations, the modem 116 may be incorporated within the computer module 101, for example within the interface 108. The computer module 101 also has a local network interface 111, which permits coupling of the computer system 100 via a connection 123 to a local-area communications network 122, 0 known as a Local Area Network (LAN). As illustrated in Fig. 1A, the local communications network 122 may also couple to the wide network 120 via a connection 124, which would typically include a so-called "firewall" device or device of similar functionality. The local network interface 111 may comprise an Ethernet circuit card, a Bluetooth* wireless arrangement or an IEEE 802.11 wireless arrangement; however, numerous other types of 5 interfaces may be practiced for the interface 111. The 1/0 interfaces 108 and 113 may afford either or both of serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and having corresponding USB connectors (not illustrated). Storage devices 109 are provided and typically include a hard disk drive (HDD) 110. Other storage 0 devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used. An optical disk drive 112 is typically provided to act as a non-volatile source of data. Portable memory devices, such optical disks (e.g., CD-ROM, DVD, Blu-ray Disc TM), USB RAM, portable, external hard drives, and floppy disks, for example, may be used as appropriate sources of data to the system 100. 25 The components 105 to 113 of the computer module 101 typically communicate via an interconnected bus 104 and in a manner that results in a conventional mode of operation of the computer system 100 known to those in the relevant art. For example, the processor 105 is coupled to the system bus 104 using a connection 118. Likewise, the memory 106 and optical disk drive 112 are coupled to the system bus 104 by connections 119. Examples of computers 30 on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations, Apple MacTM or a like computer systems. The described methods may be implemented using the computer system 100 wherein the processes of Figs. 2 to 6 to be described, may be implemented as one or more software 8018089v3 (PO89043_SpeciAs Filed) -8 application programs 133 executable within the computer system 100. In particular, the steps of the described methods are effected by instructions 131 (see Fig. 1B) in the software 133 that are carried out within the computer system 100. The software instructions 131 may be formed as one or more code modules, each for performing one or more particular tasks. The 5 software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the described methods and a second part and the corresponding code modules manage an interface between the first part and a user. The software may be stored in a computer readable medium, including the storage devices described below, for example. The software 133 is typically stored in the HDD 110 0 or the memory 106. The software is loaded into the computer system 100 from the computer readable medium, and then executed by the computer system 100. Thus, for example, the software 133 may be stored on an optically readable disk storage medium (e.g., CD ROM) 125 that is read by the optical disk drive 112. A computer readable medium having such software or computer program recorded on the computer readable medium is a computer 5 program product. The use of the computer program product in the computer system 100 preferably effects an advantageous apparatus for implementing the described methods. In some instances, the application programs 133 may be supplied encoded on one or more CD-ROMs 125 and read via the corresponding drive 112, or alternatively may be read from the networks 120 or 122. Still further, the software can also be loaded into the computer 0 system 100 from other computer readable media. Computer readable storage media refers to any non-transitory tangible storage medium that provides recorded instructions and/or data to the computer system 100 for execution and/or processing. Examples of such storage media include floppy disks, magnetic tape, CD-ROM, DVD, Blu-rayTM Disc, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card 25 such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 101. Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the computer module 101 include radio or infra-red transmission channels as well as a network connection to another computer or networked 30 device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like. The second part of the application programs 133 and the corresponding code modules mentioned above may be executed to implement one or more graphical user interfaces (GUIs) 8018089v3 (PO89043_SpeciAs Filed) -9 to be rendered or otherwise represented upon the display 114, or to be downloaded via the communications network 120 for representation on a display of one of the user camera devices 181-A, 181-B, 181-C and 181-D. Through manipulation of one of the user camera devices 181-A, 181-B, 181-C and 5 181-D, a user of the computer system 100 and the application may manipulate the interfaces in a functionally adaptable manner to provide controlling commands and/or input to the applications associated with the GUI(s). Other forms of functionally adaptable user interfaces may also be implemented, such as an audio interface utilizing speech prompts output via a loudspeakers or the like and user voice commands input via a microphone or the like. 0 Fig. 1B is a detailed schematic block diagram of the processor 105 and a "memory" 134. The memory 134 represents a logical aggregation of all the memory modules (including the HDD 109 and semiconductor memory 106) that can be accessed by the computer module 101 in Fig. 1A. When the computer module 101 is initially powered up, a power-on self-test (POST) 5 program 150 executes. The POST program 150 is typically stored in a ROM 149 of the semiconductor memory 106 of Fig. 1A. A hardware device such as the ROM 149 storing software is sometimes referred to as firmware. The POST program 150 examines hardware within the computer module 101 to ensure proper functioning and typically checks the processor 105, the memory 134 (109, 106), and a basic input-output systems software (BIOS) 0 module 151, also typically stored in the ROM 149, for correct operation. Once the POST program 150 has run successfully, the BIOS 151 activates the hard disk drive 110 of Fig. 1A. Activation of the hard disk drive 110 causes a bootstrap loader program 152 that is resident on the hard disk drive 110 to execute via the processor 105. This loads an operating system 153 into the RAM memory 106, upon which the operating system 153 commences operation. The 25 operating system 153 is a system level application, executable by the processor 105, to fulfil various high level functions, including processor management, memory management, device management, storage management, software application interface, and generic user interface. The operating system 153 manages the memory 134 (109, 106) to ensure that each process or application running on the computer module 101 has sufficient memory in which to 30 execute without colliding with memory allocated to another process. Furthermore, the different types of memory available in the system 100 of Fig. 1A must be used properly so that each process can run effectively. Accordingly, the aggregated memory 134 is not intended to illustrate how particular segments of memory are allocated (unless otherwise 8018089v3 (PO89043_SpeciAs Filed) - 10 stated), but rather to provide a general view of the memory accessible by the computer system 100 and how such is used. As shown in Fig. 1B, the processor 105 includes a number of functional modules including a control unit 139, an arithmetic logic unit (ALU) 140, and a local or internal 5 memory 148, sometimes called a cache memory. The cache memory 148 typically includes a number of storage registers 144 - 146 in a register section. One or more internal busses 141 functionally interconnect these functional modules. The processor 105 typically also has one or more interfaces 142 for communicating with external devices via the system bus 104, using a connection 118. The memory 134 is coupled to the bus 104 using a connection 119. 0 The application program 133 includes a sequence of instructions 131 that may include conditional branch and loop instructions. The program 133 may also include data 132 which is used in execution of the program 133. The instructions 131 and the data 132 are stored in memory locations 128, 129, 130 and 135, 136, 137, respectively. Depending upon the relative size of the instructions 131 and the memory locations 128-130, a particular instruction may be 5 stored in a single memory location as depicted by the instruction shown in the memory location 130. Alternately, an instruction may be segmented into a number of parts each of which is stored in a separate memory location, as depicted by the instruction segments shown in the memory locations 128 and 129. In general, the processor 105 is given a set of instructions which are executed therein. 0 The processor 105 waits for a subsequent input, to which the processor 105 reacts to by executing another set of instructions. Each input may be provided from one or more of a number of sources, including data generated by one or more of the input devices 102, 103, data received from one of the user camera devices 181-A, 181-B, 181-C and 181-D across one of the networks 120, 102, data retrieved from one of the storage devices 106, 109 or data 25 retrieved from a storage medium 125 inserted into the corresponding reader 112, all depicted in Fig. 1A. The execution of a set of the instructions may in some cases result in output of data. Execution may also involve storing data or variables to the memory 134. The described arrangements use input variables 154, which are stored in the memory 134 in corresponding memory locations 155, 156, 157. The described arrangements 30 produce output variables 161, which are stored in the memory 134 in corresponding memory locations 162, 163, 164. Intermediate variables 158 may be stored in memory locations 159, 160, 166 and 167. 8018089v3 (PO89043_SpeciAs Filed) - 11 Referring to the processor 105 of Fig. 1B, the registers 144, 145, 146, the arithmetic logic unit (ALU) 140, and the control unit 139 work together to perform sequences of micro operations needed to perform "fetch, decode, and execute" cycles for every instruction in the instruction set making up the program 133. Each fetch, decode, and execute cycle comprises: 5 a fetch operation, which fetches or reads an instruction 131 from a memory location 128, 129, 130; a decode operation in which the control unit 139 determines which instruction has been fetched; and an execute operation in which the control unit 139 and/or the ALU 140 execute the 0 instruction. Thereafter, a further fetch, decode, and execute cycle for the next instruction may be executed. Similarly, a store cycle may be performed by which the control unit 139 stores or writes a value to a memory location 132. Each step or sub-process in the processes of Figs. 2A to 6 is associated with one or 5 more segments of the program 133 and is performed by the register section 144, 145, 147, the ALU 140, and the control unit 139 in the processor 105 working together to perform the fetch, decode, and execute cycles for every instruction in the instruction set for the noted segments of the program 133. As described above, the described methods are performed on higher level devices such 0 as the server computer module 101 which may have reasonably large processing resources. The described methods may also be performed on desktop computers and the like with similar processing resources. Nevertheless, the methods to be described may also be performed on lower level devices, in which processing resources are relatively limited, such as the tablet device 181-B, the mobile phone 181-C and/or the digital camera 101-D. 25 The described methods may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of described methods. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories. Fig. 1C shows a schematic block diagram of a software architecture for use in the 30 described arrangements. The user camera devices, such as the computer 181-A, tablet 181-B, mobile phone 181-C or camera 181-D, are used by a user to capture digital images for an image collection 185. The image collection 185 may reside on one or more physical storage modules, such as a hard disk drive of the computer 181-A or even the hard disk drive 110 of 8018089v3 (PO89043_SpeciAs Filed) - 12 the server computer module 101, which the user can access. The image collection is made available to an event analysis module 187. The event analysis module 187 may be formed by one or more code modules of the software application program 133. The event analysis module 187 is configured to determine events 188 represented in the image collection 185, 5 and will be described in detail below with reference to Fig 3B. Each event may have one or more event attributes 189. The event attributes 189 will be described in detail below with reference to Fig. 3D. An event attribute 189 may be one of a time value, a location, an event type and a day of the week, The images of the image collection 185 are also analysed by the software application 0 program 133 to determine faces 186 of individuals represented in the images of the image collection 185 using face detection and facial recognition. Alternatively, the user may analyse the images of the image collection 185 using the computer 181-A, for example, to determine faces 186 of individuals represented in the images of the image collection 185. The resulting faces 186 and events 188 are used by a group identification module 190. The group 5 identification module 190 may be formed by one or more code modules of the software application program 133 and will be described in more detail below in relation to Fig. 4A. The group identification module 190 is configured to determine groups of individuals 191 of interest. As also seen in Fig. 1C, a group profile generation module 192 is configured to 0 determine a group attendance profile 193 for a group. A social category labelling module 196 is configured to use one of the groups 191 and a an associated group attendance profile 193 to determine a group label 197. The group label 197 may also be referred to as a "social category" label, as described below. The social category labelling module 196 may be formed by one or more code modules of the software 25 application program 133. The social category labelling module 196 may determine the group label 197 by comparing the group attendance profile 193 to one or more predetermined group attendance profiles 194 associated with predetermined social categories 195 of interest. The group attendance profiles 193 and predetermined group attendance profiles 194 are described in detail below with reference to Figs. 5 and 6. 30 Fig. 2 is a schematic flow diagram showing a method 200 of determining a label for a group of individuals represented in images of an image collection. The method 200 may be implemented as one or more software code modules of the software application program 133 resident in the hard disk drive 110 and being controlled in its execution by the processor 105. 8018089v3 (PO89043_SpeciAs Filed) - 13 The method 200 will be described by of example where the method 200 is used to determine a group label 197 associated with a social category 195 for a group of individuals 191 represented in the image collection 185. The method 200 begins at receiving step 201, where the processor 105 is used for 5 receiving the image collection 185 comprising a plurality of images, a plurality of images of the image collection 185 representing one or more individuals. The image collection 185 may be uploaded by the user from the computer 181-A, for example, to the server computer module 101 where the image collection 185 may be stored in the memory 106. The images of the image collection 185 may have been captured using any one or more of the user camera 0 devices such as the computer 181-A, the tablet device 181-B, the mobile phone 181-C and/or the digital camera 101-D. In one arrangement, the images of the image collection 185 may be uploaded to the server computer module 101 directly from one of the user camera devices 181-A, 181-B, 181-C and/or 181-D upon the images being captured using the particular user camera device 181-A, 181-B, 181-C and/or 181-D. 5 At the next identifying step 202, the event analysis module 187, under execution of the processor 105, is used for identifying one or more social events 188 captured by images of the image collection 185, along with event attributes 189. The event attributes 189 are determined by the event analysis module 187 and describe properties of an event. The event attributes may include references to images of the image collection 185 representing a 0 corresponding event. The event attributes may also include individuals detected in particular images, the time and location related to capture of a particular image, and event-type classification labels describing one or more images collectively. The social events may be identified based at least on metadata associated with the images, with each of the identified social events corresponding to at least one event attribute. 25 Then at associating step 203, the group identification module 190, under execution of the processor 105, is used for associating at least two of the individuals represented in the images together to determine a group based on appearance of the associated individuals in common ones of the images. As described in detail below, the individuals are associated together if the associated individuals attend the same event. The faces 186 of the individuals 30 represented in the images of the collection 185 associated with the various events 188 can be used by the group identification module 190 at step 203 to automatically determine groups of individuals 191 of interest. In one arrangement, a group 191 of individuals may be specified manually by the user at step 203, using the computer 181-A, for example. 8018089v3 (PO89043_SpeciAs Filed) - 14 Once a group 191 is determined at step 203, the method 200 proceeds to determining step 204 where the group profile generation module 192, under execution of the processor 105, is used for determining a group attendance profile 193 for the group determined at step 203. The group attendance profile 193 is determined based on the attributes of the events 5 attended by the individuals in the group using the metadata associated with the images of the image collection 185. As described below, in one arrangement, the group attendance profile 193 comprises one or more histograms each representing a distribution of event attributes of events attended by the individuals in the group. Since any individual captured in any image from an event 188 is assumed to have 0 attended the event 188, then any individual represented in the images of the image collection 185 exhibits an attendance behaviour. Over multiple events an individual may exhibit consistent behaviour, and across groups of individuals 191. Some groups of individuals may exhibit unique or particularly identifiable group behaviour. For example, a certain group of individuals may attend events on particular days of the week, or at particular times, in 5 particular locations, of particular event-type description, and so on. The group attendance profile 193 is determined at step 204 using the event-based attributes of events attended by at least some of the individuals in the group determined at step 203. The group attendance profile 193 will be further described below in relation to Fig. 5. Creating a profile of a group of individuals increases the signal to noise ratio over creating 0 profiles of single individuals. As will be described below in relation to Fig. 5, creating attendance profiles of groups rather than single individuals increases the performance of the described methods. The method 200 then proceeds to comparing step 205, where the social category labelling module 196, under execution of the processor 105, is used for comparing the group 25 attendance profile 193 determined at step 204 for the group of individuals to one or more predetermined group attendance profiles 194, each predetermined group attendance profile corresponding to one of a plurality of predetermined social categories 195. The predetermined group attendance profiles 194 may be stored in the hard disk drive 110. At decision step 206, if a sufficiently strong match between the group attendance 30 profile 193 determined at step 204 and one or more of the predetermined group attendance profiles 194 was found at step 205, then the method 200 proceeds to social category label determining step 207. Otherwise, the method 200 concludes. In one arrangement, a 8018089v3 (PO89043_SpeciAs Filed) - 15 predetermined minimal confidence threshold may be used at step 205 to compare the group attendance profile 193 with the predetermined group attendance profiles 194. At step 207, the social category labelling module 196, under execution of the processor 105, is used for determining a group label 197 for the group of individuals based on 5 similarity of the group attendance profile 193 to the predetermined group attendance profiles 194, each predetermined group attendance profile corresponding to a social category. The group label 197 may be stored in the memory 106. Alternatively, the group label 197 may be communicated to one of the devices 101-A, 101-B, 101-C or 101-D via the communication interface 108 and communications network 120. The method 200 then terminates. The method 0 200 or steps 204 to 207 may be repeated multiple types for multiple groups 191 of individuals to determine multiple group labels 197. The method 200 will now be described further by way of example with reference to Fig. 5 and Fig. 6, where the group label 197 "sports friends" is determined for the group of individuals 191 based on the social category 195 "sports friends". The social category 195 5 "sports friends" may refer to individuals with whom the user plays a team sport. The "sports friends" category may be associated with multiple predetermined group attendance profiles 194. Fig. 6 shows an example of a predetermined group attendance profile 600 which is one of the predetermined group attendance profiles 194. The predetermined group attendance 0 profile 600 represents an expected attendance behaviour of a group of "sports friends" that is suitable for team sports such as soccer, baseball or cricket. The predetermined "sports friends" group attendance profile 600 comprises histograms 601, 602, 603 and 604 each representing an event attribute 189. The predetermined "sports friends" group attendance profile 600 indicates that expected attendance behaviour of such a group of individuals is to attend events 25 in the early afternoon (e.g. between 12-4PM), as indicated by a "time of day" histogram 601. The predetermined "sports friends" group attendance profile 600 also indicates that the shared attendance behaviour for events attended by such a group of individuals is that the events are biased towards events located outdoors as indicated by "location" histogram 602 and usually occurring on Saturdays but sometimes Sundays as indicated by "day of the week" histogram 30 603. The predetermined "sports friends" group attendance profile 600 also indicates that the events attended by such a group of individuals align strongly with two event-types descriptions (e.g. "outing" and "picnic", as indicated by "event type" histogram 604). 8018089v3 (PO89043_SpeciAs Filed) - 16 In the example of Figs. 5 and 6, the event analysis module 187, under execution of the processor 105, is used for identifying one or more social events 188 captured by images of the image collection 185 associated with the group of "sports friends", along with event attributes 189 as at step 201 of the method 200. In the present example, a group 191 of seven 5 individuals is then determined by the group identification module 190 as at step 203. The seven individuals are determined to have attended events {#5, 8, 20, 33, 70 and 711. Consequently, as at step 204, the group profile generation module 192 uses the group 191 and the events 188 to determine a group attendance profile 500 for the group 191 of seven individuals as shown in Fig. 5. The group attendance profile 500 corresponds to one of the 0 group attendance profiles 193. The group attendance profile 500 of Fig. 5 is then compared with predetermined group attendance profiles 194 such as the predetermined "sports friends" group attendance profile 600 of Fig. 6 as at step 205 of the method 200. The group attendance profile 500 of Fig. 5 may also be compared with predetermined group attendance profiles 194 corresponding to 5 social categories 195 such as "Hobby friends", "Workmates" and "Overseas relatives". In the example of Figs. 5 and 6, it is also determined that a sufficiently high match is achieved between the group attendance profile 500 of Fig. 5 and the predetermined "sports friends" group attendance profile 600 of Fig. 6 corresponding to the social category "sports friends". A group label 197 is then determined for the group of seven individuals, as at step 0 207, based on similarity of the group attendance profile 500 of Fig. 5 to the predetermined group attendance profile 600 of Fig. 6. As the predetermined group attendance profile 600 is associated with the group label "sports friends", the group of seven individuals is labelled as the "sports friends" social category. Fig. 3A shows an image collection 185 comprising images 301 representing two 25 events 300 and 311. The events 300 and 311 are represented in Fig. 3A as folders. Each of the two events 300 and 311 correspond to events 188 as determined by the event analysis module 187. Each of the events 300 and 311 has been identified, as at step 202, from the image collection 185 comprising the images 301. The images 301 representing the events 300 and 311 may be organised in a hierarchy of folders as shown in Fig. 3. However, separation 30 of images 301 into the events 300 and 311 represented by the folders in Fig. 3 may also be entirely logical. For example, the events 300 and 311 represented by the folders in Fig. 3 may exist as database identifiers independent of where the images 301 are stored. Many users keep their image collections 185 organised in terms of time-based or location-based events. Any 8018089v3 (PO89043_SpeciAs Filed) - 17 suitable segmentation algorithm that can automatically determine event boundaries between events may be used to separate the events 300 and 311. Some user camera devices also automatically partition captured images into events, such as by creating a separate event for each day of capture. 5 Fig. 3B shows a graph 312 representing the relationship between events during a time period. Two events 302 and 303 are labelled and shown to be separated by a time gap. Event 302 has a longer duration and larger quantity of captured images compared to event 303. For example, the event 302 may be a full-day hike to a mountain while the shorter event 303 may be a three-hour long birthday party. The event analysis module 187 is configured to identify 0 the individual events 188, such as event 302 and event 303, using a combination of time and location metadata. For example, one event segmentation strategy is to create an event boundary at every time gap larger than a predetermined threshold. Fig. 3C shows an example event 304 capturing several individuals {A, B and C}. Image 305 shows A in a cityscape; images 306 and 307 show {B and C}, respectively; image 5 308 shows both A and B co-appearing in the same image; and image 309 has no recognised individuals. Since the event 304 is a single event, it is possible that individuals {A, B and C} all know each other, and very likely that individuals {A and BI know each other since individuals {A and B I appear next to each other in the same image 308. Fig. 3D shows event attributes 310 that may describe the event 304 of Fig. 3C. The 0 attributes 310 correspond to the event attributes 189 shown in Fig. 1C and are used by the group profile generation module 192. As illustrated by the example of Fig. 3D, the event attributes 310 are nominal or numeric values. In one arrangement the event attributes 189 include: (i) references to images and individuals in the corresponding event; (ii) time-based values such the hours of the event (e.g. 25 1-7PM), the day of week (e.g. Saturday), day of year (e.g. "January 1, New Year's Day") and seasonal properties (e.g. "winter", "rainy"), and proximity to personal events (e.g. two days until the user's birthday); (iii) location-based values such as geographic identifiers (e.g. town, postcode, country, address, GPS coordinate) or scene recognition labels (e.g. "indoor", "outdoor", "beach", "home", "city"); (iv) semantic event-type classification labels associated 30 with the event (e.g. "vacation", "wedding", "picnic", "activity with friends"). The attributes 310 can be determined by the event analysis module 187 using any suitable method. For example, face recognition may be used to determine the identity of the individuals {A, B and C} in the images 305 to 309 image EXIF metadata may be used to determine timestamps and 8018089v3 (PO89043_SpeciAs Filed) - 18 global positioning system (GPS) coordinates; gazetteer databases may be used to obtain location details; a scene detection image processing method may be used to determine if images are indoor-or-outdoor, urban-or-nature, and so on; and an event classification method may be used to determine the confidence levels that the event in question is a vacation, 5 wedding, birthday party, picnic, etc. Fig. 4A shows a network graph 403 of the faces 186 of individuals whose interactions are captured in an image collection 185. The group identification module 190 is configured to analyse the co-appearance of individuals to determine groups of individuals 191, such as the example groups 400, 401 and 402. The group 400 contains five individuals with relatively 0 dense interactions amongst the group. The group 401 shows a "clique" where every individual of the group 401 is connected to every other individual of the group 401. The group 402 shows a pair of individuals, with one individual 404 disconnected from any other individual outside the group 402. The individuals {A, B and C} shown in the event 304 in Fig. 3C may form one such group in a network graph such as the network graph 403. Determining the 5 groups of individuals 191 used in the method 200, such as the example group 400, is primarily a clustering process. In some arrangements, the groups of individuals 191 may be specified manually, while in some arrangements the groups of individuals may be automatically determined by the group identification module 190 using any suitable clustering techniques such as a clique detection algorithm, affinity propagation (AP) clustering algorithm, spectral 0 clustering algorithm, hierarchical clustering algorithm or partitional clustering algorithm. Fig. 4B shows a table 450 containing a set of social categories that may describe groups of individuals such as the groups 400, 401 and 402 shown in Fig. 4A. Although known clustering techniques can derive groups of individuals such as the groups 400, 401 and 402, the method 200 is concerned with determining appropriate group labels 197 for the groups of 25 individuals. In one arrangement, the appropriate group labels are determined using a set of predetermined social categories 195, such as the social categories listed under the heading "Social category" in table 450 of Fig. 4B. The social categories listed in table 450 cover a large number of possible types of social groups expected to be found in a typical image collection. The social categories listed in table 450 include "Major circle of friends", "Minor 30 circle of friends", "Friends of my spouse", Friends of my parents", "Friends of my sibling", "Friends of my children", "Ex-pat friends", "Hobby friends", "Sports friends", "Workmates",, "University colleagues", "School colleagues", "Friends met on holidays", "Extended family", and "Relatives living overseas". 8018089v3 (PO89043_SpeciAs Filed) - 19 Each predetermined social category 195 is associated with predetermined group attendance profiles 194 as listed under the heading "Predetermined profiles" listed in the table 450. As shown in the table 450, some profiles (such as profilee1) may be shared by multiple social categories. An example of such a predetermined profile is given in Fig. 6. The set of 5 social categories listed in the table 450 is used by the social category labelling module 196 implemented in the method 200. As described above, the example group attendance profile 500 may be used in the method 200. The example group attendance profile 500 of Fig. 5 is determined for a group of seven individuals who attended events {#5, 8, 20, 33, 70 and 711. In the example of Fig. 5, 0 events #8 and #71 were indoor dinners on weekdays and events {#5, 20, 33 and 701 were outdoor social sports games on Saturdays. The profile 500 of Fig. 5 captures the particular attendance behaviour of the seven individuals of the group of seven individuals. For example, a "time of day" histogram 501 indicates that the individuals in the group of seven individuals mainly attended events in the early afternoon (e.g. the outdoor games events) but also many 5 images were captured during the evening time (e.g. the two dinner events). A location histogram 502, which in the example of Fig. 5 is an "indoor" or "outdoor" ratio, indicates that the group of seven individuals mostly attended outdoor events. In some arrangements, location attributes may specify additional data, such as city name as in the example attributes 310 shown in Fig. 3D. In the example of Fig. 5, a day of 0 week histogram 503 indicates that the group of seven individuals mostly attended events on Saturday, with some activity on a Monday and a Friday (e.g. the two dinners). The proportions of event-types as represented by "event-type" histogram 504, tabulating the event classifications of the events {#5, 8, 20, 33, 70, 711, indicates that the group of seven individuals attended events that appeared to be "dinner", "birthday party", "outing" and 25 "picnic" (respectively), with the "picnic" classification having highest density. For example, the outdoor social events {#5, 20, 33 and 70} may have been classified as "picnics" by the event analysis module 187, resulting in the largest density for "picnic" classification. In one arrangement, attendance at a particular one of the events {#5, 8, 20, 33, 70 and 711 by individuals of the group may be disregarded in determining the group attendance 30 profile 500 where only a small number of the seven individuals in the group attend the particular social event. In some arrangements the event analysis module 187 performs image processing to determine the classification. For example, the event analysis module 187 may classify an 8018089v3 (PO89043_SpeciAs Filed) - 20 event as a "picnic" if the images captured of the event comprise more than four detected faces, the images are captured over a period which is less than five hours in duration, and 80% of the captured images have some minimum portion of grass and sky colours or textures. Overall the group attendance profile 500 of Fig. 5 indicates that the group of seven individuals primarily 5 attends park-based "picnic" events on Saturday mornings, and has attended a small portion of dinners, and no other event types on other days of the week. In one arrangement, the group attendance profiles 193 and predetermined group attendance profiles 194 include several independent histograms (or "distributions"), such as a time-of-day histogram 501 and 601, location histogram 502 and 602, day-of-week histogram 0 503 and 603], and event-type histogram 504 and 604. The histograms enable comparison between profiles by the social category labelling module 196. In some arrangements, the group attendance profile 193 and the predetermined group attendance profile 194 may be determined based on image attributes. In such arrangements, the group attendance profile 193 and the predetermined group attendance profile 194 represent 5 the image attributes. For example, the group attendance profiles 193 and predetermined group attendance profiles 194 may include a histogram representing a distribution of detected objects and desirable objects to match, where the histogram is determined using object detection. For example, the event analysis module 187 may detect objects in the images of some events, such as "house", "car", "dog", "boat" and so on, and include object identifiers as 0 event attributes 189. Such an object-based inclusion can produce more accurate group labels 197 by allowing for detection of particularly discriminatory objects, such as "soccer ball", "airplane", "wedding dress", and so on. In some arrangements, the group attendance profile 193 is determined by the group profile generation module 192 using event attributes 189 of particular events that satisfy 25 certain criteria. For example, one criterion may be that at least a minimum number of individuals of a group were detected at an event 188. An event that only one group member attended may contribute negatively or nil to the group attendance profile 193. Such an exclusion can produce more accurate group labels 197 by ignoring events that do not help define the behaviour of the group. 30 Another criterion for inclusion or exclusion of an event in the determination of a group attendance profile 193 may be that a particular individual attended, or did not attend the event. For example, a group attendance profile 193 may be determined using the events in which a particular individual was always present. The particular individual who was always present 8018089v3 (PO89043_SpeciAs Filed) - 21 may be the user, or the spouse of the user. A different group attendance profile 193 may be determined for the same group of individuals 191 using the events in which a particular individual was always absent. In some arrangements, the group attendance profile 193 may be determined using 5 weighted contributions of event attributes 189. For example, two particular individuals may be considered to be particularly weak members in a group of individuals 191. The two individuals may be considered as weak members on the basis that the two members are also members of other groups and may therefore not be unique to any group. In such an example, the events attended by the two individuals may contribute less to determination of the group 0 attendance profile 193. In one arrangement, the method 200 may comprise the step of assigning a weight to each individual of a group based on criteria such as whether a particular individual is unique to the group as described above. An individual that is unique to a particular group is assigned a higher weighting since that individual is more representative of the particular group. The 5 weight assigned to each of the individuals may be used in determining the group attendance profile 193 and may be used as the basis for removing one the individuals from the group based on the weight assigned to the removed individual. For example, an individual with a low weighting may be removed from a particular group since the events attended by the lowly weighted individual contributes little to determination of the group attendance profile 0 193.Such an inclusion or exclusion can produce more accurate group labels 197 by reducing noise from uninformative events and focusing on specifically discriminatory events or individuals of great interest. In some arrangements, the contribution of one of the social events to determining the group attendance profile 193 may be based on event attributes. The group attendance profile 25 193 may be determined by weighting contributions of event attributes 189 based on at least one event attribute value. For example, a group attendance profile 193 may be determined by assigning a lower weighting to event attributes 189 from older events (e.g., from many years ago) and higher weight to attributes from recent events. The older events may be determined by a time-based event attribute. Such time-based weighting can produce more accurate group 30 labels 197 by allowing the particular group to evolve and change category over time as the behaviour and attendance pattern of the group changes over time. The behavioural change is more likely observed in younger individuals and capturing a change in attendance behaviour over time may be used to create more accurate and up-to-date group attendance profiles 193. 8018089v3 (PO89043_SpeciAs Filed) - 22 In accordance with the method 200, the determined group attendance profile 193, such as the example group attendance profile 500, may then be used by the social category labelling module 196 to determine the likely group label 197. Determining the likely group label 197 is performed by comparing the group attendance profile 193 against predetermined group 5 attendance profiles 194 associated with the example social categories 195 of interest. The example predetermined group attendance profile 600 is associated with a predetermined social category 195, such as the category "sports friends" of table 450 in Fig. 4B. The predetermined group attendance profile 600 represents expected patterns of attendance behaviour corresponding to the social category. In the example of Fig. 6, a time of day histogram 601 0 indicates that a group is expected to attend events almost exclusively between 12-noon and 4PM, with little or no activity outside those hours. The location histogram 602 of Fig. 6 indicates that almost all events attended by a group should be classified as "outdoors" for the "sports friends". Further, a day of week histogram 603 indicates that events should occur almost exclusively on Saturdays and, to a 5 lesser extent, on Sundays. The event-type histogram 604 of Fig. 6 indicates that the events attended by a group should be almost exclusively classified as "outing" or "picnic" events for a social category such as "sports friends". The precise event-type taxonomy depend on the implementation of an event classifier subcomponent of the event analysis module 187. In one arrangement, the 0 event classifier subcomponent may be configured to classify event-types, such as "vacation" and "non-vacation", where a "vacation" is any event longer than two days and "non-vacation" is any event shorter than two days. A more sophisticated event classifier subcomponent may be configured to classify more event-types. In some arrangements, the predetermined social categories 195 or their associated 25 predetermined group attendance profiles 194 may include a target group size. For example, the social category "sports friends" may be configured to correspond to a social group of several friends known to the user via social sport events. The "sports friends" social group may be people with whom the user either plays sport or watches sport. The "sports friends" social group may be the soccer team of the user, or may be parents of other children with 30 whom the child of the user plays soccer and the user watches. Therefore, the predetermined group attendance profile 194 associated with capturing the behaviour of a group of individuals may stipulate constraints such as a minimum group size of four individuals so that groups of two or three individuals cannot match the predetermined group attendance profile. For 8018089v3 (PO89043_SpeciAs Filed) - 23 example, a group comprising two individuals is unlikely to be a social group of "university friends" where a larger group size would be expected. Accordingly, the group label 197 is determined based on the size of the group. Using such group constraints can further improve the accuracy of group labels 197 by reducing false positives. 5 In some arrangements, the predetermined group attendance profiles 194 may be hierarchically organised so that a generic social category 195 such as "sports friends" as shown in the table 450 may be further refined to sub-categories. Additional profile matches may be performed by the social category labelling module 196] when a high-level category is matched. For example, the category "sports friends" may stipulate that a group attends 0 outdoor events on weekends. Further, the category "sports friends" may have sub-categories "baseball friends", "football friends" and "sailing friends". The sub-category "baseball friends" may require that the majority of events occur during summer seasons and are located in parks where images captured of the events contain substantial proportions of green colours and grass-like textures. As another example, the "football friends" sub-category may differ to 5 the "baseball friends" sub-category by stipulating winter season rather than summer season. In contrast, the "sailing friends" sub-category may instead stipulate that the majority of events occur over water rather than in a park, where the images captured of the event contain substantial proportions of blue colours and water-like textures. Such hierarchical category organisation can produce more accurate group labels 197 by reducing noise and false positive 0 matches and by requiring the method 200 to increasingly match more specific profiles [194], such that a very specific profile (e.g., "sailing friends") cannot be matched unless a higher order profile was matched first (e.g., "sports friends"). The example group attendance profile 500 shown in Fig. 5 and the example predetermined group attendance profile 600 shown in Fig. 6 can be compared by the social 25 category labelling module 196. In some arrangements, the comparison the group attendance profile 500 and the example predetermined group attendance profile 600 is a mathematical comparison producing a numeric similarity score. In some arrangements, the similarity score is determined using a weighted sum of the similarity scores of the individual similarities of each component. For example, a weighted sum of the similarity scores of the similarity 30 between the time of day distributions represented by the histograms 501 and 601, similarity of location distributions represented by the histograms 502 and 602 or geographical distance of location coordinates, similarity of day of week distributions represented by histograms 503 and 603, and similarity of event-type distributions represented by histograms 504 and 604 may 8018089v3 (PO89043_SpeciAs Filed) - 24 be determined. Any suitable method of determining individual similarity of corresponding distributions include Dot Product, Cosine Similarity, Mutual Information, Earth Mover's Distance, Kullback-Leibler Divergence, Chi-squared Test, Area Under Curve overlap, Kolmogorov-Smirnov Test, and so on. 5 Any suitable numeric confidence threshold may be used to determine if a match between a group attendance profile 193 and a predetermined group attendance profile 194 is sufficiently strong. When a match is sufficiently strong, the group of individuals 191 under consideration may be labelled with the social category group label 197 associated with the matching predetermined group attendance profile 194, in accordance with the method 200 as 0 described above. The determined group label 197 can be used thereafter to improve numerous image services and image-related tasks, such as tasks associated with search and retrieval of images, organisation of images, classification of images, selection of images, sharing of images, recommendations, advertising, and so on. Industrial Applicability 5 The arrangements described are applicable to the computer and data processing industries and particularly for image processing. The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive. 0 In the context of this specification, the word "comprising" means "including principally but not necessarily solely" or "having" or "including", and not "consisting only of'. Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings. 8018089v3 (PO89043_SpeciAs Filed)

Claims (14)

1. A method of determining a label for a group of individuals, the method comprising: receiving an image collection, a plurality of images of said image collection 5 representing one or more individuals; associating at least two of the individuals together to determine a group based on appearance of the associated individuals in common ones of the images; identifying one or more social events captured by the common images, based at least on metadata associated with the common images, each said social event corresponding to at 0 least one event attribute; determining a group attendance profile based on the event attributes of the social events attended by the individuals in the group and using said metadata; and determining a label for the group of individuals based on similarity of the group attendance profile to predetermined group attendance profiles, each predetermined group 5 attendance profile corresponding to a social category.
2. The method according to claim 1, wherein the individuals are associated together if the associated individuals attend the same event. 0
3. The method according to claim 1, wherein the group attendance profile represents a distribution of event attributes of events attended by the individuals in the group.
4. The method according to claim 1, wherein the event attribute may be one of a time value, a location, an event type and a day of the week. 25
5. The method according to claim 1, wherein the group attendance profile is determined based on image attributes.
6. The method according to claim 1, wherein the group attendance profile is determined 30 using object detection. 8018089v3 (PO89043_SpeciAs Filed) - 26
7. The method according to claim 1, further comprising assigning a weight to the individuals when determining the group attendance profile.
8. The method according to claim 1, further comprising removing one the individuals 5 from the group based on a weight assigned to the removed individual.
9. The method according to claim 1, wherein attendance at one of said social events by said individuals of the group is disregarded where only a small number of individuals in the group attend said one social event. 0
10. The method according to claim 1, further comprising weighting a contribution of one of said social events to the group attendance profile based on event attributes.
11. The method according to claim 1, wherein the label is determined based on the size of 5 the group.
12. An apparatus for determining a label for a group of individuals, the apparatus comprising: means for receiving an image collection, a plurality of images of said image collection 0 representing one or more individuals; means for associating at least two of the individuals together to determine a group based on appearance of the associated individuals in common ones of the images; means for identifying one or more social events captured by the common images, based at least on metadata associated with the common images, each said social event 25 corresponding to at least one event attribute; means for determining a group attendance profile based on the event attributes of the social events attended by the individuals in the group and using said metadata; and means for determining a label for the group of individuals based on similarity of the group attendance profile to predetermined group attendance profiles, each predetermined 30 group attendance profile corresponding to a social category.
13. A system for determining a label for a group of individuals, the system comprising: a memory for storing data and a computer program; 8018089v3 (PO89043_SpeciAs Filed) - 27 a processor coupled to the memory for executing the computer program, said computer program comprising instructions for: receiving an image collection, a plurality of images of said image collection representing one or more individuals; 5 associating at least two of the individuals together to determine a group based on appearance of the associated individuals in common ones of the images; identifying one or more social events captured by the common images, based at least on metadata associated with the common images, each said social event corresponding to at least one event attribute; 0 determining a group attendance profile based on the event attributes of the social events attended by the individuals in the group and using said metadata; and determining a label for the group of individuals based on similarity of the group attendance profile to predetermined group attendance profiles, each predetermined group attendance profile corresponding to a social category. 5
14. A computer readable medium having a computer program stored thereon for determining a label for a group of individuals, the program comprising: code for receiving an image collection, a plurality of images of said image collection representing one or more individuals; 0 code for associating at least two of the individuals together to determine a group based on appearance of the associated individuals in common ones of the images; code for identifying one or more social events captured by the common images, based at least on metadata associated with the common images, each said social event corresponding to at least one event attribute; 25 code for determining a group attendance profile based on the event attributes of the social events attended by the individuals in the group and using said metadata; and code for determining a label for the group of individuals based on similarity of the group attendance profile to predetermined group attendance profiles, each predetermined group attendance profile corresponding to a social category. 30 8018089v3 (PO89043_SpeciAs Filed)
AU2013254921A 2013-11-07 2013-11-07 Method, apparatus and system for determining a label for a group of individuals represented in images Abandoned AU2013254921A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2013254921A AU2013254921A1 (en) 2013-11-07 2013-11-07 Method, apparatus and system for determining a label for a group of individuals represented in images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2013254921A AU2013254921A1 (en) 2013-11-07 2013-11-07 Method, apparatus and system for determining a label for a group of individuals represented in images

Publications (1)

Publication Number Publication Date
AU2013254921A1 true AU2013254921A1 (en) 2015-05-21

Family

ID=53171726

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2013254921A Abandoned AU2013254921A1 (en) 2013-11-07 2013-11-07 Method, apparatus and system for determining a label for a group of individuals represented in images

Country Status (1)

Country Link
AU (1) AU2013254921A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108446330A (en) * 2018-02-13 2018-08-24 北京数字新思科技有限公司 Promotion object processing method and device and computer-readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108446330A (en) * 2018-02-13 2018-08-24 北京数字新思科技有限公司 Promotion object processing method and device and computer-readable storage medium
CN108446330B (en) * 2018-02-13 2022-05-13 北京明略昭辉科技有限公司 Promotion object processing method and device and computer-readable storage medium

Similar Documents

Publication Publication Date Title
Xu et al. Geolocalized modeling for dish recognition
JP5848336B2 (en) Image processing device
US9430719B2 (en) System and method for providing objectified image renderings using recognition information from images
JP5537557B2 (en) Semantic classification for each event
Li et al. GPS estimation for places of interest from social users' uploaded photos
US8953895B2 (en) Image classification apparatus, image classification method, program, recording medium, integrated circuit, and model creation apparatus
US8897505B2 (en) System and method for enabling the use of captured images through recognition
Taskiran et al. ViBE: A compressed video database structured for active browsing and search
US7809192B2 (en) System and method for recognizing objects from images and identifying relevancy amongst images and information
US7809722B2 (en) System and method for enabling search and retrieval from image files based on recognized information
US9298982B2 (en) System and method for computing the visual profile of a place
US20110184953A1 (en) On-location recommendation for photo composition
US20110184949A1 (en) Recommending places to visit
US20130111373A1 (en) Presentation content generation device, presentation content generation method, presentation content generation program, and integrated circuit
Joshi et al. Inferring generic activities and events from image content and bags of geo-tags
Yuan et al. Mining compositional features from GPS and visual cues for event recognition in photo collections
Chen et al. Clues from the beaten path: Location estimation with bursty sequences of tourist photos
Cao et al. Image annotation within the context of personal photo collections using hierarchical event and scene models
Boiarov et al. Large scale landmark recognition via deep metric learning
Adams et al. Extraction of social context and application to personal multimedia exploration
Guo et al. Multigranular event recognition of personal photo albums
Tankoyeu et al. Event detection and scene attraction by very simple contextual cues
Guo et al. Event recognition in personal photo collections using hierarchical model and multiple features
Yang et al. Segmentation and recognition of multi-model photo event
AU2013254921A1 (en) Method, apparatus and system for determining a label for a group of individuals represented in images

Legal Events

Date Code Title Description
MK4 Application lapsed section 142(2)(d) - no continuation fee paid for the application