WO2011151946A1 - コンテンツ分類システム、コンテンツ生成分類装置、コンテンツ分類装置、分類方法及びプログラム - Google Patents
コンテンツ分類システム、コンテンツ生成分類装置、コンテンツ分類装置、分類方法及びプログラム Download PDFInfo
- Publication number
- WO2011151946A1 WO2011151946A1 PCT/JP2011/000630 JP2011000630W WO2011151946A1 WO 2011151946 A1 WO2011151946 A1 WO 2011151946A1 JP 2011000630 W JP2011000630 W JP 2011000630W WO 2011151946 A1 WO2011151946 A1 WO 2011151946A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- content
- classification
- state
- image data
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
- G06F16/353—Clustering; Classification into predefined classes
Definitions
- the present invention relates to a technique for classifying generated content.
- Patent Documents 1 to 3 are known.
- Patent Document 1 classifies an image group with continuous shooting dates into one group.
- Patent Document 2 uses a position where the interval between shooting dates changes greatly as a delimiter, and the shooting date and time delimits. Each of the previous images and each image after the separation are classified into different groups.
- Patent Document 3 stores GPS (Global Positioning System) information representing the latitude and longitude of the shooting position for each image data, and the shooting position is at least a predetermined distance from a reference position such as the user's home. They are classified into groups that are separated from each other and groups that are not separated from a reference position such as a home by a predetermined distance or more.
- GPS Global Positioning System
- Patent Documents 1 to 3 there is a problem that images taken at different events can be classified into one group.
- the images taken at the event A and the event B can be classified into one group.
- the present invention has been made in view of such a problem, and an object thereof is to provide a content classification system that increases the possibility of classifying a plurality of generated contents for each event.
- a content classification system is a content classification system including a content generation device that sequentially generates content, wherein the content generation device is in a first state in a predetermined position; A state detected by the detection unit between the detection unit that repeatedly detects which state is the second state that does not exist at the predetermined position, and the time when each of the two contents generated by the content generation device is generated If there is a change, the two contents are classified into different groups, and if no change has occurred in the state detected by the detection means, the two contents are classified into the same group.
- Classification means for performing processing is performed.
- the content classification system according to the present invention having the above-described configuration can increase the possibility of classifying a plurality of generated contents for each event.
- FIG. 2 is a block diagram illustrating a functional configuration of a main part of the digital camera 100 according to Embodiment 1.
- FIG. 2 is a diagram illustrating a data configuration and an example of contents of an image information table 10 used by the digital camera 100.
- FIG. It is a flowchart which shows the update process of taking-out date information and bringing-in date information by the time information update part 119.
- 4 is a flowchart showing classification processing by a classification processing unit 120.
- 2 is a diagram illustrating an example of a generation timing of a plurality of image data and a timing of a state change of the digital camera 100.
- FIG. It is a figure for demonstrating a mode that take-out date information and bring-in date information are updated.
- FIG. 10 is a figure which shows the data structure and content example of the taking-out date-and-time table 20 and the bringing-in date and time table 30.
- 5 is a flowchart showing classification processing by a classification processing unit 308. It is a block diagram which shows the system configuration
- Embodiment 1 a digital camera 100 including a content generation / classification apparatus 110 as an embodiment of a content generation / classification apparatus according to the present invention will be described.
- FIG. 1 is a block diagram illustrating a functional configuration of a main part of the digital camera 100 according to the first embodiment.
- the digital camera 100 includes a release button 101, a display unit 102, a time measuring unit 103, and a content generation and classification device 110, as shown in FIG.
- the release button 101 is a button used by the user to give a shooting instruction, and has a function of sending a predetermined input signal to the content generation and classification apparatus 110 in response to a pressing operation by the user.
- the display unit 102 includes a liquid crystal display (LCD) and has a function of displaying captured images and the like according to instructions from the content generation and classification apparatus 110.
- LCD liquid crystal display
- the time measuring unit 103 is a so-called clock and has a function of measuring the current date and time.
- the content generation / classification device 110 has a function of classifying image data captured and generated based on an input signal from the release button 101 for each event period, and includes a storage unit 111, a position storage unit 112, a generation unit 113, a position calculation. Unit 116, detection unit 117, and classification unit 118.
- the event period refers to a period in which the same state continues between a state where the digital camera 100 exists at a predetermined position and a state where the digital camera 100 is not at a predetermined position. It is assumed that the position is the home of the user of the digital camera 100.
- the content generation / classification device 110 classifies the generated image data for each event period, and as a result, increases the possibility of classifying the generated image data for each event.
- the content generation / classification device 110 includes a processor and a memory.
- the processor executes a program stored in the memory, the generation unit 113, the position calculation unit 116, the detection unit 117, and the classification unit are performed. Each function of 118 is realized.
- the storage unit 111 is realized by a recording medium such as a memory or a hard disk, and has a function of storing generated image data, an image information table 10 (see FIG. 2) described later, and the like.
- the position storage unit 112 is realized by a recording medium such as a memory or a hard disk, and stores information indicating the latitude and longitude of a predetermined position (in this example, the home of the user of the digital camera 100) (hereinafter referred to as “position information”). It has the function to do.
- the generation unit 113 stores image data captured and generated based on an input signal from the release button 101 and information indicating the generation date / time of the image data (hereinafter referred to as “generation date / time information”) in the storage unit 111. It has a function and is configured to include an imaging unit 114 and an imaging control unit 115.
- the imaging unit 114 includes a lens, a charge coupled device (CCD), and an A / D converter (Analog to Digital conversion). (Luminance data group for 480 pixels) is sent to the imaging control unit 115. Image data is generated by condensing light incident from a subject on a CCD by a lens, converting the light into an electrical signal by the CCD, and converting the electrical signal into a digital signal by an A / D converter.
- CCD charge coupled device
- a / D converter Analog to Digital conversion
- the imaging control unit 115 instructs the imaging unit 114 to perform imaging, and generates image data received from the imaging unit 114 and generation date / time information indicating the date / time acquired from the timing unit 103. It has a function of storing in the storage unit 111.
- the imaging control unit 115 stores the image data in the storage unit 111 in association with the identification information (hereinafter referred to as “data number”) of the image data, and stores the image data in the image information table 10 stored in the storage unit 111.
- the generation date / time information is registered in association with the data number. As a result, the image data and the generation date / time information are associated with each other via the data number.
- the data numbers are assigned so that the initial value at the start of operation of the digital camera 100 is “1” and is a sequential number in the order of image data generation.
- the position calculation unit 116 includes a GPS antenna, and has a function of repeatedly calculating the latitude and longitude of the location where the device is present based on a signal from a GPS satellite received via the GPS antenna.
- the detection unit 117 Based on the position information stored in the position storage unit 112 and the calculation result by the position calculation unit 116, the detection unit 117 includes a first state in which the own device exists at home and a second state in which the device does not exist at home. It has a function of repeatedly detecting which state is.
- the detection unit 117 has a predetermined value (for example, 1 second) between the latitude and longitude of the home of the user of the user's own device indicated by the position information and the latitude and longitude of the place where the user's device exists calculated by the position calculation unit 116.
- a predetermined value for example, 1 second
- the classification unit 118 has a function of classifying each image data stored in the storage unit 111 for each event period based on the detection result received from the detection unit 117, and includes a time information update unit 119 and a classification processing unit 120. It is comprised including.
- the classification unit 118 also has a function of causing the display unit 102 to display a thumbnail of image data based on the classification result when a predetermined user operation is received from an operation unit (not shown).
- the time information update unit 119 when the time information update unit 119 receives the detection result from the detection unit 117, the time information update unit 119 takes out the state indicated by the detection result received this time in accordance with the change from the state indicated by the detection result received last time. It has a function to update date / time information or bring-in date / time information. It is assumed that the take-out date / time information and the bring-in date / time information are stored in the storage unit 111.
- This carry-out date and time information is information indicating the date and time when it was detected that the own device no longer exists at home, and the carry-in date and time information was detected that the own device became present at home. This is information indicating the date and time.
- a method for updating take-out date / time information or bring-in date / time information will be described later (see FIG. 3).
- the classification processing unit 120 is based on the take-out date / time information and the bring-in date / time information stored in the storage unit 111 and the generation date / time information of the previously stored image data registered in the image information table 10. , And a function of classifying image data newly stored in the storage unit 111. This classification method will be described later (see FIG. 4).
- FIG. 2 is a diagram showing a data configuration and example contents of the image information table 10 used by the digital camera 100.
- the image information table 10 is information configured by associating a data number 11, generation date / time information 12, and a group number 13 for each image data.
- the data number 11 is identification information of the corresponding image data, and here, an initial value at the start of operation of the digital camera 100 is set to “1”, and a value that is consecutive in the order of generation of the image data is used. An example is shown.
- the generation date / time information 12 is data indicating the generation date / time of the corresponding image data
- the group number 13 is identification information of the group into which the corresponding image data is classified.
- the digital camera 100 is started at the start of operation.
- the initial value is “1”, and values that are serial numbers are used.
- the generation date and time of the image data with the data number “1” is “January 1, 2010 19:30”, and the image data is assigned to the group with the group number “1”. It shows that it was classified.
- each image data having a data number “2” to “4” is classified into the same group having a group number “2”.
- the image data indicates that it was captured and generated within the same event period.
- the image data with the data number “1” and the image data with the data numbers “2” to “4” are classified into different groups, that is, taken within different event periods. Indicates that it was generated.
- the image information table 10 is updated when the generation unit 113 generates image data, and is referenced and updated when the classification unit 118 classifies the image data.
- FIG. 3 is a flowchart showing the update processing of take-out date / time information and bring-in date / time information by the time information update unit 119.
- the time information update unit 119 When the time information update unit 119 receives the detection result from the detection unit 117 (step S1), the time information update unit 119 acquires the current date and time from the time measurement unit 103, and determines whether or not the detection result received this time indicates the first state (step S1). S2).
- Step S2 When the detection result received this time indicates the second state (step S2: NO), the time information update unit 119 determines whether or not the detection result received from the detection unit 117 last time indicates the first state. (Step S3) If the second state is indicated (Step S3: NO), the updating process is terminated without updating the bring-out date information and the bring-in date information.
- step S3 when the previously received detection result indicates the first state (step S3: YES), the time information update unit 119 is stored in the storage unit 111 so as to indicate the date and time acquired from the time measuring unit 103. The taken-out date and time information is updated (step S4), and the update process is terminated.
- the reason for updating the take-out date / time information in step S4 is that the state indicated by the detection result has changed from the first state to the second state, so that the user of the digital camera 100 goes out with the digital camera 100. This is because it can be determined.
- step S2 when the detection result received this time indicates the first state in step S2 (step S2: YES), the time information update unit 119 indicates that the detection result received from the detection unit 117 last time indicates the first state. If the first state is indicated (step S5: YES), the update process is terminated without updating the carry-out date / time information and the carry-in date / time information.
- the digital camera 100 can continue to determine that the user of the digital camera 100 is at the home because the state indicated by the detection result remains unchanged in the first state.
- step S5 NO
- the time information update unit 119 is stored in the storage unit 111 so as to indicate the date and time acquired from the time measuring unit 103.
- the date information is updated (step S6), and the update process is terminated.
- the reason for updating the brought-in date / time information in step S6 is that the state indicated by the detection result has changed from the second state to the first state, so that the user of the digital camera 100 returns home with the digital camera 100. This is because it can be determined.
- FIG. 4 is a flowchart showing the classification processing by the classification processing unit 120.
- the image data generated by the imaging unit 114 and the generation date / time information are stored in the storage unit 111 in association with each other by the imaging control unit 115 (step S11). That is, the image data generated by the imaging unit 114 is stored in the storage unit 111 in association with the data number of the image data, and the data number and generation date / time information are registered in the image information table 10.
- the classification processing unit 120 When the process of step S11 is performed, the classification processing unit 120 generates the generation date / time information of the image data generated last time, that is, the generation date / time information corresponding to the data number obtained by subtracting 1 from the latest data number. From the storage unit 111 (step S13).
- the classification processing unit 120 generates the generation date and time indicated by the generation time information read in step S12, that is, the date and time when the generation date and time of the previously generated image data is later than the export date and time indicated by the export date and time information read in step S13. (Step S14), and when the generation date and time of the previously generated image data is a date and time later than the take-out date and time (Step S14: YES), the bring-in date and time information is stored in the storage unit. Read from 111 (step S15).
- the classification processing unit 120 generates the date and time indicated by the generation time information read in step S12, that is, the date and time when the generation date and time of the previously generated image data is later than the date and time indicated by the carry-in date and time information read in step S15. (Step S16), and when the generation date and time of the previously generated image data is a date and time later than the carry-in date and time (Step S16: YES), the storage unit 111 in Step S11 The stored image data generated this time is classified into the same group as the previously generated image data (step S17), and the classification process is terminated.
- step S17 the same group number as the previously generated image data is registered in the group number of the image information table 10 corresponding to the latest data number.
- step S14 when the generation date and time of the image data generated last time is a date and time before the take-out date and time (step S14: NO), and in step S16, the generation date and time of the image data generated last time is If the date / time is earlier than the date / time (step S16: NO), the image data generated this time stored in the storage unit 111 in step S11 is classified into a new group different from the image data generated last time. (Step S18), the classification process is terminated.
- step S18 in detail, a value obtained by incrementing the group number of the previously generated image data by one is registered in the group number of the image information table 10 corresponding to the latest data number.
- FIG. 5 is a diagram illustrating an example of the generation timing of a plurality of image data and the timing of the state change of the digital camera 100.
- FIG. 6 is a diagram for explaining how the take-out date / time information and the bring-in date / time information are updated.
- FIG. 7 and 8 are diagrams for explaining how the image information table 10 is updated.
- the time T1 shown in FIG. 5 is the timing when the user goes out with the digital camera 100
- the time T2 is the timing when the user who goes out at the time T1 returns with the digital camera 100
- the time T3 is the time T2. The timing when the user who came home goes out again with the digital camera 100 is shown.
- the image data P1 to P11 indicate image data photographed and generated by the digital camera 100.
- the image data P1 is an event E1 on February 21, 2010 in which the user of the digital camera 100 participated
- the image data P2 to P9 are an event E2 on February 22 to 24, 2010.
- the image data P10 is image data that was captured and generated at an event E3 on February 15, 2010, and the image data P11 is an event E4 that is different from the event E3 on February 15, 2010. Yes.
- n ⁇ 1 to n + 9 described in each rectangle indicating the image data P1 to P11 in the figure indicate data numbers (n is an integer).
- the imaging control unit 115 associates the image data (P1) generated by the imaging unit 114 with the generation date / time information (in this example, “February 21, 2010 19:30”). It is stored in the storage unit 111 (step S11 in FIG. 4).
- the classification processing unit 120 displays the generation date / time information of the previously generated image data (in this example, “18:10 on February 21, 2010”) from the image information table 10.
- Read (step S12) take-out date / time information (in this example, as shown in FIG. 6A, “10:36 February 19, 2010”) is read from the storage unit 111 (step S13). .
- the generation date and time indicated by the generation time information read out in step S12 (18:10 on February 21, 2010) is the take-out date and time indicated in the take-out date and time information read out in step S13 (2010 2). Since it is a date and time after 10:36 on March 19 (step S14: YES), bring-in date and time information (in this example, as shown in FIG. 6A, “February 19, 2010, 21:00 "20 minutes”) is read from the storage unit 111 (step S15).
- the classification processing unit 120 indicates that the generation date and time (18:10 on February 21, 2010) indicated by the generation time information read in step S12 is the bring-in date and time indicated by the carry-in date and time information read in step S15 (2010 2). Since the date is after 21:20 on March 19 (step S16: YES), the image data (P1) is classified into the same group as the previously generated image data (step S17), and the classification process ends. To do.
- the group number of the image data P1 whose data number is “n ⁇ 1” includes the data number generated before the image data P1.
- "M-1" that is the same as the group number of the image data with "n-2" is registered.
- ⁇ Update process at time T1> since the user goes out with the digital camera 100 at time T1, the detection unit 117 is based on the position information stored in the position storage unit 112 and the calculation result by the position calculation unit 116. It detects that the apparatus is in a second state that does not exist at home.
- the detection unit 117 detects that the device is in the first state at home.
- the time information update unit 119 When the time information update unit 119 receives the detection result from the detection unit 117 (step S1 in FIG. 3), the time information update unit 119 indicates the current date and time (in this example, “6:32 on February 22, 2010”) from the time measurement unit 103. And).
- the time information update unit 119 indicates that the detection result received this time indicates the second state (step S2: NO), and the detection result received from the detection unit 117 last time indicates the first state in which the own device exists at home. (Step S ⁇ b> 3: YES), the time information update unit 119 indicates the date and time (February 22, 2010, 6:32) acquired from the time measuring unit 103, and the date and time information stored in the storage unit 111. Is updated (step S4), and the update process is terminated.
- the take-out date / time information is changed from “February 19, 2010 10:36” as shown in FIG. 6A to “February 22, 2010 6” as shown in FIG. It is updated to “time 32 minutes”.
- the imaging control unit 115 associates the image data (P2) generated by the imaging unit 114 with the generation date / time information (in this example, “February 22, 2010, 7:10”). It is stored in the storage unit 111 (step S11 in FIG. 4).
- the classification processing unit 120 reads the generation date / time information of the previously generated image data (in this example, “February 21, 2010 19:30”) from the image information table 10. (Step S12), take-out date / time information (in this example, as shown in FIG. 6B, “shows February 22, 2010, 6:32”) is read from the storage unit 111 (Step S13).
- the generation date and time (19:30 on February 21, 2010) indicated by the generation time information read in step S12 is the take-out date and time indicated by the take-out date and time information read in step S13 (2010 2). Since it is the date and time before 6:32 on March 22 (step S14: NO), the image data (P2) is classified into a new group (step S18), and the classification process is terminated.
- the group number of the image data P2 whose data number is “n” is stored in the image information table 10 as the image data P1 generated before the image data P2. “M” obtained by incrementing the group number “m ⁇ 1” by one is registered.
- Image data P3 to P9 classification processing> The image data P3 to P9 are processed in the same manner as the above ⁇ Classification of image data P1>, and the image data P3 to P9 are classified into the same group as the image data P2.
- the group number of the image data P3 to P9 whose data numbers are “n + 1” to “n + 7” is stored in the image information table 10 in the same group as the image data P2.
- the number “m” is registered.
- the digital camera 100 is not brought in, so the image data P2 to P9 have one event period extending over several days. It indicates that the image data is classified as image data.
- ⁇ Update process at time T2> since the user goes home with the digital camera 100 at time T2, the detection unit 117 is based on the position information stored in the position storage unit 112 and the calculation result by the position calculation unit 116. It detects that the apparatus is in the first state existing at home.
- the time information update unit 119 When the time information update unit 119 receives the detection result from the detection unit 117 (step S1 in FIG. 3), the time information update unit 119 indicates the current date and time (in this example, “16:20 on February 24, 2010”) from the time measurement unit 103. And).
- step S6 the time information update unit 119 updates the bring-in date information stored in the storage unit 111 so as to indicate the date and time (February 24, 2010 16:20) acquired from the time measuring unit 103 (step S6).
- the update process ends.
- the imaging control unit 115 associates the image data (P10) generated by the imaging unit 114 with the generation date / time information (in this example, “5:10 February 25, 2010”). It is stored in the storage unit 111 (step S11 in FIG. 4).
- the classification processing unit 120 reads the generation date / time information of the previously generated image data (in this example, “15:10 on February 24, 2010”) from the image information table 10. (Step S12), take-out date / time information (in this example, as shown in FIG. 6 (c), indicating “February 22, 2010, 6:32”) is read from the storage unit 111 (Step S13).
- the generation date and time indicated by the generation time information read in step S12 (15:10 on February 24, 2010) is the take-out date and time indicated by the take-out date and time information read in step S13 (2010 2). Since it is the date and time after 6:32 on March 22 (step S14: YES), the date and time information (in this example, “February 24, 2010, 16:00, as shown in FIG. 6C). "20 minutes”) is read from the storage unit 111 (step S15).
- the generation date and time indicated by the generation time information read in step S12 (15:10 on February 24, 2010) is the carry-in date and time indicated in the carry-in date and time information read in step S15 (2010 Since it is the date and time before 16:20 on March 24 (step S16: NO), the image data (P10) is classified into a new group (step S18), and the classification process is terminated.
- the group number of the image data P10 whose data number is “n + 8” is stored in the image information table 10 with the image data P9 generated before the image data P10. “M + 1” obtained by incrementing the group number “m” by one is registered.
- the take-out date and time information is changed from “February 22, 2010 6:32” as shown in FIG. 6C to “February 25, 2010 6” as shown in FIG. It is updated to “hour 20 minutes”.
- the image data P11 is processed in the same manner as the above ⁇ classification processing of image data P2>, and the image data P11 is classified into a group different from the image data P10.
- the group number of the image data P11 whose data number is “n + 9” is stored in the image information table 10 in the image data P10 generated before the image data P11. “M + 2” obtained by incrementing the group number “m + 1” by one is registered.
- image data P10 and image data P11 have different events because the digital camera 100 is taken out from the home again during the period from generation of image data P10 to generation of image data P11. It shows that it was classified as image data taken within the period.
- the possibility of classifying the image data for each event is increased. be able to. That is, the image data (P2 to P9) generated in an event that occurs over a plurality of days like the event E2 can be classified into one group. Further, as in the case of the event E3 and the event E4, even when a plurality of events occur on the same day, the image data (P10, P11) generated in each event can be classified for each event.
- FIG. 9 is a diagram illustrating a display example of the classification result.
- FIG. 9A shows a group selection screen displayed on the display unit 102 by the classification unit 118 that has received a predetermined user operation from an operation unit (not shown) immediately after the image data P11 in the example of FIG. 5 is classified. SC1 is shown.
- the group selection screen SC1 includes icons i1 to i4 indicating the respective groups to which the respective image data shown in FIG. 5 belong.
- the icon i1 indicates a group having a group number “m ⁇ 1”
- the icon i2 indicates a group having a group number “m”
- the icon i3 has a group number “m + 1”.
- the icon i4 illustrates a group having a group number“ m + 2 ”.
- the generation date and generation location (home or away from) of the image data belonging to the group indicated by the icon are displayed on each icon. Thereby, the user can easily identify the event period corresponding to each of the icons i1 to i4.
- FIG. 9B shows a thumbnail screen SC2 displayed on the display unit 102 by the classification unit 118 that has received a user operation for selecting the icon i2 in FIG. 9A from an operation unit (not shown).
- thumbnail screen SC2 since the number of the image data P2 to P9 belonging to the group having the group number “m” is eight, on the thumbnail screen SC2, eight thumbnails corresponding to the image data (corresponding image data Are displayed side by side from the upper left in the shooting order of the corresponding image data.
- thumbnail screen SC2 an example is shown in which nine thumbnails are displayed on one screen. Therefore, when 10 or more image data items belong to one group, first, a thumbnail screen composed of thumbnails of image data whose shooting order is 1st to 9th is displayed, and the shooting order is changed according to a user operation. A thumbnail screen including thumbnails of tenth and subsequent image data may be displayed.
- the thumbnails of the image data belonging to each group can be displayed for each group corresponding to each event period, the user can easily display desired image data.
- Embodiment 2 In the first embodiment, an example has been described in which one digital camera 100 collectively performs image data generation to classification.
- Embodiment 2 an embodiment of a content classification system according to the present invention in which an apparatus (content classification apparatus) different from a digital camera that generates image data classifies image data generated by the digital camera.
- a content classification system 1000 as a form will be described.
- FIG. 10 is a block diagram showing a system configuration of the content classification system 1000 according to the second embodiment.
- the content classification system 1000 includes a digital camera 200 and a content classification device 300.
- the digital camera 200 includes a content generation device 210 instead of the content generation and classification device 110 of the digital camera 100 according to the first embodiment.
- the content generation apparatus 210 includes a generation unit 113, a content storage unit 211, a wireless communication unit 212, and a transmission processing unit 213.
- the generation unit 113 includes the content generation / classification apparatus 110 according to the first embodiment. It is the same thing.
- the content storage unit 211 is implemented by a recording medium such as a memory or a hard disk and has a function of storing generated image data, similar to the storage unit 111 according to the first embodiment. It differs from the storage unit 111 in that it stores an image information table (hereinafter referred to as “modified image information table”) that is a slightly modified version of the image information table 10 and does not store take-out date information and bring-in date information.
- modified image information table an image information table that is a slightly modified version of the image information table 10 and does not store take-out date information and bring-in date information.
- the modified image information table is a table in which no group number exists among the data items of the image information table 10 according to Embodiment 1 shown in FIG. That is, although not specifically illustrated and described, the modified image information table is information configured by associating the data number 11 with the generation date / time information 12 for each image data. Information including the information 12 is also referred to as “record”.
- the wireless communication unit 212 is a circuit that transmits and receives radio waves, and is realized by, for example, a wireless LAN adapter that conforms to the IEEE 802.11 standard.
- the radio communication unit 212 has a function of transmitting a response signal to the received beacon signal every time a so-called beacon signal repeatedly transmitted from the content classification device 300 is received.
- the wireless communication unit 212 when receiving a beacon signal in a non-connected state, notifies the transmission processing unit 213, and in accordance with an instruction from the transmission processing unit 213, the SSID (Service Set Identifier) set in advance in the content classification device 300 ) Including a connection request signal including a function to transmit to the content classification device 300. Further, the wireless communication unit 212 establishes a connection with the content classification device 300 by receiving a connection permission signal from the content classification device 300 that has received the connection request signal, and sets each image based on an instruction from the transmission processing unit 213. It has a function of transmitting data and the like to the content classification device 300.
- the SSID Service Set Identifier
- the SSID included in the connection request signal may be acquired from a beacon signal repeatedly transmitted by the content classification device 300 as is conventionally performed, or may be stored in the digital camera 100 in advance. Good.
- the transmission processing unit 213 manages whether or not transmission to the content classification device 300 has been completed for each image data stored in the content storage unit 211, and each record for each untransmitted image data Is extracted from the modified image information table, and each extracted record and each untransmitted image data and data number are transmitted to the content classification device 300 via the wireless communication unit 212.
- the transmission processing unit 213 performs this transmission in accordance with an instruction from the user. That is, when receiving a notification that the beacon signal has been received from the wireless communication unit 212, the user determines whether or not to transmit each untransmitted image data stored in the content storage unit 211 to the content classification device 300. A message for inquiring is sent to the display unit 102. When a user operation indicating that the message is transmitted from an operation unit (not shown) is accepted, the transmission processing unit 213 instructs the wireless communication unit 212 to transmit a connection request signal, and establishes a connection with the content classification device 300. Later, each image data etc. is transmitted.
- the content classification device 300 is realized by a personal computer (PC) including a display. As shown in the figure, a wireless communication unit 301, a data storage unit 302, a display unit 303, a detection unit 304, and an acquisition unit 305 are provided. The classification unit 306 is provided.
- PC personal computer
- the wireless communication unit 301 is a circuit that transmits and receives radio waves, and operates as a so-called access point based on, for example, the IEEE 802.11 standard.
- the wireless communication unit 301 has a function of notifying the detection unit 304 that a response signal has been received when a so-called beacon signal is repeatedly transmitted and a response signal is received.
- the wireless communication unit 301 when the wireless communication unit 301 receives a connection request signal including the SSID of its own device, the wireless communication unit 301 transmits a connection permission signal, receives each image data transmitted from the digital camera 200, and the like to the acquisition unit 305. It has a function to send out.
- the data storage unit 302 is realized by a recording medium such as a memory or a hard disk, and each image data received from the digital camera 200 via the wireless communication unit 301 and the image information table 10 described in the first embodiment (see FIG. 2). ).
- the display unit 303 is a liquid crystal display (LCD), for example, and has a function of displaying a thumbnail screen or the like (see FIG. 9) similar to that described in the first embodiment in accordance with an instruction from the classification unit 306.
- LCD liquid crystal display
- the detection unit 304 Based on whether or not the notification that the response signal has been received from the wireless communication unit 301 is received, the detection unit 304 includes a first state in which the digital camera 200 exists at home and a second state that does not exist at home. It has a function of repeatedly detecting which state is.
- the detecting unit 304 detects the first state when the response signal is received, detects the second state when the response signal is not received, and sends the detection result to the classification unit 306.
- the acquisition unit 305 stores each image data received from the wireless communication unit 301 in association with the data number in the data storage unit 302 and stores each record received together with each image data in the image information table 10 of the data storage unit 302. It has a function to register with.
- the classification unit 306 has the same function as the classification unit 118 according to the first embodiment, that is, the event data is stored in the data storage unit 302 based on the detection result received from the detection unit 117. It has a function of classifying for each period, and includes a time information update unit 307 and a classification processing unit 308.
- the display unit 303 similarly to the classification unit 118, when a predetermined user operation is received from an operation unit (not shown), the display unit 303 also has a function of displaying thumbnails of each image data based on the classification result.
- the time information update unit 307 basically has the same function as the time information update unit 119, but the classification processing unit classifies each image data stored in the data storage unit 302 by the acquisition unit 305 last time. It differs from the time information update unit 119 in that each take-out date and time and each carry-in date and time after 308 is completed are managed. That is, the time information update unit 119 according to the first embodiment manages the take-out date and time and the bring-in date and time for one generation, but the time information update unit 307 according to the present embodiment There are cases where the take-out date and time and the carry-in date and time are managed.
- the classification processing unit 308 basically has the same function as that of the classification processing unit 120. However, after the storage of each image data received from the digital camera 200 by the acquisition unit 305 in the data storage unit 302 is completed, each image is processed. It differs from the classification processing unit 120 in that the data is classified in the order of generation time.
- FIG. 11A is a diagram showing a data configuration and example contents of the take-out date / time table 20.
- the take-out date and time table 20 indicates the time when it is detected that the digital camera 200 no longer exists at the user's home after completion of the previous classification for each image data by the classification processing unit 308. This is information including take-out date and time information 21 registered in order.
- FIG. 11B is a diagram showing a data configuration and example contents of the bring-in date / time table 30.
- the bring-in date and time table 30 indicates the time when the digital camera 200 is detected to be present at the user's home after the previous classification for each image data by the classification processing unit 308 is completed. This is information composed of bring-in date information 31 registered in order of time.
- step S4 is changed to add take-out date / time information indicating the date / time acquired from the time measuring unit 103 to the take-out date / time table 20, and the process in step S6 Is changed so that brought-in date / time information indicating the date / time acquired from the time measuring unit 103 is added to the brought-in date / time table 30.
- FIG. 12 is a flowchart showing classification processing by the classification processing unit 308.
- the classification processing unit 308 It is determined whether there is unclassified image data (step S21).
- the classification processing unit 308 determines that there is unclassified image data (step S21: YES), and no group number is registered. If there is no record, it is determined that there is no unclassified image data (step S21: NO).
- the classification processing unit 308 selects the image data with the oldest generation date and time (hereinafter also referred to as “target image data”) among the unclassified image data.
- target image data the oldest generation date and time
- the generation date / time information and the generation date / time information of the image data generated immediately before the target image data are read from the image information table 10 (steps S22 and S23).
- the classification processing unit 308 reads each take-out date / time information from the take-out date / time table 20 (step S24).
- the classification processing unit 308 is between the generation date and time indicated by the generation date and time information of the image data generated immediately before read in step S23 and the generation date and time indicated by the generation date and time information of the target image data read in step S22. Then, it is determined whether or not the take-out date and time indicated by any take-out date and time information read in step S24 is included (step S25).
- each carry-in date / time information is read from the carry-in date / time table 30 (step S26).
- the classification processing unit 308 generates the generation date and time indicated by the generation date and time information of the image data generated immediately before read in step S23, and the generation date and time indicated by the generation date and time information of the target image data read in step S22. In the meantime, it is determined whether or not the carry-in date and time indicated by any carry-in date and time information read in step S26 is included (step S27).
- step S27: NO When neither brought-in date / time is included (step S27: NO), the target image data is classified into the same group as the image data generated immediately before, similarly to the process of step S17 of FIG. (Step S28), the process is performed again from Step S21.
- step S25 when any take-out date and time is included in step S25 (step S25: YES) and when any bring-in date and time is included in step S27 (step S27: YES), the target image data Are classified into a new group (step S29), and the processing from step S21 is performed again in the same manner as in step S18 of FIG.
- step S21 when there is no unclassified image data (step S21: NO), the classification processing unit 308 ends the classification process.
- the time information update unit 307 includes each take-out date / time information registered in the take-out date / time table 20 and each bring-in date / time table 30 registered. Delete date / time information.
- the content classification apparatus 300 receives the image data P1 to P11 generated at the timing shown in FIG. 5 and the records of the modified image information table for these image data from the digital camera 200, and these image data A case where P1 to P11 are stored in the data storage unit 302 and each record is registered in the image information table 10 will be described as an example.
- the classification processing unit 308 since there are unclassified image data P1 to P11 (step S21 in FIG. 12: YES), the classification processing unit 308 generates the generation date / time information (in this example, the image data P1 having the oldest generation date / time). "Shows February 21, 2010 19:30") and generation date / time information of the image data generated immediately before that image data (in this example, "February 21, 2010 18:10" Is read from the image information table 10 (steps S22 and S23).
- the classification processing unit 308 includes each date and time information (“February 19, 2010, 10:36” and “2010, February 22, 6:32” as shown in FIG. 11A. "Indicates” February 25, 2010 6:20 ”) is read from the take-out date / time table 20 (step S24).
- the classification processing unit 308 generates the generation date and time indicated by the generation date and time information read in step S23 (February 21, 2010, 18:10) and the generation date and time indicated by the generation date and time information read in step S22 (February 21, 2010). Since no take-out date and time is included between the date and time (19:30 on the day) (step S25: NO), as shown in FIG. “19:21:20” and “February 24, 2010, 16:20” are read out from the bring-in date and time table 30 (step S26).
- the classification processing unit 308 generates the generation date and time indicated by the generation date and time information read in step S23 (February 21, 2010, 18:10) and the generation date and time indicated by the generation date and time information read in step S22 (February 21, 2010). Since no brought-in date / time is included between the date and time (19:30) (step S27: NO), the image data P1 is classified into the same group as the image data generated immediately before (step S27). S28).
- the group number of the image data P1 in which the data number of the image information table 10 is “n ⁇ 1” The same “m ⁇ 1” as the group number of the image data with the data number “n ⁇ 2” generated immediately before the data P1 is registered.
- step S21 YES
- the classification processing unit 308 since there is unclassified image data P2 to P11 (step S21: YES), the classification processing unit 308 generates the generation date / time information of the image data P2 with the oldest generation date / time (in this example, “2010” February 22, 7:10 "), and generation date / time information of the image data generated immediately before the image data (in this example," February 21, 2010 19:30 ”) ) Is read from the image information table 10 (steps S22 and S23).
- the classification processing unit 308 includes each date and time information (“February 19, 2010, 10:36”, “2010, February 22, 6:32”, and “2010, February 25, 6:20”). (“Minute”) is read from the take-out date / time table 20 (step S24).
- the classification processing unit 308 generates the generation date and time indicated by the generation date and time information read in step S23 (February 21, 2010, 19:30) and the generation date and time indicated by the generation date and time information read in step S22 (February 22, 2010). Since the date and time of take-out (February 22, 2010, 6:32) is included between (day 7:10) (step S25: YES), the image data P2 is classified into a new group (step S29).
- the group number of the image data P2 having the data number “n” in the image information table 10 includes the image data P2 “M”, which is obtained by incrementing the group number “m ⁇ 1” of the image data P1 generated immediately before the “1”, is registered.
- Image data P3 to P9 classification processing> The image data P3 to P9 are processed in the same manner as the above ⁇ Classification of image data P1>, and the image data P3 to P9 are classified into the same group as the image data P2.
- the image information table 10 includes image data P3 to P9 having data numbers “n + 1” to “n + 7”.
- the same group number “m” as that of the image data P2 is registered.
- the classification processing unit 308 since there is unclassified image data P10 and P11 (step S21 in FIG. 12: YES), the classification processing unit 308 generates the generation date / time information (in this example, the image data P10 having the oldest generation date / time). "Shows February 25, 2010 at 5:10”) and generation date / time information of the image data generated immediately before the image data (in this example, "February 24, 2010, 15:10”) Is read from the image information table 10 (steps S22 and S23).
- the classification processing unit 308 includes each date and time information (“February 19, 2010, 10:36” and “2010, February 22, 6:32” as shown in FIG. 11A. "Indicates” February 25, 2010 6:20 ”) is read from the take-out date / time table 20 (step S24).
- the classification processing unit 308 generates the generation date and time indicated by the generation date and time information read in step S23 (February 24, 2010 at 15:10) and the generation date and time indicated by the generation date and time information read in step S22 (February 25, 2010). Since no take-out date and time is included between the date and time (5:10 on the day) (step S25: NO), as shown in FIG. “19:21:20” and “February 24, 2010, 16:20” are read out from the bring-in date and time table 30 (step S26).
- the classification processing unit 308 generates the generation date and time indicated by the generation date and time information read in step S23 (February 24, 2010 at 15:10) and the generation date and time indicated by the generation date and time information read in step S22 (February 25, 2010). (5:10 on the day), the date and time of bringing in (16:20 on February 24, 2010) is included (step S27: YES), so the image data P10 is classified into a new group (step S29).
- the image information table 10 includes the group number of the image data P10 having the data number “n + 8” as the image number. “M + 1”, which is obtained by incrementing the group number “m” of the image data P9 generated immediately before the data P10 by one, is registered.
- the image data P11 is processed in the same manner as the above ⁇ classification processing of image data P2>, and the image data P11 is classified into a group different from the image data P10.
- the group number of the image data P11 whose data number is “n + 9” is stored in the image information table 10 as the image number P11.
- “M + 2” obtained by incrementing the group number “m + 1” of the image data P10 generated immediately before the data P11 by one is registered.
- ⁇ Modification 1 In the content classification system 1000 according to the second embodiment, based on whether or not a response signal is received from the digital camera 200, the digital camera 200 is in a first state that exists in the user's home and a second state that does not exist in the home. An example has been described in which the content classification device 300 detects which state of the content is detected.
- the digital camera is transformed into one equipped with an IC (Integrated ⁇ ⁇ ⁇ Circuit) tag, and based on whether or not a signal from the IC tag is received, the content classification device is in a state of the digital camera (first state or second state).
- IC Integrated ⁇ ⁇ ⁇ Circuit
- FIG. 13 is a block diagram showing a system configuration of a content classification system 1100 according to the first modification.
- the content classification system 1100 includes a digital camera 400 and a personal computer (PC) 500 including a content classification device 510.
- the digital camera 400 and the PC 500 are connected by a USB (Universal Serial Bus) cable. 1 can be connected.
- the USB cable 1 conforms to, for example, the USB 2.0 standard, and the cable length is, for example, about 50 cm.
- the digital camera 400 includes a content generation device 410 instead of the content generation device 210 of the digital camera 200 according to the second embodiment, and further includes a USB interface unit 401.
- the USB interface unit 401 When the USB interface unit 401 detects the connection with the PC 500 via the USB interface unit 501 described later, the USB interface unit 401 notifies the content generation device 410 to that effect, and based on the instruction of the content generation device 410, the image data and the like are transmitted. It has the function to transmit to PC500.
- the content generation device 410 includes an IC tag 411 and a transmission processing unit 412 instead of the wireless communication unit 212 and the transmission processing unit 213 of the content generation device 210 according to the second embodiment.
- the IC tag 411 includes an LF antenna for receiving an LF (Low Frequency) band signal and an UHF antenna for transmitting an UHF (UltraUHigh Frequency) band signal, and receives an LF band signal.
- Each has a function of transmitting a UHF band signal including identification information of its own device.
- the transmission processing unit 412 basically has the same function as the transmission processing unit 213 according to the second embodiment. However, when the notification indicating that the connection with the PC 500 is detected from the USB interface unit 401 is received. In addition, each image data and each data number not transmitted to the PC 500 and each record corresponding to each untransmitted image data extracted from the modified image information table are transmitted to the PC 500 via the USB interface unit 401. This is different from the transmission processing unit 213.
- the transmission processing unit 412 When the transmission processing unit 412 receives a notification that the connection with the PC 500 has been detected from the USB interface unit 401, the transmission processing unit 412 transmits to the PC 500 each untransmitted image data stored in the content storage unit 211. A message for inquiring the user about whether or not is displayed on the display unit 102. When a user operation for transmitting each image data is received from an operation unit (not shown), each image data is transmitted.
- the PC 500 includes a USB interface unit 501 and a content classification device 510 as shown in FIG.
- the USB interface unit 501 has a function of detecting connection with the digital camera 400 via the USB interface unit 401, receiving each image data transmitted from the digital camera 400, and sending it to the acquisition unit 513.
- the content classification device 510 includes an IC tag reader 511, a detection unit 512, and an acquisition unit 513 instead of the wireless communication unit 301, the detection unit 304, and the acquisition unit 305 of the content classification device 300 according to the second embodiment.
- the IC tag reader 511 includes an LF antenna for transmitting an LF band signal and has a function of repeatedly transmitting an LF band signal. It should be noted that the receivable range of the LF band signal is limited to a relatively narrow range, and in this embodiment, the digital camera 400 approaches from the content classification device 510 to a range of about 1.5 m without an obstacle. In this case, the digital camera 400 can receive this LF band signal.
- the IC tag reader 511 includes a UHF antenna for receiving a UHF band signal.
- the IC tag reader 511 detects that fact. It has a function to notify.
- the detection unit 512 basically has the same function as that of the detection unit 304 according to Embodiment 2, but a notification that the UHF band signal including the identification information of the digital camera 400 has been received from the IC tag reader 511. Is different from the detection unit 304 in that the digital camera 400 detects whether the digital camera 400 is in the first state or the second state based on whether or not it is received.
- the acquisition unit 513 basically has the same function as the acquisition unit 305 according to the second embodiment, but is acquired in that each image data transmitted from the digital camera 400 is received from the USB interface unit 501. Different from the unit 305.
- ⁇ Modification 2 In the content classification system 1100 according to the first modification, the digital camera 400 is in the first state existing in the user's home based on whether or not the UHF band signal including the identification information of the digital camera 400 is received, An example in which the PC 500 detects which state is the second state that does not exist in the PC has been described.
- FIG. 14 is a block diagram showing a system configuration of a content classification system 1200 according to the second modification.
- the content classification system 1200 includes a digital camera 450 and a PC 550 including a content classification device 552.
- the digital camera 450 and the PC 550 are connected via the USB cable 1 as in the first modification. Connectable.
- the digital camera 450 includes a content generation device 451 instead of the content generation device 410 of the digital camera 400 according to Modification 1, and the content generation device 451 is an IC of the content generation device 410.
- the content generation device 451 is an IC of the content generation device 410.
- Each component excluding the tag 411 is included.
- the PC 550 includes a USB interface unit 551 and a content classification device 552 instead of the USB interface unit 501 and the content classification device 510 of the PC 500 according to the first modification.
- the USB interface unit 551 further detects the connection and disconnection with the digital camera 450 via the USB interface unit 401. A function of notifying the detection unit 553 of the effect.
- the content classification device 552 includes a detection unit 553 instead of the detection unit 512 of the content classification device 510 according to the modified example 1, and further includes components other than the IC tag reader 511.
- the detection unit 553 basically has the same function as that of the detection unit 512 according to the modified example 1, except that the connection with the digital camera 450 from the USB interface unit 551 is detected and the disconnection is detected. Based on the notification, the digital camera 450 is different from the detection unit 512 in that it detects whether the digital camera 450 is in the first state or the second state.
- the detection unit 553 after receiving the notification that the connection is detected, the detection unit 553 repeatedly detects that the state is the first state until the notification that the disconnection is detected and receives the notification that the disconnection is detected. After the receipt, the second state is repeatedly detected until a notification that the connection is detected is received.
- the IC tag 411 according to the modification 1 has been described as transmitting a UHF band signal when receiving an LF band signal. However, the IC tag does not receive an LF band signal. Alternatively, the LF band signal or the UHF band signal may be transmitted. In this case, the IC tag reader 511 according to the modification 1 does not need to transmit an LF band signal, and only needs to be able to receive a signal transmitted from the IC tag according to this modification.
- the content classification device 300 has been described as detecting the state (first state or second state) of the digital camera 200 and classifying each image data.
- the digital camera 200 detects the state of the device itself, and is modified so that each image data is classified. Also good.
- a digital camera according to this modification is referred to as a “deformed digital camera”.
- the content classification device 300 may be modified to function as a content management device that stores each image data acquired from the modified digital camera via the wireless communication unit 301 in the data storage unit 302.
- the content management apparatus is realized not only when realized by a personal computer but also by providing a wireless communication function to an apparatus having a function of storing data (for example, a computer such as a NAS (Network Attached Storage) or a server). Also good.
- a computer such as a NAS (Network Attached Storage) or a server.
- HDD Hard Disk Drive
- each image data is obtained via the USB. May be obtained.
- the content management apparatus that acquires and stores each image data from the digital camera 100 has not been described in particular, but the digital camera 100 is modified to be connectable wirelessly or by wire, You may comprise the content classification system which consists of the digital camera which concerns on this deformation
- “home” or “going out” is used as the generation location of image data belonging to the group indicated by each icon included in the group selection screen SC1 (see FIG. 9A).
- the shooting position latitude, longitude
- a specific place prefecture name, city name
- Area names, etc. may be displayed.
- the take-out date / time information and the bring-in date / time information have been described as being stored separately, but information indicating the date / time when the state of the digital camera has changed is not particularly distinguished. A plurality of generations may be stored.
- each icon included in the group selection screen SC1 has its icon. It is impossible to display the image data generation location (home or away from home) belonging to the group indicated by.
- the USB cable has been described as an example of the connection unit for connecting the digital camera and the PC.
- a cable other than the USB cable for example, RS A -232C cable may be used.
- the number of digital cameras including the content generation device is one, but there may be a plurality of digital cameras.
- the content generation device of each digital camera also sends the identification information of the own device set in advance when transmitting each image data and the like, and the content classification device according to the second embodiment and each modification example
- the classification processing unit may perform classification processing for each content generation device of each digital camera based on the received identification information.
- the content storage unit 211 is realized by a memory card (non-volatile memory) that can be removed from the digital camera 400, and a memory card reader is connected to the PC 500 to be removed from the digital camera 400.
- a memory card reader is connected to the PC 500 to be removed from the digital camera 400.
- Each image data may be acquired from the memory card using this reading device.
- each USB interface unit (401, 501) is not necessary.
- the transmission processing unit 213 of the content generation device 210 included in the digital camera 200 according to the second embodiment has been described as instructing the wireless communication unit 212 to transmit a connection request signal after receiving a user operation. However, when receiving a notification that the beacon signal has been received from the wireless communication unit 212 without waiting for the reception of a user operation, it automatically instructs transmission of a connection request signal and establishes a connection with the content classification device 300. Later, each image data or the like may be transmitted.
- the transmission processing unit 412 of the content generation apparatus included in the digital camera according to each modification detects a connection with the PC according to each modification from the USB interface unit 401 without waiting for the reception of a user operation.
- each image data or the like may be transmitted.
- the transmission processing unit 213 of the digital camera 200 manages whether or not the transmission to the content classification device 300 has been completed for each image data stored in the content storage unit 211.
- the digital camera 200 and the content classification device 300 may be modified so that the management is performed by the content classification device 300.
- the content classification device requests the digital camera according to this modification to transmit a list of data numbers of each image data stored in the content storage unit 211.
- the digital camera according to this modification extracts each data number from the modified image information table and transmits a list of the extracted data numbers to the content classification device according to this modification.
- the content classification device When receiving the list of data numbers, the content classification device according to this modification obtains the latest data number in the image information table 10 stored in the data storage unit 302, and adds 1 to the latest data number. It is determined whether the data number after the number is included in the received list.
- untransmitted image data is stored in the digital camera according to this modification.
- the digital camera according to this modification is requested to transmit each image data associated with a data number after the data number obtained by adding 1 to the latest data number.
- the digital camera In response to this request, stores the corresponding image data (that is, each image data that has not been transmitted) and the data number, and each record for each image data in the modified image information table. It transmits to the content classification apparatus which concerns on modification.
- the content classification device When receiving each image data and the like, the content classification device according to this modification stores the received image data in the data storage unit 302 and the received image of each record, as described in the second embodiment. Registration in the information table 10 is performed.
- each component of the content generation and classification apparatus 110 described in the embodiment cooperates with a processor included in the content generation and classification apparatus 110, so that each component of the content classification apparatuses 300, 510, and 552 becomes a content classification.
- the function is realized by cooperating with a processor included in each of the devices 300, 510, and 552.
- a program for causing the processor to execute the update process and the classification process (see FIGS. 3, 4, and 12) described in the embodiment is recorded on a recording medium or distributed via various communication paths. It can also be distributed.
- a recording medium includes an IC card, an optical disk, a flexible disk, a ROM, a flash memory, and the like.
- the distributed / distributed program is used by being stored in a memory or the like that can be read by a processor in the device, and the processor executes the program so that the content generation / classification apparatus and content shown in the embodiment Each function of the classification device is realized.
- a content classification system is a content classification system including a content generation device that sequentially generates content, wherein the content generation device is in a first state in a predetermined position;
- the detection unit that repeatedly detects which state is the second state that does not exist at the predetermined position, and the state detected by the detection unit between the generation times of the two contents generated by the content generation device
- two contents generated at the event A performed at a place other than the predetermined position can be classified into the same group, and the content generated at the event A and the event B performed at the predetermined position
- the two contents with the contents generated in can be classified into different groups.
- this content classification system by setting a place where a user with a content generation device is likely to stop between two events at a predetermined position, the possibility of classification for each event is increased. be able to.
- the classification unit may perform the classification process for each set of two contents generated in succession by the content generation apparatus.
- a content generation device is generated for each period (that is, an event period) during the same period between a state where the content generation device is at a predetermined position and a state where the content generation apparatus is not at a predetermined position. And classify each content into the same group.
- the content generation apparatus when the user of the content generation apparatus participates in two events A and B that are performed in order, generally, after participating in the event A, before participating in the event B, the content generation apparatus is It is considered that there is a high possibility that an event will occur such as the user who goes home or goes out.
- the user goes out with the content generation device, and is generated at an event A performed for several days on the go.
- the content can be classified into the same group, and the user returns home with the content generation device, and at the event B performed at home or after going home again and at the event B performed outside
- Each of the generated contents can be classified into a group different from the group for the event A.
- the content generation device stores content information including first communication means for transmitting a first wireless signal receivable within a predetermined range, generated content, and generation time information of the content.
- Storage means, and the content classification system further includes the detection means and the classification means, a second communication means capable of receiving the first radio signal, and each content information stored in the content storage means.
- a content classification device installed at the predetermined position, wherein the detection means is configured to be in the first state when the second communication means receives the first radio signal.
- each content constituting each content information obtained by the obtaining unit may be a process of managing on a group to which the contents belong.
- the content classification device sets the content generation device to the first state depending on whether or not the first wireless signal that can be received when the content generation device exists within a predetermined range from the own device. It is detected which state is the second state.
- this predetermined range is set to a relatively narrow range, it is possible to appropriately detect the state of the content generation device and increase the possibility of classification for each event.
- the second communication means repeatedly transmits a second radio signal that can be received within the predetermined range, and the first communication means further receives the second radio signal.
- the second wireless signal is transmitted when the second wireless signal is received, and the second communication means transmits the second wireless signal on the basis of the time when the second wireless signal is transmitted.
- the predetermined time may be measured.
- the predetermined range is relatively narrow. By setting the range, it is possible to appropriately detect whether the content generation device is in the first state or the second state, and increase the possibility of classifying the event by event.
- the first communication unit is an IC tag that repeatedly transmits a signal including identification information of the content generation device as the first wireless signal
- the second communication unit is an IC tag reader. It is good as well.
- this content classification system it is relatively easy to detect whether the content generation device is in the first state or the second state using an existing mechanism of an IC tag and an IC tag reader. Can be implemented.
- the content generation apparatus further includes the detection unit and the classification unit, and a reception unit for receiving a specific radio signal, and the detection unit adds the radio signal received by the reception unit to the radio signal. Based on this, the detection may be performed.
- content generation device that generates content performs classification
- content generation to classification can be performed collectively by one device.
- the wireless signal is a signal transmitted from a GPS satellite, and the content generation device further includes position storage means for storing position information indicating latitude and longitude of the predetermined position, and the reception means.
- a position acquisition unit that calculates latitude and longitude indicating the position of the device based on the received wireless signal, and the detection unit includes latitude and longitude indicated by the position information stored in the position storage unit;
- the content generation apparatus further includes content storage means for storing content information including generated content and generation time information of the content, and the content classification system repeatedly transmits the wireless signal.
- an acquisition means for acquiring each piece of content information stored in the content storage means, including a content management device installed at the predetermined position, wherein the wireless signal can be received only within a predetermined range,
- the detecting means detects the first state when the receiving means receives the wireless signal, and detects the second state when the receiving means does not receive the wireless signal within a predetermined time.
- the classification process performed by the classification unit detects each content that constitutes each piece of content information acquired by the acquisition unit. And it is also possible a process of managing on a group to which the contents belong.
- the content generation apparatus determines whether the self-apparatus is in the first state and the second state depending on whether or not a radio signal that can be received when the self-existence apparatus exists within a predetermined range from the content management apparatus. Detect which state.
- this predetermined range is set to a relatively narrow range, it is possible to appropriately detect the state of the content generation device and increase the possibility of classification for each event.
- the content classification system includes the detection unit and the classification unit, and includes a content classification device installed at the predetermined position.
- the content generation device and the content classification device are wired to the other device, respectively.
- this content classification system it is possible to appropriately detect whether the content generation device is in the first state or the second state by performing wired connection using a short-distance cable of about several meters. It is possible to increase the possibility of classifying each event.
- the content generation device may be a digital camera that captures an image of a subject and generates image data
- the content classification device may be a computer installed at a home of a user who owns the digital camera. Good.
- the user goes out with the content generation device (digital camera), and classifies each image data taken in the event A held for several days on the go.
- the user can return to the home with the content generation device, and the image data captured at the event B performed at home can be classified into a group different from the group of the event A.
- the content classification system includes the detection unit and the classification unit, and includes a content classification device installed at the predetermined position. There are a plurality of the content generation devices, and the detection unit is the detection unit. The classification unit may perform the classification process for each set for each content generation device.
- the content generation device generates each content generation device by setting a predetermined position at a place where each user of the plurality of content generation devices is likely to stop in common. The possibility of classifying each content for each event can be increased.
- a family (father, mother, son, daughter) of a family is a user of each content generation device
- a classroom at the school is established.
- an employee of a company is a user of each content generation device
- the office of the company is specified. It is conceivable to set the position.
- a content generation and classification apparatus includes: content storage means for storing content; generation means for sequentially generating content and storing the content in the content storage means; Each of two contents stored in the content storage means, detection means for repeatedly detecting whether the first state exists at a predetermined position and the second state not present at the predetermined position; If there is a change in the state detected by the detection means during the generation, the two contents are classified into different groups, and if there is no change in the state detected by the detection means, Classification means for performing classification processing for classifying two contents into the same group.
- two contents generated at the event A performed at a place other than the predetermined position can be classified into the same group, and the content generated at the event A and the event performed at the predetermined position are classified.
- the two contents with the contents generated in B can be classified into different groups.
- this content generation and classification device the possibility that a user who owns the device between two events is likely to stop by is set at a predetermined position, thereby increasing the possibility of classification for each event. be able to.
- the content generation and classification apparatus further uses position storage means for storing position information indicating the latitude and longitude of the predetermined position and a position for calculating latitude and longitude indicating the position of the own apparatus using GPS. Calculating means, and when the difference between the latitude and longitude indicated by the position information stored in the position storage means and the latitude and longitude calculated by the position calculation means is a predetermined value or less, It is good also as detecting that it is the said 2nd state when it detects that it is the said 1st state and is larger than the said predetermined value.
- this content generation and classification apparatus it is possible to appropriately detect whether the own apparatus is in the first state or the second state using the existing GPS mechanism, and therefore it can be implemented relatively easily. it can.
- the content classification device includes a content storage unit that stores content sequentially generated by an external device, a first state in which the external device is present at a predetermined position, There is a change in the state detected by the detection unit between each generation of the two contents generated by the external device and a detection unit that repeatedly detects whether the second state does not exist at the predetermined position. If it occurs, the two contents are classified into different groups, and if there is no change in the state detected by the detection means, a classification process is performed to classify the two contents into the same group. Classification means.
- two contents generated in event A performed at a place other than a predetermined position can be classified into the same group, and the content generated in event A and event B performed at a predetermined position
- the two contents with the contents generated in can be classified into different groups.
- this content classification device by setting a place where a user with an external device is likely to stop between two events at a predetermined position, the possibility of classification for each event is increased. Can do.
- the classification method according to the present invention is realized by, for example, the content generation and classification device 110 and the content classification devices 300, 510, and 552 described in the embodiment (particularly, the classification shown using FIGS. 4 and 12). Refer to the processing procedure etc.).
- the content classification system, content generation classification device, and content classification device according to the present invention are used to automatically classify a plurality of contents.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Library & Information Science (AREA)
- Television Signal Processing For Recording (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Studio Devices (AREA)
Abstract
Description
実施の形態1では、本発明に係るコンテンツ生成分類装置の一実施形態としてのコンテンツ生成分類装置110を含むデジタルカメラ100について説明する。
まず、本実施の形態に係るコンテンツ生成分類装置110を含むデジタルカメラ100の構成について、図1を用いて説明する。
次に、デジタルカメラ100が使用するデータについて、図2を用いて説明する。
次に、デジタルカメラ100の動作について、図3、図4を用いて説明する。
図3は、時間情報更新部119による持出日時情報及び持込日時情報の更新処理を示すフローチャートである。
図4は、分類処理部120による分類処理を示すフローチャートである。
上述したデジタルカメラ100の動作について、図5~図8に示す具体例を用いて説明する。
撮像部114で生成された画像データ(P1)と、生成日時情報(この例では、「2010年2月21日19時30分」を示すものとする)とが撮像制御部115により対応付けて記憶部111に記憶される(図4のステップS11)。
この例では、時刻T1で、ユーザがデジタルカメラ100を持って外出するので、検出部117は、位置記憶部112に記憶されている位置情報と、位置算出部116による算出結果とに基づいて、自装置が自宅に存在しない第2状態であることを検出する。
撮像部114で生成された画像データ(P2)と、生成日時情報(この例では、「2010年2月22日7時10分」を示すものとする)とが撮像制御部115により対応付けて記憶部111に記憶される(図4のステップS11)。
画像データP3~P9については、上記<画像データP1の分類処理>と同様に処理されて、画像データP3~P9は、画像データP2と同じグループに分類される。
この例では、時刻T2で、ユーザがデジタルカメラ100を持って帰宅するので、検出部117は、位置記憶部112に記憶されている位置情報と、位置算出部116による算出結果とに基づいて、自装置が自宅に存在する第1状態であることを検出する。
撮像部114で生成された画像データ(P10)と、生成日時情報(この例では、「2010年2月25日5時10分」を示すものとする)とが撮像制御部115により対応付けて記憶部111に記憶される(図4のステップS11)。
この例では、時刻T3(この例では、「2010年2月25日6時20分」とする)で、ユーザがデジタルカメラ100を持って再び外出するので、上記<時刻T1での更新処理>と同様に処理される。
画像データP11については、上記<画像データP2の分類処理>と同様に処理されて、画像データP11は、画像データP10とは異なるグループに分類される。
図9は、分類結果の表示例を示す図である。
実施の形態1では、1台のデジタルカメラ100が画像データの生成から分類までを一括して行う例を説明した。実施の形態2では、画像データを生成するデジタルカメラとは別の装置(コンテンツ分類装置)が、デジタルカメラで生成された画像データを分類するようにした、本発明に係るコンテンツ分類システムの一実施形態としてのコンテンツ分類システム1000について説明する。
本実施の形態に係るコンテンツ分類システム1000の構成について、図10を用いて説明する。
まず、デジタルカメラ200の構成について説明する。
次に、コンテンツ分類装置300の構成について説明する。
次に、コンテンツ分類装置300が使用するデータについて、図11を用いて説明する。
次に、コンテンツ分類装置300の動作について説明する。
時間情報更新部307による持出日時テーブル20及び持込日時テーブル30の更新処理は、実施の形態1に係る時間情報更新部119による持出日時情報及び持込日時情報の更新処理(図3参照)のうち、ステップS4及びステップS6の処理を若干変更したものである。
図12は、分類処理部308による分類処理を示すフローチャートである。
上述したコンテンツ分類装置300による分類処理について、図5、図7、図8、図11に示す具体例を用いて、図12に示すフローチャートに即して説明する。
この例では、未分類の画像データP1~P11があるので(図12のステップS21:YES)、分類処理部308は、この中で最も生成日時が古い画像データP1の生成日時情報(この例では「2010年2月21日19時30分」を示す)、及びその画像データの1つ前に生成された画像データの生成日時情報(この例では、「2010年2月21日18時10分」を示すものとする)を画像情報テーブル10から読み出す(ステップS22、S23)。
この例では、未分類の画像データP2~P11があるので(ステップS21:YES)、分類処理部308は、この中で最も生成日時が古い画像データP2の生成日時情報(この例では「2010年2月22日7時10分」を示す)、及びその画像データの1つ前に生成された画像データの生成日時情報(この例では、「2010年2月21日19時30分」を示す)を画像情報テーブル10から読み出す(ステップS22、S23)。
画像データP3~P9については、上記<画像データP1の分類処理>と同様に処理されて、画像データP3~P9は、画像データP2と同じグループに分類される。
この例では、未分類の画像データP10、P11があるので(図12のステップS21:YES)、分類処理部308は、この中で最も生成日時が古い画像データP10の生成日時情報(この例では「2010年2月25日5時10分」を示す)、及びその画像データの1つ前に生成された画像データの生成日時情報(この例では、「2010年2月24日15時10分」を示す)を画像情報テーブル10から読み出す(ステップS22、S23)。
画像データP11については、上記<画像データP2の分類処理>と同様に処理されて、画像データP11は、画像データP10とは異なるグループに分類される。
実施の形態2に係るコンテンツ分類システム1000では、デジタルカメラ200から応答信号を受信したか否かに基づいて、デジタルカメラ200が、ユーザの自宅に存在する第1状態と、自宅に存在しない第2状態とのいずれの状態であるかの検出をコンテンツ分類装置300が行う例を説明した。
変形例1に係るコンテンツ分類システム1100の構成について、図13を用いて説明する。
コンテンツ分類装置510の分類部306による持出日時テーブル20及び持込日時テーブル30の更新処理、及び分類処理は、実施の形態2で説明した通りであるため、説明は省略する。
変形例1に係るコンテンツ分類システム1100では、デジタルカメラ400の識別情報を含むUHF帯の信号を受信したか否かに基づいて、デジタルカメラ400が、ユーザの自宅に存在する第1状態と、自宅に存在しない第2状態とのいずれの状態であるかの検出をPC500が行う例を説明した。
変形例2に係るコンテンツ分類システム1200の構成について、図14を用いて説明する。
コンテンツ分類装置552の分類部306による持出日時テーブル20及び持込日時テーブル30の更新処理、及び分類処理は、実施の形態2で説明した通りであるため、説明は省略する。
以上、本発明に係るコンテンツ分類システム、コンテンツ生成分類装置、及びコンテンツ分類装置を、各実施の形態及び各変形例(以下、単に「実施の形態」ともいう)に基づいて説明したが、以下のように変形することも可能であり、本発明は上述した実施の形態で示した通りのコンテンツ分類システム、コンテンツ生成分類装置、及びコンテンツ分類装置に限られないことは勿論である。
100、200、400、450 デジタルカメラ
101 レリーズボタン
102、303 表示部
103 計時部
110 コンテンツ生成分類装置
111 記憶部
112 位置記憶部
113 生成部
114 撮像部
115 撮像制御部
116 位置算出部
117、304、512、553 検出部
118、306 分類部
119、307 時間情報更新部
120、308 分類処理部
210、410、451 コンテンツ生成装置
211 コンテンツ記憶部
212、301 無線通信部
213、412 送信処理部
300、510、552 コンテンツ分類装置
302 データ記憶部
305、513 取得部
401、501、551 USBインタフェース部
411 ICタグ
500、550 PC
511 ICタグリーダ
1000、1100、1200 コンテンツ分類システム
Claims (16)
- 逐次コンテンツを生成するコンテンツ生成装置を含むコンテンツ分類システムであって、
前記コンテンツ生成装置が、所定位置に存在する第1状態と、前記所定位置に存在しない第2状態とのいずれの状態であるかの検出を繰り返し行う検出手段と、
前記コンテンツ生成装置により生成された2つのコンテンツの各生成時の間に、
前記検出手段が検出した状態に変化が生じている場合には、当該2つのコンテンツを異なるグループに分類し、
前記検出手段が検出した状態に変化が生じていない場合には、当該2つのコンテンツを同一のグループに分類する分類処理を行う分類手段とを備える
ことを特徴とするコンテンツ分類システム。 - 前記分類手段は、
前記コンテンツ生成装置により生成された生成順が連続する2つのコンテンツからなる各組について、前記分類処理を行う
ことを特徴とする請求項1記載のコンテンツ分類システム。 - 前記コンテンツ生成装置は、
所定範囲で受信可能な第1の無線信号を送信する第1通信手段と、
生成したコンテンツと当該コンテンツの生成時間情報とからなるコンテンツ情報を記憶するコンテンツ記憶手段とを備え、
前記コンテンツ分類システムは、更に
前記検出手段及び前記分類手段と、
前記第1の無線信号を受信可能な第2通信手段と、
前記コンテンツ記憶手段に記憶されている各コンテンツ情報を取得する取得手段とを備え、前記所定位置に設置されたコンテンツ分類装置を含み、
前記検出手段は、
前記第2通信手段が前記第1の無線信号を受信した場合に、前記第1状態であると検出し、
前記第2通信手段が所定時間以内に前記第1の無線信号を受信しない場合に、前記第2状態であると検出し、
前記分類手段による前記分類処理は、
前記取得手段が取得した各コンテンツ情報を構成する各コンテンツを、当該コンテンツが属するグループ別に管理する処理である
ことを特徴とする請求項2記載のコンテンツ分類システム。 - 前記第2通信手段は、更に
前記所定範囲で受信可能な第2の無線信号を繰り返し送信するものであり、
前記第1通信手段は、更に
前記第2の無線信号を受信することが可能であり、
当該第2の無線信号を受信した際に、前記第1の無線信号を送信し、
前記第2通信手段は、
前記第2の無線信号を送信した時を基準に、前記所定時間を計時する
ことを特徴とする請求項3記載のコンテンツ分類システム。 - 前記第1通信手段は、前記第1の無線信号として、前記コンテンツ生成装置の識別情報を含む信号を繰り返し送信するICタグであり、
前記第2通信手段は、ICタグリーダである
ことを特徴とする請求項3記載のコンテンツ分類システム。 - 前記コンテンツ生成装置は、
前記検出手段及び前記分類手段と、
特定の無線信号を受信するための受信手段とを備え、
前記検出手段は、
前記受信手段が受信した前記無線信号に基づいて、前記検出を行う
ことを特徴とする請求項2記載のコンテンツ分類システム。 - 前記無線信号は、GPS(Global Positioning System)衛星から送信された信号であり、
前記コンテンツ生成装置は、更に
前記所定位置の緯度及び経度を示す位置情報を記憶する位置記憶手段と、
前記受信手段が受信した前記無線信号に基づいて、自装置の位置を示す緯度及び経度を算出する位置取得手段とを備え、
前記検出手段は、
前記位置記憶手段が記憶する前記位置情報が示す緯度及び経度と、前記位置取得手段が算出した緯度及び経度との差が、
所定値以下である場合に、前記第1状態であると検出し、
当該所定値より大きい場合に、前記第2状態であると検出する
ことを特徴とする請求項6記載のコンテンツ分類システム。 - 前記コンテンツ生成装置は、更に
生成したコンテンツと当該コンテンツの生成時間情報とからなるコンテンツ情報を記憶するコンテンツ記憶手段を備え、
コンテンツ分類システムは、
前記無線信号を繰り返し送信する送信手段と、
前記コンテンツ記憶手段に記憶されている各コンテンツ情報を取得する取得手段とを備え、前記所定位置に設置されたコンテンツ管理装置を含み、
前記無線信号は、所定範囲でのみ受信可能であり、
前記検出手段は、
前記受信手段が前記無線信号を受信した場合に、前記第1状態であると検出し、
前記受信手段が所定時間以内に前記無線信号を受信しない場合に、前記第2状態であると検出し、
前記分類手段による前記分類処理は、
前記取得手段が取得した各コンテンツ情報を構成する各コンテンツを、当該コンテンツが属するグループ別に管理する処理である
ことを特徴とする請求項6記載のコンテンツ分類システム。 - 前記コンテンツ分類システムは、
前記検出手段及び前記分類手段を備え、前記所定位置に設置されたコンテンツ分類装置を含み、
前記コンテンツ生成装置及び前記コンテンツ分類装置はそれぞれ
他方の装置と有線接続するための接続手段を備え、
前記検出手段は、
自装置の前記接続手段による有線接続を検出した場合に、前記第1状態であると検出し、
自装置の前記接続手段による有線接続を所定時間以内に検出しない場合に、前記第2状態であると検出する
ことを特徴とする請求項2記載のコンテンツ分類システム。 - 前記コンテンツ生成装置は、
被写体を撮像し画像データを生成するデジタルカメラであり、
前記コンテンツ分類装置は、
前記デジタルカメラを所持するユーザの自宅に設置されたコンピュータである
ことを特徴とする請求項2記載のコンテンツ分類システム。 - 前記コンテンツ分類システムは、
前記検出手段及び前記分類手段を備え、前記所定位置に設置されたコンテンツ分類装置を含み、
前記コンテンツ生成装置は、複数存在し、
前記検出手段は前記検出を、前記分類手段は各組についての前記分類処理を、コンテンツ生成装置毎に行う
ことを特徴とする請求項2記載のコンテンツ分類システム。 - コンテンツを記憶するためのコンテンツ記憶手段と、
逐次コンテンツを生成し前記コンテンツ記憶手段に格納する生成手段と、
自装置が、所定位置に存在する第1状態と、前記所定位置に存在しない第2状態とのいずれの状態であるかの検出を繰り返し行う検出手段と、
前記コンテンツ記憶手段に記憶されている2つのコンテンツの各生成時の間に、
前記検出手段が検出した状態に変化が生じている場合には、当該2つのコンテンツを異なるグループに分類し、
前記検出手段が検出した状態に変化が生じていない場合には、当該2つのコンテンツを同一のグループに分類する分類処理を行う分類手段とを備える
ことを特徴とするコンテンツ生成分類装置。 - 前記コンテンツ生成分類装置は、更に
前記所定位置の緯度及び経度を示す位置情報を記憶する位置記憶手段と、
GPSを用いて、自装置の位置を示す緯度及び経度を算出する位置算出手段とを備え、
前記検出手段は、
前記位置記憶手段が記憶する位置情報が示す緯度及び経度と、前記位置算出手段が算出した緯度及び経度との差が、
所定値以下である場合に、前記第1状態であると検出し、
当該所定値より大きい場合に、前記第2状態であると検出する
ことを特徴とする請求項12記載のコンテンツ生成分類装置。 - 外部装置により逐次生成されたコンテンツを記憶するコンテンツ記憶手段と、
前記外部装置が、所定位置に存在する第1状態と、前記所定位置に存在しない第2状態とのいずれの状態であるかの検出を繰り返し行う検出手段と、
前記外部装置により生成された2つのコンテンツの各生成時の間に
前記検出手段が検出した状態に変化が生じている場合には、当該2つのコンテンツを異なるグループに分類し、
前記検出手段が検出した状態に変化が生じていない場合には、当該2つのコンテンツを同一のグループに分類する分類処理を行う分類手段とを備える
ことを特徴とするコンテンツ分類装置。 - 逐次コンテンツを生成するコンテンツ生成装置を含むコンテンツ分類システムにおいて用いられる分類方法であって、
前記コンテンツ生成装置が、所定位置に存在する第1状態と、前記所定位置に存在しない第2状態とのいずれの状態であるかの検出を繰り返し行う検出ステップと、
前記コンテンツ生成装置により生成された2つのコンテンツの各生成時の間に、
前記検出ステップで検出した状態に変化が生じている場合には、当該2つのコンテンツを異なるグループに分類し、
前記検出ステップで検出した状態に変化が生じていない場合には、当該2つのコンテンツを同一のグループに分類する分類ステップとを含む
ことを特徴とする分類方法。 - 逐次コンテンツを生成するコンテンツ生成装置を含むコンテンツ分類システムにおけるコンピュータに、分類処理を行わせるためのプログラムであって、
前記コンテンツ生成装置が、所定位置に存在する第1状態と、前記所定位置に存在しない第2状態とのいずれの状態であるかの検出を繰り返し行う検出ステップと、
前記コンテンツ生成装置により生成された2つのコンテンツの各生成時の間に、
前記検出ステップで検出した状態に変化が生じている場合には、当該2つのコンテンツを異なるグループに分類し、
前記検出ステップで検出した状態に変化が生じていない場合には、当該2つのコンテンツを同一のグループに分類する分類ステップとを含む
ことを特徴とするプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012518210A JP5512810B2 (ja) | 2010-05-31 | 2011-02-04 | コンテンツ分類システム、コンテンツ生成分類装置、コンテンツ分類装置、分類方法及びプログラム |
CN201180001290.6A CN102959540B (zh) | 2010-05-31 | 2011-02-04 | 内容分类***、内容生成分类装置、内容分类装置、分类方法 |
US13/144,155 US20120179641A1 (en) | 2010-05-31 | 2011-02-04 | Content classification system, content generation classification device, content classification device, classification method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-123798 | 2010-05-31 | ||
JP2010123798 | 2010-05-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011151946A1 true WO2011151946A1 (ja) | 2011-12-08 |
Family
ID=45066340
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/000630 WO2011151946A1 (ja) | 2010-05-31 | 2011-02-04 | コンテンツ分類システム、コンテンツ生成分類装置、コンテンツ分類装置、分類方法及びプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20120179641A1 (ja) |
JP (1) | JP5512810B2 (ja) |
CN (1) | CN102959540B (ja) |
WO (1) | WO2011151946A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011165176A (ja) * | 2010-01-13 | 2011-08-25 | Canon Inc | 画像管理装置、画像管理装置の制御方法、及びプログラム |
JP2019164788A (ja) * | 2018-03-14 | 2019-09-26 | 株式会社Spectee | 情報処理装置、情報処理方法、プログラム及び画像情報表示システム |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012247840A (ja) * | 2011-05-25 | 2012-12-13 | Sony Corp | 近隣人物特定装置、近隣人物特定方法、近隣人物特定プログラム及び近隣人物特定システム |
US10224026B2 (en) * | 2016-03-15 | 2019-03-05 | Sony Corporation | Electronic device, system, method and computer program |
US10447974B2 (en) * | 2017-03-13 | 2019-10-15 | Quanta Computer Inc. | System for determining device location data in a data center |
CN109683899B (zh) * | 2017-10-18 | 2022-04-08 | 中移(苏州)软件技术有限公司 | 一种软件集成方法及装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000217057A (ja) * | 1998-11-18 | 2000-08-04 | Casio Comput Co Ltd | 撮影画像検索装置、電子カメラ装置及び撮影画像検索方法 |
JP2005174060A (ja) * | 2003-12-12 | 2005-06-30 | Matsushita Electric Ind Co Ltd | 画像分類装置ならびに画像分類方法およびプログラム |
JP2006344005A (ja) * | 2005-06-09 | 2006-12-21 | Sony Corp | 情報処理装置、情報処理方法、およびプログラム |
JP2008250605A (ja) * | 2007-03-30 | 2008-10-16 | Sony Corp | コンテンツ管理装置、画像表示装置、撮像装置、および、これらにおける処理方法ならびに当該方法をコンピュータに実行させるプログラム |
JP2009518704A (ja) * | 2005-11-22 | 2009-05-07 | イーストマン コダック カンパニー | 地図分類方法及び地図分類システム |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3982605B2 (ja) * | 2000-09-29 | 2007-09-26 | カシオ計算機株式会社 | 撮影画像管理装置、撮影画像管理方法及び撮影画像管理プログラム |
TW595145B (en) * | 2003-03-21 | 2004-06-21 | Benq Corp | Method and related apparatus for reducing cell phone transmission power consumption by longer discrete receiving time interval |
US7945535B2 (en) * | 2004-12-13 | 2011-05-17 | Microsoft Corporation | Automatic publishing of digital content |
US7647248B2 (en) * | 2004-12-17 | 2010-01-12 | International Business Machines Corporation | Shopping environment including detection of unpaid items in proximity to an exit |
JP4086041B2 (ja) * | 2005-01-27 | 2008-05-14 | コニカミノルタオプト株式会社 | 撮像装置 |
JP2007241377A (ja) * | 2006-03-06 | 2007-09-20 | Sony Corp | 検索システム、撮像装置、データ保存装置、情報処理装置、撮像画像処理方法、情報処理方法、プログラム |
JP2007258856A (ja) * | 2006-03-22 | 2007-10-04 | Hitachi Ltd | 携帯端末および情報処理装置、データ送受信システム。 |
JP4922848B2 (ja) * | 2007-06-28 | 2012-04-25 | オリンパス株式会社 | 携帯端末装置及びネットワーク接続制御方法 |
-
2011
- 2011-02-04 US US13/144,155 patent/US20120179641A1/en not_active Abandoned
- 2011-02-04 CN CN201180001290.6A patent/CN102959540B/zh active Active
- 2011-02-04 JP JP2012518210A patent/JP5512810B2/ja not_active Expired - Fee Related
- 2011-02-04 WO PCT/JP2011/000630 patent/WO2011151946A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000217057A (ja) * | 1998-11-18 | 2000-08-04 | Casio Comput Co Ltd | 撮影画像検索装置、電子カメラ装置及び撮影画像検索方法 |
JP2005174060A (ja) * | 2003-12-12 | 2005-06-30 | Matsushita Electric Ind Co Ltd | 画像分類装置ならびに画像分類方法およびプログラム |
JP2006344005A (ja) * | 2005-06-09 | 2006-12-21 | Sony Corp | 情報処理装置、情報処理方法、およびプログラム |
JP2009518704A (ja) * | 2005-11-22 | 2009-05-07 | イーストマン コダック カンパニー | 地図分類方法及び地図分類システム |
JP2008250605A (ja) * | 2007-03-30 | 2008-10-16 | Sony Corp | コンテンツ管理装置、画像表示装置、撮像装置、および、これらにおける処理方法ならびに当該方法をコンピュータに実行させるプログラム |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011165176A (ja) * | 2010-01-13 | 2011-08-25 | Canon Inc | 画像管理装置、画像管理装置の制御方法、及びプログラム |
JP2019164788A (ja) * | 2018-03-14 | 2019-09-26 | 株式会社Spectee | 情報処理装置、情報処理方法、プログラム及び画像情報表示システム |
Also Published As
Publication number | Publication date |
---|---|
CN102959540A (zh) | 2013-03-06 |
US20120179641A1 (en) | 2012-07-12 |
JP5512810B2 (ja) | 2014-06-04 |
JPWO2011151946A1 (ja) | 2013-07-25 |
CN102959540B (zh) | 2017-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5512810B2 (ja) | コンテンツ分類システム、コンテンツ生成分類装置、コンテンツ分類装置、分類方法及びプログラム | |
CN1835569B (zh) | 时移图像分配***,时移图像分配方法,时移图像请求装置及图像服务器 | |
US9196307B2 (en) | Geo-location video archive system and method | |
US20070228159A1 (en) | Inquiry system, imaging device, inquiry device, information processing method, and program thereof | |
US9092456B2 (en) | Method and system for reconstructing image having high resolution | |
EP3321880B1 (en) | Monitoring system, photography-side device, and verification-side device | |
CN102077581A (zh) | 数据接收设备、数据发送设备、及其控制方法和程序 | |
JP2012247841A (ja) | 近隣人物特定装置、近隣人物特定方法、近隣人物特定プログラム及び近隣人物特定システム | |
JP2012247840A (ja) | 近隣人物特定装置、近隣人物特定方法、近隣人物特定プログラム及び近隣人物特定システム | |
EP2213979A1 (en) | Map display device, map display method, and imaging device | |
US20230179811A1 (en) | Information processing apparatus, information processing method, imaging apparatus, and image transfer system | |
CN102209177A (zh) | 传送装置、传送方法和程序 | |
US10630894B2 (en) | Notification system, wearable device, information processing apparatus, control method thereof, and computer-readable storage medium | |
CN111566639A (zh) | 一种图像分类方法及设备 | |
US8502874B2 (en) | Image recording apparatus and control method | |
JP2012084052A (ja) | 撮像装置、制御方法及びプログラム | |
CN103546677A (zh) | 互动***及互动方法 | |
CN106326328A (zh) | 图像发送***、图像发送装置和图像发送方法 | |
JP2009265850A (ja) | データ転送方法、データ送信装置、データ受信装置およびコンピュータプログラム | |
JP5181616B2 (ja) | 情報提供依頼システム及び情報提供依頼方法 | |
JP6809558B2 (ja) | サーバ及び課金方法 | |
WO2024067428A1 (zh) | 高分辨率高帧率摄像方法和图像处理装置 | |
JP2008245108A (ja) | デジタルカメラ | |
JP2003281255A (ja) | スタッフ派遣方法およびプログラム | |
KR20120045559A (ko) | 전자 경매 히스토리 기록 시스템 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180001290.6 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13144155 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012518210 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11789361 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11789361 Country of ref document: EP Kind code of ref document: A1 |