HK1184245B - Detecting recurring events in consumer image collections - Google Patents
Detecting recurring events in consumer image collections Download PDFInfo
- Publication number
- HK1184245B HK1184245B HK13111430.0A HK13111430A HK1184245B HK 1184245 B HK1184245 B HK 1184245B HK 13111430 A HK13111430 A HK 13111430A HK 1184245 B HK1184245 B HK 1184245B
- Authority
- HK
- Hong Kong
- Prior art keywords
- digital image
- event
- time
- events
- digital
- Prior art date
Links
Description
Technical Field
The present invention relates generally to the field of digital image processing and, more particularly, to a method of identifying a set of digital images describing a recurring event in a user image collection.
Background
With the popularity of digital cameras and camera phones, people take a large number of images and videos to record events important to them. Interesting portions of these events are then shared online for access by the user's social network. Large collections of digital media that accumulate over time contain rich information that can be used to learn about individual users and groups of people. Time information is often advantageous for management and retrieval of information, which enhances search and browsing applications. Analyzing the content and time of user media in a collection spanning years may generate important dates and views of interest to the user. This knowledge may enable the organization of personal collections shared with contacts, as well as the personalization and timely advertising. For example, if evidence of a user's personal photo collection indicates that he/she is on vacation on a regular basis in a school vacation of march, the images in the group may be linked to vacations of the previous year for proper organization. Travel and travel-related advertisements may be targeted at planning stages during this time, and these images may be shared with contacts with whom the user often shares such images.
Attempts to identify certain events using a generic calendar with important dates can only detect a limited number of events and in this way cannot detect a user-specific special date (e.g. a birthday). In addition, the method assumes that users in the same area are currently celebrating the same holiday, and actually at this time, different calendars are required for each of the different populations. In addition to differences in calendars due to cultural differences, the location of the user also provides local events to the calendar, such as the clove section of Rochester, N.Y., the International Thermococcus section of Arabic, N.Mexico. For these problems, there has been work to associate images taken by users with their Personal Calendars (e.g., "image interpretation Using Personal Calendars as Context", Gallagher et al. acmentl. conf. on Multimedia2008 (Gallagher et al "annotation Using Personal Calendars as Context" on 2008 Multimedia conference of international computer association)). However, the indicia on a personal calendar typically relate to appointments and work tasks unrelated to taking pictures.
There has been work to group images into events. U.S. Pat. No. 6,606,411 to Loui and U.S. Pat. No. 6,351,556 to Loui disclose algorithms for clustering image content by temporal events and sub-events. According to U.S. Pat. No. 6,606,411, events have a consistent color distribution, and therefore, the photographs may be taken with the same background. For each sub-event, a single color and texture representation of all background regions co-captured is calculated. The above two patents teach how to cluster images and videos in a digital image collection into events and sub-events related to time. The terms "event" and "sub-event" are used in an objective sense to indicate a computer-mediated program product that attempts to match the subjective perception of a user of a particular occurrence (corresponding to an event) and portion of that occurrence (corresponding to a sub-event). Another method of automatically organizing images into events is disclosed in U.S. patent No. 6,915,011 to Loui et al. The detected events are arranged chronologically in the timeline from earliest to latest.
Using the above approach, the amount of browsing required by a user to locate a particular event may be reduced by viewing representations (representations) of the event along the timeline rather than viewing thumbnails of the individual images. However, due to the large time interval of related events (e.g., birthdays), these event groups are far apart on the time axis and are not easily considered as a group. Therefore, there is a need to detect groups of images that are semantically related to each other but separated by a long time difference.
Disclosure of Invention
According to the present invention there is provided a method of detecting recurring events in a collection of digital images taken over a predetermined period of time using a processor for:
(a) analyzing the digital image collection to produce a multi-dimensional representation of the distribution of image capture behavior over time;
(b) repetitive events are detected by identifying spatial clusters in the multi-dimensional representation. The similarity between events may also be considered in the clustering process.
In the present invention, an architecture is described that mines recurring events over time from a set of users over multiple years. The "set" is described in terms of "events" represented in a suitable multidimensional space. The bins are filtered based on event characteristics using density-based clustering at different bin sizes to reduce the number of false matches in the group. An event signature is created based on the event classification, location, and temporal features to characterize the event. The present invention detects individual special dates such as birthdays and anniversaries, seasonal events and holidays customized by a user's personal collection to be celebrated.
For example, the invention is applicable to two commonly occurring types of calendar-based recurring events in a set of users: events that typically occur near the same date each year, such as birthdays, anniversaries, and some holidays; and events that are not strictly dependent on calendar date. Although events of the second type also have similar temporal characteristics, their exact dates are usually not the same year after year. Including holidays that do not follow exact dates, such as those on a particular week or day of the week in a month (e.g., labor knots in the united states; mother's day), and those resulting from the calculation of the profit or loss of the moon or sun (e.g., religious and cultural holidays in many asias). In addition, there are regular vacations (e.g., school holidays), party/party (celebrating calendar-based events, but moved to a convenient weekend rather than a fixed date), and sporting events, which are also of a type that is not strictly related to a calendar and is not on a particular date.
The organization and retrieval of images and videos is a problem for the typical user. It is beneficial for users to be able to browse an overview of important events in their collection. Techniques disclosed in the prior art allow images in a collection to be classified as events, but when they are separated in time, related events cannot be correlated. The invention enables the efficient detection of recurring events that typically occur around the same date each year and events that are not strictly related to calendar dates. This includes individual special dates customized for the user whose collection is being analyzed, such as birthdays and anniversaries, seasonal activities and holidays to celebrate, and the like.
Drawings
FIG. 1 is a block diagram of a system embodying the present invention;
FIG. 2 is a general flow diagram of the method of the present invention;
FIG. 3 is a more detailed flow diagram of event signature generation shown in block 130 of FIG. 2;
FIG. 4 illustrates one particular example of a two-dimensional representation of an event generated in accordance with the present invention;
FIG. 5 illustrates one particular example of a three-dimensional representation of an event generated in accordance with the present invention;
fig. 6A and 6B illustrate two examples of displaying an organized collection showing groups of duplicate events detected in the collection.
Detailed Description
The present invention may be implemented in computer systems known to those skilled in the art. In the following description, some embodiments of the present invention will be described as software programs. Those skilled in the art will readily recognize that the equivalent of this method can also be constructed as hardware or software within the scope of the present invention.
Since image processing algorithms and systems are known, the present description will be directed in particular to algorithms and systems being part of, or cooperating more directly with, the method in accordance with the present invention. Other aspects of the algorithms and systems, and the hardware or software for generating and processing the image signals included therein, not specifically shown or described herein, may be selected from systems, algorithms, components, and elements known in the art. All software implementations therein are conventional and do not go beyond the routine skill in the art in view of the description set forth in the specification that follows.
The present invention may be implemented in computer hardware and computerized equipment. For example, the method may be performed on digital cameras, multimedia smartphones, digital printers, internet servers, kiosks, and personal computers. Referring to FIG. 1, there is shown a computer system for implementing the present invention. While a computer system is shown for purposes of illustrating the preferred embodiment, the invention is not limited to the computer system shown, but may be used in any electronic processing system such as found in a digital camera, home computer, kiosk, or any other system for processing digital images. The computer 10 includes a microprocessor-based unit 20 (also referred to herein as a processor) for receiving and processing software programs and for performing other processing functions. The memory unit 30 stores user-provided and computer-generated data that can be accessed by the processor 20 when running a computer program. A display device (e.g., a monitor) 70 is electrically connected to the computer 10 and displays information and data associated with the software through, for example, a graphical user interface. A keyboard 60 is also connected to the computer 10. As an alternative to using a keyboard for input, a mouse may also be used to move a selector on the display device 70 and to select the option covered by the selector, as is known in the art. An input device 50, such as a Compact Disc (CD) and DVD, may be embedded in the computer 10 for inputting software programs and other information into the computer 10 and the processor 20. Additionally, computer 10 may be programmed to store software programs internally, as is known in the art. In addition, media files (e.g., images, music, video, etc.) may be transferred to the memory unit 30 of the computer 10 by using an input device 50 such as a memory card, a flash drive, a CD, and a DVD, or by directly connecting a camera (e.g., a camera, a cell phone, a video recorder, etc.) to the computer 10 as an input device. Computer 10 may have a network connection, such as a telephone line or wireless connection 80, to an external network, such as a local area network or the internet. Software programs and media files may be transferred to computer 10 from other computers or networks through a network connection.
It should also be noted that the present invention can be implemented as a combination of software or hardware and is not limited to devices that are physically connected or located at the same physical location. One or more of the devices shown in fig. 1 may be remotely located and connected via a network. One or more of the devices may be connected in a wireless manner, for example by a radio frequency link, either directly or via a network.
The invention has been described in detail with particular reference to certain preferred embodiments thereof, but it will be understood that variations and modifications can be effected within the spirit and scope of the invention. In the detailed description that follows, the term "image" also includes video in the collection.
Referring to FIG. 2, a digital image collection 110 of a user's years is stored in the memory unit 30 of the computer 10. In a preferred embodiment, the digital image collection 110 spans a time of at least 5 years. The other modules in the figure are implemented by software programs and executed by the processor 20 of the computer 10. The digital image collection 110 is provided to an event clustering algorithm 120, which algorithm 120 groups the images in the digital image collection 110 into events related to time. In a preferred embodiment, event and sub-event detectors as described in U.S. Pat. No. 6,606,411 to Loui and U.S. Pat. No. 6,351,556 to Loui are used. According to U.S. Pat. No. 6,606,411, events have a consistent color distribution and therefore, the photographs may be taken using the same background. For each sub-event, a single color and texture representation is calculated for all background regions taken together. The above two patents teach how to cluster images and videos in a digital image set into events and sub-events related to time. The terms "event" and "sub-event" are used in an objective sense to indicate a computer-mediated program product that attempts to match the subjective perception of a user of a particular occurrence (corresponding to an event) and a partition of the occurrence (corresponding to a sub-event). Briefly summarized, a collection of images is classified as one or more events by determining a maximum time difference in one or more image sets based on a cluster of times or dates of the images, and segmenting the plurality of images into events based on one or more boundaries between the events, wherein the one or more boundaries correspond to the one or more maximum time differences. For each event, sub-events (if any) may be determined by comparing color histogram information of successive images, as described in U.S. Pat. No. 6,351,556. This is achieved by dividing the image into a number of blocks and then calculating the color histogram for each block. Sub-event boundaries are detected using a block-based histogram correlation procedure, as described in U.S. patent No. 6,351,556 to Loui. Another method of automatically organizing images into events is disclosed in U.S. patent No. 6,915,011 to Loui et al, which is incorporated herein by reference. Briefly summarized, according to one aspect of the invention described above, an event clustering method uses foreground and background segmentation for clustering images from a group into similar events. First, each image is divided into a plurality of blocks, thereby providing a block-based image. Using block-by-block comparison, each block-based image is segmented into a plurality of regions including at least a foreground and a background. One or more luminance, color, position or size features are extracted from the regions, and the extracted features are used to assess and compare the similarity of regions comprising foreground and background in successive images in the group. Subsequently, a measure of the overall similarity between successive images is calculated, providing image distances between successive images, and event clusters are demarcated according to the image distances.
Referring to FIG. 2, events detected in the event clustering algorithm 120 are represented in a multi-dimensional space 140. In one embodiment, a two-dimensional space is used as shown in FIG. 4. Referring to FIG. 4, each event forms an event point 310, also referred to in this application as a plurality of event points, in the space defined by the number of years on the y-axis 320 and the days in the years on the x-axis 330. The years are numbered only chronologically to generate years, e.g., if the collection spans 2005-2010, 2005 corresponds to 1,2006 for year 2, and so on, 2010 corresponds to year 6. Days in the year are counted as day 1 from 1 month 1 day of the beginning of the year. If 2 months and 29 days occur in the year, they are typically ignored in the counts, so that the dates in successive years correspond to the same day of the year. The events represented in this two-dimensional space place potentially repetitive events in close spatial proximity to each other, so a spatial clustering approach can be used to find groups of events. The events may also be represented in a multidimensional space where the dimension n is greater than 2. Other event characteristics, such as event type or event size, may be used as additional axes to further spatially locate the event by its characteristics. In another embodiment, the week of the year and the day of the week are used as the x-axis and the y-axis, respectively. FIG. 5 illustrates an example of an embodiment of creating a three-dimensional representation using years as the z-axis. The week in one year refers to the serial number of the first week from that year as 1. When the first and last weeks are incomplete weeks, the last week in the year is week 53. The days of the week are numbered sequentially from-3 to +3 (including 0) starting on Monday. Such a representation is useful for detecting recurring events related to days of the week, such as school sports tournaments, periodic gatherings every week, easter, thanksgiving, and other festivals.
Referring to fig. 2, spatial clustering 150 is performed on the event representations in the multidimensional space generated in 140. In a preferred embodiment, density-based clustering methods (Data Ming Concepts and technologies, Hanand Kamber, Elsevier, 2006, page 418-420) are used to generate spatial clusters. The algorithm grows regions with a sufficiently high point density into clusters. In our embodiment, the neighborhood around any given central event point (x, ± y) is defined as (x ± 2, y ± 2) to detect events closely related to the calendar date. Core targets having more than a threshold number of points (5 points in this embodiment) in their neighborhood are identified. The density-based clustering algorithm iteratively collects directly available density targets from these core targets, terminating when no new points can be added. To detect repeated events that are not strictly related to the calendar, a larger neighborhood (x ± 7, y ± 7) is selected around the central event point with the same threshold (5 points) used to define the core point. However, only event points 310 processed by the event signature filtering described in the next paragraph are included to calculate the neighborhood point for any given event point 310.
Referring to fig. 2, filtering based on event signatures 130 may be used to refine spatial clustering 150. This additional step is particularly beneficial when larger clinical areas are used or repetitive events within a year are detected. The event signatures 130 are used as filters to determine whether points can be considered to be in the same neighborhood as any given central event point 310. The event signatures 130 capture commonality of features between events and may be derived from content-based analysis at the image level and/or event-based analysis at the event level. In one embodiment, three main features obtained at the event level are used-day of week, event category and location, which show good correlation within events from the same repeating group to perform the event signature based filtering shown in FIG. 3.
Referring to FIG. 3, while the neighborhood of a central event point 205 is being considered, the other event points 310 in its neighborhood 210 are processed one by one as follows. The event category match 220 determines whether the potential adjacent event point 210 has the same event category label as the central point 205. In a preferred embodiment, the method described in "Event Classification in Personal image collections", Das and Loui, works hop on MediaInformation Analysis for Personal and Social Applications, IEEE int.icme 2009 "is used to provide a broad category of events (" vacation "," party "," sport ", or" home time ") for each Event. In this approach, a variety of high-level visual and temporal features that exhibit good correlation with event type are used to train a Bayesian belief network (Bayesian belief network) for event classification that computes the posterior probability of an event classification given the input features.
The location matching module 230 checks whether the potential critical event points 210 can be co-located with the central event point 205. The location where an event occurs is an important factor in determining whether it forms a repeating group with other events. Many repeat groups contain events that occur at the same location. In the absence of GPS information, Event locations are matched by using SIFT features as described in "Event-based location matching for Consumer Image Collections" ("Event-based location matching for user Image Collections"), Das et al, Proceedings of the ACM int. In this method, events in a user's image collection are matched by location by automatically matching detected unusual objects appearing in the scene using SIFT features. Using this approach, if there is a positive scene match between two events, their locations are considered to have matched. It should be noted that if the two events cannot be matched using the scenario-based approach mentioned above, this does not mean that the events cannot be captured at the same location. Because there is no unique or unusual target captured from the images of the two events, the events may not be able to be matched. However, a positive match would explicitly indicate that the event was captured at the same location. When GPS-based location is available for matching, these are used to determine whether two events have occurred at the same location. However, even in this case, a negative match does not exclude the possibility of belonging to the same repeating group. Users may periodically leave a particular area, forming a repeating group, but they may visit different specific locations in the area. The area in the repeating group where it is said to be located may be very extensive, such as florida during spring holidays. In contrast, there may be differences in event populations based on a finer granularity than the town in which they are located, e.g., a user may think that "school" is a different location than "home" even though they are located in the same town. In some cases, the location information may be irrelevant. For example, birthday derivatives are usually celebrated in the user's hometown, but some birthday derivatives may be at home while others may be at some particular location. Thus, only positive matches are included in the event signature comparison.
The day of the week is used as part of the event signature-based filter because by studying the user's media collection, it is found that there is significant correlation between members in the same group of repeated events and the day of the week that the event occurred, e.g., events from the same group may all occur on sundays. Many holidays are dependent on the day of the week, e.g. easter, thanksgiving (usa). Typically, there is more photographing activity near or during the weekend. Friday through monday are kept as unique choices in view of the distribution of such events, and tuesday through thursday are incorporated into the "workday" label. The day of week match 240 determines whether the two events have the same day of week label as described above. For multi-day events, any overlap of days in the week is considered a match.
Features in the event derived from the content-based analysis of the image may also be included in the event signature. One example of this is person-based matching, where existing face recognition techniques (e.g., "OKAO vision" face detection techniques of ohong) are used to determine if there is a person in common in both events. Matching to a common target may provide other matching criteria. A common scene classification of the images in both events (e.g., beach, city scene, or field) may be used as a matching criterion.
The event signature comparison module 250 produces a final determination of whether the potential neighbor event point 210 should be considered to be in the neighbor of the central event point 205. Because the above-described features included in the event signature 130 are meaningless in a given context, they are not combined into a single value. Furthermore, the mismatch is not necessarily significant to any of the three features 220,230,240 discussed above. Instead, a positive match is meaningful and the above features are noted. A positive match for any of the three features is assigned equal weight. For example, two events that occur on the same day of the week, have the same event category, and the same location have a match score based on the event signature, i.e., 3 points; while two events that occur on the same day of the week but have different event categories and are found to be not location matched have only a score of 1. Events with a score of at least 1 have passed the event signature filtering process. Thus, for any given event, the points considered to be in its neighborhood are those that occur on the same day of the week during the previous and next week, or those that have the same category of events, or those that were photographed at the same location in a given time period.
The clusters resulting from the spatial clustering process 150 are output as duplicate events 160 detected in the set 10 over a number of years. These repetitive events are interpreted based on the coordinate axes used in the multi-dimensional representation of the event. The images belonging to each repeat event are indexed so that they are linked to other images in the group. The recurring events are displayed to the user in a view of the organized set of years. The event may be represented by a representative event or a collection of images from an event. Referring to fig. 6A and 6B, two common organized views are shown — a timeline view 440 of fig. 6A and a calendar view 450 of fig. 6B. Non-recurring events 420 are displayed on the timeline and in the calendar based on the date they occur. Repeat events 400 are presented by icons 425, 430, which are linked to events in time periods before and after in the repeat group. For example, a person's 2010 birthday event would be linked to their birthdays in 2009 and 2011. This form allows the user to conveniently access related events that are segmented by a large time difference.
Parts list
10 computer
20 processor
30 memory
50 input device
60 keyboard
70 monitor
80 network connection
110 years old image collection
120 event clustering module
130 event signature based filtering module
140 step of presenting events in an n-dimensional space
150 spatial clustering module
160 repeat event detection
205 central event point
210 potential critical area event points
220 event category matching step
230 position matching step
Matching procedure for day of 240 weeks
250 event signature comparison step
310 event point
320 years as y-axis
Day of 330 years as the x-axis
400 repeat event
420 non-repetitive events
425 icon representing previous event in repeating group
430 icon representing the next event in the repeating group
440 time axis view of organized collection
450 calendar view of organized collection
Claims (20)
1. A method of detecting recurring events in a digital image collection taken over a predetermined period of time, comprising using a processor to:
(a) analyzing a digital image collection to produce a multi-dimensional representation of the distribution of image capture activity over time by:
(a1) identifying events in a set of digital images based at least in part on content of the digital images, each event including at least one digital image and each digital image having shoot time metadata; and
(a2) characterizing the identified event as a multi-dimensional representation based at least in part on a distribution of shot times of a digital image of the identified event, the multi-dimensional representation comprising a representation having a first time axis and a second time axis different from the first time axis; and
(b) detecting recurring events by identifying spatial clusters in the multi-dimensional representation.
2. The method of claim 1, wherein the content comprises color content of digital images in a digital image collection.
3. The method of claim 1, wherein the predetermined period of time comprises a plurality of years, and wherein operation (b) comprises identifying spatial clusters that span the plurality of years.
4. The method of claim 1, wherein the predetermined period of time comprises one year or less, and wherein operation (b) comprises identifying spatial clusters that occur within the predetermined period of time.
5. The method of claim 1, wherein the first time axis is a day of the year axis and the second time axis is a year axis.
6. The method of claim 1, wherein the first time axis is a week axis of the year and the second time axis is a day axis of the week.
7. The method of claim 1, wherein detecting recurring events by identifying spatial clusters in the multi-dimensional representation comprises applying an event signature-based filter to at least one identified event.
8. The method of claim 2, wherein identifying an event based at least in part on content of a digital image comprises:
dividing a first digital image of the digital images into a plurality of blocks;
determining a color histogram for at least some of the blocks of the first digital image;
dividing a second digital image of the digital images into a plurality of blocks, wherein the capture time of the second digital image immediately follows the capture time of the first digital image;
determining a color histogram for each of the blocks of the second digital image that correspond to the blocks in the first digital image; and
the color histogram of the block of the first digital image is compared to the color histogram of the corresponding block of the second digital image.
9. The method of claim 1, wherein identifying an event further comprises identifying an event based at least in part on foreground and background segments of digital images in the set of digital images.
10. The method of claim 1, wherein the organized set of digital images is displayed to a user.
11. A system for detecting recurring events in a digital image collection taken over a predetermined period of time, comprising:
a set of digital images taken over a predetermined period of time;
apparatus for analyzing a digital image collection to produce a multi-dimensional representation of the distribution of image capture activity over time, wherein the apparatus for analyzing the digital image collection comprises:
means for identifying events in a set of digital images based at least in part on content of the digital images, each event comprising at least one digital image, and each digital image having capture time metadata; and
means for characterizing the identified event as a multi-dimensional representation based at least in part on a distribution of shot times of a digital image of the identified event, the multi-dimensional representation comprising a representation having a first time axis and a second time axis different from the first time axis; and
means for detecting recurring events by identifying spatial clusters in the multi-dimensional representation.
12. The system of claim 11, wherein the content comprises color content of digital images in the digital image collection.
13. The system of claim 11, wherein the predetermined period of time comprises a plurality of years, and wherein the identified spatial clusters span the plurality of years.
14. The system of claim 11, wherein the predetermined time period comprises one year or less, and wherein the identified spatial clusters span the predetermined time period.
15. The system of claim 11, wherein the first timeline is a day of the year axis and the second timeline is a year axis.
16. The system of claim 11, wherein the first time axis is a week axis of the year and the second time axis is a day axis of the week.
17. The system of claim 13, wherein means for detecting recurring events by identifying spatial clusters in the multi-dimensional representation comprises means for applying an event signature-based filter to at least one identified event.
18. The system of claim 12, wherein the means for identifying an event in the set of digital images based at least in part on content of the digital images comprises:
means for dividing a first digital image of the digital images into a plurality of blocks;
means for determining a color histogram for at least some of the blocks of the first digital image;
means for dividing a second digital image of the digital images into a plurality of blocks, wherein the capture time of the second digital image immediately follows the capture time of the first digital image;
means for determining a color histogram for each of the blocks of the second digital image that corresponds to the block in the first digital image; and
means for comparing the color histogram of the block of the first digital image with the color histogram of the corresponding block of the second digital image.
19. The system of claim 11, wherein the means for identifying an event in the digital image collection further comprises means for identifying an event based at least in part on a foreground segment and a background segment of digital images in the digital image collection.
20. The system of claim 11, wherein the organized set of digital images is displayed to a user.
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/862,806 | 2010-08-25 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1184245A HK1184245A (en) | 2014-01-17 |
| HK1184245B true HK1184245B (en) | 2018-09-14 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN103069420B (en) | Detect the repeated events in user images set | |
| CN102770874B (en) | Adaptive event timeline in user image set | |
| US8611677B2 (en) | Method for event-based semantic classification | |
| US20100121852A1 (en) | Apparatus and method of albuming content | |
| US20160283483A1 (en) | Providing selected images from a set of images | |
| CN102804178B (en) | Critical event in detection user's image collection | |
| JP2013520725A5 (en) | ||
| US20210365490A1 (en) | Method for ranking and selecting events in media collections | |
| US9665773B2 (en) | Searching for events by attendants | |
| US20150006545A1 (en) | System for ranking and selecting events in media collections | |
| Li et al. | Automatic summarization for personal digital photos | |
| RU2660599C1 (en) | Method of video data indexing for facet classification | |
| JP7279776B2 (en) | Event management device and method | |
| HK1184245B (en) | Detecting recurring events in consumer image collections | |
| HK1184245A (en) | Detecting recurring events in consumer image collections | |
| CN121365166A (en) | Data collection recommendation methods, devices, electronic equipment, and readable storage media | |
| Ryu et al. | A priority queue-based hierarchical photo clustering method using photo timestamps | |
| Das et al. | Collaborative content synchronization through an event-based framework |