SeaLinc Media Conference report
[View the story "Harnessing Social Output to Enrich Cultural Heritage" on Storify]
Harnessing Social Output to Enrich Cultural Heritage
The 2nd SEALINCMedia symposium took place on Friday 23 November 2012 at the National Library of the Netherlands (KB) in The Hague. It brought together representatives of all participating institutions to share the experiences and results of the first project year.
Storified by · Tue, Nov 27 2012 06:15:28
Conference report by Lotte Belice Baltussen & Julia Vytopil
The title of the symposium was ‘ICT Advances for Cultural Heritage going Digital, Online and Social’. Speakers from both the cultural heritage and computer science domains gave plenary presentations in which two of the SEALINCMedia use case demonstrators were discussed: Accurator, a niche-sourcing application, and SocialZap, in which television content is enriched with social media. State-of-the-art solutions for multimedia content recommendation, indexing and search applied to cultural heritage use cases were presented at the demo market.
Accurator: from crowdsourcing to niche-sourcing
LizzyJongsma of @Rijksmuseum presents #Sealinc Use case in @KB_NederlandHildelies Balk
After a welcome and introduction by our host of the day, the National Library, and by project leader Alan Hanjalic (TU Delft), the first session of the day was kicked off by Lizzy Jongma of the Rijksmuseum. She introduced the ‘Accurator’ prototype, which allows the public to annotate (or: tag) the Rijksmuseum’s print collection. There are 1 million objects in the Rijksmuseum collection, 160,000 of which are part of the printing room collection. The Rijksmuseum has provided this collection with rich cultural-historical metadata but specific information about the depicted content (e.g. fish, flowers, nautical history) is lacking. Therefore, they wanted to ask very specific parts of ‘the crowd’ with extensive niche knowledge on these various topics to help them with enriching the existing metadata. As a result, the Accurator prototype was developed.
The research and technical teams (TU Delft, VU University Amsterdam, CWI) then addressed the various issues and questions involving the Accurator development. Creating an application that stimulates annotators to add the most specific and useful tags possible was vital. They implemented a system to personalise the types of annotations people can add, to make sure that their skills and expertise are matched with the desired types of metadata added. A study in which people were presented with different types of tags, and with tags of various quality and quantity, was held. It turned out that people are happily adding information, but that they are not motivated to remove faulty tags of others. A personal note: whether the latter is actually problematic remains to be seen. Asking people that are helping out with enriching collections to ‘critique’ the work of others might not be the most productive activity. Focusing on rewarding and stimulating high-quality tags will from our experience yield the best results.
spannend moment.... Mieke en Myriam starten de demo #accurator #rijksmuseum #sealinc project....Lizzy Jongma
RT @lottebelice: Mieke and Myriam now presenting the Accurator demo of the Rijksmuseum http://bit.ly/UO8lAH #crowdsourcing #sealincLizzy Jongma
RT @lottebelice: For those who don't have access to the Accurator demo or want to create an account, here's a screenshot :) #sealinc http://pic.twitter.com/Hoe7u3l2Lizzy Jongma
Accurator 2.0: linking historical newspapers with av-content through commons events
Jacco van Ossenbruggen (CWI/VU University) provided a sneak peek into the possibilities for the next-generation Accurator, in which creating links between collections is central. The specific focus is asking the crowd (or rather: niches) to create event-based annotations of relations among the National Library and Sound & Vision collections. The challenge here is to link two types of collections: the historical newspapers of the KB with the time-based ones of Sound & Vision. Jacco presented various examples, such as a newsreel item about the return home of Olympic swimming champion Nel van Vliet and newspaper articles about the same event. This presents great possibilities for enriching not just these collections, but to connect with other applications and projects as well, such as the just-released ‘Here was the news’ app of the National Library, and the Agora project.
SocialZap: enriching television content with social media
Bouke Huurnink: Imagine you are a television producer.... On SocialZap, enriching tv content with social media #sealinc @sealincmediaJulia Vytopil
Bouke Huurnink: SocialZap will segment interesting pieces of tv programmes (Zap Points) automatically, based on Tweets #sealincLotte Belice
SocialZap'Pitch' type presentation for the SocialZap demonstrator at the SEALincMeida Symposium on 23 November 2012. Also includes an update of th...
Bouke Huurnink from the Netherlands Institute for Sound & Vision introduced the project SocialZap, which enriches television content with context information and content related metadata from social media.
When television programmes are broadcast, the viewers tweet about remarkable moments and content. By using social media to enrich the content, producers and broadcasters will benefit from the users’ input.
Specifically, by analysing these programme-related tweets SocialZap can identify “zap points”: the most interesting moments in an episode. These zap points can then be used to fragment the episode for online broadcasting on the programme portal.
When television programmes are broadcast, the viewers tweet about remarkable moments and content. By using social media to enrich the content, producers and broadcasters will benefit from the users’ input.
Specifically, by analysing these programme-related tweets SocialZap can identify “zap points”: the most interesting moments in an episode. These zap points can then be used to fragment the episode for online broadcasting on the programme portal.
To illustrate this, the example of the popular Dutch daily tv-show “De Wereld Draait Door” was used. During the episode, viewers tweet about what they see, for instance a poisonous frog leaping off the interview table. The amount of tweets mentioning “frog” and “#dwdd” peaks right after the item about the frog starts. In addition, computer vision is used to accurately pinpoint where the frog would appear.
Svetlana Kordumova from the University of Amsterdam elaborated on the importance of selecting the most important frames from videos. Computer vision techniques can be used to identify and label concepts - such as the frog - in videos fragments. By learning a visual model to find the interesting topic (what does a poisonous frog and handling it look like?) the topic can be marked and shared on portals like Uitzending Gemist and programme websites. This information is used to identify zap points in the episode for online airing.
Christoph Kofler from TU Delft demonstrated work done on connecting real time tweets to video fragments, to identify concepts. For this part of the project, three main challenges remain: how to extract the concepts; reducing noise in video; and different appearances of concepts in video. Furthermore, a combination with comparison data such as closed captions is promising.
Other challenges still ahead are for instance defining whether tweets pertain a visual counterpart, ASR integration and identifying the intent (e.g. sarcastic) of tweets. Currently, the software is being integrated at the Institute for Sound & Vision, analysing episodes. The first complete version of the demonstrator is expected to be ready in Q1, 2013.
Now Archana speaks on trust in crowdsourcing. Go Archana! #sealinc #kbLizzy Jongma