|Power behind MediaEval 2013|
The workshop produced a working notes proceedings containing 96 short working papers, and has been published by CEUR-WS.org.
The purpose of MediaEval is to offer tasks to the multimedia community that support research progress on multimedia challenges that have a social or a human aspect. The tasks are autonomous and each run by a different team of task organizers. My role within MediaEval is to guide the process of running tasks, which involves providing feedback to task organizers and sending out the cues that keep the tasks ruining smoothly and on time. Today I did a quick count that revealed that during the 2013 season, I wrote 1529 personal emails to people that contained the keyword "MediaEval" in them.
What makes MediaEval work, however, cannot be expressed in numbers. Rather, it is the dedication and intensive effort of a large group of people, who propose and organize tasks and carry out the logistics that make the workshop come together. My motivation to continue MediaEval year after year stems largely from an underlying sense of awe at what these people do: both at the work that I am aware of and also at the many things that they do behind the scenes that make largely invisible. These people are the power behind MediaEval. Here I represent them with the picture above, which are the power plugs arranged by Xavi Anguera from Telefonica with the assistive effort of Bart Thomee from Yahoo! Research. The process involved a combination of precision car driving and applied electrical engineering.
In the airplane back from Barcelona yesterday, I finished processing the responses that we received from the participant surveys (collected during the workshop), input from the organizers meeting (held on Sunday after the workshop), and feedback that people gave me verbally during ACM Multimedia (last week). These points are summarized below.
Thus endeth MediaEval 2013, but at the same time beginneth the season of MediaEval 2014. Hope to have you aboard.
Community Feedback from the MediaEval 2013 Multimedia Benchmarking Season + WorkshopThe most important feedback point this year was the new structure of the workshop, which was very well received. This year the workshop was faster paced and we introduced poster sessions. We were happy that people liked the short talks and that the poster sessions were considered to be useful and productive. There is a clear trend to preferring there to be more discussion time at the workshop, both in the presentation sessions and in the poster sessions. An idea for the future is to separate passive poster time (posters are hanging and people can look at them but the presenter need not be present) from active poster time (presenter is standing at the poster).
The number one most frequent request was for MediaEval to provide more detailed information. This request was made with respect to a range of areas: descriptions of the tasks should always strive to be maximally explicit; descriptions of the evaluation methods should be detailed and available in a timely manner; task overview talks at the workshop should contain examples and descriptions that allow a general audience (i.e., people who did not participate in the task) to understand the task easily.
Other suggestions were to increase consistency check and continue to promote industry involvement. Finally, requests for more time for preparation of presentations and to explicitly invite (and support) groups to make demos with the posters.
The organizers meeting on Sunday was the source of additional feedback. Task organization requires a huge amount of time and dedication from task organization teams and it is important that this is distributed as evenly as possible across the year and across people. In general, tasks would benefit from additional practical guidance on organization. This includes task management and evaluation methodologies. Since MediaEval is a decentralized system, the source of this guidance must be people with past experience with task organization and communication between tasks. Here, the bi-weekly telcos for organizers are an important tool.
In the coming year, the awards and sponsorship committee can expect an expanded role. The outreach to early-career researchers and to researchers located outside of Europe (in the form of travel grants) is seen by the organizers to be not merely a "nice-to-have", but rather a central part of MediaEval's mission. There is solid consensus about the usefulness of the MediaEval Distinctive Mentions (MDMs). MDMs are peer-to-peer certificates awarded by task organizers to each other or to the participants of their tasks. The MDMs allow the community to send public messages between members of the community, and especially to point out participant submissions that are highly innovative or have particularly high potential (although they may not have been top scorers according to the official evaluation metric). It is important to make clear that the MediaEval Distinctive Mention is not an "award", since the process by which they are chosen is intentionally kept very informal. In the coming year, we will be investigating the issue of whether MediaEval should introduce a five-year impact award, that would be more formal in nature. The peer-to-peer MDMs will be maintained, although and effort will be made to make them increasingly transparent.
In general we were satisfied with the process used to produce the proceedings. Having groups do an online check of their metadata was helpful. If future years also involve proceedings with 50+ papers, we will need to further streamline the schedule for submission---with the ultimate goal of having the proceedings online at the moment that the workshop opens.