Thursday, April 21, 2016

Horizons: Multimedia Technologies that Protect Privacy

The Survey on Future Media for the new H2020 Work Programme gave me 500 characters each to answer a series of critical questions. I’m listing questions and my answers below. I'm taking this as my chance to pull out all the stops: extreme caution meets idealism. Did I use my characters wisely?

Describe which area the new research and innovation work programme of H2020 should look at when addressing the future of Media.

Non-Obvious Relationship Awareness (NORA) is a set of data mining techniques that find relationships between people and events in data that no one would think would exist. European Citizens sharing images or videos online have no way of knowing what sorts of information they are revealing about themselves. We need innovative research on media processing techniques that protect people's privacy by warning them when they are sharing information, and that obfuscate media making it safe for sharing.

What difference would projects in the area you propose make for Europe's society and citizens?

Projects in this area would contribute to safeguarding the fundamental right of European citizens to privacy and protection of personal data. Today, privacy protection focuses on protecting "obvious" personal information. This protection means nothing when personal information is obtainable "non-obvious" form. European citizens need tools to understand the dangers of sharing media in cyberspace, and tools that can support them in making informed decisions and protecting themselves.

What are the main technological and Media ecosystem related breakthroughs to achieve the foreseen scenario?

The Media ecosystem in question is the whole of cyberspace. The breakthrough that we need is techniques to predict that impact of data that we have not seen entering the system. We need techniques that are able to obfuscate images and videos in ways that defeat sophisticated machine learning algorithms, such as deep learning techniques. These technologies must be designed from the beginning in a way that is understandable and acceptable to the general population: protection only works if used.

What kind of technology(ies) will be involved?

Technologies involved are image, text, audio, and video processing algorithms. These algorithms will re-synthesize users' multimedia content so that it still fulfills its intended function, but with a reduced risk of leaking private information. Technology must go beyond big data to be aware of hypothetical future data. Yet unheard of: technology capable of protecting users' privacy against inference of non-obvious relations must be understandable by the people who it is intended to serve.

Describe your vision on the future of Media in 5 years' time?

People will begin to worry about large companies claiming to own (and attempt to sell them back) digital versions of their past selves, forgotten on distant servers. The realization will grow that it is not enough to have a device that takes amazing images and videos, but you also need a device that allows you to save and enjoy those images in years to come. An understanding will emerge that a rich digital media chronicle of ones own life contributes to health, happiness and wellbeing.

Describe your vision on the future of Media in 10 years' time?

Social images circling the globe will give people unprecedented insight into the human condition. People living in both developed and developing countries will rebel at anyone in the human race living under conditions of constant fear, and threat of constant hunger. The world will change. If protecting privacy means that people need to stop sharing images and videos all together, the opportunity to fulfill this idealistic vision is missed. The future of Media is bright, only if can be kept safe.

At the end of the day, multimedia is about making the world healthy, happy, and complete. At the end of this exercise I have concluded that the horizon stretches even further than 2020.