Program Chair

Systems and Applications

Research in multimedia systems is generally concerned with understanding fundamental trade-offs between competing resource requirements, developing practical techniques and heuristics for complex optimization and allocation strategies, and demonstrating innovative mechanisms and frameworks for realizing modern multimedia applications. Developing such applications may concern the design and implementation of computer systems, networks, end devices, and the performance, resource, adaptability, and usability issues associated with the target system as a whole.

Authors' Advocate

Judith Redi
Judith Redi
Delft University of Technology
Area: Multimedia Systems and Middleware
Area Chairs
Yonggang Wen
Yonggang Wen
Nanyang Technological University
Cheng Hsin Hsu
Cheng Hsin Hsu
National Tsing Hua University

The Multimedia Systems and Middleware area targets applications, mechanisms, algorithms, and tools that enable the design and development of efficient, robust, and scalable multimedia systems. In general, it includes solutions at various levels in the software and hardware stacks.

We call for submissions that explore the design of architectures and software for large-scale and/or distributed multimedia systems, multimedia in pervasive computing applications, as well as mobile multimedia. This includes tools and middleware to build multimedia applications, such as content adaptation and transcoding, stream processing, scheduling and synchronization, and cloud multimedia systems.

Multimedia architectures and systems are continuously evolving, and hardware technology changes influence middleware and applications. We therefore also solicit submissions of new research on host-device interaction in heterogeneous systems, applications for transactional memory, and multi-level memory architectures (e.g., RAM – SSDs – spinning disks) for operating systems and storage functions.

Topics of interest include, but are not limited to:

  • Efficient implementations of processing frameworks for multimedia workloads
  • System and middleware implementation with graphics processing units (GPUs), network processors, and field-programmable gate arrays (FPGAs)
  • Multimedia systems over new generation computing platforms, e.g., social, cloud, Software-Defined Everything (SDx), and human-based computing
  • Middleware systems for mobile multimedia, especially for emerging wireless technologies, e.g., 5G mobile Internet, Low-Power Wide-Area Network (LPWAN), and LiFi
  • Real-time multimedia systems and middleware
  • Embedded multimedia systems and middleware
  • QoS and QoE of multimedia systems and middleware
  • System prototypes, deployment, and measurements
  • Energy-efficiency for multimedia systems and middleware
  • Multimedia storage systems
  • Immersive and virtual world systems and middleware

Submitted contributions should either provide new and interesting theoretical or algorithmic insights of specific aspects of multimedia systems, or contain a comprehensive evaluation of a complete multimedia system.

Area: Multimedia Transport and Delivery
Area Chairs
Christian Timmerer
Christian Timmerer
Klagenfurt University
Vishy Swaminathan
Vishy Swaminathan
Adobe Systems
Ali C. Begen
Ali C. Begen
Networked Media and Ozyegin University

The Multimedia Transport and Delivery area invites research that is concerned with the mechanisms that are used to move multimedia content through public networks like the Internet, as well as the placement and movement of multimedia content within CDNs, P2P networks, clouds and clusters. Most multimedia transport today is no longer content-agnostic: mechanisms may adapt, filter out or combine content, and they may even organize the transport based on content type or semantics.

Topics of interest in this area include, but are not limited to:

  • New deployment concepts, such as network function virtualization and software defined networking in the context of multimedia transport and delivery
  • Adaptive media streaming over HTTP and other (non-)standard protocols (e.g., HTTP/2, QUIC, WebRTC, MPTCP) for media transport and delivery
  • New research addressing bufferbloat and new congestion management methods including AQM strategies
  • Performance improvements due to new forms of host-device interaction, including the benefits of new interconnects, transactional memory, and SSD controllers allowing random memory access
  • Transport of multimodal data types and other interactive applications, including the relation of transport and delivery with scalable media adaptation, scaling, compression, and coding
  • Multimedia content-aware pre-fetching and caching, multimedia analysis and recommendations for media distribution and caching

The contributions are expected to provide new theoretical or experimental insights into transport and delivery mechanisms, enhancements to one or more system components, complete systems or applications.

Area: Multimedia Telepresence and Virtual/Augmented/Mixed Reality
Area Chairs
Jonathan Ventura
Jonathan Ventura
University of Colorado
Pål Halvorsen
University of Oslo & Simula Research Laboratory

Telepresence and virtual/augmented reality have for a long time been grand challenges for researchers and industry alike. High-resolution 3D telepresence can dramatically improve sense of presence for interpersonal and group communication. This is paramount for supporting non-verbal and subconscious communication that is currently lost in video and audio conferencing environments. Realistic virtual/augmented reality enables a wide spectrum of important applications including tele-medicine, training for hazardous situations, scientific visualization, and engineering prototyping. Addressing the challenges of telepresence and virtual/augmented reality requires the development of new media representation, processing and streaming techniques as well as innovations in human-computer interaction.

Topics include but are not limited to:

  • Multi-camera coding and streaming
  • 3D video coding
  • Image-based rendering for virtual/augmented environments
  • Virtual/augmented reality user interface design and evaluation
  • Haptic interfaces for virtual/augmented reality
  • Virtual-world design and authoring tools
  • 3D sound rendering in virtual/augmented environments
  • Multi-viewpoint stereo for group telepresence
  • Automated group telepresence capture and control
  • Distributed multi-user virtual/augmented reality systems
  • Real-time bandwidth adaptation for VR and telepresence
  • Innovative VR and telepresence applications
  • Quality of experience models and evaluation for VR and telepresence
Area: Mobile and Wearable Multimedia
Area Chairs
Vivek Singh
Vivek K. Singh
Rutgers University
Roger Zimmermann
Roger Zimmermann
National University of Singapore
Nic Lane
Nic Lane
UCL & Nokia Bell Labs

With a multitude of sensors, including accelerometer, GPS-location, multiple video and still cameras, microphone and speakers, mobile devices are arguably the truest embodiment of multimedia. Furthermore, as processing power, sensor quality, and display resolution continue to improve at an exponential pace, the potential of these devices is seemingly unbounded. At the very same time, however, much more limited growth in battery life and communication capacity create distinct systems-level challenges that must be addressed to tap that potential. Additionally wearable devices such as, for example, fitness trackers open new possibilities and applications in the fields of healthcare, education, and computational social science.

Topics include but are not limited to:

  • Media streaming to/from mobile devices
  • Innovative uses of location, accelerometer and multiple sensors for novel applications.
  • Mobility models and simulations in service of multimedia research
  • Performance and quality-of-experience studies of multimedia applications in a mobile context
  • Low-power and power-aware media services and multimedia applications
  • Real-time interactive mobile-to-mobile conferencing
  • Crowdsourcing within mobile multimedia applications
  • Touch-based interfaces for multimedia applications
  • Vehicle-based multimedia systems
  • Peer-to-peer mobile media and applications
  • Wearable device systems and applications