Call for papers

  • Submission deadline: July 11, 2022 (11:59 AM Pacific Time)
  • Author notification: August 17, 2022 (11:59 AM Pacific Time)
  • Camera-ready submission: August 22, 2022 (11:59 AM Pacific Time)

Topics

Topics of interest include the following:

  • Model optimisation for reducing energy consumption while ensuring high performance.
  • Design of innovative architectures and operators for data-intensive scenarios.
  • Applications of large-scale pre-training techniques.
  • Distributed training approaches and architectures.
  • HPC and massively parallel architectures for Deep Learning.
  • Model pruning, gradient compression techniques, quantization to reduce training/inference time.
  • Strategies to reduce memory/data transmission footprint.
  • Methods to estimate computational costs/energy/power.
  • Design, implementation and efficient use of hardware accelerators.
  • Models/methods that can foster diversity and inclusions in research through the adoption of computationally efficient procedures for DL.
  • Analysis of computational cost and consequent social impact of large model training.

Submission and review

We invite submission of papers describing work in the domains suggested above or in closely-related areas.

Reviewing of the submissions will be double-blind. Accepted submissions will be presented either as oral or posters at the workshop, and published in the ECCV 2022 Workshops proceedings.

Papers are limited to 14 pages, including figures and tables, in the ECCV style. Additional pages containing only cited references are allowed. Please refer to the following files for detailed formatting instructions:

Papers that are not properly anonymized, or do not use the template, or have more than 14 pages (excluding references) will be rejected without review.

Note also that the template has changed since ECCV’2020. We therefore strongly urge authors to use this new template instead of templates from older conferences.

Submission site

The submission site is open: https://cmt3.research.microsoft.com/CADL2022/.