Skip to the content.

News

Workshop Summary

Gradients and derivatives are integral to machine learning, as they enable gradient-based optimization. In many real applications, however, models rest on algorithmic components that implement discrete decisions, or rely on discrete intermediate representations and structures. These discrete steps are intrinsically non-differentiable and accordingly break the flow of gradients. To use gradient-based approaches to learn the parameters of such models requires turning these non-differentiable components differentiable. This can be done with careful considerations, notably, using smoothing or relaxations to propose differentiable proxies for these components. With the advent of modular deep learning frameworks, these ideas have become more popular than ever in many fields of machine learning, generating in a short time-span a multitude of “differentiable everything”, impacting topics as varied as rendering, sorting and ranking, convex optimizers, shortest-paths, dynamic programming, physics simulations, NN architecture search, top-k, graph algorithms, weakly- and self-supervised learning, and many more.

This workshop will provide a forum for anything differentiable, bringing together academic and industry researchers to highlight challenges and developments, provide unifying ideas, discuss practical implementation choices and explore future directions.


Keynote Speakers


Organizers


Call for Papers

This workshop encourages submissions on novel research results, benchmarks, frameworks, and work-in-progress research on differentiating through conventionally non-differentiable operations. The format of submission are 4-page papers (excluding references) submitted to OpenReview. The review-process will not be open.

Scope

The technical topics of interest at this workshop include (but are not limited to):

The workshop does not cover “differentiable programming”, i.e., the programming paradigm of automatic differentiation and its technical implementations. Instead, the workshop covers cases where vanilla automatic differentiation fails or does not yield meaningful gradients.


Contact

Contact the organizers: mail@differentiable.xyz