Information for Archivists & Librarians

Distill is a publication of the Distill Working Group, a group of machine learning researchers primarily based out of San Francisco, California.

Distill articles are published on a rolling basis. That is, each article is published immediately upon completion of review, rather than waiting for the next issue.

Volumes & Issues

To better integrate into library systems, we do organize our articles into volumes and issues. At the end of every month, articles published that month are collated into an issue. At the end of the year, issues for that year are collated into a volume. This a standard practice for electronic journals with rolling publications.

Our first issue was designated Vol 1, Issue 9 in order to keep the issue number consistently mapped to the given month.

Issues To Date

Vol 1, Issue 9 Vol 1, Issue 10 Vol 1, Issue 12 Vol 2, Issue 3 Vol 2, Issue 4 Vol 2, Issue 11 Vol 2, Issue 12 Vol 3, Issue 3 Vol 3, Issue 7 Vol 3, Issue 8 Vol 4, Issue 2 Vol 4, Issue 3 Vol 4, Issue 4 Vol 4, Issue 8 Vol 4, Issue 9 Vol 4, Issue 11 Vol 5, Issue 1 Vol 5, Issue 2 Vol 5, Issue 3 Vol 5, Issue 4 Vol 5, Issue 5 Vol 5, Issue 6 Vol 5, Issue 8 Vol 5, Issue 9 Vol 5, Issue 11 Vol 5, Issue 12 Vol 6, Issue 1 Vol 6, Issue 2 Vol 6, Issue 3 Vol 6, Issue 4 Vol 6, Issue 5 Vol 6, Issue 7 Vol 6, Issue 9

Archiving & Preservation

Distill takes its responsibility to safeguard the articles published in it very seriously.

We publish entirely in accepted web standards, adopted by major companies, and our data is globally replicated. It is also archived with the Internet Archive for independent preservation. In the future, we intend to investigate joining LOCKSS.

We’d be delighted for your institution to archive copies of Distill’s articles. Because our publications are creative commons, you do not need permission, but please feel free to contact admin@distill.pub if any questions come up.

is dedicated to clear explanations of machine learning