Despite “every company being a software company” not everybody is a software developer, not yet at least. Our lives and jobs though, today as never before, require us to use and write scripts and softwares. This is particularly true in an environment devoted to research and innovation as academia is. Here tools change often, and likewise often there’s no GUI involved, these representing an unsustainable entry threshold for outsiders slowing down the whole research progress. Portability can be a true nightmare when your very high specialization is as far from dev as it can be. Docker has been introduced in academia not just for how powerful, flexible and adaptable is, but first of all for its simplicity. Having given training about docker to biologist and biotechnologists, though the concept of containerisation is initially hard to grasp, what won their hearts was the simplicity and the velocity with which they could pick it up and exploit it in their daily lives. It is worth mentioning and reminding ourselves as well of those situations in which broadband speed and reliability is not always a matter of course: here as well Docker provides some nice, often overlooked, functionalities. The life science community has been drawn in the recent past to bioinformatics and computational biology, new fields of research that only very recently found their space in education. While new researchers are often trained in coding and data analysis, the whole community benefits from tools and platforms that lower the length of time to get results. Docker has now an important place in a number of projects, allowing, like a virtual machine with installed software, reproducibility of analysis, a major concern in science, while at the same time being easily portable, which is essential for collaborations. Even more, scientists can now share their code, packaging and distributing it themselves with very little effort.
Speaker: Alice Minotto, Earlham Institute