Julia and Flux: Modernizing Machine Learning

Authors

  • Dhairya Gandhi Julia Computing
  • Mike Innes Julia Computing
  • Elliot Saba Julia Computing
  • Keno Fischer Julia Computing
  • Viral Shah Julia Computing

DOI:

https://doi.org/10.5753/compbr.2019.39.4527

Keywords:

Machine Learning, Artificial Intelligence, Performance

Abstract

This paper presents two cutting-edge technologies that facilitate the development of Machine Learning applications without sacrificing execution performance.

Downloads

Download data is not yet available.

References

Mike Innes, Stefan Karpinski, Viral Shah, David Barber, Pontus Stene- torp, Tim Besard, James Bradbury, Valentin Churavy, Simon Danisch, Alan Edelman, et al. On Machine Learning and Programming Languages, 2018. URL [link]

Mike Innes. What is Differentiable Programming?, 2019. URL [link]

Mike Innes, James Bradbury, Keno Fischer, Dhairya Gandhi, Neethu Mariya Joy, Tejan Karmali, Matt Kelly, Avik Pal, Marco Rudilosso, Saba Elliot, Viral Shah, and Deniz Yuret. Building a Language and Compiler for Machine Learning, 2018. URL [link].

Published

2019-04-01

How to Cite

Gandhi, D., Innes, M., Saba, E., Fischer, K., & Shah, V. (2019). Julia and Flux: Modernizing Machine Learning. Brazil Computing, (39), 41–45. https://doi.org/10.5753/compbr.2019.39.4527

Issue

Section

Papers