Start of main content
Two Years in the Life of a Kotlin Deep Learning Library: From Prototype to Release 4
The black-and-white story of how Alexey smeared Kotlin on top of TensorFlow computational kernel and what came out of it. And a few years ago he was young, had no gray hair in his beard, and did not want much: just to train and refine neural networks in JVM language with the speed of TensorFlow or PyTorch. And, of course, to have all the MLOps out of the box, not to write his own!
Before he knew it two years of working on Deep Learning for Kotlin as a member of JetBrains team (Kotlin for Data Science) passed quickly and the path went from a MVP of 1 class and 3 methods to a multi-module project with hundreds of classes, tests, dozens of tutorials, articles and a few thousand users. KotlinDL became a project with 30 contributors from all over the world: Poland, China, Iran, India, Germany, Canada and Russia. A project, on the basis of which users create multiplatform libraries and game engines with AI elements, and just dabble with the detection of objects from their cameras.
This talk is not about Deep Learning, as such, but about the uneasy path of growing JVM-library for Data Science ecosystem from scratch and about overcoming difficulties in the intersection of native and snake worlds, where no one has set foot on the types and industrial Java beans. We'll be talking both about how to set tasks for those you don't pay, and about how to parse model weights. About how to properly use the resources of the company you work for to develop an OSS project and the difference between JNI and JavaCPP, and about finding the truth in the bowels of Github Issues titans on whose shoulders it all stands.
If it's a smoothie, it's one with a cream of steel.