Tensor library Nx 0.1.0 makes Elixir fit for ML

Share your love

Almost a year after the development team first presented the Nx project, the first public release of the library is now available on GitHub. Nx 0.1.0 is intended to enable Elixir developers to combine functional programming with tensor compilers in order to open up new areas of application such as numerical calculations and machine learning.

Although the Nx development team around José Valim and Sean Moriarity were aware of the fact that a functional programming language like Elixir can be problematic with numerical calculations, especially when it comes to the immutability when working with large blocks of memory, the tempting potential benefits. The Elixir programming language, which recently introduced semantic recompilation with version 1.13, and its underlying Erlang VM promise low latency when operated in distributed and fault-tolerant systems, so that flexibly scalable and easily maintainable applications can be built.

The Nx library, which is designed for multidimensional arrays (tensors) and just-in-time compilation of numerical Elixir on both CPUs and GPUs, is therefore based on models such as NumPy and JAX from the Python ecosystem. JAX in particular is fundamentally based functional programming and Immutability principles aligned.

On this basis, Nx should create the prerequisites, also at higher levels of abstraction such as numerical definitions (defn) or calculation graphs to exploit the advantages of functional programming. Within defn, which is tailored to numerical computing as a subset of Elixir, regular Elixir operators can also be used with Nx and translated into the corresponding tensor operations. This provides developers with language functions and data types such as macros, the pipe operator, pattern matching or maps.

defmodule MyModule do
  import Nx.Defn

  defn softmax
    normalized = t - Nx.max
    Nx.exp(normalized) / Nx.sum(Nx.exp(normalized))
  end
end

After the call, the code from the preceding listing compiles the types and forms of the arguments, for example using Google’s XLA compiler, to create optimized code that can be executed on CPUs, GPUs or Cloud TPUs. Multi-dimensional tensors can be created with Nx from unsigned (u8, u16, u32, u64) and signed integers (s8, s16, s32, s64) as well as unsigned (f16, f32, f64) and signed floating point numbers (bf16).

In order to expand Nx for productive use in machine learning projects, the team has implemented streaming functions in the past few months, which enable an application to be loaded into GPUs and TPUs while it receives a batch of inputs. The functions are intended to help carry out inference more efficiently, but are also intended to be suitable for distributed learning.

With a view to Recurrent Neural Networks (RNN) and models for speech recognition, semantic parsing, sign language translation and comparable areas of application, the Nx makers have revised the numerical definitions while-Looping expanded. They should enable both the static and the dynamic disentangling of loops.

In order to be able to test the Nx library and its automatic derivation engine against numerous neural networks, Sean Moriarity Axon developed. Axon provides a functional API that can be used as a low-level API for numerical definitions (defn) and at the same time creates the prerequisites for further decoupled APIs for model creation, optimization and training in order to be able to use Nx-based neural networks with Elixir. The following listing shows the model of a convolutional neural network (CNN) implemented with Axon for training and classifying the CIFAR-10 data set:

Axon.input(input_shape)
|> Axon.conv(32, kernel_size: {3, 3}, activation: :relu)
|> Axon.batch_norm()
|> Axon.max_pool(kernel_size: {2, 2})
|> Axon.conv(64, kernel_size: {3, 3}, activation: :relu)
|> Axon.batch_norm()
|> Axon.max_pool(kernel_size: {2, 2})
|> Axon.flatten()
|> Axon.dense(64, activation: :relu)
|> Axon.dropout(rate: 0.5)
|> Axon.dense(10, activation: :softmax)

Further information on Nx 0.1.0 and the entire project can be found in the blog post about the release of the first production-ready version such as in the original announcement on Nx (Numerical Elixir). The project’s public repositories can be viewed on GitHub.


(map)

Article Source

Read Also   The best games for this month of September 2021
Share your love