Skip to content

Instantly share code, notes, and snippets.

@nickgnd
Last active January 24, 2023 22:06
Show Gist options
  • Save nickgnd/2df15743a67216856c0b4c94769cb61a to your computer and use it in GitHub Desktop.
Save nickgnd/2df15743a67216856c0b4c94769cb61a to your computer and use it in GitHub Desktop.
Livebook to reproduce strange behaviour when running a model after binding a variable
# elixir ./slow_model_run.ex
# Title: Debugging: slowness 2nd model execution after binding variable
Mix.install(
[
{:exla, "~> 0.4"},
{:nx, "~> 0.4"},
{:axon, "~> 0.4.1"}
],
config: [nx: [default_backend: EXLA.Backend]]
)
# ── Generate the dataset ──
key = Nx.Random.key(42)
{inputs, _new_key} = Nx.Random.normal(key, 0, 1, shape: {1000, 2}, type: :f32)
labels =
Enum.map(0..999, fn _ -> Enum.random([0, 1]) end)
|> Nx.tensor()
|> Nx.reshape({:auto, 1})
|> Nx.equal(Nx.tensor([0, 1]))
batch_size = 25
train_inputs = Nx.to_batched(inputs, batch_size)
train_labels = Nx.to_batched(labels, batch_size)
train_batches = Stream.zip(train_inputs, train_labels)
# ── Define the model and training ──
model =
Axon.input("data")
|> Axon.dense(100, activation: :sigmoid)
|> Axon.dense(30, activation: :sigmoid)
|> Axon.dense(2, activation: :softmax)
loop = Axon.Loop.trainer(model, :categorical_cross_entropy, Axon.Optimizers.rmsprop(0.001))
epochs = 100
{time_before_1, _ } = :timer.tc(fn -> Axon.Loop.run(loop, train_batches, %{}, epochs: epochs, compiler: EXLA) end)
{time_before_2, _ } = :timer.tc(fn -> Axon.Loop.run(loop, train_batches, %{}, epochs: epochs, compiler: EXLA) end)
# ── Bind a variable to random generated data ──
_var =
0..9999
|> Enum.map(fn _i -> :random.uniform() end)
|> Enum.join("\n")
|> String.split("\n")
|> Enum.with_index(1)
|> Enum.map(fn {loss, index} ->
%{loss: String.to_float(loss), epoch: index, type: "training"}
end)
# ── Re-run the same loop --> slower ──
{time_after, _ } = :timer.tc(fn -> Axon.Loop.run(loop, train_batches, %{}, epochs: epochs, compiler: EXLA) end)
IO.inspect("Before binding var - 1st run: #{time_before_1 / 1_000_000}s")
IO.inspect("Before binding var - 2nd run: #{time_before_2 / 1_000_000}s")
IO.inspect("After binding var - 3rd run: #{time_after / 1_000_000}s")

2nd Loop run is slower when binding variable

Mix.install(
  [
    {:exla, "~> 0.4"},
    {:nx, "~> 0.4"},
    {:axon, "~> 0.4.1"}
  ],
  config: [nx: [default_backend: EXLA.Backend]]
)

Generate the dataset

key = Nx.Random.key(42)

{inputs, _new_key} = Nx.Random.normal(key, 0, 1, shape: {1000, 2}, type: :f32)

labels =
  Enum.map(0..999, fn _ -> Enum.random([0, 1]) end)
  |> Nx.tensor()
  |> Nx.reshape({:auto, 1})
  |> Nx.equal(Nx.tensor([0, 1]))

batch_size = 25

train_inputs = Nx.to_batched(inputs, batch_size)
train_labels = Nx.to_batched(labels, batch_size)
train_batches = Stream.zip(train_inputs, train_labels)

Define the model and training

model =
  Axon.input("data")
  |> Axon.dense(100, activation: :sigmoid)
  |> Axon.dense(30, activation: :sigmoid)
  |> Axon.dense(2, activation: :softmax)

loop = Axon.Loop.trainer(model, :categorical_cross_entropy, Axon.Optimizers.rmsprop(0.001))
epochs = 100

Axon.Loop.run(loop, train_batches, %{}, epochs: epochs, compiler: EXLA)

Bind a variable to random generated data

0..9999
|> Enum.map(fn _i -> :random.uniform() end)
|> Enum.join("\n")
|> String.split("\n")
|> Enum.with_index(1)
|> Enum.map(fn {loss, index} ->
  %{loss: String.to_float(loss), epoch: index, type: "training"}
end)

Re-run the same loop --> slower

Axon.Loop.run(loop, train_batches, %{}, epochs: epochs, compiler: EXLA)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment