06 · Define train_loop_config#
The train_loop_config is a simple dictionary of hyperparameters that Ray passes into your training loop (train_loop_ray_train).
It acts as the bridge between the
TorchTrainerand your per-worker training code.Anything defined here becomes available inside the
configargument oftrain_loop_ray_train.
In this example we define:
num_epochs→ how many full passes through the dataset to run.global_batch_size→ the total batch size across all workers (Ray will split this evenly across GPUs).
You can add other parameters here (like learning_rate, embedding_dim, etc.) and they’ll automatically be accessible in your training loop via config["param_name"].
# 06. Define the configuration dictionary passed into the training loop
# train_loop_config is provided to TorchTrainer and injected into
# train_loop_ray_train(config) as the "config" argument.
# → Any values defined here are accessible inside the training loop.
train_loop_config = {
"num_epochs": 2, # Number of full passes through the dataset
"global_batch_size": 128 # Effective batch size across ALL workers
# (Ray will split this evenly per worker, e.g.
# with 8 workers → 16 samples/worker/step)
}