All the NeuralForecast models are “global” because we train them with all the series from the input pd.DataFrame data Y_df, yet the optimization objective is, momentarily, “univariate” as it does not consider the interaction between the output predictions across time series. Like the StatsForecast library, core.NeuralForecast allows you to explore collections of models efficiently and contains functions for convenient wrangling of input and output pd.DataFrames predictions.

First we load the AirPassengers dataset such that you can run all the examples.

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt

from neuralforecast.tsdataset import TimeSeriesDataset
from neuralforecast.utils import AirPassengersDF as Y_df
# Split train/test and declare time series dataset
Y_train_df = Y_df[Y_df.ds<='1959-12-31'] # 132 train
Y_test_df = Y_df[Y_df.ds>'1959-12-31']   # 12 test
dataset, *_ = TimeSeriesDataset.from_df(Y_train_df)

1. Automatic Forecasting

A. RNN-Based


source

AutoRNN

 AutoRNN (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7f401f470040>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f401f470040>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoRNN.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=-1, encoder_hidden_size=8)
model = AutoRNN(h=12, config=config, num_samples=1, cpus=1)

model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoRNN(h=12, config=None, num_samples=1, cpus=1, backend='optuna')

source

AutoLSTM

 AutoLSTM (h, loss=MAE(), valid_loss=None, config=None,
           search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
           object at 0x7f401f0eff10>, num_samples=10,
           refit_with_val=False, cpus=4, gpus=0, verbose=False,
           alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f401f0eff10>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoLSTM.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=-1, encoder_hidden_size=8)
model = AutoLSTM(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoLSTM(h=12, config=None, backend='optuna')

source

AutoGRU

 AutoGRU (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7f401ed0b4c0>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f401ed0b4c0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoGRU.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=-1, encoder_hidden_size=8)
model = AutoGRU(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoGRU(h=12, config=None, backend='optuna')

source

AutoTCN

 AutoTCN (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7f4020edf6d0>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020edf6d0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTCN.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=-1, encoder_hidden_size=8)
model = AutoTCN(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTCN(h=12, config=None, backend='optuna')

source

AutoDeepAR

 AutoDeepAR (h, loss=DistributionLoss(), valid_loss=MQLoss(), config=None,
             search_alg=<ray.tune.search.basic_variant.BasicVariantGenerat
             or object at 0x7f4020d03430>, num_samples=10,
             refit_with_val=False, cpus=4, gpus=0, verbose=False,
             alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossDistributionLossDistributionLoss()Instantiated train loss class from losses collection.
valid_lossMQLossMQLoss()Instantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020d03430>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNHITS.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, lstm_hidden_size=8)
model = AutoDeepAR(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoDeepAR(h=12, config=None, backend='optuna')

source

AutoDilatedRNN

 AutoDilatedRNN (h, loss=MAE(), valid_loss=None, config=None,
                 search_alg=<ray.tune.search.basic_variant.BasicVariantGen
                 erator object at 0x7f4020eb5c30>, num_samples=10,
                 refit_with_val=False, cpus=4, gpus=0, verbose=False,
                 alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020eb5c30>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoDilatedRNN.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=-1, encoder_hidden_size=8)
model = AutoDilatedRNN(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoDilatedRNN(h=12, config=None, backend='optuna')

source

AutoBiTCN

 AutoBiTCN (h, loss=MAE(), valid_loss=None, config=None,
            search_alg=<ray.tune.search.basic_variant.BasicVariantGenerato
            r object at 0x7f4020cf6a10>, num_samples=10,
            refit_with_val=False, cpus=4, gpus=0, verbose=False,
            alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020cf6a10>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNHITS.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoBiTCN(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoBiTCN(h=12, config=None, backend='optuna')

B. MLP-Based


source

AutoMLP

 AutoMLP (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7f4020edcbb0>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020edcbb0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoMLP.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoMLP(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoMLP(h=12, config=None, backend='optuna')

source

AutoNBEATS

 AutoNBEATS (h, loss=MAE(), valid_loss=None, config=None,
             search_alg=<ray.tune.search.basic_variant.BasicVariantGenerat
             or object at 0x7f40f6a9ed10>, num_samples=10,
             refit_with_val=False, cpus=4, gpus=0, verbose=False,
             alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f40f6a9ed10>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNBEATS.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12,
              mlp_units=3*[[8, 8]])
model = AutoNBEATS(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoNBEATS(h=12, config=None, backend='optuna')

source

AutoNBEATSx

 AutoNBEATSx (h, loss=MAE(), valid_loss=None, config=None,
              search_alg=<ray.tune.search.basic_variant.BasicVariantGenera
              tor object at 0x7f401f4b6470>, num_samples=10,
              refit_with_val=False, cpus=4, gpus=0, verbose=False,
              alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f401f4b6470>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNBEATS.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12,
              mlp_units=3*[[8, 8]])
model = AutoNBEATSx(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoNBEATSx(h=12, config=None, backend='optuna')

source

AutoNHITS

 AutoNHITS (h, loss=MAE(), valid_loss=None, config=None,
            search_alg=<ray.tune.search.basic_variant.BasicVariantGenerato
            r object at 0x7f4020eb73d0>, num_samples=10,
            refit_with_val=False, cpus=4, gpus=0, verbose=False,
            alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020eb73d0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNHITS.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12, 
              mlp_units=3 * [[8, 8]])
model = AutoNHITS(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoNHITS(h=12, config=None, backend='optuna')

source

AutoDLinear

 AutoDLinear (h, loss=MAE(), valid_loss=None, config=None,
              search_alg=<ray.tune.search.basic_variant.BasicVariantGenera
              tor object at 0x7f4020ec8a90>, num_samples=10,
              refit_with_val=False, cpus=4, gpus=0, verbose=False,
              alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020ec8a90>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoDLinear.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12)
model = AutoDLinear(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoDLinear(h=12, config=None, backend='optuna')

source

AutoNLinear

 AutoNLinear (h, loss=MAE(), valid_loss=None, config=None,
              search_alg=<ray.tune.search.basic_variant.BasicVariantGenera
              tor object at 0x7f4020e46bf0>, num_samples=10,
              refit_with_val=False, cpus=4, gpus=0, verbose=False,
              alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020e46bf0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNLinear.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12)
model = AutoNLinear(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoNLinear(h=12, config=None, backend='optuna')

source

AutoTiDE

 AutoTiDE (h, loss=MAE(), valid_loss=None, config=None,
           search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
           object at 0x7f401eedcc10>, num_samples=10,
           refit_with_val=False, cpus=4, gpus=0, verbose=False,
           alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f401eedcc10>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTiDE.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12)
model = AutoTiDE(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTiDE(h=12, config=None, backend='optuna')

source

AutoDeepNPTS

 AutoDeepNPTS (h, loss=MAE(), valid_loss=None, config=None,
               search_alg=<ray.tune.search.basic_variant.BasicVariantGener
               ator object at 0x7f401ee51d20>, num_samples=10,
               refit_with_val=False, cpus=4, gpus=0, verbose=False,
               alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f401ee51d20>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoDeepNPTS.default_config
config = dict(max_steps=2, val_check_steps=1, input_size=12)
model = AutoDeepNPTS(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoDeepNPTS(h=12, config=None, backend='optuna')

C. Transformer-Based


source

AutoTFT

 AutoTFT (h, loss=MAE(), valid_loss=None, config=None,
          search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
          object at 0x7f4020e25a20>, num_samples=10, refit_with_val=False,
          cpus=4, gpus=0, verbose=False, alias=None, backend='ray',
          callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020e25a20>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNHITS.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoTFT(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTFT(h=12, config=None, backend='optuna')

source

AutoVanillaTransformer

 AutoVanillaTransformer (h, loss=MAE(), valid_loss=None, config=None,
                         search_alg=<ray.tune.search.basic_variant.BasicVa
                         riantGenerator object at 0x7f4020e6db70>,
                         num_samples=10, refit_with_val=False, cpus=4,
                         gpus=0, verbose=False, alias=None, backend='ray',
                         callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020e6db70>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNHITS.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoVanillaTransformer(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoVanillaTransformer(h=12, config=None, backend='optuna')

source

AutoInformer

 AutoInformer (h, loss=MAE(), valid_loss=None, config=None,
               search_alg=<ray.tune.search.basic_variant.BasicVariantGener
               ator object at 0x7f40210a1f90>, num_samples=10,
               refit_with_val=False, cpus=4, gpus=0, verbose=False,
               alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f40210a1f90>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNHITS.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoInformer(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoInformer(h=12, config=None, backend='optuna')

source

AutoAutoformer

 AutoAutoformer (h, loss=MAE(), valid_loss=None, config=None,
                 search_alg=<ray.tune.search.basic_variant.BasicVariantGen
                 erator object at 0x7f4020e26320>, num_samples=10,
                 refit_with_val=False, cpus=4, gpus=0, verbose=False,
                 alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020e26320>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNHITS.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=8)
model = AutoAutoformer(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoAutoformer(h=12, config=None, backend='optuna')

source

AutoFEDformer

 AutoFEDformer (h, loss=MAE(), valid_loss=None, config=None,
                search_alg=<ray.tune.search.basic_variant.BasicVariantGene
                rator object at 0x7f4020e021d0>, num_samples=10,
                refit_with_val=False, cpus=4, gpus=0, verbose=False,
                alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020e021d0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNHITS.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=64)
model = AutoFEDformer(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoFEDformer(h=12, config=None, backend='optuna')

source

AutoPatchTST

 AutoPatchTST (h, loss=MAE(), valid_loss=None, config=None,
               search_alg=<ray.tune.search.basic_variant.BasicVariantGener
               ator object at 0x7f40210ddae0>, num_samples=10,
               refit_with_val=False, cpus=4, gpus=0, verbose=False,
               alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f40210ddae0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoNHITS.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=16)
model = AutoPatchTST(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoPatchTST(h=12, config=None, backend='optuna')

source

AutoiTransformer

 AutoiTransformer (h, n_series, loss=MAE(), valid_loss=None, config=None,
                   search_alg=<ray.tune.search.basic_variant.BasicVariantG
                   enerator object at 0x7f4020e35090>, num_samples=10,
                   refit_with_val=False, cpus=4, gpus=0, verbose=False,
                   alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020e35090>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoiTransformer.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=16)
model = AutoiTransformer(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoiTransformer(h=12, n_series=1, config=None, backend='optuna')

D. CNN Based


source

AutoTimesNet

 AutoTimesNet (h, loss=MAE(), valid_loss=None, config=None,
               search_alg=<ray.tune.search.basic_variant.BasicVariantGener
               ator object at 0x7f4020e24e20>, num_samples=10,
               refit_with_val=False, cpus=4, gpus=0, verbose=False,
               alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020e24e20>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTimesNet.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12, hidden_size=32)
model = AutoTimesNet(h=12, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTimesNet(h=12, config=None, backend='optuna')

E. Multivariate


source

AutoStemGNN

 AutoStemGNN (h, n_series, loss=MAE(), valid_loss=None, config=None,
              search_alg=<ray.tune.search.basic_variant.BasicVariantGenera
              tor object at 0x7f4020e35660>, num_samples=10,
              refit_with_val=False, cpus=4, gpus=0, verbose=False,
              alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020e35660>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoStemGNN.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12)
model = AutoStemGNN(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoStemGNN(h=12, n_series=1, config=None, backend='optuna')

source

AutoHINT

 AutoHINT (cls_model, h, loss, valid_loss, S, config,
           search_alg=<ray.tune.search.basic_variant.BasicVariantGenerator
           object at 0x7f4020ef36d0>, num_samples=10, cpus=4, gpus=0,
           refit_with_val=False, verbose=False, alias=None, backend='ray',
           callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
cls_modelPyTorch/PyTorchLightning modelSee neuralforecast.models collection here.
hintForecast horizon
lossPyTorch moduleInstantiated train loss class from losses collection.
valid_lossPyTorch moduleInstantiated valid loss class from losses collection.
S
configdict or callableDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020ef36d0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
refit_with_valboolFalseRefit of best model should preserve val_size.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Perform a simple hyperparameter optimization with 
# NHITS and then reconcile with HINT
from neuralforecast.losses.pytorch import GMM, sCRPS

base_config = dict(max_steps=1, val_check_steps=1, input_size=8)
base_model = AutoNHITS(h=4, loss=GMM(n_components=2, quantiles=quantiles), 
                       config=base_config, num_samples=1, cpus=1)
model = HINT(h=4, S=S_df.values,
             model=base_model,  reconciliation='MinTraceOLS')

model.fit(dataset=dataset)
y_hat = model.predict(dataset=hint_dataset)

# Perform a conjunct hyperparameter optimization with 
# NHITS + HINT reconciliation configurations
nhits_config = {
       "learning_rate": tune.choice([1e-3]),                                     # Initial Learning rate
       "max_steps": tune.choice([1]),                                            # Number of SGD steps
       "val_check_steps": tune.choice([1]),                                      # Number of steps between validation
       "input_size": tune.choice([5 * 12]),                                      # input_size = multiplier * horizon
       "batch_size": tune.choice([7]),                                           # Number of series in windows
       "windows_batch_size": tune.choice([256]),                                 # Number of windows in batch
       "n_pool_kernel_size": tune.choice([[2, 2, 2], [16, 8, 1]]),               # MaxPool's Kernelsize
       "n_freq_downsample": tune.choice([[168, 24, 1], [24, 12, 1], [1, 1, 1]]), # Interpolation expressivity ratios
       "activation": tune.choice(['ReLU']),                                      # Type of non-linear activation
       "n_blocks":  tune.choice([[1, 1, 1]]),                                    # Blocks per each 3 stacks
       "mlp_units":  tune.choice([[[512, 512], [512, 512], [512, 512]]]),        # 2 512-Layers per block for each stack
       "interpolation_mode": tune.choice(['linear']),                            # Type of multi-step interpolation
       "random_seed": tune.randint(1, 10),
       "reconciliation": tune.choice(['BottomUp', 'MinTraceOLS', 'MinTraceWLS'])
    }
model = AutoHINT(h=4, S=S_df.values,
                 cls_model=NHITS,
                 config=nhits_config,
                 loss=GMM(n_components=2, level=[80, 90]),
                 valid_loss=sCRPS(level=[80, 90]),
                 num_samples=1, cpus=1)
model.fit(dataset=dataset)
y_hat = model.predict(dataset=hint_dataset)

source

AutoTSMixer

 AutoTSMixer (h, n_series, loss=MAE(), valid_loss=None, config=None,
              search_alg=<ray.tune.search.basic_variant.BasicVariantGenera
              tor object at 0x7f4020e128f0>, num_samples=10,
              refit_with_val=False, cpus=4, gpus=0, verbose=False,
              alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020e128f0>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTSMixer.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12)
model = AutoTSMixer(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTSMixer(h=12, n_series=1, config=None, backend='optuna')

source

AutoTSMixerx

 AutoTSMixerx (h, n_series, loss=MAE(), valid_loss=None, config=None,
               search_alg=<ray.tune.search.basic_variant.BasicVariantGener
               ator object at 0x7f4020e84d30>, num_samples=10,
               refit_with_val=False, cpus=4, gpus=0, verbose=False,
               alias=None, backend='ray', callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020e84d30>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTSMixerx.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12)
model = AutoTSMixerx(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoTSMixerx(h=12, n_series=1, config=None, backend='optuna')

source

AutoMLPMultivariate

 AutoMLPMultivariate (h, n_series, loss=MAE(), valid_loss=None,
                      config=None, search_alg=<ray.tune.search.basic_varia
                      nt.BasicVariantGenerator object at 0x7f4020f8b100>,
                      num_samples=10, refit_with_val=False, cpus=4,
                      gpus=0, verbose=False, alias=None, backend='ray',
                      callbacks=None)

*Class for Automatic Hyperparameter Optimization, it builds on top of ray to give access to a wide variety of hyperparameter optimization tools ranging from classic grid search, to Bayesian optimization and HyperBand algorithm.

The validation loss to be optimized is defined by the config['loss'] dictionary value, the config also contains the rest of the hyperparameter search space.

It is important to note that the success of this hyperparameter optimization heavily relies on a strong correlation between the validation and test periods.*

TypeDefaultDetails
hintForecast horizon
n_series
lossMAEMAE()Instantiated train loss class from losses collection.
valid_lossNoneTypeNoneInstantiated valid loss class from losses collection.
configNoneTypeNoneDictionary with ray.tune defined search space or function that takes an optuna trial and returns a configuration dict.
search_algBasicVariantGenerator<ray.tune.search.basic_variant.BasicVariantGenerator object at 0x7f4020f8b100>For ray see https://docs.ray.io/en/latest/tune/api_docs/suggestion.html
For optuna see https://optuna.readthedocs.io/en/stable/reference/samplers/index.html.
num_samplesint10Number of hyperparameter optimization steps/samples.
refit_with_valboolFalseRefit of best model should preserve val_size.
cpusint4Number of cpus to use during optimization. Only used with ray tune.
gpusint0Number of gpus to use during optimization, default all available. Only used with ray tune.
verboseboolFalseTrack progress.
aliasNoneTypeNoneCustom name of the model.
backendstrrayBackend to use for searching the hyperparameter space, can be either ‘ray’ or ‘optuna’.
callbacksNoneTypeNoneList of functions to call during the optimization process.
ray reference: https://docs.ray.io/en/latest/tune/tutorials/tune-metrics.html
optuna reference: https://optuna.readthedocs.io/en/stable/tutorial/20_recipes/007_optuna_callback.html
# Use your own config or AutoTSMixerx.default_config
config = dict(max_steps=1, val_check_steps=1, input_size=12)
model = AutoMLPMultivariate(h=12, n_series=1, config=config, num_samples=1, cpus=1)

# Fit and predict
model.fit(dataset=dataset)
y_hat = model.predict(dataset=dataset)

# Optuna
model = AutoMLPMultivariate(h=12, n_series=1, config=None, backend='optuna')

TESTS