Arcana Models: sequence_to_sequence

Submodules

arcana.models.sequence_to_sequence.seq2seq_factory module

Factory class for creating Seq2Seq models.

class arcana.models.sequence_to_sequence.seq2seq_factory.Seq2SeqFactory(config)

Bases: object

Factory class for creating Seq2Seq models.

count_parameters()

Count the number of trainable parameters in a model.

Returns:

num_params (int) – The number of trainable parameters

create_additive_model()

Create an additive model.

Parameters:

config (dict) – Dictionary containing the configuration parameters

Returns:

seq2seq (Seq2Seq) – The additive model

create_multihead_model()

Create a multihead model.

Returns:

seq2seq (Seq2Seq) – The multihead model

print_weights(layer)

Print the weights of a layer.

Parameters:

layer (torch.nn.Module) – The layer to print the weights of

arcana.models.sequence_to_sequence.sequence_to_sequence module

Sequence to sequence model for time series forecasting

class arcana.models.sequence_to_sequence.sequence_to_sequence.Seq2Seq(*args: Any, **kwargs: Any)

Bases: Module

Seq2Seq module

forward(source, target, source_lengths, teacher_forcing_ratio, start_position)

Forward pass for seq2seq model. The forward pass is implemented as follows: 1. get the encoder outputs 2. iterate over the target sequence by specific window length 3. get the prediction from the decoder 4. concatenate the exogenous variables with the prediction 5. store the prediction in a tensor

Parameters:
  • source (torch.Tensor) – source tensor (batch_size, seq_length, input_size)

  • target (torch.Tensor) – target tensor (batch_size, seq_length, output_size)

  • source_lengths (torch.Tensor) – source lengths (batch_size)

  • teacher_forcing_ratio (float) – teacher forcing ratio

  • start_position (int) – start position of the prediction

Returns:

outputs (torch.Tensor) – outputs (num_quantiles, batch_size, seq_length, output_size)

Module contents