Skip to content

Chambon2018 documentation

This page details the implementation of the chambon2018 model published here.

To train the model one could use the train -experiment chambon2018 command.

physioex.train.networks.chambon2018.Chambon2018Net

Bases: SleepModule

Source code in physioex/train/networks/chambon2018.py
class Chambon2018Net(SleepModule):
    def __init__(self, module_config: dict = module_config):
        """
        The Chambon2018Net class extends SleepModule. This class is a wrapper for the core Chambon2018 network to be trained inside physioex.

        Args:
            module_config (dict): A dictionary containing the module configuration. Defaults to `module_config`.

        Attributes:
            all the attributes are from `SleepModule`
        """
        super(Chambon2018Net, self).__init__(Net(module_config), module_config)

    def compute_loss(
        self,
        embeddings,
        outputs,
        targets,
        log: str = "train",
        log_metrics: bool = False,
    ):
        """
        Computes the loss for the Chambon2018Net model. This is necessary because Chambon2018 is a multi-input-single-output model, while the base class is a multi-input-multi-output model ( sequence-to-sequence ).

        Args:
            embeddings (torch.Tensor): The embeddings tensors.
            outputs (torch.Tensor): The model output tensors.
            targets (torch.Tensor): The target tensors.
            log (str): The logging information. Defaults to "train".
            log_metrics (bool): Whether to log metrics. Defaults to False.

        Returns:
            torch.Tensor: The computed loss value.
        """
        batch_size, n_class = outputs.size()
        outputs = outputs.reshape(batch_size, 1, n_class)
        embeddings = embeddings.reshape(batch_size, 1, -1)

        return super().compute_loss(embeddings, outputs, targets, log, log_metrics)

__init__(module_config=module_config)

The Chambon2018Net class extends SleepModule. This class is a wrapper for the core Chambon2018 network to be trained inside physioex.

Parameters:

Name Type Description Default
module_config dict

A dictionary containing the module configuration. Defaults to module_config.

module_config
Source code in physioex/train/networks/chambon2018.py
def __init__(self, module_config: dict = module_config):
    """
    The Chambon2018Net class extends SleepModule. This class is a wrapper for the core Chambon2018 network to be trained inside physioex.

    Args:
        module_config (dict): A dictionary containing the module configuration. Defaults to `module_config`.

    Attributes:
        all the attributes are from `SleepModule`
    """
    super(Chambon2018Net, self).__init__(Net(module_config), module_config)

compute_loss(embeddings, outputs, targets, log='train', log_metrics=False)

Computes the loss for the Chambon2018Net model. This is necessary because Chambon2018 is a multi-input-single-output model, while the base class is a multi-input-multi-output model ( sequence-to-sequence ).

Parameters:

Name Type Description Default
embeddings Tensor

The embeddings tensors.

required
outputs Tensor

The model output tensors.

required
targets Tensor

The target tensors.

required
log str

The logging information. Defaults to "train".

'train'
log_metrics bool

Whether to log metrics. Defaults to False.

False

Returns:

Type Description

torch.Tensor: The computed loss value.

Source code in physioex/train/networks/chambon2018.py
def compute_loss(
    self,
    embeddings,
    outputs,
    targets,
    log: str = "train",
    log_metrics: bool = False,
):
    """
    Computes the loss for the Chambon2018Net model. This is necessary because Chambon2018 is a multi-input-single-output model, while the base class is a multi-input-multi-output model ( sequence-to-sequence ).

    Args:
        embeddings (torch.Tensor): The embeddings tensors.
        outputs (torch.Tensor): The model output tensors.
        targets (torch.Tensor): The target tensors.
        log (str): The logging information. Defaults to "train".
        log_metrics (bool): Whether to log metrics. Defaults to False.

    Returns:
        torch.Tensor: The computed loss value.
    """
    batch_size, n_class = outputs.size()
    outputs = outputs.reshape(batch_size, 1, n_class)
    embeddings = embeddings.reshape(batch_size, 1, -1)

    return super().compute_loss(embeddings, outputs, targets, log, log_metrics)

physioex.train.networks.chambon2018.Net

Bases: Module

Source code in physioex/train/networks/chambon2018.py
class Net(nn.Module):
    def __init__(self, module_config=module_config):
        """
        The Net class extends nn.Module. This class implements the core network proposed by Chambon et al in 2018.
        The network consist of an encoder of epochs, a concatenation layer and a classification layer.
        Multiple epochs are concatenated and fed to a linear classifier which predicts the sleep stage of the middle epoch of the sequence.

        Args:
            module_config (dict): A dictionary containing the module configuration. Defaults to `module_config`.

        Attributes:
            epoch_encoder (SleepStagerChambon2018): The epoch encoder.
            clf (nn.Linear): The linear classifier.
            drop (nn.Dropout): A dropout module to prevent overfitting.
        """
        super().__init__()

        print(module_config["in_channels"])
        self.epoch_encoder = SleepStagerChambon2018(
            n_chans=module_config["in_channels"],
            sfreq=module_config["sfreq"],
            n_outputs=module_config["n_classes"],
            n_times=module_config["n_times"],
            return_feats=True,
        )

        self.clf = nn.Linear(
            self.epoch_encoder.len_last_layer * module_config["seq_len"],
            module_config["n_classes"],
        )

        self.drop = nn.Dropout(0.5)

    def forward(self, x):
        """
        Implements the forward pass of the module.

        Args:
            x (torch.Tensor): The input tensor.

        Returns:
            torch.Tensor: The output tensor of the module.
        """
        x, y = self.encode(x)
        return y

    def encode(self, x: torch.Tensor):
        """
        Encodes the input x using the epoch encoder, returns both the econdings and the classification outcome.

        Args:
            x (torch.Tensor): The input tensor.

        Returns:
            tuple: A tuple containing the encoded input tensor and the output tensor of the module.
        """
        batch_size, seqlen, nchan, nsamp = x.size()

        x = x.reshape(-1, nchan, nsamp)

        x = self.epoch_encoder(x)

        x = x.reshape(batch_size, -1)

        y = self.drop(x)
        y = self.clf(y)

        return x, y

__init__(module_config=module_config)

The Net class extends nn.Module. This class implements the core network proposed by Chambon et al in 2018. The network consist of an encoder of epochs, a concatenation layer and a classification layer. Multiple epochs are concatenated and fed to a linear classifier which predicts the sleep stage of the middle epoch of the sequence.

Parameters:

Name Type Description Default
module_config dict

A dictionary containing the module configuration. Defaults to module_config.

module_config

Attributes:

Name Type Description
epoch_encoder SleepStagerChambon2018

The epoch encoder.

clf Linear

The linear classifier.

drop Dropout

A dropout module to prevent overfitting.

Source code in physioex/train/networks/chambon2018.py
def __init__(self, module_config=module_config):
    """
    The Net class extends nn.Module. This class implements the core network proposed by Chambon et al in 2018.
    The network consist of an encoder of epochs, a concatenation layer and a classification layer.
    Multiple epochs are concatenated and fed to a linear classifier which predicts the sleep stage of the middle epoch of the sequence.

    Args:
        module_config (dict): A dictionary containing the module configuration. Defaults to `module_config`.

    Attributes:
        epoch_encoder (SleepStagerChambon2018): The epoch encoder.
        clf (nn.Linear): The linear classifier.
        drop (nn.Dropout): A dropout module to prevent overfitting.
    """
    super().__init__()

    print(module_config["in_channels"])
    self.epoch_encoder = SleepStagerChambon2018(
        n_chans=module_config["in_channels"],
        sfreq=module_config["sfreq"],
        n_outputs=module_config["n_classes"],
        n_times=module_config["n_times"],
        return_feats=True,
    )

    self.clf = nn.Linear(
        self.epoch_encoder.len_last_layer * module_config["seq_len"],
        module_config["n_classes"],
    )

    self.drop = nn.Dropout(0.5)

forward(x)

Implements the forward pass of the module.

Parameters:

Name Type Description Default
x Tensor

The input tensor.

required

Returns:

Type Description

torch.Tensor: The output tensor of the module.

Source code in physioex/train/networks/chambon2018.py
def forward(self, x):
    """
    Implements the forward pass of the module.

    Args:
        x (torch.Tensor): The input tensor.

    Returns:
        torch.Tensor: The output tensor of the module.
    """
    x, y = self.encode(x)
    return y

encode(x)

Encodes the input x using the epoch encoder, returns both the econdings and the classification outcome.

Parameters:

Name Type Description Default
x Tensor

The input tensor.

required

Returns:

Name Type Description
tuple

A tuple containing the encoded input tensor and the output tensor of the module.

Source code in physioex/train/networks/chambon2018.py
def encode(self, x: torch.Tensor):
    """
    Encodes the input x using the epoch encoder, returns both the econdings and the classification outcome.

    Args:
        x (torch.Tensor): The input tensor.

    Returns:
        tuple: A tuple containing the encoded input tensor and the output tensor of the module.
    """
    batch_size, seqlen, nchan, nsamp = x.size()

    x = x.reshape(-1, nchan, nsamp)

    x = self.epoch_encoder(x)

    x = x.reshape(batch_size, -1)

    y = self.drop(x)
    y = self.clf(y)

    return x, y