DeepEM Playground Template

The template provides guidelines for contributing your work to the DeepEM Playground. In the following we provide a short overview of the template.

Overview

Jupyter Notebooks

Each use case consists of two main notebooks:

These notebooks serve as an interface between deep learning (DL) experts and electron microscopy (EM) experts. To ensure consistency and simplify the learning process for EM researchers, the notebooks should follow a standardized structure.

Please update the markdown cells in the notebooks to describe your specific use case.

To assist you:

Before submitting your use case, please remove all color formatting.

Project Structure

The deepEM/ folder contains a lightweight library for implementing your use case.

Your custom code implementation should be placed in the src/ folder.

For library documentation and available functions, please see below.

Model Configuration

The DeepEM library manages model parameters through a configuration file:

Tunable vs. Non-Tunable Parameters

All parameters—both tunable and non-tunable—must be well-documented.

For detailed documentation, see configs/README.md.


DeepEM Documentation

The DeepEM library provides a simple, pytorch based framework for the implementation of training, tuning and applying deep learning models. It provides following modules:
  1. A simple Logger class for monitoring training and tuning of the model
  2. A module for automatic hyperparameter tuning (ModelTuner).
  3. An AbstractModelTrainer which implements basic learning concepts like early stopping or model checkpointing.
  4. A AbstractModel to implement the deep learning model, based on the torch.nn.Module() class.
  5. An AbstractInfrence module to do inference on single or multiple files.

This approach not only simplifies implementation for DL experts but also establishes a standardized workflow, reducing the learning curve for EM researchers. It enables them to modify the application of the use case simply by changing the training data, without requiring any code changes.

Note To contribute your work to the DeepEM playground, you need to implement the @abstractmethod within all abstract classes. All other classes function as helper classes and should only be altered if absolutely nessecary.

Legend

Methods highlighted in this color are abstract methods that must be implemented by the DL specialist. All other methods are helper methods and can be overwritten if nessecary.


View Source Code

AbstractModel Class

The AbstractModel class is a base class for deep learning models, extending torch.nn.Module. This class provides methods for resetting model parameters, performing the forward pass, and making predictions. It is intended to be inherited by specific model classes to define the architecture and training logic.

Class Overview

AbstractModel serves as a foundational model class that defines essential methods for deep learning models. These include:

Constructor

__init__()

Initializes the AbstractModel class, which extends torch.nn.Module.

This constructor serves as a base class for all deep learning models, providing methods for resetting model parameters (i.e. in between multiple model runs during hyperparameter search), performing the forward pass, and making predictions.

View Source Code

AbstractModelTrainer Class

The AbstractModelTrainer class is an abstract base class designed to facilitate the training, validation, and testing of deep learning models. It provides a structured workflow for model training, including dataset handling, logging, checkpointing, and early stopping mechanisms. It allows for automated hyperparameter tuning by leveraging deepEM.ModelTuner.ModelTuner

Class Overview

AbstractModelTrainer manages the entire training pipeline, ensuring modularity and flexibility. It is intended to be subclassed, requiring concrete implementations for setting up the model, datasets, optimizer, and scheduler. Key functionalities include:

This class supports GPU acceleration, integrates with a the deepEM.Logger.Logger for tracking training progress, and provides dataset loaders for training, validation, and testing. Subclasses must define the specific architecture and training behavior of the model.

Constructor

__init__(data_path, logger, resume_from_checkpoint=None)

Initializes the trainer class for training, validating, and testing models.

Args:

View Source Code

AbstractInference Class

AbstractInference is an abstract base class for performing model inference. It provides methods for model loading, inference execution, and result storage. It is used within the 2_Inference.ipynb

Constructor

__init__(model_path: str, data_path: str, batch_size: int)

Initializes the inference pipeline with model and data paths.

Args:


View Source Code

Logger Class

Logger is a class that provides logging functionality, including checkpoint saving, hyperparameter tracking, and resource monitoring. It will print to the console as well as save log files to the system.

Class Overview

For each time running the .ipynb, it will create a log directory at logs/{datafolder}-{currentdatetime}. Within this directory, it will create one subfolder for each run. Hyperparameter sweep runs are named Sweep_{idx}. The training run will be named TrainingRun. Finally, evaluations will be saved at subfolder Evaluate. Each training run subfolder will have following directories:

  1. checkpoints to store the latest_model.pth as well as the best_model.pth.
  2. plots to store the training and validation curves.
  3. samples to store qualitative visualizations during validation.
Additionally, each training run subfolder stores a hyperparameters.json to store the used hyperparameters, as well as log.txt to save the loggers output to a file.

Constructor

__init__(data_path: str) -> None

Initializes the Logger and creates a timestamped log directory.

View Source Code

ModelTuner Class

The ModelTuner class provides a framework for performing hyperparameter tuning on machine learning models using grid search. The class can be extended to implement different search methods (like random, bayesian).

Class Overview

The ModelTuner class automates hyperparameter tuning on the ModelTrainer class. It provides a simple API between DL specialists and the EM specialists (users). While DL experts define their model parameters within the configs/parameters.json, ipywidgets are used to allow code independent input by the EM specialists.

The class supports loading and updating configuration files, creating interactive widgets for parameter tuning, performing grid search, and logging the best hyperparameters based on validation performance.

It helps streamline the hyperparameter tuning process, optimizing model performance efficiently.

Constructor

__init__(model_trainer, data_path, logger)

Initializes the ModelTuner class with the given model trainer, data path, and logger. It also loads the configuration and sets hyperparameter tuning options.

Args:

View Source Code

Utility Functions

Utility Functions provide helper methods for file handling, logging, and UI components.

Can be imported from deepem.Utils

Methods