PyCaret + SKORCH: Build PyTorch Neural Networks using Minimal Code



Original Source Here

PyCaret + SKORCH: Build PyTorch Neural Networks using Minimal Code

A low-code guide to build PyTorch Neural Networks with Pycaret

Photo by Danist Soh on Unsplash

Almost in every machine learning project, we train and evaluate multiple machine learning models. This often involves writing multiple lines of imports, many function calls, print statements to train individual models and compare the results across the models. The code becomes a mess when comparing different models with cross-validation loops or ensembling the models. Over time, it gets even messier when we move from classification models to regression models or vice-versa. We end up copying snippets of code from one place to another, creating chaos! We can easily avoid this chaos by just importing PyCaret!

PyCaret is a low-code machine library that allows you to create, train, and test ML models via a unified API given a regression or classification problem. PyCaret also offers various steps in a machine learning project, from data preparation to model deployment with a minimal amount of code. It can work with any model/library that follows a Scikit-Learn API such as Scikit-Learn Models, Xgboost, LightGBM, CatBoost, etc. All in all, the library is an easy-to-use productivity booster that enables fast experiments and helps you focus more on the business problem at hand.

Now, what if you want to add Neural Networks to your models-to-try list?, you need to write snippets for training and testing loops with frameworks like PyTorch, convert NumPy arrays to tensors, and the other way around to get existing things working or write a whole new set of evaluation functions. One small Type error and you end up changing a part of the code you have written over time, which might create more issues you never anticipated. You end up more time updating the code than experimenting on different models and solve the problem.

What if you can use the same PyCaret with Neural Networks with very minimal changes?

Yes, you heard it right! SKORCH makes it possible! SKORCH is a Scikit-Learn wrapper for PyTorch that makes it possible to train Neural Networks with sklearn-like API, which is what PyCaret expects!

Many resources are explaining how to use Pycaret to build ML models with a single line of code! And also tutorials and examples for using SKORCH to build neural networks. It is recommended to go through these tutorials before jumping into the next parts of this blog. This notebook containing the code can be referred to in parallel.

In this blog, we will see

  • how to use build a neural network with SKORCH
  • how to train a neural network with PyCaret
  • how to tune neural networks with PyCaret

During these steps, we will also see how to compare the performance of neural networks with other models with oneliners from PyCaret.

For this tutorial, we are going to use the “electrical_grid” dataset. Which is a binary classification problem containing 12 input features and one target variable. This dataset is available in the built-in datasets that come with PyCaret and can be accessed by importing and calling the get_data function, as shown

from pycaret.datasets import get_data
data = get_data('electrical_grid')

The output should look something like this in a notebook,

electrical_grid data from PyCaret (Image by Author)

How to Build Neural Networks with SKORCH

PyTorchModel

SKORCH works with PyTorch models; we can create a simple model to start with just as we do when using pure PyTorch. We will build a three-layered MLP as shown, which has 12 inputs for the first layer as we have 12 features in the data, and two outputs for both the classes.

import torch.nn as nnclass Net(nn.Module):
def __init__(self, num_inputs=12, num_units_d1=200, num_units_d2=100):
super(Net, self).__init__()
self.dense0 = nn.Linear(num_inputs, num_units_d1)
self.nonlin = nn.ReLU()
self.dropout = nn.Dropout(0.5)
self.dense1 = nn.Linear(num_units_d1, num_units_d2)
self.output = nn.Linear(num_units_d2, 2)
self.softmax = nn.Softmax(dim=-1)
def forward(self, X, **kwargs):
X = self.nonlin(self.dense0(X))
X = self.dropout(X)
X = self.nonlin(self.dense1(X))
X = self.softmax(self.output(X))
return X

Skorch Classifier

Now that we have defined the network architecture, we have to instantiate a sklearn compatible neural network with SKORCH, which is done by importing NeuralNetClassifier from skorch. We are working on a classification task.

from skorch import NeuralNetClassifiernet = NeuralNetClassifier(
module=Net,
max_epochs=30,
lr=0.1,
batch_size=32,
train_split=None
)

The most important argument that needs to be passed is the module that takes the name of the nn.Module defined, in this case, we pass Net to it. Similarly, we can pass other arguments max_epochs indicating how many epochs to train the model for, and “lr” the learning rate for the optimizer, batch_size. To set the mini-batch size. We will stick to the default SGD optimizer used in Skorch, and this can be changed to any custom optimizer as explained here.

Skorch uses a 5 Fold Cross Validation by default. Therefore each split has 80% samples in train and 20% samples as validation. This can be disabled by passing train_split=None, which we are going to do since we will use PyCaret to train the models, which already uses cross-validation to train the models.

How to Train a Neural Networks with PyCaret

DataFrameTransformer

Now that we have initialized a SKORCH NN model, we can train the model with PyCaret. One thing we have to remember here is that PyCaret inherently works with pandas.DataFrames during various operations, but in the Skorch model, it fails to pass the data directly to the model. (more about it here) Therefore, we need to construct a sklearn Pipeline nn_pipe with a skorch.helper.DataFrameTransformer to transform the input passed by PyCaret into the required format in addition to the model. Thanks to IncubatorShokuhou for identifying this.

from skorch.helper import DataFrameTransformernn_pipe = Pipeline(
[
("transform", DataFrameTransformer()),
("net", net),
]
)

PyCaret Setup

Now we are finally ready with our model and can train it as we have decided to use PyCaret to train the model instead of the Skorch API. Let’s have a quick primer about PyCaret. Before any model is trained with PyCaret, we need to initiate an experiment by calling the setup function, which performs all the preprocessing steps, which can, of course, be controlled with the help of arguments that are passed to the function.

from pycaret.classification import *target = "stabf"
clf1 = setup(data = data,
target = target,
train_size = 0.8,
fold = 5,
session_id = 123,
log_experiment = True,
experiment_name = 'electrical_grid_1',
silent = True)

We pass in the data, target column name, train_size, the number of folds used in cross-validation as the main arguments to the setup. We will set the log_experiment to True to track the experiments with MLFlow, and set an experiment_name so that we can come back and refer to the results at a later stage. Also, we will set the silent argument to True to avoid the “press enter to continue” step during the setup phase.

PyCaret Train ML Model

To train an ml model, we can call the create_model function with the acronym of the model we want to train. In this case, we can do create_model(“rf”) to train a Random Forest Classifier.

model = create_model("rf")

Now PyCaret instantiates the RandomForestClassifier with default parameters and trains the model on the five-folds, and prints the results as shown.

Image by Author

The trained RandomForestClassifier is now stored in the variable model, which we will use later.

PyCaret Train Skorch Model

Now that we know how PyCaret works let’s train the Skorch model that we have instantiated. The good thing with Pycaret is that the create_model accepts any Sklearn API compatible object. Therefore we can pass the nn_pipe we have created above to the create_model function as below.

Image by Author

Again, the create_model takes the skorch model, trains the model on the five-folds, and prints the results. And the trained skorch model is now saved in the variable skorch_model.

Comparing models

Now we can compare both these models by passing them as a list to the compare_models function in PyCaret. The compare models function will now train both the models on the cross-validation folds and records the mean cv scores to compare the models on the desired metric. We can set the desired metric by passing a string to the sort parameter; let us use “AUC” in our example.

best_model = compare_models(include=[skorch_model, rf_model], sort="AUC"
Image by Author

How to Tune Neural Networks with SKORCH

Setting up the Hyperparameter Grid

Now that we know how to train a neural network let’s look at how to tune the hyperparameters. In this case, we will tune the number of neurons in the hidden dense layers, learning rate, optimizer, and the number of epochs. For this, we need to create a custom hyperparameter grid as below.

custom_grid = {
'net__max_epochs':[20, 30],
'net__lr': [0.01, 0.05, 0.1],
'net__module__num_units_d1': [50, 100, 150],
'net__module__num_units_d2': [50, 100, 150],
'net__optimizer': [optim.Adam, optim.SGD, optim.RMSprop]
}

Notice that the module name and the parameters are separated by a double “__” because that is how the NeuralNetClassifier expects the arguments to be passed so that it can instantiate the required modules. Also, here “net” represents the net variable we created from the NeuralNetClassifier, and “module” represents the Net module we created for the Pytorch Model.

If you want to know the list of parameters of the model, you can view them by executing estimator.get_params().keys(). Where the estimator is the variable skorch_model in this case, skorch_model.get_params().keys() prints the following output,

Image by Author

Notice that the arguments net__module__num_units_d1/2 are not present in the printed output since they are not the arguments for the skorch_model but the arguments for the net; therefore, they need to be set with net__module__**.

Tuning

After setting the hyperparameter grid, we have to call the tune_model function from pycaret and pass in the model and the custom_grid as arguments for tuning. Here as well we can select from various tuning algorithms/libraries supported by PyCaret, which are listed in the documentation.

tuned_skorch_model = tune_model(skorch_model, custom_grid=custom_grid)

That’s it! Now the model tuning will start. Be patient; this might take some time based on the network size, dataset size, etc. Once the tuning is finished, you will again see the cross-validation scores as below.

Image by Author

The tune_model() function returns the tuned model which is now available in tuned_skorch_model variable.

Comparing models

Again, we can compare these models by passing them as a list to the compare_models function in PyCaret.

best_model = compare_models(include=[tuned_skorch_model, skorch_model, rf_model], sort="AUC")
Image by Author

Unfortunately, there is no option to pass the names of the models as an argument. Therefore both tuned_skorch_model and skorch_model are named NeuralNetClassifier (which is automatically identified by PyCaret). I assume the second row is the tuned model, and the third one is the untuned one.

Also, this blog is focused on showing how we can use Skorch with Pycaret to build PyTorch models with minimal code. It is not focused on improving the model performance in particular.

Note:

  1. Here the compare_models function is used after create_model or tuned_model, but pycaret suggests that the compare_models is to be used as a first step to compare all the available models. But the purpose here is to explain how to use Skorch+Pytorch. Therefore it is done differently in this case. In general, to compare different models after training, we can write a custom wrapper function. We can pass a list of models and the data to be evaluated to obtain the comparison scores.

MLFlow Dashboard

Since we have set log_experiment to True in the PyCaret setup, all the experiments are logged with MLFlow without us doing anything! Voila! To look at all the experiments, open a terminal in the working directory, and type mlflow ui. You should see this in your browser by clicking on the URL shown in your terminal.

Image by Author

The above picture shows all the experiments we ran in the notebook.

Conclusion

We saw how we could Skorch and PyCaret train Neural Networks with minimal code with the advantage of using the same API for training traditional ML models and Neural Networks. In addition, we can compare the performance of all the models with a single line of code and save all the metrics saved into MLFlow with just an argument! I highly recommend going through the official PyCaret and Skorch documentation to learn more on how to use these amazing libraries and make ML experimentation easier!

References

  1. https://pycaret.org/
  2. https://www.analyticsvidhya.com/blog/2020/05/pycaret-machine-learning-model-seconds/
  3. https://github.com/skorch-dev/skorch
  4. https://towardsdatascience.com/skorch-pytorch-models-trained-with-a-scikit-learn-wrapper-62b9a154623e

AI/ML

Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot

%d bloggers like this: