Skip to content

Commit

Permalink
Merge pull request #92 from giotto-ai/docs_notebooks_typing
Browse files Browse the repository at this point in the history
Docs notebooks typing
  • Loading branch information
matteocao authored Jun 2, 2022
2 parents 9d4bc1b + dd46255 commit bb0c449
Show file tree
Hide file tree
Showing 54 changed files with 2,418 additions and 1,854 deletions.
8 changes: 7 additions & 1 deletion docs/source/modules/analysis.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,14 @@
Analysis
========

Interpretability
----------------

.. automodule:: gdeep.analysis.interpretability
:members:


Decision Boundary
-----------------

.. automodule:: gdeep.analysis.decision_boundary
:members:
5 changes: 5 additions & 0 deletions docs/source/modules/utility.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,8 @@ Utility
.. automodule:: gdeep.utility
:members:

Persistence Gradient
--------------------

.. automodule:: gdeep.utility.optimisation
:members:
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,18 @@
"# Ensemble learning\n",
"#### Author: Matteo Caorsi\n",
"\n",
"It is possible, in giotto-deep, to use the ensemble models of the library `ensemble-pxytorch` together with all the functionalities of `gdeep`.\n",
"It is possible, in giotto-deep, to use the ensemble models of the library `ensemble-pytorch` together with all the functionalities of `gdeep`.\n",
"\n",
"In this short tutorial we will explain this concept with a simple example."
"## Scope\n",
"\n",
"Ensamble technique put together the predictions of different models and decide which is the best answer. It is a bit like having and ensemble of experts giving opinions and then the person in charge takes the final decision. In this example, we will try out the `VotingClassifer`, i.e. an ensemble method that decides on the best preediciton based on the majority of experts votes.\n",
"\n",
"## Content\n",
"These aree the main steps we wll follow:\n",
" 1. Load your data\n",
" 2. Defne a single expert\n",
" 3. wrap the ensemble model\n",
" 4. train the ensemble"
]
},
{
Expand All @@ -23,14 +32,16 @@
"# imports\n",
"from torch import nn\n",
"from torch.utils.tensorboard import SummaryWriter\n",
"from gdeep.models import FFNet\n",
"from torch.optim.lr_scheduler import ExponentialLR\n",
"from torch.optim import SGD, Adam\n",
"from torchensemble import VotingClassifier\n",
"\n",
"from gdeep.data.datasets import DatasetBuilder, DataLoaderBuilder\n",
"from gdeep.trainer import Trainer\n",
"from gdeep.utility.optimisation import SAMOptimizer\n",
"from torch.optim.lr_scheduler import ExponentialLR\n",
"from torch.optim import SGD, Adam\n",
"from gdeep.utility import ensemble_wrapper"
"from gdeep.models import FFNet\n",
"from gdeep.utility import ensemble_wrapper\n",
"from gdeep.visualisation import Visualiser"
]
},
{
Expand All @@ -40,8 +51,8 @@
"source": [
"# Initialize the tensorboard writer\n",
"\n",
"In order to analyse the reuslts of your models, you need to start tensorboard.\n",
"On the terminal, move inside the `/example` folder. There run the following command:\n",
"In order to analyse the results of your models, you need to start tensorboard.\n",
"On the terminal, move inside the `/examples` folder. There run the following command:\n",
"\n",
"```\n",
"tensorboard --logdir=runs\n",
Expand All @@ -65,7 +76,9 @@
"id": "a956ff40-cc51-49bb-bd72-34a05f06ea99",
"metadata": {},
"source": [
"# Load your data"
"# Load your data\n",
"\n",
"In this example we use a tabular dataset and the task is a classification task. The dataset is a point cloud representing two entangled tori and the model needs to classify each point as belonging to one or the other torus."
]
},
{
Expand All @@ -89,7 +102,7 @@
"source": [
"# Define a single estimator of the ensemble\n",
"\n",
"You can define it as you would normally do with any neural network"
"You can define a single estimator of an ensemble as you would normally do with any other neural network"
]
},
{
Expand Down Expand Up @@ -139,7 +152,7 @@
"id": "ecd8167e-9b7b-43a4-9b2b-9b28b303def4",
"metadata": {},
"source": [
"What you would have done instead:\n",
"What you would have done instead, renouncing to many giotto-deep capablities:\n",
"\n",
"```\n",
"model = VotingClassifier(\n",
Expand All @@ -148,7 +161,7 @@
" cuda=False\n",
")\n",
"model.set_optimizer(\"Adam\")\n",
"model.fit(train_loader=dl_tr,epochs=1`\n",
"model.fit(train_loader=dl_tr,epochs=1)\n",
"```\n"
]
},
Expand All @@ -157,8 +170,8 @@
"id": "8f6d96a5-8de4-4d2d-8048-dcc0e8a2bed2",
"metadata": {},
"source": [
"# Train your model\n",
"You can easily train your esnsemble model as you would train any other model in giotto-deep."
"# Train your ensemble of models\n",
"You can easily train your ensemble model as you would train any other model in giotto-deep: initialise the `Trainer` class and run it."
]
},
{
Expand Down Expand Up @@ -188,7 +201,7 @@
"source": [
"# Visualise the model graph\n",
"\n",
"You can integractively visualise your model by checking it on tensorboard after these few lines:"
"You can integractively visualise your ensemble of models by checking it on tensorboard after these few lines are executed:"
]
},
{
Expand All @@ -198,11 +211,11 @@
"metadata": {},
"outputs": [],
"source": [
"from gdeep.visualisation import Visualiser\n",
"\n",
"# initialise he visualiser\n",
"vs = Visualiser(pipe)\n",
"\n",
"vs.plot_data_model() # send the graph of the model to tb"
"# send the graph of the model to tensorboard\n",
"vs.plot_data_model() "
]
},
{
Expand Down
Loading

0 comments on commit bb0c449

Please sign in to comment.