Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Keras 3 #317

Merged
merged 24 commits into from
Apr 10, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Scikit-Learn compatible wrappers for Keras Models.

## Why SciKeras

SciKeras is derived from and API compatible with `tf.keras.wrappers.scikit_learn`. The original TensorFlow (TF) wrappers are not actively maintained,
SciKeras is derived from and API compatible with `keras.wrappers.scikit_learn`. The original TensorFlow (TF) wrappers are not actively maintained,
and [will be removed](https://github.com/tensorflow/tensorflow/pull/36137#issuecomment-726271760) in a future release.
adriangb marked this conversation as resolved.
Show resolved Hide resolved

An overview of the advantages and differences as compared to the TF wrappers can be found in our
Expand Down Expand Up @@ -38,7 +38,7 @@ runtime or SciKeras will throw an error when you try to import it.

The current version of SciKeras depends on `scikit-learn>=1.0.0` and `TensorFlow>=2.7.0`.

### Migrating from `tf.keras.wrappers.scikit_learn`
### Migrating from `keras.wrappers.scikit_learn`

Please see the [migration](https://www.adriangb.com/scikeras/stable/migration.html) section of our documentation.

Expand Down
26 changes: 13 additions & 13 deletions docs/source/advanced.rst
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@ offer an easy way to compile and tune compilation parameters. Examples:

.. code:: python

from tensorflow.keras.optimizers import Adam
from keras.optimizers import Adam

def model_build_fn():
model = Model(...)
Expand Down Expand Up @@ -280,7 +280,7 @@ Optimizer
.. code:: python

from scikeras.wrappers import KerasClassifier
from tensorflow import keras
import keras

clf = KerasClassifier(
model=model_build_fn,
Expand All @@ -305,7 +305,7 @@ Losses

.. code:: python

from tensorflow.keras.losses import BinaryCrossentropy, CategoricalCrossentropy
from keras.losses import BinaryCrossentropy, CategoricalCrossentropy

clf = KerasClassifier(
...,
Expand All @@ -322,7 +322,7 @@ Additionally, SciKeras supports routed parameters to each individual loss, or to

.. code:: python

from tensorflow.keras.losses import BinaryCrossentropy, CategoricalCrossentropy
from keras.losses import BinaryCrossentropy, CategoricalCrossentropy

clf = KerasClassifier(
...,
Expand All @@ -348,7 +348,7 @@ Here are several support use cases:

.. code:: python

from tensorflow.keras.metrics import BinaryAccuracy, AUC
from keras.metrics import BinaryAccuracy, AUC

clf = KerasClassifier(
...,
Expand Down Expand Up @@ -388,7 +388,7 @@ SciKeras can route parameters to callbacks.

clf = KerasClassifier(
...,
callbacks=tf.keras.callbacks.EarlyStopping
callbacks=keras.callbacks.EarlyStopping
callbacks__monitor="loss",
)

Expand All @@ -399,21 +399,21 @@ Just like metrics and losses, callbacks support several syntaxes to compile them
# for multiple callbacks using dict syntax
clf = KerasClassifier(
...,
callbacks={"bl": tf.keras.callbacks.BaseLogger, "es": tf.keras.callbacks.EarlyStopping}
callbacks={"bl": keras.callbacks.BaseLogger, "es": keras.callbacks.EarlyStopping}
callbacks__es__monitor="loss",
)
# or using list sytnax
clf = KerasClassifier(
...,
callbacks=[tf.keras.callbacks.BaseLogger, tf.keras.callbacks.EarlyStopping]
callbacks=[keras.callbacks.BaseLogger, keras.callbacks.EarlyStopping]
callbacks__1__monitor="loss", # EarlyStopping(monitor="loss")
)

Keras callbacks are event based, and are triggered depending on the methods they implement.
For example:

.. code:: python
from tensorflow import keras
import keras

class MyCallback(keras.callbacks.Callback):

Expand All @@ -433,9 +433,9 @@ simply use the ``fit__`` or ``predict__`` routing prefixes on your callback:

clf = KerasClassifier(
...,
callbacks=tf.keras.callbacks.Callback, # called from both fit and predict
fit__callbacks=tf.keras.callbacks.Callback, # called only from fit
predict__callbacks=tf.keras.callbacks.Callback, # called only from predict
callbacks=keras.callbacks.Callback, # called from both fit and predict
fit__callbacks=keras.callbacks.Callback, # called only from fit
predict__callbacks=keras.callbacks.Callback, # called only from predict
)

Any routed constructor parameters must also use the corresponding prefix to get routed correctly.
Expand All @@ -449,7 +449,7 @@ which tells SciKeras to pass that parameter as an positional argument instead of

.. code:: python

from tensorflow import keras
import keras

class Schedule:
"""Exponential decay lr scheduler.
Expand Down
2 changes: 1 addition & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Welcome to SciKeras's documentation!

The goal of scikeras is to make it possible to use Keras/TensorFlow with sklearn.
This is achieved by providing a wrapper around Keras that has an Scikit-Learn interface.
SciKeras is the successor to ``tf.keras.wrappers.scikit_learn``, and offers many
SciKeras is the successor to ``keras.wrappers.scikit_learn``, and offers many
improvements over the TensorFlow version of the wrappers. See :ref:`Migration<Migration>` for a more details.

SciKeras tries to make things easy for you while staying out of your way.
Expand Down
6 changes: 3 additions & 3 deletions docs/source/migration.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
.. _Migration:

=================================================
Migrating from ``tf.keras.wrappers.scikit_learn``
Migrating from ``keras.wrappers.scikit_learn``
=================================================

.. contents::
Expand All @@ -10,7 +10,7 @@ Migrating from ``tf.keras.wrappers.scikit_learn``

Why switch to SciKeras
----------------------
SciKeras has several advantages over ``tf.keras.wrappers.scikit_learn``:
SciKeras has several advantages over ``keras.wrappers.scikit_learn``:

* Full compatibility with the Scikit-Learn API, including grid searches, ensembles, transformers, etc.
* Support for Functional and Subclassed Keras Models.
Expand All @@ -29,7 +29,7 @@ SciKeras is largely backwards compatible with the existing wrappers. For most ca

.. code:: diff

- from tensorflow.keras.wrappers.scikit_learn import KerasClassifier, KerasRegressor
- from keras.wrappers.scikit_learn import KerasClassifier, KerasRegressor
+ from scikeras.wrappers import KerasClassifier, KerasRegressor


Expand Down
4 changes: 2 additions & 2 deletions docs/source/notebooks/AutoEncoders.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,15 +53,15 @@ warnings.filterwarnings("ignore", message="Setting the random state for TF")
```python
import numpy as np
from scikeras.wrappers import KerasClassifier, KerasRegressor
from tensorflow import keras
import keras
```

## 2. Data

We load the dataset from the Keras tutorial. The dataset consists of images of cats and dogs.

```python
from tensorflow.keras.datasets import mnist
from keras.datasets import mnist
import numpy as np


Expand Down
6 changes: 3 additions & 3 deletions docs/source/notebooks/Basic_Usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ warnings.filterwarnings("ignore", message="Setting the random state for TF")
```python
import numpy as np
from scikeras.wrappers import KerasClassifier, KerasRegressor
from tensorflow import keras
import keras
```

## 2. Training a classifier and making predictions
Expand Down Expand Up @@ -100,7 +100,7 @@ do for binary classification. The second option is usually reserved for when
you have >2 classes.

```python
from tensorflow import keras
import keras


def get_clf(meta, hidden_layer_sizes, dropout):
Expand Down Expand Up @@ -355,7 +355,7 @@ This is exactly the same logic that allows to access estimator parameters in `sk

This feature is useful in several ways. For one, it allows to set those parameters in the model definition. Furthermore, it allows you to set parameters in an `sklearn GridSearchCV` as shown below.

To differentiate paramters like `callbacks` which are accepted by both `tf.keras.Model.fit` and `tf.keras.Model.predict` you can add a `fit__` or `predict__` routing suffix respectively. Similar, the `model__` prefix may be used to specify that a paramter is destined only for `get_clf`/`get_reg` (or whatever callable you pass as your `model` argument).
To differentiate paramters like `callbacks` which are accepted by both `kerasModel.fit` and `kerasModel.predict` you can add a `fit__` or `predict__` routing suffix respectively. Similar, the `model__` prefix may be used to specify that a paramter is destined only for `get_clf`/`get_reg` (or whatever callable you pass as your `model` argument).
adriangb marked this conversation as resolved.
Show resolved Hide resolved

For more information on parameter routing with special prefixes, see the [Advanced Usage Docs](https://www.adriangb.com/scikeras/stable/advanced.html#routed-parameters)

Expand Down
2 changes: 1 addition & 1 deletion docs/source/notebooks/Benchmarks.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ warnings.filterwarnings("ignore", message="Setting the random state for TF")
```python
import numpy as np
from scikeras.wrappers import KerasClassifier, KerasRegressor
from tensorflow import keras
import keras
```

## 2. Dataset
Expand Down
2 changes: 1 addition & 1 deletion docs/source/notebooks/DataTransformers.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ warnings.filterwarnings("ignore", message="Setting the random state for TF")
```python
import numpy as np
from scikeras.wrappers import KerasClassifier, KerasRegressor
from tensorflow import keras
import keras
```

## 2. Data transformer interface
Expand Down
2 changes: 1 addition & 1 deletion docs/source/notebooks/MLPClassifier_MLPRegressor.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ warnings.filterwarnings("ignore", message="Setting the random state for TF")
```python
import numpy as np
from scikeras.wrappers import KerasClassifier, KerasRegressor
from tensorflow import keras
import keras
```

## 2. Defining the Keras Model
Expand Down
2 changes: 1 addition & 1 deletion docs/source/notebooks/Meta_Estimators.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ warnings.filterwarnings("ignore", message="Setting the random state for TF")
```python
import numpy as np
from scikeras.wrappers import KerasClassifier, KerasRegressor
from tensorflow import keras
import keras
```

## 2. Defining the Keras Model
Expand Down
10 changes: 5 additions & 5 deletions docs/source/notebooks/sparse.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ import numpy as np
from scikeras.wrappers import KerasRegressor
from sklearn.preprocessing import OneHotEncoder
from sklearn.pipeline import Pipeline
from tensorflow import keras
import keras
```

## Data
Expand Down Expand Up @@ -89,19 +89,19 @@ def get_clf(meta) -> keras.Model:
## Pipelines

Here is where it gets interesting.
We make two Scikit-Learn pipelines that use `OneHotEncoder`: one that uses `sparse=False` to force a dense matrix as the output and another that uses `sparse=True` (the default).
We make two Scikit-Learn pipelines that use `OneHotEncoder`: one that uses `sparse_output=False` to force a dense matrix as the output and another that uses `sparse_output=True` (the default).

```python
dense_pipeline = Pipeline(
[
("encoder", OneHotEncoder(sparse=False)),
("encoder", OneHotEncoder(sparse_output=False)),
("model", KerasRegressor(get_clf, loss="mse", epochs=5, verbose=False))
]
)

sparse_pipeline = Pipeline(
[
("encoder", OneHotEncoder(sparse=True)),
("encoder", OneHotEncoder(sparse_output=True)),
("model", KerasRegressor(get_clf, loss="mse", epochs=5, verbose=False))
]
)
Expand Down Expand Up @@ -153,7 +153,7 @@ You might be able to save even more memory by changing the output dtype of `OneH
```python
sparse_pipline_uint8 = Pipeline(
[
("encoder", OneHotEncoder(sparse=True, dtype=np.uint8)),
("encoder", OneHotEncoder(sparse_output=True, dtype=np.uint8)),
("model", KerasRegressor(get_clf, loss="mse", epochs=5, verbose=False))
]
)
Expand Down
2 changes: 1 addition & 1 deletion docs/source/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ it on a toy classification dataset using SciKeras

import numpy as np
from sklearn.datasets import make_classification
from tensorflow import keras
import keras

from scikeras.wrappers import KerasClassifier

Expand Down
20 changes: 7 additions & 13 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -31,24 +31,18 @@ version = "0.12.0"

[tool.poetry.dependencies]
importlib-metadata = {version = ">=3", python = "<3.8"}
python = ">=3.8.0,<3.12.0"
scikit-learn = ">=1.0.0"
packaging = ">=0.21"
tensorflow = {version = ">=2.12.0,<2.13.0", optional = true}
tensorflow-cpu = {version = ">=2.12.0,<2.13.0", optional = true}
tensorflow-metal = {markers = "sys_platform == \"darwin\" and platform_machine == \"arm64\"", version = "^1.1.0"}
python = ">=3.9.0,<4"
scikit-learn = ">=1.4.1.post1"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the very latest version. Is it really necessary, or is an earlier version sufficient?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

An earlier version may suffice but given that they’ve made several breaking changes within this major version I’d like to just set it to something that works for now to save time with this update. We can always relax it later.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK

keras = { git = "https://github.com/keras-team/keras.git", rev = "master" }
tensorflow = { version = ">=2.16.0", optional = true }
tensorflow-cpu = { version = ">=2.16.0", optional = true }

[tool.poetry.extras]
tensorflow = ["tensorflow"]
tensorflow-cpu = ["tensorflow-cpu"]

[tool.poetry.dependencies.tensorflow-io-gcs-filesystem]
# see https://github.com/tensorflow/tensorflow/issues/60202
version = ">=0.23.1,<0.32"
markers = "sys_platform == 'win32'"
tensorflow = ["tensorflow"]
test = ["tensorflow"]

[tool.poetry.dev-dependencies]
tensorflow = ">=2.12.0,<2.13.0"
coverage = {extras = ["toml"], version = ">=6.4.2"}
insipid-sphinx-theme = ">=0.3.2"
ipykernel = ">=6.15.1"
Expand Down
22 changes: 4 additions & 18 deletions scikeras/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,27 +5,13 @@
try:
import importlib.metadata as importlib_metadata
except ModuleNotFoundError:
import importlib_metadata # python <3.8
import importlib_metadata # type: ignore # python <3.8

__version__ = importlib_metadata.version("scikeras")
__version__ = importlib_metadata.version("scikeras") # type: ignore

import keras as _keras

MIN_TF_VERSION = "2.7.0"
TF_VERSION_ERR = f"SciKeras requires TensorFlow >= {MIN_TF_VERSION}."

from packaging import version # noqa: E402

try:
from tensorflow import __version__ as tf_version
except ImportError: # pragma: no cover
raise ImportError("TensorFlow is not installed. " + TF_VERSION_ERR) from None
else:
if version.parse(tf_version) < version.parse(MIN_TF_VERSION): # pragma: no cover
raise ImportError(TF_VERSION_ERR) from None

import tensorflow.keras as _keras # noqa: E402

from scikeras import _saving_utils # noqa: E402
from scikeras import _saving_utils

_keras.Model.__reduce__ = _saving_utils.pack_keras_model
_keras.Model.__deepcopy__ = _saving_utils.deepcopy_model
Expand Down
Loading
Loading