Skip to content

Commit

Permalink
At first code to test in getting_started
Browse files Browse the repository at this point in the history
  • Loading branch information
holgern committed Feb 24, 2020
1 parent 5d4d181 commit e566626
Show file tree
Hide file tree
Showing 2 changed files with 23 additions and 22 deletions.
5 changes: 1 addition & 4 deletions build_tools/travis/test_docs.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,5 @@
set -e
set -x
if [[ "$SDIST" != "true" ]]; then
pushd .
cd doc
make doctest
popd
make test-doc
fi
40 changes: 22 additions & 18 deletions doc/getting_started.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
.. currentmodule:: skopt

===============
Getting started
===============

.. currentmodule:: skopt

Scikit-Optimize, or ``skopt``, is a simple and efficient library to
minimize (very) expensive and noisy black-box functions. It implements
several methods for sequential model-based optimization. ``skopt`` aims
Expand All @@ -27,27 +29,29 @@ Finding a minimum
Find the minimum of the noisy function ``f(x)`` over the range ``-2 < x < 2``
with :class:`skopt`::

import numpy as np
from skopt import gp_minimize

def f(x):
return (np.sin(5 * x[0]) * (1 - np.tanh(x[0] ** 2)) *
np.random.randn() * 0.1)

res = gp_minimize(f, [(-2.0, 2.0)])
>>> import numpy as np
>>> from skopt import gp_minimize
>>> np.random.seed(123)
>>> def f(x):
... return (np.sin(5 * x[0]) * (1 - np.tanh(x[0] ** 2)) *
... np.random.randn() * 0.1)
>>>
>>> res = gp_minimize(f, [(-2.0, 2.0)], n_calls=20)
>>> print("x*=%.2f f(x*)=%.2f" % (res.x[0], res.fun))
x*=0.85 f(x*)=-0.05

For more control over the optimization loop you can use the :class:`skopt.Optimizer`
class::

from skopt import Optimizer

opt = Optimizer([(-2.0, 2.0)])

for i in range(20):
suggested = opt.ask()
y = f(suggested)
opt.tell(suggested, y)
print('iteration:', i, suggested, y)
>>> from skopt import Optimizer
>>> opt = Optimizer([(-2.0, 2.0)])
>>>
>>> for i in range(20):
... suggested = opt.ask()
... y = f(suggested)
... res = opt.tell(suggested, y)
>>> print("x*=%.2f f(x*)=%.2f" % (res.x[0], res.fun))
x*=-0.10 f(x*)=-0.06

For more read our :ref:`sphx_glr_auto_examples_bayesian-optimization.py` and the other
`examples <auto_examples/index.html>`_.
Expand Down

0 comments on commit e566626

Please sign in to comment.