Skip to content

1.5.6 #455

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 21 commits into from
Nov 8, 2016
Merged

1.5.6 #455

Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 3 additions & 5 deletions .gitchangelog.rc
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ ignore_regexps = [
##
section_regexps = [
('New', [
r'^[nN]ew\s*:\s*((dev|use?r|pkg|test|doc)\s*:\s*)?([^\n]*)$',
r'^[nN]ew\s*:\s*((dev|use?r|pkg|test|doc)\s*:\s*)?([^\n]*)$',
]),
('Changes', [
r'^[cC]hg\s*:\s*((dev|use?r|pkg|test|doc)\s*:\s*)?([^\n]*)$',
Expand All @@ -87,7 +87,6 @@ section_regexps = [

('Other', None ## Match all lines
),

]


Expand Down Expand Up @@ -147,7 +146,7 @@ tag_filter_regexp = r'^v[0-9]+\.[0-9]+(\.[0-9]+)?$'
##
## This label will be used as the changelog Title of the last set of changes
## between last valid tag and HEAD if any.
unreleased_version_label = "%%__version__%% (unreleased)"
unreleased_version_label = "Unreleased"


## ``output_engine`` is a callable
Expand Down Expand Up @@ -178,7 +177,6 @@ unreleased_version_label = "%%__version__%% (unreleased)"
## Examples:
## - makotemplate("restructuredtext")
##

#output_engine = rest_py
#output_engine = mustache("restructuredtext")
output_engine = mustache("markdown")
Expand All @@ -189,4 +187,4 @@ output_engine = mustache("markdown")
##
## This option tells git-log whether to include merge commits in the log.
## The default is to include them.
include_merge = True
include_merge = True
4 changes: 4 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,10 @@ before_install:
install:
- echo $PATH
- source install_retry.sh
- if [[ "$TRAVIS_OS_NAME" == "osx" ]];
then
conda install --yes pandoc;
fi;
- pip install codecov
- pip install coveralls
- pip install pypandoc
Expand Down
159 changes: 134 additions & 25 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,34 +1,97 @@
# Changelog

## v1.5.5 (2016-10-03)
## v1.5.6 (2016-11-07)

### New

* Added ploy basis kernel tests and import. [mzwiessele]

* Gitchangelogrc. [mzwiessele]

### Changes

* Added polynomial basis func kernel. [mzwiessele]

### Fix

* Installation #451. [Max Zwiessele]

* Pandoc install under travis osx. [mzwiessele]

* Pandoc install under travis osx. [mzwiessele]

* Pypi changing to pypi.org. [mzwiessele]

### Other

* Bump version: 1.5.4 → 1.5.5. [Max Zwiessele]
* Bump version: 1.5.5 → 1.5.6. [mzwiessele]

* Merge pull request #448 from thangbui/devel. [Max Zwiessele]

## v1.5.4 (2016-10-03)
Added pep.py -- Sparse Gaussian processes using Power Expectation Propagation

### New
* Renamed pep test scripts. [Thang Bui]

* Added deployment pull request instructions for developers. [mzwiessele]
* Fixed seed in pep test script #448. [Thang Bui]

* Using gitchangelog to keep track of changes and log new features. [mzwiessele]
* Added tests. [Thang Bui]

* Added pep.py -- Sparse Gaussian processes using Power Expectation Propagation. [Thang Bui]

This allows interpolation between FITC (EP or alpha = 1), and Titsias's variational (VarDTC, VFE when alpha = 0).

* Merge pull request #452 from SheffieldML/setupreq. [Max Zwiessele]

fix: Installation #451

* Merge pull request #447 from SheffieldML/polinomial. [Max Zwiessele]

Polynomial

* Merge branch 'devel' into polinomial. [mzwiessele]

* Merge pull request #449 from SheffieldML/deploy. [Max Zwiessele]

Deploy

* Update setup.py. [Mike Croucher]

* Merge pull request #446 from SheffieldML/devel. [Max Zwiessele]

newest patch fixing some issues

* Merge branch 'devel' of github.com:SheffieldML/GPy into devel. [mzwiessele]

* Merge branch 'deploy' into devel. [Max Zwiessele]

* Merge pull request #442 from SheffieldML/devel. [Max Zwiessele]

New Major for GPy

* Merge pull request #426 from SheffieldML/devel. [Max Zwiessele]

some fixes from issues and beckdaniels warped gp improvements


## v1.5.5 (2016-10-03)

### Other

* Bump version: 1.5.4 → 1.5.5. [Max Zwiessele]


## v1.5.4 (2016-10-03)

### Changes

* Version update on paramz. [Max Zwiessele]

* Fixed naming in variational priors : [Max Zwiessele]

* Changelog update. [mzwiessele]

### Fix

* Bug in dataset (in fn download_url) which wrongly interprets the Content-Length meta data, and just takes first character. [Michael T Smith]

* What's new update fix #425 in changelog. [mzwiessele]

### Other

* Bump version: 1.5.3 → 1.5.4. [Max Zwiessele]
Expand All @@ -39,25 +102,14 @@

* Merge branch 'kurtCutajar-devel' into devel. [mzwiessele]

* Bump version: 1.5.2 → 1.5.3. [mzwiessele]

* Merge branch 'devel' into kurtCutajar-devel. [mzwiessele]

* Bump version: 1.5.1 → 1.5.2. [mzwiessele]

* Minor readme changes. [mzwiessele]

* Bump version: 1.5.0 → 1.5.1. [mzwiessele]

* Bump version: 1.4.3 → 1.5.0. [mzwiessele]

* Bump version: 1.4.2 → 1.4.3. [mzwiessele]
## v1.5.3 (2016-09-06)

* Bump version: 1.4.1 → 1.4.2. [mzwiessele]
### Other

* Merge branch 'devel' of github.com:SheffieldML/GPy into devel. [mzwiessele]
* Bump version: 1.5.2 → 1.5.3. [mzwiessele]

* [kern] fix #440. [mzwiessele]
* Merge branch 'devel' into kurtCutajar-devel. [mzwiessele]

* [doc] cleanup. [mzwiessele]

Expand Down Expand Up @@ -92,6 +144,63 @@
* Added core code for GpSSM and GpGrid. [kcutajar]


## v1.5.2 (2016-09-06)

### New

* Added deployment pull request instructions for developers. [mzwiessele]

### Other

* Bump version: 1.5.1 → 1.5.2. [mzwiessele]

* Minor readme changes. [mzwiessele]


## v1.5.1 (2016-09-06)

### Fix

* What's new update fix #425 in changelog. [mzwiessele]

### Other

* Bump version: 1.5.0 → 1.5.1. [mzwiessele]


## v1.5.0 (2016-09-06)

### New

* Using gitchangelog to keep track of changes and log new features. [mzwiessele]

### Other

* Bump version: 1.4.3 → 1.5.0. [mzwiessele]


## v1.4.3 (2016-09-06)

### Changes

* Changelog update. [mzwiessele]

### Other

* Bump version: 1.4.2 → 1.4.3. [mzwiessele]


## v1.4.2 (2016-09-06)

### Other

* Bump version: 1.4.1 → 1.4.2. [mzwiessele]

* Merge branch 'devel' of github.com:SheffieldML/GPy into devel. [mzwiessele]

* [kern] fix #440. [mzwiessele]


## v1.4.1 (2016-09-06)

### Other
Expand Down
2 changes: 1 addition & 1 deletion GPy/__version__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "1.5.5"
__version__ = "1.5.6"
1 change: 1 addition & 0 deletions GPy/inference/latent_function_inference/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@ def __setstate__(self, state):
from .expectation_propagation import EP, EPDTC
from .dtc import DTC
from .fitc import FITC
from .pep import PEP
from .var_dtc_parallel import VarDTC_minibatch
from .var_gauss import VarGauss
from .gaussian_grid_inference import GaussianGridInference
Expand Down
93 changes: 93 additions & 0 deletions GPy/inference/latent_function_inference/pep.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
from .posterior import Posterior
from ...util.linalg import jitchol, tdot, dtrtrs, dtrtri, pdinv
from ...util import diag
import numpy as np
from . import LatentFunctionInference
log_2_pi = np.log(2*np.pi)

class PEP(LatentFunctionInference):
'''
Sparse Gaussian processes using Power-Expectation Propagation
for regression: alpha \approx 0 gives VarDTC and alpha = 1 gives FITC

Reference: A Unifying Framework for Sparse Gaussian Process Approximation using
Power Expectation Propagation, https://arxiv.org/abs/1605.07066

'''
const_jitter = 1e-6

def __init__(self, alpha):
super(PEP, self).__init__()
self.alpha = alpha

def inference(self, kern, X, Z, likelihood, Y, mean_function=None, Y_metadata=None):
assert mean_function is None, "inference with a mean function not implemented"

num_inducing, _ = Z.shape
num_data, output_dim = Y.shape

#make sure the noise is not hetero
sigma_n = likelihood.gaussian_variance(Y_metadata)
if sigma_n.size >1:
raise NotImplementedError("no hetero noise with this implementation of PEP")

Kmm = kern.K(Z)
Knn = kern.Kdiag(X)
Knm = kern.K(X, Z)
U = Knm

#factor Kmm
diag.add(Kmm, self.const_jitter)
Kmmi, L, Li, _ = pdinv(Kmm)

#compute beta_star, the effective noise precision
LiUT = np.dot(Li, U.T)
sigma_star = sigma_n + self.alpha * (Knn - np.sum(np.square(LiUT),0))
beta_star = 1./sigma_star

# Compute and factor A
A = tdot(LiUT*np.sqrt(beta_star)) + np.eye(num_inducing)
LA = jitchol(A)

# back substitute to get b, P, v
URiy = np.dot(U.T*beta_star,Y)
tmp, _ = dtrtrs(L, URiy, lower=1)
b, _ = dtrtrs(LA, tmp, lower=1)
tmp, _ = dtrtrs(LA, b, lower=1, trans=1)
v, _ = dtrtrs(L, tmp, lower=1, trans=1)
tmp, _ = dtrtrs(LA, Li, lower=1, trans=0)
P = tdot(tmp.T)

alpha_const_term = (1.0-self.alpha) / self.alpha

#compute log marginal
log_marginal = -0.5*num_data*output_dim*np.log(2*np.pi) + \
-np.sum(np.log(np.diag(LA)))*output_dim + \
0.5*output_dim*(1+alpha_const_term)*np.sum(np.log(beta_star)) + \
-0.5*np.sum(np.square(Y.T*np.sqrt(beta_star))) + \
0.5*np.sum(np.square(b)) + 0.5*alpha_const_term*num_data*np.log(sigma_n)
#compute dL_dR
Uv = np.dot(U, v)
dL_dR = 0.5*(np.sum(U*np.dot(U,P), 1) - (1.0+alpha_const_term)/beta_star + np.sum(np.square(Y), 1) - 2.*np.sum(Uv*Y, 1) \
+ np.sum(np.square(Uv), 1))*beta_star**2

# Compute dL_dKmm
vvT_P = tdot(v.reshape(-1,1)) + P
dL_dK = 0.5*(Kmmi - vvT_P)
KiU = np.dot(Kmmi, U.T)
dL_dK += self.alpha * np.dot(KiU*dL_dR, KiU.T)

# Compute dL_dU
vY = np.dot(v.reshape(-1,1),Y.T)
dL_dU = vY - np.dot(vvT_P, U.T)
dL_dU *= beta_star
dL_dU -= self.alpha * 2.*KiU*dL_dR

dL_dthetaL = likelihood.exact_inference_gradients(dL_dR)
dL_dthetaL += 0.5*alpha_const_term*num_data / sigma_n
grad_dict = {'dL_dKmm': dL_dK, 'dL_dKdiag':dL_dR * self.alpha, 'dL_dKnm':dL_dU.T, 'dL_dthetaL':dL_dthetaL}

#construct a posterior object
post = Posterior(woodbury_inv=Kmmi-P, woodbury_vector=v, K=Kmm, mean=None, cov=None, K_chol=L)

return post, log_marginal, grad_dict
2 changes: 1 addition & 1 deletion GPy/kern/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
from .src.splitKern import SplitKern,DEtime
from .src.splitKern import DEtime as DiffGenomeKern
from .src.spline import Spline
from .src.basis_funcs import LogisticBasisFuncKernel, LinearSlopeBasisFuncKernel, BasisFuncKernel, ChangePointBasisFuncKernel, DomainKernel
from .src.basis_funcs import LogisticBasisFuncKernel, LinearSlopeBasisFuncKernel, BasisFuncKernel, ChangePointBasisFuncKernel, DomainKernel, PolynomialBasisFuncKernel
from .src.grid_kerns import GridRBF

from .src.sde_matern import sde_Matern32
Expand Down
20 changes: 20 additions & 0 deletions GPy/kern/src/basis_funcs.py
Original file line number Diff line number Diff line change
Expand Up @@ -102,6 +102,26 @@ def _K(self, X, X2):
phi2 = phi2[:, None]
return phi1.dot(phi2.T)

class PolynomialBasisFuncKernel(BasisFuncKernel):
def __init__(self, input_dim, degree, variance=1., active_dims=None, ARD=True, name='polynomial_basis'):
"""
A linear segment transformation. The segments start at start, \
are then linear to stop and constant again. The segments are
normalized, so that they have exactly as much mass above
as below the origin.

Start and stop can be tuples or lists of starts and stops.
Behaviour of start stop is as np.where(X<start) would do.
"""
self.degree = degree
super(PolynomialBasisFuncKernel, self).__init__(input_dim, variance, active_dims, ARD, name)

@Cache_this(limit=3, ignore_args=())
def _phi(self, X):
phi = np.empty((X.shape[0], self.degree+1))
for i in range(self.degree+1):
phi[:, [i]] = X**i
return phi

class LinearSlopeBasisFuncKernel(BasisFuncKernel):
def __init__(self, input_dim, start, stop, variance=1., active_dims=None, ARD=False, name='linear_segment'):
Expand Down
Loading