Gaussian Processes

Contents:

New Module to implement tasks relating to Gaussian Processes.

2013-03-14 18:40 IJMC: Begun.

gaussianprocess.logLikelihood(*arg, **kw)[source]

Compute log likelihood using Gaussian Process techniques.

INPUTS:

(fitparams, function, arg1, arg2, ... , depvar, cov_func, cov_args, ncov_params)

OR:

(fitparams, function, arg1, arg2, ... , depvar, cov_func, cov_args, ncov_params, kw)

OR:

(allparams, (args1, args2, ..), npars=(npar1, npar2, ...))

where allparams is an array concatenation of each functions input parameters.

fitparams : sequence

Parameters used to compute the likelihood. The first values (length ‘ncov_params’) are used to compute the covariance matrix via ‘cov_func’, and the subsequent values are passed to ‘function’.

cov_func : function

Function to generate covariance matrix (e.g. squaredExponentialKernel()).

cov_args : tuple

Arguments to be passed to ‘cov_func’.

ncov_params: scalar

Number of N parameters used to call ‘cov_func’. These parameters must be the first N values of ‘fitparams’!

Computes the likelihood using full covariance matrices, i.e.:

(2 pi |C|)^-0.5 * exp(-0.5 * r.T * C^-1 * r)

where r is the residual vector

(depvar - function(fitparams, arg1, ...)).ravel()

If

‘function’ need not output a 1D vector, but its output must be

the same shape as ‘depvar’. The difference will be np.ravel()’ed, and the result of the ravel() operation must be a vector with the same lengths as the 1-D size of ‘covariance’.

OPTIONS:
gaussprior, uniformprior: must be the same length as “params”,

which is the concatenation of the paramaters passed to ‘cov_func’ and ‘function’. Elements must be either ‘None’ or 2-tuples of (mean, std.dev.).

ngaussprior: must be the same length as “params”, which is the

concatenation of the parameters passed to ‘cov_func’ and ‘function’! Elements must be either ‘None’ or 3-tuples of form: (indices, mean_vector, covariance_matrix).

jointpars : also valid

SEE_ALSO:
phasecurves.devfunc() (for a frequentist version without

all these extra covariance-matrix hyperparameters).

gaussianprocess.negLogLikelihood(*arg, **kw)[source]

Returns negative log likelihood, for minimization routines.

See logLikelihood() for syntax and details.

gaussianprocess.squaredExponentialKernel(params, x, x1=None)[source]

Construct a squared-exponential kernel with specified perameters.

INPUTS:
x : 1D NumPy array

Input independent data.

params : sequence

params[0] – h : scalar

Amplitude scaling

params[1] – L : scalar

Length scaling

params[2] – s : OPTIONAL scalar

White noise component from whiteNoiseKernel() (optional!)

x1 : 1D NumPy Array

Dependent data (e.g.)

NOTES:

Computes k(x, x’) = h^2 exp(- [ (x - x’) / 2L ]^2 )

RETURNS:

k, the kernel matrix.

REFERENCE:

Roberts et al. 2012, Eq. 13.

EXAMPLE:
import gaussianprocess as gp
import numpy as np
import pylab as py

x0 = np.arange(50.)  # independent data
k = gp.squaredExponentialKernel([1, 5], x0)
sample_draw = np.random.multivariate_normal(np.zeros(x0.size), k, 1).ravel()
gaussianprocess.whiteNoiseKernel(s, x)[source]

Construct a white-noise kernel (diagonal matrix) with specified sigma.

INPUTS:
x : 1D NumPy array, or scalar

Input independent data, or size of desired matrix.

s : scalar

Sigma [i.e., sqrt(variance) ] of the desired white noise.

NOTES:

Computes k(i, j) = delta_{ij} s^2

RETURNS:

k, the kernel matrix.

REFERENCE:

Roberts et al. 2012, Eq. 12.

EXAMPLE:
import gaussianprocess as gp
import numpy as np
import pylab as py

x0 = np.arange(50.)  # independent data
k = gp.whiteNoiseKernel(1, x0)
sample_draw = np.random.multivariate_normal(np.zeros(x.size), k, 1).ravel()

Previous topic

NIRSPEC Data Analysis

Next topic

Planetary phase curve routines

This Page