Import all necessary modules from Sghoun
from modshogun import RealFeatures, BinaryLabels, GaussianKernel, Math
from modshogun import ProbitLikelihood, ZeroMean, LaplacianInferenceMethod, GaussianProcessBinaryClassification
Generate some easy toy data, three isotropic 2D Gaussians, with labels +1, -1. Plot it. Test data is a mesh on the 2d plane
n=30
mean_a1=asarray([0,0])
mean_a2=asarray([2,2])
mean_b=asarray([1,1])
std_dev=0.5
X1=(randn(n,2)*std_dev+mean_a1).T
X2=(randn(n,2)*std_dev+mean_a2).T
X3=(randn(n,2)*std_dev+mean_b).T
X=hstack((X1,X2,X3))
Y=-ones(shape(X)[1])
Y[:2*n]+=2
# generate all pairs in 2d range of training data
import itertools
n_test=60
P=linspace(X[0,:].min()-1, X[0,:].max()+1, n_test)
Q=linspace(X[0,:].min()-1, X[0,:].max()+1, n_test)
X_test=asarray(list(itertools.product(P, Q))).T
# plot training data
plot(X[0,:2*n],X[1,:2*n], 'ro')
_=plot(X[0,2*n:],X[1,2*n:], 'bo')
Convert data into Shogun representation, print dimensions to be sure data was passed in correct
labels=BinaryLabels(Y)
feats_train=RealFeatures(X)
feats_test=RealFeatures(X_test)
Specify a Shogun GP (probit GP-classification with Laplace-approx.) with fixed hyper-parameters and pass it the data
kernel_sigma=2.5
kernel=GaussianKernel(10,kernel_sigma)
mean=ZeroMean()
lik=ProbitLikelihood()
inf=LaplacianInferenceMethod(kernel, feats_train, mean, labels, lik)
gp = GaussianProcessBinaryClassification(inf)
Train GP, perform inference and plot predictive distribution with decision boundary
gp.train()
predictions=gp.apply_binary(feats_test)
Y_test=predictions.get_values()
Y_test=reshape(Y_test, (n_test,n_test))
plot(X[0,:2*n],X[1,:2*n], 'ro')
plot(X[0,2*n:],X[1,2*n:], 'bo')
contour(P,Q,Y_test, levels=[0])
_=pcolor(P,Q,Y_test)