Skip to content

Instantly share code, notes, and snippets.

@m-pilia
Created November 14, 2019 10:39
Show Gist options
  • Save m-pilia/c0c74917293abc3b098b48adc24581e7 to your computer and use it in GitHub Desktop.
Save m-pilia/c0c74917293abc3b098b48adc24581e7 to your computer and use it in GitHub Desktop.
Example sampling of a Gaussian Process posterior
using Distributions
using LinearAlgebra
using Plots
using Random
pyplot();
# Training points
P = [
1.0 1.0
2.0 3.0
4.0 -2.0
7.0 5.0
8.0 3.0
]
# Dataset size
N₁ = size(P, 1)
N₂ = 500;
# I.i.d. observation noise standard deviation
σₙ = 0.05;
f₁ = P[:, 2]
X₁ = P[:, 1]
X₂ = collect(range(0, stop=10, length=N₂));
ϵ = 1e-12 * rand() * I;
# Build conditional distribution
k(x, y) = exp(-0.5 * norm(x - y)^2);
K(X, Y) = [k(X[i], Y[j]) for i in 1:size(X, 1), j in 1:size(Y, 1)];
Σ₁₁¯¹ = inv(K(X₁, X₁) + σₙ^2 * I);
Σ₁₂ = K(X₁, X₂);
Σ₂₂ = K(X₂, X₂);
μ = Σ₁₂' * Σ₁₁¯¹ * f₁;
Σ = Σ₂₂ - Σ₁₂' * Σ₁₁¯¹ * Σ₁₂ + ϵ;
σ = sqrt.(diag(Σ));
d = MvNormal(μ, Matrix(Hermitian(Σ)));
# Sample some functions
f₂ = rand(d, 5);
p = plot(X₂, f₂);
plot!(X₂, μ, linewidth=2, ribbon=(2σ, 2σ), fillalpha=0.2, c="grey", label="μ ± 2σ")
scatter!(X₁, f₁, label="P")
display(p);
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment