Gaussian process change¶
Description¶
This cost function detects changes in the mean and scale of a Gaussian time series. Formally, for a signal \(\{y_t\}_t\) on an interval \(I\),
\[c(y_{I}) = |I| \log\det\widehat{\Sigma}_I\]
where \(\widehat{\Sigma}_I\) is the empirical covariance matrix of the sub-signal \(\{y_t\}_{t\in I}\).
Usage¶
Start with the usual imports and create a signal.
import numpy as np
import matplotlib.pylab as plt
import ruptures as rpt
# creation of data
n, dim = 500, 3 # number of samples, dimension
n_bkps, sigma = 3, 5 # number of change points, noise standart deviation
signal, bkps = rpt.pw_constant(n, dim, n_bkps, noise_std=sigma)
Then create a CostNormal
instance and print the cost of the sub-signal signal[50:150]
.
c = rpt.costs.CostNormal().fit(signal)
print(c.error(50, 150))
You can also compute the sum of costs for a given list of change points.
print(c.sum_of_costs(bkps))
print(c.sum_of_costs([10, 100, 200, 250, n]))
In order to use this cost class in a change point detection algorithm (inheriting from BaseEstimator
), either pass a CostNormal
instance (through the argument 'custom_cost'
) or set model="normal"
.
c = rpt.costs.CostNormal(); algo = rpt.Dynp(custom_cost=c)
# is equivalent to
algo = rpt.Dynp(model="normal")
Code explanation¶
-
class
ruptures.costs.
CostNormal
[source]¶ Maximum Gaussian likelihood.