-
Notifications
You must be signed in to change notification settings - Fork 7
PCA
Randall O'Reilly edited this page Sep 11, 2019
·
1 revision
PCA = principal components analysis which is just computing the eigenvectors of the covariance or correlation matrix of values over time -- i.e., how different state values (e.g., neural network activations) change in relation to all the other state variables over time. The first principal component captures the greatest amount of variance in these changes. Hebbian learning in a neural network does much the same thing, actually.
- Docs: pca
pc := &pca.PCA{}
pc.TableCol(rels, "Hidden", metric.Covariance64) // does the PCA, stores results
prj := &etable.Table{}
// next project data onto first two eigenvalues (0,1):
pc.ProjectColToTable(prj, rels, "Hidden", "TrialName", []int{0, 1})
ss.ConfigPCAPlot(&PCAlot, prj)
and here's how you can configure a plot:
func (ss *Sim) ConfigPCAPlot(plt *eplot.Plot2D, dt *etable.Table) {
nm, _ := dt.MetaData["name"]
plt.Params.Title = "Family Trees PCA Plot: " + nm
plt.Params.XAxisCol = "Prjn0"
plt.SetTable(dt)
plt.Params.Lines = false
plt.Params.Points = true
// order of params: on, fixMin, min, fixMax, max
plt.SetColParams("TrialName", true, true, 0, false, 0)
plt.SetColParams("Prjn0", false, true, 0, false, 0)
plt.SetColParams("Prjn1", true, true, 0, false, 0)
}