Deep Temporal Memory #13
Replies: 2 comments 1 reply
-
Hi, I think a big issue currently with Triadic Memory is the memory usage. I am guessing that the deep temporal memory example used over 8GB RAM - which is a lot to memorize ~2KB worth of data. The problem is essentially equivalent to storing large amounts of 3D voxels, so I would suggest trying to store the memories in a sparse voxel octree. It is also important to emphasize the generalization aspects I think - to differentiate Triadic Memory from an expensive lookup table. An obvious way of doing this would be to predict noisy results, e.g. predicting the evolution of a noisy Lorenz attractor perhaps. Anyways, just some ideas! |
Beta Was this translation helpful? Give feedback.
-
A triadic memory instance with n=1000 and population p=10 can store 1M triples. That translates to 31001000000 bits (38MB) assuming a sparse SDR representation. The memory usage for this configuration is 954MB, so we're looking at an efficiency of 4 percent. Assuming a dense SDR representation (as realized by the brain), the efficiency would be 40 percent. Consider that this is a redundant storage which does not require indexing... At full capacity, one third of memory locations is 0, one third is 1, and one third has values from 2 to about 11. Would a sparse voxel octree improve memory usage when 2/3 of possible values are populated? Great idea to use the memory as a fast lookup mechanism for solutions of differential equations -- your brain must be doing something similar when catching a ball or riding a bike. What would be a simple proof of concept? |
Beta Was this translation helpful? Give feedback.
-
Just posted a first implementation of a Deep Temporal Memory -- a predictive engine with eight levels of hierarchy.
https://github.com/PeterOvermann/TriadicMemory/blob/main/Mathematica/Notebooks/Deep%20Temporal%20Memory%20-%20Introduction.pdf
Beta Was this translation helpful? Give feedback.
All reactions