-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathconclusion.tex
10 lines (10 loc) · 1.68 KB
/
conclusion.tex
1
2
3
4
5
6
7
8
9
10
\acresetall
In this paper, we have presented a model of learning of \ac{MSI} in the \ac{SC}.
We have shown that enhancement of neural responses due to spatial and feature-based attention can arise naturally from statistical self-organization.
It is especially interesting that the model presented here is an extension of a previous model which considered only bottom-up multisensory processing:
First, a simple and natural extension of the domain of a model, if successful, corroborates that model, and thus our success at including attentional input along with sensory input in our model of learning \ac{MSI} in the \ac{SC} strengthens it.
Second, the extension does not require any new mechanisms and treats attentional and sensory input identically, thus showing that the effects it reproduces can be explained by the same basic mechanisms as stimulus-driven \ac{MSI}.
This view fits in well with previous work by \citet{krauzlis-et-al-2014} who argued that attention may, on the network level, be seen not as a cause but an effect of the ecological requirements on information processing.
Finally, the probabilistic origins of the network at the basis of our model suggest an elegant functional interpretation of both the specialization of neurons in different modality combinations and spatial and feature-based attention:
According to this interpretation, self-organization produces a latent-variable model in which each neuron represents in its activity the probability of a specific hypothesis about the position and the quality of a stimulus in terms of modality combination.
Attentional input can then be seen either as additional evidence or as a prior on either one of the dimensions of the latent variable.