Parameters of Combinatorial Neural Codes
Motivated by recent developments in the mathematical theory of neural codes, we study the structure of error-correcting codes for the binary asymmetric channel. These are also known as combinatorial neural codes and can be seen as the discrete version of neural receptive field codes. We introduce two notions of discrepancy between binary vectors, which are not metric functions in general but nonetheless capture the mathematics of the binary asymmetric channel. In turn, these lead to two new fundamental parameters of combinatorial neural codes, both of which measure the probability that the maximum likelihood decoder fails. We then derive various bounds for the cardinality and weight distribution of a combinatorial neural code in terms of these new parameters, giving examples of codes meeting the bounds with equality.
READ FULL TEXT