Optimal Combinatorial Neural Codes with Matched Metric δ_r: Characterization and Constructions

12/15/2021
by   Aixian Zhang, et al.
0

Based on the theoretical neuroscience, G. Cotardo and A. Ravagnavi in <cit.> introduced a kind of asymmetric binary codes called combinatorial neural codes (CN codes for short), with a "matched metric" δ_r called asymmetric discrepancy, instead of the Hamming distance d_H for usual error-correcting codes. They also presented the Hamming, Singleton and Plotkin bounds for CN codes with respect to δ_r and asked how to construct the CN codes with large size || and δ_r(). In this paper we firstly show that a binary code reaches one of the above bounds for δ_r() if and only if reaches the corresponding bounds for d_H and r is sufficiently closed to 1. This means that all optimal CN codes come from the usual optimal codes. the usual Plotkin bound). Secondly we present several constructions of CN codes with nice and flexible parameters (n,K, δ_r()) by using bent functions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset