Seq2Seq-SC: End-to-End Semantic Communication Systems with Pre-trained Language Model
While semantic communication is expected to bring unprecedented communication efficiency in comparison to classical communication, many challenges must be resolved to realize its potential. In this work, we provide a realistic semantic network dubbed seq2seq-SC, which is compatible to 5G NR and can work with generalized text dataset utilizing pre-trained language model. We also utilize a performance metric (SBERT) which can accurately measure semantic similarity and show that seq2seq-SC achieves superior performance while extracting semantically meaningful information.
READ FULL TEXT