Seq2Seq-SC: End-to-End Semantic Communication Systems with Pre-trained Language Model

10/27/2022
by   Ju-Hyung Lee, et al.
0

While semantic communication is expected to bring unprecedented communication efficiency in comparison to classical communication, many challenges must be resolved to realize its potential. In this work, we provide a realistic semantic network dubbed seq2seq-SC, which is compatible to 5G NR and can work with generalized text dataset utilizing pre-trained language model. We also utilize a performance metric (SBERT) which can accurately measure semantic similarity and show that seq2seq-SC achieves superior performance while extracting semantically meaningful information.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset