Through in-context learning (ICL), large-scale language models are effec...
Out-of-distribution (OOD) detection aims to discern outliers from the
in...
Large-scale pre-trained language models (PLMs) are well-known for being
...
Despite recent explosion in research interests, in-context learning and ...
Transformer-based pre-trained language models (PLMs) have dramatically
i...
We propose a novel method that enables us to determine words that deserv...
Main memory (DRAM) significantly impacts the power and energy utilizatio...
With the recent success and popularity of pre-trained language models (L...
As an attempt to combine extractive and abstractive summarization, Sente...
We propose a simple approach to train better Korean word representations...
We present a latent variable model for predicting the relationship betwe...
Most existing recursive neural network (RvNN) architectures utilize only...
We propose a method of stacking multiple long short-term memory (LSTM) l...
We present a novel neural architecture for the Argument Reasoning
Compre...
Layout-Aware Data Scheduler (LADS) data transfer tool, identifies and
ad...
Future terabit networks are committed to dramatically improving big data...
Word embedding has become a fundamental component to many NLP tasks such...