Domain adaptation (DA) aims to alleviate the domain shift between source...
We present a simple yet novel parameterized form of linear mapping to
ac...
Reparameterization aims to improve the generalization of deep neural net...
Understanding and modelling the performance of neural architectures is k...
Evaluating neural network performance is critical to deep neural network...
Predicting neural architecture performance is a challenging task and is
...
Lifelong object re-identification incrementally learns from a stream of
...
In this paper, we investigate open-set recognition with domain
shift, wh...
Systematicity, i.e., the ability to recombine known parts and rules to f...
We propose a simple but effective source-free domain adaptation (SFDA)
m...
Most meta-learning approaches assume the existence of a very large set o...
Anderson mixing has been heuristically applied to reinforcement learning...
Domain adaptation (DA) aims to alleviate the domain shift between source...
Neural architecture search automates neural network design and has achie...
Neural architecture search (NAS) has achieved remarkable results in deep...
In real scenarios, state observations that an agent observes may contain...
Domain adaptation (DA) aims to transfer the knowledge learned from a sou...
Despite the empirical success of neural architecture search (NAS) in dee...
GANs largely increases the potential impact of generative models. Theref...
Unsupervised domain adaptation (UDA) aims to transfer the knowledge lear...
Deep neural networks have recently become a popular solution to keyword
...
Humans are capable of learning new tasks without forgetting previous one...
Class-incremental learning of deep networks sequentially increases the n...
Neural Architecture Search (NAS) is attractive for automatically produci...
An effective and efficient architecture performance evaluation scheme is...