Multi-task learning (MTL) creates a single machine learning model called...
Although multi-task deep neural network (DNN) models have computation an...
Hamiltonian Monte Carlo (HMC) is a powerful algorithm to sample latent
v...
Personalization in Federated Learning (FL) aims to modify a collaborativ...
Homophily and heterophily are intrinsic properties of graphs that descri...
Subgraph representation learning based on Graph Neural Network (GNN) has...
Tree-structured multi-task architectures have been employed to jointly t...
This paper investigates deep neural network (DNN) compression from the
p...
Training wide and deep neural networks (DNNs) require large amounts of
s...
Multi-task learning (MTL) jointly learns a set of tasks. It is a promisi...
Hard parameter sharing in multi-task learning (MTL) allows tasks to shar...
Graph Neural Networks (GNNs) are a new and increasingly popular family o...
Parallel applications are extremely challenging to achieve the optimal
p...
When capturing images in low-light conditions, the images often suffer f...
Continuous representations have been widely adopted in recommender syste...
Convolutional Neural Networks (CNN) are being actively explored for
safe...
We introduce the idea of Data Readiness Level (DRL) to measure the relat...