Zero-shot inference is a powerful paradigm that enables the use of large...
The quality of training data impacts the performance of pre-trained larg...
Machine learning models – including prominent zero-shot models – are oft...
Recent work has shown that language models' (LMs) prompt-based learning
...
Weak supervision overcomes the label bottleneck, enabling efficient
deve...
The ability to generalize to unseen domains is crucial for machine learn...
An important class of techniques for resonant anomaly detection in high
...
Weak supervision (WS) is a rich set of techniques that produce pseudolab...
Creating large-scale high-quality labeled datasets is a major bottleneck...
Large language models (LLMs) transfer well to new tasks out-of-the-box s...
Weak supervision (WS) is a powerful method to build labeled datasets for...
Foundation models offer an exciting new paradigm for constructing models...
Weak supervision (WS) frameworks are a popular way to bypass hand-labeli...
Labeling data for modern machine learning is expensive and time-consumin...
Our goal is to enable machine learning systems to be trained interactive...
Knowledge graph (KG) embeddings learn low-dimensional representations of...
A popular way to estimate the causal effect of a variable x on y from
ob...
Weak supervision is a popular method for building machine learning model...
Since manually labeling training data is slow and expensive, recent
indu...
Labeling training data is a key bottleneck in the modern machine learnin...
As machine learning models continue to increase in complexity, collectin...
Hyperbolic embeddings offer excellent quality with few dimensions when
e...
In this work, we investigate the problem of constructing codes capable o...
After being trained, classifiers must often operate on data that has bee...