Fine-tuning vision-language models (VLMs) like CLIP to downstream tasks ...
We introduce an adaptive method with formal quality guarantees for weak
...
Alfred is the first system for programmatic weak supervision (PWS) that
...
We develop a rigorous mathematical analysis of zero-shot learning with
a...
As post hoc explanation methods are increasingly being leveraged to expl...
We propose a new strategy for applying large pre-trained language models...
We introduce compositional soft prompting (CSP), a parameter-efficient
l...
PromptSource is a system for creating, sharing, and using natural langua...
Machine learning practitioners often have access to a spectrum of data:
...
Large language models have recently been shown to attain reasonable zero...
In situations where explanations of black-box models may be useful, the
...
Programmatic weak supervision creates models without hand-labeled traini...
In many practical few-shot learning problems, even though labeled exampl...
Zero-shot learning relies on semantic class representations such as
attr...
Labeling training data is one of the most costly bottlenecks in developi...
Labeling training data is increasingly the largest bottleneck in deployi...
Curating labeled training data has become the primary bottleneck in mach...
A fundamental challenge in developing high-impact machine learning
techn...