The Living Thing / Notebooks :

Semi/weakly-supervised learning

On extracting nutrition from bullshit

I’m not yet sure what this is, but I’ve seen these words invoked in machine learning problems with a partially-observed model, where you hope to simultaneously learn the parameters of the label generation process and the observation process: So if I have a bunch of crowd-sourced labels for my data and I wish to use them to train a classifier, but I suspect that my crowd is a little unreliable, then I try to do “weakly supervised” learning when I learn both the true labels and the crowd whimsy process, as a kind of hierarchical model of informative sampling (e.g. MZMG15) Or I might assume no explicit model for the crowd whimsy, but simply that similar data should not be too differently labelled, a.k.a. Label Propagation, which uses graph clustering to infer data labels.

Other methods?

Here’s one practical thingy:

snorkel:

Snorkel is a system for rapidly creating, modeling, and managing training data, currently focused on accelerating the development of structured or “dark” data extraction applications for domains in which large labeled training sets are not available or easy to obtain.

Today’s state-of-the-art machine learning models require massive labeled training sets—which usually do not exist for real-world applications. Instead, Snorkel is based around the new data programming paradigm, in which the developer focuses on writing a set of labeling functions, which are just scripts that programmatically label data. The resulting labels are noisy, but Snorkel automatically models this process—learning, essentially, which labeling functions are more accurate than others—and then uses this to train an end model (for example, a deep neural network in TensorFlow).

Surprisingly, by modeling a noisy training set creation process in this way, we can take potentially low-quality labeling functions from the user, and use these to train high-quality end models. We see Snorkel as providing a general framework for many weak supervision techniques, and as defining a new programming model for weakly-supervised machine learning systems.

Refs

BHRR17
Bach, S. H., He, B., Ratner, A., & Ré, C. (2017) Learning the Structure of Generative Models without Labeled Data. In Proceedings of the 34th International Conference on Machine Learning. International Conference on Machine Learning, Sydney, Australia
DeBL05
Delalleau, O., Bengio, Y., & Le Roux, N. (2005) Efficient Nonparametric function induction in semi-supervised learning. In In Proc. Artificial Intelligence and Statistics. Citeseer
JHMJ16
Jung, A., Hero III, A. O., Mara, A., & Jahromi, S. (2016) Semi-Supervised Learning via Sparse Label Propagation. ArXiv:1612.01414 [Cs, Stat].
KTSL14
Karpathy, A., Toderici, G., Shetty, S., Leung, T., Sukthankar, R., & Fei-Fei, L. (2014) Large-Scale Video Classification with Convolutional Neural Networks. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition (pp. 1725–1732). Washington, DC, USA: IEEE Computer Society DOI.
KuRa16
Kumar, A., & Raj, B. (2016) Audio Event Detection Using Weakly Labeled Data. In Proceedings of the 2016 ACM on Multimedia Conference (pp. 1038–1047). New York, NY, USA: ACM DOI.
KuRa17
Kumar, A., & Raj, B. (2017) Deep CNN Framework for Audio Event Recognition using Weakly Labeled Web Data. ArXiv:1707.02530 [Cs].
LiTa15
Li, Z., & Tang, J. (2015) Weakly Supervised Deep Metric Learning for Community-Contributed Image Retrieval. IEEE Transactions on Multimedia, 17(11), 1989–1999. DOI.
MZMG15
Misra, I., Zitnick, C. L., Mitchell, M., & Girshick, R. (2015) Seeing through the Human Reporting Bias: Visual Classifiers from Noisy Human-Centric Labels. In Proceedings of CVPR.
PCMY00
Papandreou, G., Chen, L.-C., Murphy, K., & Yuille, A. L.(n.d.) Weakly-and Semi-Supervised Learning of a Deep Convolutional Network for Semantic Image Segmentation.
RBEF17
Ratner, A., Bach, S. H., Ehrenberg, H., Fries, J., Wu, S., & Ré, C. (2017) Snorkel: Rapid Training Data Creation with Weak Supervision. Proceedings of the VLDB Endowment, 11(3), 269–282. DOI.
RDWS16
Ratner, A. J., De Sa, C. M., Wu, S., Selsam, D., & Ré, C. (2016) Data Programming: Creating Large Training Sets, Quickly. In D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, & R. Garnett (Eds.), Advances in Neural Information Processing Systems 29 (pp. 3567–3575). Curran Associates, Inc.
VHBB17
Varma, P., He, B., Bajaj, P., Banerjee, I., Khandwala, N., Rubin, D. L., & Ré, C. (2017) Inferring Generative Model Structure with Static Analysis. In Advances In Neural Information Processing Systems.
WWZY15
Wu, F., Wang, Z., Zhang, Z., Yang, Y., Luo, J., Zhu, W., & Zhuang, Y. (2015) Weakly Semi-Supervised Deep Learning for Multi-Label Image Annotation. IEEE Transactions on Big Data, 1(3), 109–122. DOI.
ZBLW03
Zhou, D., Bousquet, O., Lal, T. N., Weston, J., & Schölkopf, B. (2003) Learning with Local and Global Consistency. In Proceedings of the 16th International Conference on Neural Information Processing Systems (pp. 321–328). Cambridge, MA, USA: MIT Press
ZhGh02
Zhu, X., & Ghahramani, Z. (2002) Learning from Labeled and Unlabeled Data with Label Propagation.