Active Learning (Machine Learning) - Wikipedia - Uncertainty sampling ![rw-book-cover|200x400](https://readwise-assets.s3.amazonaws.com/static/images/article1.be68295a7e40.png) ## Metadata - Author: **Uncertainty sampling** - Full Title: Active Learning (Machine Learning) - Wikipedia - Category: #articles - URL: https://en.wikipedia.org/wiki/Active_learning_(machine_learning)#Scenarios ## Highlights - Membership Query Synthesis: This is where the learner generates its own instance from an underlying natural distribution. For example, if the dataset are pictures of humans and animals, the learner could send a clipped image of a leg to the teacher and query if this appendage belongs to an animal or human. This is particularly useful if the dataset is small. Pool-Based Sampling: In this scenario, instances are drawn from the entire data pool and assigned a confidence score, a measurement of how well the learner “understands” the data. The system then selects the instances for which it is the least confident and queries the teacher for the labels. Stream-Based Selective Sampling: Here, each unlabeled data point is examined one at a time with the machine evaluating the informativeness of each item against its query parameters. The learner decides for itself whether to assign a label or query the teacher for each datapoint. - Tags: #ai - Balance exploration and exploitation: the choice of examples to label is seen as a dilemma between the exploration and the exploitation over the data space representation. This strategy manages this compromise by modelling the active learning problem as a contextual bandit problem. For example, Bouneffouf et al. propose a sequential algorithm named Active Thompson Sampling (ATS), which, in each round, assigns a sampling distribution on the pool, samples one point from this distribution, and queries the oracle for this sample point label. - Tags: #ai