Learning from Examples and Membership Queries with Structured Determinations

作者:Prasad Tadepalli, Stuart Russell

摘要

It is well known that prior knowledge or bias can speed up learning, at least in theory. It has proved difficult to make constructive use of prior knowledge, so that approximately correct hypotheses can be learned efficiently. In this paper, we consider a particular form of bias which consists of a set of “determinations.” A set of attributes is said to determine a given attribute if the latter is purely a function of the former. The bias is tree-structured if there is a tree of attributes such that the attribute at any node is determined by its children, where the leaves correspond to input attributes and the root corresponds to the target attribute for the learning problem. The set of allowed functions at each node is called the basis. The tree-structured bias restricts the target functions to those representable by a read-once formula (a Boolean formula in which each variable occurs at most once) of a given structure over the basis functions. We show that efficient learning is possible using a given tree-structured bias from random examples and membership queries, provided that the basis class itself is learnable and obeys some mild closure conditions. The algorithm uses a form of controlled experimentation in order to learn each part of the overall function, fixing the inputs to the other parts of the function at appropriate values. We present empirical results showing that when a tree-structured bias is available, our method significantly improves upon knowledge-free induction. We also show that there are hard cryptographic limitations to generalizing these positive results to structured determinations in the form of a directed acyclic graph.

论文关键词:determinations, tree-structured bias, declarative bias, prior knowledge, read-once formulas, queries, controlled experimentation, pac-learning

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1007421315813