REN, K.,  GARDNER-MEDWIN, A.R.
Department of Physiology, University College London, London WC1E 6BT, UK
A PROPER KNOWLEDGE MEASURE AND ITS IMPLICATIONS FOR LEARNING
   Learning increases knowledge of the environment, in the sense that an animal becomes better able to choose actions that lead to beneficial consequences.  We define knowledge as the ability to predict, with appropriate confidence, the outcome of events or the consequences of actions.  It has clear biological value, but has not generally been held to have a well defined measure.  We show first that a satisfactory measure for knowledge or lack of knowledge does exist within certain constraints.  We define assessment procedures and show that the results are invariant with details of these procedures.  Furthermore, this invariance renders our measure uniquely satisfactory amongst the possible options.  The measures and procedures apply equally to knowledge of individual facts, sets of interconnected facts and their implications, or stochastic events.
   Quantitative treatment of knowledge can lead to neurobiological insights.  An animal that predicts events with a high level of accuracy and confidence (i.e. a high estimated probability for the actual outcome) has low nescience (our formal measure for missing knowledge).  This means that it can store information identifying the actual events more compactly: minimum required storage capacity is directly proportional to average nescience for the events.  With stochastic events, low nescience implies that an animal's confidence judgements are close to the true probabilities.  This is precisely what is required if confidence judgements are to form the basis for optimal selection between different courses of action.  Low nescience can also lead to quicker and more accurate stimulus identification, where this involves sequential matching of predicted inputs to sensory signals.
   Mechanisms by which knowledge can be acquired and represented in neural systems and can lead to improved efficiency will be illustrated with neural network models.