Talk:Symbolic Logic:Learning:Symbolic Replacement

I have been guilty of being vague about what the input is I am learning.

If the input is the output from perception by visual and other senses, then it wont just be string of symbols. Instead it will be structured information representing objects.

If the input is conversation with other people then the communication is naturally broken up into sentences. Individual words are learnt first, in the context of all perceived input.

If the input is a whole bunch of C++ programs, humans dont understand the syntax of these programs in a vacuum. If you dont know at least the groupings of characters there is not much information available to you just by looking at the ordering of characters.

Learning is a very hard problem. It may defined for some data D find the function f and data x such that f(x) = D and has the least complexity. If we have a complexity function, the result is J in,


 * $$C = \{(f, x): f(x) = D\} \! $$
 * $$J \in C \and complexity(J) = \min \{complexity(k) : k \in C\} \! $$

This is second order logic as the function $$f \!$$ is a variable here. If general learning were possible cryptography would be impossible. Learning can only be possible in simple situations. Learning based on continuous functions may be achieved using feedback (as in a neural network).

So I am hoping that there is a simple set of rules that apply in particular situations to give learning in the non-continuous symbolic case.

Angluin's algorithm does learning of syntax for DFA's but I am not sure if that helps me much. I was really looking for something simpler but more flexible.

So I need to sort my head out and clarify what the goal really is here.

Need to learn from much richer sources of information

 * Functions, inputs and outputs.
 * Sequence of events in a real world environment. Combination, of language input and perception inputs.

Learning should start with simple association.
 * Finger points to a dog. "Dog" word is spoken.
 * Steal a cookie, get wacked with a rolling pin.

Mixture of stated facts and facts learned by induction.

Syntax built up from simple examples first.

Syntax gives structure information (e.g like XML but not particularly in that syntax).

Characterization of simple learning.

Within a limited framework iterate possible functions. For example if you don't know that association is left to write in a mathematical expression,


 * 6 - 5 - 4 = 6 - (5 - 4) = 5

But example gives,


 * 6 - 5 - 4 = (6 - 5) - 4 = -3

Semantic meaning is identified with functions.

Probability function for next event based on action and previous events.

What is the role of the neural net?