Abstract
Linear regression problem deal with the solution of an over determined set of linear equations under different assumptions about noise in the experimental data. Some parameterized techniques, designed for a compact treatment of these problems, are briefly reviewed. Then, the GeTLS method is introduced and applied, as learning law, to a novel linear neuron, GeTLS EXIN. Some numerical considerations follow, together with software examples. This neuron yields very good results, especially for large systems of equations.