Radial basis function (RBF) and Volterra series (VS) nonlinear predictors are examined with a view to reducing their complexity while maintaining prediction performance. A geometrical interpretation is presented. This interpretation indicates that while a multiplicity of choices of reduced state predictors exists, some choices are better than others in terms of the numerical conditioning of the solution. Two algorithms are developed using signal subspace concepts to find reduced state solutions which are 'close to' the minimum norm solution and which share its numerical properties. The performance of these algorithms are assessed using chaotic time series as test signals. The conclusion is drawn that the so-called Direct Method, which only uses the eigenstructure to characterise the signal subspace, offers the best performance.