On the PLS algorithm for multiple regression (PLS1)

Yoshio Takane*, Sebastien Loisel

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Partial least squares (PLS) was first introduced byWold in the mid 1960s as a heuristic algorithm to solve linear least squares (LS) problems. No optimality property of the algorithm was known then. Since then, however, a number of interesting properties have been established about the PLS algorithm for regression analysis (called PLS1). This paper shows that the PLS estimator for a specific dimensionality S is a kind of constrained LS estimator confined to a Krylov subspace of dimensionality S. Links to the Lanczos bidiagonalization and conjugate gradient methods are also discussed from a somewhat different perspective from previous authors.

Original languageEnglish
Title of host publicationThe Multiple Facets of Partial Least Squares and Related Methods
PublisherSpringer
Pages17-28
Number of pages12
ISBN (Electronic)9783319406435
ISBN (Print)9783319406411
DOIs
Publication statusPublished - 16 Oct 2016

Publication series

NameSpringer Proceedings in Mathematics & Statistics
PublisherSpringer
Volume173
ISSN (Print)2194-1009
ISSN (Electronic)2194-1017

Keywords

  • Conjugate gradients
  • Constrained principal component analysis (CPCA)
  • Krylov subspace
  • Lanczos bidiagonalization
  • NIPALS
  • PLS1 algorithm

ASJC Scopus subject areas

  • General Mathematics

Fingerprint

Dive into the research topics of 'On the PLS algorithm for multiple regression (PLS1)'. Together they form a unique fingerprint.

Cite this