### Abstract

Partial least squares (PLS) was first introduced byWold in the mid 1960s as a heuristic algorithm to solve linear least squares (LS) problems. No optimality property of the algorithm was known then. Since then, however, a number of interesting properties have been established about the PLS algorithm for regression analysis (called PLS1). This paper shows that the PLS estimator for a specific dimensionality S is a kind of constrained LS estimator confined to a Krylov subspace of dimensionality S. Links to the Lanczos bidiagonalization and conjugate gradient methods are also discussed from a somewhat different perspective from previous authors.

Language | English |
---|---|

Title of host publication | The Multiple Facets of Partial Least Squares and Related Methods |

Publisher | Springer |

Pages | 17-28 |

Number of pages | 12 |

ISBN (Electronic) | 9783319406435 |

ISBN (Print) | 9783319406411 |

DOIs | |

Publication status | Published - 16 Oct 2016 |

### Publication series

Name | Springer Proceedings in Mathematics & Statistics |
---|---|

Publisher | Springer |

Volume | 173 |

ISSN (Print) | 2194-1009 |

ISSN (Electronic) | 2194-1017 |

### Fingerprint

### Keywords

- Conjugate gradients
- Constrained principal component analysis (CPCA)
- Krylov subspace
- Lanczos bidiagonalization
- NIPALS
- PLS1 algorithm

### ASJC Scopus subject areas

- Mathematics(all)

### Cite this

*The Multiple Facets of Partial Least Squares and Related Methods*(pp. 17-28). (Springer Proceedings in Mathematics & Statistics; Vol. 173). Springer. https://doi.org/10.1007/978-3-319-40643-5_2

}

*The Multiple Facets of Partial Least Squares and Related Methods.*Springer Proceedings in Mathematics & Statistics, vol. 173, Springer, pp. 17-28. https://doi.org/10.1007/978-3-319-40643-5_2

**On the PLS algorithm for multiple regression (PLS1).** / Takane, Yoshio; Loisel, Sebastien.

Research output: Chapter in Book/Report/Conference proceeding › Chapter

TY - CHAP

T1 - On the PLS algorithm for multiple regression (PLS1)

AU - Takane, Yoshio

AU - Loisel, Sebastien

PY - 2016/10/16

Y1 - 2016/10/16

N2 - Partial least squares (PLS) was first introduced byWold in the mid 1960s as a heuristic algorithm to solve linear least squares (LS) problems. No optimality property of the algorithm was known then. Since then, however, a number of interesting properties have been established about the PLS algorithm for regression analysis (called PLS1). This paper shows that the PLS estimator for a specific dimensionality S is a kind of constrained LS estimator confined to a Krylov subspace of dimensionality S. Links to the Lanczos bidiagonalization and conjugate gradient methods are also discussed from a somewhat different perspective from previous authors.

AB - Partial least squares (PLS) was first introduced byWold in the mid 1960s as a heuristic algorithm to solve linear least squares (LS) problems. No optimality property of the algorithm was known then. Since then, however, a number of interesting properties have been established about the PLS algorithm for regression analysis (called PLS1). This paper shows that the PLS estimator for a specific dimensionality S is a kind of constrained LS estimator confined to a Krylov subspace of dimensionality S. Links to the Lanczos bidiagonalization and conjugate gradient methods are also discussed from a somewhat different perspective from previous authors.

KW - Conjugate gradients

KW - Constrained principal component analysis (CPCA)

KW - Krylov subspace

KW - Lanczos bidiagonalization

KW - NIPALS

KW - PLS1 algorithm

UR - http://www.scopus.com/inward/record.url?scp=85015316289&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-40643-5_2

DO - 10.1007/978-3-319-40643-5_2

M3 - Chapter

SN - 9783319406411

T3 - Springer Proceedings in Mathematics & Statistics

SP - 17

EP - 28

BT - The Multiple Facets of Partial Least Squares and Related Methods

PB - Springer

ER -