GLS Kernel Regression for Network-Structured Data

Edward Antonian, Gareth Peters, Michael John Chantler, Hongxuan Yan

Research output: Working paperPreprint


In this paper we consider the problem of predicting a signal y_t, defined across the N nodes of a fixed graph, over a series of $T$ regularly sampled time points. At each moment the graph signal is postulated to be a function of a set of global features x_t, which exist independently of the graph structure. We extend recent work in the field by designing an O(N^3 + T^3) kernel regression algorithm where the assumption that each observation is independent and identically distributed is relaxed. The result is a computationally tractable class of GLS models for graph regression. Specific attention is paid to the case where the graph signal is modelled as an AR(1) vector autoregressive process, whilst also exhibiting cross-correlation amongst nodes. We use the Laplace approximation to provide a lower bound for the marginal out-of-sample prediction uncertainty in a form that is tractable for large problems. Finally, the method is compared against other learning approaches on real data taken from a network of air quality monitoring stations across California. Here the goal is to predict the concentrations of various different pollutants 24 hours in advance given weather and environmental conditions. We find that kernel graph regression algorithms (both GLS and regular) out-perform competing techniques, whilst also providing a lower bound on the output uncertainty derived from the Laplace approximation.
Original languageEnglish
Publication statusPublished - 30 Aug 2021


  • Graph Signal Processing
  • Regression
  • Kernel


Dive into the research topics of 'GLS Kernel Regression for Network-Structured Data'. Together they form a unique fingerprint.

Cite this