Support Vector Regression (SVR) and its variants are widely used regression algorithms, and they have demonstrated high generalization ability. This research proposes a new SVR-based regressor: v-minimum absolute deviation distribution regression (v-MADR) machine. Instead of merely minimizing structural risk, as with v-SVR, v-MADR aims to achieve better generalization performance by minimizing both the absolute regression deviation mean and the absolute regression deviation variance, which takes into account the positive and negative values of the regression deviation of sample points. For optimization, we propose a dual coordinate descent (DCD) algorithm for small sample problems, and we also propose an averaged stochastic gradient descent (ASGD) algorithm for large-scale problems. Furthermore, we study the statistical property of v-MADR that leads to a bound on the expectation of error. The experimental results on both artificial and real datasets indicate that our v-MADR has significant improvement in generalization performance with less training time compared to the widely used v-SVR, LS-SVR, v-TSVR, and linear ε-SVR. Finally, we open source the code of v-MADR at https://github.com/AsunaYY/v-MADR for wider dissemination.
- absolute regression deviation mean
- absolute regression deviation variance
- dual coordinate descent algorithm
- v-support vector regression
ASJC Scopus subject areas
- Computer Science(all)
- Materials Science(all)