Abstract
Most GNNs for molecular property prediction are proposed based on the idea of learning the representations for the nodes by aggregating the information of their neighbour nodes in graph layers. Then, the representations can be passed to subsequent task-specific layers to deal with individual downstream tasks. Facing real-world molecular problems, the hyperparameter optimisation for those layers are vital. In this research, we focus on the impact of selecting two types of GNN hyperparameters, those belonging to graph layers and those of task-specific layers, on the performance of GNN for molecular property prediction. In our experiments, we employed a state-of-the-art evolutionary algorithm (i.e., CMA-ES) for HPO. The results reveal that optimising the two types of hyperparameters separately can improve GNNs' performance, but optimising both types of hyperparameters simultaneously will lead to predominant improvements.
Original language | English |
---|---|
Title of host publication | GECCO '21: Proceedings of the Genetic and Evolutionary Computation Conference Companion |
Publisher | Association for Computing Machinery |
Pages | 1403–1404 |
Number of pages | 2 |
ISBN (Print) | 9781450383516 |
DOIs | |
Publication status | Published - 7 Jul 2021 |
Event | 11th Workshop on Evolutionary Computation for the Automated Design of Algorithms at GECCO 2021 - online Duration: 10 Jul 2021 → 14 Jul 2021 https://bonsai.auburn.edu/ecada/ |
Workshop
Workshop | 11th Workshop on Evolutionary Computation for the Automated Design of Algorithms at GECCO 2021 |
---|---|
Abbreviated title | ECADA 2021 |
Period | 10/07/21 → 14/07/21 |
Internet address |
Keywords
- evolutionary computation
- graph neural networks
- hyperparameter optimisation
- molecular property prediction
ASJC Scopus subject areas
- Computer Science Applications
- Software
- Computational Theory and Mathematics