Abstract
Graph neural networks have recently met huge success in various inference tasks including materials property prediction amongst many others. Nevertheless, having an inherently locally-based representation capacity as they do, global representation of materials' structures can only only be achieved by expanding the model complexity which in turn scales up training times and memory consumption. In this work we focus on efficiently capturing global interactions ``in-model'', through long-range edge attentions with minimal memory footprints. We introduce a novel ``contextual'' message passing scheme that better captures global interactions by attending on edges from both the local and global environment of each node in an edge-update fashion. The performance of the proposed model (LiCOMPGNN) is tested on a diverse set of materials property prediction benchmarks and demonstrates competitiveness against state-of-the-art models in several prediction tasks whilst being an order of magnitude smaller in terms of trainable parameters. We further augment the framework to a multiplex graph setting for solid-state data with reciprocal space features taken into account in a multimodal message passing regime. We demonstrate the representation capacity of the proposed variant along with others in maintaining supercell invariance in crystalline property prediction tasks.