Meet Graph-Mamba: A Novel Graph Model that Leverages State Space Models SSM for Efficient Data-Dependent Context Selection
Graph Transformers need help with scalability in graph sequence modeling due to high computational costs, and existing attention sparsification methods fail to adequately address data-dependent contexts. State space models (SSMs) like Mamba are effective and efficient in modeling long-range dependencies in sequential data, but adapting them to non-sequential graph data is challenging. Many sequence models…
