3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in set-to-graph-prediction-2
No benchmarks available.
Use these libraries to find set-to-graph-prediction-2 models and implementations
No datasets available.
No subtasks available.
This work presents sparse second-order Transformers with kernel attention that achieve significant performance improvement over invariant MLPs and message-passing graph neural networks in large-scale graph regression and set-to-(hyper)graph prediction tasks.
Adding a benchmark result helps the community track progress.