3260 papers • 126 benchmarks • 313 datasets
Property prediction involves forecasting or estimating a molecule's inherent physical and chemical properties based on information derived from its structural characteristics. It facilitates high-throughput evaluation of an extensive array of molecular properties, enabling the virtual screening of compounds. Additionally, it provides the means to predict the unknown attributes of new molecules, thereby bolstering research efficiency and reducing development times.
(Image credit: Papersgraph)
These leaderboards are used to track progress in property-prediction-9
No benchmarks available.
Use these libraries to find property-prediction-9 models and implementations
No datasets available.
No subtasks available.
Using MPNNs, state of the art results on an important molecular property prediction benchmark are demonstrated and it is believed future work should focus on datasets with larger molecules or more accurate ground truth labels.
A new strategy and self-supervised methods for pre-training Graph Neural Networks (GNNs) that avoids negative transfer and improves generalization significantly across downstream tasks, leading up to 9.4% absolute improvements in ROC-AUC over non-pre-trained models and achieving state-of-the-art performance for molecular property prediction and protein function prediction.
An architectural component that interfaces with perceptual representations such as the output of a convolutional neural network and produces a set of task-dependent abstract representations which are exchangeable and can bind to any object in the input by specializing through a competitive procedure over multiple rounds of attention is presented.
A conditional molecular design method that facilitates generating new molecules with desired properties is presented, built as a semisupervised variational autoencoder trained on a set of existing molecules with only a partial annotation.
This work proposes Molecule Attention Transformer, a key innovation is to augment the attention mechanism in Transformer using inter-atomic distances and the molecular graph structure, and shows that attention weights learned are interpretable from the chemical point of view.
Experimental results on the tasks of graph classification and molecular property prediction show that InfoGraph is superior to state-of-the-art baselines and InfoGraph* can achieve performance competitive with state- of- the-art semi-supervised models.
A graph convolutional model is introduced that consistently matches or outperforms models using fixed molecular descriptors as well as previous graph neural architectures on both public and proprietary data sets.
A molecular multimodal foundation model which is pretrained from molecular graphs and their semantically related textual data via contrastive learning which enhances molecular property prediction and possesses capability to generate meaningful molecular graphs from natural language descriptions is proposed.
This work proposes a deep convolutional neural network architecture, MUST-CNN, that uses a novel multilayer shift-and-stitch technique to generate fully dense per-position predictions on protein sequences and beats the state-of-the-art performance on two large protein property prediction datasets.
Extensive experiments on Open Graph Benchmark show DeeperGCN significantly boosts performance over the state-of-the-art on the large scale graph learning tasks of node property prediction and graph property prediction.
Adding a benchmark result helps the community track progress.