3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in single-step-retrosynthesis-4
Use these libraries to find single-step-retrosynthesis-4 models and implementations
No subtasks available.
The Conditional Graph Logic Network is proposed, a conditional graphical model built upon graph neural networks that learns when rules from reaction templates should be applied, implicitly considering whether the resulting reaction would be both chemically feasible and strategic.
It is demonstrated that applying augmentation techniques to the SMILE representation of target data significantly improves the quality of the reaction predictions.
It is argued that representing the reaction as a sequence of edits enables MEGAN to efficiently explore the space of plausible chemical reactions, maintaining the flexibility of modeling the reaction in an end-to-end fashion and achieving state-of-the-art accuracy in standard benchmarks.
This study model single-step retrosynthesis in a template-based approach using modern Hopfield networks (MHNs) to associate different modalities, reaction templates and molecules, which allows the model to leverage structural information about reaction templates.
This work proposes to leverage both the representations and design a new pre-training algorithm, dual-view molecule pre- training (briefly, DMP), that can effectively combine the strengths of both types of molecule representations.
This work proposes a single-step template-free and Transformer-based method dubbed RetroPrime, integrating chemists’ retrosynthetic strategy of decomposing a molecule into synthons then generating reactants by attaching leaving groups.
A local retrosynthesis framework called LocalRetro is proposed, motivated by the chemical intuition that the molecular changes occur mostly locally during the chemical reactions, and shows a promising 89.5 and 99.2% round-trip accuracy at top-1 and top-5 predictions for the USPTO-50K dataset containing 50 016 reactions.
A novel Graph2SMILES model that combines the power of Transformer models for text generation with the permutation invariance of molecular graph encoders that mitigates the need for input data augmentation is described.
A template-based single-step retrosynthesis model based on Modern Hopfield Networks is introduced, which learn an encoding of both molecules and reaction templates in order to predict the relevance of templates for a given molecule.
This work proposes an innovative retrosynthesis prediction framework that can compose novel templates beyond training templates and can produce novel templates for 15 USPTO-50K test reactions that are not covered by training templates.
Adding a benchmark result helps the community track progress.