Authors
Jike Wang, Chang-Yu Hsieh, Mingyang Wang, Xiaorui Wang, Zhenxing Wu, Dejun Jiang, Benben Liao, Xujun Zhang, Bo Yang, Qiaojun He, Dongsheng Cao, Xi Chen, Tingjun Hou
Publication date
2021/10
Journal
Nature Machine Intelligence
Volume
3
Issue
10
Pages
914-922
Publisher
Nature Publishing Group UK
Description
Machine learning-based generative models can generate novel molecules with desirable physiochemical and pharmacological properties from scratch. Many excellent generative models have been proposed, but multi-objective optimizations in molecular generative tasks are still quite challenging for most existing models. Here we proposed the multi-constraint molecular generation (MCMG) approach that can satisfy multiple constraints by combining conditional transformer and reinforcement learning algorithms through knowledge distillation. A conditional transformer was used to train a molecular generative model by efficiently learning and incorporating the structure–property relations into a biased generative process. A knowledge distillation model was then employed to reduce the model’s complexity so that it can be efficiently fine-tuned by reinforcement learning and enhance the structural diversity of the …
Total citations
202220232024264936