Extracting Molecular Properties from Natural Language with Multimodal Contrastive Learning

07/22/2023
by   Romain Lacombe, et al.
0

Deep learning in computational biochemistry has traditionally focused on molecular graphs neural representations; however, recent advances in language models highlight how much scientific knowledge is encoded in text. To bridge these two modalities, we investigate how molecular property information can be transferred from natural language to graph representations. We study property prediction performance gains after using contrastive learning to align neural graph representations with representations of textual descriptions of their characteristics. We implement neural relevance scoring strategies to improve text retrieval, introduce a novel chemically-valid molecular graph augmentation strategy inspired by organic reactions, and demonstrate improved performance on downstream MoleculeNet property classification tasks. We achieve a +4.26 gain versus models pre-trained on the graph modality alone, and a +1.54 compared to recently proposed molecular graph/text contrastively trained MoMu model (Su et al. 2022).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset