Unknown

Dataset Information

0

Extracting chemical-protein relations using attention-based neural networks.


ABSTRACT: Relation extraction is an important task in the field of natural language processing. In this paper, we describe our approach for the BioCreative VI Task 5: text mining chemical-protein interactions. We investigate multiple deep neural network (DNN) models, including convolutional neural networks, recurrent neural networks (RNNs) and attention-based (ATT-) RNNs (ATT-RNNs) to extract chemical-protein relations. Our experimental results indicate that ATT-RNN models outperform the same models without using attention and the ATT-gated recurrent unit (ATT-GRU) achieves the best performing micro average F1 score of 0.527 on the test set among the tested DNNs. In addition, the result of word-level attention weights also shows that attention mechanism is effective on selecting the most important trigger words when trained with semantic relation labels without the need of semantic parsing and feature engineering. The source code of this work is available at https://github.com/ohnlp/att-chemprot.

SUBMITTER: Liu S 

PROVIDER: S-EPMC6174551 | biostudies-literature | 2018 Jan

REPOSITORIES: biostudies-literature

altmetric image

Publications

Extracting chemical-protein relations using attention-based neural networks.

Liu Sijia S   Shen Feichen F   Komandur Elayavilli Ravikumar R   Wang Yanshan Y   Rastegar-Mojarad Majid M   Chaudhary Vipin V   Liu Hongfang H  

Database : the journal of biological databases and curation 20180101


Relation extraction is an important task in the field of natural language processing. In this paper, we describe our approach for the BioCreative VI Task 5: text mining chemical-protein interactions. We investigate multiple deep neural network (DNN) models, including convolutional neural networks, recurrent neural networks (RNNs) and attention-based (ATT-) RNNs (ATT-RNNs) to extract chemical-protein relations. Our experimental results indicate that ATT-RNN models outperform the same models witho  ...[more]

Similar Datasets

| S-EPMC8064235 | biostudies-literature
| S-EPMC7784314 | biostudies-literature
| S-EPMC4834205 | biostudies-literature
| S-EPMC5338769 | biostudies-literature
| S-EPMC7251675 | biostudies-literature
| S-EPMC8448872 | biostudies-literature