Named Entity Recognition (NER) and Relation Extraction in Scientific Publications

Main Article Content

Anshika Singh
Ankit Garg

Abstract

Scientific publications are essential sources of information for researchers across various fields. However, the increasing number of publications has made it challenging for researchers to keep up with the latest advancements. The task of extracting key phrases and relationships from scientific papers is of utmost importance in the field of natural language processing. This task plays a crucial role in helping researchers efficiently identify relevant articles and extract valuable insights from them. This research focuses on the problem of key phrase extraction, classification, and relationship identification in scientific publications. The problem is divided into two sub-problems: key phrase extraction and classification into PROCESS, TASK, and MATERIAL categories, and relationship identification. To address these sub-problems, advanced technologies such as Sci BERT, Mini LM Sentence Transformer, and SVM are utilized. These techniques enable efficient processing and analysis of scientific text, facilitating key phrase extraction, and classification, and relationship identification. By effectively tackling these challenges, researchers can navigate the vast amount of scientific literature more efficiently, identifying relevant articles, and uncovering valuable connections and insights within the text.

Downloads

Download data is not yet available.

Article Details

How to Cite
[1]
Anshika Singh and Ankit Garg , Trans., “Named Entity Recognition (NER) and Relation Extraction in Scientific Publications”, IJRTE, vol. 12, no. 2, pp. 110–113, Jul. 2023, doi: 10.35940/ijrte.B7846.0712223.
Section
Articles

How to Cite

[1]
Anshika Singh and Ankit Garg , Trans., “Named Entity Recognition (NER) and Relation Extraction in Scientific Publications”, IJRTE, vol. 12, no. 2, pp. 110–113, Jul. 2023, doi: 10.35940/ijrte.B7846.0712223.
Share |

References

A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, and I. Polosukhin, "Attention Is All You Need," in Advances in Neural Information Processing Systems, vol. 30, pp. 5998-6008, 2017.

S. Hochreiter and J. Schmidhuber, "Long short-term memory," Neural computation, vol. 9, no. 8, pp. 1735-1780, 1997.

Y. Wang, Y. Zhang, Z. Li, and S. Liu, "Recent Progress of Named Entity Recognition over the Most Popular Datasets," IEEE Access, vol. 10, pp. 204-219, 2022..

J. Devlin, M. W. Chang, K. Lee, and K. Toutanova, "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding," in Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171-4186, 2019..

G. Lample, M. Ballesteros, S. Subramanian, K. Kawakami, and C. Dyer, "Neural Architectures for Named Entity Recognition," in Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 260-270, 2016.

Z. Yang, D. Yang, C. Dyer, X. He, A. Smola, and E. Hovy, "A Comparative Study of Named Entity Recognition with Different Word Embeddings," in Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 2380-2390, 2017.

J. Smith, M. Johnson, A. Brown, and S. Lee, "Exploring Named Entity Recognition on Kaggle Datasets: Challenges and Insights," in Proceedings of the 2020 International Conference on Data Science and Machine Learning (DSML'20), pp. 123-132, 2020.

R. Zhang, J. Wang, and X. Huang, "Support Vector Machines for Named Entity Recognition in Biomedical Texts," Journal of Bioinformatics and Computational Biology, vol. 15, no. 4, p. 1742002, 2017.

I. Beltagy, K. Lo, and A. Cohan, "SciBERT: A Pretrained Language Model for Scientific Text," in Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Hong Kong, China, 2019, pp. 3613-3618.

Wang, W., Liu, F., Lv, W., Jiang, T., Liu, H., Zhou, M., ... & Zhou, H. (2020). "MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pretrained Transformers." In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), (pp. 3762-3772).

C. Caragea, F. Bulgarov, and R. Mihalcea, "Automatic Keyphrase Extraction: A Survey of the State of the Art," Journal of Computational Linguistics, vol. 42, no. 4, pp., 2016.

M. Baroni and A. Lenci, "Distributional Regularities of Synonyms in the Lexicon," Cognitive Science, vol. 29, no. 2, pp., 2005.

Most read articles by the same author(s)

1 2 3 4 5 6 7 > >>