Enhancing Contextual Masking in Reversible Linguistic Steganography with Ensemble Methods

Main Article Content

Mrs. M. Prasha Meena
N J S Deepalakshmi
R Dharsni
R Subashree

Abstract

Various cybercrimes can be prevented by text authentication that is responsible for preserving digital identities and contents. Digital signatures come in handy as a way of authenticating texts, which is an extensively used method. One approach to this problem is linguistic steganography, which allows hiding the signature in other words within the text and thereby facilitating efficient data management. However, it should be noted that there is a danger that these kinds of changes may result in inappropriate decisions being taken by automated computing systems not to mention change their final outputs (unseen). As such, many people are becoming more concerned with the possibility of reversing steganography so that it becomes possible to eliminate any distortions made during the process. This paper uses Contextual masking instead of masking randomly with BERT model. The goal behind this research was developing a natural language text specific Reversible Steganographic System. Our model uses pre-trained BERT as a transformer based masked language model and reversibly embeds messages through predictive word substitution. To quantify predictive uncertainty, we introduce an adaptive steganographic technique using Bayesian deep learning. This experiment shows us how our proposed system balances imperceptibility with capacity while maintaining near semantics at all times. Also, we integrate ensemble methods instead of Monte Carlo to balance the imperceptibility.

Downloads

Download data is not yet available.

Article Details

How to Cite
[1]
Mrs. M. Prasha Meena, N J S Deepalakshmi, R Dharsni, and R Subashree , Trans., “Enhancing Contextual Masking in Reversible Linguistic Steganography with Ensemble Methods”, IJRTE, vol. 13, no. 1, pp. 31–40, May 2024, doi: 10.35940/ijrte.A8066.13010524.
Section
Articles

How to Cite

[1]
Mrs. M. Prasha Meena, N J S Deepalakshmi, R Dharsni, and R Subashree , Trans., “Enhancing Contextual Masking in Reversible Linguistic Steganography with Ensemble Methods”, IJRTE, vol. 13, no. 1, pp. 31–40, May 2024, doi: 10.35940/ijrte.A8066.13010524.
Share |

References

“Deep Learning for predictive Analysis in Reversible steganography” Ching-Chun Chang; Xu Wang; Sisheng Chen; Isao Echizen; Victor Sanchez; Chang-Tsun Li.,IEEE: 2020

“VAE Stega: Linguistic Steganography based on Variational Auto Encoder”, Zhong-Liang Yang; Si-Yu Zhang; Yu-Ting Hu; Zhi-Wen Hu; Yong-Feng Huang.,IEEE: 2020

“Introduction to Linguistic Steganography”, Majid Khan, Ali Shahab*, and Zeeshan Asghan

“BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,” Jacob Devlin Ming-Wei Chang Kenton Lee Kristina Toutanova

“Reversible steganography techniques: A survey,” Tzu-Chuen Lu, Thanh Nhan Vo Chaoyang University of Technology, Department of Information Management, Taichung, Taiwan, R.O.C

“A Survey of Ensemble Learning: Concepts, Algorithms, Applications, and Prospects” Ibomoiye Domor Mienye; Yanxia Sun.,IEEE: 2020

“Ensemble Methods in Machine Learning”, Thomas G. Dietterich.IEEE: 2020

“Ensemble learning methods for decision making: Status and future prospects”, Shahid Ali; Sreenivas Sremath Tirumala; Abdolhossein Sarrafzadeh

“BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding,” Jacob Devlin Ming-Wei Chang Kenton Lee Kristina Toutanova

” Uncertainty in Bayesian Reinforcement Learning for Robot Manipulation Tasks with Sparse Rewards”, Li Zheng; Yanghong Li; Yahao Wang; Guangrui Bai; Haiyang He; Erbao Dong

” Bayesian Learning for Uncertainty Quantification, Optimization, and Inverse Design”, Madhavan Swaminathan; Osama Waqar Bhatti; Yiliang Guo; Eric Huang; Oluwaseyi Akinwande

” Understanding Uncertainty in Bayesian Deep Learning”, Cooper Lorsung

” A Survey of Uncertainty in Deep Neural Networks”, Jakob Gawlikowski, Cedrique Rovile Njieutcheu Tassi, Mohsin Ali, Jongseok Lee, Matthias Humt, Jianxiang Feng, Anna Kruspe, Rudolph Triebel, Peter Jung, Ribana Roscher, Muhammad Shahzad, Wen Yang, Richard Bamler, Xiao Xiang Zhu

” Theoretical Evaluation of Ensemble Machine Learning Techniques”, Milind Shah; Kinjal Gandhi; Kinjal A Patel; Harsh Kantawala; Rohini Patel; Ankita Kothari

“Ensemble Learning Techniques and its Efficiency in Machine Learning: A Survey”, Thomas N. Rincy; Roopam Gupta

” A Random Forest Classification Algorithm Based on Dichotomy Rule Fusion”, Yueyue Xiao; Wei Huang; Jinsong Wang

” Review of random forest classification techniques to resolve data imbalance”, A. S. More; Dipti P. Rana

” A novel improved random forest for text classification using feature ranking and optimal number of trees”, Nasir Jalal, Arif Mehmood, Gyu Sang Choi, Imran Ashraf

” A Review on Random Forest: An Ensemble Classifier”, Aakash Parmar, Rakesh Katariya & Vatsal Patel

” Application Research of Text Classification Based on Random Forest Algorithm”, Yanxiong Sun; Yeli Li; Qingtao Zeng; Yuning Bian

“Deep Generative Models for Uncertainty Estimation”, Ian J. Goodfellow et al.

“Transfer Learning for Uncertainty Calibration” Sinno Jialin Pan and Qiang Yang

“Information Theory and Uncertainty Metrics”, Thomas M. Cover and Joy A. Thomas

“Robust Optimization and Uncertainty-Aware Planning”, Dimitris Bertsimas, David B. Brown, and Constantine Caramanis

“Natural Language Processing for Uncertainty Analysis”, Bo Pang and Lillian Lee

“Adversarial Attacks and Model Robustness”, Christian Szegedy et al.

“Reversible Linguistic Steganography with Bayesian Masked Language Modeling”, Ching-Chun Chang et al.

Younis, Z., Kafri, N., & Hasouneh, W. (2022). A Framework for Sentiment Analysis Classification based on Comparative Study. In International Journal of Soft Computing and Engineering (Vol. 12, Issue 2, pp. 7–15). https://doi.org/10.35940/ijsce.a3524.0512222

Kumthekar, A., & G, R. R. (2019). Ensemble Learning Technique for Cloud Classification. In International Journal of Engineering and Advanced Technology (Vol. 9, Issue 2, pp. 2582–2587). https://doi.org/10.35940/ijeat.b3957.129219

Jindam, S., Challa, S. T., Chada, S. J., B, N. S. B., & Malgireddy, S. (2023). Prediction of Software Defects using Ensemble Machine Learning Techniques. In International Journal of Recent Technology and Engineering (IJRTE) (Vol. 11, Issue 5, pp. 58–65). https://doi.org/10.35940/ijrte.e7421.0111523

Kumthekar, A., & G, R. R. (2019). Ensemble Learning Technique for Cloud Classification. In International Journal of Engineering and Advanced Technology (Vol. 9, Issue 2, pp. 2582–2587). https://doi.org/10.35940/ijeat.b3957.129219

S, D., & P L, L. (2020). Binary Class Classification of Software Faults in Software Modules using Popular Machine Learning Techniques. In International Journal of Innovative Science and Modern Engineering (Vol. 6, Issue 6, pp. 14–18). https://doi.org/10.35940/ijisme.f1221.046620

Most read articles by the same author(s)

<< < 1 2 3 4 5 6 7 > >>