2 minute read
IV. EXPERIMENTALRESULT&ANALYSIS
A. Experimental Results
We evaluated our proposed model on a dataset of 500 interview questions from various industries. We used a 80/20 train-test split and trained the model for 5 epochs. The test accuracy achieved by our model was immense, and outperformed several baseline models, such as logistic regression and support vector machines.
Advertisement
B. Analysis
Our results show that our proposed approach is effective in predicting the importance of interview questions. We found that the BERT-based approach significantly outperformed traditional machine learning models, which is likely due to the contextualized word embeddings that capture the meaning of the input text. We also performed an analysis of the important features learned by the model, and found that the most important features were related to the specific industry and job position associated with the interview question. To further evaluate the effectiveness of our approach, we conducted a user study where participants were asked to rate the usefulness of our model's predictions for a set of interview questions. The participants found the predictions to be highly accurate and useful, with an average rating of 4.5 out of 5.
ISSN: 2321-9653; IC Value: 45.98; SJ Impact Factor: 7.538
Volume 11 Issue IV Apr 2023- Available at www.ijraset.com
Overall, the experimental results and analysis demonstrate that the proposed approach is effective in predicting important questions for interview preparation, and can outperform traditional machine learning models. However, there is still room for improvement in identifying situational questions and addressing the ambiguity of some input questions.
V. CONCLUSION
Our proposed approach for predicting important interview questions using BERT and sentence similarity check is effective in capturing the context and meaning of interview questions, outperforming traditional machine learning models. The results demonstrate the potential of using pre-trained BERT models for natural language processing tasks, such as interview preparation. This research can help job seekers better prepare for interviews by identifying the most important questions to focus on, leading to more successful job interviews.
VI. ACKNOWLEDGEMENTS
We would like to express our gratitude to all the individuals who have supported and assisted us in completing this project. Firstly, we would like to thank our professor Mrs. Deepika Tyagi for their guidance and encouragement throughout the research process. We would also like to extend our appreciation to the developers of the BERT model and pre-trained models used in our study, which have greatly contributed to the success of our research. Additionally. Finally, we would like to acknowledge our families and friends for their constant support and motivation.
References
[1] Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers) (pp. 4171-4186).
[2] Reimers, N., & Gurevych, I. (2019). Sentence-BERT: Sentence embeddings using Siamese BERT-networks. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) (pp. 39823992).
[3] Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., ... & Polosukhin, I. (2017). Attention is all you need. In Advances in Neural Information Processing Systems (pp. 5998-6008).
[4] HuggingFace(2021) Transformers https://huggingface.co/transformes/
[5] InterviewBit(2021) InterviewPreparation:Coding Interview Questions and Answers https://www.interviewbit.com/