An Efficient Comparison Neural Network Methods to Evaluate Student Performance

Page 1

GRD Journals- Global Research and Development Journal for Engineering | Volume 6 | Issue 1 | December 2020 ISSN- 2455-5703

An Efficient Comparison Neural Network Methods to Evaluate Student Performance Dr. V. S. R. Kumari Principal ( Professor) Department of Electronics and Communication Engineering Sri Mittapalli Institute of Technology for Women /JNTU Kakinada Suresh Veesa Associate Professor Department of Electronics and Communication Engineering Sri Mittapalli Institute of Technology for Women /JNTU Kakinada

Srinivasa Rao Chevala Assistant Professor Department of Electronics and Communication Engineering Sri Mittapalli Institute of Technology for Women /JNTU Kakinada

Abstract In present educational frameworks, student performance prediction be getting worsen step by step. Predicting student performance ahead of time can support students as well as to their instructor for monitor progress of a student. Numerous organizations have adopt persistent assessment framework today. Such frameworks are advantageous to the students in improving performance about student studies. This cause about continuous evolutional work toward helped to regular students. As of late, Neural Networks have seen far reaching and effective usage in a wide scope of information mining applications, frequently surpassing different classifiers. This investigation means to explore if Neural Networks are a fitting classifier to predict student performance from Learning Management System information with regards to Educational Data Mining. To survey the applicable of Neural Networks, we think about their predictive performance against six other classifiers on this dataset. These classifiers are Naive Bayes, k-Nearest Neighbours, Decision Tree, Random Forest, Support Vector Machine and Logistic Regression and will be prepared on information acquired during each course. The features utilized for preparing originated through LMS information acquired while performing every course, as well as range from utilization information like time spent on each course page, to grades got for course tasks and tests. Subsequent to preparing, the Neural System beats every one of the six classifiers as far as precision and is comparable to the best classifiers regarding review. We can infer that Neural Networks beat the six different calculations tried on this dataset and could be effectively utilized toward predicting the student Performance. Keywords- Neural Networks, Random Forest, Support Vector Machine and Logistic Regression

I. INTRODUCTION Predicting student performance is a valuable application for school, instructor and understudies. School can concede high caliber understudies concurring understudies' scholastic presentation. For the teacher, it can assist them with checking understudies' presentation what's more, give better training techniques. For understudies, they can improve and change execution in time. There are numerous investigations have talked about this comparable point. Amirah et al. [1] found that the combined evaluation point average was a significant characteristic and had been regularly been utilized. They investigated a few forecast models, counting Decision Tree, Neural Network, NaĂŻve Bayes, and Support Vector Machine. The outcome was that Neural Network can have the most noteworthy exactness, and coming up next is Decision Tree. Moreover, these days, there is a ton of separation training, which understudies regularly feel secluded because of absence of correspondence. Training is not, at this point a one-time occasion yet a deep rooted understanding. One explanation is that working lives are currently so long what's more, quick changing that individuals need to continue learning all through their professions [3]. While the great model of training isn't scaling to meet these evolving needs, the more extensive market is developing to empower laborers to learn in new manners. Gigantic open online courses (MOOCs), offered by organizations, for example, Udacity and Coursera, are currently centering a lot all the more legitimately on courses that make their understudies more employable. At Coursera and Udacity, understudies pay for short programs that present micro credentials and Nano degrees in innovation centered subjects, for example, self-driving vehicles and Android. Additionally, colleges are offering on the web degrees to make it simpler for experts to get to chances to build up their aptitudes (e.g., Georgia Tech's Computer Science Graduate degree). Nonetheless, widening admittance to front line professional subjects doesn't normally ensure understudy achievement [2]1 . In a great study hall, where understudy numbers are restricted, different elements of cooperationâ€&#x;s empower the educator to very adequately survey an individual understudy's degree of commitment and foresee their learning results (e.g., effective finish of a course, course withdrawals, last grades). In the universe of MOOCs, the noteworthy increment in understudy numbers makes it All rights reserved by www.grdjournals.com

4


An Efficient Comparison Neural Network Methods to Evaluate Student Performance (GRDJE/ Volume 6 / Issue 1 / 002)

unrealistic for even experienced human teachers to direct such individual appraisals. A computerized framework, which precisely predicts how understudies will perform progressively, might help for this situation. It would be a significant device for settling on savvy choices about when to make live instructive mediations during the course (and with whom), with the point of expanding commitment, giving inspiration and engaging understudies to succeed. The understudy execution expectation issue has been incompletely concentrated inside the learning examination and instructive information mining networks as the understudy dropout (or culmination) forecast issue (which is a significant subclass issue of the understudy execution expectation issue). In our planned work, we are evaluating performance of various machine learning algorithms and Neural Networks to predict student performance. Neural Network shows better prediction accuracy compare to other machine learning algorithms.

II. METHODOLOGY A. Random Forest Random forests or random decision forests are a gathering learning strategy for order, relapse and different assignments that work by building a large number of choice trees at preparing time and yielding the class that is the method of the classes (arrangement) or mean expectation (relapse) of the individual trees. Random choice backwoods right for choice trees' propensity for over fitting to their preparation set. These algorithms first select random values and then start looking for class which is close to that random values and assign that class to that values and this process continues till all classes assign with closer random values. B. NN Working Procedures To demonstrate how to build a neural network based on student performance prediction, we shall build a 3 layer neural network that will identify and separate one water level from other. This network that we shall build is a very small network that we can run on a CPU as well. Traditional neural networks that are very good at doing image classification have many more parameters and take a lot of time if trained on normal CPU. However, our objective is to show how to build a real-world convolutional neural network using TENSORFLOW. Neural Networks are essentially mathematical models to solve an optimization problem. They are made of neurons, the basic computation unit of neural networks. A neuron takes an input (say x), do some computation on it (say: multiply it with a variable w and adds another variable b) to produce a value (say; z= wx + b). This value is passed to a non-linear function called activation function (f) to produce the final output (activation) of a neuron. There are many kinds of activation functions. One of the popular activation function is Sigmoid. The neuron which uses sigmoid function as an activation function will be called sigmoid neuron. Depending on the activation functions, neurons are named and there are many kinds of them like RELU, TanH. If you stack neurons in a single line, itâ€&#x;s called a layer; which is the next building block of neural networks. See below image with layers.

To predict student performance class multiple layers operate on each other to get best match layer and this process continues till no more improvement left. C. Naive Bayes Naive Bayes which is one of the most commonly used algorithms for classifying problems is simple probabilistic classifier and is based on Bayes Theorem. It determines the probability of each features occurring in each class and returns the outcome with the highest probability.

All rights reserved by www.grdjournals.com

5


An Efficient Comparison Neural Network Methods to Evaluate Student Performance (GRDJE/ Volume 6 / Issue 1 / 002)

D. Decision Tree Algorithm This algorithm will build training model by arranging all similar records in the same branch of tree and continue till all records arrange in entire tree. The complete tree will be referred as classification train model. E. SVM Machine learning involves predicting and classifying data and to do so we employ various machine learning algorithms according to the dataset. SVM or Support Vector Machine is a linear model for classification and regression problems. It can solve linear and non-linear problems and work well for many practical problems. The idea of SVM is simple: The algorithm creates a line or a hyperplane which separates the data into classes. In machine learning, the radial basis function kernel, or RBF kernel, is a popular kernel function used in various kernelized learning algorithms. In particular, it is commonly used in support vector machine classification. As a simple example, for a classification task with only two features (like the image above), you can think of a hyperplane as a line that linearly separates and classifies a set of data.

III. RESULTS AND DISCUSSION We need to design a Deep Neural Network linear classifier model to foresee the presentation of understudies. This cycle ought to be followed once the dataset is preprocessed: information cleaning and information transformation.

In screen for all 7 algorithms I run all button and got their accuracy. From all algorithms neural networks got highest accuracy as 89.84%. Here application will use random test data so accuracy may vary for each run.

In graph x-axis represents algorithm names and y-axis represents accuracy of those algorithms. From above graph we can conclude NN algorithm got highest accuracy. Now click on „Upload & Predict New Student Performanceâ€&#x; button and upload test dataset and then predict student performance by applying neural networks and its train model.

All rights reserved by www.grdjournals.com

6


An Efficient Comparison Neural Network Methods to Evaluate Student Performance (GRDJE/ Volume 6 / Issue 1 / 002)

In this screen for each test dataset records we got predicted performance value as HIGH, LOW or MEDIUM.

IV. CONCLUSION In this paper, we are evaluating performance of various machine learning algorithms and Neural Networks to predict student performance. Neural Network shows better prediction accuracy compare to other machine learning algorithms such as KNN, Naïve Bayes, SVM, Logistic Regression, Random Forest and Decision Tree. To implement this project we are using student performance dataset from KAGGLE website.

REFERENCES [1]

[2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16]

V. Lahari Sowmya , Dr. A. Kousar Nikhath. “A Qualitative Evaluation and Comparision of Cognitive Learning Analytics for Blended Learning Technology”. International Journal of Advanced Science and Technology, Vol. 29, no. 08, Sept. 2020, pp. 5356 -0, http://sersc.org/journals/index.php/IJAST/article/view/32283. Barzilay, Regina, Michael Collins, Julia Hirschberg, and Steve Whittaker. 2000. The rules behind roles. In Proceedings of AAAI-00. Baxendale, Phyllis B. 1958. Man-made index for technical literature--An experiment. IBM Journal of Research and Development, 2(4):354-361. Biber, Douglas. 1995. Dimensions of Register Variation: A Cross-Linguistic Comparison. Cambridge University Press, Cambridge, England. Brandow, Ronald, Karl Mitze, and Lisa F. Rau. 1995. Automatic condensation of electronic publications by sentence selection. Information Processing and Management, 31(5):675-685. Carletta, Jean. 1996. Assessing agreement on classification tasks. The kappa statistic. Computational Linguistics, 22(2):249-254. Choi, Freddy Y. Y. 2000. Advances in domain independent linear text segmentation.In Proceedings of the Sixth Applied Natural Language Conference (ANLP-00).and the First Meeting of the North American Chapter of the Association for Computational Linguistics (NAACL-00), pages 26-33. CMP_LG. 1994. The computation and language e-print archive. Dunning, Ted. 1993. Accurate methods for the statistics of surprise and coincidence. Computational Linguistics, 19(1):61-74. Barzilay, Regina, Kathleen R. McKeown, and Michael Elhadad. 1999. Information fusion in the context of multidocument summarization. In Proceedings of the 37th Annual Meeting of the Association for Computational Linguistics (ACL-99), pages 550-557. https://cv-tricks.com/tensorflow-tutorial/training-convolutional-neural-network-for-image-classification/ https://gnss.geosystems.ru/dls-2021-8ggcv/tensorflow-sigmoid.html https://justontheinternet.com/convolutional-neural-networks-using-tensorflow/ https://towardsdatascience.com/https-medium-com-pupalerushikesh-svm-f4b42800e989?gi=d5736103512f https://www.broadhorizons.co.za/1521835454-jGexwWbX_separate+machine+bulls+mercadolibre-jGexwWbX.html https://aylien.com/blog/support-vector-machines-for-dummies-a-simple-explanation

All rights reserved by www.grdjournals.com

7


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.