2016
COMPUTATIONAL LINGUISTICS IN ENGINEERING AND RESERACH
CLEAR
Department Of Computer Science & Engineering Government Engineering College Sreekrishnapuram, Palakkad A Simple Initiative
CLEAR March 2016
2
Editorial…………………… 4 News & Updates……….5 CLEAR June 2016 Invitation…………………24 Last word…………………25
CLEAR Magazine (Computational Linguistics in Engineering And Research) M. Tech Computational Linguistics, Dept. of Computer Science and Engineering, Govt. Engineering College, Sreekrishnapuram, Palakkad678633 www.simplegroups.in simplequest.in@gmail.com Chief Editor Dr. Ajeesh Ramanujan Assistant Professor Dept. of Computer Science and Engineering Govt. Engineering College, Sreekrishnapuram, Palakkad678633 Editors Deepthi Kavya V J Reshma M M Swedha B
The Much Harder Soft Computing...…………..............07 Gayathri E S Fascinating World of Neural Networks....................................09 Swedha B Darwin Evolution And Genetic Algorithms..................................12 Deepthi Fuzzy Logic – The Past , Present And the Future ...................................16 Reshma M M Particle Swarm Optimization.....20 Kavya V J
Cover page and Layout Gayathri E S Reshma M M Swedha B
CLEAR March 2016
3
With Best Wishes,
Dear Readers!
Dr. Ajeesh Ramanujan
(Chief Editor) Greetings!
This edition of CLEAR is a special issue consisting of articles covering the major researches done in the area of Soft Computing which, unlike conventional hard computing, is tolerant of imprecision, uncertainty, partial truth and approximation. The scope of this magazine covers the Soft Computing related techniques of Neural Networks, Fuzzy Logic, Genetic Algorithms and Particle Swarm Optimization. The good response from our readers reflect the wider acceptability and growing awareness of Language processing among the technical community, which is definitely gratifying for the entire CLEAR team. On this positive note, I proudly present this Special Issue of CLEAR to the readers, for critical evaluation and feedback. Best Regards, Dr. Ajeesh Ramanujan (Chief Editor)
CLEAR March 2016
4
WORKSHOP ON CLOUD COMPUTING AND BIG DATA ANALYTICS
The four day workshop was held at GEC Sreekrishnapuram from February 29 – March 3rd 2016. Eminent personalities from NIT Surathkal, GCE Kannur and MESCE Kuttippuram gave sessions on Cloud Computing Issues and Challenges, Green Cloud Computing, Mobile Computing, Cloud Service Development, Virtualization and Virtual Machines, Big Data Analytics using Hadoop, SecurityWEB and CloudSim. On the first day WORKSHOP ON Cloud SEMANTIC of the workshop there was a session on Introduction to Cloud computing & Issues and A one day workshop held at GEC February Challenges by Dr. Chandrasekaran K, NITwas Surathkal. Next Sreekrishnapuram session discussed on about Green th 15 2016. The session was handled by Sangeetha, Research Scholar from IIT Madras. In the Cloud Computing, Mobile Cloud Computing. It was an interesting session. Second day session forenoon session she explained about the Ontology and working of Protégé 5. She claims that was mainly focused on Introduction to Virtualization and Virtual Machines by Mr. Lino through this work, we can replace the existing database in which we need to formulate queries Abraham Varghese, Kuttippuram. Afternoon was handled by Mr. Sreejith for each instance but MESCE, the Protégé have the ability to mapsession automatically from the already givenV P, GCE Kannur, about Cloud session Serviceended Development and followed by hands on training on information. And the afternoon with the familiarisation of the Protégé 5 toolkit. communication between VMs. session. The third day session was on Big Data Analytics using Hadoop It was an interesting and lively by Mr. Sreejith V P, GCE Kannur. Next session was training on setting up of Hadoop single node cluster by Mr. Lino Abraham Varghese. The last day dealt with Cloud Security and CloudSim by Edet Bijoy K from MESCE, Kuttippuram. th
WORKSHOP ON SEMANTIC WEB
A one day workshop was held at GEC Sreekrishnapuram on February 15 2016. The session was handled by Sangeetha, Research Scholar from IIT Madras. In the forenoon session she explained about the Ontology and working of Protégé 5. She claims that through this work, we can replace the existing database in which we need to formulate queries for each instance but the Protégé have the ability to map automatically from the already given information. And the afternoon session ended with the familiarisation of the Protégé 5 toolkit. It was an interesting and lively session. th
CLEAR March 2016
5
WORKSHOP ON SOFT COMPUTING TECHNIQUES
The five day workshop on Soft computing techniques was held at GEC Sreekrishnapuram from 8-12th February 2016. Eminent personalities from NIT Trichy and CET gave sessions on topics: Genetic algorithms, Neural Networks, Fuzzy logic and Particle Swarm Optimization (PSO). On the First day of workshop there was a session on Introduction to Soft computing and ANN by Mr. Sishaj P Simon, Assistant Professor, NIT Trichy followed by a MATLAB familiarization session. Second day session discussed about PSO and Genetic algorithms which was handled by Mr. U Sreenivasulu Reddy, Assistant Professor, NIT Trichy. It was an interesting session. Third day session was mainly focused on Fuzzy logic by Mr. Vipin Vasu M, Assistant Professor, CET. On the fourth and fifth day the session was on Soft Computing techniques using MATLAB by R. Thamaraiselvan, Senior analyst, Infosys, Chennai. The session was highly beneficial to staff and students, giving them an exposure to MATLAB.
PUBLICATIONS
“Sentiment Analysis for Malayalam movies reviews using YamCha and Fuzzy Logic”,
Dhanaraj V, Reshma R, Sreetha S and Binu R, published paper in NCILC 2016 conference held at CUSAT. “A new way of Topic modelling using MALLET for current Job Trends”, Bhavya V R , Athira M , Soorya K, Ajeesh Ramanujan, published paper in nCORETech-2016 (National conference on Recent Advances in Engineering & Technology) conference held at LBS college Kasaragod. “Ramanujan Sum based Image kernels for computer vision”, Krishnaprasad P and Ajeesh Ramanujan, published paper in ICEEOT 2016. “Novelty Detection using Ramanujan Kernel”, Krishnaprasad P and Ajeesh Ramanujan, paper was accepted in International Conference on Data Mining and Big Data 2016. “A Rule Based Approach For Malayalam-English Translation”, Suneera C M and Kala M T, paper was accepted in ICCEECON 2K16 ( International conference on Technical Challenges in Instrumentation, Computer Science, Civil, Mechanical, Electronics and Electrical Engineering).
SIMPLE group congratulates all the authors for their achievement!! CLEAR March 2016
6
The Much Harder Soft Computing Gayathri E S M Tech, Computational Linguistics GEC, Sreekrishnapuram gayathryes@gmail.com
Soft computing, it is not as soft as it hears. The impact it can make in the world is much hard. So what makes it so revolutionary one in the present scenario is a relevant discussion topic? Soft computing is likely to play an important role in science and engineering in the future. The successful applications of soft computing and the rapid growth suggest that the impact of soft computing will be felt increasingly in coming years. Soft Computing encourages the integration of soft computing techniques and tools into both everyday and advanced applications. In computer science, soft computing is the use of inexact solutions to computationally hard tasks such as the solution of NP-complete problems, for which there is no known algorithm that can compute an exact solution in polynomial time. Soft computing differs from conventional (hard) computing in that, unlike hard computing, it is tolerant of imprecision, uncertainty, partial truth, and approximation In effect, the role model for soft computing is the human mind. A system or a method is able to exist in the fast growing modern technologies only if it can be applied to get better results in the case of real time applications. So this is the
CLEAR March 2016
point which makes soft computing a highly asked technology. There are wide ranges of real time application areas concerned with soft computing which includes data analysis and data mining, optimization, fault diagnosis, control, pattern recognition, signal processing, image and video processing as well as traffic and transportation systems, parameter estimation, system identification, robust solution, adaptive system, self-organization and failure analysis, multi-objective optimization etc. It is known that by employing various soft computing techniques can lead to higher optimum of business decision-making, but generally in many other fields such engineering, technology, public services etc. However, the idea behind soft computing is to model cognitive behaviour of human mind and it is a foundation of conceptual intelligence in machines. The principal constituents of Soft Computing (SC) are Fuzzy Logic (FL), Neural Computing (NC), Evolutionary Computation (EC) Machine Learning (ML) and Probabilistic Reasoning (PR), with the latter subsuming belief networks, chaos theory and parts of learning theory. Soft computing (SC) solutions are unpredictable, uncertain and between 0 and 7
one. Soft Computing became a formal area of study in Computer Science in the early 1990s. Earlier computational approaches could model and precisely analyze only relatively simple systems. More complex systems arising in biology, medicine, the humanities, management sciences, and similar fields often remained intractable to conventional mathematical and analytical methods. However, it should be pointed out that simplicity and complexity of systems are relative, and many conventional mathematical models have been both challenging and very productive. Soft computing deals with imprecision, uncertainty, partial truth, and approximation to achieve practicability, robustness and low solution cost. As such it forms the basis of a considerable amount of machine learning techniques. Recent trends tend to involve evolutionary and swarm intelligence based algorithms and bio-inspired computation. Generally speaking, soft computing techniques resemble biological processes more closely than traditional techniques, which are largely based on formal logical systems, such as sentential logic and predicate logic, or rely heavily on computeraided numerical analysis. Soft computing techniques are intended to complement each other. Unlike hard computing schemes, which strive for exactness and full truth, soft computing techniques exploit the given tolerance of imprecision, partial truth, and uncertainty for a particular problem.
plays a larger role in soft computing than in hard computing. It used to be obvious that the world was designed by some sort of intelligence. What else could account for fire and rain and lightning and earthquakes? Above all is the human minds that exist in a twisted and perverse ways. So the idea to model cognitive behaviour of human mind in to machines is a much harder task. But remember harder works can give better results.
Twitter gives access to Politicians’ Deleted tweets again…
The site Twitter Twitter has has The micro-blogging micro-blogging site reversed controversial decision decision to to ban ban reversed its its controversial popular tool. Politwoops that archived popular tool. Politwoops that archived politicians‘ social media media updates updates to to politicians‘ deleted deleted social bring more transparency to public dialogue bring more transparency to public dialogue .. Politwoops is the only comprehensive comprehensive Politwoops is the only collection deleted tweets tweets by by politicians politicians collection of of deleted that offers a window into what they hoped that offers a window into what they hoped you see. New New CEO CEO Jack Jack Dorsey Dorsey in in you did did not not see. October last year spoke about their October last year spoke about their responsibilities to developers developers and and users users in in responsibilities to the areas of government transparency. the areas of government transparency. Visit: Visit: http://www.bgr.in/news/twitter-giveshttp://www.bgr.in/news/twitter-gives-accessaccess-to-politicians-deleted-tweets-again/ to-politicians-deleted-tweets-again/
Another common contrast comes from the observation that inductive reasoning
CLEAR March 2016
8
Fascinating world of Neural Networks Swedha B M Tech, Computational Linguistics GEC, Sreekrishnapuram swedhathrisala93@gmail.com
Recently in the ―The Times of India‖ newspaper on December 2015, I came to read about an article related to neural networks. It was more likely to be about ―Brain-like computer chip developed by Chinese Scientists‖ from China's Zhejiang province and they have developed a computer chip that works like the brain. They made a striking quoting that ―It can perform intelligent computer tasks by simulating a human brain's neural networks‖. And this insisted me to read much about this and was redirected to the area of Artificial Neural networks. So, through this article my aim is to educate about Artificial Neural Network and the present stand of this field. The basic definition of an artificial neural network (ANN) or just neural network (NN) is an interconnected group of artificial neurons that uses a mathematical model or computational model for information processing based on a connectionist approach to computation. The model of artificial neural network got inspired from the biological neural network.
CLEAR March 2016
The researchers started their work in the field of Artificial Neural Network in the beginning of 1943, the first work being proposed by neurophysiologist Warren McCulloch and mathematician Walter Pitts, they wrote a paper on how neurons might work. In order to describe how neurons in the brain might work, they also modelled a simple neural network using electrical circuits. By the year 1949, Donald Hebb wrote a work The Organization of Behavior, which pointed out the fact that neural pathways are strengthened each time they are used, a concept fundamentally essential to the ways in which humans learn. We learn something repeatedly and try to remember about this. Thus, by continuously doing the process of training or learning the performance can be improved. Similarly, if two nerves fire at the same time, Hebb argued that, the connection between them is enhanced [3]. As computers became more advanced in the 1950's, it was finally possible to simulate a hypothetical neural network. The first step towards this was made by Nathanial Rochester from the IBM research
9
laboratories. Unfortunately for him, the first attempt to do so failed. In 1959, Bernard Widrow and Marcian Hoff of Stanford developed models which are named as "ADALINE" and "MADALINE". MADALINE was the first neural network applied to a real world problem. In 1962, Widrow & Hoff developed a learning procedure it was based on the idea that when one of the active perceptron has a big error, one can adjust the weight values to distribute it across the network, or at least to adjacent perceptron. Kohonen and Anderson in 1972 developed a similar network independently of one another. They both used matrix mathematics to describe their ideas but they were not able to realize what they were doing and their aim was to create an array of analog ADALINE circuits. The first multilayered network was developed in 1975, and it was an unsupervised network. In 1982, interest in the field was renewed. John Hopfield of Caltech presented a paper to the National Academy of Sciences. His approach was to create more useful machines by using bidirectional lines. Previously, the connections between neurons were only one way. That same year, Reilly and Cooper used a "Hybrid network" with multiple layers, each layer using a different problem- solving strategy. 3 In 1986, with multiple layered neural networks in the news, as I mentioned earlier about the Widrow & Hoff‘s learning procedure, discussions were made how to extend the Widrow-Hoff rule to multiple layers. Three independent groups of researchers, one of which included David CLEAR March 2016
Rumelhart, a former member of Stanford's psychology department, came up with similar ideas which are now called back propagation networks because it distributes pattern recognition errors throughout the network, these back-propagation networks use many. The result is that backpropagation networks are "slow learners," and requires possibly thousands of iterations to learn. But we have Hybrid networks which use just two layers.
Fig1. Knightscope at Plug and Play Expo 55
Now, neural networks are used in several applications. From the troubled early years of developing neural networks to the unbelievable advances in the field, neural networks have been a fascinating source of intellectual enjoyment for computer scientists around the world. Support Vector Machine and other, much simpler methods such as linear 10
classifiers gradually overtook neural networks in machine learning popularity. But the advent of deep learning in the late 2000s sparked renewed interest in neural nets. The advancement in Deep Learning (DL) and Neural Network (NN) is currently driving the most ingenious inventions in today‘s century. Their incredible ability to learn from data and environment makes them the first choice of machine learning scientists. And it lies in the heart of products such as self driving cars, image recognition software, and recommender systems etc. People think neural network is an extremely difficult topic to learn. Therefore, either some of them don‘t use it, or the ones who use it, use it as a black box. Deep Learning was a stepping stone to the development of Robots. There were many news articles that give much information regarding the advancements in this particular area. One among them is ―Chinese channel hires Microsoft‘s Robot as weather reporter‖; this robots voice is much closer to human voice than any other speech synthesizers. It is named as Xiaolce and it is actually a piece of software developed by using Microsoft using smart cloud and big data and gather some concepts from emotional technology and that greatly helps to incorporates feelings and comments based on weather data. Chinese social and gaming giant Tencent published its first business report written by a robot.
and even as robotic pets. Another related achievement is in the area of crime, a fully autonomous robot named Knightscope 1 (the K5 beta prototype) used to predict crimes, schools and neighbourhood. Figure 1 shows the image of Knightscope at Plug and Play Expo 55. For judging the beauty contest robots have been used [2]. Recent advances in Deep Learning have made machine recognition of beauty aspects far better than ever before. Eventually, through this technology, machines may even learn to judge another machine‘s appearance, opening up the possible world of robot. The fundamental idea behind the nature of neural networks is that if it works in nature, it must be able to work in computers. I hope in future editions many articles regarding the developments in the field of Neural Networks will be added by SIMPLE group to attract the readers. References: [1] http://techcrunch.com/2015/12/31/meetknightscopes-crime-fighting-robots/
[2] http://techcrunch.com/2016/01/01/thefirst-international-beauty-contest-judged-byrobots/
[3]http://cs.stanford.edu/people/eroberts/cour ses/soco/projects/2000-01/neuralnetworks/index.html
Robots are starting to appear everywhere: driving cars, cooking dinners
CLEAR March 2016
11
Darwin Evolution and Genetic Algorithms Deepthi M Tech, Computational Linguistics GEC, Sreekrishnapuram deepthisuthan@gmail.com
It was in 1858 that Darwin's Theory of Evolution by Natural Selection was proposed. Though highly controversial at the time, there remains quite little doubt about evolution and natural selection in today‘s world. Darwin suggests that ―survival of the fittest‖ was the basis for organic evolution. More precisely it means that the species with useful adaptations to the environment are more likely to survive and produce progeny than those with less useful adaptations. It is on these ideas of genetics and evolution that Genetic Algorithms are based. Genetic Algorithms are a heuristic search and optimization technique that mimic the process of natural evolution. GAs are part of the field evolutionary computing, a rapidly growing area of artificial intelligence. Search and optimization are the key terms that can define the areas where GAs are applied. In solving a problem, some solutions might be better than the others. And this search space of solutions could be quite large. GAs can do both the tasks of going through the entire search space and finding the best solution quite efficiently by mimicking the processes that are used in nature; ie, Selection, Crossover and Mutation. The computational analogy of these concepts has quite the similar meaning
CLEAR March 2016
to its biological counterparts. Though they vary in the entities they act upon. Genetic algorithm begins with a set of solutions (represented by chromosomes in the biological background) called the population. The solutions in GA are represented as strings using some suitable encoding schemes. Solutions from one population are taken and used to form a new population. This is motivated by the possibility that the new population will be better than the old one. Solutions are selected according to their fitness to form new solutions (offspring); more suitable they are, more chances they have to reproduce. Selection (survival of the fittest) is enforced by this fitness function. Crossover combines two parents to produce an offspring, this new string might be better than the parents. Mutation is a genetic operator used to maintain genetic diversity from one generation of a population of chromosomes to the next. Mutation is an important part of the genetic search, helps to prevent the population from stagnating at any local optima. One simple mutation operator is flip bit, which simply inverts a bit of chosen string i.e., 0 goes to 1 or 1 goes to 0. Also mutation occurs during evolution according to a user-definable mutation probability, usually set to fairly low value, say 0.01 a 12
good first choice. These processes are repeated from one generation to the next, until a best solution or an end condition has been satisfied. The application areas of genetic algorithms are vast, ranging from automotive design to computational creativity. Engineering design is one such field, where GAs are used to get the most out of a range of materials to optimize the structural and operational design of buildings, factories, machines, etc. To focus on a more specific application, an evolved antenna is used in radio communications and is designed fully or substantially by an automatic computer design program that uses an evolutionary algorithm such as genetic algorithms. This sophisticated procedure has been used in recent years to design a few antennas for mission-critical applications involving stringent, conflicting, or unusual design requirements, such as unusual radiation patterns, for which none of the many existing antenna types are adequate. The computer program starts with simple antenna shapes, then adds or modifies elements in a semi random manner to create a number of new candidate antenna shapes. These are then evaluated to determine how well they fulfil the design requirements, and a numerical score is computed for each. Selection is done at this stage and a portion of the antennas with worst scores are discarded, leaving behind a group of high scoring antennas. Using these antennas, the computer repeats the procedure, generating a population of even higher-scoring designs. After a number of iterations, the population of antennas is evaluated and the highest-scoring design is CLEAR March 2016
chosen. The resulting antenna often outperforms the best manual designs, because it has a complicated asymmetric shape that could not have been found with traditional manual design methods. An example of an evolved antenna is an X-band antenna evolved for a 2006 NASA mission called Space Technology 5 (ST5). The detailed procedure and with the initial designs and their performance evaluations are published in the paper Automated Antenna Design with Evolutionary Algorithms by Gregory S. Hornby and Al Globus, Derek S. Linden and Jason D. Lohn. The ST5 mission successfully launched on March 22, 2006, and so this evolved antenna represents the world's first artificially evolved object to fly in space. Similar methods for military antenna design using simple and competent genetic algorithms has also been developed and used.
Fig1: The 2006 NASA ST5 spacecraft antenna. This complicated shape was found by an evolutionary computer design program to create the best radiation pattern.
Another interesting area of application of Genetic Algorithms that is closely related to Linguistics is Joke and Pun Generation. Among the linguistic applications of GAs 13
including a JAPE (automated pun generator) inspired STANDUP program to design communications strategies for people working with children who suffer communications disabilities - are GAs that search for jokes and puns. These come under the category of artificial creativity, and AI. This GAs lets you to input a word describing the subject you wish to joke about and returns a variety of solutions. As evident, this clever piece of GAs could prove to be quite useful to wannabe punsters and stand-up comedians. Biomimetic invention, trip traffic and shipment routing, computer gaming, encryption and code breaking, computer aided molecular design, gene expression profiling, marketing and merchandising, the list goes on and on. Basically any problem that deal with modelling and optimization and has a credible fitness function could be solved using GA. But most readers might find the application of GAs to natural language processing more significant. Natural Language Processing is concerned with the interactions between computers and human (natural) languages. Since the language evolves, so does the study of it. In the paper, Evolutionary Algorithms in Natural Language Processing by Lars Bungum and Bjorn Gamback, an investigation on how evolutionary computation has been employed in NLP, ranging from efforts to induce grammars to models of language development through parameter optimization and search, is done. Statistical natural language processing (NLP) and evolutionary algorithms (EAs) are two CLEAR March 2016
very active areas of research which have been combined many times. In general, statistical models applied to deal with NLP tasks require designing specific algorithms to be trained and applied to process new texts. The development of such algorithms may be hard. This makes EAs attractive since they offer a general design, yet providing a high performance in particular conditions of application. EAs do not guarantee to reach the optimum solution but an approximation, the quality of which depends on the time and memory devoted to solve the problem. However, the problems related to statistical NLP are tackled assuming a statistical model, which is itself an approximation to the real data. Therefore, it makes sense to save resources by accepting a high quality approximation. The paper, How evolutionary algorithms are applied to statistical natural language processing by Lourdes Araujo collects some of these applications of EAs and neural networks on some of the NLP tasks. Grammar induction, also known as grammatical inference, is the process in which a system generates a grammar given corpora. Given the complexity of natural language and availability of large bodies of text, Grammar induction is and will continue to be an important part of Natural Language Processing and Machine Learning. Grammars are represented by the genetic algorithm as a finite string of non-terminals and pre-terminals corresponding to the rules of a context-free grammar in Chomsky Normal Form. Fitness is based on the 14
number of sentences correctly parsed by the grammar from a selection of language examples, inversely related to the number of random sentences parsed by the grammar, and discounted by the length of the string representing the grammar. Evolutionary approaches like these can produce results with high precision and recall, although such grammars are dissimilar to the grammars designed by humans. The very concept of computer algorithms being based on the evolution of organisms can be quite intriguing, even more is their extensive use in many of the laborious problem areas. GAs is very helpful when the developer does not have precise domain expertise, because GAs possesses the ability to explore and learn from their domain. The usefulness and gracefulness of genetic algorithms is what gives them an edge over other traditional methods.
[6] Margaret Aycinena, Mykel J Kochenderfer, David Carl Mulford, “An Evolutionary Approach to Natural Language Grammar Induction”.
MSI’s new Tobii gaming Laptop lets you play games with your eyes The latest gaming laptop from MSI turns your eyes into a controller. At CES in Las Vegas, the company announced a slate of new gaming-focused devices-ranging from thin and light notebooks to powerful all-inone desktops-but the most interesting is the GT72S Tobii, which is the first PC to integrate Tobii‘s eye-tracking technology. MSI hasn‘t revealed any specs for the machine, but the new laptop will include both eye-tracking and facial recognition support.
References: [1] “Evolved antenna,” Wikipedia, the free encyclopaedia. 03-Jul-2015. [2] “15 Real-World Applications of Genetic Algorithms.” [Online]. Available in: http://brainz.org/15-real-worldapplications-genetic-algorithms/.[Accessed: 21Mar-2016]. [3] Gregory S Hornby and Al Globus., Derek S Linden., Jason D Lohn, “Automated Antenna Design with Evolutionary Algorithms,” American Institute of Aeronautics and Astronautics. [4] Lars Bungum, Bjorn Gamback, “Evolutionary Algorithms in Natural Language Processing,” Norwegian Artificial Intelligence Symposium, Gjovik, 22 November 2010. [5] Lourdes Araujo, “How evolutionary algorithms are applied to statistical natural language processing,” Artif Intell Rev (2007).
CLEAR March 2016
Last year Assassin‘s Creed Rogue became the first blockbuster game to utilize the tech, letting you control the in-game camera by looking around. Since then a number of other games Dayz and Spaceship sim Elite: Dangerous. Ubisoft‘s upcoming open-world shooter The Division will also support the eye-tracking technology in some form.
Visit: http://www.theverge.com/2016/1/6/107 10110/msi-tobii-gaming-laptop-eyetracking-gt72s-ces-2016 15
FUZZY LOGIC – THE PAST, PRESENT AND FUTURE Reshma M M M Tech, Computational Linguistics GEC, Sreekrishnapuram reshma12393@gmail.com
Fuzzy logic is powerful tool in the technology world. It has been applied to many fields, from control theory to artificial intelligence (AI). Simply, Fuzzy Logic is a problemsolving control system methodology which is used in many data acquisition and control systems. It can be implemented in hardware, software or a combination of the both. FL helps us to get a definite conclusion from a set of noisy, imprecise, ambiguous or missing input. In general, fuzzy logic provides an inference structure that enables appropriate human reasoning capabilities. The utility of fuzzy sets lies in their ability to model uncertain and ambiguous data and to provide suitable decision as in figure a.
Fig1: A Fuzzy Logic System accepting imprecise data and providing a decision.
CLEAR March 2016
Fuzzy logic, introduced in the year 1965 by Lotfi Zadeh (Professor at the University of California, Berkeley), is a mathematical tool for dealing with uncertainty. It is a form of many-valued logic in which the truth values of variables may be any real number between 0 and 1 (considered to be "fuzzy"), where 0.0 represents absolute falseness and 1.0 represents absolute truth. Zadeh presented FL not as a control methodology but as a way of processing data by allowing partial set membership rather than crisp set membership or non-membership. The main applications of fuzzy logic are in control systems. The Japanese were the first to utilize fuzzy logic for practical applications. The first notable application was on the high-speed train in Sendai, in which fuzzy logic was able to improve the economy, comfort, and precision of the ride. It has also been used in recognition of hand written symbols in Sony pocket computers, flight aid for helicopters, controlling of subway systems in order to improve driving comfort, precision of halting, and power economy, improved fuel consumption for automobiles, single-button control for washing machines, automatic 16
motor control for vacuum cleaners with recognition of surface condition and degree of soiling, and prediction systems for early recognition of earthquakes through the Institute of Seismology Bureau of Meteorology, Japan. Principal applications of fuzzy logic are in control systems, consumer products, decision analysis, pattern recognition and robotics. Recently, FL became important in applications like natural language processing, financial engineering, and legal reasoning. Many natural phenomena are complex and cannot be modelled using onedimensional classes and/or one-dimensional variables. For example, in pattern recognition, objects can be represented by a set of measurements and are regarded as vectors in a multidimensional space. Often, it is not practical to assume that this multidimensional information can be represented via a simple combination of variables and operators on one-dimensional clauses. In such cases Fuzzy logic is very important. These complex cases can be handled by fuzzy logic systems. EMERGING APPLICATIONS OF FUZZY LOGIC I.
Using Fuzzy Logic for Risk Assessment:
Risk is commonly associated with an unexpected event that results in large negative consequences. The fuzzy logic can be used for operational risk assessment, Hazard or disaster risk assessment, health CLEAR March 2016
risk assessment, financial risk assessment and strategic risk assessment. II. Fuzzy logic in the multi-agent financial decision support system: Financial decisions are made under conditions of risk and uncertainty that influence their level of performance. These decisions are usually supported by decision support systems and various computational models. Among them, there are multi-agent systems which use various methods based on mathematics, statistics, finance or artificial intelligence. Using the fuzzy logic as agents‘ knowledge representation allows the trading decision to be closer to real experts‘ decisions (made under conditions of risk and uncertainty) that are also taken with a certain level of probability. Fuzzy logic system allows expressing the agent‘s knowledge in a flexible, intuitive way. III.
Forecasting:
FL can be used as the method to calculate the output value for input data lies outside the scope initially predicted. FL is a powerful tool in the computation of decisions‘ probability level. It can be used in implementation of different automated learning algorithms. IV.
A Fuzzy Logic Model to Predict the Bioleaching Efficiency of Copper Concentrates in Stirred Tank Reactors:
Multiplicity of the chemical, biological, electrochemical and operational variables 17
and nonlinear behaviour of metal extraction in bioleaching environments complicate the mathematical modelling of these systems. To predict copper and iron recovery from a copper flotation concentrate in a stirred tank bioreactor, a fuzzy logic model can be used. Input variables used are method of operation (bioleaching or electrobioleaching), the type of bacteria used for the experiment and time (day). And the outputs are the recoveries of copper and iron. A relationship was developed between stated inputs and the outputs by means of ―if-then‖ rules.
model-based design. In fact, it's possible to build up a graphic integrated user interface to facilitate the usage of the proposed system. The sizing photovoltaic/battery system provides the capacity of batteries used in the hybrid system and determines the surface of the PVP to be used. Its primary significance is characterized by its simplicity in the usage and its efficiency in optimizing the cost and losses. The system may find use in a number of areas, including domestic dwellings, public buildings and industrial electrical settings, as well as for agricultural water pumping in the field.
The resulting fuzzy model showed a satisfactory prediction of the copper and iron extraction. The results of this study suggested that fuzzy logic provided a powerful and reliable tool for predicting the nonlinear and time variant bioleaching processes. V.
Using 'fuzzy logic' to optimize hybrid solar/battery systems:
A group of researchers in Tunisia and Algeria show how fuzzy logic has helped them create an ideal photovoltaic system that obeys the supply-and-demand principle and its delicate balance. They described a new sizing system of a solar array and a battery in a standalone photovoltaic system. The fuzzy logic system accepts the consumed energy and the monthly average of daily solar radiations as inputs and then outputs photovoltaic panel surfaces and the battery capacity. The method applies Matlab/Simulink interfaces, which aren't complicated compared with other forms of simulation and CLEAR March 2016
Fig2: The fuzzy logic algorithm which reads the consumption energy and the monthly average of daily solar radiation and gives the output of the system which is the PVP surface and the battery capacity.
The researchers are planning an extension of the present system is possible -such as appending a wind or fuel cell energy source. The system can also be improved by adding an electrolyser system to permit it to convert lost photovoltaic solar energy into hydrogen that can be stored in a special tank and then be used elsewhere. 18
Fuzzy logic is an emerging technology which has numerous applications. It is an existing technology used in many industrial applications. The technology is still growing. Researchers are trying to find new dimensions of this technology. It can be used for realizing many new and innovative ideas. References [1] Fuzzy logic encyclopaedia.
-
Wikipedia,
the
Mark Zuckerberg’s 2016 challenge: To built an AI for his home Facebook CEO Mark Zuckerberg has set himself the task of building a personal Artificial Intelligence system to run his home. Zuckerberg hopes his AI will be like Jarvis in the Iron Man series.
free
[2] [Book] Principles of Soft computing by Dr. S N Sivanandam and S N Deepa. [3] “Using Fuzzy Logic for Risk Assessment,” by Arnold F. Shapiro, Warren Center for Actuarial Studies & Research, University of Manitoba, Winnipeg, Manitoba, Canada and Marie-Claire Koissi, Department of Mathematics, University of Wisconsin-Eau Claire, WI, Published by Casualty Actuarial Society, Canadian Institute of Actuaries, Society of Actuaries, 2015. [4] “Fuzzy logic in the multi-agent financial decision support system,”, Jerzy Korczak, Marcin Hernes, Maciej Bac,Wrocław University of Economics, Proceedings of the Federated Conference on Computer Science and Information Systems pp. 1367–1376, 2015. [5] “A Fuzzy Logic Model to Predict the Bioleaching Efficiency of Copper Concentrates in Stirred Tank Reactors,” by Ali Ahmadi, Mohammad Raouf Hosseini (Department of Mining Engineering, Isfahan University of Technology, Isfahan, Iran). Published in International Journal of Nonferrous Metallurgy Vol.04 No: 01(2015). [6] “New optimally technical sizing procedure of domestic
PVP/Battery system”,
Chokri
Ben
Salah, Kheireddine Lamamra and Anis Fatnassi. Published in Journal of Renewable and Sustainable Energy, 2015.
CLEAR March 2016
In a Facebook post Zuckerberg wrote, ―Every year, I take on a personal challenge to learn new things and grow outside my work at Facebook. My challenges in recent years have been to read two books every month learn Mandarin and meet a new person every day. My personal challenge for 2016 is to build a simple AI to run my home and help me with my work. You can think of it kind of like Jarvis in Iron Man.‖ Zuckerberg adds that he will begin with exploring the technology that is already present, and then will teach the AI his own voice to control pretty much everything in his house, from music to lights to temperature.
Visit: http://indianexpress.com/article/tec hnology/social/mark-zuckerbergs2016-challenge-to-build-an-ai-forhis-home/
19
Particle Swarm Optimization and its Applications Kavya V J M Tech, Computational Linguistics GEC, Sreekrishnapuram kavyavj91@gmail.com
Particle swarm optimization (PSO) is a population based stochastic optimization technique developed by Dr. Eberhart and Dr. Kennedy in 1995, inspired by social behaviour of bird flocking or fish schooling. PSO shares many similarities with evolutionary computation techniques such as Genetic Algorithms (GA). In past several years, PSO has been successfully applied in many research and application areas. It is demonstrated that PSO gets better results in a faster, cheaper way compared with other methods. In PSO, a collection of particles (called a ―swarm‖) move around in search space looking for the best solution to an optimization problem. All particles have their own velocity that drives the direction they move in. This velocity is affected by both the position of the particle with the best fitness and each particle's own best fitness. Fitness refers too how well a particle performs: in a flock of birds this might be how close a bird is to a food source, in an optimization algorithm this refers to the proximity of the particle to optima. Particle
CLEAR March 2016
swarm optimization has been used for approaches that can be used across a wide range of applications, as well as for specific applications focused on a specific requirement. Here are various applications of PSO. I.
Applications of PSO
A. Identifying Shoplifters Using today‘s technology, the art of identifying shoplifters can be converted into a science. Particularly within stores and common shopping areas, shoplifters today are using newer techniques to be as invisible as possible: while in the act, they create distractions by working as a flash mob. By using Swarm Intelligence (SI) algorithms such as PSO, the Bees Algorithm, the Ant Colony Optimization, the Bat Algorithm, etc., retailers can create simulations to understand potential threats based on how merchandise is placed at the store. PSO is a good algorithm to apply to the flash mob scenario, as it helps retailers understand the behaviour of each individual (particle)—and ideally, of the leader coordinating the flash mob—who is browsing at a store. The store 20
format may have multiple levels (ndimensions) with various departments for different categories and merchandise, and the flash mobs organize themselves based on the information they gather moving through the store. B. Image Processing In Image Compression PSO can be used to speed-up the search of a near best match block for a given block to be encoded. PSO based Fractal Image Compression (FIC) shows that PSO can efficiently find the suitable domain blocks. Meanwhile, the retrieved image quality can be preserved when comparing to the full search FIC. Full search FIC can exactly find the best domain block corresponding to each range block, but it is very time consuming. PSO can provide a faster way to encode the range blocks. Digital Image segmentation is one of the major tasks in digital image processing. It is the process of subdividing a digital image into its constituent objects. PSO can also be used for Image segmentation. It is used to search cluster centre in the arbitrary data set automatically without any input knowledge about the number of naturally occurring regions in the data, and their applications to image segmentation. C. Robot path Planning PSO has been demonstrated to be a useful technique in robot path planning in dynamic environment with mobile obstacles and multiple goals, as a feasible approach for self organized control of robot to avoid obstacle throughout the trajectory. It works in cluttered and changing environments with CLEAR March 2016
moving obstacles.
D. Vehicle Routing Problem The Vehicle Routing problem is concerned with finding efficient routes of minimum total cost, beginning and ending at a central depot, for a fleet of vehicles to serve a number of customers for some commodity, such that each customer is visited exactly once by one vehicle, and satisfying some constraints such as capacity constraints, duration constraints and time window restrictions. PSO can be applied to this problem solving, in seeking the optimal solution for obtaining better results, with operational speed, robustness, individual number and optimization results related and many other advantages. E. Stock Price Forecasting The process of making assumptions of future Changes based on existing data is Forecasting. The more accurate the forecasting, the more it could be helpful to make decisions for future. Improved PSO and Support vector machines are used for efficient prediction of various stock indices. F. Cloud computing The Cloud computing has become the fast spread in the field of computing, research and industry in the last few years. As part of the service offered, there are new possibilities to build applications and provide various services to the end user by virtualization through the internet. Task 21
scheduling is the most significant matter in the cloud computing because the user has to pay for resource using on the basis of time, which acts to distribute the load evenly among the system resources by maximizing utilization and reducing task execution Time. A Dynamic Adaptive Particle Swarm Optimization is proposed to solve the PSO affinity problem in inertia weight where great inertia weight facilitates a global search while a little inertia weight facilitates a local search. This enhances the performance of basic PSO algorithm to optimize the task runtime by minimizing the make span of a particular task set, and in the same time, maximizing resource utilization. G. Speech enhancement Speech Enhancement is the of modifying an audio signal so as to improve its speech quality. This is made particularly challenging by the fact that the final decision on the quality of speech is the human ear, which is subjective. This means that a modification that improves the speech quality of one specific signal may actually reduce the quality of another. Despite this, there are several general techniques, such as echo cancellation, signal separation, and noise reduction that can improve the speech quality of a signal. PSO helps to improve upon the noise reduction technique known as two-channel noise reduction. In this method, we have the speech signal to be enhanced in one channel, and a second channel with only noise in it. The noisy channel is used to adaptively design a filter which will remove the noise, CLEAR March 2016
and this filter is then applied to the channel which contains the speech. PSO can be used to improve upon the existing two-channel noise reduction technique by improving the speed at which the optimal noise reduction filter is found. In addition, PSO based speech enhancement can provide more noise reduction than traditional methods relying on gradient based algorithms, for example the Normalized Least Mean Squares (NLMS) approach. This is due to PSO‘s greater resilience to becoming trapped in local minima as compared to the gradient based approaches. H. Cryptography In order to provide secure data communication in the wireless sensor networks, cryptographic techniques are used. Particle swarm optimization can be used to identify the plain text from cipher text. The optimized solution will be attained by identifying the least error rate. In the entire particles, the best the error rate will be computed by comparing the generated cipher text. If the error rate will be higher the value of the original text will remain otherwise it will be restored. II.
Conclusion
There are lot more application areas where PSO is used. It is exponentially growing. Researchers in many countries are experimenting with particle swarms and applying them in real-world applications.
22
References: [1] http://www.vocal.com/particle-swarmoptimization/ [2] http://risnews.edgl.com/retailnews/Improving-Loss-Prevention-by-ApplyingSwarm-Intelligence104313 [3] K.Uma, P.Geetha, A.Kannan, K.Umanath, “Image Compression Using Optimization Techniques”, International Journal of Engineering Research and Development, Vol 5, pp. 01-07, 2012 [4] Anita Tandan, Rohit Raja, Yamini Chouhan,”Image Segmentation Based on Particle Swarm Optimization Technique”, IJSETR, Vol 3, 2014. [5] Md. Rakibul Islam, Md. Tajmiruzzaman, Md. Mahfuzul Haque Muftee, Md. Sanowar Hossain, “Autonomous Robot Path Planning Using Particle Swarm Optimization in Dynamic Environment with Mobile Obstacles & Multiple Target”, ICMIEE, 2014.
Soft robots that react to patient’s mood
Imagine a health care robot that displays a patient‘s temperature and pulse, and even reacts to their mood. New super elastic ‗skin‘ could make it possible! Researchers at the Cornell University in US have developed an electroluminescent ―skin‖ that stretches to more than six times its original size while still emitting light.
[6] Yang Peng, Ye-mei Qian, “A Particle swarm optimization to vehicle routing problem with fuzzy demands”, Journal of Convergence Information Technology, Vol 5, 2010. [7] M. Karazmodeh, S. Nasiri, and S. Majid Hashemi, “Stock Price Forecasting using Support Vector Machines and Improved Particle Swarm Optimization”, Journal of Automation and Control Engineering, Vol 1, 2013. [8] Ali Al-maamari, Fatma A. Omara, “Task Scheduling Using PSO Algorithm in Cloud Computing Environments”, International Journal of Grid Distribution Computing, Vol 8, 2015.
The discovery could lead to significant advances in health care, transportation, electronic communic-ation and other areas. It has got lot of special features.
[9]http://www.vocal.com/particle-swarmoptimization/speech-enhancement/
Visit:
[10] Swapna B. Sasi, N. Sivanandam, “A Survey on Cryptography using Optimization algorithms in WSNs”, Indian Journal of Science and Technology, Vol 8, 2015.
http://www.freepressjournal.in/soo n-soft-robots-that-react-topatients-mood/796017
CLEAR March 2016
23
M.Tech Computational Linguistics Dept. of Computer Science and Engg, Govt. Engg. College, Sreekrishnapuram Palakkad www.simplegroups.in simplequest.in@gmail.com
SIMPLE Groups Students Innovations in Morphology Phonology and Language Engineering
Article Invitation for CLEAR- June-2016 We are inviting thought-provoking articles, interesting dialogues and healthy debates on multifaceted aspects of Computational Linguistics, for the forthcoming issue of CLEAR (Computational Linguistics in Engineering And Research) Journal, publishing on June 2016. The suggested areas of discussion are:
The articles may be sent to the Editor on or before 10th June, 2016 through the email simplequest.in@gmail.com. For more details visit: www.simplegroups.in Editor,
Representative,
CLEAR Journal
SIMPLE Groups
CLEAR March 2016
24
Hello world, Most of the real world problems do not exist in an ideal setting and are pervasively imprecise and uncertain. And so the conventional techniques for finding their solutions often tend to be inadequate. Precision and certainty comes at a cost. This is where Soft Computing finds its significance. Soft Computing is an emerging approach and takes the human mind as its role model. Unlike its conventional counterpart, hard computing, soft computing aims to exploit a tolerance for approximation, uncertainty, imprecision and partial truth to achieve a close resemblance with human like decision making. Neural Networks, Fuzzy Logic and Genetic Algorithms are the methodologies that form the core of soft computing. This issue of CLEAR focuses on the researches done in the field of soft computing. The articles have been written up in the hopes of giving an insight into this fast emerging field. CLEAR always welcomes ideas that are refreshing yet purposive and is thankful to everyone that helps realise this target. Simple group welcomes more aspirants in this area. Wish you all the best!!! Deepthi
CLEAR March 2016
25
Department of Computer Science & Engineering Government Engineering College Sreekrishnapuram, Palakkad CLEAR March 2016
www.gecskp.ac.in
26