What is the history of hadoop?

Page 1

What Is The History Of Hadoop? What Is The History Of Hadoop? It is a known fact that Hadoop has been exclusively designed to handle Big Data. Here we are going to learn about the brief reputation of Hadoop. Everybody on the globe knows about Google; it is probably the most popular online search engine in the internet. To provide search engine results for users Google had to shop loads of Data. In the 90’s, Google started searching for ways to shop and procedure loads of Data. Last but not least in 2003 they offered the globe with an impressive Big Data storage concept known as GFS or Google File System; it is a strategy to shop data especially large amount of Data. During 2004 they offered the globe with another strategy known as MapReduce, which is the strategy for handling the Data that is present in GFS. And it can be noticed that it took Google 13 years to come up with this impressive concept of saving and handling Big Data and fine adjusting the concept. But these methods have been shown to the globe just as an explanation through white-colored documents. So the globe and interested many individuals have just been offered with the concept of what GFS is and how it would shop data hypothetically and what MapReduce is and how it would procedure the Data saved in GFS hypothetically. So individuals had the Data of the http://crbtech.in/Student-Reviews/Oracle-Reviews


strategy, which was just its Data but there was no working model or rule offered. Then in the year 2006-07 another major online look for motor, Google came up with methods known as HDFS and MapReduce based on the white-colored documents created by Google. So lastly, the HDFS and MapReduce are the two primary ideas that make up Hadoop. Hadoop was actually developed by Doug Cutting. Those who have some Data of Hadoop know that its logo is a yellow-colored hippo. So there is a doubt in most people’s mind of why Doug Cutting has selected such a name and such an emblem for his venture. There is a reason behind it; the hippo is representational in the sense that it is the answer for Big Data. Actually Hadoop was the name that came from the creativity of Doug Cutting’s son; it was the name that the little boy provided to his favorite smooth toy which was a yellow-colored hippo and this is where the name and the brand for the venture have come from. Thus, this is the brief record behind Hadoop and its name. Search had already designed many such frameworks before, and they worked well fairly well, but it seemed to be a chance to take a step back and reconsider what such a program might look like when designed from the begining. And from the begining they began. Having seemed at Apache Hadoop and thought it was too primary, Eric14 and group began composing the program http://crbtech.in/Student-Reviews/Oracle-Reviews


from line zero. Well financed, and manned with strong technicians, they would have been successful in making a ‘better’ Hadoop – but it would have taken lots of your time. The bottom Apache Hadoop structure is consisting of the following modules: Hadoop Typical – contains collections and sources needed by other Hadoop modules; Hadoop Distributed Data System (HDFS) – a distributed file-system that shops data on product devices, offering very high total data transfer useage across the cluster; Hadoop YARN – a resource-management system accountable for handling sources in groups and using them for arranging of users’ applications; and Hadoop MapReduce – an execution of the MapReduce development design for extensive Data systems. The term Hadoop has come to relate not just to system segments above, but also to the environment, or selection of additional software programs that can be used on top of or together with Hadoop, such as Apache Pig, Apache Hive, Apache HBase, Apache Arizona, Apache Ignite, Apache ZooKeeper, Cloudera Impala, Apache Flume, Apache Sqoop, Apache Oozie, Apache Surprise. Apache Hadoop‘s MapReduce and HDFS elements http://crbtech.in/Student-Reviews/Oracle-Reviews


were motivated by Search engines documents on their MapReduce and Search engines Data file Program. The Hadoop structure itself is mostly published in the Java development terminology, with some local program code in C and control line sources published as spend programs. Though MapReduce Java program code frequently occurs, any development terminology can be used with “Hadoop Streaming” to apply the “map” and “reduce” areas of a person’s program. If you want to make your career in Oracle then you can do the Oracle certification course.

http://crbtech.in/Student-Reviews/Oracle-Reviews


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.