About 237,000 results
Open links in new tab
  1. Hadoop Tutorial

    This tutorial has been prepared for professionals aspiring to learn the basics of Big Data Analytics using Hadoop Framework and become a Hadoop Developer. Software Professionals, …

  2. Hadoop - MapReduce - Online Tutorials Library

    During a MapReduce job, Hadoop sends the Map and Reduce tasks to the appropriate servers in the cluster. The framework manages all the details of data-passing such as issuing tasks, …

  3. HBase - Installation

    This chapter explains how HBase is installed and initially configured. Java and Hadoop are required to proceed with HBase, so you have to download and install java and Hadoop in your …

  4. MapReduce - Hadoop Implementation

    MapReduce is a framework that is used for writing applications to process huge volumes of data on large clusters of commodity hardware in a reliable manner. This chapter takes you through …

  5. Big Data Hadoop: Hands-On Course for Beginners

    The world of Hadoop and "Big Data" can be intimidating - hundreds of different technologies with cryptic names form the Hadoop ecosystem but you'll go hands-on and learn how to use them …

    • Reviews: 146
    • Sqoop Tutorial

      It is used to import data from relational databases such as MySQL, Oracle to Hadoop HDFS, and export from Hadoop file system to relational databases. This is a brief tutorial that explains …

    • Hadoop - Quick Guide - Online Tutorials Library

      All Hadoop commands are invoked by the $HADOOP_HOME/bin/hadoop command. Running the Hadoop script without any arguments prints the description for all commands.

    • Hadoop - Introduction - Online Tutorials Library

      The Hadoop framework application works in an environment that provides distributed storage and computation across clusters of computers. Hadoop is designed to scale up from single server …

    • Hadoop - Enviornment Setup - Online Tutorials Library

      Before installing Hadoop into the Linux environment, we need to set up Linux using ssh (Secure Shell). Follow the steps given below for setting up the Linux environment.

    • Apache Flume Tutorial - Online Tutorials Library

      Flume is a standard, simple, robust, flexible, and extensible tool for data ingestion from various data producers (webservers) into Hadoop. In this tutorial, we will be using simple and …