Hadoop Developer Job Description

Hadoop Developer Job Description

September 18th, 2019

Hadoop Developers are responsible for developing and coding Hadoop applications. Hadoop is an open source framework that manages and stores Big Data applications that run within cluster systems. Essentially a Hadoop Developer creates applications to manage and maintain a company’s Big Data.

Get Hired

The World's #1 Job Site

With Indeed you can search millions of jobs online to find the next step in your career.

Find Jobs

9.8 jobs per second are added to Indeed.

1.5 million companies hire on Indeed.

Hadoop Developer Job Description Template

We are looking to hire a skilled Hadoop Developer to help build Big Data infrastructure and storage software. Your primary responsibility will be to design, build, and maintain Hadoop infrastructure. You may also be required to evaluate existing data solutions, write scalable ETLs, develop documentation, and train staff.

To ensure success as a Hadoop Developer, you should have in-depth knowledge of Hadoop API, high-level programming skills, and the ability to project manage. Ultimately, a top-class Hadoop developer designs and implements bespoke Hadoop applications to manage current and future Big Data infrastructures.

Hadoop Developer Responsibilities:

  • Meeting with the development team to assess the company’s Big Data infrastructure.
  • Designing and coding Hadoop applications to analyze data collections.
  • Creating data processing frameworks.
  • Extracting data and isolating data clusters.
  • Testing scripts and analyzing results.
  • Troubleshooting application bugs.
  • Maintaining the security of company data.
  • Creating data tracking programs.
  • Producing Hadoop development documentation.
  • Training staff on application use.

Hadoop Developer Requirements:

  • Bachelor’s degree in Software Engineering or Computer Science.
  • Previous experience as a Hadoop Developer or Big Data Engineer.
  • Advanced knowledge of the Hadoop ecosystem and its components.
  • In-depth knowledge of Hive, HBase, and Pig.
  • Familiarity with MapReduce and Pig Latin Scripts.
  • Knowledge of back-end programming languages including JavaScript, Node.js, and OOAD.
  • Familiarity with data loading tools including Squoop and Flume.
  • High-level analytical and problem-solving skills.
  • Good project management and communication skills.

Related Hiring Resources