Hadoop Developer Job Description

Hadoop Developer Job Description

September 18th, 2019

Hadoop Developers are responsible for developing and coding Hadoop applications. Hadoop is an open source framework that manages and stores Big Data applications that run within cluster systems. Essentially a Hadoop Developer creates applications to manage and maintain a company’s Big Data.

Special Offer

Post a Hadoop Developer job to 100 job boards with one submission.

Post Jobs for FREE

Post to over 100+ job boards.

Reach over 150 million candidates.

Completely free trial, no credit card required.

Hadoop Developer Job Description Template

We are looking to hire a skilled Hadoop Developer to help build Big Data infrastructure and storage software. Your primary responsibility will be to design, build, and maintain Hadoop infrastructure. You may also be required to evaluate existing data solutions, write scalable ETLs, develop documentation, and train staff.

To ensure success as a Hadoop Developer, you should have in-depth knowledge of Hadoop API, high-level programming skills, and the ability to project manage. Ultimately, a top-class Hadoop developer designs and implements bespoke Hadoop applications to manage current and future Big Data infrastructures.

Hadoop Developer Responsibilities:

  • Meeting with the development team to assess the company’s Big Data infrastructure.
  • Designing and coding Hadoop applications to analyze data collections.
  • Creating data processing frameworks.
  • Extracting data and isolating data clusters.
  • Testing scripts and analyzing results.
  • Troubleshooting application bugs.
  • Maintaining the security of company data.
  • Creating data tracking programs.
  • Producing Hadoop development documentation.
  • Training staff on application use.

Hadoop Developer Requirements:

  • Bachelor’s degree in Software Engineering or Computer Science.
  • Previous experience as a Hadoop Developer or Big Data Engineer.
  • Advanced knowledge of the Hadoop ecosystem and its components.
  • In-depth knowledge of Hive, HBase, and Pig.
  • Familiarity with MapReduce and Pig Latin Scripts.
  • Knowledge of back-end programming languages including JavaScript, Node.js, and OOAD.
  • Familiarity with data loading tools including Squoop and Flume.
  • High-level analytical and problem-solving skills.
  • Good project management and communication skills.

Related Hiring Resources