Find Amazing Talent Find Your Dream Job

Hadoop Administrator

Contract: Birmingham, Alabama, US

Salary Range: 60.00 - 80.00 | Per Hour

Job Code: 359041

End Date: 2025-03-22

Job Status: Expired

This Job is no longer accepting applications
Title: Hadoop Administrator
Location: Remote
Duration: 12 months
Pay Range: $60/hr - $80/hr
 

Must Have

  • Experienced Hadoop Admin with an understanding Cloudera, Ranger, Yarn, HDFS and Red Hat is Must have.


Primary Responsibilities

  • Oversees implementation and ongoing administration of Hadoop infrastructure and systems
  • Manages Big Data components/frameworks such as Hadoop, Spark, Storm, HBase, Hadoop Distributed File System (HDFS), Pig, Hive, Sqoop, Flume, Ozie, Avro, etc.
  • Analyzes latest Big Data analytic technologies and innovative applications in both business intelligence analysis and new offerings
  • Aligns with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and expand existing environments
  • Handles cluster maintenance and creation/removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise
  • Handles performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screens Hadoop cluster job performances and capacity planning
    Monitors Hadoop cluster connectivity and security
  • Manages and reviews Hadoop log files
  • Handles HDFS and file system management, maintenance, and monitoring
  • Partners with infrastructure, network, database, application, and business intelligence teams to guarantee high data quality and availability
  • Collaborates with application teams to install operating system and Hadoop updates, patches, and version upgrades when required
  • Acts of point of contact for vendor escalation


Requirements

  • Bachelor's degree in a related field
  • Seven (7) years of experience in architecture and implementation of large and highly complex projects


Skills and Competencies

  • Experience with Airflow, Argo, Luigi, or similar orchestration tool
  • Experience with DevOps principals and CI/CD
  • Experience with Docker and Kubernetes
  • Experience with No-SQL databases such as HBase, Cassandra, or MongoDB
  • Experience with streaming technologies such as Kafka, Flink, or Spark Streaming
  • Experience working with Hadoop ecosystem building Data Assets at an enterprise scale
  • Strong communication skills through written and oral presentations
Job Requirement
  • Cloudera
  • Ranger
  • Yarn
  • HDFS
  • RedHat
Reach Out to a Recruiter
  • Recruiter
  • Email
  • Phone
  • SHAKTHIKANNAN KONAR
  • shakthikannan.konar@collabera.com
  • 9737345512
This Job is no longer accepting applications
Apply Now
close-icon