Application Programmer

Contract: Atlanta, Georgia, US

Salary: $62.00 Per Hour

Job Code: 348076

End Date: 2024-06-06

Days Left: 26 days, 22 hours left

About the role:

  • We are seeking a Hadoop ETL Developer to support production (L3) in Treasury ADS.
  • The individual will be responsible for understanding design, proposing high level and detailed design solutions, and suggesting out-of-the-box technical solutions for resolving business and technical problems that arise in real-time production.
  • This role may require flexibility to work on weekends at least once a month, with the possibility of compensatory time off during weekdays.
  • As an individual contributor in BAU, the person should possess strong analytical skills to make quick decisions during challenging situations.

Responsibilities:

  • Engage in discussions with the information architecture team to develop design solutions and propose new technology adoption ideas.
  • Participate in project meetings and collaborate with near shore and offshore teammates in a matrix environment.
  • Coordinate with support teams such as L2, development, testing, and upstream and downstream partners.
  • Perform BAU support activities including analysis, coding, proposing improvement ideas, and driving development activities offshore.
  • Work on multiple projects concurrently, taking ownership and pride in the work done.
  • Partner with Business Analysts to understand requirements and design solutions to address real-time production issues.
  • Identify gaps in technology and propose viable solutions.
  • Take accountability for technical deliveries from offshore.
  • Ensure adherence to defined process quality standards, best practices, and high-quality levels in all deliverables.
  • Adhere to team's governing principles and policies.

Required Skill:

  • 10+ years of experience in IT
  • Strong working knowledge of ETL, database technologies, big data, and data processing skills
  • 3+ years of experience developing ETL solutions using tools like Informatica, SSIS, etc.
  • 3+ years of experience developing applications using Hadoop, Spark, Impala, Hive, and Python
  • 3+ years of experience in running, using, and troubleshooting the ETL Cloudera Hadoop Ecosystem
  • Experience with Autosys JIL scripting.
  • Proficient scripting skills in Unix shell and Perl
  • Experience troubleshooting data-related issues.
  • Experience processing large amounts of structured and unstructured data with MapReduce.
  • Experience in SQL and relational database developing data extraction applications.
  • Experience with data movement and transformation technologies.
  • Good to have experience in Python or Scala programming.
  • Understanding of the end-to-end process of the application and all its aspects like upstream, database model, data processing, and data distribution layers

 

Job Requirement
  • Hadoop
  • Cloudera
  • HDFS
  • Hive
  • Apcahe
  • ETL
  • Informatica
  • SSIS
Reach Out to a Recruiter
  • Recruiter
  • Email
  • Phone
  • Sushmita Singh
  • sushmita.k@collabera.com
  • 9737345553
Apply Now
Apply Now
close-icon

©2024 Collabera. All rights reserved.