The right talent can transform your business—and we make that happen. At Collabera, we go beyond staffing to deliver strategic workforce solutions that drive growth, innovation, and agility. With deep industry expertise, a global talent network, and a people-first approach, we connect you with professionals who don’t just fit the role but elevate your business. Partner with us and build a workforce that powers success.
Ab Initio Developer/Data Engineer
Contract: Charlotte, North Carolina, US span>
Salary Range: 60.00 - 70.00 | Per Hour
Job Code: 363421
End Date: 2025-08-01
Days Left: 28 days, 11 hours left
Job Title: Ab Initio Developer / Data Engineer
Location: Charlotte, NC (Hybrid – 3Days Onsite)
Duration: 12–24 Months
Payrate: $60-70/hr
Job Summary:
We are seeking a highly skilled Ab Initio Developer / Data Engineer with extensive experience in large-scale data environments, cloud platforms, and enterprise-level systems. The ideal candidate will possess a strong background in Ab Initio, Teradata, GCP (Google Cloud Platform), and BigQuery, with a solid foundation in SQL and ETL development. Exceptional communication skills and the ability to work in Agile environments are essential.
Required Qualifications:
- Minimum 7+ years of experience as a Data Engineer
- 6+ years of hands-on experience with Ab Initio
- 6+ years of experience working with Teradata
- At least 3+ years of experience with GoogleCloud Platform (GCP)
- 3+ years of experience with BigQuery
- Strong expertise in SQL and ETL/ELT processes
- Experience with Agile methodologies and tools like JIRA (3+ years)
- Proven ability to interact and collaborate with technical stakeholders
- Experience in enterprise-level environments
- Excellent written and verbal communication skills
Day-to-Day Responsibilities:
- Design, develop, and test robust and scalable data pipelines using Ab Initio
- Implement and optimize ETL/ELT processes for high-performance data movement
- Write, debug, and tune complex SQL queries for data extraction, transformation, aggregation, and reporting—particularly for Teradata and BigQuery
- Develop and manage data ingestion processes into GCP BigQuery to handle large datasets
- Monitor and manage GCP resources for data processing and storage efficiency
- Collaborate with cross-functional teams to define data architecture, flows, and design patterns
- Continuously optimize cloud-based data workloads to improve performance and cost-effectiveness
Nice to Have:
- Experience with Java/Python or other scripting languages for automation
- Experience with Spark, Hadoop, MapR, Data Lake
- GCP certification(s) is a plus
- Background in Banking/Financial Technology – Deposits, Payments, Cards domain, etc.
Job Requirement
- Ab Initio
- Google Cloud Platform (GCP)
- BigQuery
- ETl
- SQL
Reach Out to a Recruiter
- Recruiter
- Phone
- Shoeb Khan
- shoeb.khan@collabera.com
Apply Now
Apply Now
