GCP Data Engineer

Contract: Phoenix, Arizona, US

Salary: $60.00 Per Hour

Job Code: 355936

End Date: 2024-12-15

Days Left: 22 days, 9 hours left

Client's Domain - Financial
 
Job Title - Data Engineer

Location - Phoenix, AZ 85054 - Hybrid

Duration - 06+ Months (Potentially Contract to Hire)

Pay Rate - ($55-$60) hourly.

Note:

  • Min 7 years of experience.

About the Role:

  • We are seeking a talented Data Engineer to join our team and help us unlock the power of data. In this role, you will be responsible for designing, developing, and maintaining robust data pipelines to support critical business initiatives.
  • You will work closely with data scientists, analysts, and business stakeholders to ensure data quality, accessibility, and security.
  • This is a hybrid role based in Phoenix, offering a flexible work arrangement. You will have the opportunity to work on cutting-edge projects, collaborate with talented colleagues, and contribute to the growth of our organization
Key Responsibilities:
  • Data Pipeline Development: Build and maintain efficient data pipelines using Python and SQL to ingest, transform, and load data from various sources into our data warehouse.
  • GCP Expertise: Leverage GCP services such as Dataflow, Dataproc, and BigQuery to implement scalable and cost-effective data solutions.
  • Bitquery Mastery: Utilize Bitquery to extract valuable insights from blockchain data and integrate them into our data pipelines.
  • Infrastructure Development: Design and implement robust data infrastructure, including data lakes, data warehouses, and data marts, to support our growing data needs.
  • AI/ML Integration: Collaborate with AI/ML teams to integrate machine learning models into our data pipelines and applications.
  • Data Governance and Security: Ensure data quality, security, and compliance with industry standards and regulations.
  • Performance Optimization: Continuously monitor and optimize data pipelines for performance and efficiency.
  • Cross-functional Collaboration: Work closely with teams across the organization to understand business requirements and translate them into technical solutions.
Required Skills and Experience:
  • Strong proficiency in SQL
  • Working experience with GCP, knowledge of blockchain technology and Bitquery
  • Expertise in building and maintaining data pipelines using Python
  • Familiarity with data warehousing and data lake concepts  
Preferred Skills and Experience:
  • Experience with AI/ML techniques and tools
  • Prior experience working with a financial or banking client.
Job Requirement
  • Python
  • SQL
  • GCP
  • AI/ML
Reach Out to a Recruiter
  • Recruiter
  • Email
  • Phone
  • Mradul Khampariya
  • mradul.khampariya@collabera.com
  • 8139371148
Apply Now
Apply Now
close-icon

©2024 Collabera. All rights reserved.