Sr. QA Tester
July 4, 2025Mlops engineer
July 11, 2025
Full-Time, Remote
Posted 4 months ago
Location – Remote / fulltime
Experience – 5 to 8 years
Budget – 1L to 1.50L
JD :-
Job Summary:
We are seeking a highly skilled and experienced Senior GCP Data Engineer to join our data team. The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and solutions on Google Cloud Platform (GCP). You will work closely with data architects, analysts, and cross-functional teams to ensure robust data infrastructure and insights delivery.
Key Responsibilities:
- Design and develop scalable and reliable data pipelines using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Composer, etc.)
- Collaborate with stakeholders to understand data requirements and deliver efficient solutions.
- Implement data quality checks and monitoring frameworks.
- Optimize data workflows for performance and cost-effectiveness.
- Develop ETL/ELT processes and automate data ingestion from multiple sources.
- Ensure data security and compliance with organizational policies.
- Maintain comprehensive documentation of systems, processes, and best practices.
- Provide technical mentorship and review code of junior engineers.
Required Skills and Qualifications:
- 5 to 8 years of experience as a Data Engineer, with at least 3+ years of hands-on experience on Google Cloud Platform (GCP).
- Strong expertise in BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer (Airflow), and Cloud Functions.
- Proficiency in Python and SQL for data manipulation and automation.
- Experience with Terraform or Infrastructure as Code (IaC) for managing cloud resources is a plus.
- Familiarity with data modeling, data warehousing concepts, and performance tuning.
- Solid understanding of CI/CD, DevOps practices, and version control (Git).
- Excellent communication and problem-solving skills.
Preferred Qualifications:
- GCP Professional Data Engineer Certification is a strong plus.
- Experience in streaming data processing (Apache Beam, Kafka, etc.)
- Background in working with large-scale distributed systems.
- Exposure to data governance and data catalog tools.