Senior Data Engineer (GCP)

Posted 2026-05-05
Remote, USA Full-time Immediate Start

We are looking for a highly skilled and experienced Senior Data Engineer with strong expertise in Google Cloud Platform (GCP) to design, build, and optimize scalable, secure, and high-performance data pipelines and cloud-native data architectures.

The ideal candidate should possess deep hands-on experience in large-scale data engineering, ETL/ELT pipeline development, real-time data processing, and cloud-based data warehousing solutions. This role is best suited for a self-driven professional who can independently manage complex data projects in a remote / freelance work environment.

    Key Responsibilities:
  • Design, develop, and maintain scalable data pipelines and workflows on GCP
  • Build robust data architectures using BigQuery, Dataflow, Cloud Storage, and Cloud Composer
  • Develop and optimize ETL/ELT pipelines for structured and unstructured data from multiple data sources
  • Implement real-time and batch data processing solutions
  • Ensure data quality, integrity, governance, security, and compliance
  • Optimize performance, scalability, and cost efficiency of GCP-based solutions
  • Design and maintain data warehouse schemas, data models, and partitioning strategies
  • Work closely with data scientists, BI teams, analysts, and engineering stakeholders
  • Implement monitoring, alerting, and troubleshooting mechanisms for data pipelines
  • Resolve complex data engineering and performance bottlenecks
  • Follow best practices for data lifecycle management, backup, and disaster recovery
  • Support CI/CD deployment processes for data engineering workflows
    Required Skills & Qualifications:
  • 10 - 15 years of experience in Data Engineering / Data Platform roles
  • Strong hands-on expertise in Google Cloud Platform (GCP)
  • Proven experience in:
  • BigQuery
  • Dataflow (Apache Beam)
  • Cloud Composer (Airflow)
  • Cloud Storage
  • Pub/Sub
  • Strong proficiency in Python and SQL
  • Experience with distributed data processing frameworks
  • Expertise in data warehousing, data modeling, schema design
  • Strong experience in ETL/ELT architecture and pipeline optimization
  • Familiarity with streaming platforms such as Kafka / Pub/Sub
  • Experience with Git, CI/CD pipelines, and deployment automation
  • Strong analytical, debugging, and problem-solving skills
  • Ability to work independently in a remote and outcome-driven setup
    Preferred Qualifications:
  • GCP Professional Data Engineer Certification
  • Experience with Docker and Kubernetes
  • Exposure to multi-cloud environments (AWS / Azure)
  • Experience in large-scale real-time streaming systems
  • Knowledge of Spark, Dataproc, or similar big data ecosystems

Similar Jobs

Back to Job Board