
Data Architect, Google Cloud Platform (Gcp)
Automate your job search with Sonara.
Submit 10x as many applications with less effort than one manual application.1
Reclaim your time by letting our AI handle the grunt work of job searching.
We continuously scan millions of openings to find your top matches.

Job Description
As a Data Architect, you will lead the design and delivery of cloud-native data architectures and solutions for our clients. You will work closely with business stakeholders, data engineers, and developers to build robust data platforms that enable advanced analytics, machine learning, and real-time data processing. This role requires a mix of technical expertise, consulting skills, and leadership to drive successful outcomes in data-driven projects.
Responsibilities:
- Design and implement scalable, secure, and high-performance data architectures on Google Cloud Platform (GCP).
- Define and implement data lake and data warehouse architectures using GCP services such as BigQuery, Cloud Storage, Dataplex, and Dataform.
- Develop strategies for data migration to GCP from on-premises or other cloud platforms, ensuring minimal disruption and optimal performance.
- Architect and oversee the implementation of batch data pipelines using tools such as Dataflow, BigQuery Dataform, and Data Fusion.
- Guide the development of data models optimized for performance, scalability, and cost-efficiency in BigQuery and other GCP services.
- Define and implement best practices for data governance, data quality, lineage, security, and compliance in GCP environments, leveraging tools like Cloud DLP, IAM, and Dataplex.
- Partner with stakeholders to establish real-time analytics pipelines using services like Pub/Sub, Dataflow, and BigQuery streaming.
- Provide expertise in data partitioning, clustering, and query optimization to reduce costs and improve performance.
- Lead the adoption of serverless solutions and modern data engineering practices, including CI/CD pipelines for data workflows using tools like Cloud Build, GitHub Actions, or Terraform.
- Evaluate and recommend GCP-native AI/ML tools such as Vertex AI and AutoML for advanced analytics and predictive modeling.
- Serve as a trusted advisor to clients, presenting technical solutions, architectural roadmaps, and cost optimization strategies.
- Conduct workshops, proof-of-concepts (POCs), and training sessions to help clients adopt GCP technologies.
- Lead end-to-end implementation of data solutions, including ETL/ELT pipelines, data lakes, and data warehouses, ensuring delivery within scope, budget, and timeline.
- Troubleshoot and resolve complex issues related to GCP infrastructure, data pipelines, and integrations.
- Monitor and optimize the performance and cost of GCP data systems, leveraging tools like Cloud Monitoring, Cloud Logging, and BigQuery BI Engine.
Qualifications:
- 7+ years of experience in data architecture, data engineering, or related roles, with at least 3 years of hands-on experience in Google Cloud Platform (GCP).
- Proven track record of delivering data lake, data warehouse, and real-time analytics solutions on GCP.
- Expertise in GCP services including BigQuery, Cloud Storage, Dataproc, Dataflow, Pub/Sub, and Cloud SQL/Spanner.
- Proficiency in designing and implementing ETL/ELT pipelines using Cloud Data Fusion, Apache Beam, or Cloud Composer.
- Experience with streaming data pipelines using Pub/Sub and Dataflow.
- Familiarity with Vertex AI, AutoML, and AI Platform Pipelines for machine learning workflows.
- Strong understanding of IAM roles, service accounts, VPC Service Controls, and encryption best practices.
- Proficiency in SQL for data modeling, querying, and optimization in BigQuery.
- Strong programming skills in Python or Java, with experience in building reusable data pipelines and frameworks.
- Experience with Terraform or Deployment Manager for infrastructure as code (IaC) in GCP environments.
- Familiarity with CI/CD pipelines for data workflows using Cloud Build or other DevOps tools.
- Proven ability to lead technical teams and deliver complex projects.
- Excellent communication and stakeholder management skills, with the ability to explain technical concepts to non-technical audiences.
- GCP certifications such as Professional Data Engineer or Professional Cloud Architect are preferred.
- Experience with data mesh or data fabric architectures is a plus.
- Knowledge of multi-cloud and hybrid cloud strategies is a plus.
- Familiarity with other cloud platforms such as AWS or Azure is a plus.
- Hands-on experience with data observability tools such as Monte Carlo or Databand is a plus.
- Travel to client sites up to 50%
Automate your job search with Sonara.
Submit 10x as many applications with less effort than one manual application.
