Chevron logo

Data Architect - Data Lake & Data Engineering

Automate your job search with Sonara.

Submit 10x as many applications with less effort than one manual application.1

Reclaim your time by letting our AI handle the grunt work of job searching.

We continuously scan millions of openings to find your top matches.

pay-wall

Overview

Schedule
Full-time
Career level
Senior-level
Benefits
Career Development

Job Description

Total Number of Openings

1

Chevron is accepting online applications for the position Data Architect - Data Lake & Data Engineering through April 3, 2026 at 11:59 p.m. (Central Time).  

Enable business opportunities through data availability and accessibility. Perform and/or coordinate end-to-end data lifecycle management activities from source to analytics, including movement, storage, modeling, enhancement, integration, quality, and security of data throughout the enterprise. Focus on data reusability, business outcome and cost efficiency.

Senior individual contributor architect accountable for hands on design, enablement, and production ownership of Chevron’s Enterprise Lakehouse architecture, implementing best practices in data engineering for AI-ready data, and architecture of enterprise AI platform. This role defines and operationalizes governed, scalable, and cost-efficient Lakehouse patterns primarily on Azure Databricks with Unity Catalog, while selectively evaluating and guiding usage of Microsoft Fabric for specific workloads. It aligns data engineering practices with the Lakehouse pattern and help architect the data AI stack using Databricks, Fabric and other tools.

This is a deeply technical architecture role balancing standards and strategy with direct implementation, validation, and enablement across data engineering teams. The role has no people management responsibilities; however, the role is responsible to mentor and guide other data/solution architects in the team.

Key Responsibilities 

Core Deliverables 

  • Enterprise Lakehouse reference architectures and standards 

  • Productionvalidated Unity Catalog patterns and guidance 

  • Documented evaluations and recommendations for Microsoft Fabric and Azure Databricks 

  • Reusable architecture artifacts that enable consistent, governed delivery 

Enterprise Lakehouse Architecture & Ownership 

  • Define, evolve, and own the Enterprise Lakehouse architecture, with Azure Databricks as the primary data engineering platform and Microsoft Fabric evaluated for targeted workloads. 

  • Maintain hands-on ownership of architectural standards ensuring they are practical, enforceable, and proven at scale. 

  • Consult on design of scalable Lakehouse patterns supporting analytics, AI/ML, and application consumption. 

Azure Databricks & Unity Catalog Enablement 

  • Lead adoption and operationalization of Unity Catalog, including: 

  • Catalog, schema, and storage location design 

  • Identity, access boundaries, and privilege models 

  • Data sharing, lineage, and governance alignment 

  • Define and validate Delta Lake/Delta table standards for performance, interoperability, and long-term maintainability. 

  • Provide hands-on guidance, examples, and enablement to data engineering teams using Spark, Delta, and Databricks SQL. 

Microsoft Fabric 

  • Perform targeted architecture evaluations of Microsoft Fabric capabilities, including: 

  • OneLake architecture and domain organization patterns 

  • Consult on shortcut vs. mirroring approaches (tradeoffs, limitations, and governance impact) 

  • Security and governance alignment with Unity Catalog (duplication risks, access boundaries) 

  • Capacity planning and workload placement (Fabric capacities vs. Azure Databricks workloads) 

  • Interoperability patterns between Azure Databricks and Fabric 

  • Contribute to readiness assessments for Fabric features (e.g., Lakehouse, OneLake, RealTime Intelligence), with clear recommendations on appropriate use. 

Architecture Standards, Governance & Operations 

  • Establish and enforce enterprise architecture standards aligned with security, compliance, and data governance policies. 

  • Influence CI/CD patterns for Lakehouse assets and architecture artifacts in Azure DevOps. 

  • Partner with platform and governance teams to ensure consistent data quality, lineage, observability, and cost optimization guardrails. 

  • Track and communicate architectural risks, priorities, and blockers impacting Enterprise Lakehouse adoption. 

Cost, Performance & Optimization

  • Optimize cost efficiency by influencing: 

    • Workload placement decisions 

    • Storage and compute patterns 

    • Consumption and access models 

  • Validate performance and scalability of architecture through hands-on testing and tuning. 

Technology Stack

  • Azure Databricks (primary data engineering platform) 

  • Unity Catalog 

  • Delta Lake/Delta Tables 

  • Apache Spark (PySpark/Spark SQL) 

  • Microsoft Fabric/OneLake (architecture evaluation and selective usage) 

  • Azure Storage 

  • Azure DevOps (work item tracking; CI/CD concepts) 

Stakeholders & Collaboration

  • Product Manager – Data Lake & Data Engineering 

  • Data Engineering, Data Lake, and Unified Data Enablement teams 

  • Platform, security, and governance stakeholders 

  • Strategic technology vendors 

Documentation, Mentorship & Collaboration

  • Develop and maintain clear, consumable architecture artifacts, including logical and physical models, data flow diagrams, and reference patterns. 

  • Mentor data engineers and peer architects on Lakehouse standards, best practices, and platform usage. 

  • Collaborate closely with Product Management, Data Engineering, Data Lake, and Unified Data Enablement teams. 

  • Engage with vendors and internal stakeholders on architecture, governance, and platform direction. 

Relocation Options:

Relocation will not be considered.

International Considerations:

Expatriate assignments will not be considered.

Chevron regrets that it is unable to sponsor employment Visas or consider individuals on time-limited Visa status for this position.

U.S. Regulatory notice:

Chevron is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religious creed, sex (including pregnancy), sexual orientation, gender identity, gender expression, national origin or ancestry, age, mental or physical disability, medical condition, reproductive health decision-making, military or veteran status, political preference, marital status, citizenship, genetic information or other characteristics protected by applicable law.

We are committed to providing reasonable accommodations for qualified individuals with disabilities. If you need assistance or an accommodation, please email us at emplymnt@chevron.com.

Chevron participates in E-Verify in certain locations as required by law.

Automate your job search with Sonara.

Submit 10x as many applications with less effort than one manual application.

pay-wall

FAQs About Data Architect - Data Lake & Data Engineering Jobs at Chevron

What is the work location for this position at Chevron?
This job at Chevron is located in Houston, Texas, according to the details provided by the employer. Some roles may also include multiple work locations depending on the requirement.
What pay range can candidates expect for this role at Chevron?
Employer has not shared pay details for this role.
What employment applies to this position at Chevron?
Chevron lists this role as a Full-time position.
What experience level is required for this role at Chevron?
Chevron is looking for a candidate with "Senior-level" experience level.
What benefits are offered by Chevron for this role?
Chevron offers Career Development for this position. Actual benefits may vary depending on the employer's policies and employment terms.
What is the process to apply for this position at Chevron?
You can apply for this role at Chevron either through Sonara's automated application system, which helps you submit applications 10X faster with minimal effort, or by applying manually using the direct link on the job page.