Lead Data Engineer (P1952)
Automate your job search with Sonara.
Submit 10x as many applications with less effort than one manual application.1
Reclaim your time by letting our AI handle the grunt work of job searching.
We continuously scan millions of openings to find your top matches.

Job Description
Lead Data Engineer (P1952)
Cincinnati, OH or Chicago, IL
We are a full-stack data science company and a wholly owned subsidiary of The Kroger Company. We own 10 petabytes of data and collect 35+ terabytes of new data each week sourced from 62 million households. As a member of our engineering team, you will use cutting-edge technologies to develop applications that turn our data into actionable insights used to personalize the customer experience for shoppers at Kroger. We practice Agile development with Big Room Planning to align cross-functional squads-Product, Architecture, Data Science, and Engineering-around building scalable enterprise applications.
SUMMARY:
As a Lead Data Engineer, you will design and implement solutions to ingest, store, transform, and distribute big data on Azure Cloud. Our developers use Databricks (Spark, SQL, PySpark), Databricks Workflows, Python, JSON, and SQL to build products, tools, and features that power customer personalization at scale. Communication and Collaboration with Product and Data science are critical to success and are expected for this role to be successful.
RESPONSIBILITIES: Take ownership of features and drive them to completion through all phases of the entire 84.51° SDLC. This includes external facing and internal applications as well as process improvement activities such as:
- Develop Business Domain understanding, bridges this to technical stakeholders
- Sets technical strategy in partnership with Data Science, Product, Architecture, and Kroger partners
- Oversees team(s) of Engineers including operations and development, blended Offshore/Onshore
- Provide mentoring to junior resources
- Lead design and development of Databricks-based solutions
- Perform development of REST APIs hosted in Kubernetes
- Execute unit and integration testing
- Collaborate with senior resources to ensure consistent development practices
- Bring new perspectives to problems and be driven to improve yourself and the way things are done
QUALIFICATIONS, SKILLS, AND EXPERIENCE:
- Bachelor's degree typically in Computer Science, Management Information Systems, Mathematics, Business Analytics or another technically strong program.
- 5+ years proven ability of professional data development experience
- Strong understanding of Agile Principles (Scrum)
- 5+ years proven ability of developing with Spark (Databricks, PySpark)
- Full understanding of ETL concepts and data warehousing concepts
- 3+ year developing experience with Python
- 1+ year developing experience with REST APIs
- 1+ year experience with Kubernetes
#LI-SSS
Automate your job search with Sonara.
Submit 10x as many applications with less effort than one manual application.
