landing_page-logo
Rokt logo

Senior Software Engineer (Storage Infrastructure)

RoktNew York, NY
Apply

Automate your job search with Sonara.

Submit 10x as many applications with less effort than one manual application.1

Reclaim your time by letting our AI handle the grunt work of job searching.

We continuously scan millions of openings to find your top matches.

pay-wall

Job Description

We are Rokt, a hyper-growth ecommerce leader.

Rokt is the global leader in ecommerce, unlocking real-time relevance in the moment that matters most. Rokt's AI Brain and ecommerce Network powers billions of transactions connecting hundreds of millions of customers and is trusted to do this by the world's leading companies.

We are a team of builders helping smart businesses find innovative ways to meet customer needs and generate incremental revenue. Leading companies drive 10-50% of additional revenue-and often all their profits-from the extra products or services they sell. This economic edge unleashes a world of possibilities for growth and innovation.

The Rokt engineering team builds best-in-class ecommerce technology that provides personalized and relevant experiences for customers globally and empowers marketers with sophisticated, AI-driven tooling to understand consumers better. Our bespoke platform handles millions of transactions per day. It considers billions of data points which give engineers the opportunity to build technology at scale, collaborate across teams and gain exposure to a wide range of technology.

We are looking for a Senior Software Engineer (Storage Infrastructure)

Target total compensation ranges from $300,000 - $325,000, including a fixed annual salary of $200,000 - $225,000, an employee equity plan grant, and world-class benefits.

Equity grants are issued in good faith and are subject to company policies, board approval, and individual eligibility.

As our Storage Engineer, you'll own the design, deployment, and operation of our data backbone-spanning multi-region Kafka clusters and schema registry, Cassandra NoSQL services, and the Datalake/OLAP stack (DeltaStreamer, Trino, encryption UDFs, and analytics gateways). You'll ensure data pipelines flow securely and efficiently, schemas evolve safely, and that our storage platform scales to meet real-time and batch analytics demands.

About the role:

  • Operate and tune self-hosted Cassandra clusters (including k8ssandra operator), backing real-time data access and high-throughput ingestion services.
  • Architect, deploy, and maintain multi-region Kafka brokers, Schema Registry, MirrorMaker, and Kafka Connect clusters to power resilient event streaming.
  • Build and enhance Datalake ingestion pipelines (DeltaStreamer, Cassandra Streamer) and encryption/decryption UDFs, ensuring secure, auditable data at rest and in motion.
  • Manage the OLAP layer-design and operate Trino engine clusters, gateways, custom plugins, and Superset dashboards for ad-hoc analytics and BI.
  • Oversee artifact and backup services (Nexus, Datadog dashboards, datalake backup solutions), enforcing schema compatibility, automated restores, and cost-efficient storage tiering.