1. Home
  2. »All Job Categories
  3. »Data Science Jobs

Auto-apply to these data science jobs

We've scanned millions of jobs. Simply select your favorites, and we can fill out the applications for you.

Geico Insurance logo
Geico InsuranceChicago, IL

$88,150 - $157,850 / year

At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities. Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers' expectations while making a real impact for our company through our shared purpose. When you join our company, we want you to feel valued, supported and proud to work here. That's why we offer The GEICO Pledge: Great Company, Great Culture, Great Rewards and Great Careers. GEICO is looking for a customer-obsessed and results-oriented Product Manager to support our Data Ingestion and Movement platform. This role will help drive product direction for our data ingestion, ETL/ELT pipelines, and data movement services, focusing on enabling reliable data flow into our lakehouse and other data stores. The ideal candidate will have a technical background in data engineering and experience delivering scalable data platforms and data pipeline solutions. Description As a Product Manager for Data Ingestion and Movement, you will be responsible for supporting the product vision and execution for GEICO's data ingestion and movement products. To successfully shape a platform that enables pipeline-as-a-service and supports a scalable data mesh architecture, a strong technical understanding of data pipelines, data integration patterns, data orchestration, ETL/ELT processes, and platform engineering is essential. Your goal is to abstract complexity and empower domain teams to autonomously and efficiently build, deploy, and govern data pipelines. This role also requires stakeholder management skills and the ability to bridge technical solutions with business value. Key Responsibilities Support the development and execution of data ingestion and movement platform vision aligned with business goals and customer needs Help create and maintain a clear, prioritized roadmap for data ingestion and movement capabilities that balances short-term delivery with long-term strategic objectives Support evangelizing the Data Ingestion and Movement platform across the organization and help drive stakeholder alignment Stay abreast of industry trends and competitive landscape (Apache Kafka, Apache Airflow, AWS Glue, Azure Data Factory, Google Cloud Dataflow, etc.) to inform data ingestion strategy Support requirement gathering and product strategy for data ingestion, ETL/ELT pipelines, and data movement services Understand end-to-end data ingestion workflows and how data movement fits into the broader data ecosystem and downstream analytics Support data governance initiatives for data lineage, quality, and compliance in data ingestion and movement processes Ensure data ingestion and movement processes adhere to regulatory, compliance, and data quality standards Partner with engineering on the development of data ingestion tools, pipeline orchestration services, and data movement capabilities Help define product capabilities for data ingestion, pipeline monitoring, error handling, and data quality validation to improve reliability and performance Support customer roadshows and training on data ingestion and movement capabilities Build instrumentation and observability into data ingestion and movement tools to enable data-driven product decisions and pipeline monitoring Work closely with engineering, data engineering, and data teams to ensure seamless delivery of data ingestion and movement products Partner with customer success, support, and engineering teams to create clear feedback loops Translate data ingestion and movement technical capabilities into business value and user benefits Support alignment across multiple stakeholders and teams in complex, ambiguous environments Qualifications Required Understanding of data ingestion patterns, ETL/ELT processes, and data pipeline architectures (Apache Kafka, Apache Airflow, Apache Spark, AWS Glue, etc.) Experience with data integration APIs, connectors, and data pipeline orchestration tools Basic understanding of data pipeline monitoring, observability, and data quality validation practices Experience in cloud data ecosystems (AWS, GCP, Azure) Proven analytical and problem-solving abilities with a data-driven approach to decision-making Experience working with Agile methodologies and tools (JIRA, Azure DevOps) Good communication, stakeholder management, and cross-functional collaboration skills Strong organizational skills with ability to manage product backlogs Preferred Previous experience as a software or data engineer is a plus Strong business acumen to prioritize features based on customer value and business impact Experience with data ingestion tools (Apache Kafka, Apache NiFi, AWS Kinesis, Azure Event Hubs, etc.) Knowledge of data lineage, data quality frameworks, and compliance requirements for data ingestion Insurance industry experience Experience Minimum 5+ years of technical product management experience building platforms that support data ingestion, ETL/ELT pipelines, data engineering, and data infrastructure Track record of delivering successful products in fast-paced environments Experience supporting complex, multi-stakeholder initiatives Proven ability to work with technical teams and translate business requirements into technical product specifications Experience with customer research, user interviews, and data-driven decision making Education Bachelor's degree in computer science, engineering, management information systems, or related technical field required MBA/MS or equivalent experience preferred Annual Salary $88,150.00 - $157,850.00 The above annual salary range is a general guideline. Multiple factors are taken into consideration to arrive at the final hourly rate/ annual salary to be offered to the selected candidate. Factors include, but are not limited to, the scope and responsibilities of the role, the selected candidate's work experience, education and training, the work location as well as market and business considerations. At this time, GEICO will not sponsor a new applicant for employment authorization for this position. The GEICO Pledge: Great Company: At GEICO, we help our customers through life's twists and turns. Our mission is to protect people when they need it most and we're constantly evolving to stay ahead of their needs. We're an iconic brand that thrives on innovation, exceeding our customers' expectations and enabling our collective success. From day one, you'll take on exciting challenges that help you grow and collaborate with dynamic teams who want to make a positive impact on people's lives. Great Careers: We offer a career where you can learn, grow, and thrive through personalized development programs, created with your career - and your potential - in mind. You'll have access to industry leading training, certification assistance, career mentorship and coaching with supportive leaders at all levels. Great Culture: We foster an inclusive culture of shared success, rooted in integrity, a bias for action and a winning mindset. Grounded by our core values, we have an an established culture of caring, inclusion, and belonging, that values different perspectives. Our teams are led by dynamic, multi-faceted teams led by supportive leaders, driven by performance excellence and unified under a shared purpose. As part of our culture, we also offer employee engagement and recognition programs that reward the positive impact our work makes on the lives of our customers. Great Rewards: We offer compensation and benefits built to enhance your physical well-being, mental and emotional health and financial future. Comprehensive Total Rewards program that offers personalized coverage tailor-made for you and your family's overall well-being. Financial benefits including market-competitive compensation; a 401K savings plan vested from day one that offers a 6% match; performance and recognition-based incentives; and tuition assistance. Access to additional benefits like mental healthcare as well as fertility and adoption assistance. Supports flexibility- We provide workplace flexibility as well as our GEICO Flex program, which offers the ability to work from anywhere in the US for up to four weeks per year. The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled. GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants.

Posted 30+ days ago

American International Group logo
American International GroupAtlanta, GA
Be part of something groundbreaking At AIG, we are making long-term investments in a brand-new, innovative Generative AI team, designed to explore new possibilities for how artificial intelligence can be applied in insurance and beyond, and we need your help. With the support and investment needed to explore new frontiers in Generative AI, you'll be working alongside talented colleagues, innovating and contributing to projects that will transform how we manage risk and serve our customers. This team is central to our vision of the future and the core of our business offering. We will incorporate best-in-class engineering and product management principles, and your contribution will be critical to its success. To rapidly advance and innovate, we need your skills and expertise to build world-class products. If you're excited by the opportunity to create meaningful impact at scale and shape the future of the insurance industry, we'd love to hear from you. Who we are AIG is one of the leading global commercial and personal insurance organizations, with one of the most far-reaching property casualty networks. We provide world-class products and expertise to businesses and individuals in approximately 190 countries and jurisdictions. At AIG, we're reshaping how the world manages risk, and we're inviting you to be a key part of that transformation. As a Data Architect, you'll have the opportunity to make a meaningful impact, leveraging and further developing your skills to guide groundbreaking AI initiatives. If you're looking for a place to grow your career and where your contributions will shape the future, AIG is where you belong. How you will create an impact As a Data Architect, you will drive the development of AIG's data architecture and ontology frameworks, ensuring that our data systems are robust, scalable, and future-proof. You'll shape how data is modeled, integrated, and governed across the organization, helping business applications deliver faster with greater quality and compliance. Your role will be crucial in creating reusable enterprise information assets, and you will collaborate closely with solution architects, product owners, and engineering teams to deliver high-impact business outcomes. This is an opportunity to play a key role in ensuring data architecture supports innovation, compliance, and improved business performance. Your responsibilities include Creating and managing data models and ontologies for large-scale business applications across multiple geographies Assisting the design and integration of enterprise information assets, services, and business systems to ensure efficient data usage and compliance with governance standards Documenting and continuously improving data architecture systems, while educating teams on data governance standards and best practices Collaborating with product owners, engineers, and the Data Governance team to ensure high data quality, security, and privacy compliance across business applications Monitoring market and industry trends in data architecture and applying them to enhance AIG's information lifecycle management and data integration processes What is needed to be successful Data modeling and data analysis experience: You bring expertise in creating complex data models, and have a proven track record of modeling with 3NF, as well as dimensional and Data Vault methodologies, as well as contributing to the data architecture strategy at an enterprise level Strong data analysis, data profiling and SQL skills You have experience with best practices in data governance, security and privacy in modeling and data architecture Strong knowledge of data technologies, data design practices and data integration Passion for integrity and clarity of data Experience with master and reference data management Experience across sub-functional data domain areas such as reporting, analysis, conceptualization and design of data assets across warehouses and marts You are passionate about using the latest technology for data innovation and are excited to use AI and LLM integration to innovate Experience (or certification) with data modeling tools, ETL tools, Snowflake, Dimensional Modeling, Data Vault modeling, ER modeling Experience (or certification) working with NoSQL databases like MongoDB, experience with XML / JSON Solid communication skills - verbal and written AGILE / Rally development experience or training required Experience with (or strong enthusiasm about) AI-driven solutions It would be nice if Experience in Data Warehousing Experience using Snowflake platform You have a good understanding of ontologies Experience with Palantir Foundry or similar tool is highly desired Experience in insurance or financial services is desired Ready to solve bigger problems? We would love to hear from you. Veterans are encouraged to apply. #LI-AIG #LI-CM1 #AI #GenAI #artificialintelligence #DataEngineering #DataMining At AIG, we value in-person collaboration as a vital part of our culture, which is why we ask our team members to be primarily in the office. This approach helps us work together effectively and create a supportive, connected environment for our team and clients alike. Enjoy benefits that take care of what matters At AIG, our people are our greatest asset. We know how important it is to protect and invest in what's most important to you. That is why we created our Total Rewards Program, a comprehensive benefits package that extends beyond time spent at work to offer benefits focused on your health, wellbeing and financial security-as well as your professional development-to bring peace of mind to you and your family. Reimagining insurance to make a bigger difference to the world American International Group, Inc. (AIG) is a global leader in commercial and personal insurance solutions; we are one of the world's most far-reaching property casualty networks. It is an exciting time to join us - across our operations, we are thinking in new and innovative ways to deliver ever-better solutions to our customers. At AIG, you can go further to support individuals, businesses, and communities, helping them to manage risk, respond to times of uncertainty and discover new potential. We invest in our largest asset, our people, through continuous learning and development, in a culture that celebrates everyone for who they are and what they want to become. Welcome to a culture of inclusion We're committed to creating a culture that truly respects and celebrates each other's talents, backgrounds, cultures, opinions and goals. We foster a culture of inclusion and belonging through learning, cultural awareness activities and Employee Resource Groups (ERGs). With global chapters, ERGs are a cornerstone for our culture of inclusion. The talent of our people is one of AIG's greatest assets, and we are honored that our drive for positive change has been recognized by numerous recent awards and accreditations. AIG provides equal opportunity to all qualified individuals regardless of race, color, religion, age, gender, gender expression, national origin, veteran status, disability or any other legally protected categories. AIG is committed to working with and providing reasonable accommodations to job applicants and employees with disabilities. If you believe you need a reasonable accommodation, please send an email to candidatecare@aig.com. Functional Area: DT - Data AIG PC Global Services, Inc.

Posted 30+ days ago

DXC Technology logo
DXC TechnologyAshburn, VA
Job Description: DXC Technology (NYSE: DXC) helps global companies run their mission-critical systems and operations while modernizing IT, optimizing data architectures, and ensuring security and scalability across public, private, and hybrid clouds. The world's largest companies and public sector organizations trust DXC to deploy services across the Enterprise Technology Stack to drive new performance levels, competitiveness, and customer experience. Learn more about how we deliver excellence for our customers and colleagues at DXC.com. Location: Hybrid with up to 50% travel to client location. Candidates located within 25 miles of a DXC office are required to work onsite two days per week. Preferred locations: Plano, TX · Detroit/Farmington Hills, MI · Nashville, TN · New York City, NY · New Orleans, LA: Ashburn, VA · Tulsa, OK Overview: The SAP Data Migration SME is responsible for defining and executing the data migration strategy for SAP implementations, ensuring data integrity, quality, and compliance throughout the migration lifecycle. This role involves collaborating with business and technical teams to design robust migration plans, oversee ETL processes, and provide subject matter expertise on SAP data management tools and methodologies. Key Responsibilities: Develop and own the end-to-end data migration strategy aligned with SAP implementation goals. Define data governance standards, quality metrics, and compliance requirements. Create migration roadmaps, timelines, and risk mitigation plans. Lead data extraction, transformation, and loading (ETL) processes using SAP tools (e.g., LSMW, SAP Data Services, Migration Cockpit). Perform data mapping, cleansing, and validation to ensure accuracy and completeness. Oversee reconciliation and audit of migrated data post-go-live. Work closely with functional consultants, business process owners, and technical teams to gather requirements and validate migration outputs. Conduct workshops and training sessions for end-users and project teams. Implement data quality checks and error handling mechanisms. Ensure compliance with data security and privacy regulations. Maintain detailed documentation of migration processes, data dictionaries, and mapping specifications. Provide post-migration support and troubleshoot data-related issues. Required Skills & Qualifications: Bachelor's degree in Computer Science, Information Technology, or related field. Minimum 5+ years in SAP data migration projects, preferably in S/4HANA environments. Strong knowledge of ETL tools and SAP migration utilities (LSMW, BODS, Migration Cockpit). SQL, data modeling, and data validation techniques. Familiarity with SAP modules (FI, MM, SD, PP). Excellent analytical, problem-solving, and communication skills. Ability to manage multiple projects under tight deadlines. Preferred Certifications: SAP Certified Application Associate - Data Migration. Experience with SAP S/4HANA conversion projects. Soft Skills: Strong leadership and stakeholder management. Attention to detail and commitment to data integrity. Ability to work in cross-functional teams and global environments. Must be legally authorized to work in the United States without the need for sponsorship now or in the future At DXC Technology, we believe strong connections and community are key to our success. Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances. We're committed to fostering an inclusive environment where everyone can thrive. If you are an applicant from the United States, Guam, or Puerto Rico DXC Technology Company (DXC) is an Equal Opportunity employer. All qualified candidates will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, pregnancy, veteran status, genetic information, citizenship status, or any other basis prohibited by law. View postings below . We participate in E-Verify. In addition to the posters already identified, DXC provides access to prospective employees for the Federal Minimum Wage Poster, Federal Polygraph Protection Act Poster as well as any state or locality specific applicant posters. To access the postings in the link below, select your state to view all applicable federal, state and locality postings. Postings are available in English, and in Spanish, where required. View postings below. Postings Link Disability Accommodations If you are an individual with a disability, a disabled veteran, or a wounded warrior and you are unable or limited in your ability to access or use this site as a result of your disability, you may request a reasonable accommodation by contacting us via email. Please note: DXC will respond only to requests for accommodations due to a disability. Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here.

Posted 1 week ago

Pacific Life logo
Pacific LifeNewport Beach, CA

$134,280 - $164,120 / year

Job Description: Providing for loved ones, planning rewarding retirements, saving enough for whatever lies ahead - our policyholders count on us to be there when it matters most. It's a big ask, but it's one that we have the power to deliver when we work together. We collaborate and innovate - pushing one another to transform not just Pacific Life, but the entire industry for the better. Why? Because it's the right thing to do. Pacific Life is more than a job, it's a career with purpose. It's a career where you have the support, balance, and resources to make a positive impact on the future - including your own. We're actively seeking a talented Data Engineer to join our Pacific Life Team in Newport Beach, CA. We are looking for self-starters to help shape the future of data engineering and drive data-driven success. As a Data Operations Engineer you'll move Pacific Life, and your career, forward by leading and acting as an individual contributor in our Data Ops support team. You will fill Data Operations Engineer role that sits on a team of 15 people in the PL Data division. Your colleagues will include other Data Ops Engineering professionals and you will be interfacing with internal/external, business and IT Stakeholders. How you'll help move us forward: Partnering with data architects, analysts, engineers, and business stakeholders to understand data requirements and deliver solutions in an urgent and effective manner. Implementing automated workflows to streamline data collection, processing, and analysis. Monitoring data pipelines to detect and resolve issues promptly. Reducing manual intervention to minimize errors and increase efficiency. ·Promoting a culture of continuous improvement and agile methodologies. Demonstrate adaptability, initiative and inquisitiveness in issues resolution. Ensuring compliance with data governance policies and regulations. Skills in analyzing data to identify trends, patterns, and insights. Design, build, and maintain scalable data pipelines for data extraction, transformation, and loading. Proficiency in languages such as Python and PowerShell for data pipeline management Strong knowledge of SQL for querying, optimizing and innovating databases and managing data. The experience you bring: Bachelor's degree in Computer Science, Mathematics, Data Science or Statistics 5+ years of experience in design, development, and data management 5+ years of experience and proficiency in SQL, ETL, data transformation, and data operations tools (Snowflake, Redshift, Informatica, Matillion, DBT, Python, Control-M) 2+ years of experience with DevOps and CI/CD Effective communication & facilitation; both verbal and written Team-Oriented: Collaborating effectively with team and stakeholders Analytical Skills: Strong problem-solving skills with ability to breakdown complex data solutions What makes you stand out: 2+ years of experience in Snowflake and Data Build Tool (DBT), and cloud services Experience with automation, optimization and innovation in data management and batch cycle environments Understanding of data catalogs, glossary, data quality, and effective data governance Financial services domain knowledge Data driven individual with ability to setup effective processes in your sphere of ownership Strong communication with the ability to translate business requirements into technical specification You can be who you are. People come first here. We're committed to a diverse, equitable and inclusive workforce. Learn more about how we create a welcoming work environment through Diversity, Equity, and Inclusion at www.pacificlife.com. What's life like at Pacific Life? Visit Instagram.com/lifeatpacificlife. Benefits start Day 1. Your wellbeing is important. We're committed to providing flexible benefits that you can tailor to meet your needs. Whether you are focusing on your physical, financial, emotional, or social wellbeing, we've got you covered. Prioritization of your health and well-being including Medical, Dental, Vision, and a Wellbeing Reimbursement Account that can be used on yourself or your eligible dependents Generous paid time off options including Paid Time Off, Holiday Schedules, and Financial Planning Time Off Paid Parental Leave as well as an Adoption Assistance Program Competitive 401k savings plan with company match and an additional contribution regardless of participation. Base Pay Range: The base pay range noted represents the company's good faith minimum and maximum range for this role at the time of posting. The actual compensation offered to a candidate will be dependent upon several factors, including but not limited to experience, qualifications and geographic location. Also, most employees are eligible for additional incentive pay. $134,280.00 - $164,120.00 Your Benefits Start Day 1 Your wellbeing is important to Pacific Life, and we're committed to providing you with flexible benefits that you can tailor to meet your needs. Whether you are focusing on your physical, financial, emotional, or social wellbeing, we've got you covered. Prioritization of your health and well-being including Medical, Dental, Vision, and Wellbeing Reimbursement Account that can be used on yourself or your eligible dependents Generous paid time off options including: Paid Time Off, Holiday Schedules, and Financial Planning Time Off Paid Parental Leave as well as an Adoption Assistance Program Competitive 401k savings plan with company match and an additional contribution regardless of participation EEO Statement: Pacific Life Insurance Company is an Equal Opportunity /Affirmative Action Employer, M/F/D/V. If you are a qualified individual with a disability or a disabled veteran, you have the right to request an accommodation if you are unable or limited in your ability to use or access our career center as a result of your disability. To request an accommodation, contact a Human Resources Representative at Pacific Life Insurance Company.

Posted 3 weeks ago

S logo
Stryker CorporationKalamazoo, MI

$69,100 - $139,600 / year

Work Flexibility: Remote Join Stryker's global transformation journey! As a Senior Master Data Deployment Analyst, you'll play a critical role in ensuring the accuracy and integrity of material and finance master data during our SAP implementation. This is an opportunity to influence global data standards, drive process excellence, and enable seamless go-live execution across divisions and geographies. What You Will Do Coordinate with business stakeholders to document and execute go-live activities, including ramp-up and ramp-down tasks; building strong relationships Ensure completeness and accuracy of material master data through rigorous data quality checks and validation Lead data cleansing, conversion, and migration activities in collaboration with business SMEs Perform manual data loads and updates as required to support project timelines Support testing activities (Integration, UAT, Regression, DITL) by preparing scenarios and data for successful outcomes Analyze and resolve data defects identified during data loads or testing, ensuring timely remediation Develop and maintain work instructions and Power BI dashboards to monitor data quality and load progress Deliver training and onboarding for local data readiness resources, promoting global master data standards What You Will Need Required Qualifications Bachelor's degree in Information Sciences, IT, Finance, Engineering, or related field Minimum of 2 years experience working with material master data Functional experience with ERP processes (ex: OTC, PTP, FTS, RTR) and their master data dependencies Prior experience with ERP or large-scale systems Proficiency in Microsoft Office and data analysis tools, particularly Excel Preferred Qualifications Minimum 3 years of experience in SAP material master data; finance master data experience Experience with SAP MDG and S/4HANA applications Data experience related to the medical device or pharmaceutical industry Knowledge of master data governance principles and best practices Familiarity with Power BI for reporting and dashboard creation $69,100 - $139,600 salary plus bonus eligible + benefits. Individual pay is based on skills, experience, and other relevant factors. Posted: November 26, 2025 Travel Percentage: 20% Stryker Corporation is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, ethnicity, color, religion, sex, gender identity, sexual orientation, national origin, disability, or protected veteran status. Stryker is an EO employer - M/F/Veteran/Disability. Stryker Corporation will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.

Posted 6 days ago

Canoo logo
CanooOklahoma City, OK
Job Title Canoo Data Platform- Data Engineer About Canoo Canoo's mission is to bring EVs to Everyone and build a world-class team to deploy this sustainable mobility revolution. We have developed breakthrough electric vehicles that are reinventing the automotive landscape with pioneering technologies, award-winning designs, and a unique business model that spans all owners in the full lifecycle of the vehicle. Canoo is starting production and is distinguished by its pioneering and experienced team of technologists, engineers, and designers. With offices around the country, the company is scaling quickly and seeking candidates who love to challenge themselves, are motivated by purpose, and possess a strong desire to get things done. The "Canoo Way" Canoo's success is the direct result of our disciplined application of our core operating principles and drills, which are based on three main principles: Think 80/20 ("Important versus less important"), Act 30/30 ("Reduce waste and increase output"), and Live 90/10 ("We have each other's back"). We hire based on "MET"- Mindset, Equipment and willingness to Train - and seek individuals that take accountability and deliver results being Humble, Hungry to succeed, and Hunting for opportunities to win. We train our team to engage with each other by modulating between their intellect (iQ) and emotional intelligence (eQ) applying Facts, Finesse, and Force when they communicate. The principles and drills of the CANOO Way have been fundamental to our success, our ability to grow, continuously improve, innovate and are at the core of our day-to-day operations. Job Purpose As a Data Engineer, you will be responsible for developing and maintaining highly scalable data pipelines that enable data transformation and load between internal systems, IoT devices (electric vehicles), external backend systems, and frontend user interfaces. You will design and implement data streams ensuring data quality, data integrity, security, and high performance. Additionally, you will collaborate with cross-functional teams to continually integrate all company systems. Responsibilities (80s of the Position) Work with stakeholders to gather data and reporting requirements, to build dashboards and data flows. Create infrastructure-as-code, deployment pipelines, developer tools, and other automations. Understand product requirements, engage with team members and customers to define solutions, and estimate the scope of work required. Deliver solutions that can keep up with a rapidly evolving product in a timely fashion. Required Experience Google Cloud Platform (GCP), GCS, BigQuery Expertise with one or more back-end languages such as Python, Go, TypeScript, JavaScript, etc. SQL expertise- DBT experience a plus. Experience with cloud services like GCP, AWS or Azure. Kafka Dashboarding and Reporting- Superset, Looker Git- BitBucket/Gitlab *Kubernetes- Mid-Level Experience Preferred Experience Python Python dependency management and custom packages Expertise with Google Cloud Platform (GCP) Data Warehousing - partitioning, segmentation Internet of Things (IoT) and MQTT Docker Terraform - experience a plus CI/CD tooling- Jenkins/git-ci Understanding of automotive and embedded software systems Travel Requirements Onsite presence in the office, this is not a remote or hybrid role. Travel may be required on an occasional basis for events such as team meetings or working with manufacturers or subject-matter experts on particular tasks ( Physical Requirements for Non-Physical Positions While performing the duties of this job, employees may be required to sit for prolonged periods of time, occasionally bending or stooping, lifting up to 10 pounds, and prolonged periods of computer use. Reasonable Accommodations Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of the position. What's Cool About Working Here... Meaningful, challenging work that will redefine automotive landscape and make EVs available to everyone Comprehensive Health Insurance Equity Compensation Flexible Paid Time Off Casual workplace with an unbelievable feeling of energy Canoo is an equal opportunity-affirmative action employer and considers all qualified applicants for employment based on business needs, job requirements and individual qualifications, without regard to race, color, religion, sex, age, disability, sexual orientation, gender identity or expression, marital status, past or present military service or any other status protected by the laws or regulations in the locations where we operate. We also consider qualified applicants with criminal histories consistent with applicable federal, state and local law. Any unsolicited resumes or candidate profiles submitted in response to our job posting shall be considered the property of Canoo Inc. and its subsidiaries and are not subject to payment of referral or placement fees if any such candidate is later hired by Canoo unless you have a signed written agreement in place with us which covers the applicable job posting. Canoo maintains compliance with the OFCCP. As such, please feel free to review the following information: https://www.dol.gov/agencies/ofccp/posters https://www.dol.gov/agencies/olms/poster/labor-rights-federal-contractors If you are a person with a disability needing assistance with the application process, please call (214) 529-8055 or email us at TalentAcquisition@canoo.com Equal Employment Opportunity Posters Equal Employment Opportunity Posters | U.S. Department of Labor (dol.gov)

Posted 30+ days ago

Aledade logo
AledadeMyrtle Point, OR
As a Senior Technical Product Manager, Clinical Data Platform, you will be a key member of the team responsible for designing scalable technical, service, and unstructured extraction solutions that Support Aledade's various lines of business, with a focus on Medicare Advantage within the clinical data platform. In this role, you will partner with business owners, AI researchers, engineers and clinical informaticists to define project goals, solution scope, implementation approaches, and rollout plans. You will play a critical role in expanding the clinical platform's ability to ingest, standardize and serve clinical data at scale, including structured and unstructured data, to power downstream LLM use cases. Your primary focus will be to help identify and productionize AI/ML capabilities that extract and improve clinical insights from unstructured data in support of enhancing data quality and completeness. As a platform focused product manager, you will bridge the gap between strategic business needs and core platform capabilities, ensuring Aledade delivers a high-quality, future ready clinical data foundation that can support AI innovation and scale operationally. Primary Duties: Define and drive both short and long term technical roadmaps for data pipeline infrastructure, ensuring scalable, reliable ingestion and transformation of structured and unstructured data across diverse upstream sources and downstream consumers to deliver maximum value with minimum risk Partner cross functionally with engineering, analytics and key business stakeholders to identify data requirements, translate them into technical specifications and support implementation through backlog grooming, solution design and adoption oversight Monitor pipeline performance and data quality metrics, proactively investigate anomalies with SQL or equivalent query tools to drive root cause analysis and implement improvements to support data completeness, timeliness, analytics and generative AI initiatives. Work with internal teams and end users to develop a deep understanding of requirements, perform thoughtful technical solution designs, use data to test hypotheses, and support teams throughout execution. Write detailed user stories for new features, capturing detailed descriptions of business rationale, requirements, and success criteria that are defined by measurable outcomes. Ongoing optimization of live user workflows and capabilities including monitoring of key metrics & internal user feedback Minimum Qualifications: 8+ years of product or technical program management experience in healthcare data platforms, interoperability, or machine learning infrastructure, with a focus on clinical data ingestion and transformation, technology-enabled services industry, or a SaaS product. Experience using and writing queries against data for the purposes of performing preliminary research to inform solution design and build internal business understanding. Strong understanding of the software development lifecycle, Agile methodologies, and cross functional collaboration across engineering, informatics and data science teams. Product development experience supporting LLM pipelines or retrieval-augmented generation workflows using structured and unstructured healthcare data Proven ability to bridge business objectives and platform capabilities in environments requiring data standardization and semantic normalization. Preferred Knowledge, Skills, and/or Abilities: Excellent organizational and communication skills with an emphasis on problem-solving and building subject matter expertise. Intermediate understanding of EHR data integration and applicable data standards, including FHIR, QRDA, CCDA, SNOMED CT, LOINC, ICD-10, CPT and RxNorm Experience applying project management principles and techniques with an eye towards execution. To include skills in leading and managing change within the team and initiative more broadly. Knowledge of healthcare administrative and clinical data sets, including demographics, financials, encounters, labs, diagnoses, and medications. Familiarity with software development environments, version control systems and basic coding or scripting languages to better communicate with development teams and participate in technical discussions. Basic understanding of clinical workflows across inpatient, ambulatory, and ancillary care settings, including how data is captured and used in EHR systems. Experience working with EHR, practice management, revenue cycle tools, or population health platforms to support clinical or operational use cases.. Experience with clinical datasets to ensure accurate patient record linkage, data integration, and interoperability between clinical, administrative and claims data sources. Experience applying NLP and/or named entity extraction methods for extracting structured clinical insights from free-text or unstructured data (e.g., clinical notes, CCDAs, scanned documents, and/or images). Experience working with clinical or similar data pipelines including ingestion, normalization, and mapping to standardized terminologies and schemas. Proficient in SQL, with experience querying large healthcare datasets in PostgresSQL or similar environments. Physical Requirements: Sitting for prolonged periods of time. Extensive use of computers and keyboard. Occasional walking and lifting may be required. Who We Are: Aledade, a public benefit corporation, exists to empower the most transformational part of our health care landscape - independent primary care. We were founded in 2014, and since then, we've become the largest network of independent primary care in the country - helping practices, health centers and clinics deliver better care to their patients and thrive in value-based care. Additionally, by creating value-based contracts across a wide variety of health plans, we aim to flip the script on the traditional fee-for-service model. Our work strengthens continuity of care, aligns incentives and ensures primary care physicians are paid for what they do best - keeping patients healthy. If you want to help create a health care system that is good for patients, good for practices and good for society - and if you're eager to join a collaborative, inclusive and remote-first culture - you've come to the right place. What Does This Mean for You? At Aledade, you will be part of a creative culture that is driven by a passion for tackling complex issues with respect, open-mindedness and a desire to learn. You will collaborate with team members who bring a wide range of experiences, interests, backgrounds, beliefs and achievements to their work - and who are all united by a shared passion for public health and a commitment to the Aledade mission. In addition to time off to support work-life balance and enjoyment, we offer the following comprehensive benefits package designed for the overall well-being of our team members: Flexible work schedules and the ability to work remotely are available for many roles Health, dental and vision insurance paid up to 80% for employees, dependents and domestic partners Robust time-off plan (21 days of PTO in your first year) Two paid volunteer days and 11 paid holidays 12 weeks paid parental leave for all new parents Six weeks paid sabbatical after six years of service Educational Assistant Program and Clinical Employee Reimbursement Program 401(k) with up to 4% match Stock options And much more! At Aledade, we don't just accept differences, we celebrate them! We strive to attract, develop and retain highly qualified individuals representing the diverse communities where we live and work. Aledade is committed to creating a diverse environment and is proud to be an equal opportunity employer. Employment policies and decisions at Aledade are based on merit, qualifications, performance and business needs. All qualified candidates will receive consideration for employment without regard to age, race, color, national origin, gender (including pregnancy, childbirth or medical conditions related to pregnancy or childbirth), gender identity or expression, religion, physical or mental disability, medical condition, legally protected genetic information, marital status, veteran status, or sexual orientation. Privacy Policy: By applying for this job, you agree to Aledade's Applicant Privacy Policy available at https://www.aledade.com/privacy-policy-applicants

Posted 30+ days ago

Plaid logo
PlaidSan Francisco, CA

$180,000 - $270,000 / year

We believe that the way people interact with their finances will drastically improve in the next few years. We’re dedicated to empowering this transformation by building the tools and experiences that thousands of developers use to create their own products. Plaid powers the tools millions of people rely on to live a healthier financial life. We work with thousands of companies like Venmo, SoFi, several of the Fortune 500, and many of the largest banks to make it easy for people to connect their financial accounts to the apps and services they want to use. Plaid’s network covers 12,000 financial institutions across the US, Canada, UK and Europe. Founded in 2013, the company is headquartered in San Francisco with offices in New York, Washington D.C., London and Amsterdam. #LI-Hybrid The main goal of the DE team in 2024-25 is to build robust golden data sets to power our business goals of creating more insights based products. Making data-driven decisions is key to Plaid's culture. To support that, we need to scale our data systems while maintaining correct and complete data. We provide tooling and guidance to teams across engineering, product, and business and help them explore our data quickly and safely to get the data insights they need, which ultimately helps Plaid serve our customers more effectively. Data Engineers heavily leverage SQL and Python to build data workflows. We use tools like DBT, Airflow, Redshift, ElasticSearch, Atlanta, and Retool to orchestrate data pipelines and define workflows. We work with engineers, product managers, business intelligence, data analysts, and many other teams to build Plaid's data strategy and a data-first mindset. Our engineering culture is IC-driven -- we favor bottom-up ideation and empowerment of our incredibly talented team. We are looking for engineers who are motivated by creating impact for our consumers and customers, growing together as a team, shipping the MVP, and leaving things better than we found them. You will be in a high impact role that will directly enable business leaders to make faster and more informed business judgements based on the datasets you build. You will have the opportunity to carve out the ownership and scope of internal datasets and visualizations across Plaid which is a currently unowned area that we intend to take over and build SLAs on. You will have the opportunity to learn best practices and up-level your technical skills from our strong DE team and from the broader Data Platform team. You will collaborate with and have strong and cross functional partnerships with literally all teams at Plaid from Engineering to Product to Marketing/Finance etc. Responsibilities Understanding different aspects of the Plaid product and strategy to inform golden dataset choices, design and data usage principles. Have data quality and performance top of mind while designing datasetsLeading key data engineering projects that drive collaboration across the company. Advocating for adopting industry tools and practices at the right timeOwning core SQL and python data pipelines that power our data lake and data warehouse. Well-documented data with defined dataset quality, uptime, and usefulness. Qualifications 4+ years of dedicated data engineering experience, solving complex data pipelines issues at scale. You’ve have experience building data models and data pipelines on top of large datasets (in the order of 500TB to petabytes) You value SQL as a flexible and extensible tool, and are comfortable with modern SQL data orchestration tools like DBT, Mode, and Airflow. You have experience working with different performant warehouses and data lakes; Redshift, Snowflake, Databricks. You have experience building and maintaining batch and realtime pipelines using technologies like Spark, Kafka. You appreciate the importance of schema design, and can evolve an analytics schema on top of unstructured data. You are excited to try out new technologies. You like to produce proof-of-concepts that balance technical advancement and user experience and adoption. You like to get deep in the weeds to manage, deploy, and improve low level data infrastructure. You are empathetic working with stakeholders. You listen to them, ask the right questions, and collaboratively come up with the best solutions for their needs while balancing infra and business needs. You are a champion for data privacy and integrity, and always act in the best interest of consumers. The target base salary for this position ranges from $180,000/year to $270,000/year in Zone 1. The target base salary will vary based on the job's location. Our geographic zones are as follows: Zone 1 - New York City and San Francisco Bay Area Zone 2 - Los Angeles, Seattle, Washington D.C. Zone 3 - Austin, Boston, Denver, Houston, Portland, Sacramento, San Diego Zone 4 - Raleigh-Durham and all other US cities Additional compensation in the form(s) of equity and/or commission are dependent on the position offered. Plaid provides a comprehensive benefit plan, including medical, dental, vision, and 401(k). Pay is based on factors such as (but not limited to) scope and responsibilities of the position, candidate's work experience and skillset, and location. Pay and benefits are subject to change at any time, consistent with the terms of any applicable compensation or benefit plans. Our mission at Plaid is to unlock financial freedom for everyone. To support that mission, we seek to build a diverse team of driven individuals who care deeply about making the financial ecosystem more equitable. We recognize that strong qualifications can come from both prior work experiences and lived experiences. We encourage you to apply to a role even if your experience doesn't fully match the job description. We are always looking for team members that will bring something unique to Plaid! Plaid is proud to be an equal opportunity employer and values diversity at our company. We do not discriminate based on race, color, national origin, ethnicity, religion or religious belief, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, military or veteran status, disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local laws. Plaid is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance with your application or interviews due to a disability, please let us know at accommodations@plaid.com Please review our Candidate Privacy Notice here .

Posted 30+ days ago

T logo
Tech9Lehi, UT
Senior Data Engineer (Oracle FDI / Data Warehouse) About Us Tech9 is shaking up a 20-year-old industry, and we're not slowing down. Recognized by Inc. 5000 as one of the nation's fastest-growing companies, we are dedicated to building innovative, high-quality software and data solutions. Our team is passionate about craftsmanship, collaboration, and delivering technology that makes an impact. We offer a 100% remote working environment with a supportive culture that empowers you to do your best work. Role Overview We are seeking a Senior Data Engineer with deep expertise in Oracle and Fusion Data Intelligence (FDI) to support the Elevate Finance team as they enter the data warehouse build phase. The project has completed discovery and requirements gathering—now it needs a highly technical, hands-on engineer to design and build scalable pipelines and warehouse components within Oracle’s ecosystem. You will partner closely with the existing data engineer and project leadership to translate business requirements into data models, reporting objects, and production-ready data flows. This role requires someone who can operate independently, ramp quickly, and deliver high-quality engineering work within a fast-moving finance data environment. Responsibilities Data Pipeline Development : Design, build, and maintain data pipelines and warehouse components within the Oracle ecosystem. Oracle FDI Expertise : Develop, support, and optimize reporting in Oracle Fusion Data Intelligence , including report creation when needed. Warehouse Build-Out : Enable the next phase of the Elevate Finance data warehouse, ensuring scalability, accuracy, and alignment with business needs. Collaboration & Requirements Translation : Work closely with the Elevate data engineer and project leadership to translate requirements into data models and deliverables. Best Practices : Contribute to standards for data quality, architecture, performance, and long-term maintainability. Problem Solving : Troubleshoot issues, optimize data flows, and ensure accuracy across financial datasets. Minimum Qualifications Strong expertise with Oracle FDI (Fusion Data Intelligence) — core requirement. Experience developing or enabling reports in Oracle FDI — major plus and highly preferred. Senior-level data engineering experience , with a proven ability to design and build pipelines, not just perform analysis or requirements gathering. Ability to operate independently, learn fast, and contribute meaningfully from day one. Strong communication and collaboration skills to work with cross-functional teams. Preferred Qualifications Experience with data architecture , including conceptual, logical, or physical modeling. Background working in finance or ERP-related data environments. Familiarity with data quality frameworks or governance practices. Why Join Us? High Impact : Play a key role in building a finance data warehouse from the ground up. Flexibility : 100% remote role based in LATAM with flexible hours. Collaboration : Work with a talented engineering team and directly influence architectural decisions. Growth : Deepen expertise in Oracle’s modern data ecosystem while contributing to a strategic enterprise initiative. Interview Process Our interview process is designed to move efficiently while ensuring transparency and alignment: 1. Introductory Call – 15 minutes - A brief conversation with our recruiting team to discuss your background, data engineering experience, and alignment with the role. 2. On-Demand HireVue Screening –15–30 minutes- Behavioral and situational questions focused on ownership, communication, data problem-solving, and ability to work independently. 3. Internal Technical Interview #1 – 1 hour- A deep technical discussion on Oracle FDI , Oracle-based data engineering, data modeling, and end-to-end pipeline design. 4. Internal Technical Interview #2 – 1 hour- Scenario-based evaluation focused on warehouse build-out, scalability, optimization, and navigating ambiguity in finance data environments. 5. Hiring Manager Interview – 30 minutes- A conversation about your experience, collaboration style, and ability to drive engineering work with limited oversight. 6. Client Interview – 1 hour- A final discussion with client stakeholders to validate technical alignment, communication style, and ability to deliver within a finance data warehouse setting. To ensure you've received our notifications, please whitelist the domains jazz.co, jazz.com, and applytojob.com Powered by JazzHR

Posted 1 week ago

K logo
KDA Consulting IncBethesda, MD
We are seeking a Senior Data Scientist/Architect to support our customer onsite in Bethesda, MD. This position will work directly with a Chief Data & Artificial Intelligence Officer (CDAO) to support a fast-paced, dynamic enviornment and work closely with a goverment customer and IT developers to validate customer's data and data analytics requirements. Primary Job Duties Establish Data Environments, and serve as the data visualization subject matter expert. Create and translate trend, time-series, and predictive analyses, and facilitate statkeholder decision-making technical exchange meetings. Participate in hands on work on data science for data modeling, design, model training, and implementation to meet enterprise requirements. Establishing the system environment for large data sets, considering aspects such as data capture, storage, extraction, process and analysis. Using data visualization tools to create roadmaps, placemats and process flows; and creatively displaying data to facilitate customer decision-making processes. Applying data wrangling and exploration techniques. Manipulating data to gather insights from large structured and unstructured data sets and developing trend analyses. Developing algorithms to facilitate data analysis, support identification of data errors, and developing pattern analysis from multiple disparate data sources. Providing input on software implementation for analytics. Validating customer requirements and determining feasibility of solutions and alternative options. Contributing to and collaborating on executive-level briefings for the CDAO. Provide excellent organizational, analytical, data science abilities, and compile and organize statistical information retrieved and present findings to management. Strong math, coding and analytical skills are essential to complete job requirements successfully. Qualifications Current TS/SCI with Poly is required. Bachelor’s degree in Computer Science, Computer Engineering, or relevant STEM field. 10+ years of total professional experience in a technical setting; with at least 5 of those years as a Data Scientist, or related role. Ability to implement effective data solutions to ingest, curate, store and retrieve agency mission, business and IC data. Ability to establish a system environment for large data sets, considering aspects such as data capture, storage, extraction, process and analysis. Experience using data visualization tools to create roadmaps, placemats and process flows; and the ability to creatively display data to facilitate customer decision-making processes. Ability to manipulate data from disparate structured and unstructured sources to develop models and other detailed analysis that provide meaningful data insights. Experience with Programming languages like Python, Java, SQL and Cloud technologies. Ability to work in a dynamic and fast-paced mission environment which requires balancing competing priorities and schedules. Desired Skills: Have a depth of experience in data science, to include demonstrated experience with programming languages (e.g. Python, SQL), statistical programs (e.g. R and MS Excel), and data visualization tools (e.g. Tableau, Power BI, QlikView). Data scientist certification from AWS, Google, Microsoft, Oracle, or etc. Demonstrated experience with data modeling tools to include or similar to ER/Studio, Erwin or Lucid Chart. Deep understanding of machine learning and artificial intelligence for logical applications and techniques. Have experience working in Agile and DevOps environments.

Posted 30+ days ago

Axiom Software Solutions Limited logo
Axiom Software Solutions LimitedAtlanta, GA
Job Description: We are seeking a highly skilled and experienced Senior Data Engineer to join our team. The ideal candidate will have a strong background in data engineering, with a specialization in Matillion, SSIS, Azure DevOps, and ETL processes. This role will involve designing, developing, testing, and deploying ETL jobs, collaborating with cross-functional teams, and ensuring efficient data processing. Key Responsibilities: Design, develop, test, and deploy Matillion ETL jobs in accordance with project requirements. Collaborate with the Data and BI team to understand data integration needs and translate them into Matillion ETL solutions. Create and modify Python code/components in Matillion jobs. Identify opportunities for performance optimization and implement enhancements to ensure efficient data processing. Collaborate with cross-functional teams, including database administrators, data engineers, and business analysts, to ensure seamless integration of ETL processes. Create and maintain comprehensive documentation for Matillion ETL jobs, ensuring knowledge transfer within the team. Create, test, and deploy SQL Server Integration Service (SSIS) packages and schedule them via Active Batch scheduling tool. Create Matillion deployment builds using Azure DevOps CI/CD pipeline and perform release manager activities. Review code of other developers (L2, L3-BI/DI) to ensure code standards and provide approval as part of code review activities. Resolve escalation tickets from the L2 team as part of the on-call schedule. Working knowledge of API and Postman tool is an added advantage. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering, with a focus on ETL processes. Proficiency in Matillion, SSIS, Azure DevOps, and ETL. Strong knowledge of SQL, Python, and data integration techniques. Experience with performance optimization and data processing enhancements. Excellent collaboration and communication skills. Ability to work in a fast-paced, dynamic environment. Preferred Skills: Experience with cloud platforms such as AWS or Azure. Knowledge of data warehousing and data modeling. Familiarity with DevOps practices and CI/CD pipelines. Strong problem-solving skills and attention to detail.

Posted 30+ days ago

Care It Services logo
Care It ServicesDallas, Texas

$50 - $60 / hour

Benefits: Company parties Competitive salary Dental insurance Free food & snacks Hi Hope doing good & Well Work location: Dallas, TX /Atlanta, GA End client: IBM/DTV Compltely on site position Job Description We are looking for an experienced Databricks Subject Matter Expert (SME) with expertise in Data Profiling and Data Modeling to join our growing team. In this role, you will be responsible for leveraging Databricks to drive end-to-end data solutions, ensuring data quality, and optimizing data pipelines for performance and scalability. You will also play a pivotal role in designing, implementing, and maintaining data models that align with business requirements and industry best practices. The ideal candidate should have deep experience with the Databricks ecosystem, including Spark, Delta Lake, and other cloud-based data technologies, combined with a strong understanding of data profiling and data modeling concepts. You will collaborate closely with data engineers, data scientists, and business analysts to ensure data integrity, accuracy, and optimal architecture. Skills and Qualifications: Technical Skills: Databricks (including Spark, Delta Lake, and other relevant components) · 8+ years of hands-on experience with Databricks or related technologies. Strong expertise in Data Profiling tools and techniques. Experience in Data Profiling and Data Quality management. · Experience in Data Modeling , including working with dimensional models for analytics. Advanced knowledge of SQL , PySpark , and other scripting languages used within Databricks. Experience with Data Modeling (e.g., relational, dimensional, star schema, snowflake schema). Hands-on experience with ETL/ELT processes , data integration, and data pipeline optimization. Familiarity with cloud platforms (AWS, Azure, Google Cloud) and cloud data storage technologies. Proficiency in Python , Scala , or other programming languages commonly used in Databricks. Experience in Data Governance and Data Quality practices.. · Familiarity with Machine Learning workflows within Databricks is a plus. Thank you venkatesh@careits.com Compensation: $50.00 - $60.00 per hour Who We Are CARE ITS is a certified Woman-owned and operated minority company (certified as WMBE). At CARE ITS, we are the World Class IT Professionals, helping clients achieve their goals. Care ITS was established in 2010. Since then we have successfully executed several projects with our expert team of professionals with more than 20 years of experience each. We are globally operated with our Head Quarters in Plainsboro, NJ, with focused specialization in Salesforce, Guidewire and AWS. We provide expert solutions to our customers in various business domains.

Posted 3 weeks ago

Q logo
QCHI/ LendNation Open CareerLenexa, Kansas
QCHI / LendNation is looking for an experienced Data Analyst / Data Scientist to join our Marketing team. This is an experienced Data Analyst / Data Scientist who can independently drive high-impact analysis and shape strategic decisions. This role is not entry-level. The ideal candidate brings proven experience in online lending analytics, fraud detection, and credit risk performance management—preferably in subprime or alternative financial services. This role owns analytic projects end-to-end: defining objectives, extracting and validating data, performing analysis, presenting findings, and ensuring recommendations are adopted. The individual will partner with underwriting, marketing, product, and external vendors to quantify risk, identify growth opportunities, and improve portfolio performance. Success in this role means translating complex analysis into clear, persuasive recommendations that directly influence strategy, decisioning, and financial outcomes. The right candidate will be: Experienced in credit and fraud analytics , not just reporting Able to challenge assumptions and propose analytical solutions Self-directed, accountable, and comfortable prioritizing multiple workstreams REQUIRED SKILLS and EXPERIENCE: 5+ years analytics experience in online lending, fintech, consumer credit, fraud, or related industry 5+ years hands-on experience using R or Python for data extraction, modeling, automation, and analysis Demonstrated expertise with: Credit risk analysis, segmentation, and underwriting decision support Fraud detection analytics (velocity checks, behavioral/identity risk indicators) Portfolio performance measurement and optimization Online marketing analytics and customer lifecycle performance Strong SQL skills, including SAS Ability to communicate insights visually and verbally to both technical and non-technical audiences (Power BI, Tableau, etc.) Proven ability to drive outcomes and influence leadership through data-based recommendations

Posted 3 weeks ago

Sprinter Health logo
Sprinter HealthSan Francisco, CA
About Sprinter Health At Sprinter Health, our mission is reimagining how people access care by bringing it directly to their homes. Nearly 30% of patients in the U.S. skip preventive or chronic care simply because they can’t get to a doctor’s office. For many, the ER becomes their first touchpoint with the healthcare system—driving over $300B in avoidable costs every year. By using the same technologies that power leading marketplace and last-mile platforms, we deliver care where people are, especially those who need it most. So far, we’ve supported more than 2 million patients across 22 states, completed 130,000+ in-home visits, and maintained a 92 NPS. Our team of clinicians, technologists, and operators have raised over $125M to date investors like a16z, General Catalyst, GV, and Accel and enjoy multi-year runway. Data is core to how we deliver care, allocate clinicians, and understand population risk—and we’re now building the team to scale that impact. About the Role We’re hiring a Senior / Staff Data Scientist to tackle high-impact problems across logistics, patient behavior, clinical operations, and population health. This is a 0→1 role where your work directly shapes how care is delivered, optimized, and scaled. You’ll turn messy, real-world healthcare data into insights that drive product direction, operational efficiency, and patient outcomes. If you like asking hard questions and using data to solve problems that matter in the real world, this is that opportunity. Office Location We are a hybrid company based in the Bay Area with offices in both San Francisco and Menlo Park. We care about work-life balance and understand that there will be times where flexibility is needed. What you will do Explore clinical, operational, and patient engagement data to surface actionable insights Analyze routing, capacity planning, and supply-demand dynamics across regions Identify drivers of preventive care engagement and patient follow-through Predict cancellations, risk, and future care needs using behavioral and clinical signals Design and evaluate A/B experiments across product, growth, and operations Support internal teams with dashboards, analytics, and ad hoc decisions Help shape the Data function’s standards, tooling, and processes as an early leader What you have done 5+ years in data science, advanced analytics, or applied data modeling Worked with large, messy datasets to answer ambiguous, high-stakes questions Built dashboards or visualizations using Looker, Tableau, Grafana, Superset, or similar Written high-quality SQL across warehouses like BigQuery, Snowflake, or Redshift Influenced product, ops, or strategy decisions through data insight and storytelling Presented findings to leadership and collaborated with cross-functional teams What gives you an edge Experience with Python, Pandas, Scikit-learn, or ELK (Elasticsearch + Kibana) Exposure to GCP data tools like BigQuery, DataFlow, or DataForm Familiarity with healthcare data (claims, HL7, CCDAs, HIEs) Experience with logistics, routing, or operational optimization problems Designed or analyzed growth experiments or user behavior funnels Communicated insights to C-level stakeholders or mentored junior talent Our Tech Stack Python SQL Superset, Looker, Tableau, Grafana BigQuery, Snowflake, Redshift Elasticsearch / Kibana FHIR, HL7, claims data (bonus) Pandas, Scikit-learn, DataForm, DataFlow What we offer Meaningful pre-IPO equity Medical, dental, and vision plans 100% paid for you and your dependents Flexible PTO + 10 paid holidays per year 401(k) with match 16-week parental leave policy for birthing parent, 8 weeks for all other parents HSA + FSA contributions Life insurance, plus short and long-term disability coverage Free daily lunch in-office Annual learning stipend Sprinter Health is an equal opportunity employer. We value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, disability status or other protected classes. Beware of recruitment fraud and scams that involve fictitious job descriptions followed by false job offers. If you are applying for a job, you can confirm the legitimacy of a job posting by viewing current open roles here . All legitimate job postings will require an application to be made directly on our official Sprinter Health Careers website. Job-related communications will only be sent from email addresses ending in @sprinterhealth.com. Please ensure that you’re only replying to emails that end with @sprinterhealth.com.

Posted 3 weeks ago

C logo
Castleton Commodities International LLCHouston, TX
Castleton Commodities International (CCI) is seeking a Senior Data Operations Analyst to help support the day-to-day management, data quality, and continuous enhancement of the firm's enterprise Master Data Management platform, TIBCO EBX. You will work with an offshore support team, business data owners, and technology partners to make certain that master data is complete, trusted, and readily available to trading, risk, finance, and analytics applications. Your remit spans monitoring platform health, troubleshooting data issues, coordinating the ServiceNow work queue, contributing to data model designs, and validating releases delivered through a structured SDLC (Jira). Responsibilities: Platform & Data Operations Act as a senior SME and L2/L3 support resource for TIBCO EBX, collaborating with an offshore team to ensure appropriate coverage. Enter and maintain master/reference data in EBX, enforcing stewardship workflows and governance rules. Monitor application jobs, security, and integrations; escalate issues and document resolution steps. Manage ServiceNow work queue: triage, prioritize, assign, and track incidents, enhancements, and service requests against defined SLAs. Data Design & Quality Partner with data owners to design and extend EBX data models, hierarchies, validation rules, and stewardship workflows. Investigate and resolve data errors surfaced by downstream systems or data quality rules; perform root-cause analysis and propose sustainable fixes. Develop data quality dashboards (Omni/Power BI) to track KPIs such as completeness, duplication, and timeliness. Release & Change Management Coordinate with engineering to test EBX configuration changes, code deployments, and version upgrades. Author and execute regression and user-acceptance test (UAT) scripts; validate mappings between EBX and consuming systems (REST/SOAP, SQL, Kafka, etc.). Champion change-control best practices, ensuring all stories and tasks are effectively managed in Jira from requirements through deployment. Continuous Improvement & Collaboration Analyze recurring data defects to recommend automation or rule enhancements that reduce manual touch points. Deliver training and knowledge-transfer sessions for end-users and offshore analysts on EBX workflows and best practices. Support audit, compliance, and SOX requests related to MDM operational controls. Qualifications: 5 + years of hands-on experience operating or supporting a commercial MDM platform (TIBCO EBX preferred; Informatica, Reltio, SAP MDG, etc. are acceptable). Solid grasp of core MDM concepts: golden-record management, hierarchy/versioning, data quality rules, stewardship workflows, matching/merging, and reference-data integration. Demonstrated experience managing or supporting operational queues in ServiceNow and project backlogs in Jira. Proficiency in SQL and one scripting language (Java, Python, or similar) for data investigation and automation. Proven record of partnering with offshore or managed-service teams, including defining SLAs and run-books. Strong analytical and problem-solving skills; ability to translate data symptoms into root cause across complex data flows. Excellent written and verbal communication skills, comfortable interfacing with both technical teams and front-office stakeholders. Must be able to work effectively in a fast-paced, dynamic and high-intensity environment including open-floor plan if applicable to the position, with timely responsiveness and the ability to work beyond normal business hours when required. Preferred Qualifications: Prior exposure to energy-trading or commodity-trading reference data. Experience configuring EBX data models, workflows, validation rules, and user roles. Familiarity with data-catalog/governance tooling (Collibra, Alation, Atlan) and their integration with MDM. Knowledge of API integrations (REST/SOAP), message queues (Kafka), and cloud data platforms (Azure Synapse, Amazon Redshift, Databricks). ITIL or similar service-management certification. Employee Programs & Benefits: CCI offers competitive benefits and programs to support our employees, their families and local communities. These include: Competitive comprehensive medical, dental, retirement and life insurance benefits Employee assistance & wellness programs Parental and family leave policies CCI in the Community: Each office has a Charity Committee and as a part of this program employees are allocated 2 days annually to volunteer at the selected charities. Charitable contribution match program Tuition assistance & reimbursement Quarterly Innovation & Collaboration Awards Employee discount program, including access to fitness facilities Competitive paid time off Continued learning opportunities Visit https://www.cci.com/careers/life-at-cci/ # to learn more! #LI-CD1

Posted 4 weeks ago

P logo
Plaid Inc.San Francisco, CA

$180,000 - $270,000 / year

We believe that the way people interact with their finances will drastically improve in the next few years. We're dedicated to empowering this transformation by building the tools and experiences that thousands of developers use to create their own products. Plaid powers the tools millions of people rely on to live a healthier financial life. We work with thousands of companies like Venmo, SoFi, several of the Fortune 500, and many of the largest banks to make it easy for people to connect their financial accounts to the apps and services they want to use. Plaid's network covers 12,000 financial institutions across the US, Canada, UK and Europe. Founded in 2013, the company is headquartered in San Francisco with offices in New York, Washington D.C., London and Amsterdam. #LI-Hybrid The main goal of the DE team in 2024-25 is to build robust golden data sets to power our business goals of creating more insights based products. Making data-driven decisions is key to Plaid's culture. To support that, we need to scale our data systems while maintaining correct and complete data. We provide tooling and guidance to teams across engineering, product, and business and help them explore our data quickly and safely to get the data insights they need, which ultimately helps Plaid serve our customers more effectively. Data Engineers heavily leverage SQL and Python to build data workflows. We use tools like DBT, Airflow, Redshift, ElasticSearch, Atlanta, and Retool to orchestrate data pipelines and define workflows. We work with engineers, product managers, business intelligence, data analysts, and many other teams to build Plaid's data strategy and a data-first mindset. Our engineering culture is IC-driven -- we favor bottom-up ideation and empowerment of our incredibly talented team. We are looking for engineers who are motivated by creating impact for our consumers and customers, growing together as a team, shipping the MVP, and leaving things better than we found them. You will be in a high impact role that will directly enable business leaders to make faster and more informed business judgements based on the datasets you build. You will have the opportunity to carve out the ownership and scope of internal datasets and visualizations across Plaid which is a currently unowned area that we intend to take over and build SLAs on. You will have the opportunity to learn best practices and up-level your technical skills from our strong DE team and from the broader Data Platform team. You will collaborate with and have strong and cross functional partnerships with literally all teams at Plaid from Engineering to Product to Marketing/Finance etc. Responsibilities Understanding different aspects of the Plaid product and strategy to inform golden dataset choices, design and data usage principles. Have data quality and performance top of mind while designing datasetsLeading key data engineering projects that drive collaboration across the company. Advocating for adopting industry tools and practices at the right timeOwning core SQL and python data pipelines that power our data lake and data warehouse. Well-documented data with defined dataset quality, uptime, and usefulness. Qualifications 4+ years of dedicated data engineering experience, solving complex data pipelines issues at scale. You've have experience building data models and data pipelines on top of large datasets (in the order of 500TB to petabytes) You value SQL as a flexible and extensible tool, and are comfortable with modern SQL data orchestration tools like DBT, Mode, and Airflow. You have experience working with different performant warehouses and data lakes; Redshift, Snowflake, Databricks. You have experience building and maintaining batch and realtime pipelines using technologies like Spark, Kafka. You appreciate the importance of schema design, and can evolve an analytics schema on top of unstructured data. You are excited to try out new technologies. You like to produce proof-of-concepts that balance technical advancement and user experience and adoption. You like to get deep in the weeds to manage, deploy, and improve low level data infrastructure. You are empathetic working with stakeholders. You listen to them, ask the right questions, and collaboratively come up with the best solutions for their needs while balancing infra and business needs. You are a champion for data privacy and integrity, and always act in the best interest of consumers. $180,000 - $270,000 a year The target base salary for this position ranges from $180,000/year to $270,000/year in Zone 1. The target base salary will vary based on the job's location. Our geographic zones are as follows: Zone 1 - New York City and San Francisco Bay Area Zone 2 - Los Angeles, Seattle, Washington D.C. Zone 3 - Austin, Boston, Denver, Houston, Portland, Sacramento, San Diego Zone 4 - Raleigh-Durham and all other US cities Additional compensation in the form(s) of equity and/or commission are dependent on the position offered. Plaid provides a comprehensive benefit plan, including medical, dental, vision, and 401(k). Pay is based on factors such as (but not limited to) scope and responsibilities of the position, candidate's work experience and skillset, and location. Pay and benefits are subject to change at any time, consistent with the terms of any applicable compensation or benefit plans. Our mission at Plaid is to unlock financial freedom for everyone. To support that mission, we seek to build a diverse team of driven individuals who care deeply about making the financial ecosystem more equitable. We recognize that strong qualifications can come from both prior work experiences and lived experiences. We encourage you to apply to a role even if your experience doesn't fully match the job description. We are always looking for team members that will bring something unique to Plaid! Plaid is proud to be an equal opportunity employer and values diversity at our company. We do not discriminate based on race, color, national origin, ethnicity, religion or religious belief, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, military or veteran status, disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local laws. Plaid is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance with your application or interviews due to a disability, please let us know at [email protected] Please review our Candidate Privacy Notice here.

Posted 30+ days ago

Edelman logo
EdelmanBogota, NJ
We are looking to expand our capabilities with support in global research, analytics, performance, and data consultancy. This role is ideal for someone early in their career who is eager to learn and develop skills in media and social monitoring and analysis. You will gain valuable experience in a dynamic, fast-paced, and highly international environment. Experience 1 to 3 years of experience working in data collection, monitoring, reporting and visualization in advertising/PR agencies or marketing departments. Excellent verbal and written communication skills Strong numeracy and analytical skills Strong knowledge of the Microsoft Suite (Word, PowerPoint, Excel) Good organizational skills and ability to manage tasks effectively. Strong degree of accuracy and attention to detail Understanding North American business culture Knowledge of PR / communications, and marketing Ability to work closely with international stakeholders and adapt to the demands of a global organization Ability to work independently and as part of a team Enjoys working in a dynamic & fast-paced environment English fluency B2 Key responsibilities Data Collection and Management Support the collection of data from various media sources (social media, television, radio, print, digital advertising channels, etc.)Help ensure data accuracy and consistency within projects.Learn and assist in creating basic search queries and taxonomies using Boolean language. Monitoring & Data Analysis Assist in setting up and maintaining alert systems according to project needs (such as taxonomies, frequency, etc.)Support the analysis of media performance metrics like visibility, reach, engagement, conversions, or ROI.Participate in data analysis related to audience behavior, content preferences, and market trends specific to the Americas region. Reporting and Visualization Help prepare clear, actionable reports (daily, weekly, monthly, or quarterly) using visual storytelling.Support the creation and maintenance of live dashboards within media monitoring tools.Assist in customizing reports to meet regional requirements, including language and cultural considerations. If you checked off everything above, apply now and take on this challenge where you can grow professionally, connect with expert teams, receive training locally, regionally, and globally, and LIVE our TRUST culture (open, inclusive, and innovative).

Posted 4 weeks ago

CZ Biohub logo
CZ BiohubRedwood City, CA

$241,000 - $331,100 / year

Biohub is leading the new era of AI-powered biology to cure or prevent disease through its 501c3 medical research organization, with the support of the Chan Zuckerberg Initiative. The Team Biohub supports the science and technology that will make it possible to help scientists cure, prevent, or manage all diseases by the end of this century. While this may seem like an audacious goal, in the last 100 years, biomedical science has made tremendous strides in understanding biological systems, advancing human health, and treating disease. Achieving our mission will only be possible if scientists are able to better understand human biology. To that end, we have identified four grand challenges that will unlock the mysteries of the cell and how cells interact within systems - paving the way for new discoveries that will change medicine in the decades that follow: Building an AI-based virtual cell model to predict and understand cellular behavior Developing novel imaging technologies to map, measure and model complex biological systems Creating new tools for sensing and directly measuring inflammation within tissues in real time.tissues to better understand inflammation, a key driver of many diseases Harnessing the immune system for early detection, prevention, and treatment of disease The Opportunity At Biohub, we are generating unprecedented scientific datasets that drive biological modeling innovation: Billions of standardized cells of single-cell transcriptomic data, with a focus on measuring genetic and environmental perturbations 10s of thousands of donor-matched DNA & RNA samples PB-scale static and dynamic imaging datasets TB-scale mass spectrometry datasets Diverse, large multi-modal biological datasets that enable biological bridges across measurement types and facilitate multi-modal model training to define how cells act. After model training, we make all data products available through public resources like CELLxGENE Discover and the CryoET Portal, used by tens of thousands of scientists monthly to advance understanding of genetic variants, disease risk, drug toxicities, and therapeutic discovery. As a Senior Staff Data Scientist, you'll lead the creation of groundbreaking imaging datasets that decode cellular function at the molecular level, describe development, and predict responses to genetic or environmental changes. Working at the intersection of data science, biology, and AI, you'll define data needs, format standards, analysis approaches, quality metrics, and our technical strategy, creating systems to ingest, transform, and validate and deploy data products. Success for this role means delivering high-quality, usable datasets that directly address modeling challenges and accelerate scientific progress. Join us in building the data foundation that will transform our understanding of human biology and move us along the path to curing, preventing, and managing all disease. What You'll Do Define the technical strategy for a robust imaging data ecosystem and build data ingestion pipelines, define data formats, write validation tools, QC metrics, and analysis pipelines. Collaborate with ML engineers, AI Researchers, and Data Engineers to iteratively evaluate, refine and grow datasets to maximize model performance. Discover and define new data generation opportunities, and manage the delivery of those data products to our AI team. Collaborate with engineers, product managers, UX designers, and other data scientists to publish valuable datasets as part of CZI's open data ecosystem. What You'll Bring 10+ years of experience with large scale biological imaging data. Demonstrated delivery of multiple large biological data products. Experience with big data: extraction, transport, loading, databases, standardization, validation, QC, and analysis. Experience with processing and orchestration pipelines, such as Argo Workflows, Databricks Strong fundamentals in statistical reasoning and machine learning. Experience with biological data analysis and QC best practices Excellent written and verbal communication skills. Enthusiasm to ramp up on technologies and learn new domains. Experience working in a multidisciplinary environment (engineering, product, AI Research). Compensation The Redwood City, CA base pay range for a new hire in this role is $241,000 - $331,100. New hires are typically hired into the lower portion of the range, enabling employee growth in the range over time. Actual placement in range is based on job-related skills and experience, as evaluated throughout the interview process. Better Together As we grow, we're excited to strengthen in-person connections and cultivate a collaborative, team-oriented environment. This role is a hybrid position requiring you to be onsite for at least 60% of the working month, approximately 3 days a week, with specific in-office days determined by the team's manager. The exact schedule will be at the hiring manager's discretion and communicated during the interview process. Benefits for the Whole You We're thankful to have an incredible team behind our work. To honor their commitment, we offer a wide range of benefits to support the people who make all we do possible. Provides a generous employer match on employee 401(k) contributions to support planning for the future. Paid time off to volunteer at an organization of your choice. Funding for select family-forming benefits. Relocation support for employees who need assistance moving If you're interested in a role but your previous experience doesn't perfectly align with each qualification in the job description, we still encourage you to apply as you may be the perfect fit for this or another role. #LI-Hybrid

Posted 30+ days ago

Geico Insurance logo
Geico InsuranceChevy Chase, MD

$88,150 - $157,850 / year

At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities. Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers' expectations while making a real impact for our company through our shared purpose. When you join our company, we want you to feel valued, supported and proud to work here. That's why we offer The GEICO Pledge: Great Company, Great Culture, Great Rewards and Great Careers. GEICO is looking for a customer-obsessed and results-oriented Product Manager to support our Data Ingestion and Movement platform. This role will help drive product direction for our data ingestion, ETL/ELT pipelines, and data movement services, focusing on enabling reliable data flow into our lakehouse and other data stores. The ideal candidate will have a technical background in data engineering and experience delivering scalable data platforms and data pipeline solutions. Description As a Product Manager for Data Ingestion and Movement, you will be responsible for supporting the product vision and execution for GEICO's data ingestion and movement products. To successfully shape a platform that enables pipeline-as-a-service and supports a scalable data mesh architecture, a strong technical understanding of data pipelines, data integration patterns, data orchestration, ETL/ELT processes, and platform engineering is essential. Your goal is to abstract complexity and empower domain teams to autonomously and efficiently build, deploy, and govern data pipelines. This role also requires stakeholder management skills and the ability to bridge technical solutions with business value. Key Responsibilities Support the development and execution of data ingestion and movement platform vision aligned with business goals and customer needs Help create and maintain a clear, prioritized roadmap for data ingestion and movement capabilities that balances short-term delivery with long-term strategic objectives Support evangelizing the Data Ingestion and Movement platform across the organization and help drive stakeholder alignment Stay abreast of industry trends and competitive landscape (Apache Kafka, Apache Airflow, AWS Glue, Azure Data Factory, Google Cloud Dataflow, etc.) to inform data ingestion strategy Support requirement gathering and product strategy for data ingestion, ETL/ELT pipelines, and data movement services Understand end-to-end data ingestion workflows and how data movement fits into the broader data ecosystem and downstream analytics Support data governance initiatives for data lineage, quality, and compliance in data ingestion and movement processes Ensure data ingestion and movement processes adhere to regulatory, compliance, and data quality standards Partner with engineering on the development of data ingestion tools, pipeline orchestration services, and data movement capabilities Help define product capabilities for data ingestion, pipeline monitoring, error handling, and data quality validation to improve reliability and performance Support customer roadshows and training on data ingestion and movement capabilities Build instrumentation and observability into data ingestion and movement tools to enable data-driven product decisions and pipeline monitoring Work closely with engineering, data engineering, and data teams to ensure seamless delivery of data ingestion and movement products Partner with customer success, support, and engineering teams to create clear feedback loops Translate data ingestion and movement technical capabilities into business value and user benefits Support alignment across multiple stakeholders and teams in complex, ambiguous environments Qualifications Required Understanding of data ingestion patterns, ETL/ELT processes, and data pipeline architectures (Apache Kafka, Apache Airflow, Apache Spark, AWS Glue, etc.) Experience with data integration APIs, connectors, and data pipeline orchestration tools Basic understanding of data pipeline monitoring, observability, and data quality validation practices Experience in cloud data ecosystems (AWS, GCP, Azure) Proven analytical and problem-solving abilities with a data-driven approach to decision-making Experience working with Agile methodologies and tools (JIRA, Azure DevOps) Good communication, stakeholder management, and cross-functional collaboration skills Strong organizational skills with ability to manage product backlogs Preferred Previous experience as a software or data engineer is a plus Strong business acumen to prioritize features based on customer value and business impact Experience with data ingestion tools (Apache Kafka, Apache NiFi, AWS Kinesis, Azure Event Hubs, etc.) Knowledge of data lineage, data quality frameworks, and compliance requirements for data ingestion Insurance industry experience Experience Minimum 5+ years of technical product management experience building platforms that support data ingestion, ETL/ELT pipelines, data engineering, and data infrastructure Track record of delivering successful products in fast-paced environments Experience supporting complex, multi-stakeholder initiatives Proven ability to work with technical teams and translate business requirements into technical product specifications Experience with customer research, user interviews, and data-driven decision making Education Bachelor's degree in computer science, engineering, management information systems, or related technical field required MBA/MS or equivalent experience preferred Annual Salary $88,150.00 - $157,850.00 The above annual salary range is a general guideline. Multiple factors are taken into consideration to arrive at the final hourly rate/ annual salary to be offered to the selected candidate. Factors include, but are not limited to, the scope and responsibilities of the role, the selected candidate's work experience, education and training, the work location as well as market and business considerations. At this time, GEICO will not sponsor a new applicant for employment authorization for this position. The GEICO Pledge: Great Company: At GEICO, we help our customers through life's twists and turns. Our mission is to protect people when they need it most and we're constantly evolving to stay ahead of their needs. We're an iconic brand that thrives on innovation, exceeding our customers' expectations and enabling our collective success. From day one, you'll take on exciting challenges that help you grow and collaborate with dynamic teams who want to make a positive impact on people's lives. Great Careers: We offer a career where you can learn, grow, and thrive through personalized development programs, created with your career - and your potential - in mind. You'll have access to industry leading training, certification assistance, career mentorship and coaching with supportive leaders at all levels. Great Culture: We foster an inclusive culture of shared success, rooted in integrity, a bias for action and a winning mindset. Grounded by our core values, we have an an established culture of caring, inclusion, and belonging, that values different perspectives. Our teams are led by dynamic, multi-faceted teams led by supportive leaders, driven by performance excellence and unified under a shared purpose. As part of our culture, we also offer employee engagement and recognition programs that reward the positive impact our work makes on the lives of our customers. Great Rewards: We offer compensation and benefits built to enhance your physical well-being, mental and emotional health and financial future. Comprehensive Total Rewards program that offers personalized coverage tailor-made for you and your family's overall well-being. Financial benefits including market-competitive compensation; a 401K savings plan vested from day one that offers a 6% match; performance and recognition-based incentives; and tuition assistance. Access to additional benefits like mental healthcare as well as fertility and adoption assistance. Supports flexibility- We provide workplace flexibility as well as our GEICO Flex program, which offers the ability to work from anywhere in the US for up to four weeks per year. The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled. GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants.

Posted 30+ days ago

T logo
TP-Link CorpIrvine, CA
ABOUT US: Headquartered in the United States, TP-Link Systems Inc. is a global provider of reliable networking devices and smart home products, consistently ranked as the world's top provider of Wi-Fi devices. The company is committed to delivering innovative products that enhance people's lives through faster, more reliable connectivity. With a commitment to excellence, TP-Link serves customers in over 170 countries and continues to grow its global footprint. We believe technology changes the world for the better! At TP-Link Systems Inc, we are committed to crafting dependable, high-performance products to connect users worldwide with the wonders of technology. Embracing professionalism, innovation, excellence, and simplicity, we aim to assist our clients in achieving remarkable global performance and enable consumers to enjoy a seamless, effortless lifestyle. KEY RESPONSIBILITIES Design and build scalable data pipeline: Develop and maintain high performance and large scale data ingestion and transformation, including ETL/ELT processes, data de-identification, and security management. Data orchestration and automation: Develop and manage automated data workflows using tools like Apache Airflow to schedule pipelines, manage dependencies, and ensure reliable, timely data processing and availability. AWS integration and cloud expertise: Build data pipelines integrated with AWS cloud-native storage and compute services, leveraging scalable cloud infrastructure for data processing. Monitoring and data quality: Implement comprehensive monitoring, logging, and alerting to ensure high availability, fault tolerance and data quality through self healing strategies and robust data validation processes. Technology innovation: Stay current with emerging big data technologies and industry trends, recommending and implementing new tools and approaches to continuously improve data infrastructure. Technical leadership: Provide technical leadership for data infrastructure teams, guide architecture decisions and system design best practices. Mentor junior engineers through code reviews and knowledge sharing, lead complex projects from concept to production, and help to foster a culture of operational excellence. Bilingual Mandarin Required

Posted 1 week ago

Geico Insurance logo

Product Manager, Data Platform - Data Ingestion And Movement

Geico InsuranceChicago, IL

$88,150 - $157,850 / year

Automate your job search with Sonara.

Submit 10x as many applications with less effort than one manual application.1

Reclaim your time by letting our AI handle the grunt work of job searching.

We continuously scan millions of openings to find your top matches.

pay-wall

Job Description

At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities.

Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers' expectations while making a real impact for our company through our shared purpose.

When you join our company, we want you to feel valued, supported and proud to work here. That's why we offer The GEICO Pledge: Great Company, Great Culture, Great Rewards and Great Careers.

GEICO is looking for a customer-obsessed and results-oriented Product Manager to support our Data Ingestion and Movement platform. This role will help drive product direction for our data ingestion, ETL/ELT pipelines, and data movement services, focusing on enabling reliable data flow into our lakehouse and other data stores. The ideal candidate will have a technical background in data engineering and experience delivering scalable data platforms and data pipeline solutions.

Description

As a Product Manager for Data Ingestion and Movement, you will be responsible for supporting the product vision and execution for GEICO's data ingestion and movement products. To successfully shape a platform that enables pipeline-as-a-service and supports a scalable data mesh architecture, a strong technical understanding of data pipelines, data integration patterns, data orchestration, ETL/ELT processes, and platform engineering is essential. Your goal is to abstract complexity and empower domain teams to autonomously and efficiently build, deploy, and govern data pipelines. This role also requires stakeholder management skills and the ability to bridge technical solutions with business value.

Key Responsibilities

  • Support the development and execution of data ingestion and movement platform vision aligned with business goals and customer needs

  • Help create and maintain a clear, prioritized roadmap for data ingestion and movement capabilities that balances short-term delivery with long-term strategic objectives

  • Support evangelizing the Data Ingestion and Movement platform across the organization and help drive stakeholder alignment

  • Stay abreast of industry trends and competitive landscape (Apache Kafka, Apache Airflow, AWS Glue, Azure Data Factory, Google Cloud Dataflow, etc.) to inform data ingestion strategy

  • Support requirement gathering and product strategy for data ingestion, ETL/ELT pipelines, and data movement services

  • Understand end-to-end data ingestion workflows and how data movement fits into the broader data ecosystem and downstream analytics

  • Support data governance initiatives for data lineage, quality, and compliance in data ingestion and movement processes

  • Ensure data ingestion and movement processes adhere to regulatory, compliance, and data quality standards

  • Partner with engineering on the development of data ingestion tools, pipeline orchestration services, and data movement capabilities

  • Help define product capabilities for data ingestion, pipeline monitoring, error handling, and data quality validation to improve reliability and performance

  • Support customer roadshows and training on data ingestion and movement capabilities

  • Build instrumentation and observability into data ingestion and movement tools to enable data-driven product decisions and pipeline monitoring

  • Work closely with engineering, data engineering, and data teams to ensure seamless delivery of data ingestion and movement products

  • Partner with customer success, support, and engineering teams to create clear feedback loops

  • Translate data ingestion and movement technical capabilities into business value and user benefits

  • Support alignment across multiple stakeholders and teams in complex, ambiguous environments

Qualifications

Required

  • Understanding of data ingestion patterns, ETL/ELT processes, and data pipeline architectures (Apache Kafka, Apache Airflow, Apache Spark, AWS Glue, etc.)

  • Experience with data integration APIs, connectors, and data pipeline orchestration tools

  • Basic understanding of data pipeline monitoring, observability, and data quality validation practices

  • Experience in cloud data ecosystems (AWS, GCP, Azure)

  • Proven analytical and problem-solving abilities with a data-driven approach to decision-making

  • Experience working with Agile methodologies and tools (JIRA, Azure DevOps)

  • Good communication, stakeholder management, and cross-functional collaboration skills

  • Strong organizational skills with ability to manage product backlogs

Preferred

  • Previous experience as a software or data engineer is a plus

  • Strong business acumen to prioritize features based on customer value and business impact

  • Experience with data ingestion tools (Apache Kafka, Apache NiFi, AWS Kinesis, Azure Event Hubs, etc.)

  • Knowledge of data lineage, data quality frameworks, and compliance requirements for data ingestion

  • Insurance industry experience

Experience

  • Minimum 5+ years of technical product management experience building platforms that support data ingestion, ETL/ELT pipelines, data engineering, and data infrastructure

  • Track record of delivering successful products in fast-paced environments

  • Experience supporting complex, multi-stakeholder initiatives

  • Proven ability to work with technical teams and translate business requirements into technical product specifications

  • Experience with customer research, user interviews, and data-driven decision making

Education

  • Bachelor's degree in computer science, engineering, management information systems, or related technical field required

  • MBA/MS or equivalent experience preferred

Annual Salary

$88,150.00 - $157,850.00

The above annual salary range is a general guideline. Multiple factors are taken into consideration to arrive at the final hourly rate/ annual salary to be offered to the selected candidate. Factors include, but are not limited to, the scope and responsibilities of the role, the selected candidate's work experience, education and training, the work location as well as market and business considerations.

At this time, GEICO will not sponsor a new applicant for employment authorization for this position.

The GEICO Pledge:

Great Company: At GEICO, we help our customers through life's twists and turns. Our mission is to protect people when they need it most and we're constantly evolving to stay ahead of their needs.

We're an iconic brand that thrives on innovation, exceeding our customers' expectations and enabling our collective success. From day one, you'll take on exciting challenges that help you grow and collaborate with dynamic teams who want to make a positive impact on people's lives.

Great Careers: We offer a career where you can learn, grow, and thrive through personalized development programs, created with your career - and your potential - in mind. You'll have access to industry leading training, certification assistance, career mentorship and coaching with supportive leaders at all levels.

Great Culture: We foster an inclusive culture of shared success, rooted in integrity, a bias for action and a winning mindset. Grounded by our core values, we have an an established culture of caring, inclusion, and belonging, that values different perspectives. Our teams are led by dynamic, multi-faceted teams led by supportive leaders, driven by performance excellence and unified under a shared purpose.

As part of our culture, we also offer employee engagement and recognition programs that reward the positive impact our work makes on the lives of our customers.

Great Rewards: We offer compensation and benefits built to enhance your physical well-being, mental and emotional health and financial future.

  • Comprehensive Total Rewards program that offers personalized coverage tailor-made for you and your family's overall well-being.
  • Financial benefits including market-competitive compensation; a 401K savings plan vested from day one that offers a 6% match; performance and recognition-based incentives; and tuition assistance.
  • Access to additional benefits like mental healthcare as well as fertility and adoption assistance.
  • Supports flexibility- We provide workplace flexibility as well as our GEICO Flex program, which offers the ability to work from anywhere in the US for up to four weeks per year.

The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled.

GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants.

Automate your job search with Sonara.

Submit 10x as many applications with less effort than one manual application.

pay-wall