landing_page-logo
  1. Home
  2. »All Job Categories
  3. »Data Entry Jobs

Auto-apply to these data entry jobs

We've scanned millions of jobs. Simply select your favorites, and we can fill out the applications for you.

PricewaterhouseCoopers logo
PricewaterhouseCoopersRosemont, Illinois
Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals.In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. The OpportunityAs a Cloud Engineering, Data & Analytics - Data Science professional, you will engage in the dynamic world of data analysis, leveraging your skills to transform complex data into actionable insights. This role involves working closely with clients to understand their needs and deliver data-driven solutions that enhance decision-making processes. You will be at the forefront of utilizing advanced data analytics tools and techniques to uncover trends and opportunities within financial markets.As an Associate, you will focus on learning and contributing to client engagements while developing your skills and knowledge to deliver quality work. You will be exposed to clients, learning how to build meaningful connections, manage projects, and inspire others. This role encourages you to embrace challenges as opportunities for growth, enhancing your personal brand and technical knowledge.In this role, you will be part of a team that supports the Data and Analytics group, where you will apply your skills in data modeling, machine learning, and statistical analysis to deliver impactful solutions. Your contributions will be vital in helping clients navigate complex data landscapes and data integrity, and driving business success through informed decision-making.Responsibilities- Conducting complex data analysis to extract meaningful insights and support decision-making processes- Developing algorithms and predictive models using machine learning techniques to enhance data-driven insights- Utilizing Python and statistical analysis software for data modeling and validation tasks- Building and maintaining data pipelines to ensure seamless data integration and management- Creating interactive dashboards and visualizations using Power BI and Tableau to communicate data findings effectively- Collaborating with teams to conduct analytic research and customer analysis for business improvement- Implementing data security measures to protect sensitive information and maintain data integrity- Engaging in exploratory data analysis to identify trends and patterns within large datasets- Supporting client engagements by applying business data analytics techniques to address specific needsWhat You Must Have- Currently pursuing or have completed a Bachelor's degree- Client service associate positions are entry-level roles intended for job seekers who are completing or have recently completed their final academic year of educational requirementsWhat Sets You Apart- Preference for one of the following field(s) of study: Management Information Systems, Information Technology, Computer Science, Data Analytics, Data Science, Statistics, Mathematics- Preference for a 3.3 overall GPA- Demonstrating proficiency in Python and machine learning- Utilizing Power BI and Tableau for data visualization- Conducting complex data analysis and predictive analytics- Developing data-driven insights for client support- Engaging in algorithm development and data modeling- Excelling in data security and validation techniques-Leveraging AI to create efficiencies, innovate ways of working and deliver distinctive outcomes Travel Requirements Up to 80% Job Posting End Date Learn more about how we work: https://pwc.to/how-we-workPwC does not intend to hire experienced or entry level job seekers who will need, now or in the future, PwC sponsorship through the H-1B lottery, except as set forth within the following policy: https://pwc.to/H-1B-Lottery-Policy.As PwC is an equal opportunity employer, all qualified applicants will receive consideration for employment at PwC without regard to race; color; religion; national origin; sex (including pregnancy, sexual orientation, and gender identity); age; disability; genetic information (including family medical history); veteran, marital, or citizenship status; or, any other status protected by law. For only those qualified applicants that are impacted by the Los Angeles County Fair Chance Ordinance for Employers, the Los Angeles' Fair Chance Initiative for Hiring Ordinance, the San Francisco Fair Chance Ordinance, San Diego County Fair Chance Ordinance, and the California Fair Chance Act, where applicable, arrest or conviction records will be considered for Employment in accordance with these laws. At PwC, we recognize that conviction records may have a direct, adverse, and negative relationship to responsibilities such as accessing sensitive company or customer information, handling proprietary assets, or collaborating closely with team members. We evaluate these factors thoughtfully to establish a secure and trusted workplace for all.The salary range for this position is: $61,000 - $100,000, plus individuals may be eligible for an annual discretionary bonus. For roles that are based in Maryland, this is the listed salary range for this position. Actual compensation within the range will be dependent upon the individual's skills, experience, qualifications and location, and applicable employment laws. PwC offers a wide range of benefits, including medical, dental, vision, 401k, holiday pay, vacation, personal and family sick leave, and more. To view our benefits at a glance, please visit the following link: https://pwc.to/benefits-at-a-glance

Posted 6 days ago

C logo
CrackaJack Digital Solutions LLCPhoenix, AZ
In-Person round of interview mandatory. Tech Stack: Big Data, Spark, Python, SQL, GCP is must. Need a hardcore, heavy hitting Data Engineer who is extremely skilled and is able to function independently and manage their deliverables Capable of writing ETL pipelines using Python from scratch Expert in OOP principles and concepts Ability to independently write efficient and reusable code for ETL pipelines Expert in data modeling concepts such as schemas and entity relationships Expert at analyzing and developing queries in SQL in various dialects (SQL Server, DB2, Oracle) Familiarity with Airflow and understands how to develop DAGs Expert in data warehouses like BigQuery, Databricks Deltalakehouse and how to programmatically ingest, cleanse, govern and report data out of them Expertise in Spark. Powered by JazzHR

Posted 30+ days ago

OmegaHires logo
OmegaHiresMinneapolis, MN
Job Role: Data Analyst with SQL, Data Modelling, Financial/Banking domain Location: Minneapolis, MN (3 days office, 2 days WFH) NEED LOCALS ONLY Duration: Long-term Contract Key Skills: Business Data analysis, Data Analysis, SQL, Data discovery, Data Modeling Must be from the Financial/Banking domain Qualifications Bachelors’ degree with 10 years of experience Financial industry experience: Demonstrated experience working in the financial services sector is required. Data analysis background: Proven track record in a data analyst role, including hands-on experience with data extraction, analysis, and reporting. Technical skills: Proficiency in writing SQL queries is essential. Experience with data visualization tools (e.g., Tableau, Power BI) is a plus. Soft skills: Exceptional verbal and written communication skills with the ability to build strong working relationships. Accountability: A demonstrated history of taking ownership of tasks and projects from inception to completion. Problem-solving: A very strong analytical mindset with a passion for solving complex business problems. Powered by JazzHR

Posted 5 days ago

A logo
Ankura Consulting GroupTampa, Florida
Ankura is a team of excellence founded on innovation and growth. Ankura Corporate Overview: The goal of Ankura Business Services (ABS) is to provide support and assistance to Ankura’s client facing business groups. All of Ankura’s ABS groups work together to achieve a common goal, which is to create innovative solutions for our clients. ABS includes Finance, Legal, IT Services, Marketing and Communications, Real Estate, Conflict Check, Operations & Workplace Services, and the People Office. Collaborative lateral thinking, hard-earned experience, expertise, and multidisciplinary capabilities drive results. Together, Ankura’s ABS groups work in harmony to Protect, Create, and Recover Value for our clients. Role Overview: Ankura is seeking a seasoned Data Engineer / Business Intelligence professional with strong technical acumen and demonstrated experience in dashboarding, data modeling, and workflow automation to join the Operations team. This position will play a pivotal role in building and maintaining the firm’s reporting infrastructure, integrating disparate data sources, and developing intuitive dashboards that enable informed operational decision-making. The ideal candidate has extensive experience designing and scaling data pipelines, optimizing reporting outputs, and applying advanced analytics to improve business performance. Candidates must demonstrate a disciplined approach to data governance, confidentiality, and security, as they will be handling sensitive operational and financial information. Prior experience within the Professional Services sector is a plus. The role will be filled at the Senior Director level. This position is ideally hybrid based for a candidate located near an Ankura office in the eastern or central time zone but there is potential to be remote based (in the eastern or central time zone). Responsibilities: Core Responsibilities Data Engineering & ETL: Design, build, and maintain efficient data pipelines using SQL, Alteryx, Microsoft Fabric, and other ETL tools to integrate and transform data from multiple systems (e.g., Workday ERP, Salesforce CRM, Intapp). Dashboard Development: Lead the design and development of executive-level dashboards and reports in Power BI and Microsoft Fabric, ensuring outputs are intuitive, scalable, and actionable. Advanced Data Modeling: Architect and optimize large-scale data models, applying best practices in normalization, schema design, and performance tuning for analytics and reporting. Automation & Optimization: Identify opportunities to streamline and automate data workflows, reduce manual reporting, and increase the speed and reliability of business insights. Analytics & Insights: Apply statistical and predictive analytics techniques (Python, R, or Alteryx) to analyze large datasets, forecast trends, and generate meaningful operational insights. Collaboration: Partner with IT, Finance, and business leadership to ensure reporting solutions meet stakeholder needs and align with firm-wide operational excellence priorities. Governance & Security: Uphold rigorous standards around data governance, security, and confidentiality in handling sensitive compensation, financial, and client information. Technical Responsibilities Power BI & Fabric Expertise: Develop complex dashboards, write advanced DAX formulas, use M language for data transformations, and leverage Microsoft Fabric for end-to-end data management, including data lakes, pipelines, and governance. Data Engineering Skills: Create scalable ETL pipelines using SQL, Alteryx, and Fabric Data Factory, integrating structured and unstructured data sources. Workflow Automation: Use tools such as Alteryx, Power Query, or Python to automate repetitive tasks and improve reporting efficiency. Predictive Analytics: Leverage statistical tools (Python, R, Alteryx, KNIME) to model and forecast business performance metrics. Workday ERP Integration: Support reporting and data integration from Workday PSA and Financials, including API usage and data migration. Visualization: Deliver compelling, executive-ready reports and presentations via Power BI, Fabric dashboards, Excel, and PowerPoint. Requirements: Bachelor’s degree in Data Engineering, Computer Science, Business Analytics, or related field. Advanced degree preferred. 12+ years of experience in data engineering, BI, or analytics roles (experience in Professional Services industry preferred). Expert proficiency with Power BI (DAX, M), SQL, Alteryx / KNIME, and Microsoft Fabric. Strong background in ETL design, large data model development, and BI dashboarding. Experience with Workday ERP and integrating ERP data with BI tools. Proficiency with Excel (advanced functions, VBA), Power Query, Power Pivot, and PowerPoint. Preferred skills include Python, R, Tableau, Salesforce CRM, and Workday Adaptive Planning. Excellent analytical, problem-solving, and communication skills, with the ability to translate technical outputs into business insights. Experience managing cross-functional data projects, with proven ability to deliver under tight deadlines. Strong organizational skills and the ability to manage multiple priorities across stakeholders. #LI-MJ1 #LI-Hybrid Ankura is an Affirmative Action and Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against based on disability. Equal Employment Opportunity Posters, if you have a disability and believe you need a reasonable accommodation to search for a job opening, submit an online application, or participate in an interview/assessment, please email accommodations@ankura.com or call toll-free +1.312-583-2122. This email and phone number are created exclusively to assist disabled job seekers whose disability prevents them from being able to apply online. Only messages left for this purpose will be returned. Messages left for other purposes, such as following up on an application or technical issues unrelated to a disability, will not receive a response.

Posted 3 days ago

O logo
Ontrac SolutionsChicago, IL
Industry: Private Equity–Backed Manufacturing Location: Remote (US-Based Preferred) Start Date: ASAP Duration: 3–6 months with high potential for extension (Phase 2) Make a Real Impact from Day One Are you a seasoned Data Engineer looking for a project where your work directly drives financial decisions and business transformation? Join us on a high-priority initiative for a PE-backed manufacturer undergoing a major ERP shift. Your mission: help establish a critical monthly inventory valuation process by unlocking data from a legacy AS400 system. This is a unique opportunity to lead a highly visible, hands-on project from the ground up—solving real business problems, working with engaged stakeholders, and setting the stage for long-term modernization. What You'll Do Design & build an interim data extraction pipeline from a legacy AS400 (IBM iSeries) system. Create a monthly “book of record” for inventory valuation, enabling financial close and business reporting. Engineer a method to lock monthly labor and material inputs for data consistency and audit-readiness. Collaborate directly with finance, ops, and executive stakeholders—no red tape. Set the foundation for Phase 2 : advanced visualization and ongoing maintenance. What You Bring 8+ years of experience as a data engineer or technical data consultant. Solid track record of working with legacy systems , especially AS400/DB2 (strongly preferred). Strong skills in ETL pipeline development , SQL , and handling structured enterprise data. Comfortable working in fast-paced, private equity-backed or high-growth environments. Deep understanding of inventory valuation , cost inputs , or financial close processes . You're a self-starter , comfortable owning deliverables and working independently. Tech Environment AS400 / DB2 (preferred), legacy ERPs Python, SQL, Bash/Shell scripting SQL Server, Oracle, Snowflake Git or version control workflows Power BI, Tableau (for future phases) Why You'll Love This Role Immediate impact on a business-critical initiative Flexible, remote work setup with autonomy Work with a highly collaborative client team who values technical leadership Opportunity to own a project end-to-end , not just maintain someone else's code Visibility and influence with executive stakeholders Possibility for follow-on work in analytics, visualization, and ERP transformation

Posted 30+ days ago

Care It Services logo
Care It ServicesDallas, Texas
Benefits: Company parties Competitive salary Dental insurance Free food & snacks Hi Hope doing good & Well Work location: Dallas, TX /Atlanta, GA End client: IBM/DTV Compltely on site position Job Description We are looking for an experienced Databricks Subject Matter Expert (SME) with expertise in Data Profiling and Data Modeling to join our growing team. In this role, you will be responsible for leveraging Databricks to drive end-to-end data solutions, ensuring data quality, and optimizing data pipelines for performance and scalability. You will also play a pivotal role in designing, implementing, and maintaining data models that align with business requirements and industry best practices. The ideal candidate should have deep experience with the Databricks ecosystem, including Spark, Delta Lake, and other cloud-based data technologies, combined with a strong understanding of data profiling and data modeling concepts. You will collaborate closely with data engineers, data scientists, and business analysts to ensure data integrity, accuracy, and optimal architecture. Skills and Qualifications: Technical Skills: Databricks (including Spark, Delta Lake, and other relevant components) · 8+ years of hands-on experience with Databricks or related technologies. Strong expertise in Data Profiling tools and techniques. Experience in Data Profiling and Data Quality management. · Experience in Data Modeling , including working with dimensional models for analytics. Advanced knowledge of SQL , PySpark , and other scripting languages used within Databricks. Experience with Data Modeling (e.g., relational, dimensional, star schema, snowflake schema). Hands-on experience with ETL/ELT processes , data integration, and data pipeline optimization. Familiarity with cloud platforms (AWS, Azure, Google Cloud) and cloud data storage technologies. Proficiency in Python , Scala , or other programming languages commonly used in Databricks. Experience in Data Governance and Data Quality practices.. · Familiarity with Machine Learning workflows within Databricks is a plus. Thank you venkatesh@careits.com Compensation: $50.00 - $60.00 per hour Who We Are CARE ITS is a certified Woman-owned and operated minority company (certified as WMBE). At CARE ITS, we are the World Class IT Professionals, helping clients achieve their goals. Care ITS was established in 2010. Since then we have successfully executed several projects with our expert team of professionals with more than 20 years of experience each. We are globally operated with our Head Quarters in Plainsboro, NJ, with focused specialization in Salesforce, Guidewire and AWS. We provide expert solutions to our customers in various business domains.

Posted 30+ days ago

C3 AI logo
C3 AIRedwood City, CA
C3 AI (NYSE: AI), is the Enterprise AI application software company. C3 AI delivers a family of fully integrated products including the C3 Agentic AI Platform, an end-to-end platform for developing, deploying, and operating enterprise AI applications, C3 AI applications, a portfolio of industry-specific SaaS enterprise AI applications that enable the digital transformation of organizations globally, and C3 Generative AI, a suite of domain-specific generative AI offerings for the enterprise. Learn more at: C3 AI As a member of the C3 AI Data Science team, you will work with some of the largest companies on the planet to help them build the next generation of AI-powered enterprise applications. You will collaborate directly with data scientists, software engineers, and subject matter experts in defining new AI solutions that provide our customers (c3.ai/customers/) with the information they need to make informed decisions and enable their digital transformation. Your role will involve finding the appropriate machine learning algorithms and implementing them on the C3 AI Platform to ensure they can run at scale. C3 AI Data Scientists are equipped with modern development tools, IDEs, and AI agents to maximize productivity and accelerate solution delivery.   Qualified candidates will have an in-depth knowledge of most common machine learning techniques and their application. You will also understand the limitations of these algorithms and how to tweak them or derive from them to achieve similar results at a large scale. Note: This is a client-facing position which requires travel. Candidates should have the ability and willingness to travel based on business needs. Responsibilities: Designing and deploying Machine Learning algorithms for industrial applications such as predictive maintenance, demand forecasting and process optimization. Collaborating with data and subject matter experts from C3 AI and its customer teams to seek, understand, validate, interpret, and correctly use new data elements. Driving adoption and scalability of Generative AI and Deep Learning systems within C3 AI’s products. Qualifications: MS or PhD in Computer Science, Electrical Engineering, Statistics, or equivalent fields. Applied Machine Learning experience (regression and classification, supervised, self-supervised, and unsupervised learning).  Strong mathematical background (linear algebra, calculus, probability, and statistics). Proficiency in Python and objected-oriented programming, e.g. JavaScript Familiarity with key Python packages for data wrangling, machine learning, and deep learning such as pandas, sklearn, tensorflow, torch, langchain, etc. Ability to drive a project and work both independently and in a cross-functional team. Smart, motivated, can-do attitude, and seeks to make a difference in a fast-paced environment. Excellent verbal and written communication. Ability to travel as needed  Preferred Qualifications: Experience with scalable ML (MapReduce, Spark). Experience in Generative AI, e.g., Large Language Models (LLMs), embedding models, prompt engineering, and fine-tuning. Experience with reinforcement learning. A portfolio of projects (GitHub, papers, etc.). Experience working with modern IDEs and AI agent tools  as part of accelerated development workflows. C3 AI provides excellent benefits, a competitive compensation package and generous equity plan.  California Base Pay Range $123,000 — $185,000 USD C3 AI is proud to be an Equal Opportunity and Affirmative Action Employer. We do not discriminate on the basis of any legally protected characteristics, including disabled and veteran status. 

Posted 30+ days ago

Bristol Myers Squibb logo
Bristol Myers SquibbSan Diego, CA
Working with Us Challenging. Meaningful. Life-changing. Those aren't words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more: careers.bms.com/working-with-us. Summary: As part of the Translational Data Products team, you will directly support translational medicine leaders in their mission to discover biomarkers that guide patient selection and treatment response for BMS assets. Your work will enable exploratory data analysis that drives crucial biomarker decisions at the heart of translational research. You will bridge data engineering with innovation: orchestrating advanced pipelines, ensuring auto-generated ETL and schema mappings are correct, and experimenting with the newest techniques-such as MCP servers, prompt engineering strategies (ReACT, chain-of-thought, etc.), and LLM-assisted tooling-to make biomarker data accessible, trustworthy, and actionable. Key Responsibilities: Enable biomarker discovery: Deliver data pipelines and mappings that help translational leaders identify biomarkers (molecular, digital, imaging) for patient stratification and treatment response. Innovate with AI/LLMs: Explore and apply cutting-edge approaches (MCP servers, prompt orchestration, auto-schema mapping, LLM-based ETL generation) to accelerate and improve data workflows. Data orchestration: Oversee ingestion from diverse sources (vendor feeds, raw instruments, CSV, PDF, etc.), ensuring automated ETL and sample-to-target mapping & transformation (STTM) outputs meet stakeholder needs. Quality and profiling: Assess and validate source data, documenting any cleaning, normalization of semantic mapping that needs to be applied for optimal QC, and identify where improvements are required vs merely convenient. Hands-on implementation: Build or adapt tools/scripts (Python, SQL, AWS Glue, Databricks, etc.) when automation falls short. Stakeholder collaboration: Act as a partner to translational medicine leaders-communicating progress, and brainstorming next steps as priorities evolve. Agile team contribution: Participate actively in standups, design sessions, sprint demos and innovation discussions. Qualifications: Bachelor's or Master's degree in Computer Science, Data Engineering, Bioinformatics, or related field. 5+ years of experience in data engineering, ideally with exposure to life sciences or healthcare. Strong experience with data integration from heterogeneous sources (structured, semi-structured, unstructured). Proficiency in AWS, Python and SQL, with ability to prototype and automate workflows. Hands-on expertise with ETL frameworks (AWS Glue, Databricks, Airflow) Familiarity with modern AI/LLM approaches for data transformation and semantic mapping is highly desirable. Excellent communication skills to engage both technical and scientific stakeholders. Comfortable in agile, exploratory, scientific environments What Makes This Role Unique: Direct scientific impact: Your work connects directly to patient-centric translational decisions. Innovation: You are encouraged to explore new technologies and approaches, not just maintain existing ones. Automation first: Instead of building every pipeline from scratch, you orchestrate and validate auto-generated ETLs and mappings. Collaborative science + engineering: You will brainstorm with scientists, demo working solutions, and help shape the future of translational data products. If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. Compensation Overview: Cambridge Crossing: $148,850 - $180,374San Diego- CA - US: $148,850 - $180,374Tampa- FL - US: $135,320 - $163,976 The starting compensation range(s) for this role are listed above for a full-time employee (FTE) basis. Additional incentive cash and stock opportunities (based on eligibility) may be available. The starting pay rate takes into account characteristics of the job, such as required skills, where the job is performed, the employee's work schedule, job-related knowledge, and experience. Final, individual compensation will be decided based on demonstrated experience. Eligibility for specific benefits listed on our careers site may vary based on the job and location. For more on benefits, please visit https://careers.bms.com/life-at-bms/ . Benefit offerings are subject to the terms and conditions of the applicable plans then in effect and may include the following: Medical, pharmacy, dental and vision care. Wellbeing support such as the BMS Living Life Better program and employee assistance programs (EAP). Financial well-being resources and a 401(K). Financial protection benefits such as short- and long-term disability, life insurance, supplemental health insurance, business travel protection and survivor support. Work-life programs include paid national holidays and optional holidays, Global Shutdown Days between Christmas and New Year's holiday, up to 120 hours of paid vacation, up to two (2) paid days to volunteer, sick time off, and summer hours flexibility. Parental, caregiver, bereavement, and military leave. Family care services such as adoption and surrogacy reimbursement, fertility/infertility benefits, support for traveling mothers, and child, elder and pet care resources. Other perks like tuition reimbursement and a recognition program. Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as "Transforming patients' lives through science ", every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. On-site Protocol BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms.com. Visit careers.bms.com/eeo-accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers.bms.com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations.

Posted 3 weeks ago

Crane Worldwide Logistics logo
Crane Worldwide LogisticsHouston, TX
ESSENTIAL JOB FUNCTIONS Data Pipeline & Platform Development Build and maintain pipelines using dbt, Prefect, and Terraform Develop and manage connectors across sources and targets including Kafka, RDBMs, and Snowflake. Implement schema evolution, validation rules, and automated testing Support high-availability and disaster recovery design for Snowflake and Materialize Data Product Engineering Author and review schemas and data contracts for consistency and governance Develop and optimize dbt models for Snowflake and Materialize analytics layers Configure clusters and role-based access for shared environments Document datasets to ensure discoverability and proper usage across teams Stakeholder Collaboration Partner with BI developers, analysts, and business teams to deliver datasets that support reporting, dashboards, and integrations Investigate and resolve data issues, ensuring durable fixes Participate in design reviews to align technical solutions with business requirements Collaboration & Standards Contribute to PR and design reviews for pipelines and models Support platform governance, observability, and best practices for data quality Work with adjacent teams (Ops & Reliability, Analytics, Product) to align on SLAs and data definitions Other duties as assigned OTHER SKILLS AND ABILITIES Proficiency in Python and SQL for building and optimizing data pipelines Hands-on experience with dbt for modeling and testing, and Terraform for infrastructure-as-code Familiarity with modern data platforms: Snowflake, Materialize, Kafka, HVR, Fivetran, or Stitch Understanding of data contracts, observability, and governance practices Experience with CI/CD tools (GitHub Actions, GitLab CI, or similar) Ability to translate business needs into scalable technical solutions Knowledge of compliance frameworks (e.g., GDPR, CCPA, SOC 2) a plus EDUCATION AND EXPERIENCE Quantitative bachelors' degree preferred but not required Prior experience in a data engineering or data-heavy backend software engineering role PHYSICAL REQUIREMENTS Job requires the ability to use vision, adjust focus and work on a standard computer screen Job may require extended sitting or standing, use of standard office equipment CERTIFICATION AND LICENSES Professional certification may be required in some areas. MUST COMPLETE PI ASSESSMENT IN ORDER TO BE CONSIDERED FOR THE POSITION: https://assessment.predictiveindex.com/bo/28w/Candidate_Link WHY SHOULD YOU WORK FOR CRANE? At Crane, we believe in providing our employees with excellent benefits at a Great Place to Work. We offer: Quarterly Incentive Plan 136 hours of Paid Time Off which equals 17 days for the year, that can be used for Sick Time or for Personal Use Excellent Medical, Dental and Vision benefits Tuition Reimbursement for education related to your job Employee Referral Bonuses Employee Recognition and Rewards Program Paid Volunteer Time to support a cause that is close to your heart and contributes to our communities Employee Discounts Wellness Incentives that can go up to $100 per year for completing challenges, in addition to a discount on contribution rates Come join the leader in logistics and take your career in the right direction. Disclaimer: The above statements are intended to describe the general nature and level of work being performed by people assigned to this position. They are not to be construed as an exhaustive list of all responsibilities, duties, and skills required of personnel so classified. All personnel may be required to perform duties outside of their normal responsibilities from time to time, as needed. The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. We maintain a drug-free workplace and perform pre-employment substance abuse testing. This position requires the final candidate to successfully pass an E-Verify Check. More Information: http://www.dhs.gov/e-verify Company benefits are contingent upon meeting eligibility requirements and plan conditions.

Posted 30+ days ago

L logo
LIVE NATION ENTERTAINMENT INCSan Juan, PR
Job Summary: WHO ARE WE? Live Nation Entertainment is the world's leading live entertainment company, comprised of global market leaders: Ticketmaster, Live Nation Concerts, and Live Nation Media & Sponsorship. Ticketmaster is the global leader in event ticketing with over 620 million tickets sold annually and approximately 10,000 clients worldwide. Live Nation Concerts is the largest provider of live entertainment in the world promoting more than 50,000 events annually for nearly 7,000 artists in 40+ countries. These businesses allow Live Nation Media & Sponsorship to create strategic music marketing programs that connect more than 1,200 sponsors with the 145 million fans that attend Live Nation Entertainment events each year. For additional information, visit www.livenationentertainment.com. WHO ARE YOU? Passionate and motivated. Driven, with an entrepreneurial spirit. Resourceful, innovative, forward thinking and committed. At Live Nation Entertainment, our people embrace these qualities, so if this sounds like you then please read on! THE TEAM Core Data Services org is at the center of Data and Analytics initiatives across the entire Live Nation enterprise. We are at the beginning of our journey to build an enterprise data platform capable of being the true backbone for data needs across the organization. Our mission is to make reliable data available and enable value creation by the data community of engineers, analysts and decision makers. Core Data Services org consists of Product and Engineering teams overseeing Architecture, Platform Engineering, Data Engineering, Business Intelligence Engineering and Operations. THE ROLE As a Data Architect within the Core Data Services team, you will be responsible for the comprehensive solution and architecture of our Enterprise Data platform. We are seeking an experienced professional with a track record of building solutions to large-scale systems at the enterprise level. You will be responsible for designing, implementing, and optimizing end-to-end data solutions that empower our organization to harness the full potential of data. You will collaborate closely with cross-functional teams, including data engineers, data scientists, and business stakeholders, to drive innovation and ensure the seamless integration of data and analytics within our ecosystem. Within this role, you will encounter a multitude of opportunities to effect meaningful change across various dimensions, from technological enhancements and streamlined processes to the development and mentorship of talented individuals. Your contributions will have a significant impact on our business and technology team, reaffirming our commitment to excellence. This role is about driving transformation - defining how data is modeled, governed, and delivered in a global, high-volume environment. If you're excited to influence architecture decisions from day one, thrive in ambiguity, and want to leave a lasting mark on how data is done at scale, this role is for you. WHAT THIS ROLE WILL DO: Architecture & Strategy Design and evolve our Databricks Lakehouse architecture with a focus on scalability, cost efficiency, and reliability in response to changing requirements and to address various data needs across the business. Define and implement a tiered data ecosystem with maturity layers to transform data systematically while ensuring each layer services a specific purpose from raw data ingestion to refined and analytics-ready data. Outline the vision, requirements, and lead development of the Data Lakehouse by aligning technical architecture with business needs and long-term vision. Continuously evaluate and introduce new patterns, tools, or practices to make our data platform more scalable, resilient and reliable and then work across our team to put your ideas into action. Modeling, Standards & Governance Develop conceptual, logical, and physical data models to serve as the backbone of integrated, reusable and scalable data to support the current and future development of data strategies Establish and enforce standards for metadata, lineage, and data quality across the ecosystem. Define patterns and playbooks that guide engineering teams in building consistent, future-proof pipelines. Enhance data discoverability by working closely with data analysts and business stakeholders to make data easily accessible and understandable to them. Develop and enforce data engineering, security, and data quality standards through automation. Develop and implement strategies for seamlessly integrating data from diverse sources across different systems and platforms, including real-time and batch processing. Delivery & Enablement Collaborate with engineers, analysts, and business teams to design solutions that are accessible, usable, and trustworthy. Mentor and provide architectural guidance to engineering staff. Lead workshops and training to raise data literacy and promote best practices. WHAT THIS PERSON WILL BRING: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field 10+ years of tech industry experience and have 5+ years experience in architecting and implementing big data strategies 5+ years expertise with and cloud-based data platforms (Databricks, AWS, Snowflake, Spark) Deep knowledge of data modeling using Erwin or similar tooling (3NF, dimensional/star schema), medallion architectures, ETL processes, data integration, and Data Warehousing concepts Experience with Big Data technologies such as Databricks, Spark, Hadoop, and NoSQL databases Experience in architecting data pipelines and solutions for streaming and batch integrations using tools/frameworks like dbt, Talend, Azure Data Factory, Spark, Spark Streaming, etc. Experience with confluent Kafka real-time data processing and API platforms. Strong understanding of data governance, lineage, and compliance. Experience optimizing for cost and performance at scale (e.g., caching, partitioning strategies) Excellent communication and interpersonal skills for effective collaboration with technical and non-technical teams to be able to translate architecture into business value BENEFITS & PERKS Our motto is 'Taking Care of Our Own' through 6 pillars of benefits: HEALTH: Medical, vision, dental and mental health benefits for you and your family, with access to a health care concierge, and Flexible or Health Savings Accounts (FSA or HSA) YOURSELF: Free concert tickets, generous paid time off including paid holidays, sick time, and personal days WEALTH: 401(k) program with company match, stock reimbursement program FAMILY: New parent programs including caregiver leave and baby bonuses, plus fertility, adoption, foster, or surrogacy support CAREER: Career and skill development programs with School of Live, tuition reimbursement, and student loan repayment OTHERS: Volunteer time off, crowdfunding match EQUAL EMPLOYMENT OPPORTUNITY We aspire to build teams that reflect and support the fans and artists we serve. Every day we aim to promote environments where everyone can be themselves, contribute fully, and thrive within our company and at our events. As a growing business we will encourage you to develop your professional and personal aspirations, enjoy new experiences, and learn from the talented people you will be working with. Live Nation is an equal opportunity employer. It hires and promotes employees based on their experience, talent, and qualifications for the job and does not tolerate discrimination toward employees based on age (40 and over), ancestry, color, religious creed (including religious dress and grooming practices), family and medical care leave or the denial of family and medical care leave, mental or physical disability (including HIV and AIDS), marital status, domestic partner status, medical condition (including cancer and genetic characteristics), genetic information, military and veteran status, political affiliation, national origin (including language use restrictions), citizenship, race, sex (including pregnancy, childbirth, breastfeeding and medical conditions related to pregnancy, childbirth or breastfeeding), gender, gender identity, and gender expression, sexual orientation, intersectionality, or any other basis protected by applicable federal, state or local law, rule, ordinance or regulation. We will consider qualified applicants with criminal histories in a manner consistent with the requirements of the Los Angeles Fair Chance Ordinance, San Francisco Fair Chance Ordinance and the California Fair Chance Act and consistent with other similar and / or applicable laws in other areas. Live Nation affords equal employment opportunities to qualified individuals with a disability. For this reason, Live Nation will make reasonable accommodations for the known physical or mental limitations of an otherwise qualified individual with a disability who is an applicant or an employee consistent with its legal obligations to do so. As part of its commitment to make reasonable accommodations, Live Nation also wishes to participate in a timely, good faith, interactive process with a disabled applicant or employee to determine effective reasonable accommodations, if any, which can be made in response to a request for accommodations. Applicants and employees are invited to identify reasonable accommodations that can be made to assist them to perform the essential functions of the position they seek or currently occupy. Any applicant or employee who requires an accommodation in order to perform the essential functions of the job should contact either the hiring manager for the role or a Human Resources representative to request the opportunity to participate in a timely interactive process. HIRING PRACTICES The preceding job description has been designed to indicate the general nature and level of work performed by employees within this classification. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities, and qualifications required of employees assigned to this job. Live Nation recruitment policies are designed to place the most highly qualified persons available in a timely and efficient manner. Live Nation may pursue all avenues available, including promotion from within, employee referrals, outside advertising, employment agencies, internet recruiting, job fairs, college recruiting and search firms. #LI-EF1 #LI-RemoteUnitedStates --------- The expected compensation for this position is: $144,000.00 USD - $180,000.00 USD Pay is based on a number of factors including market location, qualifications, skills, and experience.

Posted 6 days ago

C logo
C3 AI Inc.Redwood City, CA
C3 AI (NYSE: AI), is the Enterprise AI application software company. C3 AI delivers a family of fully integrated products including the C3 Agentic AI Platform, an end-to-end platform for developing, deploying, and operating enterprise AI applications, C3 AI applications, a portfolio of industry-specific SaaS enterprise AI applications that enable the digital transformation of organizations globally, and C3 Generative AI, a suite of domain-specific generative AI offerings for the enterprise. Learn more at: C3 AI As a Data Scientist / Senior Data Scientist -- Optimization, you will work with some of the world's largest companies to design and deliver the next generation of AI-powered enterprise applications, where optimization solutions play a critical role. Our team focuses on developing scalable, explainable optimization models and algorithms tailored to diverse industry verticals, including supply chain, manufacturing, food processing, agriculture, health care, and more. You will work alongside peer data scientists, software engineers, subject matter experts and business stakeholders to deliver the end-to-end optimization solutions that drive measurable business value and enable digital transformation for our customers. Your responsibilities will span the entire solution lifecycle, including: understanding business requirements and translating them into optimization problems, analyzing historical data to uncover insights and guide solution design, developing advanced optimization models and algorithms to address complex business challenges, building prototypes and products to validate solutions and demonstrate value, deploying production-grade solutions on the C3 AI Suite, ensuring scalability and robustness in real-world operations. C3 AI Data Scientists are equipped with modern development tools, IDEs, and AI agents to maximize productivity and accelerate solution delivery. Qualified candidates should possess deep expertise in operations research and optimization, along with a solid understanding of common machine learning techniques and their practical applications. Candidates with extensive experience may be considered for more senior positions within this category. Responsibilities: Provide thought leadership and expertise in optimization, guiding strategic decision-making and technical direction. Research, design, implement, and deploy optimization solutions for enterprise applications leveraging the C3 AI Suite. Assist and enable C3 AI customers to build their own optimization applications on the C3 AI Suite. Collaborate with or lead a small technical team on a project and identify potential risks and implement mitigation strategies to ensure project success. Partner with cross-functional team to translate optimization model insights into actionable business strategies and measurable outcomes. Develop, maintain and enhance optimization frameworks, libraries, and tools to ensure scalability and efficiency while contributing to the continuous improvement of the C3 AI Suite. Stay informed on state-of-the-art optimization techniques, promote best practices, and foster an innovative and collaborative work environment at C3 AI. Qualifications: MS or PhD in Operations Research, Applied Mathematics, Computer Science, Artificial Intelligence, Industrial Engineering or equivalent. Deep understanding of optimization (constrained, stochastic, convex and non-convex optimization problems, and LP, QP, MILP, MINLP, problems and solvers). Strong mathematical background (linear algebra, calculus, statistics, numerical simulation). Demonstrated expertise in Python (or a similar object-oriented programming language), with hands-on experience in at least one mathematical programming library or framework. Ability to drive a project and work both independently and in a team. Smart, motivated, can-do attitude, and seeks to make a difference. Curiosity and willingness to learn about our customers industries. Excellent verbal and written communication in English. Work will be conducted at our office in Redwood City, California Preferred Qualifications: Professional experience applying optimization in a customer facing role. Practical experience with at least one type of commercial solver (e.g., Gurobi). Knowledge of git and experience with GenAI productivity tools. Experience with applied/scalable ML, state-of-art deep learning and reinforcement learning problem formulation and model architecture design. A portfolio of projects (GitHub, etc.) and publications at top-tier peer-reviewed conferences or journals is a plus. Experience working with modern IDEs and AI agent tools as part of accelerated development workflows. C3 AI provides excellent benefits, a competitive compensation package and generous equity plan. California Base Pay Range $123,000-$185,000 USD C3 AI is proud to be an Equal Opportunity and Affirmative Action Employer. We do not discriminate on the basis of any legally protected characteristics, including disabled and veteran status.

Posted 2 weeks ago

Geico Insurance logo
Geico InsuranceChevy Chase, MD
At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities. Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers' expectations while making a real impact for our company through our shared purpose. When you join our company, we want you to feel valued, supported and proud to work here. That's why we offer The GEICO Pledge: Great Company, Great Culture, Great Rewards and Great Careers. GEICO is looking for a customer-obsessed and results-oriented Product Manager to support our Data Ingestion and Movement platform. This role will help drive product direction for our data ingestion, ETL/ELT pipelines, and data movement services, focusing on enabling reliable data flow into our lakehouse and other data stores. The ideal candidate will have a technical background in data engineering and experience delivering scalable data platforms and data pipeline solutions. Description As a Product Manager for Data Ingestion and Movement, you will be responsible for supporting the product vision and execution for GEICO's data ingestion and movement products. To successfully shape a platform that enables pipeline-as-a-service and supports a scalable data mesh architecture, a strong technical understanding of data pipelines, data integration patterns, data orchestration, ETL/ELT processes, and platform engineering is essential. Your goal is to abstract complexity and empower domain teams to autonomously and efficiently build, deploy, and govern data pipelines. This role also requires stakeholder management skills and the ability to bridge technical solutions with business value. Key Responsibilities Support the development and execution of data ingestion and movement platform vision aligned with business goals and customer needs Help create and maintain a clear, prioritized roadmap for data ingestion and movement capabilities that balances short-term delivery with long-term strategic objectives Support evangelizing the Data Ingestion and Movement platform across the organization and help drive stakeholder alignment Stay abreast of industry trends and competitive landscape (Apache Kafka, Apache Airflow, AWS Glue, Azure Data Factory, Google Cloud Dataflow, etc.) to inform data ingestion strategy Support requirement gathering and product strategy for data ingestion, ETL/ELT pipelines, and data movement services Understand end-to-end data ingestion workflows and how data movement fits into the broader data ecosystem and downstream analytics Support data governance initiatives for data lineage, quality, and compliance in data ingestion and movement processes Ensure data ingestion and movement processes adhere to regulatory, compliance, and data quality standards Partner with engineering on the development of data ingestion tools, pipeline orchestration services, and data movement capabilities Help define product capabilities for data ingestion, pipeline monitoring, error handling, and data quality validation to improve reliability and performance Support customer roadshows and training on data ingestion and movement capabilities Build instrumentation and observability into data ingestion and movement tools to enable data-driven product decisions and pipeline monitoring Work closely with engineering, data engineering, and data teams to ensure seamless delivery of data ingestion and movement products Partner with customer success, support, and engineering teams to create clear feedback loops Translate data ingestion and movement technical capabilities into business value and user benefits Support alignment across multiple stakeholders and teams in complex, ambiguous environments Qualifications Required Understanding of data ingestion patterns, ETL/ELT processes, and data pipeline architectures (Apache Kafka, Apache Airflow, Apache Spark, AWS Glue, etc.) Experience with data integration APIs, connectors, and data pipeline orchestration tools Basic understanding of data pipeline monitoring, observability, and data quality validation practices Experience in cloud data ecosystems (AWS, GCP, Azure) Proven analytical and problem-solving abilities with a data-driven approach to decision-making Experience working with Agile methodologies and tools (JIRA, Azure DevOps) Good communication, stakeholder management, and cross-functional collaboration skills Strong organizational skills with ability to manage product backlogs Preferred Previous experience as a software or data engineer is a plus Strong business acumen to prioritize features based on customer value and business impact Experience with data ingestion tools (Apache Kafka, Apache NiFi, AWS Kinesis, Azure Event Hubs, etc.) Knowledge of data lineage, data quality frameworks, and compliance requirements for data ingestion Insurance industry experience Experience Minimum 5+ years of technical product management experience building platforms that support data ingestion, ETL/ELT pipelines, data engineering, and data infrastructure Track record of delivering successful products in fast-paced environments Experience supporting complex, multi-stakeholder initiatives Proven ability to work with technical teams and translate business requirements into technical product specifications Experience with customer research, user interviews, and data-driven decision making Education Bachelor's degree in computer science, engineering, management information systems, or related technical field required MBA/MS or equivalent experience preferred Annual Salary $88,150.00 - $157,850.00 The above annual salary range is a general guideline. Multiple factors are taken into consideration to arrive at the final hourly rate/ annual salary to be offered to the selected candidate. Factors include, but are not limited to, the scope and responsibilities of the role, the selected candidate's work experience, education and training, the work location as well as market and business considerations. At this time, GEICO will not sponsor a new applicant for employment authorization for this position. The GEICO Pledge: Great Company: At GEICO, we help our customers through life's twists and turns. Our mission is to protect people when they need it most and we're constantly evolving to stay ahead of their needs. We're an iconic brand that thrives on innovation, exceeding our customers' expectations and enabling our collective success. From day one, you'll take on exciting challenges that help you grow and collaborate with dynamic teams who want to make a positive impact on people's lives. Great Careers: We offer a career where you can learn, grow, and thrive through personalized development programs, created with your career - and your potential - in mind. You'll have access to industry leading training, certification assistance, career mentorship and coaching with supportive leaders at all levels. Great Culture: We foster an inclusive culture of shared success, rooted in integrity, a bias for action and a winning mindset. Grounded by our core values, we have an an established culture of caring, inclusion, and belonging, that values different perspectives. Our teams are led by dynamic, multi-faceted teams led by supportive leaders, driven by performance excellence and unified under a shared purpose. As part of our culture, we also offer employee engagement and recognition programs that reward the positive impact our work makes on the lives of our customers. Great Rewards: We offer compensation and benefits built to enhance your physical well-being, mental and emotional health and financial future. Comprehensive Total Rewards program that offers personalized coverage tailor-made for you and your family's overall well-being. Financial benefits including market-competitive compensation; a 401K savings plan vested from day one that offers a 6% match; performance and recognition-based incentives; and tuition assistance. Access to additional benefits like mental healthcare as well as fertility and adoption assistance. Supports flexibility- We provide workplace flexibility as well as our GEICO Flex program, which offers the ability to work from anywhere in the US for up to four weeks per year. The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled. GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants.

Posted 2 weeks ago

C logo
CNA Financial Corp.Chicago, IL
You have a clear vision of where your career can go. And we have the leadership to help you get there. At CNA, we strive to create a culture in which people know they matter and are part of something important, ensuring the abilities of all employees are used to their fullest potential. Individual contributor with expertise within a domain of insurance data management associated with finance, actuarial, or corporate processing and reporting applications. Promotes P&C data standards, and corporate coding structures through deep knowledge of CNA's products and services, external statutory, regulatory and financial (GAAP) and internal management information reporting requirements. Provides informal data-related technical guidance and formally leads teams on a project basis. Develops designs and/or integrates data solutions to complex business problems. JOB DESCRIPTION: Essential Duties & Responsibilities Performs a combination of duties in accordance with departmental guidelines: Evaluates defined client area requirements and processes for application changes to data values or complex structures. Performs data analysis, develops recommendations and approves requests to ensure that changes comply with corporate data policy and/or reporting needs. Designs high level, functional and detailed designs for data solutions. Consults with client and application areas to analyze data usage and end-to-end system capabilities, identifies risks, recommends and/or executes resolution. Drives accountability for data integrity by developing and executing necessary processes and controls around the flow of data. Collaborates with management, technical staff and subject matter advisors to understand business needs/issues, troubleshoot problems, conduct root cause analysis and develop cost effective resolutions for data anomalies. Participates in integrated data architecture discussions and recommends use of existing or new data elements that will enhance current systems and support overall corporate and business goals. Remains externally focused by providing technical consultation to clients and IT management to ensure development of efficient application systems utilizing established procedures and methodologies and data policy. Influences the development of data policy. Participates in or leads data-related projects Analyzes, develops, and/or executes specifications for data mapping and transformation processes within and between applications Verifies accuracy of table changes and data transformation processes. Ensures adequacy of test plans and monitors application testing results associated with data transformation processes between applications Utilizes data and/or metrics from applicable systems to review data processes, identify issues, determine resolution and/or escalate problems that require data, system or process improvements. May perform additional duties as assigned. Defines, publishes and enforces changes to corporate codes. Reporting Relationship Typically Reports to Manager or above Skills, Knowledge & Abilities Solid analytical and problem solving skills. Solid communication, interpersonal and presentation skills to work effectively among all levels of internal/external partners/clients. Ability to multitask in a fast-paced dynamic workforce. Ability to work independently as well as in a team environment. Possesses some influence management skills. Skillful at learning new system applications/functions. Readily adapts to change. Relationships with business partners in IT, Finance, Actuarial and Data Stewards helpful in resolution of issue, root cause analysis, and driving result - oriented change. Education & Experience Bachelor's degree or equivalent work experience Typically a minimum of five to seven years' experience in data management, accounting, systems development, data analysis or systems/business analysis as required by position. Typically a minimum of three to five years of project management experience preferred. Advanced computing skills including, but not limited to: MS Office Suite, SQL, PeopleSoft, and business Objects. Basic mainframe knowledge (i.e. Job Control Language (JCL) helpful, but not required. One or more data, insurance or functional certifications such as IIA, DMIP, LIMA, AIDM, CIDM, or CPA helpful but not required. Knowledge of statistical, regulatory and other National Association of Insurance Commissioners (NAIC) information will be required when dealing with external agencies. #LI-ED1 #LI-HYBRID In certain jurisdictions, CNA is legally required to include a reasonable estimate of the compensation for this role. In District of Columbia, California, Colorado, Connecticut, Illinois, Maryland, Massachusetts, New York and Washington, the national base pay range for this job level is $72,000 to $141,000 annually. Salary determinations are based on various factors, including but not limited to, relevant work experience, skills, certifications and location. CNA offers a comprehensive and competitive benefits package to help our employees - and their family members - achieve their physical, financial, emotional and social wellbeing goals. For a detailed look at CNA's benefits, please visit cnabenefits.com. CNA is committed to providing reasonable accommodations to qualified individuals with disabilities in the recruitment process. To request an accommodation, please contact leaveadministration@cna.com.

Posted 30+ days ago

Guidehouse logo
GuidehouseArlington, VA
Job Family: Data Science Consulting Travel Required: Up to 25% Clearance Required: Ability to Obtain Public Trust What You Will Do: We are seeking an experienced Data Engineer to join our growing AI and Data practice, with a dedicated focus on the Federal Civilian Agencies (FCA) market within the Communities, Energy & Infrastructure (CEI) segment. This individual will be a hands-on technical contributor, responsible for designing and implementing scalable data pipelines and interactive dashboards that enable federal clients to achieve mission outcomes, operational efficiency, and digital transformation. This is a strategic delivery role for someone who thrives at the intersection of data engineering, cloud platforms, and public sector analytics. Client Leadership & Delivery Collaborate with FCA clients to understand data architecture and reporting needs. Lead the development of ETL pipelines and dashboard integrations using Databricks and Tableau. Ensure delivery excellence and measurable outcomes across data migration and visualization efforts. Solution Development & Innovation Design and implement scalable ETL/ELT pipelines using Spark, SQL, and Python. Develop and optimize Tableau dashboards aligned with federal reporting standards. Apply AI/ML tools to automate metadata extraction, clustering, and dashboard scripting. Practice & Team Leadership Work closely with data architects, analysts, and cloud engineers to deliver integrated solutions. Support documentation, testing, and deployment of data products. Mentor junior developers and contribute to reusable frameworks and accelerators. What You Will Need: US Citizenship is required Bachelor's degree is required Minimum TWO (2) years of experience in data engineering and dashboard development Proven experience with Databricks, Tableau, and cloud platforms (AWS, Azure) Strong proficiency in SQL, Python, and Spark Experience building ETL pipelines and integrating data sources into reporting platforms Familiarity with data governance, metadata, and compliance frameworks Excellent communication, facilitation, and stakeholder engagement skills What Would Be Nice To Have: AI/LLM Certifications Experience working with FCA clients such as DOT, GSA, USDA, or similar Familiarity with federal contracting and procurement processes What We Offer: Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace. Benefits include: Medical, Rx, Dental & Vision Insurance Personal and Family Sick Time & Company Paid Holidays Position may be eligible for a discretionary variable incentive bonus Parental Leave and Adoption Assistance 401(k) Retirement Plan Basic Life & Supplemental Life Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts Short-Term & Long-Term Disability Student Loan PayDown Tuition Reimbursement, Personal Development & Learning Opportunities Skills Development & Certifications Employee Referral Program Corporate Sponsored Events & Community Outreach Emergency Back-Up Childcare Program Mobility Stipend About Guidehouse Guidehouse is an Equal Opportunity Employer-Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation. Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco. If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at 1-571-633-1711 or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation. All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or guidehouse@myworkday.com. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process. If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse's Ethics Hotline. If you want to check the validity of correspondence you have received, please contact recruiting@guidehouse.com. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant's dealings with unauthorized third parties. Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.

Posted 5 days ago

MassMutual Financial Group logo
MassMutual Financial GroupNew York, NY
The Opportunity Join our dynamic team as a Data Software Engineer - Corporate Technology Data Engineering & Analytics, where you'll play a pivotal role in leading the design, development, and administration of complex AWS environments and contribute to enterprise-wide data initiatives. The ideal candidate will bring experience working with global delivery teams, demonstrate thought leadership in data architecture, and actively mentor junior engineers to build internal capability. The Team You'll be an integral part of our esteemed Corporate Technology Team, focused on Data Engineering & Analytics. Our team operates on a global scale, driving innovation and excellence across diverse areas of expertise. As a Senior Business Systems Analyst, you'll play a pivotal role in high impact Corporate Technology Finance Initiatives, ensuring alignment with organizational objectives and driving impactful outcomes. This is an opportunity to collaborate closely with our Corp Technology leadership team as well as our CFO customers. Our team thrives on collaboration, innovation, and a shared commitment to excellence. Together, we're shaping the future of technology within our organization and making a lasting impact on a global scale. Join us and be part of a dynamic team where your contributions will be valued and your potential unleashed. The Impact: Lead the design, development, and deployment of end-to-end data engineering solutions on AWS, ensuring scalability, performance, and security. Administer and manage a broad range of AWS services such as EC2, S3, Glue, Lambda, IAM, and more in a production environment. Develop and maintain infrastructure as code using tools like AWS CloudFormation or Terraform. Design and orchestrate complex data pipelines using Apache Airflow or AWS Step Functions to enable reliable ELT/ETL processes. Implement data quality checks, monitoring, and alerting to ensure data integrity and reliability. Optimize data workflows for performance and cost-efficiency in the cloud. Drive the implementation of Operational Data Store (ODS) patterns and integrate with downstream data warehouse and analytics layers. Collaborate with global delivery teams across time zones to deliver high-quality, well-documented solutions on schedule. Develop and evaluate proof-of-concepts (POCs) for AWS-native and third-party tool integrations (e.g., Databricks, Snowflake). Conduct architecture reviews, provide code-level guidance, and establish best practices across cloud, data, and DevOps efforts. Mentor junior engineers and support team skill development through peer reviews, technical coaching, and design sessions. Participate in roadmap planning and contribute to long-term data platform strategy and technology selection. Ensure compliance with data governance, security, and operational standards across all engineering activities. The Minimum Qualifications Bachelor's in Computer Science, Engineering or related technical field 8+ years of experience building and managing Cloud solutions preferably in the AWS ecosystem, including infrastructure and data services 2+ years experience with understanding of data warehousing concepts, ODS, dimensional modeling, and scalable architecture design 2+ years experience with Terraform, Apache Airflow, Databricks or Snowflake in a production or large-scale prototype environment 2+ years experience in Python, SQL, and automation scripting 2+ years experience with containerization (Docker, ECS, EKS) The Ideal Qualifications Master's degree in Computer Science, Engineering or related field AWS Certifications (Solutions Architect, Data Analytics, or DevOps Engineer). Knowledge of streaming data technologies (e.g., Kinesis, Kafka) Exposure to data lake and data warehouse architectures Experience with monitoring and observability tools (e.g., CloudWatch, Datadog, Prometheus) Exposure to machine learning pipelines and MLOps tools (e.g., SageMaker, MLflow, Kubeflow) Demonstrated experience working with global delivery teams and cross-functional stakeholders Strong communication skills with a proven ability to work across functional teams Experience with data lake architecture, data governance frameworks, and modern metadata management Familiarity with modern DevOps practices including CI/CD pipelines, monitoring, and alerting in cloud environments Experience with data cataloging tools (e.g., AWS Glue Data Catalog, Apache Atlas) Understanding of data privacy regulations and compliance (e.g., GDPR, HIPAA) Exceptional communication and interpersonal skills Ability to influence and motivate teams without direct authority Excellent time management and organizational skills, with the ability to prioritize multiple initiatives #LI-RK1 Salary Range: $134,400.00-$176,400.00 At MassMutual, we focus on ensuring fair equitable pay, by providing competitive salaries, along with incentive and bonus opportunities for all employees. Your total compensation package includes either a bonus target or in a sales-focused role a Variable Incentive Compensation component. Why Join Us. We've been around since 1851. During our history, we've learned a few things about making sure our customers are our top priority. In order to meet and exceed their expectations, we must have the best people providing the best thinking, products and services. To accomplish this, we celebrate an inclusive, vibrant and diverse culture that encourages growth, openness and opportunities for everyone. A career with MassMutual means you will be part of a strong, stable and ethical business with industry leading pay and benefits. And your voice will always be heard. We help people secure their future and protect the ones they love. As a company owned by our policyowners, we are defined by mutuality and our vision to put customers first. It's more than our company structure - it's our way of life. We are a company of people protecting people. Our company exists because people are willing to share risk and resources, and rely on each other when it counts. At MassMutual, we Live Mutual. MassMutual is an Equal Employment Opportunity employer Minority/Female/Sexual Orientation/Gender Identity/Individual with Disability/Protected Veteran. We welcome all persons to apply. Note: Veterans are welcome to apply, regardless of their discharge status. If you need an accommodation to complete the application process, please contact us and share the specifics of the assistance you need. At MassMutual, we focus on ensuring fair, equitable pay by providing competitive salaries, along with incentive and bonus opportunities for all employees. Your total compensation package includes either a bonus target or in a sales-focused role a Variable Incentive Compensation component. For more information about our extensive benefits offerings please check out our Total Rewards at a Glance.

Posted 30+ days ago

Johnson & Johnson logo
Johnson & JohnsonSpring House, PA
At Johnson & Johnson, we believe health is everything. Our strength in healthcare innovation empowers us to build a world where complex diseases are prevented, treated, and cured, where treatments are smarter and less invasive, and solutions are personal. Through our expertise in Innovative Medicine and MedTech, we are uniquely positioned to innovate across the full spectrum of healthcare solutions today to deliver the breakthroughs of tomorrow, and profoundly impact health for humanity. Learn more at https://www.jnj.com Job Function: Data Analytics & Computational Sciences Job Sub Function: Data Science Job Category: Scientific/Technology All Job Posting Locations: Cambridge, Massachusetts, United States of America, Spring House, Pennsylvania, United States of America, Titusville, New Jersey, United States of America Job Description: Johnson & Johnson Innovative Medicine is recruiting for a Knowledge Graph Engineer, R&D Data Science & Digital Health- Data Strategy and Products. The primary location is Barcelona or Madrid, Spain but is also open to Titusville, NJ; Spring House, PA; or Cambridge, MA. Our expertise in Innovative Medicine is informed and inspired by patients, whose insights fuel our science-based advancements. Engineers like you work on teams that save lives by developing the medicines of tomorrow. Join us in developing treatments, finding cures, and pioneering the path from lab to life while championing patients every step of the way. Learn more at https://www.jnj.com/innovative-medicine Job Responsibilities We are committed to using innovative technology to improve healthcare outcomes worldwide. As part of this mission, we are seeking a Knowledge Graph Engineer to join our Data Strategy and Products team to standardize and connect biomedical and clinical data. You will be a hands-on technical contributor with depth in semantic technologies, ontology, and graph data modeling, plus strong familiarity with the life sciences domain. You will connect enterprise master data with R&D data across the entire product lifecycle so trusted, interoperable knowledge powers analytics, search, and AI across Johnson and Johnson Innovative Medicine. Contribute to the design and implementation of a scalable knowledge graph infrastructure focused on data standardization and interoperability. Curate and extend ontologies for clear mapping into established biomedical ontologies and controlled terminologies using RDF standards. Apply graph-based data modeling for efficient organization, integration and retrieval to ensure system flexibility and long-term maintainability. Stand up SPARQL/GraphQL/REST services; develop ingestion and curation pipelines to ingest, normalize and map concepts across data sources. Extend and curate ontologies (e.g., diseases, drugs, targets, pathways, etc.) and maintain synonyms, cross-references, and provenance. Partner with cross-functional teams to enable NLP/RAG over graphs, features for predictive modeling and terminology services for search and study design tools. Work with IT and DevOps teams to deploy and manage the graph database infrastructure, focusing on high availability, scalability, and recovery operations. Create and be responsible for documentation, such as data dictionaries, data lineage, and data flow diagrams, to facilitate understanding of the knowledge graph. Job Qualifications Desired Ph.D. or master's degree in bioengineering, computer science, IT, bioinformatics, physics, mathematics, or related fields, emphasis on semantic technologies and biomedical application. At least 5 years professional experience in health informatics, or at least 7 years of professional experience or with additional consideration for candidates with graduate degrees or equivalent experience. Programming background in parser combinators, natural language processing, and linked data (RDF Triple Stores and property graphs). Demonstrated experience in large-scale knowledge graphs construction, ontology development, pharmaceutical or healthcare domains integration. Proficiency in semantic web technologies (SPARQL, RDF, OWL), familiarity with graph databases (Neo4j, Amazon Neptune). Proven work with complex biomedical datasets, including genomics, proteomics, and high-throughput screening data. Impressive records in a pharmaceutical, biotech, or related research environment are preferred. Proficiency in various data storage solutions (SQL, key-value, column, document, graph stores) and data modeling techniques (semantic data, ontologies, taxonomies). Experience in CI/CD implementations, git usage, CI/CD stacks (Jenkins, GitLab, Azure DevOps), DevOps tools, metrics/monitoring, and containerization technologies (Docker, Singularity). Strong skills in analysis, problem-solving, organizational change, project delivery, and managing external vendors. Demonstrated agile decision-making, performance management, continuous learning, and commitment to quality. Ability to multi-task, prioritize work, exhibit organizational skills and flexibility to deliver maximum business value. Capacity to translate discussions into user requirements and project plans. Willingness to travel less than 25% to conferences and internal meetings. #JRDDS The anticipated base pay range for this position is : 146.200 USD - 197.800 USD Additional Description for Pay Transparency: Subject to the terms of their respective plans, employees and/or eligible dependents are eligible to participate in the following Company sponsored employee benefit programs: medical, dental, vision, life insurance, short- and long-term disability, business accident insurance, and group legal insurance. Subject to the terms of their respective plans, employees are eligible to participate in the Company's consolidated retirement plan (pension) and savings plan (401(k)). This position is eligible to participate in the Company's long-term incentive program. Subject to the terms of their respective policies and date of hire, Employees are eligible for the following time off benefits: Vacation- 120 hours per calendar year Sick time- 40 hours per calendar year; for employees who reside in the State of Washington- 56 hours per calendar year Holiday pay, including Floating Holidays- 13 days per calendar year Work, Personal and Family Time - up to 40 hours per calendar year Parental Leave- 480 hours within one year of the birth/adoption/foster care of a child Condolence Leave- 30 days for an immediate family member: 5 days for an extended family member Caregiver Leave- 10 days Volunteer Leave- 4 days Military Spouse Time-Off- 80 hours Additional information can be found through the link below. https://www.careers.jnj.com/employee-benefits

Posted 3 days ago

The New York Times Company logo
The New York Times CompanyNew York, NY
The mission of The New York Times is to seek the truth and help people understand the world. That means independent journalism is at the heart of all we do as a company. It's why we have a world-renowned newsroom that sends journalists to report on the ground from nearly 160 countries. It's why we focus deeply on how our readers will experience our journalism, from print to audio to a world-class digital and app destination. And it's why our business strategy centers on making journalism so good that it's worth paying for. About the Role The New York Times is looking for a senior data engineer to join the Customer-Facing Data Products team to develop real-time data pipelines and APIs that process events and serve aggregated data for customer-facing use cases. You will report to the Engineering Manager for the Customer-Facing Data Products team and build widely reusable solutions to help partner teams solve our most important real-time needs, including behavioral and targeting use cases. This is a hybrid role based in our New York City headquarters. Responsibilities Develop real-time data pipelines using Apache Kafka, Apache Flink, and other streaming technologies. Ingest and organize structured and unstructured data for widespread reuse across patterns. Implement mechanisms to ensure data quality, observability and governance best practices. Collaborate with software engineers and infrastructure teams to improve pipeline performance and integrate solutions into production environments. Stay current with latest technologies, keeping up with the latest advancements in streaming data processing and related technologies. Grow the skills of colleagues by providing clear technical feedback through pairing, design, and code review. Experience collaborating with product and partners to meet shared goals. Demonstrate support and understanding of our value of journalistic independence and a strong commitment to our mission to seek the truth and help people understand the world. Basic Qualifications: 5+ years of full-time data engineering experience shipping real-time solutions with event-driven architectures and stream-processing frameworks. Experience with AWS and their service offerings and tools. Understanding of modern API design principles and technologies, including REST, GraphQL, and gRPC for data serving. Programming fluency with Python. Experience using version control and CI/CD tools, such as Github and Drone. Preferred Qualifications: Experience developing pipelines with Apache Kafka, Apache Flink, or Spark Streaming. Experience with SQL and building APIs with GoLang and Protobuf. Understanding of cloud-native data platform technologies including data lakehouse and medallion architectures. REQ-018499 The annual base pay range for this role is between: $140,000-$155,000 USD The New York Times Company is committed to being the world's best source of independent, reliable and quality journalism. To do so, we embrace a diverse workforce that has a broad range of backgrounds and experiences across our ranks, at all levels of the organization. We encourage people from all backgrounds to apply. We are an Equal Opportunity Employer and do not discriminate on the basis of an individual's sex, age, race, color, creed, national origin, alienage, religion, marital status, pregnancy, sexual orientation or affectional preference, gender identity and expression, disability, genetic trait or predisposition, carrier status, citizenship, veteran or military status and other personal characteristics protected by law. All applications will receive consideration for employment without regard to legally protected characteristics. The U.S. Equal Employment Opportunity Commission (EEOC)'s Know Your Rights Poster is available here. The New York Times Company will provide reasonable accommodations as required by applicable federal, state, and/or local laws. Individuals seeking an accommodation for the application or interview process should email reasonable.accommodations@nytimes.com. Emails sent for unrelated issues, such as following up on an application, will not receive a response. The Company will further consider qualified applicants, including those with criminal histories, in a manner consistent with the requirements of applicable "Fair Chance" laws. For information about The New York Times' privacy practices for job applicants click here. Please beware of fraudulent job postings. Scammers may post fraudulent job opportunities, and they may even make fraudulent employment offers. This is done by bad actors to collect personal information and money from victims. All legitimate job opportunities from The New York Times will be accessible through The New York Times careers site. The New York Times will not ask job applicants for financial information or for payment, and will not refer you to a third party to do so. You should never send money to anyone who suggests they can provide employment with The New York Times. If you see a fake or fraudulent job posting, or if you suspect you have received a fraudulent offer, you can report it to The New York Times at NYTapplicants@nytimes.com. You can also file a report with the Federal Trade Commission or your state attorney general.

Posted 30+ days ago

Geico Insurance logo
Geico InsuranceKaty, TX
At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities. Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers' expectations while making a real impact for our company through our shared purpose. When you join our company, we want you to feel valued, supported and proud to work here. That's why we offer The GEICO Pledge: Great Company, Great Culture, Great Rewards and Great Careers. GEICO is looking for a customer-obsessed and results-oriented Product Manager to support our Data Ingestion and Movement platform. This role will help drive product direction for our data ingestion, ETL/ELT pipelines, and data movement services, focusing on enabling reliable data flow into our lakehouse and other data stores. The ideal candidate will have a technical background in data engineering and experience delivering scalable data platforms and data pipeline solutions. Description As a Product Manager for Data Ingestion and Movement, you will be responsible for supporting the product vision and execution for GEICO's data ingestion and movement products. To successfully shape a platform that enables pipeline-as-a-service and supports a scalable data mesh architecture, a strong technical understanding of data pipelines, data integration patterns, data orchestration, ETL/ELT processes, and platform engineering is essential. Your goal is to abstract complexity and empower domain teams to autonomously and efficiently build, deploy, and govern data pipelines. This role also requires stakeholder management skills and the ability to bridge technical solutions with business value. Key Responsibilities Support the development and execution of data ingestion and movement platform vision aligned with business goals and customer needs Help create and maintain a clear, prioritized roadmap for data ingestion and movement capabilities that balances short-term delivery with long-term strategic objectives Support evangelizing the Data Ingestion and Movement platform across the organization and help drive stakeholder alignment Stay abreast of industry trends and competitive landscape (Apache Kafka, Apache Airflow, AWS Glue, Azure Data Factory, Google Cloud Dataflow, etc.) to inform data ingestion strategy Support requirement gathering and product strategy for data ingestion, ETL/ELT pipelines, and data movement services Understand end-to-end data ingestion workflows and how data movement fits into the broader data ecosystem and downstream analytics Support data governance initiatives for data lineage, quality, and compliance in data ingestion and movement processes Ensure data ingestion and movement processes adhere to regulatory, compliance, and data quality standards Partner with engineering on the development of data ingestion tools, pipeline orchestration services, and data movement capabilities Help define product capabilities for data ingestion, pipeline monitoring, error handling, and data quality validation to improve reliability and performance Support customer roadshows and training on data ingestion and movement capabilities Build instrumentation and observability into data ingestion and movement tools to enable data-driven product decisions and pipeline monitoring Work closely with engineering, data engineering, and data teams to ensure seamless delivery of data ingestion and movement products Partner with customer success, support, and engineering teams to create clear feedback loops Translate data ingestion and movement technical capabilities into business value and user benefits Support alignment across multiple stakeholders and teams in complex, ambiguous environments Qualifications Required Understanding of data ingestion patterns, ETL/ELT processes, and data pipeline architectures (Apache Kafka, Apache Airflow, Apache Spark, AWS Glue, etc.) Experience with data integration APIs, connectors, and data pipeline orchestration tools Basic understanding of data pipeline monitoring, observability, and data quality validation practices Experience in cloud data ecosystems (AWS, GCP, Azure) Proven analytical and problem-solving abilities with a data-driven approach to decision-making Experience working with Agile methodologies and tools (JIRA, Azure DevOps) Good communication, stakeholder management, and cross-functional collaboration skills Strong organizational skills with ability to manage product backlogs Preferred Previous experience as a software or data engineer is a plus Strong business acumen to prioritize features based on customer value and business impact Experience with data ingestion tools (Apache Kafka, Apache NiFi, AWS Kinesis, Azure Event Hubs, etc.) Knowledge of data lineage, data quality frameworks, and compliance requirements for data ingestion Insurance industry experience Experience Minimum 5+ years of technical product management experience building platforms that support data ingestion, ETL/ELT pipelines, data engineering, and data infrastructure Track record of delivering successful products in fast-paced environments Experience supporting complex, multi-stakeholder initiatives Proven ability to work with technical teams and translate business requirements into technical product specifications Experience with customer research, user interviews, and data-driven decision making Education Bachelor's degree in computer science, engineering, management information systems, or related technical field required MBA/MS or equivalent experience preferred Annual Salary $88,150.00 - $157,850.00 The above annual salary range is a general guideline. Multiple factors are taken into consideration to arrive at the final hourly rate/ annual salary to be offered to the selected candidate. Factors include, but are not limited to, the scope and responsibilities of the role, the selected candidate's work experience, education and training, the work location as well as market and business considerations. At this time, GEICO will not sponsor a new applicant for employment authorization for this position. The GEICO Pledge: Great Company: At GEICO, we help our customers through life's twists and turns. Our mission is to protect people when they need it most and we're constantly evolving to stay ahead of their needs. We're an iconic brand that thrives on innovation, exceeding our customers' expectations and enabling our collective success. From day one, you'll take on exciting challenges that help you grow and collaborate with dynamic teams who want to make a positive impact on people's lives. Great Careers: We offer a career where you can learn, grow, and thrive through personalized development programs, created with your career - and your potential - in mind. You'll have access to industry leading training, certification assistance, career mentorship and coaching with supportive leaders at all levels. Great Culture: We foster an inclusive culture of shared success, rooted in integrity, a bias for action and a winning mindset. Grounded by our core values, we have an an established culture of caring, inclusion, and belonging, that values different perspectives. Our teams are led by dynamic, multi-faceted teams led by supportive leaders, driven by performance excellence and unified under a shared purpose. As part of our culture, we also offer employee engagement and recognition programs that reward the positive impact our work makes on the lives of our customers. Great Rewards: We offer compensation and benefits built to enhance your physical well-being, mental and emotional health and financial future. Comprehensive Total Rewards program that offers personalized coverage tailor-made for you and your family's overall well-being. Financial benefits including market-competitive compensation; a 401K savings plan vested from day one that offers a 6% match; performance and recognition-based incentives; and tuition assistance. Access to additional benefits like mental healthcare as well as fertility and adoption assistance. Supports flexibility- We provide workplace flexibility as well as our GEICO Flex program, which offers the ability to work from anywhere in the US for up to four weeks per year. The equal employment opportunity policy of the GEICO Companies provides for a fair and equal employment opportunity for all associates and job applicants regardless of race, color, religious creed, national origin, ancestry, age, gender, pregnancy, sexual orientation, gender identity, marital status, familial status, disability or genetic information, in compliance with applicable federal, state and local law. GEICO hires and promotes individuals solely on the basis of their qualifications for the job to be filled. GEICO reasonably accommodates qualified individuals with disabilities to enable them to receive equal employment opportunity and/or perform the essential functions of the job, unless the accommodation would impose an undue hardship to the Company. This applies to all applicants and associates. GEICO also provides a work environment in which each associate is able to be productive and work to the best of their ability. We do not condone or tolerate an atmosphere of intimidation or harassment. We expect and require the cooperation of all associates in maintaining an atmosphere free from discrimination and harassment with mutual respect by and for all associates and applicants.

Posted 2 weeks ago

C logo
C3 AI Inc.Tysons Corner, VA
C3 AI (NYSE: AI), is the Enterprise AI application software company. C3 AI delivers a family of fully integrated products including the C3 Agentic AI Platform, an end-to-end platform for developing, deploying, and operating enterprise AI applications, C3 AI applications, a portfolio of industry-specific SaaS enterprise AI applications that enable the digital transformation of organizations globally, and C3 Generative AI, a suite of domain-specific generative AI offerings for the enterprise. Learn more at: C3 AI As a Data Scientist / Senior Data Scientist - Federal Optimization, you will partner with some of the largest and most mission-critical organizations in the world to design and deliver the next generation of AI-powered enterprise applications. Our team focuses on developing scalable, explainable optimization models and algorithms tailored to federal use cases across domains such as logistics, operations, resource planning, and more. You'll work cross-functionally with data scientists, engineers, subject matter experts, and federal stakeholders to deliver full-lifecycle solutions: from translating client input into soft and hard constraints, to deploying robust, production-grade optimization tools on the C3 AI Suite. Qualified candidates should possess deep expertise in operations research and optimization. This role requires US Citizenship. Responsibilities: Research, design, implement, and deploy optimization solutions for enterprise applications leveraging the C3 AI Suite. Transform client requirements into mathematical formulations. Partner with cross-functional teams to translate optimization model insights into actionable strategies and measurable outcomes. Assist and enable federal customers to build their own optimization applications on the C3 AI Suite. Develop, maintain, and enhance optimization frameworks, libraries, and tools to ensure scalability and efficiency while contributing to the continuous improvement of the C3 AI Suite. Stay informed on state-of-the-art optimization techniques, promote best practices, and foster an innovative and collaborative work environment at C3 AI. Qualifications: U.S. Citizenship (and willingness to obtain a security clearance). Bachelor's in computer science, Electrical Engineering, Statistics, Operations Research or equivalent fields. Strong foundation in optimization techniques (e.g., LP, MILP, MINLP) and solvers. Strong mathematical foundation (linear algebra, calculus, statistics, probability). Proficiency in Python and experience with mathematical programming libraries. Excellent communication skills and ability to work independently or in teams. Motivated, curious, and eager to learn about federal mission domains. Preferred Qualifications: MS or PhD in Operations Research, Applied Mathematics, Computer Science, Industrial Engineering, or a related field. Active TS/SCI with CI or Full-Scope Polygraph. Professional experience applying optimization in federal or customer-facing environments. Familiarity with commercial solvers (e.g., Gurobi), Git, and GenAI tools. Understanding of machine learning, deep learning, or reinforcement learning. Portfolio of relevant projects or publications. C3 AI provides a competitive compensation package and excellent benefits. Candidates must be authorized to work in the United States without the need for current or future company sponsorship. C3 AI is proud to be an Equal Opportunity and Affirmative Action Employer. We do not discriminate on the basis of any legally protected characteristics, including disabled and veteran status.

Posted 30+ days ago

Procter & Gamble logo
Procter & GambleCincinnati, OH
Job Location CINCINNATI GENERAL OFFICES Job Description Are you passionate about data and analytics and eager to drive significant business impact? Procter & Gamble is looking for interns interested in various roles within the data and analytics domain, including Business Analysts, Data Asset Managers, and Data Engineers. You will have the opportunity to apply innovative technologies to solve complex business problems and contribute to data-driven decision-making. The work you will be doing could range from analyzing large datasets to providing actionable insights, managing data assets to ensure data quality and accessibility, to engineering robust data pipelines that feed analytical models and support business operations. Regardless of the specific role, as a Data, Analytics, & Data Engineering intern, you will be someone who leverages data to solve business problems. Example Responsibilities by Type: Data Asset Manager: Manage and oversee data assets, ensuring data quality, accessibility, and compliance. Collaborate with teams to implement data governance practices and maintain data integrity. Data Engineer: Design and build robust data pipelines that extract, transform, and load (ETL) data from various sources. Ensure data is accurate, timely, and accessible for analysis and reporting and development of machine learning/artificial intelligence models. Job Qualifications In the process of obtaining a Bachelor's or Master's degree in MIS, Computer Science, Data Science, Applied Math, Statistics, Operations Research, Analytics or like degrees. Strong analytical and problem-solving skills with experience in programming languages such as Python, SQL, R, or similar (Experience in a specific language is not required; you will be expected to learn new languages). Familiarity with data visualization tools (e.g., Tableau, Power BI) and understanding of data management practices. Ability to communicate technical concepts to both technical and non-technical colleagues effectively. You must be available during the summer of 2026, from mid/late May through early August. Preferred: A history of solving complex problems with innovative solutions. Experience with cloud platforms such as Microsoft Azure or Google Cloud Platform. Understanding of data governance and data quality best practices. Relevant experience or coursework in data analysis, data management, or related fields. Compensation for roles at P&G varies depending on a wide array of non-discriminatory factors including but not limited to the specific office location, role, degree/credentials, relevant skill set, and level of relevant experience. At P&G compensation decisions are dependent on the facts and circumstances of each case. Total rewards at P&G include salary + bonus (if applicable) + benefits. Your recruiter may be able to share more about our total rewards offerings and the specific salary range for the relevant location(s) during the hiring process. We are committed to providing equal opportunities in employment. We value diversity and do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Immigration Sponsorship is not available for this role. For more information regarding who is eligible for hire at P&G along with other work authorization FAQ's, please click HERE. Procter & Gamble participates in e-verify as required by law. Qualified individuals will not be disadvantaged based on being unemployed. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Job Schedule Full time Job Number R000137031 Job Segmentation Internships Starting Pay / Salary Range $29.00 - $50.00 / hour

Posted 30+ days ago

PricewaterhouseCoopers logo

Cloud Engineering, Data & Analytics - Data Science - Summer/Fall 2026 Associate

PricewaterhouseCoopersRosemont, Illinois

Automate your job search with Sonara.

Submit 10x as many applications with less effort than one manual application.1

Reclaim your time by letting our AI handle the grunt work of job searching.

We continuously scan millions of openings to find your top matches.

pay-wall

Job Description

Industry/Sector

Not Applicable

Specialism

Data, Analytics & AI

Management Level

Associate

Job Description & Summary

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals.In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems.

Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities.

Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to:

  • Apply a learning mindset and take ownership for your own development.
  • Appreciate diverse perspectives, needs, and feelings of others.
  • Adopt habits to sustain high performance and develop your potential.
  • Actively listen, ask questions to check understanding, and clearly express ideas.
  • Seek, reflect, act on, and give feedback.
  • Gather information from a range of sources to analyse facts and discern patterns.
  • Commit to understanding how the business works and building commercial awareness.
  • Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements.
The OpportunityAs a Cloud Engineering, Data & Analytics - Data Science professional, you will engage in the dynamic world of data analysis, leveraging your skills to transform complex data into actionable insights. This role involves working closely with clients to understand their needs and deliver data-driven solutions that enhance decision-making processes. You will be at the forefront of utilizing advanced data analytics tools and techniques to uncover trends and opportunities within financial markets.As an Associate, you will focus on learning and contributing to client engagements while developing your skills and knowledge to deliver quality work. You will be exposed to clients, learning how to build meaningful connections, manage projects, and inspire others. This role encourages you to embrace challenges as opportunities for growth, enhancing your personal brand and technical knowledge.In this role, you will be part of a team that supports the Data and Analytics group, where you will apply your skills in data modeling, machine learning, and statistical analysis to deliver impactful solutions. Your contributions will be vital in helping clients navigate complex data landscapes and data integrity, and driving business success through informed decision-making.Responsibilities- Conducting complex data analysis to extract meaningful insights and support decision-making processes- Developing algorithms and predictive models using machine learning techniques to enhance data-driven insights- Utilizing Python and statistical analysis software for data modeling and validation tasks- Building and maintaining data pipelines to ensure seamless data integration and management- Creating interactive dashboards and visualizations using Power BI and Tableau to communicate data findings effectively- Collaborating with teams to conduct analytic research and customer analysis for business improvement- Implementing data security measures to protect sensitive information and maintain data integrity- Engaging in exploratory data analysis to identify trends and patterns within large datasets- Supporting client engagements by applying business data analytics techniques to address specific needsWhat You Must Have- Currently pursuing or have completed a Bachelor's degree- Client service associate positions are entry-level roles intended for job seekers who are completing or have recently completed their final academic year of educational requirementsWhat Sets You Apart- Preference for one of the following field(s) of study: Management Information Systems, Information Technology, Computer Science, Data Analytics, Data Science, Statistics, Mathematics- Preference for a 3.3 overall GPA- Demonstrating proficiency in Python and machine learning- Utilizing Power BI and Tableau for data visualization- Conducting complex data analysis and predictive analytics- Developing data-driven insights for client support- Engaging in algorithm development and data modeling- Excelling in data security and validation techniques-Leveraging AI to create efficiencies, innovate ways of working and deliver distinctive outcomes

Travel Requirements

Up to 80%

Job Posting End Date

Learn more about how we work: https://pwc.to/how-we-workPwC does not intend to hire experienced or entry level job seekers who will need, now or in the future, PwC sponsorship through the H-1B lottery, except as set forth within the following policy: https://pwc.to/H-1B-Lottery-Policy.As PwC is an equal opportunity employer, all qualified applicants will receive consideration for employment at PwC without regard to race; color; religion; national origin; sex (including pregnancy, sexual orientation, and gender identity); age; disability; genetic information (including family medical history); veteran, marital, or citizenship status; or, any other status protected by law. For only those qualified applicants that are impacted by the Los Angeles County Fair Chance Ordinance for Employers, the Los Angeles' Fair Chance Initiative for Hiring Ordinance, the San Francisco Fair Chance Ordinance, San Diego County Fair Chance Ordinance, and the California Fair Chance Act, where applicable, arrest or conviction records will be considered for Employment in accordance with these laws.  At PwC, we recognize that conviction records may have a direct, adverse, and negative relationship to responsibilities such as accessing sensitive company or customer information, handling proprietary assets, or collaborating closely with team members. We evaluate these factors thoughtfully to establish a secure and trusted workplace for all.The salary range for this position is: $61,000 - $100,000, plus individuals may be eligible for an annual discretionary bonus. For roles that are based in Maryland, this is the listed salary range for this position. Actual compensation within the range will be dependent upon the individual's skills, experience, qualifications and location, and applicable employment laws. PwC offers a wide range of benefits, including medical, dental, vision, 401k, holiday pay, vacation, personal and family sick leave, and more. To view our benefits at a glance, please visit the following link: https://pwc.to/benefits-at-a-glance

Automate your job search with Sonara.

Submit 10x as many applications with less effort than one manual application.

pay-wall