landing_page-logo
  1. Home
  2. »All Job Categories
  3. »Data Science Jobs

Auto-apply to these data science jobs

We've scanned millions of jobs. Simply select your favorites, and we can fill out the applications for you.

Senior Data Scientist-Lead Data Engineer (N373)-logo
Senior Data Scientist-Lead Data Engineer (N373)
Heluna HealthLos Angeles, California
Salary Range: $9,333.00 -$12,576.46 Monthly SUMMARY Housing for Health (HFH) is a program office within Community Programs, a division under the Los Angeles County Department of Health Services (DHS) for the County of Los Angeles. HFH was created to support people experiencing homelessness with complex clinical needs. We support people in obtaining housing, improving their health and thriving in their communities. HFH is a core component of Los Angeles County’s effort to respond to the homeless emergency. Where appropriate to the job function, a hybrid work schedule may be available, with employees working both remotely and from the office, as needed. This position offers a unique opportunity to shape the data infrastructure of one of the most impactful homelessness service systems in the nation. The Lead Data Engineer will oversee the foundational architecture for Community Programs’ Databricks-based Lakehouse, supporting seamless integration, governance, and analysis of health, housing, and justice data across initiatives including Housing for Health and others. The role reports to the Director of Data Analytics and Evaluation and will guide efforts across data ingestion, transformation, privacy, and CalAIM claims workflows. This is a chance to architect a scalable data environment from the ground up in a mission-driven context. The engineering team plays a central role in the County’s data strategy, with opportunities for mentorship, innovation, and cross-sector impact. ESSENTIAL FUNCTIONS Lead the development of bronze, silver, and gold data layers from highly normalized data sources, translating complex table relationships into denormalized, analyst-ready structures. Implement GitHub-based CI/CD workflows that manage Dev and Prod separation, parameterized notebook execution, and automated testing. Use Azure Data Factory and related tools to ingest and orchestrate data flows from diverse internal and external sources. Collaborate closely with analytics and privacy engineers to ensure models align with data governance and privacy requirements. Support integration of MDM-linked datasets and promote entity resolution strategies across systems. Monitor pipeline health and lead resolution of data quality issues in coordination with analysts and external partners. Review and approve engineering code as part of CI/CD workflow, unblock or resolve issues holding up promotion to Prod. Define architecture standards and RBAC models in Unity Catalog for Departmental and Countywide access. Mentor engineering team members on best practices and solution design. Produce clear, maintainable documentation and SOPs to support onboarding, data catalog use, and model transparency. JOB QUALIFICATIONS Option I: Two (2) years of experience independently leading complex data engineering projects, including developing and applying methods to collect, structure, and transform data using statistical, programmatic, and engineering techniques to support data-driven decision-making, at a level equivalent to the Los Angeles County class of Data Scientist. Option II: A Bachelor's degree from an accredited college or university in a field such as Data Science, Data Engineering, Computer Science, Statistics, Mathematics, Machine Learning, Business Analytics, Psychology, or Public Health, with at least 12 semester or 18 quarter units in data science, quantitative research methods, programming, or statistical analysis – AND – six (6) years of experience in data engineering, including two (2) years in a lead or supervisory capacity. A Master’s or Doctoral degree from an accredited college or university in a field of applied research such Data Science, Machine Learning, Mathematics, Statistics, Business Analytics, Psychology, or Public Health may substitute for up to two (2) years of experience. Certificates/Licenses/Clearances A valid California Class C Driver License or the ability to utilize an alternative method of transportation when needed to carry out job-related essential functions. Successful clearing through the Live Scan and the Health Clearance process with the County of Los Angeles. Other Skills, Knowledge, and Abilities 7+ years building scalable data pipelines in cloud environments. Advanced proficiency in Databricks (Delta Live Tables, Unity Catalog), Spark, SQL, and Python. Experienced with CI/CD (GitHub, Azure DevOps), infrastructure-as-code (Terraform), and Azure tools (Data Factory, Synapse). Skilled in MDM integration and entity resolution. Familiar with MLflow, experience embedding data expectations within ETL/ELT pipelines, and data observability practices. Strong grasp of Medallion Architecture, performance optimization, and ETL/ELT design. Working knowledge of HIPAA, 42 CFR Part 2, and public-sector data governance. Experience mentoring teams and leading technical standards. PHYSICAL DEMANDS Stand: Frequently Walk: Frequently Sit: Frequently Reach Outward: Occasionally Reach Above Shoulder: Occasionally Climb, Crawl, Kneel, Bend: Occasionally Lift / Carry: Occasionally - Up to 15 lbs. Push/Pull: Occasionally - Up to 15 lbs. See: Constantly Taste/ Smell: Not Applicable Not Applicable = Not required for essential functions Occasionally = (0 - 2 hrs./day) Frequently = (2 - 5 hrs./day) Constantly = (5+ hrs./day) WORK ENVIRONMENT General Office Setting, Indoors Temperature Controlled EEOC STATEMENT It is the policy of Heluna Health to provide equal employment opportunities to all employees and applicants, without regard to age (40 and over), national origin or ancestry, race, color, religion, sex, gender, sexual orientation, pregnancy or perceived pregnancy, reproductive health decision making, physical or mental disability, medical condition (including cancer or a record or history of cancer), AIDS or HIV, genetic information or characteristics, veteran status or military service.

Posted 1 day ago

Director, Data Scientist & Data Operations-logo
Director, Data Scientist & Data Operations
EisaiNutley, New Jersey
At Eisai, satisfying unmet medical needs and increasing the benefits healthcare provides to patients, their families, and caregivers is Eisai’s human health care (hhc) mission. We’re a growing pharmaceutical company that is breaking through in neurology and oncology, with a strong emphasis on research and development. Our history includes the development of many innovative medicines, notably the discovery of the world's most widely-used treatment for Alzheimer’s disease. As we continue to expand, we are seeking highly-motivated individuals who want to work in a fast-paced environment and make a difference. If this is your profile, we want to hear from you. FOR INTERNAL USE ONLY (NOT POSTED) The Data Operations Group at Eisai, Inc. is looking for an Director - Data Scientist/Programmer to drive drug development through predictive modeling of disease and drug response. The Data Scientist will work closely with Biostatisticians in the Stat Methodology / Machine Learning group to support projects across various stages of development. This role will be integral in providing actionable insights for critical data science projects that are vital to our business. The Data Operations Group at Eisai, Inc. is looking for an Director - Data Scientist/Programmer to drive drug development through predictive modeling of disease and drug response. The Data Scientist will work closely with Biostatisticians in the Stat Methodology / Machine Learning group to support projects across various stages of development. This role will be integral in providing actionable insights for critical data science projects that are vital to our business. This position may be either office based (hybrid) in Nutley, NJ, or remote based . Responsibilities Data Analysis for Strategic Insights: Skilled in analyzing complex datasets to extract actionable insights, identify key trends and patterns, and provide data-driven recommendations that support and guide strategic decision-making. Data Storage & Processing: Extensive experience in designing, managing, and optimizing data storage solutions. Expertise in building and automating data pipelines for efficient data processing. AI-Driven Data Preparation: Experienced in leveraging AI and machine learning algorithms for automated data preparation, streamlining the transformation of raw data into high-quality, actionable insights. Skilled in using these techniques to create dynamic and interactive visualizations via Power BI, facilitating better decision-making and business intelligence. Natural Language Processing (NLP) & Large Language Models (LLM): Hands-on expertise in applying NLP techniques and LLMs to process and analyze unstructured data, generating insightful infographics and data-driven narratives. These methods help to uncover hidden patterns and deliver actionable insights for stakeholders in a visually compelling format. Pipeline Orchestration & Automation: Experienced in automating and orchestrating complex data pipelines using tools like Apache Airflow, Prefect, and Dagster to ensure seamless data flow and efficient workflows. Data Quality & Consistency: Proficient in establishing and enforcing validation rules to ensure data integrity, consistency, and high-quality standards throughout the data lifecycle. Incremental Data Loads: Skilled in implementing incremental data loading strategies to optimize data refresh cycles and minimize resource consumption. Event-Driven Automation: Implemented event-driven automation to ensure real-time and dynamic updates for dashboards, enhancing decision-making with live data. Low-Latency Data Processing: Ensured optimal performance and low-latency processing for delivering real-time, time-sensitive insights to stakeholders. Dashboard Optimization: Leveraged parameterized queries and other optimization techniques to enhance the performance and responsiveness of Power BI dashboards. Data Communication & Visualization: Proficient in presenting complex data findings to non-technical stakeholders through clear, visually compelling reports, interactive dashboards, and presentations that facilitate easy understanding and informed decision-making. Exploratory Data Analysis (EDA): Skilled in conducting thorough exploratory data analysis to assess data quality, uncover insights, and deepen understanding of data characteristics, ensuring data readiness for analysis and model building. Feature Engineering: Expertise in engineering relevant features from raw datasets to enhance model performance, improve predictive accuracy, and support the development of robust machine learning models. Qualifications Bachelor's Degree from an accredited institution with 7+ years of experience in a related role required; Master’s degree preferred. In-depth knowledge of statistical analysis, machine learning algorithms, and data modeling techniques. Proficiency in programming languages such as Python or R, with hands-on experience in data manipulation and analysis libraries (e.g., pandas, NumPy, scikit-learn). Experience with data visualization tools (e.g., Tableau, matplotlib) for effectively communicating insights. Familiarity with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure) is a plus. Strong problem-solving abilities, critical thinking, and the capacity to manage complex projects independently. Excellent communication and presentation skills, capable of translating complex concepts for both technical and non-technical audiences. Eisai Salary Transparency Language: The base salary range for the Director, Data Scientist & Data Operations is from :196,800-258,300 Under current guidelines, this position is eligible to participate in : Eisai Inc. Annual Incentive Plan & Eisai Inc. Long Term Incentive Plan. Final pay determinations will depend on various factors including but not limited to experience level, education, knowledge, and skills. Employees are eligible to participate in Company employee benefit programs. For additional information on Company employee benefits programs, visit https://us.eisai.com/careers-at-eisai/benefits . Certain other benefits may be available for this position, please discuss any questions with your recruiter. Eisai is an equal opportunity employer and as such, is committed in policy and in practice to recruit, hire, train, and promote in all job qualifications without regard to race, color, religion, gender, age, national origin, citizenship status, marital status, sexual orientation, gender identity, disability or veteran status. Similarly, considering the need for reasonable accommodations, Eisai prohibits discrimination against persons because of disability, including disabled veterans. Eisai Inc. participates in E-Verify. E-Verify is an Internet based system operated by the Department of Homeland Security in partnership with the Social Security Administration that allows participating employers to electronically verify the employment eligibility of all new hires in the United States. Please click on the following link for more information: Right To Work E-Verify Participation

Posted 1 week ago

Support Engineering Data Specialist (Support Engineering Data)-logo
Support Engineering Data Specialist (Support Engineering Data)
BoeingOklahoma City, Oklahoma
Support Engineering Data Specialist (Support Engineering Data) Company: The Boeing Company Boeing Global Services BGS is seeking a Product Support Maintenance Engineer to provide support for Oklahoma City, OK. The candidate selected for this Maintenance Engineering position will work within the Engineering Capability team to support Maintenance Engineering activities across multiple programs. This position will work collaboratively with Boeing employees and contractors to fulfill the Product Support Engineer contractual requirements. This position is expected to be 100% onsite. The selected candidate will be required to work onsite at the Oklahoma City, OK facility. Position Responsibilities: Analyzes complex engineering designs and design changes to determine maintenance/repair requirements, procedures and instructions Influences product designs and processes to ensure supportability and maintainability Develops and uses fault isolation procedures and techniques Creates systems theory descriptions to ensure common understanding of systems and components Conducts engineering analyses to verify the accuracy of maintenance/repair data Develop and support the generation and review of logistics support documentation, including Integrated Logistics Support Plans (ILSP), Product Support Analysis Plan (PSAP), Maintenance Task Analysis (MTA), support equipment recommendation data (SERD), Level of Repair Analysis (LORA) and other logistics supportability analysis (LSA) documents. Develops and provides technical solutions to customers; evaluates customers' maintenance operations Develops procedures and techniques to allow users to maintain and repair products Participates in organizational activities that influence the Aerospace Industry Use a variety of tools and common desktop software to gather, analyze, and report data Work closely with program management and engineering to accomplish tasks Participate in or lead in-person and virtual working group meetings Confidently and professionally work with customers and suppliers Understand and work within the boundaries of budget and schedule demands Work in a team environment Demonstrate excellent verbal and written communication skills Work collaboratively to reach decisions that are mutually acceptable to multiple engineering and support teams This position requires the ability to obtain a US Security Clearance for which the US Government requires US Citizenship. An interim and/or final U.S. secret clearance Post Start is required. Basic Qualifications (Required Skills/Experience): Bachelor’s Degree and minimum 2-year of prior relevant experience such as electrical, mechanical or systems engineering. 1-2 years’ experience reading and interpreting aircraft engineering drawings and/or process specifications 1-2 years’ experience with aircraft technical manuals, task relationships and familiar with Technical Orders and/or Technical Order System requirements or equivalent Understanding and interpretation of engineering data with previous experience using Technical Orders to maintain military aircraft Preferred Qualifications (Desired Skills/Experience): Bachelor’s degree in a related technical field or higher preferred. Experience with Product Support Analysis (PSA), LSA, GEIA-STD-0007 or MIL-HDBK-502A Proficiency using LSA software applications, models, and databases (e.g., Slicwave, EAGLE, Powerlog, Compass). Experience using and applying military and commercial Maintenance Manual Publications, Instruction for Continued Airworthiness, Overhaul Manuals, Drawings, Technical Orders and Part Illustrated Manuals, Wiring Diagrams towards aircraft system’s Maintenance Planning Experience with any of the following: Radar, Computer (Mission), Comm NAV, Power Plant, Avionics, Hydraulics, or Electrical & Environmental Strong analytical and problem-solving skills, with the ability to evaluate complex technical information and make informed decisions Experience in commercial or military airworthiness certification Experience using Microsoft Office Products like Outlook, Power Point, Excel and Word. Proficient at reviewing military detailed work instructions, reading and interpreting blueprints, drawings, or specifications. Ability to understand the big picture and the inter-relationships of all positions and activities in the system, including the impact of changes in one area on another area. This includes the ability to see and understand the inter-relationships between components of systems and plans, anticipate future events, and apply the principles of systems thinking to accelerate performance. Minimum of 1 year of experience with cost account and project engineering management. Aircraft experience working B-52 Bombe or AWACS (Airborne Warning and Control System) Typical Education/Experience: Education/experience typically acquired through advanced technical education (e.g. Bachelor) and typically 2 or more years' related work experience or an equivalent combination of technical education and experience (e.g. Master, 6 year's related work experience, etc.). Travel: 10% travel will be required to support program activities include site visits. Relocation: Relocation assistance is not a negotiable benefit for this position. Candidates must live in the immediate area or relocate at their own expense. Drug Free Workplace: Boeing is a Drug Free Workplace where post offer applicants and employees are subject to testing for marijuana, cocaine, opioids, amphetamines, PCP, and alcohol when criteria is met as outlined in our policies . Shift: This position is for 1st shift. At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent. Elements of the Total Rewards package include competitive base pay and variable compensation opportunities. The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and several programs that provide for both paid and unpaid time away from work. The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements. Pay is based upon candidate experience and qualifications, as well as market and business considerations. Summary pay range: 60,350 - 81,650 Language Requirements: Not Applicable Education: Not Applicable Relocation: Relocation assistance is not a negotiable benefit for this position. Export Control Requirement: This position must meet export control compliance requirements. To meet export control compliance requirements, a “U.S. Person” as defined by 22 C.F.R. §120.15 is required. “U.S. Person” includes U.S. Citizen, lawful permanent resident, refugee, or asylee. Safety Sensitive: This is a safety-sensitive position and is subject to random drug testing. Security Clearance: This position requires the ability to obtain a U.S. Security Clearance for which the U.S. Government requires U.S. Citizenship. An interim and/or final U.S. Secret Clearance Post-Start is required. Visa Sponsorship: Employer will not sponsor applicants for employment visa status. Contingent Upon Award Program This position is not contingent upon program award Shift: Shift 1 (United States of America) Stay safe from recruitment fraud! The only way to apply for a position at Boeing is via our Careers website. Learn how to protect yourself from recruitment fraud - Recruitment Fraud Warning Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law. EEO is the law Boeing EEO Policy Request an Accommodation Applicant Privacy Boeing Participates in E – Verify E-Verify (English) E-Verify (Spanish) Right to Work Statement Right to Work (English) Right to Work (Spanish)

Posted 3 days ago

Experienced Data Engineer - Data Engineering-logo
Experienced Data Engineer - Data Engineering
PlaidSan Francisco, CA
We believe that the way people interact with their finances will drastically improve in the next few years. We’re dedicated to empowering this transformation by building the tools and experiences that thousands of developers use to create their own products. Plaid powers the tools millions of people rely on to live a healthier financial life. We work with thousands of companies like Venmo, SoFi, several of the Fortune 500, and many of the largest banks to make it easy for people to connect their financial accounts to the apps and services they want to use. Plaid’s network covers 12,000 financial institutions across the US, Canada, UK and Europe. Founded in 2013, the company is headquartered in San Francisco with offices in New York, Washington D.C., London and Amsterdam. #LI-Hybrid The main goal of the DE team in 2024-25 is to build robust golden data sets to power our business goals of creating more insights based products. Making data-driven decisions is key to Plaid's culture. To support that, we need to scale our data systems while maintaining correct and complete data. We provide tooling and guidance to teams across engineering, product, and business and help them explore our data quickly and safely to get the data insights they need, which ultimately helps Plaid serve our customers more effectively. Data Engineers heavily leverage SQL and Python to build data workflows. We use tools like DBT, Airflow, Redshift, ElasticSearch, Atlanta, and Retool to orchestrate data pipelines and define workflows. We work with engineers, product managers, business intelligence, data analysts, and many other teams to build Plaid's data strategy and a data-first mindset. Our engineering culture is IC-driven -- we favor bottom-up ideation and empowerment of our incredibly talented team. We are looking for engineers who are motivated by creating impact for our consumers and customers, growing together as a team, shipping the MVP, and leaving things better than we found them. You will be in a high impact role that will directly enable business leaders to make faster and more informed business judgements based on the datasets you build. You will have the opportunity to carve out the ownership and scope of internal datasets and visualizations across Plaid which is a currently unowned area that we intend to take over and build SLAs on. You will have the opportunity to learn best practices and up-level your technical skills from our strong DE team and from the broader Data Platform team. You will collaborate with and have strong and cross functional partnerships with literally all teams at Plaid from Engineering to Product to Marketing/Finance etc. Responsibilities Understanding different aspects of the Plaid product and strategy to inform golden dataset choices, design and data usage principles. Have data quality and performance top of mind while designing datasetsLeading key data engineering projects that drive collaboration across the company. Advocating for adopting industry tools and practices at the right timeOwning core SQL and python data pipelines that power our data lake and data warehouse. Well-documented data with defined dataset quality, uptime, and usefulness. Qualifications 4+ years of dedicated data engineering experience, solving complex data pipelines issues at scale. You’ve have experience building data models and data pipelines on top of large datasets (in the order of 500TB to petabytes) You value SQL as a flexible and extensible tool, and are comfortable with modern SQL data orchestration tools like DBT, Mode, and Airflow. You have experience working with different performant warehouses and data lakes; Redshift, Snowflake, Databricks. You have experience building and maintaining batch and realtime pipelines using technologies like Spark, Kafka. You appreciate the importance of schema design, and can evolve an analytics schema on top of unstructured data. You are excited to try out new technologies. You like to produce proof-of-concepts that balance technical advancement and user experience and adoption. You like to get deep in the weeds to manage, deploy, and improve low level data infrastructure. You are empathetic working with stakeholders. You listen to them, ask the right questions, and collaboratively come up with the best solutions for their needs while balancing infra and business needs. You are a champion for data privacy and integrity, and always act in the best interest of consumers. Target base Salary for this role is $182,250 - $297,640 per year. Additional compensation in the form(s) of equity and/or commission are dependent on the position offered. Plaid provides a comprehensive benefit plan, including medical, dental, vision, and 401(k). Pay is based on factors such as (but not limited to) scope and responsibilities of the position, candidate's work experience and skillset, and location. Pay and benefits are subject to change at any time, consistent with the terms of any applicable compensation or benefit plans. Our mission at Plaid is to unlock financial freedom for everyone. To support that mission, we seek to build a diverse team of driven individuals who care deeply about making the financial ecosystem more equitable. We recognize that strong qualifications can come from both prior work experiences and lived experiences. We encourage you to apply to a role even if your experience doesn't fully match the job description. We are always looking for team members that will bring something unique to Plaid! Plaid is proud to be an equal opportunity employer and values diversity at our company. We do not discriminate based on race, color, national origin, ethnicity, religion or religious belief, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, military or veteran status, disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local laws. Plaid is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance with your application or interviews due to a disability, please let us know at accommodations@plaid.com Please review our Candidate Privacy Notice here .

Posted 30+ days ago

Dod Military Skillbridge - Data Center Operations (Critical Facilities Eng And Data Center Technician)-logo
Dod Military Skillbridge - Data Center Operations (Critical Facilities Eng And Data Center Technician)
Equinix, Inc.Atlanta, GA
Who are we? Equinix is the world's digital infrastructure company, operating over 260 data centers across the globe. Digital leaders harness Equinix's trusted platform to bring together and interconnect foundational infrastructure at software speed. Equinix enables organizations to access all the right places, partners and possibilities to scale with agility, speed the launch of digital services, deliver world-class experiences and multiply their value, while supporting their sustainability goals. Joining our operations team means that you will be at the forefront of all we do, maintaining critical facilities infrastructure as part of a close-knit team delivering best-in-class service to our data center customers. We embrace diversity in thought and contribution and are committed to providing an equitable work environment that is foundational to our core values as a company and is vital to our success. This is not the official job application; this posting is specifically for Equinix Pathways Military Program. Please note that this program is exclusively for Military Spouse Fellows or Active-duty US military personnel transitioning out of the military to civilian workforce. Equinix is the world's digital infrastructure company, operating 200+ data centers across the globe and providing interconnections to all the key clouds and networks. Businesses need one place to simplify and bring together fragmented, complex infrastructure that spans private and public cloud environments. Our global platform allows customers to place infrastructure wherever they need it and connect it to everything they need to succeed. We are a fast-growing global company with 70+ consecutive quarters of growth. Through our innovative portfolio of high-performance products and services, we have created the largest, most active global ecosystem of nearly 10,000 companies, including 1,800+ networks and 2,900+ cloud and IT service providers in over 26 countries spanning five continents. Are you a military service member within or nearing your 180-day window for separation from active duty? Are you a military spouse fellow wanting to engage a technology career? If this is you, then Equinix has a unique opportunity to help kick-start the next chapter in your professional life. We are accepting applications for our very own DoD SkillBridge Fellowship Program supporting the Data Center Operations, Critical Facilities. The training will be on the cutting-edge of technology in a digital infrastructure environment supporting an initiative designed to link our nation's warfighters to the best employment opportunities available. Do you have a background in Electronics, Electrician, or HVAC and skilled Mechanical trades? This could be your next career move! In this DoD Skillbridge Program with Equinix, you will: Learn how your military experience translates into a rewarding civilian career Explore a career or industry you might want to pursue upon separation from active duty Earn real-world industry qualifications and certifications Build experience and competency in your trade/ profession with our team Expand your professional network of contacts Gain familiarity with corporate culture. Responsibilities Has a substantial understanding of the job while working on assignments that are moderately difficult requiring judgement in resolving issues or making recommendations. Focus is on moderately difficult tasks, using substantial understanding of standard operating procedure. Supports the overall team. Facility / Infrastructure Maintenance Performs moderately difficult preventative and corrective maintenance checks on-site to facility components. You will perform site inspections and monitor the building and IBX alarms Performs moderately difficult repairs, maintenance, installations, and on-site inspections to facility systems. Supports energy efficiency measures Monitors the Building Monitoring System (BMS) and resolves moderately difficult alarm issues that require judgement in resolving while following standard operating procedures Operates and maintains plumbing, fire suppression, and safety systems Operates critical infrastructures under the supervision of more senior technical staff Normally receives little instruction on daily work, general instructions on newly introduced assignments Customer Operations Technician Queue Management & Reporting Prioritize and manage service requests to meet deadlines Maintain detailed records and audit reports Installations Rack and stack equipment in data centers Perform fiber terminations using a fusion splicer Set up telecom cabinets, fiber trays, and cage wiring. Troubleshoot fiber and copper circuits Support standard cross-connect work orders Other troubleshooting procedures Site Administration & Incident Support Performs moderately difficult site logs for permits, such as Maintenance Operation Protocol (MOPs) and scripts Identifies Single Points of Failure (SPOFs) and makes recommendations Responds to all moderately difficult on-site incidents, including failures, problems and delays Uses substantial understanding in following operating procedures to support on-site administration Work Orders & Additional Projects Completes routine work requests and circuit installations Troubleshoots and maintains office equipment (if necessary); supports auxiliary equipment and machines with problem-solving and repairs to avoid/minimize downtime Makes minor changes to mechanical, electrical, and specialized systems, as directed Carries out infrastructure projects Collaboration Collaborates with others to resolve moderately difficult facility incidents Effectively collaborates within the department; may mentor team members on general maintenance activities Provides stakeholders of inventory needs in order to maintain optimal stock levels of critical parts and equipment May recommend infrastructure projects Qualifications Must meet all eligibility requirements outlined in DOD Instruction 1322.29 and NAVADMIN 222/15. Required Technical associates degree, military-technical school, or civilian technical trade school completion Education level: Working on bachelor's degree or relevant experience with 1-4 years in Mechanical Engineering or related field. Comprehensive knowledge of critical infrastructure i.e., UPS, generator, BMS, chillers, life safety systems Coursework in HVAC design or heat transfer and thermodynamics Knowledge of HVAC testing and balancing methodologies Knowledge of IT hardware and other data center operations functions Good time management habits, ability to multi-task, to sustain focus on long tasks. Ability to communicate thoughts and technical ideas. Attitude of taking initiative, enthusiasm, eagerness to learn, teamwork, creativity. Ability to lift 50 lbs The ability to prioritize effectively, balance assigned work and exceptional organization skills required for our constantly evolving environments. Strong interpersonal and communication skills essential for team-based work assignments. Presentation skills with colleagues and clients of all levels Skillbridge Internship positions are open to active duty and/or transitioning military members. Compensation or pay for this role is made through the service member's current enlistment contract based on pay guidelines set by the Department of Defense (DoD) Equinix is committed to ensuring that our employment process is open to all individuals, including those with a disability. Equinix is an Equal Employment Opportunity and Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to unlawful consideration of race, color, religion, creed, national or ethnic origin, ancestry, place of birth, citizenship, sex, pregnancy / childbirth or related medical conditions, sexual orientation, gender identity or expression, marital or domestic partnership status, age, veteran or military status, physical or mental disability, medical condition, genetic information, political / organizational affiliation, status as a victim or family member of a victim of crime or abuse, or any other status protected by applicable law. (Equal Opportunity / AA / Disabled / Veterans Employer) Equinix is committed to ensuring that our employment process is open to all individuals, including those with a disability. If you are a qualified candidate and need assistance or an accommodation, please let us know by completing this form. Equinix is an Equal Employment Opportunity and, in the U.S., an Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to unlawful consideration of race, color, religion, creed, national or ethnic origin, ancestry, place of birth, citizenship, sex, pregnancy / childbirth or related medical conditions, sexual orientation, gender identity or expression, marital or domestic partnership status, age, veteran or military status, physical or mental disability, medical condition, genetic information, political / organizational affiliation, status as a victim or family member of a victim of crime or abuse, or any other status protected by applicable law.

Posted 30+ days ago

Associate Director, Data Scientist & Data Operations-logo
Associate Director, Data Scientist & Data Operations
Eisai USDC, WA
At Eisai, satisfying unmet medical needs and increasing the benefits healthcare provides to patients, their families, and caregivers is Eisai's human health care (hhc) mission. We're a growing pharmaceutical company that is breaking through in neurology and oncology, with a strong emphasis on research and development. Our history includes the development of many innovative medicines, notably the discovery of the world's most widely-used treatment for Alzheimer's disease. As we continue to expand, we are seeking highly-motivated individuals who want to work in a fast-paced environment and make a difference. If this is your profile, we want to hear from you. THIS SECTION IS FOR INTERNAL USE ONLY (NOT POSTED) The Data Operations Group at Eisai, Inc. is looking for Associate Director- Data Scientist/Programmer to create insightful AI powered instantaneous dashboards. The Data Scientist will work closely with Data Operations and Biostatisticians in the ingestion, streaming and producing dashboards to support projects across various stages of development. This role will be integral in providing actionable insights for critical projects that are vital to our business. The Data Operations Group at Eisai, Inc. is looking for Associate Director- Data Scientist/Programmer to create insightful AI powered instantaneous dashboards. The Data Scientist will work closely with Data Operations and Biostatisticians in the ingestion, streaming and producing dashboards to support projects across various stages of development. This role will be integral in providing actionable insights for critical projects that are vital to our business. This position may be either office based (hybrid) in Nutley, NJ, or remote based. Responsibilities Data Extraction & Analysis: Extract and manipulate complex datasets to generate detailed reports, charts, and graphs, analyzing for outliers, root causes, business impacts, correlations, and discrepancies. Proactively propose alternative solutions to optimize business outcomes. AI-Driven Data Preparation: Skilled in utilizing AI and machine learning techniques for automated data preparation, particularly for creating dynamic visualizations and insights through Power BI tools. Natural Language Processing & LLM: Hands-on experience with Large Language Models (LLM) and Natural Language Processing (NLP) techniques to generate insightful infographics and actionable data-driven insights. Data Preparation for Analysis: Prepare data for modeling by cleaning datasets, addressing missing values, and eliminating outliers to ensure high-quality inputs for accurate and effective model development. Insight Generation for Drug Discovery: Identify patterns and root causes within data to generate meaningful insights that directly support and drive the drug discovery and development processes. Comprehensive Data Integration: Integrate diverse data sources (clinical, biological, etc.) to create comprehensive analyses that provide a holistic view of ongoing projects, facilitating informed decision-making. Model Training & Data Quality Assessment: Assess the quality of data for model training and testing, ensuring reliable and accurate models for predictive analysis and decision support. Clear Communication of Findings: Present data-driven proposals and findings in a clear and actionable format, offering insights and recommendations that inform strategic business decisions. Collaboration Across Teams: Collaborate with data scientists, biostatisticians, and the Data Standards team on data collection, feature design, and cross-team initiatives, ensuring consistency and alignment in data practices. Data Communication & Visualization: Effectively communicate complex findings to various audiences using clear writing, data visualizations, BI reports, and dashboards, ensuring accessibility and understanding across technical and non-technical stakeholders. Qualifications Computer Science Master's degree from an accredited institution with research projects. Extensive expertise in data modeling techniques. Proficient in Python or R, with strong experience in data manipulation and analysis libraries. Skilled in using data visualization tools (e.g., Tableau, matplotlib) to present insights clearly and effectively. Strong problem-solving and critical thinking abilities, with the capability to manage complex projects independently. Excellent communication and presentation skills, with the ability to convey complex concepts to both technical and non-technical audiences. Eisai Salary Transparency Language: The base salary range for the Associate Director, Data Scientist & Data Operations is from :171,100-224,600 Under current guidelines, this position is eligible to participate in : Eisai Inc. Annual Incentive Plan & Eisai Inc. Long Term Incentive Plan. Final pay determinations will depend on various factors including but not limited to experience level, education, knowledge, and skills. Employees are eligible to participate in Company employee benefit programs. For additional information on Company employee benefits programs, visit https://us.eisai.com/careers-at-eisai/benefits . Certain other benefits may be available for this position, please discuss any questions with your recruiter. Eisai is an equal opportunity employer and as such, is committed in policy and in practice to recruit, hire, train, and promote in all job qualifications without regard to race, color, religion, gender, age, national origin, citizenship status, marital status, sexual orientation, gender identity, disability or veteran status. Similarly, considering the need for reasonable accommodations, Eisai prohibits discrimination against persons because of disability, including disabled veterans. Eisai Inc. participates in E-Verify. E-Verify is an Internet based system operated by the Department of Homeland Security in partnership with the Social Security Administration that allows participating employers to electronically verify the employment eligibility of all new hires in the United States. Please click on the following link for more information: Right To Work E-Verify Participation

Posted 30+ days ago

Senior Data Engineer, Data Platform - Intelliscript (Remote)-logo
Senior Data Engineer, Data Platform - Intelliscript (Remote)
MillimanBrookfield, WI
What We Do Milliman IntelliScript is a group of a few hundred experts in fields ranging from actuarial science to information technology to clinical practice. Together, we develop and deploy category-defining, data-driven, software-as-a-service (SaaS) products for a broad spectrum of insurance, health IT and life sciences clients. We are a business unit within Milliman, Inc., a respected consultancy with offices around the world. Candidates who have their pick of jobs are drawn to IntelliScript's entrepreneurial and collaborative culture of innovation, excellence, exceptional customer service, balance, and transparency. Every single person has a voice in our company, and we challenge each other to push the outer limits of our full, diverse potential. And we've shown sustained growth that ensures you'll have room to grow your skillset, responsibilities, and career. Our team is smart, down-to-earth, and ready to listen to your best ideas. We reward excellence and offer competitive compensation and benefits. Visit our LinkedIn page for a closer look at our company and learn more about our cultural values here. Milliman invests in skills training and career development and gives all employees access to a variety of learning and mentoring opportunities. Our growing number of Milliman Employee Resource Groups (ERGs) are employee-led communities that influence policy decisions, develop future leaders, and amplify the voices of their constituents. We encourage our employees to give back to their varied professions, including leadership in professional organizations. Please visit our website to learn more about Milliman's commitments to our people, diversity and inclusion, social impact, and sustainability. What this position entails IntelliScript's Data has been a key part of our success and is critical to our future. In this position as a Senior Data Engineer of IntelliScript's Data Platform, you will be responsible for designing and implementing robust Data Platform solutions that meet our business objectives while ensuring compliance with industry-leading data privacy standards. You will collaborate closely with cross-functional agile teams to drive data architecture decisions, implement best practices, and contribute to the success of our projects. What you will be doing Data Platform: Creation of a Databricks Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise. Data Governance: Configuring and maintaining unity catalog to enable enterprise data lineage and data quality Data Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions External Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performance ETL: Building solutions within Delta Live Tables and automation of transformations Medallion Architecture: Building out performant enterprise-level medallion architecture(s) Streaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions Large Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data Platform Engineering: Building out Infrastructure as Code using Terraform and Asset Bundles Costs: Working with the business to build cost effective and cost transparent Data solutions Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performance Experience working with Migration tools i.e., AWS DMS, AWS Glue, Fivetran, integrate.io Identify and implement improvements to enhance data processing efficiency Experience with building out effective pipeline monitoring solutions Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud-based 'big data' technologies. Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured data Assemble large, complex data sets that meet functional and non-functional business requirements Develop and maintain data models, ensuring they align with business objectives and data privacy regulations Collaboration: Partner internally and externally with key stakeholders to ensure we are providing meaningful, functional, and valuable data Effectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions. Communicate complex technical concepts to non-technical stakeholders and provide guidance on best practices. Ensure that technology execution aligns with business strategy and provides efficient, secure solutions and systems Processes and Tools: Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics. Create data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. What we need 7+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products Expert level experience working in Databricks and AWS Expert level experience working in both relational and non-relational databases such as SQL Server, PostgreSQL, and MongoDB Experience managing and standardizing clinical data from structured and unstructured sources Experience building and managing solutions on AWS Expert knowledge in healthcare standards including FHIR, C-CDA, and traditional HL7 Expert knowledge in clinical standards/ontologies including ICD10/SNOMED/NDC/LOINC/Rx Norm Expert in building out data models, data warehouses, designing of data lakes for enterprise (and product use) Familiarity with designing and building APIs, ETL and data ingestion processes and utilization of tools to support enterprise solutions. Experience in performance tuning, query optimization, security, monitoring, and release management. Experience working with and managing large, disparate, identified and de-identified data sets from multiple data sources What you bring to the table Demonstrated "let's find a way to do it" attitude - no task is too big or too small Effective collaboration and communication across multiple technical and non-technical disciplines Comfortable working through ambiguous situations Able to teach & mentor others on new/emerging technologies Customer obsessed with a business-centric focus Able to understand both strategic and tactical needs and balance appropriately Driven, thorough and self-directed Able to lead through influence and persuasion Wish list Bachelor's degree or master's degree in computer science, data engineering or related field 10+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products Health and Life Insurance business experience Associate or Professional level solution architecture certification in Azure and/or AWS Experience in Snowflake Location The expected application deadline for this job is June 30, 2025. This position is open to remote work. Applicants must be willing to travel to the Milliman office in Brookfield, WI as needed and travel nationwide for meetings, conferences, and team events. The salary range is $98,000 - $222,000, depending on relevant factors, including but not limited to, education, relevant work experience, qualifications, skills, certifications, location, etc. If relevant experience is less than 10 years the range would be $98,000 - $198,000; for experience of greater than 10 years, the range would be $109,000 - $222,000. In addition, we offer a performance-based bonus plan, profit sharing, and generous benefits. Milliman Benefits (Full time U.S. positions only) At Milliman, we focus on creating an environment that recognizes - and meets - the personal and professional needs of the individual and their family. We offer competitive benefits which include the following based on plan eligibility: Medical, dental and vision coverage for employees and their dependents, including domestic partners A 401(k) plan with matching program, and profit-sharing contribution Employee Assistance Program (EAP) A discretionary bonus program Paid Time Off (PTO) starts accruing on the first day of work and can be used for any reason; full-time employees will accrue 15 days of PTO per year, and employees working less than a full-time schedule will accrue PTO at a prorated amount based on hours worked Family building benefits, including adoption and fertility assistance and paid parental leave up to 12 weeks for employees who have worked for Milliman for at least 12 months and have worked at least 1,250 hours in the preceding 12-month period Commuter Program, which allows you to use pre-tax dollars to pay for your parking or public transit expenses to get to and from work. You may utilize this benefit any time throughout the year and funds will be available the first of the month following your first contribution A minimum of eight paid holidays Milliman covers 100% of the premiums for life insurance, AD&D, and both short-term and long-term disability coverage Flexible spending accounts allow employees to set aside pre-tax dollars to pay for dependent care, transportation, and applicable medical needs Equal Opportunity All qualified applicants will receive consideration for employment, without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Posted 30+ days ago

Data Analyst - Corporate Technology Data Engineering & Analytics-logo
Data Analyst - Corporate Technology Data Engineering & Analytics
MassMutual Financial GroupSpringfield, MA
The Opportunity Join our dynamic team as a Data Analyst - Corporate Technology Data Engineering & Analytics, where you'll play a pivotal role in driving the execution of our data strategy. This role is crucial in driving digital transformation and operational efficiency across financial systems that support Actuarial, FP&A, Reinsurance, Treasury, Accounting and Tax functions and requires a balance of technical expertise, business acumen, and leadership capabilities. You will work cross-functionally to deliver insights, mentor team members, define standards, and collaborate with both onshore and offshore teams to enable high-quality, scalable data solutions. The Team You'll be an integral part of our esteemed Corporate Technology Team, focused on Data Engineering & Analytics. Our team operates on a global scale, driving innovation and excellence across diverse areas of expertise. As a Data Analyst, you'll play a pivotal role in high impact Corporate Technology Finance Initiatives, ensuring alignment with organizational objectives and driving impactful outcomes. This is an opportunity to collaborate closely with our Corp Technology leadership team as well as our CFO customers. Our team thrives on collaboration, innovation, and a shared commitment to excellence. Together, we're shaping the future of technology within our organization and making a lasting impact on a global scale. Join us and be part of a dynamic team where your contributions will be valued and your potential unleashed. The Impact: Analyze data related to life insurance operations including policy administration, reinsurance, reserves, and risk to generate actionable insights. Develop and maintain comprehensive data mapping documents and work closely with data engineering teams to ensure accurate data integration and transformation. Collaborate with business stakeholders to gather requirements, define KPIs, and create reporting solutions aligned to strategic objectives. Partner with data engineers, architects, and IT to validate datasets, optimize queries, and ensure scalable data pipelines. Work directly with onshore teams to coordinate analysis, share requirements, and maintain consistent quality and delivery timelines. Mentor junior analysts by reviewing their work, offering guidance, and promoting professional growth. Define and implement standards for documentation, analysis, and data governance across the analytics team. Apply strong data modeling knowledge (star/snowflake schema, normalization, dimensional modeling) to support efficient analysis and reporting. Help build dashboards, scorecards, and visualizations using Tableau, Power BI, or Strategy to communicate insights to stakeholders. Ensure data quality and compliance with internal controls and industry regulations. The Minimum Qualifications Bachelor's degree in Data Science, Statistics, Mathematics, Computer Science, Actuarial Science, or related technical field. 8+ years of experience as a Data Analyst 5+ years of experience in Finance domain within the insurance industry 2+ years of experience with Vertica/Teradata or other similar tools for querying, performance optimization, and large-scale data analysis. 2+ years of experience in Python and SQL 2+ years of experience in writing detailed source-to-target mapping documents and collaborate with technical teams on data integration. 2+ years of experience with at least one of the following: Investment Principles: Knowledge of different asset classes, investment strategies, and financial markets. Quantitative Finance: Understanding of financial modeling, risk management, and derivatives. Regulatory Framework: Awareness of relevant financial regulations and compliance requirements. The Ideal Qualifications Master's degree in Data Science, Statistics, Mathematics, Computer Science, Actuarial Science, or related field. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modeling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Proven track record of Analytical and Problem-Solving skills. A solid understanding of Financial Accounting Systems and knowledge of accounting principles, reporting and budgeting Strong data analysis skills for extracting insights from financial data Proficiency in data visualization tools and reporting software is also important. Experience integrating financial systems with actuarial, policy administration, and claims platforms. Familiarity with actuarial processes, reinsurance, or regulatory reporting requirements. Experience with General Ledger systems such as SAP and forecasting tools like Anaplan. Exceptional communication and interpersonal skills. Ability to influence and motivate teams without direct authority. Excellent time management and organizational skills, with the ability to prioritize multiple initiatives. #LI-RK1 Salary Range: $144,800.00-$190,000.00 At MassMutual, we focus on ensuring fair equitable pay, by providing competitive salaries, along with incentive and bonus opportunities for all employees. Your total compensation package includes either a bonus target or in a sales-focused role a Variable Incentive Compensation component. Why Join Us. We've been around since 1851. During our history, we've learned a few things about making sure our customers are our top priority. In order to meet and exceed their expectations, we must have the best people providing the best thinking, products and services. To accomplish this, we celebrate an inclusive, vibrant and diverse culture that encourages growth, openness and opportunities for everyone. A career with MassMutual means you will be part of a strong, stable and ethical business with industry leading pay and benefits. And your voice will always be heard. We help people secure their future and protect the ones they love. As a company owned by our policyowners, we are defined by mutuality and our vision to put customers first. It's more than our company structure - it's our way of life. We are a company of people protecting people. Our company exists because people are willing to share risk and resources, and rely on each other when it counts. At MassMutual, we Live Mutual. MassMutual is an Equal Employment Opportunity employer Minority/Female/Sexual Orientation/Gender Identity/Individual with Disability/Protected Veteran. We welcome all persons to apply. Note: Veterans are welcome to apply, regardless of their discharge status. If you need an accommodation to complete the application process, please contact us and share the specifics of the assistance you need. At MassMutual, we focus on ensuring fair, equitable pay by providing competitive salaries, along with incentive and bonus opportunities for all employees. Your total compensation package includes either a bonus target or in a sales-focused role a Variable Incentive Compensation component. For more information about our extensive benefits offerings please check out our Total Rewards at a Glance.

Posted 2 days ago

Senior/Lead Data Scientist, Data Integrity-logo
Senior/Lead Data Scientist, Data Integrity
YouGov PLC.New York, NY
YouGov has an exciting opportunity open for a Senior Data Scientist / Lead Data Scientist to come and work for a Global Online Research Company, offering insight into what the world thinks. Role Overview We are seeking a talented and self-motivated Senior Data Scientist / Lead Data Scientist to join our team. This role is critical to ensuring the integrity and credibility of our research data. You will work independently to develop new strategies to detect, prevent, and mitigate research fraud, and be responsible for communicating the results of new and existing fraud prevention programs to internal and external stakeholders through white papers, ad hoc reports, and in-person presentations. This is a hands-on role that requires both analytical, communication, and project management skills. About the team We develop models and predictions using a combination of surveys, voter databases, census microdata, behavioral tracking and other sources. We handle the most difficult data problems at our company, including massive datasets, missing data, self-selection, measurement error, fraud detection, attrition, and many more. Key Responsibilities Develop comprehensive statistical models to detect, prevent, and respond to research fraud. Lead investigations into suspected cases of research fraud, including data manipulation, respondent fraud, and other unethical practices. Exercise considerable creativity, foresight, and judgment while working independently on complex research projects - including methods selection, techniques, and evaluation criteria. Convey complex information and persuade several diverse stakeholders/audiences across multiple cultures. Author white papers, ad hoc reports, and in-person presentations to communicate results. Coordinate with senior stakeholders to understand project requirements Work collaboratively with the existing data science team. Required Qualifications Master's or PhD in data science, statistics, quantitative social science, or equivalent business experience Experience with (and ability to learn) a variety of statistical methods - regression, causal inference, multilevel regression models, classification models, machine learning, MRP, LLMs Experience performing statistical analyses on large datasets. Proficiency in Python and/or R for data analysis, modeling, and visualization. Ability to write clean, maintainable, well-documented code. Skilled at communicating results and research methodology though reports #LI-PM1 Company Description and Culture YouGov is a global online research company, offering insight into what the world thinks. We speak daily to our panel of over 27 million registered members to understand opinion and behaviors around the world. We have a strong reputation as a source of accurate data and we're trusted by the world's biggest brands to get it right, making us the most quoted market research source in the world. Why join YouGov? Join our global team to help us achieve our social mission: to make millions of people's opinions heard for the benefit of our local, national, and international communities. Understanding diversity of opinion requires diversity of background. Although our global panel of millions of people worldwide powers our research, our biggest asset is our people. If our research is to be truly representative of what the world thinks, we need people from all walks of life to be part of the team to bring their perspective to the work we do. Life at YouGov We are driven by a set of shared values. We are fast, fearless, and innovative. We work diligently to get it right. We are guided by accuracy, ethics, and proven methodologies. We respect and trust each other, bringing these values into everything that we do. We strive to provide YouGovers with best-in-class benefits to support their physical, financial, and emotional wellbeing. We want our employees to have a sense of belonging and uniqueness in a supportive workplace, so they can bring their full selves to work. Equal Opportunity Employer As an Equal Opportunity Employer, qualified applicants will receive consideration for employment without regard to race, color, religion, sex (including pregnancy, gender identity or expression, and sexual orientation), parental status, national origin, marital status, age, disability, genetic information, HIV status, political affiliation, socioeconomic background, veteran status or any other characteristic protected by law or in line with our responsibilities as a fair and ethic employer. All employment decisions are made based on occupational qualifications, merit, and business need. Data Privacy To find out how we collect and use your personal data when you apply for a role at YouGov, please read our privacy notice at https://jobs.yougov.com/privacy

Posted 3 days ago

Data Governance And Master Data Manager-logo
Data Governance And Master Data Manager
Mimedx Group Inc.Marietta, GA
At MIMEDX, our purpose starts with helping humans heal. We are driven by discovering and developing regenerative biologics utilizing human placental tissue to provide breakthrough therapies addressing the unmet medical needs for patients across multiple areas of healthcare. Possessing a strong portfolio of industry leading advanced wound care & surgical products, we are committed to making a transformative impact on the lives of patients we serve globally. Will you join us on this journey? We are excited to add a Data Governance and Master Data Manager to our QOR Team! The position will pay between $105,000-$130,000k based on previous relevant experience, educational credentials, and location. POSITION SUMMARY: The Manager, Data Governance & Master Data will lead efforts to establish, oversee, and enhance data governance and master data management (MDM) practices across the organization. The role ensures data accuracy, integrity, security, and accessibility while aligning with business strategies and compliance requirements. Responsible for gathering data from all stakeholders, define KPI's, integrate data for process improvement and fill the gaps between current vs future data architecture. This position collaborates with cross-functional teams to develop frameworks, policies, and solutions to improve data quality and usability. ESSENTIAL DUTIES AND RESPONSIBILITIES: Develop and implement the organization's data governance framework, policies, and standards. Define roles and responsibilities for data stewardship, ownership, and custodianship. Monitor and audit data governance processes to ensure alignment with organizational goals. Provide leadership in resolving data quality issues and establishing data quality metrics and KPIs. Work together with Stakeholders, architects and Engineers from different domains to implement, test and deploy solutions for the Business and manufacturing Master Data domain and Work end to end in their implementation. Gather business requirements across multiple application teams and identify master data contributing to process improvements. Designing, developing, testing, and debugging Master Data solutions that will be used by end users or integrated with other applications. Use modern software development methodologies and programming languages, follow secure coding practices and software legal compliance guidelines, analyze user stories, write both functional and test code, automate build and deployment, and to perform unit integration and end to end testing of applications. Design, develop, and implement MES solutions to improve manufacturing processes and data flow. Configure MES/ERP software to meet specific production requirements, ensuring system scalability and reliability. Maintain, upgrade, and troubleshoot MES applications and related infrastructure. Collaborate with manufacturing teams to ensure the MES/ERP integrates seamlessly with equipment and existing systems. Conduct root cause analysis with appropriate subject matter experts to identify and address root causes to production issues. PROBLEM SOLVING: Problems are difficult and typically undefined where information is difficult to obtain Solutions require analysis and investigation to understand the root cause of the problem Assesses issues thoroughly and solves complex problems quickly; removes roadblocks DECISION MAKING/SCOPE OF AUTHORITY: Achieves planned results by decisions and actions based on professional methods, business principles, and practical experience May recommend/make decisions regarding existing or new programs/initiatives that have a significant impact to business operations/outcomes and carry potential consequences if unsuccessful SPAN OF CONTROL/COMPLEXITY: Acts independently to determine methods and procedures on new or special assignments. May supervise the activities of major projects and/or vendors EDUCATION/EXPERIENCE: Bachelor's Degree in Computer Science, Information Management, Statistics or Engineering and a minimum of 5-8 years' of experience. Strong understanding of 21 CFR Part 11 guidelines, cGxP and GAMP5, IOT practices and IIOT 4.0 strategy. Experience in MES systems like Tulip, POMSnet, Koerber PAS-X, Siemens OpCenter etc. Proficient understanding of data modeling, data quality and data profiling. Experience with architecture, design and development of data integration solutions using MSMQ or HiveMQ etc. SKILLS/COMPETENCIES: Proficiency in MES/ERP platforms (e.g., Siemens SIMATIC IT, Rockwell FactoryTalk, Dassault DELMIA, Wonderware MES, SAP ME). MES systems like POMSnet, Koerber PAS-X, Siemens OpCenter, QAD etc. Proficiency in programming languages (e.g., Python, Java, C#, .NET). Knowledge of scripting languages like JavaScript, SQL, or VBA. data integration solutions using MSMQ or HiveMQ etc. Experience with integration technologies such as OPC UA, REST APIs, SOAP, or middleware tools. Ability to communicate clearly and build relationships within the team and across various levels within the organization Create accountability and sense of urgency for execution and plan attainment. Knowledge of the process to ensure strong problem solving and risk management assessment techniques always ensuring compliance to regulatory guidelines WORK ENVIRONMENT/EXPECTED BUSINESS TRAVEL: The work is typically performed in a normal office environment. Role routinely uses standard office equipment. Minimal travel required

Posted 30+ days ago

Lead Business Consultant, Enterprise Data Management & Data Integration-logo
Lead Business Consultant, Enterprise Data Management & Data Integration
SimCorpAtlanta, GA
WHAT MAKES US, US Join some of the most innovative thinkers in FinTech as we lead the evolution of financial technology. If you are an innovative, curious, collaborative person who embraces challenges and wants to grow, learn and pursue outcomes with our prestigious financial clients, say Hello to SimCorp! At its foundation, SimCorp is guided by our values - caring, customer success-driven, collaborative, curious, and courageous. Our people-centered organization focuses on skills development, relationship building, and client success. We take pride in cultivating an environment where all team members can grow, feel heard, valued, and empowered. If you like what we're saying, keep reading! WHY THIS ROLE IS IMPORTANT TO US At SimCorp, we facilitate the streamlining of investments, accounting, and operations for major global financial institutions. We do this through IT systems, processes, and financial knowledge. Implementing our software by way of high-quality projects is at the core of what we do. To introduce our software to our clients, business consultants are essential to us. In the role of Lead Business Consultant specializing in Data Management and Data Integration, your participation will be instrumental in the expansion of our market unit and in advancing the outcomes of our clients' projects. You will normally be working full-time on implementation projects, which require your special expertise in Data Management and Data Integration, and you will be responsible for all aspects of the project stream. You will act as an important sparring partner for your clients and effectively monitor, coordinate, and escalate issues as needed. WHAT YOU WILL BE RESPONSIBLE FOR Responsibility for major parts of SimCorp One implementation projects: You know the customer requirements and processes related to Investment Operations at banks, asset managers and insurers and outline solutions for optimal use of SimCorp One Close cooperation with the project manager and key role in developing projects to success Gain understanding of SimCorp's Global Standard Solutions and utilize them for project delivery Key contact for our customers for all questions in connection with SimCorp One Instruct and mentor less experienced colleagues on the job Contribute to improving best practices for implementation processes and promote topics across projects Actively participate in know-how exchange with colleagues on an international level WHAT WE VALUE Most importantly, you can see yourself contributing and thriving in the position described above. How you gained the skills needed for doing that is less important. We expect you to have expertise at several of the following: Previous experience as an Analyst or (Implementation) Consultant for asset managers, asset owners, banks or consulting firms Hands-on experience with SimCorp One / SimCorp Dimension (SCD) Excellent know-how in at least two of the following areas: Enterprise Data Management Data Integration architecture, patterns and standards WS*, REST API, JSON XML, XSLT and other scripting languages like Phyton Relational Databases: Oracle, SQL Server and/or ETL tools C# or any other object-oriented language Testing e.g., Test-Driven Development, agile testing, test automation, test methodologies System performance improvements Continuous integration and delivery (CI/CD) Experience working on software implementation projects in the financial industry. Ability to manage assigned tasks and deliver the results on time Basic understanding of financial industry/products and related workflows Very good communication skills and proficiency in English - both oral and written Ability to travel to client's site BENEFITS An attractive salary, bonus scheme, and pension are essential for any work agreement. However, in SimCorp, we believe we can offer more. Therefore, in addition to the traditional benefit scheme, we provide an extensive work-life balance and opportunities for professional development: there is never just only one route - we offer an individual approach to professional development to support the direction you want to take. For New York City only: The salary range for this position is $133,000 - $170,000. Additionally, employees are eligible for an annual discretionary bonus, and benefits including health care, leave, and retirement plans. Your total compensation may vary based on role, location, department and individual performance. NEXT STEPS Applications are continuously assessed, so please send your CV in English as soon as possible. Please note: Only applications sent through our system will be processed. For those keen on exploring opportunities with SimCorp but questioning the alignment with this position, we welcome you to submit your CV for consideration. SimCorp is on an exciting growth journey, and our Talent Acquisition Team is ready to assist you in discovering the right role for you. The approximate time to consider your CV is three weeks. We are eager to continually improve our talent acquisition process and make everyone's experience positive and valuable. Therefore, during the process, we will ask you to provide your feedback, which is highly appreciated . WHO WE ARE For over 50 years, we have worked closely with investment and asset managers to become the world's leading provider of integrated investment management solutions. We are 3,000+ colleagues with a broad range of nationalities, education, professional experiences, ages, and backgrounds in general. SimCorp is an independent subsidiary of the Deutsche Börse Group. Following the recent merger with Axioma, we leverage the combined strength of our brands to provide an industry-leading, full, front-to-back offering for our clients, with SimCorp as the overarching company brand and Axioma as a key product brand. SimCorp is an equal-opportunity employer. We are committed to building a culture where diverse perspectives and expertise are integrated into our everyday work. We believe in the continual growth and development of our employees, so that we can provide best-in-class solutions to our clients. #Li-Hybrid

Posted 1 week ago

Senior Data Operations Engineer, Commercial & Medical Data Solutions-logo
Senior Data Operations Engineer, Commercial & Medical Data Solutions
Revolution Medicines, Inc.Redwood City, CA
Revolution Medicines is a clinical-stage precision oncology company focused on developing novel targeted therapies to inhibit frontier targets in RAS-addicted cancers. The company's R&D pipeline comprises RAS(ON) Inhibitors designed to suppress diverse oncogenic variants of RAS proteins, and RAS Companion Inhibitors for use in combination treatment strategies. As a new member of the Revolution Medicines team, you will join other outstanding professionals in a tireless commitment to patients with cancers harboring mutations in the RAS signaling pathway. The Opportunity: We are seeking an experienced Data Operations Engineer to support our Commercial and Medical Affairs team, ensuring that critical data is readily available, reliable, and secure. In this position, you will design and automate robust data pipelines that support data-driven insights and decisions in our emerging oncology biopharmaceutical company. You'll work at the intersection of data engineering and operations, managing cloud infrastructure and upholding data governance standards to maintain data quality and compliance. This role is key to empowering analysts and data scientists with timely, well-governed data - ultimately helping the organization bring life-saving cancer therapies to patients more efficiently. The ideal candidate has experience in DataOps, cloud-based data platforms, and automation, with a strong background in handling data sources such as sales performance, claims, patient data, CRM, and digital marketing metrics. Responsibilities will include: Design, build, and automate ETL/ELT workflows to ingest, transform, and integrate data from multiple sources (sales, marketing, clinical, etc.) into our cloud data platform. Ensure pipelines are scalable, efficient, and minimize downtime through automation and monitoring. Manage and optimize cloud-based data infrastructure (AWS and Azure) for data storage and processing. Oversee the provisioning of resources, scheduling of jobs, and infrastructure-as-code deployments to ensure high availability and performance of data systems. Implement data governance best practices, including data quality checks, validation processes, and metadata management. Maintain data privacy and compliance with industry regulations (e.g., HIPAA, GDPR), ensuring that sensitive data is handled securely and ethically. Develop continuous integration/continuous deployment (CI/CD) pipelines for data workflows and analytics applications. Use modern DevOps tools and containerization (Docker, Kubernetes) to deploy updates to data pipelines, databases, and analytics tools rapidly and reliably. Work closely with data scientists, BI analysts, and business stakeholders to understand data needs and translate them into technical solutions. Ensure data is accessible and well-structured for analytics, machine learning models, and business intelligence dashboards. Set up monitoring, alerting, and logging for data pipelines and databases to proactively identify issues and improve system reliability Troubleshoot and resolve data pipeline failures, data discrepancies, or performance bottlenecks in a timely manner to minimize impact on business operations. Create and maintain clear documentation for data pipelines, infrastructure configurations, and processes. Champion DataOps best practices across teams, mentoring junior engineers and guiding developers in efficient data engineering and operational excellence. Required Skills, Experience and Education: Bachelor's degree (or equivalent experience) in Computer Science, Data Engineering, Information Systems, or related field. 7+ years of hands-on experience in Data Engineering, DevOps, or DataOps roles, with a track record of designing scalable data pipelines and infrastructure Experience in pharma/life sciences. Excellent written and verbal communication skills, able to clearly explain complex data pipelines and infrastructure concepts to both technical colleagues and non-technical stakeholders. Strong team player who partners well with cross-functional teams on requirements and solutions, open to giving and receiving constructive feedback and sharing knowledge. Analytical mindset with a solution-oriented approach, capable of troubleshooting issues across the tech stack (data, code, infrastructure) and driving problems to resolution. Comfortable working in ambiguous environments, defining operating models, processes, roles, and responsibilities while executing and building capabilities and platforms. Self-motivated and accountable, with a high sense of ownership over deliverables. Strong experience with cloud platforms such as AWS or Azure (e.g., S3/ADLS, Lambda/Functions, EC2/VMs, Glue/Data Factory, etc.). Ability to architect and manage data warehouses or lakehouse solutions on the cloud (Databricks preferred). Proficiency in SQL for data querying and manipulation, as well as programming in Python (Pandas, PySpark, or similar data frameworks) for building pipeline logic and automation. Experience with containerization and orchestration tools (Docker and Kubernetes) to deploy data services and ensure reproducible environments. Knowledge of workflow orchestration platforms (Airflow, Airbyte, Fivetran, or similar) for scheduling and managing complex data workflows and integrations. Hands-on experience implementing CI/CD pipelines using tools like Jenkins, GitLab CI/CD, GitHub Actions, or Azure DevOps. Expertise in using infrastructure-as-code (Terraform, CloudFormation) and configuration management (Ansible, Helm) to automate deployments and environment management. Experience with big data processing frameworks (Spark) or streaming platforms (Kafka, Kinesis). Demonstrated ability to implement monitoring/logging (CloudWatch, Datadog, Splunk, or ELK stack) for data systems. Familiarity with version control (Git) and collaborative development workflows. Experience with supporting data science and AI/ML workflows, such as provisioning data for machine learning models or knowledge of MLOps principles. Solid understanding of relational and NoSQL databases (e.g., PostgreSQL, SQL Server, MongoDB) and data modeling concepts. Preferred Skills: Experience with oncology data or commercial/medical affairs pharma data at time of launch. Understanding of industry-specific data sources, terminology, and compliance requirements is a strong plus. Familiarity with regulations and standards such as HIPAA, GDPR, and GxP as they pertain to data handling and software validation in pharma. Ability to optimize data pipelines for analytics tools like R, SAS, or visualization platforms (Tableau, Power BI). Relevant certifications such as AWS Certified Data Analytics or Azure Data Engineer that demonstrate validated expertise. Experience leading data engineering projects or initiatives. Ability to coordinate work among team members, manage project timelines, and engage with stakeholders to gather requirements and report progress. The base salary range for this full-time position is $158,000 to $198,000 for candidates working onsite at our headquarters in Redwood City, CA. The range displayed on each job posting is intended to be the salary for an individual working onsite in Redwood City and will be adjusted for the local market a candidate is based in. Our salary ranges are determined by role, level, and location. Individual pay is determined by multiple factors, including job-related skills, experience, market dynamics, and relevant education or training. Please note that base salary is one part of the overall total rewards program at RevMed, which includes competitive cash compensation, robust equity awards, strong benefits, and significant learning and development opportunities. Revolution Medicines is an equal opportunity employer and prohibits unlawful discrimination based on race, color, religion, gender, sexual orientation, gender identity/expression, national origin/ancestry, age, disability, marital status, medical condition, and veteran status. Revolution Medicines takes protection and security of personal data very seriously and respects your right to privacy while using our website and when contacting us by email or phone. We will only collect, process and use any personal data that you provide to us in accordance with our CCPA Notice and Privacy Policy. For additional information, please contact privacy@revmed.com. #LI-Hybrid #LI-YG1

Posted 30+ days ago

Canoo Data Platform - Data Engineer-logo
Canoo Data Platform - Data Engineer
CanooOklahoma City, OK
Job Title Canoo Data Platform- Data Engineer About Canoo Canoo's mission is to bring EVs to Everyone and build a world-class team to deploy this sustainable mobility revolution. We have developed breakthrough electric vehicles that are reinventing the automotive landscape with pioneering technologies, award-winning designs, and a unique business model that spans all owners in the full lifecycle of the vehicle. Canoo is starting production and is distinguished by its pioneering and experienced team of technologists, engineers, and designers. With offices around the country, the company is scaling quickly and seeking candidates who love to challenge themselves, are motivated by purpose, and possess a strong desire to get things done. The "Canoo Way" Canoo's success is the direct result of our disciplined application of our core operating principles and drills, which are based on three main principles: Think 80/20 ("Important versus less important"), Act 30/30 ("Reduce waste and increase output"), and Live 90/10 ("We have each other's back"). We hire based on "MET"- Mindset, Equipment and willingness to Train - and seek individuals that take accountability and deliver results being Humble, Hungry to succeed, and Hunting for opportunities to win. We train our team to engage with each other by modulating between their intellect (iQ) and emotional intelligence (eQ) applying Facts, Finesse, and Force when they communicate. The principles and drills of the CANOO Way have been fundamental to our success, our ability to grow, continuously improve, innovate and are at the core of our day-to-day operations. Job Purpose As a Data Engineer, you will be responsible for developing and maintaining highly scalable data pipelines that enable data transformation and load between internal systems, IoT devices (electric vehicles), external backend systems, and frontend user interfaces. You will design and implement data streams ensuring data quality, data integrity, security, and high performance. Additionally, you will collaborate with cross-functional teams to continually integrate all company systems. Responsibilities (80s of the Position) Work with stakeholders to gather data and reporting requirements, to build dashboards and data flows. Create infrastructure-as-code, deployment pipelines, developer tools, and other automations. Understand product requirements, engage with team members and customers to define solutions, and estimate the scope of work required. Deliver solutions that can keep up with a rapidly evolving product in a timely fashion. Required Experience Google Cloud Platform (GCP), GCS, BigQuery Expertise with one or more back-end languages such as Python, Go, TypeScript, JavaScript, etc. SQL expertise- DBT experience a plus. Experience with cloud services like GCP, AWS or Azure. Kafka Dashboarding and Reporting- Superset, Looker Git- BitBucket/Gitlab *Kubernetes- Mid-Level Experience Preferred Experience Python Python dependency management and custom packages Expertise with Google Cloud Platform (GCP) Data Warehousing - partitioning, segmentation Internet of Things (IoT) and MQTT Docker Terraform - experience a plus CI/CD tooling- Jenkins/git-ci Understanding of automotive and embedded software systems Travel Requirements Onsite presence in the office, this is not a remote or hybrid role. Travel may be required on an occasional basis for events such as team meetings or working with manufacturers or subject-matter experts on particular tasks ( Physical Requirements for Non-Physical Positions While performing the duties of this job, employees may be required to sit for prolonged periods of time, occasionally bending or stooping, lifting up to 10 pounds, and prolonged periods of computer use. Reasonable Accommodations Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of the position. What's Cool About Working Here... Meaningful, challenging work that will redefine automotive landscape and make EVs available to everyone Comprehensive Health Insurance Equity Compensation Flexible Paid Time Off Casual workplace with an unbelievable feeling of energy Canoo is an equal opportunity-affirmative action employer and considers all qualified applicants for employment based on business needs, job requirements and individual qualifications, without regard to race, color, religion, sex, age, disability, sexual orientation, gender identity or expression, marital status, past or present military service or any other status protected by the laws or regulations in the locations where we operate. We also consider qualified applicants with criminal histories consistent with applicable federal, state and local law. Any unsolicited resumes or candidate profiles submitted in response to our job posting shall be considered the property of Canoo Inc. and its subsidiaries and are not subject to payment of referral or placement fees if any such candidate is later hired by Canoo unless you have a signed written agreement in place with us which covers the applicable job posting. Canoo maintains compliance with the OFCCP. As such, please feel free to review the following information: https://www.dol.gov/agencies/ofccp/posters https://www.dol.gov/agencies/olms/poster/labor-rights-federal-contractors If you are a person with a disability needing assistance with the application process, please call (214) 529-8055 or email us at TalentAcquisition@canoo.com Equal Employment Opportunity Posters Equal Employment Opportunity Posters | U.S. Department of Labor (dol.gov)

Posted 30+ days ago

Senior Data Engineer, Real World Data-logo
Senior Data Engineer, Real World Data
ProsciaPhiladelphia, PA
This position is on-site 2-3 days in Philadelphia About Proscia Pathology is at the center of medicine and is undergoing a profound transformation as the final frontier of digitization in healthcare. We started Proscia to accelerate pathology's transition from microscope to images, and to use AI to change the way we think about cancer. We can't change medicine alone. That's why we're looking for curious thinkers. Big dreamers. Developers, evangelists, pathologists, and scientists. Exceptional talent to help us use AI for good and advance humankind. At Proscia, we push the limits of medicine and technology, solving problems the world has never solved before. We build software used by thousands of scientists and pathologists, who work on the front lines of fighting cancer for patients globally. To accelerate our vision, Proscia has raised over $100M in capital from world-class healthcare and technology investors. About this Position As a Senior Data Engineer, you will contribute to Proscia's growing Real World Data (RWD) business, which operates as a "startup within a startup." In this entrepreneurial environment, you will help build and scale innovative data solutions that drive better outcomes for cancer patients and support cutting-edge research into therapies and drug regimens. Success in this role requires independence, adaptability, and close collaboration with cross-functional teams while maintaining alignment with broader engineering initiatives. What You'll Do Working at a startup like Proscia means wearing many hats, but when you come to work, you can expect to focus on the following: Develop and refine data pipelines in collaboration with the RWD team and customers, ensuring seamless extraction, transformation, and loading (ETL) of data from customer systems into Proscia's repositories. Work independently to identify and implement process improvements, such as automating manual data migration tasks to accelerate data flows. Design, build, and optimize API integrations, including REST and event-driven/pub-sub architectures, to enable real-time data ingestion, transfer, and quantification. Modify and create Python scripts to allow customers to efficiently upload data, including images, metadata, and other formats. Architect data models, data warehouse design patterns and scalable database solutions to support rapidly growing data while ensuring efficient performance and metadata processing. Collaborate with cross-functional teams, including the core engineering and AI teams, to align technical strategies and ensure seamless integration of RWD solutions with the larger Proscia platform. Operate with a startup mindset, contributing to the RWD team's agility and entrepreneurial spirit while delivering high-impact results. What We're Seeking We're looking for people who are smart, nice, and get stuff done. Proscia thrives on entrepreneurial thinkers who excel in fast-paced, challenging environments and are ready to build the plane while flying it. Our ideal candidate has: 3-5+ years of professional experience with Python scripting for data management. 3+ years of experience with detailed knowledge of data warehouse technical architectures. Strong analytical skills and attention to detail, with a proactive approach to solving problems. Experience with JavaScript and familiarity with modern frameworks (preferred). Proficiency in SQL and/or PostgreSQL. Hands-on experience with REST API development and integrations. Experience in cloud platforms, particularly AWS (preferred). Expertise with Snowflake or similar data warehouse platforms Software engineering principles testing (definition of unit tests, integration tests), setting up CI/CD pipelines and experience with containerization technologies like Docker and orchestration tools like Kubernetes. The ability to work independently within a cross-functional team and coordinate effectively across other engineering teams to design and implement high quality data pipelines. A Bachelor's degree in Computer Science, Computer Engineering, or Electrical Engineering (Master's degree preferred). Experience working in the life sciences/biopharmaceutical industry is a plus.. Beyond Just Work At Proscia, we want our people to thrive inside and outside the office. Along with competitive pay, we provide comprehensive benefits, flexible schedules, and insurance options to promote long-term health and personal growth. Our creative, collaborative office environment in Philadelphia is designed for agility, complete with writable walls and breakout spaces. Remote teammates stay connected through innovative collaboration tools and regular opportunities for in-person interaction. We celebrate diversity and foster a culture where everyone belongs. Proscia is proud to be an equal-opportunity workplace.

Posted 30+ days ago

Head Of Data Privacy Americas / Regional Director Data Privacy-logo
Head Of Data Privacy Americas / Regional Director Data Privacy
Dba: Zeiss GroupWhite Plains, NY
About Us: How many companies can say they've been in business for over 177 years?! Here at ZEISS, we certainly can! As the pioneers of science, ZEISS handles ever-changing environments in a fast-paced world, meeting it with cutting edge of technologies and continuous advancements. ZEISS believes that innovation and technology are the key to a sustainable future and solutions for global change. We have a diverse range of portfolios throughout the ZEISS family in segments like Industrial Quality & Research, Medical Technology, Consumer Markets and Semiconductor Manufacturing Technology. We are a global company with over 42,000 employees and have over 4,000 in the US and Canada alone! Make a difference, come join the team! Location/Region: This position is located in White Plains, New York. What's the role? As a Head of Data Privacy Americas / Regional Director Data Privacy, you get to work with an astonishing team that plays a vital role in Zeiss Share Services. Show case your skills and experience with process enhancement. The Regional Director, Data Privacy will oversee the regional operational management of the North America data privacy framework, applicable policies and procedures, and the related governance operating model. The role will work closely and collaboratively with the local Data Privacy Coordinators, Corporate Data Privacy Office, Corporate Counsel, and other stakeholders. This position reports to the Head of Compliance & Legal Affairs, with functional alignment with the Head of Corporate Data Privacy . Sound Interesting? Here's what you'll do: Act as primary subject matter expert and resource on issues related to data privacy. Provide guidance and training to internal teams on privacy matters that affect the company's products, customers, and our customers' patients. Lead the North America Privacy Program as part of the ZEISS Data Privacy Framework and work in conjunction with local Data Privacy Coordinators, legal and other relevant colleagues to review products, vendors, agreements, and initiatives, to advise on privacy/data security, consumer protection, patient privacy, and other related matters in accordance with HIPAA, CCPA, FTC principles, and other applicable international, federal, and state requirements. Support internal counsel in key privacy risk management activities including but not limited to: policy drafting and review, risk and control definition, coordination of recurring audit activities, and providing day-to-day "on-call" support for high-priority privacy-related matters. Communicates detailed regulatory requirements to the businesses, the Information Security Office, Internal Audit, as well as other members of the Corporate Data Privacy Office. Primary point of contact and coordinator for internal and external data privacy inquiries concerning North America, e.g., ZEISS internal inquiries, customer-related inquiries, audit responses, or possible privacy-related disputes. Governance of incident response, issue management, and training content development and coordination with key stakeholders as necessary to effect forensic investigations, crisis management activities, notifications to affected individuals, interaction with customers or vendors, responding to federal and state regulatory inquiries and litigation-related inquiries. Serve as Data Privacy Coordinator for Carl Zeiss, Inc. Do you qualify? 10+ years of relevant work experience, including hands-on management and proven contribution at both strategic and operational levels. Deep understanding of complex data privacy laws and principles, including HIPAA, GDPR, and CCPA. Expertise in triaging privacy-related questions and issue spotting. IAPP certification required. JD from an accredited law school or similar degree preferred. Excellent written, verbal, and social communication skills. Strong work ethic and sense of accountability and integrity. Solid team success orientation and ability to work both independently and collaboratively with diverse teams across the organization. Self-starter, with a demonstrated ability to identify issues, resolve problems and drive projects to completion. Demonstrated capacity to work independently. Trustworthy, positive, energetic, optimistic attitude with a willingness to work directly to achieve goals. A creative problem solver who is eager to learn about new ideas and concepts. We have amazing benefits to support you as an employee at ZEISS! Medical Vision Dental 401k Matching Employee Assistance Programs Vacation and sick pay The list goes on! The annual pay range for this position is $170,000 - $190,000 The pay offered for this role may be influenced by factors such as job location, scope of role, qualifications, education, experience, & complexity/specialization/scarcity of talent. This position is also eligible for a performance bonus or sales commissions. ZEISS also offers robust benefits, including medical plans, retirement savings plan and paid time off. Zeiss is an Equal Opportunity Employer. Your ZEISS Recruiting Team: Jo Anne Mittelman Zeiss provides Equal Employment Opportunity without unlawful regard to an Applicants race, color, religion, creed, sex, gender, marital status, age, national origin or ancestry, physical or mental disability, medical condition, military or veteran status, citizen status, sexual orientation, pregnancy (includes childbirth, breastfeeding or related medical condition), genetic predisposition, carrier status, gender expression or identity, including transgender identity, or any other class or characteristic protected by federal, state, or local law of the employee (or the people with whom the employee associates, including relatives and friends).

Posted 30+ days ago

Director, Data Scientist & Data Operations-logo
Director, Data Scientist & Data Operations
Eisai USDC, WA
At Eisai, satisfying unmet medical needs and increasing the benefits healthcare provides to patients, their families, and caregivers is Eisai's human health care (hhc) mission. We're a growing pharmaceutical company that is breaking through in neurology and oncology, with a strong emphasis on research and development. Our history includes the development of many innovative medicines, notably the discovery of the world's most widely-used treatment for Alzheimer's disease. As we continue to expand, we are seeking highly-motivated individuals who want to work in a fast-paced environment and make a difference. If this is your profile, we want to hear from you. FOR INTERNAL USE ONLY (NOT POSTED) The Data Operations Group at Eisai, Inc. is looking for an Director- Data Scientist/Programmer to drive drug development through predictive modeling of disease and drug response. The Data Scientist will work closely with Biostatisticians in the Stat Methodology / Machine Learning group to support projects across various stages of development. This role will be integral in providing actionable insights for critical data science projects that are vital to our business. The Data Operations Group at Eisai, Inc. is looking for an Director- Data Scientist/Programmer to drive drug development through predictive modeling of disease and drug response. The Data Scientist will work closely with Biostatisticians in the Stat Methodology / Machine Learning group to support projects across various stages of development. This role will be integral in providing actionable insights for critical data science projects that are vital to our business. This position may be either office based (hybrid) in Nutley, NJ, or remote based. Responsibilities Data Analysis for Strategic Insights: Skilled in analyzing complex datasets to extract actionable insights, identify key trends and patterns, and provide data-driven recommendations that support and guide strategic decision-making. Data Storage & Processing: Extensive experience in designing, managing, and optimizing data storage solutions. Expertise in building and automating data pipelines for efficient data processing. AI-Driven Data Preparation: Experienced in leveraging AI and machine learning algorithms for automated data preparation, streamlining the transformation of raw data into high-quality, actionable insights. Skilled in using these techniques to create dynamic and interactive visualizations via Power BI, facilitating better decision-making and business intelligence. Natural Language Processing (NLP) & Large Language Models (LLM): Hands-on expertise in applying NLP techniques and LLMs to process and analyze unstructured data, generating insightful infographics and data-driven narratives. These methods help to uncover hidden patterns and deliver actionable insights for stakeholders in a visually compelling format. Pipeline Orchestration & Automation: Experienced in automating and orchestrating complex data pipelines using tools like Apache Airflow, Prefect, and Dagster to ensure seamless data flow and efficient workflows. Data Quality & Consistency: Proficient in establishing and enforcing validation rules to ensure data integrity, consistency, and high-quality standards throughout the data lifecycle. Incremental Data Loads: Skilled in implementing incremental data loading strategies to optimize data refresh cycles and minimize resource consumption. Event-Driven Automation: Implemented event-driven automation to ensure real-time and dynamic updates for dashboards, enhancing decision-making with live data. Low-Latency Data Processing: Ensured optimal performance and low-latency processing for delivering real-time, time-sensitive insights to stakeholders. Dashboard Optimization: Leveraged parameterized queries and other optimization techniques to enhance the performance and responsiveness of Power BI dashboards. Data Communication & Visualization: Proficient in presenting complex data findings to non-technical stakeholders through clear, visually compelling reports, interactive dashboards, and presentations that facilitate easy understanding and informed decision-making. Exploratory Data Analysis (EDA): Skilled in conducting thorough exploratory data analysis to assess data quality, uncover insights, and deepen understanding of data characteristics, ensuring data readiness for analysis and model building. Feature Engineering: Expertise in engineering relevant features from raw datasets to enhance model performance, improve predictive accuracy, and support the development of robust machine learning models. Qualifications Bachelor's Degree from an accredited institution with 7+ years of experience in a related role required; Master's degree preferred. In-depth knowledge of statistical analysis, machine learning algorithms, and data modeling techniques. Proficiency in programming languages such as Python or R, with hands-on experience in data manipulation and analysis libraries (e.g., pandas, NumPy, scikit-learn). Experience with data visualization tools (e.g., Tableau, matplotlib) for effectively communicating insights. Familiarity with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure) is a plus. Strong problem-solving abilities, critical thinking, and the capacity to manage complex projects independently. Excellent communication and presentation skills, capable of translating complex concepts for both technical and non-technical audiences. Eisai Salary Transparency Language: The base salary range for the Director, Data Scientist & Data Operations is from :196,800-258,300 Under current guidelines, this position is eligible to participate in : Eisai Inc. Annual Incentive Plan & Eisai Inc. Long Term Incentive Plan. Final pay determinations will depend on various factors including but not limited to experience level, education, knowledge, and skills. Employees are eligible to participate in Company employee benefit programs. For additional information on Company employee benefits programs, visit https://us.eisai.com/careers-at-eisai/benefits . Certain other benefits may be available for this position, please discuss any questions with your recruiter. Eisai is an equal opportunity employer and as such, is committed in policy and in practice to recruit, hire, train, and promote in all job qualifications without regard to race, color, religion, gender, age, national origin, citizenship status, marital status, sexual orientation, gender identity, disability or veteran status. Similarly, considering the need for reasonable accommodations, Eisai prohibits discrimination against persons because of disability, including disabled veterans. Eisai Inc. participates in E-Verify. E-Verify is an Internet based system operated by the Department of Homeland Security in partnership with the Social Security Administration that allows participating employers to electronically verify the employment eligibility of all new hires in the United States. Please click on the following link for more information: Right To Work E-Verify Participation

Posted 1 week ago

Lead Business Consultant, Enterprise Data Management & Data Integration-logo
Lead Business Consultant, Enterprise Data Management & Data Integration
SimCorpNew York, NY
WHAT MAKES US, US Join some of the most innovative thinkers in FinTech as we lead the evolution of financial technology. If you are an innovative, curious, collaborative person who embraces challenges and wants to grow, learn and pursue outcomes with our prestigious financial clients, say Hello to SimCorp! At its foundation, SimCorp is guided by our values - caring, customer success-driven, collaborative, curious, and courageous. Our people-centered organization focuses on skills development, relationship building, and client success. We take pride in cultivating an environment where all team members can grow, feel heard, valued, and empowered. If you like what we're saying, keep reading! WHY THIS ROLE IS IMPORTANT TO US At SimCorp, we facilitate the streamlining of investments, accounting, and operations for major global financial institutions. We do this through IT systems, processes, and financial knowledge. Implementing our software by way of high-quality projects is at the core of what we do. To introduce our software to our clients, business consultants are essential to us. In the role of Lead Business Consultant specializing in Data Management and Data Integration, your participation will be instrumental in the expansion of our market unit and in advancing the outcomes of our clients' projects. You will normally be working full-time on implementation projects, which require your special expertise in Data Management and Data Integration, and you will be responsible for all aspects of the project stream. You will act as an important sparring partner for your clients and effectively monitor, coordinate, and escalate issues as needed. WHAT YOU WILL BE RESPONSIBLE FOR Responsibility for major parts of SimCorp One implementation projects: You know the customer requirements and processes related to Investment Operations at banks, asset managers and insurers and outline solutions for optimal use of SimCorp One Close cooperation with the project manager and key role in developing projects to success Gain understanding of SimCorp's Global Standard Solutions and utilize them for project delivery Key contact for our customers for all questions in connection with SimCorp One Instruct and mentor less experienced colleagues on the job Contribute to improving best practices for implementation processes and promote topics across projects Actively participate in know-how exchange with colleagues on an international level WHAT WE VALUE Most importantly, you can see yourself contributing and thriving in the position described above. How you gained the skills needed for doing that is less important. We expect you to have expertise at several of the following: Previous experience as an Analyst or (Implementation) Consultant for asset managers, asset owners, banks or consulting firms Hands-on experience with SimCorp One / SimCorp Dimension (SCD) Excellent know-how in at least two of the following areas: Enterprise Data Management Data Integration architecture, patterns and standards WS*, REST API, JSON XML, XSLT and other scripting languages like Phyton Relational Databases: Oracle, SQL Server and/or ETL tools C# or any other object-oriented language Testing e.g., Test-Driven Development, agile testing, test automation, test methodologies System performance improvements Continuous integration and delivery (CI/CD) Experience working on software implementation projects in the financial industry. Ability to manage assigned tasks and deliver the results on time Basic understanding of financial industry/products and related workflows Very good communication skills and proficiency in English - both oral and written Ability to travel to client's site BENEFITS An attractive salary, bonus scheme, and pension are essential for any work agreement. However, in SimCorp, we believe we can offer more. Therefore, in addition to the traditional benefit scheme, we provide an extensive work-life balance and opportunities for professional development: there is never just only one route - we offer an individual approach to professional development to support the direction you want to take. For New York City only: The salary range for this position is $133,000 - $170,000. Additionally, employees are eligible for an annual discretionary bonus, and benefits including health care, leave, and retirement plans. Your total compensation may vary based on role, location, department and individual performance. NEXT STEPS Applications are continuously assessed, so please send your CV in English as soon as possible. Please note: Only applications sent through our system will be processed. For those keen on exploring opportunities with SimCorp but questioning the alignment with this position, we welcome you to submit your CV for consideration. SimCorp is on an exciting growth journey, and our Talent Acquisition Team is ready to assist you in discovering the right role for you. The approximate time to consider your CV is three weeks. We are eager to continually improve our talent acquisition process and make everyone's experience positive and valuable. Therefore, during the process, we will ask you to provide your feedback, which is highly appreciated . WHO WE ARE For over 50 years, we have worked closely with investment and asset managers to become the world's leading provider of integrated investment management solutions. We are 3,000+ colleagues with a broad range of nationalities, education, professional experiences, ages, and backgrounds in general. SimCorp is an independent subsidiary of the Deutsche Börse Group. Following the recent merger with Axioma, we leverage the combined strength of our brands to provide an industry-leading, full, front-to-back offering for our clients, with SimCorp as the overarching company brand and Axioma as a key product brand. SimCorp is an equal-opportunity employer. We are committed to building a culture where diverse perspectives and expertise are integrated into our everyday work. We believe in the continual growth and development of our employees, so that we can provide best-in-class solutions to our clients. #Li-Hybrid

Posted 1 week ago

Member of Technical Staff, Data Platform (Data and Annotation Tools)-logo
Member of Technical Staff, Data Platform (Data and Annotation Tools)
Inflection AIPalo Alto, CA
Inflection AI is a public benefit corporation leveraging our world class large language model to build the first AI platform focused on the needs of the enterprise.  Who we are: Inflection AI was re-founded in  March of 2024 and our leadership team has assembled a team of kind, innovative, and collaborative individuals focused on building enterprise AI solutions. We are an organization passionate about what we are building, enjoy working together and strive to hire people with diverse backgrounds and experience.  Our first product, Pi, provides an empathetic and conversational chatbot. Pi is a public instance of building from our 350B+ frontier model with our sophisticated fine-tuning (10M+ examples), inference, and orchestration platform. We are now focusing on building new systems that directly support the needs of enterprise customers using this same approach. Want to work with us? Have questions? Learn more below. About the Role As a Member of Technical Staff on our Data Platform team, you will be instrumental in creating innovative data tools that transform raw inputs into high-quality datasets for ML training and labeling. Your work will focus on designing and building robust pipelines for data transformation, filtering, analysis, and cleaning. We are looking for data engineers who deeply understand ML and are passionate about developing next-generation tools that empower our data curation processes at scale. This is a good role for you if you: Have extensive experience working in ML environments, with a keen understanding of how high-quality data drives model performance. Are skilled at designing and implementing data tools that streamline the process of dataset creation, data annotation, and labeling. Possess a strong background in building systems for efficient data transformation, filtering, analysis, and cleaning. Thrive in innovative, fast-paced settings where you can directly impact the quality and reliability of training data for cutting-edge AI applications. Responsibilities include: Designing, building, and maintaining state-of-the-art data tools that convert raw data into high-quality datasets for ML training and labeling. Developing robust workflows and pipelines for data transformation, filtering, analysis, and cleaning that enhance dataset quality. Collaborating with ML researchers, data scientists, and engineers to ensure our data tools meet the rigorous standards required for enterprise-grade AI. Continuously evaluating and integrating emerging technologies to keep our data curation processes at the forefront of innovation. Driving the evolution of our data platform to support scalable, efficient, and effective data annotation and labeling pipelines. Employee Pay Disclosures At Inflection AI, we aim to attract and retain the best employees and compensate them in a way that appropriately and fairly values their individual contributions to the company. For this role, Inflection AI estimates a starting annual base salary will fall in the range of approximately $175,000 - $350,000 depending on experience. This estimate can vary based on the factors described above, so the actual starting annual base salary may be above or below this range.   Benefits Inflection AI values and supports our team’s mental and physical health. We are focused on building a positive, safe, inclusive and inspiring place to work. Our benefits include:  Diverse medical, dental and vision options  401k matching program  Unlimited paid time off  Parental leave and flexibility for all parents and caregivers Support of country-specific visa needs for international employees living in the Bay Area Interview Process Apply: Please apply on Linkedin or our website for a specific role. After speaking with one of our recruiters, you’ll enter our structured interview process, which includes the following stages: Hiring Manager Conversation – An initial discussion with the hiring manager to assess fit and alignment. Technical Interview – A deep dive with an Inflection Engineer to evaluate your technical expertise. Onsite Interview – A comprehensive assessment, including: A domain-specific interview A system design interview A final conversation with the hiring manager Depending on the role, we may also ask you to complete a take-home exercise or deliver a presentation. For non-technical roles , be prepared for a role-specific interview, such as a portfolio review. Decision Timeline We aim to provide feedback within one week of your final interview.

Posted 30+ days ago

QA Analyst Lead - Data Lake, Data Warehouse-logo
QA Analyst Lead - Data Lake, Data Warehouse
Huntington Bancshares IncColumbus, OH
Description QA Analyst Lead - Data Lake, Data Warehouse Description: Huntington Bank is looking for a Lead QA Test Analyst in our Data Lake, Data Warehouse team. In this role you will be part of a team working to develop solutions enabling the business to leverage data as an asset at the bank. As a Lead QA Test Analyst, you will work to develop test strategies and test plans for Data Lake projects ensuring all IT SDLC processes are documented and practiced, working closely with multiple technologies teams across the enterprise. You will also execute test cases and communicate status to project team members and key stakeholders. Key technologies include Azure DevOps, Phython, Data Lake, Cloud AWS, Snowflake, Zena, and DataStage. If you consider data as a strategic asset, evangelize the value of good data and insights, have a passion for learning and continuous improvement, this role is for you. Key Responsibilities Actively participate in the review of project requirements, data mappings and technical design specifications. Create test strategies and test plans for Data Lake projects, mapping back to the project requirements to ensure proper test coverage. Execute test cases in Azure DevOps using manual and/or automated test processes. Execute SQL database queries to support test execution. Strong SQL skills required. Having knowledge of the code, Perform ETL validation according to data mapping, execute data profiling, reconciliation of data, meta data validation, initial and delta validation for different SCD types. Analyze data, troubleshoot data issues, and create action plans to address data quality issues. Coordinate test execution with other application teams and UAT partners. Create and communicate test status with project team members and stakeholders. Identify, document, and communicate testing defects. Collaborate with project team on defect analysis and triage. Support continuous improvement by identifying and solving opportunities to define or enhance QA process. Perform Functional, Regression, Negative and migration testing for the Data warehouse projects. To perform these duties, the Lead QA Testing Analyst position requires theoretical and practical knowledge of quality assurance, testing principles, ETL technologies and tools: including Cloud AWS, Snowflake, DataStage, Python, Zena, Infogix, Tableau, Azure DevOps, Mainframe and SharePoint. Basic Qualifications Bachelor's degree 5+ years of ETL testing experience in data warehouse environment 1+ years experience as a Lead or Subject Matter Expert (SME) Preferred Qualifications 5+ years of experience writing SQL queries 2+ years of experience with Snowflake and AWS Cloud 2+ Experience leading QA Analysts on a project team Experience in financial services (banking) industry. Experience testing on Snowflake and AWS S3/EC2/EMR. To perform these duties, the Lead QA Testing Analyst position requires theoretical and practical knowledge of quality assurance, testing principles, ETL technologies and tools: including AWS, DataLake, Snowflake, DataStage, Python, ASG Zena, Infogix, Tableau, Azure DevOps, Mainframe and SharePoint. Experience with data governance and data management approaches. Excellent verbal and written communications skills. Ability to effectively prioritize and execute tasks. Detail oriented and highly motivated with strong organizational, analytical and problem-solving skills. Exempt Status: (Yes = not eligible for overtime pay) (No = eligible for overtime pay) Yes Workplace Type: Office Our Approach to Office Workplace Type Certain positions outside our branch network may be eligible for a flexible work arrangement. We're combining the best of both worlds: in-office and work from home. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. Remote roles will also have the opportunity to come together in our offices for moments that matter. Specific work arrangements will be provided by the hiring team. Huntington is an Equal Opportunity Employer. Tobacco-Free Hiring Practice: Visit Huntington's Career Web Site for more details. Note to Agency Recruiters: Huntington Bank will not pay a fee for any placement resulting from the receipt of an unsolicited resume. All unsolicited resumes sent to any Huntington Bank colleagues, directly or indirectly, will be considered Huntington Bank property. Recruiting agencies must have a valid, written and fully executed Master Service Agreement and Statement of Work for consideration.

Posted 30+ days ago

Director, Data Scientist & Data Operations-logo
Director, Data Scientist & Data Operations
Eisai USProvidence, RI
At Eisai, satisfying unmet medical needs and increasing the benefits healthcare provides to patients, their families, and caregivers is Eisai's human health care (hhc) mission. We're a growing pharmaceutical company that is breaking through in neurology and oncology, with a strong emphasis on research and development. Our history includes the development of many innovative medicines, notably the discovery of the world's most widely-used treatment for Alzheimer's disease. As we continue to expand, we are seeking highly-motivated individuals who want to work in a fast-paced environment and make a difference. If this is your profile, we want to hear from you. FOR INTERNAL USE ONLY (NOT POSTED) The Data Operations Group at Eisai, Inc. is looking for an Director- Data Scientist/Programmer to drive drug development through predictive modeling of disease and drug response. The Data Scientist will work closely with Biostatisticians in the Stat Methodology / Machine Learning group to support projects across various stages of development. This role will be integral in providing actionable insights for critical data science projects that are vital to our business. The Data Operations Group at Eisai, Inc. is looking for an Director- Data Scientist/Programmer to drive drug development through predictive modeling of disease and drug response. The Data Scientist will work closely with Biostatisticians in the Stat Methodology / Machine Learning group to support projects across various stages of development. This role will be integral in providing actionable insights for critical data science projects that are vital to our business. This position may be either office based (hybrid) in Nutley, NJ, or remote based. Responsibilities Data Analysis for Strategic Insights: Skilled in analyzing complex datasets to extract actionable insights, identify key trends and patterns, and provide data-driven recommendations that support and guide strategic decision-making. Data Storage & Processing: Extensive experience in designing, managing, and optimizing data storage solutions. Expertise in building and automating data pipelines for efficient data processing. AI-Driven Data Preparation: Experienced in leveraging AI and machine learning algorithms for automated data preparation, streamlining the transformation of raw data into high-quality, actionable insights. Skilled in using these techniques to create dynamic and interactive visualizations via Power BI, facilitating better decision-making and business intelligence. Natural Language Processing (NLP) & Large Language Models (LLM): Hands-on expertise in applying NLP techniques and LLMs to process and analyze unstructured data, generating insightful infographics and data-driven narratives. These methods help to uncover hidden patterns and deliver actionable insights for stakeholders in a visually compelling format. Pipeline Orchestration & Automation: Experienced in automating and orchestrating complex data pipelines using tools like Apache Airflow, Prefect, and Dagster to ensure seamless data flow and efficient workflows. Data Quality & Consistency: Proficient in establishing and enforcing validation rules to ensure data integrity, consistency, and high-quality standards throughout the data lifecycle. Incremental Data Loads: Skilled in implementing incremental data loading strategies to optimize data refresh cycles and minimize resource consumption. Event-Driven Automation: Implemented event-driven automation to ensure real-time and dynamic updates for dashboards, enhancing decision-making with live data. Low-Latency Data Processing: Ensured optimal performance and low-latency processing for delivering real-time, time-sensitive insights to stakeholders. Dashboard Optimization: Leveraged parameterized queries and other optimization techniques to enhance the performance and responsiveness of Power BI dashboards. Data Communication & Visualization: Proficient in presenting complex data findings to non-technical stakeholders through clear, visually compelling reports, interactive dashboards, and presentations that facilitate easy understanding and informed decision-making. Exploratory Data Analysis (EDA): Skilled in conducting thorough exploratory data analysis to assess data quality, uncover insights, and deepen understanding of data characteristics, ensuring data readiness for analysis and model building. Feature Engineering: Expertise in engineering relevant features from raw datasets to enhance model performance, improve predictive accuracy, and support the development of robust machine learning models. Qualifications Bachelor's Degree from an accredited institution with 7+ years of experience in a related role required; Master's degree preferred. In-depth knowledge of statistical analysis, machine learning algorithms, and data modeling techniques. Proficiency in programming languages such as Python or R, with hands-on experience in data manipulation and analysis libraries (e.g., pandas, NumPy, scikit-learn). Experience with data visualization tools (e.g., Tableau, matplotlib) for effectively communicating insights. Familiarity with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure) is a plus. Strong problem-solving abilities, critical thinking, and the capacity to manage complex projects independently. Excellent communication and presentation skills, capable of translating complex concepts for both technical and non-technical audiences. Eisai Salary Transparency Language: The base salary range for the Director, Data Scientist & Data Operations is from :196,800-258,300 Under current guidelines, this position is eligible to participate in : Eisai Inc. Annual Incentive Plan & Eisai Inc. Long Term Incentive Plan. Final pay determinations will depend on various factors including but not limited to experience level, education, knowledge, and skills. Employees are eligible to participate in Company employee benefit programs. For additional information on Company employee benefits programs, visit https://us.eisai.com/careers-at-eisai/benefits . Certain other benefits may be available for this position, please discuss any questions with your recruiter. Eisai is an equal opportunity employer and as such, is committed in policy and in practice to recruit, hire, train, and promote in all job qualifications without regard to race, color, religion, gender, age, national origin, citizenship status, marital status, sexual orientation, gender identity, disability or veteran status. Similarly, considering the need for reasonable accommodations, Eisai prohibits discrimination against persons because of disability, including disabled veterans. Eisai Inc. participates in E-Verify. E-Verify is an Internet based system operated by the Department of Homeland Security in partnership with the Social Security Administration that allows participating employers to electronically verify the employment eligibility of all new hires in the United States. Please click on the following link for more information: Right To Work E-Verify Participation

Posted 1 week ago

Heluna Health logo
Senior Data Scientist-Lead Data Engineer (N373)
Heluna HealthLos Angeles, California
Apply

Automate your job search with Sonara.

Submit 10x as many applications with less effort than one manual application.1

Reclaim your time by letting our AI handle the grunt work of job searching.

We continuously scan millions of openings to find your top matches.

pay-wall

Job Description

Salary Range: $9,333.00 -$12,576.46 Monthly

SUMMARY

Housing for Health (HFH) is a program office within Community Programs, a division under the Los Angeles County Department of Health Services (DHS) for the County of Los Angeles. HFH was created to support people experiencing homelessness with complex clinical needs. We support people in obtaining housing, improving their health and thriving in their communities. HFH is a core component of Los Angeles County’s effort to respond to the homeless emergency. Where appropriate to the job function, a hybrid work schedule may be available, with employees working both remotely and from the office, as needed.   

This position offers a unique opportunity to shape the data infrastructure of one of the most impactful homelessness service systems in the nation. The Lead Data Engineer will oversee the foundational architecture for Community Programs’ Databricks-based Lakehouse, supporting seamless integration, governance, and analysis of health, housing, and justice data across initiatives including Housing for Health and others. The role reports to the Director of Data Analytics and Evaluation and will guide efforts across data ingestion, transformation, privacy, and CalAIM claims workflows.

This is a chance to architect a scalable data environment from the ground up in a mission-driven context. The engineering team plays a central role in the County’s data strategy, with opportunities for mentorship, innovation, and cross-sector impact.

ESSENTIAL FUNCTIONS

  • Lead the development of bronze, silver, and gold data layers from highly normalized data sources, translating complex table relationships into denormalized, analyst-ready structures.
  • Implement GitHub-based CI/CD workflows that manage Dev and Prod separation, parameterized notebook execution, and automated testing.
  • Use Azure Data Factory and related tools to ingest and orchestrate data flows from diverse internal and external sources.
  • Collaborate closely with analytics and privacy engineers to ensure models align with data governance and privacy requirements.
  • Support integration of MDM-linked datasets and promote entity resolution strategies across systems.
  • Monitor pipeline health and lead resolution of data quality issues in coordination with analysts and external partners.
  • Review and approve engineering code as part of CI/CD workflow, unblock or resolve issues holding up promotion to Prod.
  • Define architecture standards and RBAC models in Unity Catalog for Departmental and Countywide access.
  • Mentor engineering team members on best practices and solution design.
  • Produce clear, maintainable documentation and SOPs to support onboarding, data catalog use, and model transparency.

JOB QUALIFICATIONS

Option I: Two (2) years of experience independently leading complex data engineering projects, including developing and applying methods to collect, structure, and transform data using statistical, programmatic, and engineering techniques to support data-driven decision-making, at a level equivalent to the Los Angeles County class of Data Scientist.

Option II: A Bachelor's degree from an accredited college or university in a field such as Data Science, Data Engineering, Computer Science, Statistics, Mathematics, Machine Learning, Business Analytics, Psychology, or Public Health, with at least 12 semester or 18 quarter units in data science, quantitative research methods, programming, or statistical analysis – AND – six (6) years of experience in data engineering, including two (2) years in a lead or supervisory capacity. A Master’s or Doctoral degree from an accredited college or university in a field of applied research such Data Science, Machine Learning, Mathematics, Statistics, Business Analytics, Psychology, or Public Health may substitute for up to two (2) years of experience.

Certificates/Licenses/Clearances

  • A valid California Class C Driver License or the ability to utilize an alternative method of transportation when needed to carry out job-related essential functions.
  • Successful clearing through the Live Scan and the Health Clearance process with the County of Los Angeles.

Other Skills, Knowledge, and Abilities

  • 7+ years building scalable data pipelines in cloud environments.
  • Advanced proficiency in Databricks (Delta Live Tables, Unity Catalog), Spark, SQL, and Python.
  • Experienced with CI/CD (GitHub, Azure DevOps), infrastructure-as-code (Terraform), and Azure tools (Data Factory, Synapse).
  • Skilled in MDM integration and entity resolution.
  • Familiar with MLflow, experience embedding data expectations within ETL/ELT pipelines, and data observability practices.
  • Strong grasp of Medallion Architecture, performance optimization, and ETL/ELT design.
  • Working knowledge of HIPAA, 42 CFR Part 2, and public-sector data governance.
  • Experience mentoring teams and leading technical standards.

PHYSICAL DEMANDS

Stand: Frequently

Walk: Frequently

Sit: Frequently

Reach Outward: Occasionally

Reach Above Shoulder: Occasionally

Climb, Crawl, Kneel, Bend: Occasionally

Lift / Carry: Occasionally - Up to 15 lbs.

Push/Pull: Occasionally - Up to 15 lbs.

See: Constantly

Taste/ Smell: Not Applicable

 

Not Applicable = Not required for essential functions

Occasionally = (0 - 2 hrs./day)

Frequently = (2 - 5 hrs./day)

Constantly = (5+ hrs./day)

WORK ENVIRONMENT

General Office Setting, Indoors Temperature Controlled

EEOC STATEMENT

It is the policy of Heluna Health to provide equal employment opportunities to all employees and applicants, without regard to age (40 and over), national origin or ancestry, race, color, religion, sex, gender, sexual orientation, pregnancy or perceived pregnancy, reproductive health decision making, physical or mental disability, medical condition (including cancer or a record or history of cancer), AIDS or HIV, genetic information or characteristics, veteran status or military service.