1. Home
  2. »All Job Categories
  3. »Data Science Jobs

Data Science Jobs 2025 (Now Hiring) – Smart Auto Apply

We've scanned millions of jobs. Simply select your favorites, and we can fill out the applications for you.

IUNU logo
IUNUSeattle, WA
At IUNU (“you knew”), our mission is to deliver confidence at scale to the commercial greenhouse industry. We built LUNA, a computer vision platform that autonomously tracks plant development to turn visual data into high-value decisions. Deployed in over 15 countries, our technology empowers growers with critical insights like yield forecasting and pest detection to drive operational efficiency and reduce waste. We are looking for a Senior Engineer who is passionate about applying their technical expertise to solve real-world problems and build a more sustainable future for agriculture. About the Role: This role is a core systems engineering position for a builder who wants to solve complex challenges at the intersection of distributed systems and horticultural science. You will leverage your experience with data, algorithms, graphs, mathematics, and software engineering to enhance and extend the core of our LUNA system with robust, scalable systems. Working closely with our computer vision team and staff horticulturalists, you’ll transform the image and sensor data we gather to generate unique insights for growers. Responsibilities: Design and implement the core distributed data processing engine that powers IUNU’s platform, moving beyond simple aggregation to handle large-scale datasets with high dimensionality. Optimize for performance & scale by implementing advanced concurrency patterns and algorithmic techniques to maximize throughput across our distributed compute environment. Design deterministic, event-driven workflows that guarantee data integrity and exactly-once processing, handling backpressure and late-arriving data in a non-deterministic physical world. Drive the technical direction of the team by championing rigorous design reviews, observability best practices, and fault-tolerant architecture that balances speed of execution with long-term system stability. Partner with the product and computer vision teams to translate abstract horticultural requirements into concrete, scalable technical solutions that directly impact yield forecasting and operational efficiency for growers. Requirements: 5+ years of professional software engineering experience. Expert-level knowledge of Python, including modern language features, performance optimization, concurrency primitives (threading, multiprocessing), and best practices in production-grade code. Advanced mastery of relational database internals, specifically PostgreSQL. Candidates must demonstrate proficiency in query optimization, relational algebra, and the distinct challenges of time-series data storage. Proven hands-on experience designing, building, and operating systems that process and aggregate large datasets, with expertise in distributed data processing frameworks and efficient aggregation pipelines. Deep understanding of algorithms and data structures, with the ability to analyze time/space complexity, select optimal solutions for real-world problems, and implement efficient algorithmic logic. Proven track record of designing, implementing, and productionizing high-performance algorithms that operate reliably at scale in distributed environments. Solid theoretical and practical knowledge of graph theory, including traversal algorithms (DFS, BFS), shortest-path algorithms, topological sorting, cycle detection, centrality measures, and experience applying graph algorithms to real systems (e.g., dependency resolution, social networks, recommendation engines, or knowledge graphs). Comprehensive understanding of data pipeline dynamics, including scheduling strategies, event-driven vs. time-based triggering, deterministic execution guarantees, idempotency, exactly-once/late-data handling, and backpressure management. Hands-on experience with production orchestration platforms such as Kubernetes (including operators, CRDs, and Helm), Argo Workflows, Airflow, Prefect, Dagster, Temporal, or equivalent frameworks, with emphasis on reliability, observability, and scaling of complex workflows. Strong grasp of OS-level concurrency mechanisms (mutexes, semaphores, condition variables, read-write locks, atomic operations) and practical experience implementing correct, high-performance multi-threaded or multi-process systems in Python and/or lower-level languages. High ownership mindset, bias for action, and ability to thrive in complex problem spaces. Desired: Hands-on experience with the Google Cloud Platform ecosystem. Advanced knowledge of PostgreSQL and TimescaleDB. Strong mathematical or statistical programming skills within the Python ecosystem. Data engineering experience including ETL/ELT, AI/ML, and LLM pipelines. Diversity At IUNU, we’re committed to providing a safe and inclusive environment. We are dedicated to the happiness and success of all of our employees, and strive to foster a workplace in which individual differences are recognized, appreciated, nurtured, and respected. Diversity is important, and we strongly encourage people of all identities and backgrounds to confidently apply for a job with us if this is a role that interests and excites you. We want you to feel comfortable bringing your whole self to work with you, with all of your talents and strengths. Powered by JazzHR

Posted 1 week ago

Pipecare Group logo
Pipecare GroupHouston, TX
PIPECARE Group  is currently looking for  EMAT Data Analyst Level 3  to join our team in Houston, Texas. By providing technology and service focused solutions to the international arena of the oil and gas industry, the PIPECARE Group of companies has been helping our customers ensure the integrity of their pipeline and facility assets for over 20 years. Due to our global focus and international growth, PIPECARE is seeking experienced EMAT Data Analyst [Level 3] to support our continued growth. The selected candidate will be working with our project execution teams to ensure the timely and accurate reporting of in-line inspection results, with a focus on our custom-tailored reporting solutions to satisfy our customers’ needs. This includes the review of customer requirement specifications, processing of in-line inspection data, the analysis and identification of pipeline features and anomalies within processed data sets, application of industry accepted anomaly assessment criteria, ensuring the quality and accuracy of the final results, and compiling the results of our inspection activities in a concise, comprehensive custom tailored report for our customers. Industry/sector:  Oil & Gas / In Line Inspection services Qualification:  Certification in EMAT technology [Level 3] Min years of experience:  minimum last 15+ years working as EMAT Data Analyst Other requirements:  solution – oriented attitude; hands on approach; disciplined; team player; self-motivated Responsibilities include: Checking and approving the tool performance during the PTT. Checking the data quality of ILI runs. EMAT Data Analysis (Valid or expired ASNT LIII or LII Certificate). Checking and implementing dig verification task at sites and preparing relevant reports. Reviewing the software inter phase. Reviewing software user manuals. Preparing/Reviewing DAD quality documentation. To ensure accurate tool sensitivity values are provided to TM in Tool Checklist. To prepare a specific Run assessment report. To identify obstructions in the pipeline. To produce technically valid Preliminary / Final report. To inform HO-DAD about the results and/or to implement the results into the reports. To ensure that the coordinates are synchronized with the data. To alert the R&D regarding software problems. To update the documentation. To produce and update standard quality procedures. To alert the DA Team Leader / DA Manager regarding software problems. Execute all other tasks as requested by DA Team Leader or DA Manager and/or Executive Team within the assigned job role. Qualified Candidates will possess: College degree in engineering or related fields. Database development and implementation experience. Process analysis, requirement / functional specification development experience. Quality assurance of databases, reporting experience. Experience of working on large, complex and multiple databases. Proficient in using analytical tools and instruments, for instance Excel, Microsoft Access, Minitab and SPSS. High ability to work with numbers. Strong written and verbal communication skills. Analytical mind which can process information logically. Professional level of English language. Job requirements: Ability to work for extended periods of time in a stationary position at computers and workstations. Ability to pass vision acuity and color differentiation examinations. Business travel may be required for internal training, internal meetings, site visits, and customer meetings [international travel may be required]. Ability to work flexible hours based on business and project needs. Ability to work either independently or within a team to ensure project success. Physical and Mental Requirements: Lifting and Carrying: Ability to lift and carry up to 50 pounds. Mobility: Must be able to walk and climb to perform duties, including maneuvering within a refinery or plant environment and accessing elevated platforms via ladders and stairwells. Communication: Sufficient clarity of speech and hearing, or other communication capabilities, to communicate effectively. Focus and Multitasking: Ability to maintain focus and multitask effectively. Safety Equipment: Must be able to wear safety equipment as required by the safety department for personal protection, if/where needed in manufacturing environments. Personal Mobility and Reflexes: Sufficient personal mobility and physical reflexes, with or without reasonable accommodations, to perform office duties and travel to off-site locations when necessary. About PIPECARE Group: PIPECARE Group offers comprehensive In-Line Inspection Services to identify and size pipeline threats, Utilizing advanced technologies such as Magnetic Flux Leakage, Transverse Field Inspection, Ultrasound, and specialized tools, PIPECARE ensures precise detection and assessment of various pipeline anomalies. What we do: In-Line Inspection Services PIPECARE provides In-Line Inspection Services to locate, identify, and size threats, supporting integrity management requirements. Check out our AI Technology and other cutting-edge technologies by clicking the following YouTube Links: PIPECARE Group - YouTube SMART AI CALIPER - Inspection experience like never befor e Inspection Technologies Magnetic Flux Leakage (MFL): Detects and sizes general corrosion and metal loss anomalies, especially circumferentially oriented. Transverse Field Inspection (TFI): Detects and sizes general corrosion and metal loss anomalies, primarily axially oriented. Ultrasound (UT): Detects and sizes general and other metal loss anomalies with high depth sizing accuracy. Ultrasonic Crack Detection: Detects and sizes cracks and colonies of cracks. Caliper (Geometry): Detects and sizes deviations in the ideal circular shape of a pipeline (dents, ovalities, wrinkles, etc.). Specialized Tools and Technologies Combo Tools: Use multiple measurement systems in various combinations. Specialized Tubing Technologies: Designed for Furnace and Downhole Operations. Equal Opportunity Employer: We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law. Powered by JazzHR

Posted 30+ days ago

M logo
MetroSysSanta Barbara, CA
Overview MetroSys is seeking an experienced Data Engineer with a strong background in Python and Microsoft Azure environments. The ideal candidate will have at least 5 years of experience in building and optimizing data pipelines, managing data storage solutions, and integrating systems through APIs. This role will focus on developing robust data pipelines and warehouse solutions to support enterprise-level data initiatives. Key Responsibilities Design, develop, and maintain scalable data pipelines to support analytics, reporting, and operational workloads. Build and optimize data storage solutions in Microsoft Azure, including Azure Data Lake and related services. Integrate third-party and internal systems using APIs for data ingestion and synchronization. Collaborate with data architects, analysts, and business stakeholders to ensure data solutions meet requirements. Implement best practices for data governance, quality, and security across the pipeline lifecycle. Monitor, troubleshoot, and improve pipeline performance and reliability. Qualifications 5+ years of hands-on experience as a Data Engineer or similar role. Strong proficiency with Python for data manipulation, automation, and pipeline development. Proven experience in Microsoft Azure data services (Azure Data Factory, Data Lake, Synapse, SQL Database). Solid understanding of data warehouse concepts and storage optimization techniques. Experience designing and consuming APIs for system integration. Strong problem-solving skills and ability to work independently in a remote environment. Preferred Skills: Knowledge of cloud security practices and compliance requirements. Familiarity with CI/CD pipelines for data workflows. Experience with large-scale enterprise data projects. Powered by JazzHR

Posted 30+ days ago

B logo
Brook ServicesCameron, WI
Position Overview The Remote Data Entry Clerk is responsible for accurately entering, updating, and maintaining information in digital databases while working from home. This role requires attention to detail, strong organizational skills, and the ability to handle large volumes of data efficiently. Key Responsibilities Enter, verify, and update data in company databases and systems. Review documents for accuracy, completeness, and consistency. Maintain organized digital records and files. Perform data quality checks and correct errors as needed. Generate reports and summaries as required. Collaborate with other departments to support data-related tasks. Maintain confidentiality of sensitive information. Meet daily or weekly data entry targets with accuracy. Required Skills and Qualifications Strong typing skills (minimum 40 WPM preferred). Excellent attention to detail and organizational abilities. Proficient in Microsoft Office (Excel, Word) or Google Workspace. Familiarity with database management systems. Strong written and verbal communication skills. Ability to work independently and manage time effectively. High school diploma or equivalent (Associate degree preferred). Preferred Qualifications Previous experience in data entry or administrative support. Familiarity with CRM or ERP software. Experience working remotely. Powered by JazzHR

Posted 30+ days ago

I logo
IT Automation LLCMA, MA
Key Responsibilities: Design, develop, and maintain ETL/ELT pipelines using Informatica to move data into Snowflake . Develop and optimize data models, schemas, and views within Snowflake. Implement and monitor data ingestion, transformation, and quality assurance processes. Ensure all solutions comply with HIPAA/HITECH and organizational security standards. Support HL7, FHIR, and X12 EDI interoperability initiatives. Collaborate with data analysts and business teams to fulfill reporting and analytics requirements. Document data flows, metadata, and transformation logic for transparency and traceability. Required Skills: 3–5+ years of experience in data engineering or ETL development . Strong proficiency with Snowflake , Informatica (PowerCenter/IICS) , and SQL . Deep understanding of data warehousing, data governance, and data quality principles. Experience with structured/semi-structured data (JSON, XML, CSV). Familiarity with AWS, Azure, or GCP storage and services. Preferred Qualifications: Scripting experience with Python for automation. Hands-on experience with IDQ, Data Catalog, Git , and CI/CD pipelines . Snowflake or Informatica certification(s) highly desirable. Powered by JazzHR

Posted 30+ days ago

Red Arch Solutions logo
Red Arch SolutionsAnnapolis Junction, MD

$200,000 - $229,000 / year

Position Description: Red Arch Solutions is hiring a HPC Software Engineer in Annapolis Junction, Maryland. The Software Engineer shall be responsible for the design, development, test, and sustainment of an end-to-end Big Data processing platform that provides downstream systems with aggregated analytic results and allows end-users the ability to query, visualize, and analyze event data from multiple data sources. Position Required Skills: Active TS/SCI with Polygraph. Master’s degree in computer science or related discipline from an accredited college or university, plus five (5) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity OR Bachelor’s degree in computer science or related discipline from an accredited college or university, plus seven (7) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity OR Nine (9) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity. Experience using the Linux CLI Experience with Bash/Python scripting Experience developing Java applications in a Linux environment Experience developing with the Spring framework including Spring Boot Experience with processing Big Data Demonstrated experience with system design and architecture Experience with Web development, HTTP, and REST services Experience with NoSQL technologies such as Elasticsearch and Accumulo Experience with CI/CD principles, concepts, best practices and tools such as Jenkins and GitLab CI Position Desired Skills Experience with the Atlassian Tool suite (Jira, Confluence) Experience with Git Version Control System Experience with Test-driven development Experience with containerization technologies such as Docker Salary range for this position: $200,000 - $229,000 #CJ The Red Arch Solutions pay range for this job level is a general guideline only and not a guarantee of compensation or salary. Determination of official compensation or salary relies on several factors including, but not limited to, level of position, job responsibilities, geographic location, scope of relevant work experience, educational background, certifications, contract-specific affordability, organizational requirements, alignment with local internal equity as well as alignment with market data. Red Arch Solutions Benefits Snapshot: 100% paid employee healthcare premiums. CareFirst Advantage Best in Class Benefits. HaloScripts Concierge prescription medication service Generous PTO. 10 % 401K 6% match vested day one and up to 4% profit sharing contribution. Annualized bonus compensation. Spot bonuses for hard work. Tuition reimbursement. 529 College Saving Plan. College Loan Pay Back Program. 11 Paid Federal Holidays Generous Referral Bonuses. Red Arch Solutions provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws. Have more questions about Red Arch? Email us at Careers@RedArchSolutions.com Powered by JazzHR

Posted 3 weeks ago

Airtable logo
AirtableSan Francisco, CA

$179,500 - $221,500 / year

Airtable is the no-code app platform that empowers people closest to the work to accelerate their most critical business processes. More than 500,000 organizations, including 80% of the Fortune 100, rely on Airtable to transform how work gets done. At Airtable, we’re passionate about democratizing software creation—empowering anyone to build powerful, flexible tools without writing code. With our shift to an AI-native platform, customers can now generate full apps and deploy AI agents directly into their workflows. Data engineering plays a critical role in this evolution by delivering the insights our teams rely on to improve user experience, measure agent impact, and understand how the business is performing at scale. As one of the data engineers at Airtable, you'll make an enormous contribution to our data engineering efforts. You'll design and own mission-critical data pipelines to enable decision-making, partner with company leaders to create scalable data solutions, and launch innovative alerting and visualization solutions. What you'll do Work between our engineering organization and stakeholders from our data science, growth, sales, marketing, and product teams, to understand the data needs of the business and produce pipelines, data marts, and other data solutions that enable better product and growth decision-making. Design and update our foundational business tables in order to simplify analysis across the entire company. Continue to improve the performance and reliability of our data warehouse. Build and enforce a pattern language across our data stack, ensuring that our data pipelines and tables are consistent, accurate, and well-understood. Who you are You have 5+ years of professional experience designing, creating and maintaining scalable data pipelines, preferably in Airflow. You've wrangled enough data to understand how often the complex systems that produce data can go wrong. You are proficient in at least one programming language (preferably Python), and are willing to become effective in others as needed to get your job done. You are highly effective with SQL and understand how to write and tune complex queries. You're passionate and thoughtful about building systems that enhance human understanding. You communicate with clarity and precision in written form; experience communicating with graphs and plots. Compensation awarded to successful candidates will vary based on their work location, relevant skills, and experience. Our total compensation package also includes the opportunity to receive benefits, restricted stock units, and may include incentive compensation. To learn more about our comprehensive benefit offerings, please check out Life at Airtable . For work locations in the San Francisco Bay Area, Seattle, New York City, and Los Angeles, the base salary range for this role is: $179,500 — $221,500 USD Please see our Privacy Notice for details regarding Airtable’s collection and use of personal information relating to the application and recruitment process by clicking here . 🔒 Stay Safe from Job Scams All official Airtable communication will come from an @airtable.com email address. We will never ask you to share sensitive information or purchase equipment during the hiring process. If in doubt, contact us at hr@airtable.com . Learn more about avoiding job scams here .

Posted 30+ days ago

Sauce Labs logo
Sauce LabsSanta Clara, CA

$161,000 - $200,000 / year

About Us: At Sauce Labs, we empower the world's top enterprises - like Walmart, Bank of America, and Indeed - to deliver quality web and mobile applications at speed. Our industry-leading platform ensures continuous quality across the SDLC, using AI-powered analytics to identify key quality signals from development through production. With our unified solution, teams can release and innovate with confidence, knowing their apps will always look, function, and perform exactly as they should. Backed by TPG and Riverwood Capital, we are shaping the future of digital confidence - join us! The Role: We are seeking a motivated and detail-oriented Data Center Engineer to join our global infrastructure team. This is an entry-level to early-career role designed for candidates looking to build hands-on experience in data center operations and infrastructure support. As a Data Center Engineer, you will support the physical maintenance of our server and device environments, assist with routine deployments, and collaborate with senior team members to maintain high availability across our data centers and device labs. Responsibilities: Infrastructure Deployment & Maintenance Assist with racking, stacking, and cabling of servers, switches, and storage devices. Help set up mobile device testing infrastructure in device farm environments. Support basic hardware installations, upgrades, and physical maintenance. Monitor and report on power usage and rack-level connectivity. Troubleshooting & Support Perform basic diagnostics on hardware and cables. Provide hands-on support under guidance from remote engineers (smart hands). Maintain and update asset inventory systems. Monitor alerts and escalate incidents as needed. Security & Compliance Follow established protocols for physical access to data center environments. Escort vendors and visitors as required and log access appropriately. Inventory & Logistics Track equipment shipments and log hardware check-ins/outs. Assist with regular inventory audits and organization of parts/tools. Help process RMAs and coordinate return of faulty hardware. Documentation & Reporting Maintain accurate records of tasks, installations, and incidents. Contribute to knowledge base updates and process documentation as needed. Data Center Operations & Infrastructure Management Assist with the physical setup of servers, storage, and network hardware under the guidance of senior staff. Support routine tasks during scheduled maintenance and infrastructure upgrades by following documented procedures. Learn and follow best practices related to equipment handling, cabling, and physical infrastructure standards Device Farm Management Help prepare and organize real mobile devices for lab and device farm environments. Follow instructions to connect and maintain devices used for testing purposes. Assist teams by gathering logs and reporting basic device or connectivity issues across Android, iOS, and other platforms. Networking & Systems Administration Provide hands-on support for Linux, macOS, Android, and iOS systems as instructed. Use remote management tools like iDRAC, iLO, or similar interfaces to perform basic checks or reboots under direction. Learn and follow system access and logging protocols when interacting with server hardware. Monitoring, Security & Compliance Monitor infrastructure using predefined dashboards and tools (e.g., Prometheus, Zabbix, Nagios) and escalate alerts as needed. Follow established checklists to ensure systems are deployed according to documented security and compliance standards. Participate in routine maintenance tasks and provide on-the-ground support during incident response activities. Collaboration & Cross Functional Support Work closely with more senior Operators and Engineers on assigned tasks. Respond to support tickets and service requests, escalating when appropriate. Collaborate with other infrastructure and development teams to support hardware needs. Required Skills: 1-3 years of experience in a technical, data center, or IT support role. Familiarity with basic IT hardware concepts (servers, networking gear, cabling). Interest in physical infrastructure and hands-on technical work. Understanding of TCP/IP basics and cable types (fiber, copper). Comfortable using standard tools and following technical documentation. Ability to lift and move equipment safely and follow safety protocols. Willingness to travel domestically up to 20% of the time to support in-person collaboration with your team and to provide hands-on support for remote data center locations. Nice to Haves: Associate’s or Bachelor’s degree in Information Technology, Computer Science, or a related field preferred or equivalent practical experience. Exposure to Linux systems and command-line tools. Basic familiarity with mobile devices and remote management interfaces (e.g., iDRAC, iLO). Interest in learning about CI/CD tools, virtualization, or monitoring platforms. Soft Skills: Strong attention to detail and a methodical work style. Clear communication and willingness to ask questions. Eagerness to learn new technologies and grow in a technical environment. Ability to follow structured processes and work independently with guidance. While Sauce Labs is a hybrid workplace, this is primarily an in-person role. Where appropriate, opportunities for remote work will be identified and offered. Please note our privacy terms when applying for a job at Sauce Labs. Sauce Labs is proud to be an Equal Opportunity employer and values diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender identity/expression/status, sexual orientation, age, marital status, veteran status or disability status. Security responsibilities at Sauce At Sauce, we will commit to supporting the health and safety of employees and properties, partnering with internal stakeholders to learn and act on ever-evolving security protocols and procedures. You’ll be expected to fully comply with all policies and procedures related to security at the department and org wide level and exercise a ‘security first’ approach to how we design, build & run our products and services. We are excited to share the base salary for this position exclusive of fringe benefits, potential bonuses or stock-based compensation. Your base salary compensation will be determined based on factors such as geographic location, skills, education, and/or experience, along with its relationship to the base salaries of current team members at Sauce Labs that are similarly situated.Benefits and Perks that we offer include health coverage (medical, dental, and vision) along with disability and life insurance. In addition, Sauce Labs offers parental leave benefits, flexible time off, professional development, and a 401(k) retirement plan with match. To see more about benefits and perks at Sauce Labs, please check out our careers page at saucelabs.com/company/careers. US Compensation Range $85,000 — $105,000 USD

Posted 30+ days ago

Nextdoor logo
NextdoorSan Francisco, CA

$175,000 - $203,000 / year

#Team Nextdoor Nextdoor (NYSE: NXDR) is the essential neighborhood network. Neighbors, public agencies, and businesses use Nextdoor to connect around local information that matters in more than 340,000 neighborhoods across 11 countries. Nextdoor builds innovative technology to foster local community, share important news, and create neighborhood connections at scale. Download the app and join the neighborhood at nextdoor.com . Meet Your Future Neighbors As a Data Scientist at Nextdoor, you will help design and oversee product experiments and own complex analyses to drive company and product strategy. We use a semi-embedded team structure, in which a group of data team members works on a specific product or pillar, interfacing directly with product and engineering stakeholders. The Data Science group consists of people from a diverse set of backgrounds and perspectives, trained in fields as wide-ranging as economics, physics, statistics, and operations research. We are an integral part of the product development organization and play an active and collaborative role in building and improving the product. At Nextdoor, we offer a warm and inclusive work environment that embraces a hybrid employment experience, providing a flexible experience for our valued employees. The Impact You'll Make Feed is a mission-critical surface within our platform – helping neighbors discover meaningful content through an intuitive interface and surfacing the right content through intelligent ranking. As a Data Scientist on the Feed team, you will partner with Analytics Engineers to advance our data foundation, collaborate with our cross-functional teams to develop and execute product roadmaps, and define/own the ways we measure success and elevate experimentation capabilities of the team. We are seeking an entrepreneurial and driven data scientist to accelerate our efforts and play a significant role in our data-centric culture. This person will work closely with various cross-functional teams, such as product, engineering, and design, to develop and deliver metrics, analyses, solutions, and insights. Successful candidates will demonstrate technical acumen, product expertise, and business acumen, and be enthusiastic about making a positive impact through timely execution. You are passionate about leveraging the power of data to drive product changes with quality and agility. Your responsibilities will include: Partner with cross-functional teams - including product management, design, engineering, research - to support product development efforts Experiments: design and measure experiments to inform feature decisions Develop key strategic insights through exploratory data analysis, to inform future investments or pivot in strategy Build scalable metrics and dashboards to empower efficient decision-making at Nextdoor Participate in in-person Nextdoor events such as trainings, off-sites, volunteer days, and team-building exercises Build in-person relationships with team members and contribute to Nextdoor's company culture What You'll Bring To The Team 3-5 years of data science and product analytics experience BS and/or MS in a quantitative discipline: statistics, operations research, computer science, engineering, applied mathematics, physics, economics, etc. Experience in designing trustworthy experimentation and analyzing complex product a/b testing results Experience identifying, designing, and delivering ergonomic tables to amplify speed + accuracy of key analysis and self-serve reports Expert knowledge of SQL, Python programming, including common scientific computing packages and data science tools such as numpy, pandas, and scikit-learn Excellent communication skills, with the ability to synthesize, simplify, and explain complex problems to different types of audiences, including executives, and compile compelling narratives Innate curiosity around finding meaningful insights that inform the way we think about and develop both our product and our business strategies Eagerness to explore and apply AI and emerging technologies to reimagine how work gets done Rewards Compensation, benefits, perks, and recognition programs at Nextdoor come together to create one overall rewards package. The starting base salary for this role for the San Francisco, CA area is expected to range from $175,000 to $203,000 on an annualized basis, or potentially greater in the event that your 'level' of proficiency exceeds the level expected for the role. Compensation may also vary by geography. We also expect to award a meaningful equity grant for this role. With quarterly vesting, your first vest date would be within the first 3 months of your start date. Overall, total compensation will vary depending on your relevant skills, experience, and qualifications. We have you covered! Nextdoor employees can choose between a variety of great health plans. At Nextdoor, we empower our employees to build stronger local communities. To create a platform where all feel welcome, we want our workforce to reflect the diversity of the neighbors we serve. We encourage everyone interested in our mission to apply. We do not discriminate on the basis of race, gender, religion, sexual orientation, age, or any other trait that unfairly targets a group of people. In accordance with the San Francisco Fair Chance Ordinance, we always consider qualified applicants with arrest and conviction records. For information about our collection and use of applicants’ personal information, please see Nextdoor's Personnel Privacy Notice, found here .

Posted 30+ days ago

T logo
Trade DeskNew York, NY
  The Trade Desk is a global technology company with a mission to create a better, more open internet for everyone through principled, intelligent advertising. Handling over 1 trillion queries per day, our platform operates at an unprecedented scale. We have also built something even stronger and more valuable: an award-winning culture based on trust, ownership, empathy, and collaboration. We value the unique experiences and perspectives that each person brings to The Trade Desk, and we are committed to fostering inclusive spaces where everyone can bring their authentic selves to work every day.    Do you have a passion for solving hard problems at scale? Are you eager to join a dynamic, globally connected team where your contributions will make a meaningful difference in building a better media ecosystem? Come and see why Fortune magazine consistently ranks The Trade Desk among the best small-medium-sized workplaces globally.    ABOUT THE ROLE  Data scientists at TTD work closely with engineering throughout the lifecycle of the product, from ideation to production and monitoring. Our data scientists are end-to-end owners. You will participate actively in all aspects of designing, researching, building, and delivering data-focused products for our clients and traders.    This particular role is responsible for research and application of state-of-art modeling techniques to solve measurement problems centered around Advertising Technology. This role will collect and explore data, research into methodologies, design experiments to validate model results, and implement research work in production. This role will be able to work across all methods of measurement in AdTech, but will have a particular focus on helping to move both internal and external clients towards more sophisticated measurement solutions that prove the causal impact of ads served by the TTD platform. The work of this role will help to build out and make causal lift more accessible and attainable on our platform.    The main job directions include:  -Explore, evaluate and deploy models to solve the problem of “Are TTD ads driving incremental lift?”  -Help to define and build methods to optimize towards causal effects -Explore data from various sources within TTD and prototype ETL pipelines to collect them for model training.  - Design and analyze experiments (e.g. A/B test) to validate model results.  - Work with cross-functional stakeholders to come up with the best implementation and testing plan.  - Communicate learnings from data and insights from models in compelling ways to influence product decisions.    WHO WE ARE LOOKING FOR   Proficient in open-source languages, you have a strong passion for enhancing and expanding your technical skills. Your expertise includes hands-on development of statistical and machine learning models and solutions utilizing open-source tools and cloud computing platforms. Has a deep understanding of the foundations of statistics and machine learning models.  Hands-on experience building models at scale. A track record of owning a project end-to-end (from research to production), and partnership with a cross-functional team of data scientists, engineers and product managers to deliver advanced analytics or models. Possessing a keen sense of data intuition and the ability to innovate in the field of causal inference and/or machine learning, as evidenced by achievements like first-author publications or project successes.   WHAT YOU BRING TO THE TABLE  We do not expect you to know every technology we use when you start at TTD. What we care most about is that you can learn quickly and solve complex problems using the best tools for the job. However, we find that the most successful candidates typically come in with something like the following experience:  BS/MS with 4+ years or a PhD with 2+ years of experience working in a DS or ML role that involves bringing products from ideation to production. Experience in causal inference and lift measurement Experience of designing experiments in a production environment. Proficient in Python.  Experience in programmatic advertising is a plus Experience running heavy workloads on a distributed computing cluster (especially EMR or Databricks), leveraging technologies like Spark to work with large datasets preferred. The ability to communicate with diverse stakeholders, making architecture recommendations, ensuring effective execution, and measuring quality of outcomes.       CO, CA, IL, NY, WA, and Washington DC residents only: In accordance with CO, CA, IL, NY, WA, and Washington DC law, the range provided is The Trade Desk's reasonable estimate of the base compensation for this role. The actual amount may differ based on non-discriminatory factors such as experience, knowledge, skills, abilities, and location. All employees may be eligible to become The Trade Desk shareholders through eligibility for stock-based compensation grants, which are awarded to employees based on company and individual performance. The Trade Desk also offers other compensation depending on the role such as sales-based incentives and commissions. Plus, expected benefits for this role include comprehensive healthcare (medical, dental, and vision) with premiums paid in full for employees and dependents, retirement benefits such as a 401k plan and company match, short and long-term disability coverage, basic life insurance, well-being benefits, reimbursement for certain tuition expenses, parental leave, sick time of 1 hour per 30 hours worked, vacation time for full-time employees up to 120 hours thru the first year and 160 hours thereafter, and around 13 paid holidays per year.  Employees can also purchase The Trade Desk stock at a discount through The Trade Desk’s Employee Stock Purchase Plan.  The Trade Desk also offers a competitive benefits package. Click here to learn more. Note: Interns are not eligible for variable incentive awards such as stock-based compensation, retirement plan, vacation, tuition reimbursement or parental leave At the Trade Desk, Base Salary is one part of our competitive total compensation and benefits package and is determined using a salary range. The base salary range for this role is $124,900 — $228,900 USD As an Equal Opportunity Employer, The Trade Desk is committed to making our job application process accessible to everyone and to providing reasonable accommodations for applicants with disabilities. If you have a disability or medical condition and require an accommodation for any part of the application or hiring process, please contact us at   accommodations@thetradedesk.com You can also contact us using the same email address if you have a disability and need assistance to access our Company website.   When contacting us, please provide your contact information and specify the nature of your accessibility issue.  

Posted 30+ days ago

T logo
Trade DeskNew York, NY

$151,400 - $227,000 / year

Job Title: Data Scientist II Location: 1114 Ave of Americas, New York, NY, 10036 *Telecommuting permitted: work may be performed within normal commuting distance from The Trade Desk, Inc. office in New York, NY. Job Duties: Use technology to develop AI based search, retrieval, and classification products applied to programmatic advertising. Design and deploy machine learning solutions with the objective to transform unstructured and noisy content from images and web pages into structured and actionable data. Identify opportunities to enhance the reliability of audience and contextual targeting strategies offered by the company’s platform. Build data pipelines using cloud computing and big data to analyze the content of hundreds of millions of URLs on a daily basis, and process billions of ad-biddings opportunities. Support the data science process and research efforts within The Trade Desk by documenting the results of research efforts and studies, and sharing knowledge with other Data Science teams. Build AI-powered chatbot to support users navigation on the platform, and enable users to access data insight about their ad campaign performance through natural language interactions with a virtual assistant. Use Python as programming language daily, along with tools and frameworks such as Pyspark, PyTorch, and Github to analyze data, tests, and build models. Salary: $151,400 - $227,000/year Job Requirements: Master's degree (U.S. or foreign equivalent) in Statistics, Financial Engineering or related field and three (3) years of experience in the job offered or related role. Must have three (3) years of experience with: Python and SQL programming; Machine Learning and NLP; Data modeling; statistical modeling; and deep learning frameworks and agentic AI. Must have two (2) years of experience with: Big Data & Cloud Computing; and version control. CO, CA, IL, NY, WA, and Washington DC residents only: In accordance with CO, CA, IL, NY, WA, and Washington DC law, the range provided is The Trade Desk's reasonable estimate of the base compensation for this role. The actual amount may differ based on non-discriminatory factors such as experience, knowledge, skills, abilities, and location. All employees may be eligible to become The Trade Desk shareholders through eligibility for stock-based compensation grants, which are awarded to employees based on company and individual performance. The Trade Desk also offers other compensation depending on the role such as sales-based incentives and commissions. Plus, expected benefits for this role include comprehensive healthcare (medical, dental, and vision) with premiums paid in full for employees and dependents, retirement benefits such as a 401k plan and company match, short and long-term disability coverage, basic life insurance, well-being benefits, reimbursement for certain tuition expenses, parental leave, sick time of 1 hour per 30 hours worked, vacation time for full-time employees up to 120 hours thru the first year and 160 hours thereafter, and around 13 paid holidays per year. Employees can also purchase The Trade Desk stock at a discount through The Trade Desk’s Employee Stock Purchase Plan. The Trade Desk also offers a competitive benefits package. Click here to learn more. Note: Interns are not eligible for variable incentive awards such as stock-based compensation, retirement plan, vacation, tuition reimbursement or parental leave At the Trade Desk, Base Salary is one part of our competitive total compensation and benefits package and is determined using a salary range. The base salary range for this role is $151,400 — $227,000 USD As an Equal Opportunity Employer, The Trade Desk is committed to creating an inclusive hiring experience where everyone has the opportunity to thrive. Please reach out to us at accommodations@​thetradedesk.​com to request an accommodation or discuss any accessibility needs you may require to access our Company Website or navigate any part of the hiring process. When you contact us, please include your preferred contact details and specify the nature of your accommodation request or questions. Any information you share will be handled confidentially and will not impact our hiring decisions.

Posted 5 days ago

Infinitive Inc logo
Infinitive IncAshburn, VA
*Candidates must be local to the Washington D.C. metro area. About Infinitive: Infinitive is a data and AI consultancy that enables its clients to modernize, monetize and operationalize their data to create lasting and substantial value. . We possess deep industry and technology expertise to drive and sustain adoption of new capabilities. We match our people and personalities to our clients' culture while bringing the right mix of talent and skills to enable high return on investment. Infinitive has been named “Best Small Firms to Work For” by Consulting Magazine 6 times most recently in 2023. Infinitive has also been named a Washington Post “Top Workplace”, Washington Business Journal “Best Places to Work”, and Virginia Business “Best Places to Work.” We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining our clients data infrastructure. Your expertise in Python, PySpark, ETL processes, CI/CD (Jenkins or GitHub), and experience with both streaming and batch workflows will be essential in ensuring the efficient flow and processing of data to support our clients. Responsibilities: Data Architecture and Design: Collaborate with cross-functional teams to understand data requirements and design robust data architecture solutions. Develop data models and schema designs to optimize data storage and retrieval. ETL Development: Implement ETL processes to extract, transform, and load data from various sources. Ensure data quality, integrity, and consistency throughout the ETL pipeline. Python and PySpark Development: Utilize your expertise in Python and PySpark to develop efficient data processing and analysis scripts. Optimize code for performance and scalability, keeping up-to-date with the latest industry best practices. Data Integration: Integrate data from different systems and sources to provide a unified view for analytical purposes. Collaborate with data scientists and analysts to implement solutions that meet their data integration needs. Streaming and Batch Workflows: Design and implement streaming workflows using PySpark Streaming or other relevant technologies. Develop batch processing workflows for large-scale data processing and analysis. CI/CD Implementation: Implement and maintain continuous integration and continuous deployment (CI/CD) pipelines using Jenkins or GitHub Actions. Automate testing, code deployment, and monitoring processes to ensure the reliability of data pipelines. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. Strong programming skills in Python and expertise in PySpark for both batch and streaming data processing. Hands-on experience with ETL tools and processes. Familiarity with CI/CD tools such as Jenkins or GitHub Actions. Solid understanding of data modeling, database design, and data warehousing concepts. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Skills: Knowledge of cloud platforms such as AWS, Azure, or Google Cloud. Experience with version control systems (e.g., Git). Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Understanding of data security and privacy best practices. Applicants for employment in the U.S. must possess work authorization which does not require sponsorship by the employer for a visa. Infinitive is an Equal Opportunity Employer. Powered by JazzHR

Posted 2 weeks ago

Lovelytics logo
LovelyticsArlington, VA
Lovelytics is a Databricks-focused data and AI consulting firm specializing in artificial intelligence, data, and analytics solutions. Since partnering with Databricks in 2019, Lovelytics has experienced exponential growth, growing from 50 people to over 340 over the past 3 years. Lovelytics is a trusted partner for many of the most high-profile enterprise clients in Media & Entertainment, Manufacturing, Retail & CPG, Healthcare & Life Sciences, and Financial Services.Lovelytics is seeking a Sr. Data Engineer (Lead Consultant) with experience delivering strategic Databricks client engagements to join our Data & AI practice! This Lead Consultant will play a key role in delivering on client engagements related to data warehousing, ETL development, data integrations, and data modeling. This is a client-facing role, focused on using and migrating to our partner technologies; Databricks, AWS, and Azure to name a few. In addition to the technical capabilities for this role, we are looking for someone who wants to work in a collaborative, dynamic, and inclusive environment and has a passion for bringing meaning to data. This role is open to remote candidates in the US and Ontario, Canada This role is not open for work sponsorship at this time. Primary Responsibilities: Utilize consulting and technical skills to be able to work in a client-facing project environment independently. Be responsible for your own execution and sometimes lead individual work streams on client engagements as assigned and under supervision of engagement lead. Collaborate with other team members to successfully deliver on projects. Work effectively and directly communicate with both internal and client and/or partner teams. Develop full ownership of your execution on client engagements, you'll become involved in the project planning and solution stages of engagements as well. Design and implement complex ETL/ELT pipelines with evidence of improved data processing times. Successfully lead small data warehousing projects with measurable performance enhancements under the management of an engagement lead. Contribute to real-time data processing solutions and managed streaming data. Implement security and compliance measures for data pipelines. Design and implement version control and branching strategies and integrate them into CI/CD for promoting and testing in higher environments. Our Ideal Candidate's Skills and Experiences: B.S. in Computer Science or equivalent 3-5 years' experience in data engineering and big data. At least 2 years working directly with clients and external stakeholders. Extensive knowledge of data warehousing concepts and hands-on experience deploying pipelines using Databricks * *A must Data modeling and database design skills and knowledge of version control Excellent verbal and written communication skills Is able to apply technical skills to engagement needs Works with engagement leads and directors to gain exposure in the design and architecture of solutions Understands and utilizes Lovelytics tools and client tools What We Promise You: Exciting projects with great clients in varying departments and verticals across the world The ability to work closely with experienced data engineers and quickly grow and expand your skillset The ability to work closely with all sizes of companies, ranging from Fortune 100 to small local businesses A workplace where you are encouraged to challenge the status quo and develop new technologies, methodologies, and processes A diverse team consisting of data gurus, experience seekers, and entrepreneurial minds that are always pushing to be better Lovelytics is an Equal Opportunity Employer. This means you don’t have to worry about whether your application process will be fair. We consider all applicants without regard to race, color, religion, age, ancestry, ethnicity, gender, gender identity, gender expression, sexual orientation, veteran status, or disability. Powered by JazzHR

Posted 1 week ago

L logo
Lucayan Technology Solutions LLCTampa, FL
Location: Tampa, FL (Onsite) Employment Type: Full-Time, Regular Department: Intelligence Operations and Analysis Security Clearance: Top Secret/SCI Hours: Standard duty hours, Monday–Friday (mission requirements may require flexibility) About Us At Lucayan Technology Solutions LLC , we deliver secure, innovative solutions in support of national defense and intelligence missions. As a trusted government contracting partner, we provide top-tier technology and intelligence services that safeguard our nation. Our team is mission-driven, and we are committed to building careers that matter. Job Summary Lucayan Technology Solutions is seeking an Operations Analyst – Data Aggregation & Mitigation Support to join our team in Tampa, FL. This role is responsible for compiling, analyzing, and modeling military systems performance and operational data to inform mission strategy and decision-making. The analyst will coordinate with the DoD, Intelligence Community, industry, and academia to identify emerging technologies, streamline processes, and support mission readiness. Key Responsibilities Conduct research and modeling of military systems and operations . Aggregate, analyze, and interpret operational and statistical data. Develop analytic methods, trade studies, and tools to support decision-making. Provide technical and analytical support for training, tactical documentation, and systems design projects . Coordinate with DoD, IC, industry, and academia on data aggregation and mitigation support. Identify and recommend emerging technologies to enhance current processes. Prepare and deliver presentations, reports, and training materials . Mentor junior analysts and provide task leadership as needed. Maintain knowledge of relevant military operations, technology, and processes . Work Environment Onsite position in Tampa, FL supporting U.S. defense operations. Mission-critical, collaborative environment requiring coordination across agencies and partners. Less than 10% travel may be required. Minimum Qualifications Clearance: Active Top Secret/SCI clearance. Citizenship: U.S. Citizenship required. Education: Bachelor’s degree in Intelligence Studies, Military Operations, Data Science, Engineering, or related field. Experience: 6+ years of military or civilian operational/analytical experience. Prior Special Operations experience preferred. Demonstrated leadership experience. Strong communication, problem-solving, and analytical skills . Required Certifications None required. Military training or relevant professional certifications in analysis or operations preferred. Desirable Qualifications Familiarity with AI/ML tools for data analysis . Experience with advanced data visualization and analytic methods . Proven ability to mentor junior analysts . Employment Details Seniority Level: Mid-to-Senior Employment Type: Full-Time, Regular Job Function: Data Analysis, Intelligence Support, Operations Analysis Industry: Defense & Space, Government Contracting Travel: Less than 10% Why Lucayan? At Lucayan Technology Solutions , you’ll be part of a mission-focused team supporting critical U.S. defense operations. We value our people as our greatest strength and provide opportunities to grow professionally while contributing to national security. Apply Now to put your operational analysis expertise to work where it matters most. Powered by JazzHR

Posted 30+ days ago

Trace3 logo
Trace3Irvine, CA

$17 - $22 / hour

Who is Trace3 ? Trace3 is a leading Transformative IT Authority, providing unique technology solutions and consulting services to our clients. Equipped with elite engineering and dynamic innovation, we empower IT executives and their organizations to achieve competitive advantage through a process of Integrate, Automate, Innovate. Our culture at Trace3 embodies the spirit of a startup with the advantage of a scalable business. Employees can grow their career and have fun while doing it! Trace3 is headquartered in Irvine, California. We employ more than 1,200 people all over the United States. Our major field office locations include Denver, Indianapolis, Grand Rapids, Lexington, Los Angeles, Louisville, Texas, San Francisco. Ready to discover the possibilities that live in technology? Come Join Us! Street-Smart - Thriving in Dynamic Times We are flexible and resilient in a fast-changing environment. We continuously innovate and drive constructive change while keeping a focus on the “big picture.” We exercise sound business judgment in making high-quality decisions in a timely and cost-effective manner. We are highly creative and can dig deep within ourselves to find positive solutions to different problems. Juice - The “Stuff” it takes to be a Needle Mover We get things done and drive results. We lead without a title, empowering others through a can-do attitude. We look forward to the goal, mentally mapping out every checkpoint on the pathway to success, and visualizing what the final destination looks and feels like. Teamwork - Humble, Hungry and Smart We are humble individuals who understand how our job impacts the company's mission. We treat others with respect, admit mistakes, give credit where it’s due and demonstrate transparency. We “bring the weather” by exhibiting positive leadership and solution-focused thinking. We hug people in their trials, struggles, and failures – not just their success. We appreciate the individuality of the people around us. JOB SUMMARY: The Data and Analytics Engineer Intern will assist in designing, building, and testing data platforms and analytics solutions to generate actionable insights for our customers. College-level Junior, Senior, or Master's students only WHAT YOU CAN EXPECT TO LEARN: The challenges modern businesses are facing with data and how to address them with leading-edge technology solutions How to break down complex data problems and solve them by designing simplified, achievable solutions How organizations can leverage data to gain a competitive advantage by extracting patterns from large datasets How to harness large volumes of data, such as web traffic and user behaviors, to build pipelines that derive value from raw data for our customers Partner with our Data Intelligence Team to determine the best approach around data ingestion, structure, and storage, then work with the team to ensure these are implemented accurately Contribute ideas on how to make our customers’ data more valuable and work with members of Trace3’s Engineering Team to implement solutions ELIGIBILITY AND PREFERRED SKILLS: Enrollment in the Junior or Senior year of an undergraduate program or master’s program at an accredited college or university Candidates should be pursuing a field of study applicable to the Data Intelligence internship Cumulative grade point average (GPA) of 3.0 or better; People and Organizational Health may require a copy of the applicant’s transcript Academic or professional/internship experience working in a professional setting is a plus A basic level of knowledge of either data systems, data languages such as SQL or Python or data visualization is a plus Ability to work independently on assigned tasks and accepts direction on given assignments Self-motivated individuals with a customer mindset and desire to help people Enthusiasm for technical problem solving with attention to detail and strong communication skills Ability to learn and research in a dynamic and engaging environment Availability to work 40 hours per week throughout the internship Actual salary will be based on a variety of factors, including location, experience, skill set, performance, licensure and certification, and business needs. The range for this position in other geographic locations may differ. Certain positions may also be eligible for variable incentive compensation, such as bonuses or commissions, that is not included in the base salary. Estimated Pay Range $17 — $22 USD The Perks Comprehensive medical, dental and vision plans for you and your dependents 401(k) Retirement Plan with Employer Match, 529 College Savings Plan, Health Savings Account, Life Insurance, and Long-Term Disability Competitive Compensation Training and development programs Major offices stocked with snacks and beverages Collaborative and cool culture Work-life balance and generous paid time off Our Commitment At the core of Trace3's DNA is our people. We are a diverse group of talented individuals who understand the importance of teamwork and demonstrating leadership, character, and passion in all that we do. We’re committed to fostering an inclusive workplace where everyone feels respected, valued, and empowered to grow. We recognize that embracing diversity drives innovation, improves outcomes, fosters collaboration, boosts teammate satisfaction, and builds a more inclusive culture. As an equal opportunity employer, Trace3 bases all employment decisions based on individual qualifications, merit, and business requirements. We do not engage in discrimination on the basis of race, color, religion, sex (including gender identity, sexual orientation, and pregnancy), national origin, age (40 or older), disability, genetic information, or any other characteristic protected by federal, state, or local law. Any demographic information provided is strictly voluntary, kept confidential in accordance with Equal Employment Opportunity (EEO) regulations, and will not be used in employment decisions, including hiring, promotions, or mentorship programs. We are committed to providing equal employment opportunities for all. If you require a reasonable accommodation to complete the application process or participate in an interview, please email recruiting@trace3.com . To all recruitment agencies: Trace3 does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Trace3 employees or any other company location. Trace3 is not responsible for any fees related to unsolicited resumes/CVs.

Posted 3 weeks ago

Trace3 logo
Trace3Reno, NV
Who is Trace3 ? Trace3 is a leading Transformative IT Authority, providing unique technology solutions and consulting services to our clients. Equipped with elite engineering and dynamic innovation, we empower IT executives and their organizations to achieve competitive advantage through a process of Integrate, Automate, Innovate. Our culture at Trace3 embodies the spirit of a startup with the advantage of a scalable business. Employees can grow their career and have fun while doing it! Trace3 is headquartered in Irvine, California. We employ more than 1,200 people all over the United States. Our major field office locations include Denver, Indianapolis, Grand Rapids, Lexington, Los Angeles, Louisville, Texas, San Francisco. Ready to discover the possibilities that live in technology? Come Join Us! Street-Smart - Thriving in Dynamic Times We are flexible and resilient in a fast-changing environment. We continuously innovate and drive constructive change while keeping a focus on the “big picture.” We exercise sound business judgment in making high-quality decisions in a timely and cost-effective manner. We are highly creative and can dig deep within ourselves to find positive solutions to different problems. Juice - The “Stuff” it takes to be a Needle Mover We get things done and drive results. We lead without a title, empowering others through a can-do attitude. We look forward to the goal, mentally mapping out every checkpoint on the pathway to success, and visualizing what the final destination looks and feels like. Teamwork - Humble, Hungry and Smart We are humble individuals who understand how our job impacts the company's mission. We treat others with respect, admit mistakes, give credit where it’s due and demonstrate transparency. We “bring the weather” by exhibiting positive leadership and solution-focused thinking. We hug people in their trials, struggles, and failures – not just their success. We appreciate the individuality of the people around us. About the Role: Under the general direction of the Lead, Data Center Services, the Data Center Technician will physically support and provide remote-hand services to customers. This is a Contractor position with a 100% Onsite requirement What You’ll Do: Data Center Infrastructure & Cabling: 30% Support the day-to-day operations of the data center and execute various projects. Perform analysis and consulting of customers' IT infrastructure hardware in production; perform a walkthrough with the customer to determine the ability to deliver the project. Install and manage structured cabling infrastructure projects. Elicit Data Center infrastructure requirements from clients. Install, terminate, test, and label all cross-connects (copper and fiber) in the data center in accordance with standard operating procedures. Perform preventative maintenance and repairs by troubleshooting cabling system issues. Troubleshoot and address physical issues. Create straight-through, crossover, and console cables. Install or move shelves, power strips, rails, cable management, servers, switches, and other equipment as required. Compute/Network/Storage support: 30% Identify and report problem devices. Replace defective parts as directed. Visually inspect equipment for errors or unusual noises. Assist in performing infrastructure readiness tests. Configure OOB (out-of-band) management interfaces on various devices for remote access (compute, storage, and networking). Test and validate remote accessibility to the customer computer, server, and/or network system. Monitor and perform ongoing maintenance on servers and network equipment. Upgrade internal system components, including CPUs, memory, hard drives, and network cables. Create and maintain service desk documentation. Asset Management: 20% Provide a plan, unpack received items, identify parts, and dispose of all non-essential items. Must report results and give recommendations. Ensure that power tests and reports on all incoming products are conducted before they are moved to the data center floor, per practices. Digitally scan all incoming paperwork, transmit results, and proactively provide recommendations on gaps. Input all device locations and cable connections into a customer-defined database. Record all received products into a customer-defined database (including asset tags information, product manufacturer, model, and serial number). Consult and conduct inventory control. Compare received products to the packing slip and report discrepancies. Identify customer-defined spare parts and keep them in an inventory-controlled area. Recommend changes. Other: 20% Maintain a high level of customer satisfaction. Review work orders proactively and as post-work quality assurance to ensure requests are handled appropriately. Actively participate during the service transition phase to perform knowledge transfer and documentation. Develop documentation to maintain accurate records of customer computer, server, and network systems. Shipping/loading dock management (report new items received). Accept, fulfill, and report status via the ticketing system, time reports, and email processes. Escort vendors and repair technicians as requested by the customer. Perform crisis management in critical situations or during major service outages. Communicate with stakeholders according to SLAs (service level agreements). Perform root cause analysis. Follow the Company's best practices and Standard Operating Procedures. Mentor lower-level Data Center Technicians. May perform other duties as assigned by supervisor. Qualifications and Interests A minimum of one to three years’ experience in a technical or data center support role. High School diploma/GED required. CompTIA, A+, BICSI or similar certification is preferred. Good understanding of the OSI model Good command of written and spoken English.Excellent customer service skills. Proficiency in Microsoft Office (Word, Excel, PowerPoint, etc.). Must have a strong attention to detail. Participation in an on-call rotation for after-hours support. Actual salary will be based on a variety of factors, including location, experience, skill set, performance, licensure and certification, and business needs. The range for this position in other geographic locations may differ. Certain positions may also be eligible for variable incentive compensation, such as bonuses or commissions, that is not included in the base salary. Estimated Pay Range $25 — $30 USD The Perks Comprehensive medical, dental and vision plans for you and your dependents 401(k) Retirement Plan with Employer Match, 529 College Savings Plan, Health Savings Account, Life Insurance, and Long-Term Disability Competitive Compensation Training and development programs Major offices stocked with snacks and beverages Collaborative and cool culture Work-life balance and generous paid time off Our Commitment At the core of Trace3's DNA is our people. We are a diverse group of talented individuals who understand the importance of teamwork and demonstrating leadership, character, and passion in all that we do. We’re committed to fostering an inclusive workplace where everyone feels respected, valued, and empowered to grow. We recognize that embracing diversity drives innovation, improves outcomes, fosters collaboration, boosts teammate satisfaction, and builds a more inclusive culture. As an equal opportunity employer, Trace3 bases all employment decisions based on individual qualifications, merit, and business requirements. We do not engage in discrimination on the basis of race, color, religion, sex (including gender identity, sexual orientation, and pregnancy), national origin, age (40 or older), disability, genetic information, or any other characteristic protected by federal, state, or local law. Any demographic information provided is strictly voluntary, kept confidential in accordance with Equal Employment Opportunity (EEO) regulations, and will not be used in employment decisions, including hiring, promotions, or mentorship programs. We are committed to providing equal employment opportunities for all. If you require a reasonable accommodation to complete the application process or participate in an interview, please email recruiting@trace3.com . To all recruitment agencies: Trace3 does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Trace3 employees or any other company location. Trace3 is not responsible for any fees related to unsolicited resumes/CVs.

Posted 2 weeks ago

GuidePoint Security logo
GuidePoint SecuritySpringfield, VA
GuidePoint Security provides trusted cybersecurity expertise, solutions and services that help organizations make better decisions and minimize risk. By taking a three-tiered, holistic approach for evaluating security posture and ecosystems, GuidePoint enables some of the nation’s top organizations, such as Fortune 500 companies and U.S. government agencies, to identify threats, optimize resources and integrate best-fit solutions that mitigate risk. An active Top Secret/SCI clearance is required prior to consideration for this role. Work is 100% onsite.  Work may be performed in either Springfield, VA or St. Louis, MO A Cybersecurity Data Analysis Specialist is part of a team of skilled Cybersecurity professionals that support the design, build, and sustainment of the organizations Enterprise Audit capability.  Cybersecurity Data Analysis Specialist interact daily with a wide variety of industry leading audit technologies including, but not limited to, enterprise Security Information Events Management capability, long-term analytics platform, and log aggregation platform.  Cybersecurity Data Analysis Specialist provide support to fellow stakeholders when on-boarding new and existing Information Technology resources into the organizations Enterprise Audit capability and maintain all production Enterprise Audit solutions that are servicing the organization.  When required, Cybersecurity Data Analysis Specialist support fellow stakeholders in resolving operational audit issues.   You'll Bring These Qualifications: Strong understanding of Elastic SIEM and security technologies Related technical experience in security technologies such as Splunk, ArcSight and/or Kibana  TS/SCI Clearance (MUST have at least a TOP SECRET for consideration) DOD 8570 IAT Level II certification. (security+ minimum to start) A Counterintelligence Polygraph is preferred; candidates will be required to pass a CI polygraph within 3 months of starting We use Greenhouse Software as our applicant tracking system and Zoom Scheduler for HR screen request scheduling. At times, your email may block our communication with you. Please be sure to check your SPAM folder so that you don't miss updates on your application. Why GuidePoint? GuidePoint Security is a rapidly growing, profitable, privately-held value added reseller that focuses exclusively on Information Security. Since its inception in 2011, GuidePoint has grown to over 1000 employees, established strategic partnerships with leading security vendors, and serves as a trusted advisor to more than 4,200 customers. Firmly-defined core values drive all aspects of the business, which have been paramount to the company’s success and establishment of an enjoyable workplace atmosphere. At GuidePoint, your colleagues are knowledgeable, skilled, and experienced and will seek to collaborate and provide mentorship and guidance at every opportunity.   This is a unique and rare opportunity to grow your career along with one of the fastest growing companies in the nation. Some added perks…. Remote workforce primarily (U.S. based only, some travel may be required for certain positions, working on-site may be required for Federal positions) Group Medical Insurance options: Zero Deductible PPO Plan (GuidePoint pays 90% of the premium for employees and 70% for family plans (spouse/children/family) or High Deductible Health Plan with HSA (GuidePoint pays 100% of the employees premiums and 75% for family plans (spouse/children/family) and GPS will contribute in one lump sum: ($500 per EE annually / $1000 per family annually (includes spouse/children/family options) Group Dental Insurance: GuidePoint pays 100% of the premium for employees and 75% of family plans 12 corporate holidays and a Flexible Time Off (FTO) program Healthy mobile phone and home internet allowance Eligibility for retirement plan after 2 months at open enrollment Pet Benefit Option  

Posted 30+ days ago

CG Infinity logo
CG InfinityPlano, TX
Job Title: Senior Data Consultant Location: Plano, TX Employment Type: Full-time Experience Level: Senior (10+ years) About the Role: We are looking for a highly experienced Data Consultant with a strong background in delivering enterprise-scale data engineering solutions. This role is ideal for a hands-on leader who can architect and implement modern data platforms while also engaging with clients to drive business value. You will work across the full data lifecycle—from strategy and architecture to implementation and optimization—leveraging cutting-edge tools and frameworks. Key Responsibilities: Lead the design and delivery of end-to-end data engineering solutions for large-scale enterprise projects. Architect and implement modern data platforms using tools such as Databricks , Snowflake , Azure Synapse Analytics , and Azure Data Factory . Build robust, scalable data pipelines using Apache Spark , dbt , Delta Lake , and SQL-based transformations . Collaborate with business and technical stakeholders to translate requirements into actionable data strategies. Provide technical leadership and mentorship to engineering teams, ensuring adherence to best practices in data architecture, governance, and DevOps. Participate in client-facing activities including solutioning, proposal development, and account growth initiatives. Required Qualifications: 10+ years of experience in data engineering and architecture, with a strong focus on cloud-native technologies. Proven experience delivering full lifecycle data solutions in enterprise environments. Deep expertise in at least one: Databricks (including Delta Lake and Spark) Snowflake (data modeling, performance tuning, security) Azure Data Factory (pipeline orchestration, integration) Azure Synapse Analytics or equivalent cloud data warehouses dbt for transformation and modeling Strong SQL and Python skills; familiarity with CI/CD practices for data pipelines. Experience leading technical teams and managing delivery in a consulting or enterprise setting. Strong passion for business development, client engagement, and account management. Preferred Qualifications: Prior experience in a consulting environment and/or client facing roles. Familiarity with tools like Power BI , Looker , or Tableau for data visualization. Exposure to data governance frameworks and tools Powered by JazzHR

Posted 30+ days ago

Addepar logo
AddeparSalt Lake City, UT
Who We Are Addepar is a global technology and data company that helps investment professionals provide the most informed, precise guidance for their clients. Hundreds of thousands of users have entrusted Addepar to empower smarter investment decisions and better advice over the last decade. With client presence in more than 50 countries, Addepar's platform aggregates portfolio, market and client data for over $8 trillion in assets. Addepar's open platform integrates with more than 100 software, data and services partners to deliver a complete solution for a wide range of firms and use cases. Addepar embraces a global flexible workforce model with offices in New York City, Salt Lake City, Chicago, London, Edinburgh, Pune and Dubai. The Role A Data Implementation Specialist (Data Solutions Consultant) is responsible for integrating clients’ portfolio data into Addepar and consulting clients on their most complex data challenges. As a Data Implementation Specialist, you will be responsible for executing on client deliverables in addition to continuous internal tooling and process improvements to help scale our growing business. The ideal candidate will have exceptional analytical and communication skills, thrive in a fast-paced environment, and bring a solutions-oriented approach to all problems they encounter. Addepar takes a market-based approach to pay. A successful candidate’s starting pay will be determined based on the role, job-related skills, experience, qualifications, work location, and market conditions. The range displayed on each job posting reflects the minimum and maximum target base salary for roles in Colorado, California, and New York. The current range for this role is $90,000 - $112,000 (base salary) + bonus + equity + benefits. Your recruiter can share more about the specific salary range for your preferred location during the hiring process. Additionally, these ranges reflect the base salary only, and do not include bonus, equity, or benefits. What You’ll Do Translate unique client requirements into flexible and scalable data solutions Lead data conversion projects with Addepar clients to ETL historical portfolio data from their legacy system into Addepar Prioritize and context-switch effectively to complete simultaneous projects, seeing each through to the finish line Identify and drive opportunities to improve our current processes and tools to better streamline, scale, and automate workflows Effectively set, lead, and communicate expectations both internally and externally Communicate with clients in a proactive, consultative, and professional manner Collaborate with internal Services, Sales, Product, and Engineering teams Who You Are Minimum 2+ years of experience working in technology, finance, or consulting Experience with Python programming language is a bonus but not a requirement Experience with financial products and securities modeling Solution-oriented mentality and passion for problem-solving Excellent communication, organizational, and time-management skills Strong work ethic, proactive, and a high contributing teammate Highly organized, close attention to detail, and driven to make processes more efficient. Independent, adaptable, and can thrive in a fast-paced environment Our Values Act Like an Owner - Think and operate with intention, purpose and care. Own outcomes. Build Together - Collaborate to unlock the best solutions. Deliver lasting value. Champion Our Clients - Exceed client expectations. Our clients’ success is our success. Drive Innovation - Be bold and unconstrained in problem solving. Transform the industry. Embrace Learning - Engage our community to broaden our perspective. Bring a growth mindset. In addition to our core values, Addepar is proud to be an equal opportunity employer. We seek to bring together diverse ideas, experiences, skill sets, perspectives, backgrounds and identities to drive innovative solutions. We commit to promoting a welcoming environment where inclusion and belonging are held as a shared responsibility. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. PHISHING SCAM WARNING: Addepar is among several companies recently made aware of a phishing scam involving con artists posing as hiring managers recruiting via email, text and social media. The imposters are creating misleading email accounts, conducting remote “interviews,” and making fake job offers in order to collect personal and financial information from unsuspecting individuals. Please be aware that no job offers will be made from Addepar without a formal interview process. Additionally, Addepar will not ask you to purchase equipment or supplies as part of your onboarding process. If you have any questions, please reach out to TAinfo@addepar.com .

Posted 30+ days ago

Addepar logo
AddeparSalt Lake City, UT

$80,000 - $121,000 / year

Who We Are Addepar is a global technology and data company that helps investment professionals provide the most informed, precise guidance for their clients. Hundreds of thousands of users have entrusted Addepar to empower smarter investment decisions and better advice over the last decade. With client presence in more than 50 countries, Addepar's platform aggregates portfolio, market and client data for over $8 trillion in assets. Addepar's open platform integrates with more than 100 software, data and services partners to deliver a complete solution for a wide range of firms and use cases. Addepar embraces a global flexible workforce model with offices in New York City, Salt Lake City, Chicago, London, Edinburgh, Pune, Dubai, and Geneva. The Role A Portfolio Data Consultant (Portfolio Performance Analyst) leads the end-to-end data validation process with clients for data conversion projects. The ideal candidate will have a strong understanding of financial portfolio data, a motivation to work on data problems, outstanding communication skills, and the ability to deliver results in adherence to project deadlines while meeting high-quality standards. They are passionate about understanding our client's needs, taking a hands-on approach to solving problems, working collaboratively with internal teams, and taking ownership of our client's success. Addepar takes a market-based approach to pay. A successful candidate’s starting pay will be determined based on the role, job-related skills, experience, qualifications, work location, and market conditions. The range displayed on each job posting reflects the minimum and maximum target base salary for roles in Colorado, California, and New York. The current range for this role is $80,000 - $100,000 (base salary) + bonus + equity + benefits. Your recruiter can share more about the specific salary range for your preferred location during the hiring process. Additionally, these ranges reflect the base salary only, and do not include bonus, equity, or benefits. What You’ll Do Lead the data validation process end-to-end with the client and internal projects teams Hold recurring training and working sessions with the clients throughout the data validation project until completion Support clients with researching and identifying portfolio data performance discrepancies and make recommendations on how to best fix the data issues Review, compare, and document system calculation methodologies differences Coordinate across project teams, communicating regular status updates for assigned data projects and while effectively setting expectations Raise key issues to project team members and senior leadership Prioritize and context-switch effectively to complete simultaneous projects, seeing each through to the finish line Identify and drive opportunities to improve current processes, workflows, and tools to increase efficiency and automation Who You Are B.S in Finance, Mathematics, Statistics, Business or Economics 2+ years experience working in Finance and Technology Understands financial markets and has experience with financial products and portfolio data Positive attitude, strong work ethic, proactive and a high contributing teammate Independent, adaptable, and can thrive in a fast-paced environment Excellent communication, organizational, and time-management skills Experience programming in Python language is a plus Our Values Act Like an Owner - Think and operate with intention, purpose and care. Own outcomes. Build Together - Collaborate to unlock the best solutions. Deliver lasting value. Champion Our Clients - Exceed client expectations. Our clients’ success is our success. Drive Innovation - Be bold and unconstrained in problem solving. Transform the industry. Embrace Learning - Engage our community to broaden our perspective. Bring a growth mindset. In addition to our core values, Addepar is proud to be an equal opportunity employer. We seek to bring together diverse ideas, experiences, skill sets, perspectives, backgrounds and identities to drive innovative solutions. We commit to promoting a welcoming environment where inclusion and belonging are held as a shared responsibility. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. PHISHING SCAM WARNING: Addepar is among several companies recently made aware of a phishing scam involving con artists posing as hiring managers recruiting via email, text and social media. The imposters are creating misleading email accounts, conducting remote “interviews,” and making fake job offers in order to collect personal and financial information from unsuspecting individuals. Please be aware that no job offers will be made from Addepar without a formal interview process. Additionally, Addepar will not ask you to purchase equipment or supplies as part of your onboarding process. If you have any questions, please reach out to TAinfo@addepar.com .

Posted 30+ days ago

IUNU logo

Senior Software Engineer, Data Platform

IUNUSeattle, WA

Automate your job search with Sonara.

Submit 10x as many applications with less effort than one manual application.1

Reclaim your time by letting our AI handle the grunt work of job searching.

We continuously scan millions of openings to find your top matches.

pay-wall

Job Description

At IUNU (“you knew”), our mission is to deliver confidence at scale to the commercial greenhouse industry. We built LUNA, a computer vision platform that autonomously tracks plant development to turn visual data into high-value decisions. Deployed in over 15 countries, our technology empowers growers with critical insights like yield forecasting and pest detection to drive operational efficiency and reduce waste. We are looking for a Senior Engineer who is passionate about applying their technical expertise to solve real-world problems and build a more sustainable future for agriculture.About the Role:This role is a core systems engineering position for a builder who wants to solve complex challenges at the intersection of distributed systems and horticultural science. You will leverage your experience with data, algorithms, graphs, mathematics, and software engineering to enhance and extend the core of our LUNA system with robust, scalable systems. Working closely with our computer vision team and staff horticulturalists, you’ll transform the image and sensor data we gather to generate unique insights for growers.Responsibilities:
  • Design and implement the core distributed data processing engine that powers IUNU’s platform, moving beyond simple aggregation to handle large-scale datasets with high dimensionality.
  • Optimize for performance & scale by implementing advanced concurrency patterns and algorithmic techniques to maximize throughput across our distributed compute environment.
  • Design deterministic, event-driven workflows that guarantee data integrity and exactly-once processing, handling backpressure and late-arriving data in a non-deterministic physical world.
  • Drive the technical direction of the team by championing rigorous design reviews, observability best practices, and fault-tolerant architecture that balances speed of execution with long-term system stability.
  • Partner with the product and computer vision teams to translate abstract horticultural requirements into concrete, scalable technical solutions that directly impact yield forecasting and operational efficiency for growers.
Requirements:
  • 5+ years of professional software engineering experience.
  • Expert-level knowledge of Python, including modern language features, performance optimization, concurrency primitives (threading, multiprocessing), and best practices in production-grade code.
  • Advanced mastery of relational database internals, specifically PostgreSQL. Candidates must demonstrate proficiency in query optimization, relational algebra, and the distinct challenges of time-series data storage.
  • Proven hands-on experience designing, building, and operating systems that process and aggregate large datasets, with expertise in distributed data processing frameworks and efficient aggregation pipelines.
  • Deep understanding of algorithms and data structures, with the ability to analyze time/space complexity, select optimal solutions for real-world problems, and implement efficient algorithmic logic.
  • Proven track record of designing, implementing, and productionizing high-performance algorithms that operate reliably at scale in distributed environments.
  • Solid theoretical and practical knowledge of graph theory, including traversal algorithms (DFS, BFS), shortest-path algorithms, topological sorting, cycle detection, centrality measures, and experience applying graph algorithms to real systems (e.g., dependency resolution, social networks, recommendation engines, or knowledge graphs).
  • Comprehensive understanding of data pipeline dynamics, including scheduling strategies, event-driven vs. time-based triggering, deterministic execution guarantees, idempotency, exactly-once/late-data handling, and backpressure management.
  • Hands-on experience with production orchestration platforms such as Kubernetes (including operators, CRDs, and Helm), Argo Workflows, Airflow, Prefect, Dagster, Temporal, or equivalent frameworks, with emphasis on reliability, observability, and scaling of complex workflows.
  • Strong grasp of OS-level concurrency mechanisms (mutexes, semaphores, condition variables, read-write locks, atomic operations) and practical experience implementing correct, high-performance multi-threaded or multi-process systems in Python and/or lower-level languages.
  • High ownership mindset, bias for action, and ability to thrive in complex problem spaces.
Desired:
  • Hands-on experience with the Google Cloud Platform ecosystem.
  • Advanced knowledge of PostgreSQL and TimescaleDB.
  • Strong mathematical or statistical programming skills within the Python ecosystem.
  • Data engineering experience including ETL/ELT, AI/ML, and LLM pipelines.
DiversityAt IUNU, we’re committed to providing a safe and inclusive environment. We are dedicated to the happiness and success of all of our employees, and strive to foster a workplace in which individual differences are recognized, appreciated, nurtured, and respected. Diversity is important, and we strongly encourage people of all identities and backgrounds to confidently apply for a job with us if this is a role that interests and excites you. We want you to feel comfortable bringing your whole self to work with you, with all of your talents and strengths.

Powered by JazzHR

Automate your job search with Sonara.

Submit 10x as many applications with less effort than one manual application.

pay-wall