Auto-apply to these jobs in Illinois
We've scanned millions of jobs. Simply select your favorites, and we can fill out the applications for you.
Sr. Data Engineer
Posted 30+ days ago
Business Analyst
Posted 1 week ago
Software Engineer
Posted 3 weeks ago

Director of Construction
Posted 3 weeks ago

Implementation Consultant
Posted 30+ days ago

Director of Housekeeping- Hyatt Regency Schaumburg, IL
Posted 3 weeks ago
Class A OTR Driver
$25+ / project
Posted 30+ days ago

Product Assembler - Paid Weekly!
Posted 4 weeks ago
OTR Non-CDL 24–26ft Box Truck Owner Operators (Must Have Own Truck)
$5,000 - $7,000 / week
Posted 30+ days ago

Assistant Community Manager- Charleston, IL
Posted 30+ days ago

Real Estate Showing Agent (Remote)
$34+ / project
Posted 30+ days ago
Class A Driver (Home Daily)
$1,100 - $1,500 / week
Posted 30+ days ago
Class A CDL Academy
$300+ / week
Posted 4 weeks ago
Retail Sales Lead
Posted 30+ days ago
Lease Purchase Truck Driver Opportunity
$1,700 - $2,000 / week
Posted 30+ days ago
Virtual Data Analysis Intern (Work-at-Home)
$250 - $3,000 / project
Posted 30+ days ago
Busser
Posted 1 day ago
Over-the-Road Truck Driver – Earn $1,500–$2,200/Week – 3,000+ Miles/Week
$1,500 - $2,200 / week
Posted 30+ days ago

Deck and Patio Remodeling Contractor CHI
Posted 30+ days ago
CLASS A OTR REEFER DRIVER - 1099 - NO TOUCH - CHICAGO
$1,500 - $2,300 / week
Posted 30+ days ago
Sr. Data Engineer
Automate your job search with Sonara.
Submit 10x as many applications with less effort than one manual application.1
Reclaim your time by letting our AI handle the grunt work of job searching.
We continuously scan millions of openings to find your top matches.

Job Description
WHAT YOU’LL DO:
- Demonstrate understanding and awareness of the critical role terminology data plays in IMO’s products – use this to consistently inform your work
- Update, analyze, fix, enhance, and build IMO products through direct interaction with code and data
- Assemble, analyze, and interpret large and complex data sets using both technical skills and a solid understanding of IMO’s terminology data
- Construct infrastructure for optimal ETL of data from varied sources using SQL and AWS ‘big data’ technologies
- Identify and implement improvements to automate processes, optimize data delivery and performance, implement orchestration frameworks, and redesign data pipeline infrastructure for scalability and reliability
- Design data platform components for bulk, transactional, and streaming access
- Create and maintainoptimal data pipeline architecture
- Support application-specific availability, scalability, and monitoring of resources and costs
- Develop and document quality source code
- Maintain and improve database schema and data models
- Promote data quality awareness and execute data quality management procedures
- Work cooperatively within an Agile Scrum team to manage conflict and foster trust, commitment, and accountability
- Take ownership, be proactive, and anticipate impacts to take appropriate action
- Implement creative solutions to technical challenges and apply knowledge and learning from various disciplines
- Collaborate cross-functionally in a dynamic and agile environment to translate needs into requirements, assist with data/infrastructure, and partner on the creation of innovative products
- Seek out industry best practices and continuously develop new skills
- Make data-driven decisions
WHAT YOU’LL NEED:
- Relevant technical BA/BS degree and five years of experience, OR seven years of relevant professional experience
- Ability to build end-to-end data platforms and collaborate on architecting sound solutions
- Experienced developer in multiple languages, including object-oriented/functional scripting languages (Python); able to train up on additional languages as needed
- Hands-on experience with big data tools (e.g., Spark, Kafka); familiarity with building and optimizing complex data pipelines and architectures
- Proficient in AWS services (EC2, EMR, RDS)
- Strong SQL knowledge, with experience in complex query authoring, relational databases (PostgreSQL), and NoSQL databases (DynamoDB, MongoDB, Elasticsearch)
- Strong analytical, troubleshooting, and problem-solving skills
- Experienced in data modeling and logical/physical database design
- Comfortable working with large, disconnected datasets and building processes that support data transformation, structures, and metadata
- Familiar with agile development and CI/CD processes using tools such as Git and Terraform
- Experience with markup languages such as XML and HTML
- Comfortable performing root cause analyses to identify opportunities for improvement
- Familiarity with stream-processing systems (e.g., Storm, Spark-Streaming) and workflow management tools (e.g., Airflow, Luigi, Azkaban)
- Strong communication skills
- Enjoyment of challenges, eagerness to explore new approaches, and willingness to ask for help
- Interest and capacity to independently get up to speed with items in the “Nice To Have”
PREFERRED EXPERIENCE:
- AWS Associate Certification – Data Engineer (preferred, not required)
- AWS Associate Certification – Solutions Architect
- Experience with ETL and BI tools (Talend, Tableau, Looker)
- Experience with data cataloging standards and building/maintaining them
- AWS Specialty Certification – Machine Learning
- AWS Foundational Certification – AI Practitioner
- Prior experience working with healthcare data
- Exposure to knowledge graph-related technologies and standards (Graph DB, OWL, SPARQL)
Automate your job search with Sonara.
Submit 10x as many applications with less effort than one manual application.
