We're looking for a Data Engineer to architect and scale the data backbone that powers our AI-driven donor engagement platform. You'll design and own modern, cloud-native data pipelines and infrastructure that deliver clean, trusted, and timely data to our ML and product teams.
Requirements
- US Citizenship
- Bachelor’s or Master’s in Computer Science, Data Engineering, or a related field
- 2+ years of hands-on experience building and maintaining modern data pipelines using python-based ETL/ELT frameworks
- Strong Python skills, including deep familiarity with pandas and comfort writing production-grade code for data transformation
- Fluent in SQL, with a practical understanding of data modeling, query optimization, and warehouse performance trade-offs
- Experience orchestrating data workflows using modern orchestration frameworks (e.g., Dagster, Airflow, or Prefect)
- Cloud proficiency (AWS preferred): S3, Glue, Redshift or Snowflake, Lambda, Step Functions, or similar services on other clouds
- Proven track record of building performant ETL/ELT pipelines from scratch and optimizing them for cost and scalability
- Experience with distributed computing and containerized environments (Docker, ECS/EKS)
- Solid data modeling and database design skills across SQL and NoSQL systems
- Strong communication & collaboration abilities within cross-functional, agile teams
Benefits
- Generous Paid Time Off
- 401k Matching
- Retirement Plan
- Visa Sponsorship
- Four Day Work Week
- Generous Parental Leave
- Tuition Reimbursement
- Relocation Assistance