This position has been filled
This job is no longer accepting applications. Browse open EdTech jobs or view current openings at Epic or search for Data Engineer jobs.
Summary
Epic is seeking a Data Engineer to design, build, and optimize data pipelines for their digital reading platform, ensuring data quality, security, and performance across their cloud-based systems serving millions of users.
Job Description
Fast Facts
Epic is seeking a Data Engineer to design and optimize data pipelines for their digital reading platform for children, ensuring data quality and security.
Responsibilities: Develop ETL/ELT pipelines, enhance cloud data storage, implement data quality checks, and collaborate with teams to deliver data solutions.
Skills: Strong experience in data engineering, ETL/ELT principles, data storage solutions, cloud data platforms, and workflow orchestration tools.
Qualifications: 5+ years of relevant experience, familiarity with cloud-native data technologies and infrastructure tools is a plus.
Location: San Jose, CA
Compensation: $90000 - $130000 / Annually
Job Description
Epic is the leading digital reading platform for kids, used by millions of children, families, and educators around the world. With a vast library of high-quality books and learning resources, we empower students to explore their interests, build literacy skills, and develop a lifelong love of reading. As we look to the future, Epic is reimagining what reading can be—more personalized, more interactive, and more accessible than ever before. We’re combining technology, storytelling, and education to shape the next generation of readers.
About the Job
As a Data Engineer at Epic Kids, you will work closely with our development team, infrastructure team, and data team to design, build, and optimize data pipelines, ensuring data quality and security, while also collaborating with other teams to deliver effective data solutions.
Key Responsibilities:
- Develop robust ETL/ELT pipelines to extract, transform, and load data from diverse sources into our data warehouse.
- Enhance and maintain our cloud-based data storage and processing systems for performance, reliability, and cost-efficiency.
- Implement rigorous data quality checks, monitoring, and security measures across all data assets.
- Proactive in identifying and addressing data inconsistencies and bottlenecks, continuously refining data infrastructure for robust and high-performing data solutions.
- Partners with data analysts and non-technical teams to understand data requirements and shape the development of effective data products.
Job Qualifications:
- 5+ years of experience in data engineering, with a strong grasp of data warehousing, ETL/ELT principles, and data modeling.
- Experience with data storage solutions (e.g. relational, data lakes), cloud data platforms (e.g. GCP, AWS) and cloud-native data technologies (e.g. BigQuery, Snowflake).
- Experience with workflow orchestration tools (e.g. Airflow)
- Experience with infrastructure tools (e.g. Terraform, Kubernetes, Docker) is a plus.
Salary Range: $90K to $130K

