EdTech Jobs

This position has been filled

This job is no longer accepting applications. Browse open EdTech jobs or view current openings at InStride or search for Senior Data Engineer jobs.

Summary

Sr. Data Engineer role at InStride focused on designing and maintaining a robust data platform for large enterprises. Provides technical leadership to engineer teams while contributing to scalable data infrastructure and architecture decisions.

Key Responsibilities: Develop new features as part of an agile team, provide hands-on technical leadership for data engineers, and contribute to architecture discussions. Collaborate cross-functionally with SRE, product, security, and business teams to build and maintain best-in-class data platforms and pipelines.
Skills & Tools: Strong expertise in data pipeline development (batch and streaming), Apache Spark, Airflow, dbt, Kafka, and AWS/Databricks cloud platforms. Proficiency with Python, excellent communication skills, proven cross-functional collaboration ability, and deep understanding of open-source technologies.
Qualifications: Over 5 years of progressive data engineering experience with at least 2 years of Python development and 2+ years leading or mentoring engineers. Experience with MongoDB and TypeScript is a plus; demonstrated success in agile environments and SaaS solutions for enterprise customers.
Location: Remote from California, United States (Los Angeles, CA or Remote)
Compensation: $120,000 – $180,000/year

Job Description

liETtVLaARqgmMEbYzHNNLIzUPcdfPrwhYtVK7Qa.png Fast Facts

Join our team as a Sr. Data Engineer where you'll collaborate with engineers to design and maintain a robust data platform for large enterprises, focusing on scalable workflows and data infrastructure.

liETtVLaARqgmMEbYzHNNLIzUPcdfPrwhYtVK7Qa.png Responsibilities: Develop new features as part of an agile team, provide technical leadership, contribute to architecture decisions, and collaborate cross-functionally to enhance data solutions.

liETtVLaARqgmMEbYzHNNLIzUPcdfPrwhYtVK7Qa.png Skills: Strong expertise in data pipeline development, proficiency in Apache Spark, Airflow, and cloud platforms like AWS, with agile project delivery experience and cross-functional collaboration.

liETtVLaARqgmMEbYzHNNLIzUPcdfPrwhYtVK7Qa.png Qualifications: Over 5 years in data engineering, 2 years with Python, and experience in mentoring engineers; knowledge of MongoDB and TypeScript is a plus.

liETtVLaARqgmMEbYzHNNLIzUPcdfPrwhYtVK7Qa.png Location: Los Angeles, CA or Remote

liETtVLaARqgmMEbYzHNNLIzUPcdfPrwhYtVK7Qa.png Compensation: Not provided by employer. Typical compensation ranges for this position are between $120,000 - $180,000.





What we're looking for (role overview):

This role will be working with the Manager of Engineering and a diverse team of fellow engineers. You will contribute to the architecture, platform, development, and sustainment of our platform and ecosystem. The ideal candidate has a deep background in building scalable and reliable data infrastructure and products for large enterprises, a proven track record of designing and optimizing data workflows and platforms, and a passion for enabling data-driven decision-making to support our mission.

Skills we’d love to see you show off

  • Data engineering expertise: Strong expertise in building batch and streaming data pipelines, designing and implementing ETL/ELT workflows, and ensuring high-quality, reliable data movement across systems. Proficiency with tools like Apache Spark, Airflow, dbt, Kafka, and similar.
  • Cloud services: Proficiency with cloud data platforms and services, especially AWS and Databricks, to build scalable and reliable systems.
  • Agile methodologies: Extensive experience in delivering projects using agile methodologies, ensuring efficient and effective project execution.
  • Cross-functional collaboration: Proven ability to work closely with SRE, release engineering, security, product, UX, project management, business development, customer success, growth marketing, and executive teams.
  • SaaS solutions: Expertise in delivering SaaS solutions for large enterprise customers, ensuring performance and scalability.

Who you are (ideal profile):

  • You have over 5 years of progressive experience in data engineering and data infrastructure roles, delivering robust and scalable data solutions for high-growth businesses.
  • You have spent at least 2 years working with Python for data workflows and at least 2 years experience leveraging open-source data tools and frameworks.
  • You have at least 2 years of experience leading or mentoring other engineers, showcasing your ability to guide teams towards achieving technical excellence.
  • Experience working with MongoDB and TypeScript is a plus. 
  • With experience in Agile methodologies, you have proven success in cross-functional team environments. You are an excellent communicator, deliver projects on time, demonstrate high levels of emotional intelligence, and thrive when working with large, diverse teams. Experience with product, user experience, and project management teams is a plus.
  • You have a deep understanding of open-source technologies and cloud services, enhancing the scalability, reliability, and innovation of platforms.
  • You are passionate about helping our customers enable economic mobility for their employees; establishing equitable access to education for our partners' employees.

How you will create impact (key responsibilities):

  • Work as part of an agile team to develop new features and functionality.
  • Provide hands-on technical leadership for a team of data engineers to build, deliver, and maintain a best-in-class data platform and pipelines.
  • Demonstrate technical expertise by contributing to architecture discussions and decisions around data platforms, tooling, and workflows.
  • Drive best practices on schema design, data modeling, and performance optimization
  • Collaborate with cross-functional teams to achieve high-quality, elegant solutions.
  • Constantly apply best practices and provide recommendations for continuous improvement.
  • Write well-designed, testable, and efficient code.
  • Protect operations by keeping information and data confidential.
  • Proactively communicate and manage expectations with your leader and team.