Back to Careers

Data Engineering Manager

About Starry:

Starry is reinventing how people connect to and experience the internet. Our mission focuses on two things — first, on being an internet service provider committed to simplicity, transparency, and delight, and second, on providing high-speed internet to underserved communities both locally, nationally and globally. We approach our mission with a cutting-edge wireless technology, user experience designed to delight, and a diverse and intellectually curious company culture.

 

Why you'll love working here:

Starry is a fast-growing company, with incredible ambition to build new markets, and new products and services. At Starry, autonomy and creativity are rewarded; you’ll have control of your own time and the opportunity to develop your ideas and initiatives. The team is tightly-knit,  highly collaborative, and very driven. 

 

Who we’re looking for:

The data team is responsible for maintaining our internal analytics platform used by many departments at Starry. We’re looking for a data engineering manager to join us as we continue to build out our analytics platform, tooling, and pipelines. You’ll work with siloed business units to automate business processes and data pipelines while managing a small centralized data engineering team. We are looking for a candidate that relishes in autonomy and the design process.

 

What you’ll do:

  • Design data architecture and patterns for analytics and data science
  • Build tools to help democratize data across the company
  • Architect and implement a streaming data platform and processing framework
  • Develop generalized solutions to common analytics problems
  • Translate stakeholder requirements to tickets
  • Research and recommend new technologies to help scale our data infrastructure

 

Qualifications:

  • 5+ years data engineering experience
  • 1+ years managing
  • Python and SQL mastery
  • Experience with transactional databases like PostgreSQL
  • Experience with analytics databases like Snowflake
  • Experience with batch ETL
  • Experience with distributed compute
  • Experience with orchestration tools like Airflow or Luigi
  • Experience or a knowledge of streaming or near real time patterns
  • Experience working with container frameworks like Docker
  • Experience with Git and CI/CD within the context of data engineering
  • Understanding of the data science workflow

 

Bonus Points if..

  • Experience writing batch ETL in Spark
  • Experience building streaming or near real time frameworks on Kinesis or Kafka
  • Experience working with data science teams
  • Experience managing an AWS account

  

We work hard, so we take care of each other and try to enjoy ourselves along the way. We have:

  • Premium medical, dental, and vision coverage with no employee contribution required
  • 12 weeks of paid parental leave
  • Catered meals on a weekly basis
  • Groups for skiing, biking, running, climbing, stretching, shuffleboard, darts, and more