Remote job description
About Apollo
Apollo.io combines a buyer database of over 250M contacts and powerful sales engagement and automation tools in one, easy to use platform. Trusted by over 160,000 companies including Autodesk, Rippling, Deel, Jasper.ai, Divvy, and Heap, Apollo has more than one million users globally. By helping sales professionals find their ideal buyers and intelligently automate outreach, Apollo helps go-to-market teams sell anything.
In the last year, we've grown ARR 3x, quadrupled our active users, and closed a $110M Series C led by Sequoia Capital in March of 2022. This year, we continue to grow faster each month with record months of sales and added ARR. We hope you apply.
Working at Apollo
We are a remote-first inclusive organization focused on operational excellence. Our way of working ensures clear expectations and an environment to do your best work with ample reward.
Your Role & Mission
As a data engineer, you will be responsible for maintaining and operating the data warehouse and connecting in Apollo's data sources.
Daily Adventures and Responsibilities
Develop and maintain scalable data pipelines and build new integrations to support continuing increases in data volume and complexityImplement automated monitoring, alerting, self-healing (restartable/graceful failures) features while building the consumption pipelines
Implement processes and systems to monitor data quality, ensuring production data is always accurate and available
Write unit/integration tests, contributes to engineering wiki and document work
Define company data models and write jobs to populate data models in our data warehouse
Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture
Competencies
Excellent communication skills to work with engineering, product and business owners to develop and define key business questions and build data sets that answer those questions.Self-motivated and self-directed
Inquisitive, able to ask questions and dig deeper
Organised, diligent, and great attention to detail
Acts with the utmost integrity
Genuinely curious and open; loves learning
Critical thinking and proven problem-solving skills required
Skills & Relevant Experience
Required:
Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)Experience in data modeling, data warehousing, and building ETL pipelines
Deep knowledge of data warehousing with an ability to collaborate cross-functionally
Preferred:
4+ years experience in data engineering or a similar role
Experience using the Python data stack
Experience deploying and managing data pipelines in the cloud
Experience working with technologies like Airflow, Hadoop and Spark
Understanding of streaming technologies like Kafka, Spark Streaming
What You'll Love About Apollo
Besides the great compensation package and culture that thrives in openness and excellence, we invest tremendous effort into developing our remote employees' careers. The team embraces that we have a sole purpose: to help customers maximize their full revenue potential on the Apollo platform. This mindset opens us up to a lot of creative approaches to making customers successful at scale. You'll be a significant part of a lean, remote team, empowered to really own your role as a proactive educator. We're very collaborative at Apollo, so you'll be able to lean on your teammates, even in adjacent departments, to help you achieve lofty goals. You'll be supported and encouraged to experiment and take educated risks that lead to big wins. And, you'll have a whole team remotely by your side to help you do it!
Summary
Company name: Apollo
Remote job title: Senior Data Engineer
Job tags: Spark, streaming, Spark