Remote job description
Confluent is pioneering a fundamentally new category of data infrastructure focused on data in motion. Have you ever found a new favorite series on Netflix, picked up groceries curbside at Walmart, or paid for something using Square? That's the power of data in motion in action-giving organizations instant access to the massive amounts of data that is constantly flowing throughout their business. At Confluent, we're building the foundational platform for this new paradigm of data infrastructure. Our cloud-native offering is designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organization. With Confluent, organizations can create a central nervous system to innovate and win in a digital-first world.
We're looking for self-motivated team members who crave a challenge and feel energized to roll up their sleeves and help realize Confluent's enormous potential. Chart your own path and take healthy risks as we solve big problems together. We value having diverse teams and want you to grow as we grow-whether you're just starting out in your career or managing a large team, you'll be amazed at the magnitude of your impact.
Consulting Engineers drive customer success by helping them realize business value from the burgeoning flow of real-time data streams in their organizations. In this role you'll interact directly with our customers to provide software, development and operations expertise, leveraging deep knowledge of best practices in the use of Apache Kafka, the broader Confluent Platform, and complementary systems like Hadoop, Spark, Storm, relational databases, and various NoSQL databases.
Throughout all of these interactions, you'll build strong relationships with customers, ensure exemplary delivery standards, and have a lot of fun building state-of-the-art streaming data infrastructure alongside colleagues who are widely recognized as leaders in this space.
Promoting Confluent and our amazing team to the community and wider public audience is something we invite all our employees to take part in. This can be in the form of writing blog posts, speaking at meetups and well known industry events about use cases and best practices, or as simple as releasing code.
What the role entails:
- Preparing for an upcoming engagement, discussing the goals and expectations with the customer and preparing an agenda
- Researching best practices or components required for the engagement
- Delivering an engagement on-site, working with the customer's architects and developers in a workshop environment
- Producing and delivering the post-engagement report to the customer
- Developing applications on Confluent Kafka Platform
- Deploy, augment, upgrade Kafka clusters
- Building tooling for another team and the wider company
- Testing performance and functionality of new components developed by Engineering
- Writing or editing documentation and knowledge base articles
- Honing your skills, building applications, or trying out new product features
What gives you an edge:
- Deep experience building and operating in-production Big Data, stream processing, and/or enterprise data integration solutions using Apache Kafka
- Experience operating Linux (configure, tune, and troubleshoot both RedHat and Debian-based distributions)
- Experience with Java Virtual Machine (JVM) tuning and troubleshooting
- Experience with distributed systems (Kafka, Hadoop, Cassandra, etc.)
- Proficiency in Java
- Excellent communication skills, with an ability to clearly and concisely explain tricky issues and complex solutions
- Ability and willingness to travel up to 50% of the time to meet with customers
- Bachelor-level degree in computer science, engineering, mathematics, or another quantitative field
Nice to have:
- Experience using Amazon Web Services, Azure, and/or GCP for running high-throughput systems
- Experience helping customers build Apache Kafka solutions alongside Hadoop technologies, relational and NoSQL databases, message queues, and related products
- Experience with Python, Scala, or Go
- Experience with configuration and management tools such as Ansible, Teraform, Puppet, Chef
- Experience writing to network-based APIs (preferably REST/JSON or XML/SOAP)
- Knowledge of enterprise security practices and solutions, such as LDAP and/or Kerberos
- Experience working with a commercial team and demonstrated business acumen
- Experience working in a fast-paced technology start-up
- Experience managing projects, using any known methodology to scope, manage, and deliver on plan no matter the complexity
Come As You Are
At Confluent, equality is a core tenet of our culture. We are committed to building an inclusive global team that represents a variety of backgrounds, perspectives, beliefs, and experiences. The more diverse we are, the richer our community and the broader our impact. Employment decisions are made on the basis of job-related criteria without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or any other classification protected by applicable law.
Confluent requires all employees (in office and remote) in the U.S. to be vaccinated for COVID-19. Consistent with federal, state, and local requirements, Confluent will consider requests for reasonable accommodation based on medical conditions/contraindications or sincerely-held religious beliefs where it is able to do so without undue hardship to the company.
Click here to review our California Candidate Privacy Notice, which describes how and when Confluent, Inc., and its group companies, collects, uses, and shares certain personal information of California job applicants and prospective employees.
Company name: confluent
Remote job title: Consulting Engineer
Job tags: open-source, big data, infrastructure