Founded in 2005 by supply chain risk management expert Jennifer Bisceglie, we have grown into the largest and most influential player in the emerging operational resilience space. This past July we achieved $100 million dollars in Series C funding led by the investment firm Night Dragon, with additional participation from Series A and B investors Kleiner Perkins and Venrock. This latest round of funding brings Interos’ valuation to $1billion, earning us unicorn status.
Based in Arlington, Va., Interos’ breakthrough SaaS platform leverages artificial intelligence and machine learning to model the entire business ecosystems of companies into a living global map. Users can drill down to any single supplier, anywhere – helping businesses and government organizations reduce risk, avoid operational disruptions, and achieve dramatically superior resilience and performance. We have seen hypergrowth in the past 3 years and continue to expand our team across the US and several countries in Europe.
Looking for expert technical leadership to help us design, build, and iterate our next generation platform. As someone who has successfully implemented a significant Kafka based machine learning or data analytics pipeline you can help us make the right choices and avoid mistakes as we build a scalable, reliable, and fully automated supply chain risk management system for our exponentially growing customer base. You will be working with a collaborative team of highly skilled and constantly learning software engineers and data scientists who value good ideas and the ability to create quality solutions.
Primary Purpose of the Position:
The Senior Data Engineer, Kafka will guide and vet our design decisions around Kafka and our next-gen architecture. You will help us optimize our Kafka usage and implementation and mentor other engineers on best practices around Kafka and event streaming.
- Kafka topic topology design.
- Kafka message schema evolution.
- Write Kafka producers and consumers in Python.
- Guide data and machine learning engineers on best practices regarding Kafka.
- Design and build near real-time data analytics pipelines.
- Pitch in wherever needed to meet team goals and deliver a quality product.
- 5+ years of hands-on experience with Kafka Cluster management (Apache Kafka or Confluent Platform).
- Overall experience in Kafka as a solution including the various part of the streaming platform and different components.
- Experience in open source and confluent Kafka, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control. center. Experience in Confluent Replicator is appreciated.
- Hands on experience in Kafka topic sizing, performance monitoring, broker security, topic security, consumer/producer access management (ACL).
- Knowledge of Kafka API.
- Experience in Message Serialization/Storage Formats such as AVRO, Parquet, Protobufs, etc.
- Strong Python coding skills.
- Ability to communicate clearly and willingness to share knowledge and collaborate.
- Minimum education level: Bachelor’s Degree, Computer Science or related field or equivalent.
- Experience in JAVA and understanding the difference between the JAVA and Python.
- Experience in performance improvement for distributed processing.
- Strong SLDC experience along with Agile methodologies and DevOps.
- Experience in integration with real-time systems, analytics, and data warehouses.
- Experience in PostgresSQL, SQL, Snowflake.
- Experience in Spark and/or Databricks batching and streaming.
- Experience in building Microservices using libraries like FastApi or gRPC.
- Comfortable working with Docker and Kubernetes.
- Supervisory Responsibility: This position has no supervisory responsibilities.
- Travel Requirements: This position may require up to 10% travel. Frequent travel is outside the local area and overnight.
- Work Environment: This job operates in a professional office environment. This role routinely uses standard office equipment such as computers, phones, printers.
- Physical Demands: This is largely a sedentary role. Physical requirements include occasional lifting/carrying of 5 pounds; visual acuity, speech and hearing; hand and eye coordination and manual dexterity necessary to operate a computer keyboard and basic office equipment. Subject to sitting, standing, reaching, walking, twisting, and kneeling to perform the essential functions. Working conditions are primarily inside an office environment.
- FLSA Status: Exempt
Interos is required to comply with Federal Executive Order 14042 mandating employees be vaccinated against COVID-19. Accordingly, as of January 4, 2022, all Interos employees must be fully vaccinated* against COVID-19, unless you have a legally valid medical or religious reason. Any requests for exemptions based on medical or sincerely held religious beliefs will be evaluated confidentially by our Chief People Officer after hiring.
*Fully vaccinated means an individual must have received their second dose of the Moderna or Pfizer vaccine or the single dose of the Johnson & Johnson vaccine no later than 14 days before January 4, 2022.
- Comprehensive Health & Wellness package (Medical, Dental and Vision)
- 10 Paid Holiday Days Off
- Flexible Paid Time Off (FTO)
- 401 (k) Employer Matching
- Stock Options
- Career advancement opportunities
- Casual Dress
- On-site gym and dedicated Peloton room at headquarters
- Company Events (Sports Games, Fitness Competitions, Birthday Celebrations, Contests, Happy Hours)
- Annual company party
- Employee Referral Program
Interos is proud to be an Equal Opportunity Employer and will consider all qualified applicants without regard to race, color, age, religion, sex, sexual orientation, gender identity, genetic information, national origin, disability, protected veteran status or any other classification protected by law.
If you are a candidate in need of assistance or an accommodation in the application process, please contact [email protected]