Job Description The Senior Data Engineer for streaming will be responsible for supporting and enhancing our Kafka and Kinesis based streaming data collection platforms. The ideal candidate will need to have hands-on experience with Kafka (open source or Confluent) and AWS Kinesis. The candidate must have proven experience working on cloud platforms and be very familiar with AWS. A strong working knowledge of kafka components namely Topics, Producer/Consumer, KStream, KTable, KSQL is a basic prerequisite. Hands-on experience with Kinesis components including streams, firehose and data analytics is definitely desired. The candidate should have experience working on streaming systems that handle very large volume and velocity of data and should have a good understanding of streaming best practices. The candidate should also have experience working with AWS, CI/CD tools and infrastructure as code. Responsibilities Working as part of the Data Engineering team which is responsible for managing all data feeds and collecting analytical streaming data from all our client software. This is a very fast paced environment dealing with very large volume and high velocity of data. You must have strong analytical and problem solving skills and strong organizational and prioritization skills. Experience 7 years experience in data engineering Demonstrated understanding and experience with streaming data Demonstrated expert level understanding of Kafka Demonstrated ownership and ability to manage multiple projects Demonstrated strong time-management, prioritization and organizational skills Demonstrated strong troubleshooting and problem-solving skills Strong interpersonal skills, with the ability to cultivate relationships and negotiate with internal clients Ability to meet deadlines and partner timeline expectations Proven strong written and oral communication skills Bachelor’s Degree or Equivalent Qualities/Experience Strong knowledge of Java, Scala Strong knowledge of Kafka (Topics, Producer/Consumer, KStream, KTable, KSQL) Knowledge of AWS Kinesis (streams, firehose and data analytics) Knowledge of SQL and familiarity with modern cloud databases like Snowflake Good experience with the following technology stacks Frameworks: Spring Framework, Flink, JUnit, Mockito, Apache Flink. Cloud Technologies: AWS (VPC, Kubernetes, EC2, Lambda, CloudWatch and etc) Deployment Tools: Jenkins, Docker, Helm, Terraform Monitoring tools: Grafana, Prometheus CI/CD tools: Github Actions Strong analytical and troubleshooting skills At least 7 years of hands-on experience with data streaming systems Effectively and clearly communicate open project risks and dependencies with the data team and other internal teams