- At least 3-4 years in software development;
- Experience with Scala OR Java with strong desire to learn Scala;
- Experience with Hadoop, Spark;
- Experience with streaming algorithms and practical analysis of real-time data streams. Big data/scaling experience;
- B.Sc. in Computer Science or related field;
- Excellent communication skills and ability to work directly with English native speakers.
Will be a plus:
- Experience with Flink, Spark Streaming, KafkaStreams, or any other Event Stream Processing engine;
- Experience with Python;
- Experience with Aerospike;
- Experience with AWS.
- Flexible working;
- Sharing culture
- Benefit package
What you will be doing:
- Develop optimal, well-monitored, reliable ETL pipelines for streaming data flows;
- Working with cutting edge technologies in a fast-paced, start-up like environment;
- Experimenting with new tools and technologies, producing POC to address business needs;
- Carrying out efficient integration with our data providers via various API endpoints for real-time ingestion and batch data in cloud.