Senior Backend Developer in Aggero Warsaw
We are Aggero, founded to solve problems surrounding poor matching between brands and creators, leading to poor ROI and high variance in the esports and gaming marketing space. We believe in the power of numbers and the power of scale – and have a very rigorous empirical data philosophy that leverages huge datasets and clever modelling techniques to find answers to gaming industry problems.
Join our mission to empower the esports and gaming industry growth, working with leading agencies, brands, esports teams, and events the untapped opportunities new live streaming economy.
At Aggero, we’re all relaxed, passionate about gaming, and passionate about solving challenging problems using cutting edge data science techniques.
What you’ll be doing:
You’ll be a core member of the development team and will improve our current infrastructure to gather, analyse and present data to our customers.
You’ll be in charge of scraping and gathering data from APIs, analysing and storing it efficiently.
You’ll focus on building scalable, robust information gathering and storage systems.
You’ll build crafty programming tools to solve API rate limit issues and improve scraping speed to the fastest possible.
You’ll introduce new database functionalities to improve efficiency and save data costs.
You’ll solve creative problems by providing curated subsets of data from our silos that allow for new insights into our data.
You’ll be writing SQL queries that perform interesting analysis and data cleaning.
You’ll be having a great time with the laid back team at Aggero, with plenty of opportunity for play as well as work!
Very comfortable with Python 3+ or Ruby 2.3+
Experience with data processing technologies such as Spark, Kafka, Storm, Faktory, Sidekiq etc.
Experience with noSQL (e.g. Cassandra, MongoDB, CouchDB, Firebase) and SQL databases (e.g. PostgreSQL).
Experience with time-series databases (eg: InfluxDB)
Comfortable with distributed system design and implementation under highly-available environments, ideally in a SaaS or commodity website environment
Experience with docker, docker-compose or kubernetes
Proven track record of working with a terabyte or petabyte-scale data infrastructure.
Machine Learning experience (e.g. Keras, Tensorflow, Caffe).
Passionate about working with the high volume of heterogeneous data and distributed systems.
Experience working with varied data applications and databases, such as Hadoop, Druid, Spark or Redshift.
Expert knowledge of SQL, BigQuery or MapReduce.
Enjoy the social aspect of working in teams and goal-oriented environments.
Experience in building products from scratch in early startup (seed or Series A).
Passion for data and data science
Experience with Azure or AWS
Experience with terraform