Job Description
Responsibilities:
-
Building and optimizing data pipelines, architectures and data sets
-
Database optimisation, query profiling
-
Creation, optimisation, and maintenance of Airflow DAGs
-
Incorporation of data from external APIs
-
Platform stability tracking and improvements
-
Documentation of technical processes
-
Advanced data modelling using dbt
-
Collaboration with DevOps team to maintain team-related and used services
Requirements:
-
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
-
Experience building and optimizing data pipelines, architectures and data models
-
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
-
Strong analytic skills related to working with unstructured datasets
-
3+ years of experience in a Data Engineer role
-
Proficiency in English and Russian, written and verbal
-
Relational Databases, including PostgreSQL, MySQL
-
Data pipeline and workflow management tools as Airflow 2+
-
Python 3
-
Docker
Experience using the following software/tools:
Nice to have:
-
dbt (data build tool)
-
Kafka+Debezium
-
Grafana
-
Tableau
-
MLops
-
CI/CD
-
AWS
-
Vertica