Job descriptions
- Combining raw data from different data sources
- Building and maintaining ETL/ELT and data pipelines for enterprise data warehouse
- Data cleansing and preparing for future analysis
- Assist in optimizing and fine-tuning databases for optimal efficiency and performance
- Overseeing data quality and efficiency of its usage
- Developing and maintenance of datasets
- Assist in defining and documenting ETL/ELT rules
Qualifications
- Good knowledge of Python and related libraries
- Good knowledge of Apache Airflow, Spark
- Good understanding of ETL/ELT processes and Data Warehousing concepts
- Advanced SQL skills to work with databases PostgreSQL, Oracle DB, MPP databases (modify production data, create and optimize stored procedures, functions, triggers, views, etc.)
- Bash/Bash scripting, Git
Nice to have
- Experience with Docker
- Experience working with APIs and Web Services
Proficiency in English