Responsibilities
- Design and implement ETL/ELT pipelines to process retail transaction data
- Build and maintain data warehouses using our self-hosted infrastucture
- Develop automated monitoring systems and data quality checks
- Build and implement new solutions to help improve the working process and efficiency across analytics functions and business intelligence
- Collaborate with Data Analyst and business stakeholders to understand data requirements for reporting and analytics
- Optimize database performance and query efficiency for large-scale datasets
- Create and maintain documentation for data processes and systems
- Support daily data processing workflows for customer analytics
- Identify the root cause and provide a solution to resolve the problem.
- Help build data products that provide FMCG companies with market penetration, demand patterns, and distribution insights.
Requirements
- Bachelor's degree in Computer Science, Engineering, or related field
- Strong programming experience in Python
- 1-3 years of experience with SQL and database management
- Understanding of data modeling concepts
- Experience with version control (Git)
- Strong problem-solving and analytical thinking skills
- Detail-oriented and enjoy building from the ground up
- Sense of ownership
- Good command in English
- Experience with retail/e-commerce data
- Experience with data modeling, warehousing and building ETL pipelines
- Knowledge of data pipeline tools (Prefect, dbt, Airflow)
- Familiarity with data visualization tools (Tableau, Superset, Power BI)
- Familiarity with cloud platforms (AWS, Azure, or GCP)
- Fresh graduates are welcome