Notice period: 15 days to immediate joiners
Hiring mode: Permanent
Job Summary
A skilled professional with strong expertise in both data warehouse modeling and modern data transformation using dbt (data build tool). The ideal candidate will be responsible for designing scalable data models and implementing efficient ETL/ELT pipelines to support analytics and reporting needs.
Key Responsibilities
- Design and implement scalable data warehouse models (star, snowflake schemas)
- Develop and maintain fact and dimension tables aligned with business requirements
- Build and manage data transformation pipelines using dbt
- Translate business requirements into well-structured, reusable data models
- Implement data validation, testing, and documentation within dbt
- Optimize SQL queries and transformations for performance and scalability
- Collaborate with stakeholders, data engineers, and analysts for data requirements
- Ensure data quality, consistency, and governance standards
- Integrate dbt workflows into CI/CD pipelines and version control systems
Required Skills
- Strong experience in data warehouse design and dimensional modeling
- Hands-on expertise with dbt (data build tool)
- Advanced SQL skills for large-scale data transformation
- Experience working with cloud data warehouses (Snowflake, BigQuery, Redshift, or Azure Synapse)
- Solid understanding of ETL/ELT pipelines and modern data architecture
- Experience with Git and CI/CD practices
- Strong problem-solving and analytical skills