Design, develop, and maintain ETL/ELT pipelines
Build and optimize data models for data warehouses and data marts.
Develop scalable batch and real-time data processing systems.
Integrate data from multiple sources (APIs, databases, files, streaming systems)
Ensure data quality, governance, and security standards
Optimize SQL queries and database performance
Collaborate with BI and analytics teams for reporting requirements
Implement monitoring and logging for data workflows
Required Technical Skills:
Experience with ETL tools
Strong SQL and database knowledge (Snowflake, SQL Server, etc.)
Programming skills in Python / Scala
Experience with cloud platforms: AWS / Azure / GCP
Knowledge of data warehousing concepts (Star Schema, Snowflake Schema, Data Vault)
Familiarity with orchestration tools (Flow Manager, Control-M, etc.)
Advanced SQL skills and database expertise (with Oracle)
Official notificationAny question or remark? just write us a message
If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.