Project Role : Data Engineer
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : Data Building Tool
Good to have skills : NA
Minimum 3 year(s) of experience is required
Educational Qualification : 15 years full time education
Summary:
We are seeking a skilled Data Building Tool Engineer to design, develop, and maintain robust data pipelines, tools, and infrastructure that support data collection, transformation, and analysis across the organization. This role involves working closely with data analysts, data scientists, and software engineers to ensure data availability, accuracy, and efficiency in our systems.
Key Responsibilities:
Design, build, and optimize scalable data pipelines for data ingestion, processing, and storage.
Develop and maintain data building tools to automate data extraction, transformation, and loading (ETL/ELT) processes.
Integrate data from multiple sources such as APIs, databases, and external platforms.
Ensure data quality, consistency, and reliability through validation and monitoring tools.
Collaborate with cross-functional teams to understand data needs and design efficient data models.
Work with cloud platforms (e.g., AWS, Azure, GCP) and big data tools (e.g., Spark, Databricks, Snowflake).
Monitor system performance and optimize data workflows for speed and cost-efficiency.
Document technical processes, data lineage, and architecture.
Stay updated with emerging technologies and best practices in data engineering and automation.
Required Skills & Qualifications:
Bachelor s or Master s degree in Computer Science, Information Systems, Data Engineering, or related field.
Strong programming skills in Python, SQL, or Scala.
Experience with ETL tools (e.g., Airflow, Talend, dbt, Informatica).
Knowledge of data modeling and database management systems (e.g., PostgreSQL, MySQL, Redshift, BigQuery).
Experience with data APIs, JSON, and RESTful services.
Familiarity with cloud data ecosystems (AWS Glue, Azure Data Factory, Google Dataflow).
Understanding of version control (Git) and CI/CD pipelines.
Excellent problem-solving and analytical skills.
Preferred Qualifications:
Experience with machine learning data preparation pipelines.
Knowledge of data governance and security best practices.
Exposure to containerization (Docker, Kubernetes).
Experience with data visualization tools (Power BI, Tableau, Looker).
Additional Information:
- The candidate should have minimum 5 years of experience in Data Building Tool.
- This position is based at our Bengaluru office.
- A 15 years full time education is required.
Any question or remark? just write us a message
If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.