TechOps-DE-CloudOps (5+)
ey | 1 days ago | Kochi

Your key responsibilities

 

  • Lead the design, monitoring, and optimization of AWS-based data pipelines using services like AWS Glue, EMR, Lambda, and Amazon S3.
  • Oversee and enhance complex ETL workflows involving IICS (Informatica Intelligent Cloud Services), Databricks, and native AWS tools.
  • Collaborate with data engineering and analytics teams to streamline ingestion into Amazon Redshift and lead data validation strategies.
  • Manage job orchestration using Apache Airflow, AWS Data Pipeline, or equivalent tools, ensuring SLA adherence.
  • Guide SQL query optimization across Redshift and other AWS databases for analytics and operational use cases.
  • Perform root cause analysis of critical failures, mentor junior staff on best practices, and implement preventive measures.
  • Lead deployment activities through robust CI/CD pipelines, applying DevOps principles and automation.
  • Own the creation and governance of SOPs, runbooks, and technical documentation for data operations.
  • Partner with vendors, security, and infrastructure teams to ensure compliance, scalability, and cost-effective architecture.

 

 

Skills and attributes for success

 

  • Expertise in AWS data services and ability to lead architectural discussions.
  • Analytical thinker with the ability to design and optimize end-to-end data workflows.
  • Excellent debugging and incident resolution skills in large-scale data environments.
  • Strong leadership and mentoring capabilities, with clear communication across business and technical teams.
  • A growth mindset with a passion for building reliable, scalable data systems.
  • Proven ability to manage priorities and navigate ambiguity in a fast-paced environment.

 

 

To qualify for the role, you must have

 

  • 5–8 years of experience in DataOps, Data Engineering, or related roles.
  • Strong hands-on expertise in Databricks.
  • Deep understanding of ETL pipelines and modern data integration patterns.
  • Proven experience with Amazon S3, EMR, Glue, Lambda, and Amazon Redshift in production environments.
  • Experience in Airflow or AWS Data Pipeline for orchestration and scheduling.
  • Advanced knowledge of IICS or similar ETL tools for data transformation and automation.
  • SQL skills with emphasis on performance tuning, complex joins, and window functions.

 

Official notification
Contact US

Let's work laptop charging together

Any question or remark? just write us a message

Send a message

If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.