DevOps Engineer-Senior (10+)
ey | 16 days ago | Noida

Job Description 

 

  • Develop and maintain scalable data pipelines using tools like Apache Kafka, Apache Spark, or AWS Glue to ingest, process, and transform large datasets from various sources, ensuring efficient data flow and processing.
  • Design and implement data models and schemas in data warehouses (e.g., Amazon Redshift, Google BigQuery) and data lakes (e.g., AWS S3, Azure Data Lake) to support analytics and reporting needs.
  • Collaborate with data scientists and analysts to understand data requirements, ensuring data availability and accessibility for analytics, machine learning, and reporting.
  • Utilize ETL tools and frameworks (e.g., Apache NiFi, Talend, or custom Python scripts) to automate data extraction, transformation, and loading processes, ensuring data quality and integrity.
  • Monitor and optimize data pipeline performance using tools like Apache Airflow or AWS Step Functions, implementing best practices for data processing and workflow management.
  • Write, test, and maintain scripts in Python, SQL, or Bash for data processing, automation tasks, and data validation, ensuring high code quality and performance.
  • Implement CI/CD practices for data engineering workflows using tools like Jenkins, GitLab CI, or Azure DevOps, automating the deployment of data pipelines and infrastructure changes.
  • Collaborate with DevOps teams to integrate data solutions into existing infrastructure, leveraging Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation for provisioning and managing resources.
  • Manage containerized data applications using Docker and orchestrate them with Kubernetes, ensuring scalability and reliability of data processing applications.
  • Implement monitoring and logging solutions using tools like Prometheus, Grafana, or ELK Stack to track data pipeline performance, troubleshoot issues, and ensure data quality.
  • Ensure compliance with data governance, security best practices, and data privacy regulations, embedding DevSecOps principles in data workflows.
  • Participate in code reviews and contribute to the development of best practices for data engineering, data quality, and DevOps methodologies.
  • Mentor junior data engineers, providing guidance on data engineering practices, data architecture, and DevOps tools and techniques.
  • Contribute to the documentation of data architecture, processes, and workflows for knowledge sharing, compliance, and onboarding purposes.
  • Demonstrate strong communication skills to collaborate effectively with cross-functional teams, including data science, analytics, and business stakeholders.

 

 

Desired Profile

 

  • Seeking a DevOps Engineer with 5+ years of hands-on Cloud and DevOps experience, including significant leadership. Requires a Bachelor's/master’s in computer science.
  • Must have expert proficiency in Terraform and extensive experience across at least two major cloud platforms (AWS, Azure, GCP). Strong hands-on experience with Kubernetes, Helm charts, and designing/optimizing CI/CD pipelines (e.g., Jenkins, GitLab CI) is essential. Proficiency in Python and scripting (Bash/PowerShell) is also a must.
  • Valued experience includes leading cloud migrations, contributing to RFP/RFI processes, and mentoring teams. Excellent problem-solving, communication, and collaboration skills are critical. Experience with configuration management (Ansible, Puppet) and DevSecOps principles is required; OpenShift is a plus.

 

 

Experience    

 

  • 10 years and above
Official notification
Contact US

Let's work laptop charging together

Any question or remark? just write us a message

Send a message

If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.