Bachelor's degree in Computer Science, Engineering, Mathematics, a related field, or equivalent practical experience.
Experience in developing and troubleshooting data processing algorithms and software using Python, Java, Scala, Spark and hadoop frameworks.
Experience in Google Cloud Platform.
Experience in data processing frameworks and Google Cloud Platform with investigative and transactional data stores like BigQuery, CloudSQL, AlloyDB, etc.
Preferred qualifications:
Experience in Big Data, information retrieval, data mining, or Machine Learning.
Experience in building applications with modern technologies like NoSQL, MongoDB, SparkML, and TensorFlow.
Experience with architecting, developing software, or internet production-grade Big Data solutions in virtualized environments.
Experience with Infrastructure as Code (IaC) and CI/CD tools like Terraform, Ansible, Jenkins, etc.
Experience with encryption techniques like symmetric, asymmetric, Hardware Security Module (HSMs) and envelop with ability to implement secure key storage using Key Management System.
Experience in working with data warehouses, including technical architectures, infrastructure components, ETL/ELT and reporting tools, environments, and data structures.