Data Engineer
Skills and Responsibilities:
· Experience in using Public Cloud services.
· Certification in GCP (GCP Cloud Associate) is required.
· Experience in using Infrastructure as Code tooling Like Terraform.
· Experienced in Talend ETL tool.
· Experience in DevOps tooling like GitHub, GitHub Actions, Cloud Build and Terraform Cloud is a positive.
· Proficiency in languages like Python, Go, or Java is crucial.
· You’ll be automating tasks, creating scripts, and developing infrastructure as code.
· Experience with Docker is a must - including setting up and managing Docker registries as well as creating Docker files to create custom images.
· Experience on setting up Kubernetes or similar platform on premise/cloud (On-prem Rancher experience is a plus)
· General Understanding of continuous integration/continuous deployment (CI/CD) pipelines and how to optimize them for faster software delivery.
· Should have knowledge of overlay networking needed for inter-container communications from different nodes as well as external servers/infrastructure
· Experience building CI/CD pipelines using GitHub, Artifactory etc. to reduce cycle times and ensure quality.
· Experience automating systems deployments and configuration management using tools like Ansible, Chef, Puppet, Terraform, Salt stack.
· Strong scripting skills (i. e. shell scripting, Python, Perl, Ansible) for automation.
· Working experience with source control systems like Git.
· Exposure to working on cloud platforms like Azure/GCP/AWS
· Experience with Agile/Scrum development methodologies Team player with effective communication skills (verbal and written)
· Able to see tasks through to completion without significant guidance.
· Self-managed and results-oriented with a sense of ownership is required.