We’re looking for a candidate to this position in an exciting company.
Develop the security features on different data platforms with Engineering of Big Data Analytics Technology.
Provide engineering solution and framework to support the security and access in different environments and data-driven business activities at large scale
Ensuring good quality and performance of product features
Automate everything to remove the toil
Prepare and produce releases of software components
Experimentation to assess new solution paths of enabling data access in various environments
Collaborative work with remote teams
Minimum qualifications: Bachelor’s degree in Software Engineering or Computer Science
Expertise in GCP Certification required
4-8 years of proven software development experience with Big Data full stack engineering
Collaborated with cross-functional teams to define, design, and ship new features
Design and build applications using Java and related technology on GCP
Solid hands-on experience of Java/J2EE and framework such as Spring, Kubernetes, etc.
Knowledge of build tools Maven, Gradle, and DevOps environment using tools such as Git (Bitbucket), Continuous Integration (Jenkins), and Continuous Deployment (Fastlane)
Code with security and data protection
Experience in problem solving and root-cause analysis on any errors during the process of Software Development Life Cycle (SDLC) implementation and provide audit and compliance support when required
Experience in automated testing is an advantage
Good to have working with code quality tools such as Sonar, Fortify or NexusIQ will be advantageous
Tech stack: GCP, Hadoop, Spark, Hive, Java, Spring, Spring boot, Spring Webservices, Spring Cloud, Spark, Microservices, Kafka, Jenkins, git, Maven, Kubernetes, OpenShift, AWS and Linux
Professional, adaptable and innovative with a focus on delivery
Self-motivated team player who demonstrates initiative and flexibility