The candidate will be responsible for working with a portfolio of internal users to gather user requirements, design, propose enhancement solution and delivered the approved solution which conform to the bank’s technology architecture guidelines, technology information security standard and IT management process.
• Expert skills in SQL, Scala and Phython.
• Good experience in Data Management and ETL.
• Experience with Cloudera Distribution of Hadoop System.
• Proficient understanding of distributed computing principles.
• Management of Hadoop cluster, with all included services.
• Added Advantage for having experience in Oracle Big Data Appliance and ODI.
• Proficiency with Hadoop v2, MapReduce, HDFS
• Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala
• Experience with integration of data from multiple data sources
• Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
• Knowledge of various ETL techniques and frameworks, such as Flume
• Experience with building stream-processing systems, using solutions such as Spark-Streaming, Kaftka, Storm or Kinesis