Mandatory - Min 5 Yrs of Java Backend Developoment. Java 8+, Spring, SpringBoot, SpringCloud,REST Services, Microservices, NoSQL db, Kafka/MQ, CI/CD, Cloud, Design Pattern; Good to have - Basics of any JS based UI development.
[11:05 AM] 4 to 7 years of Big Data experience with experience in building data processing applications using Hadoop, Spark and NoSQL DB and Hadoop streaming.
• Expertise in one or more programming languages like Java, Scala or Python and in unix scripting.
• Expertise in using query languages such as SQL, Hive, Sqoop and SparkSQL.
• Expertise in storage and process optimization techniques in Hadoop and Spark.
• Experience in using tools like Jenkins for CI, Git for version Control
• Exposure to Google Cloud (GCP) data components such as Cloud Data Flow, Cloud Data Proc, BigQuery and BigTable is preferred
• Experience in using MicroStrategy and PowerBI Reporting tools is preferred
• Strong problem-solving, communication and articulation skills
• 3-5 years of experience in Hadoop or any Cloud Bigdata components
• Expertise in Java/Scala, SQL, Scripting, Teradata, Hadoop (Sqoop, Hive, Pig, Map Reduce), Spark (SQL, Spark Streaming, MLib), equivalent Cloud Bigdata components
• Experience in Streaming integration of Kafka using Kafka client/connectors (using Spring boot or similar)
• 4 years of experience in Hadoop, NO-SQL(at least 2 years exp), RDBMS or any Cloud Bigdata components
• Experience in building restful API's(Microservices) to expose Data using frameworks like Sprint boot. server components., Open Shift, Kubernetes and Managing Deployment of Solution. Familarity with messaging systems like JMS, IBM MQ or Active MQ, Kafka is good to have