Required Experience, Skills & Competencies:
● Ability to develop and manage scalable Hadoop cluster environments
● Ability to design solutions for Big Data applications
● Experience in Big Data technologies like HDFS, Hadoop, Hive, Yarn, Pig, HBase, Sqoop, Flume, etc
● Working experience on Big Data services in a GCP cloud-based environment.
● Experience in Spark, Pyspark, Python or Scala, Kafka, Akka, core or advanced Java, and Databricks
● Knowledge of how to create and debug Hadoop and Spark jobs
● Experience in NoSQL technologies like HBase, Cassandra, MongoDB, Cloudera, or Hortonworks Hadoop distribution
● Familiar with data warehousing concepts, distributed systems, data pipelines, and ETL
● Familiar with data visualization tools like Tableau
● Good communication and interpersonal skills
● Minimum 4-8 years of Professional experience with 3+ years of Big Data project experience
● B.Tech/B.E from reputed institute preferred
#LI-UNPost