Role and Responsibilities:

  • 6+ years of Experience working with pure big data technologies.
  • Strong Object-Oriented Programming experience e.g. Java or Python or Scala.
  • Strong Knowledge of Sqoop, Hive, Impala, Spark, Kafka, StreamSets and Oozie etc.
  • Very good understanding of Big Data & Analytics landscape and Architecture.
  • Strong Knowledge of Data Integration techniques and methodologies.
  • Very good experience in Cloudera.
  • Good understanding of Hadoop ecosystem security.
  • Thorough experience in Ingesting, parsing, integrating, and managing large sets of structured and unstructured data.
  • Strong technical delivery management and organizational skills.
  • Write/Create Scala/Spark jobs for Cleansing, data transformation and aggregation
  • Write Producer/Consumer applications for Kafka Framework
  • Writing and optimizing complex cross-platform SQL queries
  • Produce unit tests for Spark transformations and helper methods
  • DevOps Skills (like Git, Jenkins, Puppet, Docker, Kubernetes, etc) is plus

Athar Saba Rasidi


Post a Comment