he main function of a Developer Specialty is to design, develop, and implement applications using in-demand languages and technologies (e.g. - Java with Hadoop, Spark, Apache) to support business requirements.
This requirement is for Cloudera Hadoop developer having minimum of 5 years of experience.
Excellent understanding / knowledge of Hadoop architecture (1.x and 2.x) and various components such as HDFS, Yarn, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce paradigm
Good working Knowledge in Hadoop security like Kerberos and sentry
Working knowledge and experience with wide range of big data components such as HDFS, Sqoop, Flume, Pig, Hive, Hbase etc.
Practical experience and in-depth understanding of Hive, Impala and Spark
Hands-on experience with Sqoop/Kafka/spark streaming for batch and streaming data ingestion
Experience in at least one big data stack project (Spark/Flume/Flink/Hadoop etc)
Experience in at least one big data database (HBase,Cassandra, MongoDB, Red Shift etc)
Strong Programming background with expertise in Scala/python
Data infrastructure tools landscape e.g cloud service providers, software, system monitoring tools and development environments
Qualifications:
• Bachelor's degree in a technical field such as computer science, computer engineering or related field required
• 5-10 years’ experience required
• Hands on experience in designing, developing and successful deployment of large-scale projects from end-to-end
• Hands on experience in following the iterative and agile SDLC
At IQVIA, we believe in pushing the boundaries of human science and data science to make the biggest impact possible – to help our customers create a healthier world. The advanced analytics, technology solutions and contract research services we provide to the life sciences industry are made possible by our 70,000+ employees around the world who apply their insight, curiosity and intellectual courage every step of the way. Learn more at jobs.iqvia.com.