Wednesday, June 28, 2017

Hadoop Operations Lead Job in Mumbai India

 Manage Hortonworks Distribution 2.2.x cluster management and capacity planning 

- Manage Hadoop and its ecosystem Kafka, Storm, Hive, Pig, Spark, HDFS, HBase, Oozie, Sqoop, Flume, Zookeeper, etc 

- UNIX Shell/Perl/Python Scripting to do as much as automation in the Environment. 

- Able to Monitor, Debug & RCA for any service failure. 

- Creative analytical and problem-solving skills 

- Comfortable working in Rotational Shifts. 

- Provide RCAs for critical & recurring incidents. 

- Provide on-call service coverage within a larger group 

- Basic knowledge of network infrastructure for e.g. TCP/IP, DNS, Firewall, router, load balancer etc. 

Skills/ Competencies : 

- Experience in installing, Configuring, administering and operating Hadoop clusters 

- Experience in installing, configuring Linux based systems 

- Expertise managing Hadoop and its ecosystem Kafka, Storm, Hive, Pig, Spark, HDFS, HBase, Oozie, Sqoop, Flume, Zookeeper, etc 

- Hortonworks 2.2.x cluster management and capacity planning experience (Other distributions experience is a plus) 

- Experience in Middleware Technologies such as Tomcat, Apache, MySql, MQ, Websphere, dropwizard etc. 

- Proficient with Java or Scala and have coded multi-threaded Application 

- Good understanding and knowledge on UNIX Shell/Perl/Python Scripting. 

- Creative analytical and problem-solving skills 

- Experience in NoSQL database would be a plus 

- Knowledge in tuning each Big Data Components desirable 

- Ability to guide and mentor Level 2 support engineers 

- Ability to act as the shift lead and act as technical leader to support critical issues 

Experience : 

- 8+ years of Linux Administration Experience 

- 1+ year of Java Coding Experience 

- 2.5+ years of Big Data Component Administration Experience 

- Redhat Linux Certification 

Send resume to tania@jobs-n-jobs.com
Post a Comment