Hadoop Jobs May 30
Hadoop Administrator – Big Data/python
Company Name:
Impetus Technologies
Description:
Hands on experience in deploying Big Data clusters involving Hadoop (HDFS/Yarn/Hive), Cassandra/Mongo DB/HBase, RabbitMQ/Kafka, Storm, Solr, Spark, Zookeeper.
Skills required:
Hands on experience with Ambari/Cloudera Manager.
Hands on experience with Cloudera/HDP/Apache distributions.
Hands on experience of running tools like Sqoop/Flume/Oozie.
Good understanding of Hadoop benchmarks and other performance benchmarks for optimal configuration.
Good understanding of OS parameters, network topologies, storage options for optimal Hadoop cluster set up.
Hands on experience in any one RDBMS – MySQL/PostgreSQL.
Ability to install any of these databases as meta store database, create schema, populate data, import/export, create users/roles etc.
Hands on experience with configuring cluster security using Kerberos, ACL
Hands on experience with data security using encryption, Sentry, ranger etc.
Good understanding of high availability, fault tolerance, data replication.
Hands on experience on configuring HA in cluster.
Fluent in at least one scripting language (Shell/Perl/Python etc.).
General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.
Good knowledge of Linux as Hadoop runs on Linux.
Familiarity with open source configuration management and deployment tools such as Puppet or Chef and Linux scripting.
Knowledge of Troubleshooting Core Java Applications is a plus.
Sound UNIX abilities involving user management, security & permissions
Exposure to cloud technologies like EC2/Azure/VM ware will be added advantage
Hands on experience on monitoring tools like Nagios, Ganglia etc. Ability to deploy, create plug-ins for custom monitoring stats
Experience needed:
3-8 Years
Qualification – BE/ B.Tech/ MCA/ MS-IT/ CS /B.Sc/BCA or any other degree.
Location:
Bengaluru, Gurgaon, Noida, Madhya Pradesh
Contact:
HR
Apply to official link
———————————-
Hadoop Admin for Bangalore Location
Company Name:
Techtalento Softser Technologies Pvt. Ltd.
Description:
Installed and configured multi-nodes fully distributed Hadoop cluster of large number of nodes.
Addressing and Troubleshooting issues on daily basis in MapR environment.
File system management and monitoring.
Provided Hadoop, OS, Hardware optimizations.
Installed and configured Hadoop ecosystem components like Map Reduce, Hive, Pig, Sqoop, HBase, Zookeeper and Oozie.
Skills required:
Experience in administration of Big Data Technologies& Hadoop.
Good knowledge on data warehousing and DBMS techniques.
Hadoop cluster planning and implementation.
Writing Oozie workflows and configuring Sqoop.
BuildHadoop cluster using MapR Technologies and as well as using Hortonworks.
Configuring, managing & tuning of HDFS with MapReduce (v1)/Yarn (v2) architecture with High-Availability.
Excellent understanding / knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, NameNode, Data Node, Yarn, HBase, Hive, Sqoop, Pig, Zookeeperand MapReduce.
Adept at mapping client requirements, custom designing solutions & troubleshooting for complex software related issues.
Experience needed:
5-10 Years
Location:
Bangalore
Contact:
Sanjay Poddar
Email Address: sanjay.poddar@techtalentoconsulting.com
Website: http://www.techtalentoconsulting.com
Telephone: 9717300643
Reference Id: Hadoop Admin
Apply to official link
———————————-
Immediate Opening for Hadoop Admin
Company Name:
Ecentric Solutions Pvt. Ltd.
Description:
We have F2F interviews on for Hadoop Admin @ Bangalore Location location
Who can join Immediately / 1week(7 days).
Kindly share your updated profile on harjeet.kaur@ecentrichr.com
Skills required:
Mandate Skills: Keyskills-Primary Skill- Hadoop Admin ( Hortonworks or Cloudera)
Secondary Skill- Mongo DB/ Cassandra
Experience needed:
3+ years to 6 years
Location:
Bangalore
Contact:
Harjeet Kaur
Email: harjeet.kaur@ecentrichr.com
Apply to official link
———————————-