Design, build and configure applications to meet business process and application requirements.
Must Have Skills: Apache Spark, Big Data Analytics, Hadoop, Scala Programming Language
The resource must have strong hands-on with relevant Spark-Scala experience of min 2 years in Apache Spark, Spark core, Spark SQL and Spark Streaming using Scala functional language, Hadoop.
The resource should have good experience working in the Big Data ecosystem which includes: Hadoop, HDFS, Hive, Oozie or Airflow.
Good to have experience working on AWS EMR Redshift, S3, Lambda, API Gateway, Python, Kafka, Elastic Search, Kibana
The resource should have the ability to design and develop workflows in Big data ideally with either Batch Processing and/or Stream processing.
The resource should have good communication and presentation skills.
The resource should have strong analytical skills and the ability to lead the team.
Job No: 202069
Apply to official link
Upgrade your Skills during this Lockdown
Online Selenium Automation Training :-
Start Date :- 4th October
Trainer Name :- SHAMMI JHA
☸ Course Contents :-
✳ Transposed 6000+ People into Automation across the World.
✳ Customized Notes on All Topics.
✳ Concept Retention With Funny Example.
✳ In Depth Interview Preparation With Dedicated Session, Resume Writing Assistance and Much More.
☑ First 4 sessions are free. ☑
✔Duration :- 3 Months
✔Monday to Thursday :- 2 Hours
✔Timing :- 9.30 PM to 11.30 PM IST
✔Fees :- 6000/-
✔Mode of Training :- Online
Click here to directly contact on WhatsApp:-
Phone / WhatsApp Details :-
Shammi Jha :- +91-8305429370