Last Updated on
Hexaware Technologies Hiring Technical Architect – Informatica
Company: Hexaware Technologies | Location: Gurgaon | Exp: 3-4 Years
3-5 years of Hands-on Experience in Informatica Power Center.
2 years of Experience in Sybase will be preferred.
Good Experience in PL/SQL.
2 yrs of Hands-on Experience in Unix/Linux.
Ready to Work in Production Support.
Good Communication Skills.
Job Code: 140366
Skill Group: Data Management
Expected Joining Date: 31-Dec-2019
Bigdata Engineer – BigData -Hadoop
Company: Hexaware Technologies | Location: Chennai | Exp: 4-7 Years
Data engineers working in Axa Direct Japan’s data lake team carry out a wide variety of business intelligence tasks in a largely AWS based cloud computing environment.
Typical tasks include:
Building high quality and sustainable data pipelines and ETL processes to extract data from a variety of APIs and ingest into cloud-based services.
Efficiently developing complex SQL queries to aggregate and transform data for analytics teams and general users.
Maintaining accurate and error-free databases and data lake structures
Conducting quality assessment and integrity checks on both new and existing queries and processes.
Monitoring existing solutions and working pro-actively to rapidly resolve errors and identify future problems before they occur.
Using data visualization tools such as Power BI, SSRS, Tableau, Looker, etc to develop high-quality dashboards and reports.
Consulting with a variety of stakeholders to gather new project requirements and transform these into well-defined tasks and targets.
Who we’re looking for:
The right candidate will be an innovative and adaptable data expert with a strong desire to succeed.
You’ll have demonstrated experience working in a high performing business intelligence or data warehouse environment, excellent communication skills and a passion for problem-solving and learning new technologies.
Working at Axa you’ll be exposed to a large variety of tasks, tools and programming languages so the desire and ability to constantly learn new skills is essential.
We’re looking for people who are passionate about data with an emphasis on quality programming and building the best solution possible.
Required experience and technical skills:
3-5 years of practical experience in data/analytics with at least 1 year working in an engineering / B.I role.
At least 1-year practical experience working on data pipelines or analytics projects with languages such as Python, Scala or Node.JS
At least 2 years of practical experience working on data pipelines or analytics projects with SQL / NoSQL databases (ideally in a Hadoop based environment).
Strong knowledge and practical experience working with at least four of the following AWS services: (s3, EMR, ECS/EC2, Lambda, Glue, Athena, Kinesis/Spark Streaming, Step Functions, Cloudwatch, Dynamo DB ).
Strong Experience working with data processing and ETL systems such as Oozie, Airflow, Azkaban, Luigi, SSIS.
Experience developing solutions inside a Hadoop stack using tools such as (Hive, Spark, Storm, Kafka, Ambari, Hue etc).
Ability to work with large volumes of both raw and processed data in a variety of formats including (JSON, ORC, Parquet, CSV).
Ability to work in a Linux /Unix environment (predominately via EMR & AWS CLI / Hadoop File System).
Experience with DevOps solutions such as (Jenkins, GitHub, Ansible, Docker, Kubernetes).
Minimum undergraduate level qualifications in a technical discipline such as (Computer Science, Data Science, Analytics, Machine Learning, Statistics etc). Post Graduate qualifications preferred.
Demonstrated experience and expertise in setting up and maintaining cloud data solutions and AWS infrastructure will be highly regarded.
Strong knowledge of cloud-based data security, encryption, and protection methods will also be highly regarded.
Job Code: 140183
State: Tamil Nadu
Skill Group: BigData
Expected Joining Date: 20-Nov-2019