Brief Job Description :-
- Design and develop ETL platforms on Hadoop for various business use cases which are fault-tolerant, highly distributed, and robust.
- Taking care of the complete ETL(Extract, Transform & Load) process.
- Work on structured, semi-structured data to put this data to business use. This will involve organizing the
- Data (collecting, storing, processing).
- Analyze huge sets of structured and semi-structured data for business analytics solution using cutting-edge tools and techniques.
- Creating data models to reduce system complexity and hence increase efficiency.
Special skill / knowledge requirement :-
- Solid Knowledge of Operating Systems: Linux
- In-Depth Database Knowledge – SQL and NoSQL
- In-Depth Knowledge in Data Warehousing & ETL Tool– Hadoop, Map Reduce, HIVE, PIG, Apache Spark, Kafka
- Basic Language Requirement: Python
- Basic Machine Learning Familiarity
Exp: 3-5 years