Big Data Architect
Detailed Job Description:
- To support development and maintenance of Hadoop solutions for the Enterprise.
- To be part of initiatives that brings data into the data lake and delivers insights.
- To perform production support and maintenance of existing datasets in Hadoop.
- To be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them
- To be able to perform detailed analysis of business problems and technical environments and use this in designing the solution;
- To be able to work creatively and analytically in a problem-solving environment;
- Work effectively with all technical personnel (Development Team, business analysts, security, risk and compliance, data center, project managers, data architects and testers), and clearly translate business priorities and objectives into technical solutions.
- Ability to plan and organize technical work and deliverables. Ability to follow guidelines and adhere to the established software development standards and conventions.
- Self-motivated and independent. Able to work with minimum supervision and to work well with stakeholders and project staff.
- Ability to prioritize and multi-task across numerous work streams.
- Strong interpersonal skills; ability to work on cross-functional teams. Strong verbal and written communication skills.
- Deep knowledge of best practices through relevant experience across data-related disciplines and technologies particularly for enterprise wide data architectures and data warehousing/BI.
- Demonstrated problem-solving skills. Ability to learn effectively and meet deadlines.
- Strong scripting skills in Linux environment and SQL.
- Expertise in Hadoop ecosystems HDFS (Hortonworks)
- Hands-on Experience in Sqoop, Hive, HBase, Spark, Oozie, Python, Scala is a must.