Hadoop Configuration and Development Engineer

 
Location: Ann Arbor, Michigan
Posted On: 5/5/2017
Job Code: 3454_HE_MI
Print
Category:IT code:new TCU HBR
 
Job Description
 
• The Hadoop Configuration and Development Engineer will be responsible for the administration of distributed computing services (within Hadoop eco-system).
• This individual will interface with the Data Science group, Business Intelligence and Financial Risk Management to identify components that are useful in advanced analytics work-flow.
• By working with Unix Administrators and internal technology teams, they will be responsible for integrating components of the eco-system.
• This position will offer an opportunity to explore new Big Data technologies and contribute to Advanced Analytics.

Responsibilities:
• Responsible for implementation and ongoing administration of Hadoop infrastructure
• Aligning with the systems engineering team to select, install, and upgrade software
• Performance tuning of Hadoop clusters
• Screen Hadoop cluster job performances and capacity planning
• Manage and review Hadoop log files
• Help Application and Operations team to troubleshoot the performance issues
• Mentor other team members with performance tuning and tools
• Assist in data modeling, design & implementation based on recognized standards
• Software installation and configuration
• Query and execution engine performance monitoring and tuning
• Automate manual tasks
• Responsible for code (ETL/Ingestion, SQL/Data Engineering, and Data Science Model) migrations to production using bitbucket, git, Jenkins, and artifactory
• Provide operational instructions for deployments, for example, Java, Spark, Sqoop, Storm
• Support any process that needs attention in production environment

Requirements:
• One to three years’ experience with Linux in both administrator and user capacities
• Familiarity with some of the following languages: python, bash, scala, java, and SQL
• Some experience with networking and firewalls
• Some experience with NoSQL
• Ability to learn about the Hadoop ecosystem
• Two year degree in computer science required.
• Military education or experience may be considered in lieu of civilian requirements listed
Category:IT  code:new
 
 
Job Requirements
 
 
Cloud Computing,Data Modeling,Engineering,ETL,Firewall,Java,Linux,Management,Networking,Python,Risk Management,SQL,Systems Engineering,UNIX
 

Not Ready to Apply?
Contact Details
 
Recruiter
Gomit Bisht
 
Phone
 
E-mail Address
 
LinkedIn
https://www.linkedin.com/in/gomit-bisht-915397130/