Big Data Administrator

Job ID
758
Location
Montreal downtown
Role and Responsibilities
IT Unlock’s mission is to improve our client's current IT condition. We are looking for a talented Big Data Administrator. This is a permanent position. The work is done remotely and onsite in downtown of Montreal and it begins quickly. Beautiful technologies and nice challenge are waiting for you.

Department Summary:

Our client's infrastructure is supporting all entities in the Americas for all infrastructure technology requirements including:
  • Workstations including computers, telephones and email
  • Data centers hosting
  • Networks/telecommunications
  • End user computing services
  • Market Data administration infrastructure
  • Data services
Our various fields of expertize enable the organization to:
  • Be a major player in the Group's Digital Transition,
  • Ensure the 24/7 quality and continuity of our infrastructures and their resilience to incidents,
  • Advise/accompany business lines and functional divisions through their own transformation,
  • Provide diversified/innovative service offer that is customized to the requirements of our internal/external partners
  • Our goal is to deliver quality/cost-effective IT infrastructure services aligned to our customer needs while effectively managing operational risks and providing value added/innovative solutions.

Day-to-Day Responsibilities:
  • In a multi-cultural team installing, upgrading, configuring, and maintaining a Hadoop cluster.
  • Because security is key for us: setup security (Kerberos, AD, IDMP,SSL  …)
  • As you don’t like to repeat the same task over and over: automate operations, installation and monitoring of the Hadoop Framework.
  • Participate to the continuous evaluation of Hadoop infrastructure requirements and design/deploy solutions (high availability, big data clusters,  haddop in public cloud, etc)
  • Because you always question yourself for the better: cluster Monitoring, Troubleshooting and Configuration
  • As you know that evidences and traces are important in case of incdent: manage and review Hadoop log files
  • Part of a BigData feature team, key element of a Follow the Sun support it means you will work with Paris and Bangalore, deliver tasks of Agile Sprints and act upon incidents to ensure a 7/7 – 24/24 production.

Profile
Technical Skills:        
  • HDFS,
  • Knox,
  • Kafka,
  • Hbase,
  • Hive,
  • Spark,
  • Ranger,
  • Kerberos,
  • SSL,
  • Linux,
  • Shell programming

Experience Needed:
  • 2+ years of relevant professional experience in Unix and Linux Systems, server hardware, virtualization, RedHat
  • At least 2 years of experience of Hadoop (HortonWorks or Cloudera)
  • Proven experience with automation
  • Excellent troubleshooting techniques
  • Good communication skills over the phone, e-mail and documentation

Desired / Plus:
  • Java and/or Python experience
  • Knowledge of Hadoop in Public Cloud (e.g. Azure HD Insight)
  • Knowledge of Ansible

Educational Requirements:
  • Degree in Computer Science or related experience

Languages: English, French

Skillset
Required:
• 1+ years of experience in Unix and Linux Systems, server hardware, virtualization, RedHat
• 1+ years of experience of Hadoop (Cloudera is preferred or HortonWorks)
• Experience with automation Ansible
• Excellent troubleshooting techniques
• Good communication skills over the phone, e-mail and documentation
• Bilingual (English, French)

Assets:
• Java and/or Python experience
• Knowledge of Hadoop in Public Cloud (e.g. Azure HD Insight)
Number of positions
1
Work Experience
At least 1 year
Salary
null
Apply on Job