Sr. Hadoop Developer #52659

Back to All Jobs

Our client requires a Senior Hadoop Developer to help with the development of Hadoop based solutions. A solid background in Hadoop platform & solution architectures and strong development skills in Hadoop technologies are required.

Responsibilities

Duties will include, but not necessarily be limited to:

  • Prepare the Hadoop environment for development and testing.
  • Design and develop streaming applications with Spark Streaming, HBase, Cassandra, NiFi and Kafka APIs.
  • Design and develop real-time and batch solutions using Hadoop technologies.
  • Build data lakes based on requirements.
  • Research and experiment with emerging technologies and tools related to big data.
  • Work on proof-of-concept solutions.
  • Work closely with Data Scientists to define and refine the big data platform to achieve corporate business objectives.
  • Ensure the transfer of knowledge to the technical staff.
  • Update or create training materials and documentation
  • Train the technical team and users, as required
Mandatory Requirements
  • Minimum 5 years’ experience in building and managing distributed solutions dealing with high volume data on highly scalable environments.
  • 3+ years’ experience with hands-on development in Hadoop technologies (HDFS, Yarn, Spark, NiFi, Kafka, Hive, HBase, Sqoop, Ranger, Solr, etc.).
  • Experience building industry standard data lakes.
  • Hands-on experience in Spark.
  • Should have developed both real-time (including streaming) and batch processing solutions in Hadoop.
  • Working knowledge of web technologies and protocols (Java, NoSQL, JSON, REST, JMS).
  • Thorough knowledge of Linux environment and tools.
  • Experienced in industry standard Software Development Life Cycle (SDLC).
  • Strong problem solving and analytical skills.
  • Exceptional customer engagement, interpersonal, presentation and overall communication skills.
  • Exemplary written and oral communication skills.
Desirable Requirements
  • Knowledge of scripting languages (Python, Scala, Javascript, etc.).
  • Experience developing Hadoop/Spark applications to process unstructured data such as image, video etc.
  • Hands-on experience with Sqoop and Hive with understanding of partitioning/data formats/compression/performance tuning/etc.
  • Experience integrating spark applications with other frameworks such as deep learning (Tensorflow, Deeplearning4j etc), machine learning (eg:Scikit- Learn) etc.
  • Strong Data Science background.
  • Experience in Data Visualization tools such as Tableau and Power BI.
  • Thorough understanding of integration concepts (EAI and ETL).
  • Experience with Tibco or Informatica.
  • Experience in Agile project delivery is an asset.
  • Handle multiple tasks and work well under pressure to meet deadlines and shifting priorities.
  • Previous work experience with Municipal, other Government Organizations or equivalent endeavors.

 

Job Posting ID: 52659SRHADDEV

Location: Calgary, Alberta

Starting Date: January 2018

Duration: Till Oct, 2018 + pos. extensions

Posting Closing Date:

Back to All Jobs


Apply for this Job Posting
Fill in the form below to submit your application for this position.
  • Accepted file types: doc, pdf, docx.
  • This field is for validation purposes and should be left unchanged.