Data Scientist and Big Data Consultant Positions – Immediate Need!!!


All the 3 positions are in Pennington, NJ. They will be Contract – Perm positions.

Position 1: Data Scientist

  • Data Scientist work with analysts, architects, and software engineers to contribute to the big data analytics program initiatives
  • Will help build a framework of capabilities to support a number of ongoing and planned big data analytics projects
  • Must have good experience on Business Analytics OLAP tools to do visual presentation of Data in Hadoop HDFS.
  • Should have experience with Data Ingestion, Hadoop, Claudera and Stat Models.
  • Prefer Finance or Banking industry background.

 

Position 2: Big Data Hadoop – Data Ingestion and Tool expert

  • Install and Configure Cloudera CHD4.5 and related tools like Sqoop, Flume, HDFS, MapReduce1/MapReduce2 (YARN), Pig, Hive, HBase, Oozie, Zookeeper, AVRO, Hue and Cloudera Manager,
  • Implement Kerberos security on Big Data Hadoop and HBase.
  • Implement and configure MapReduce2/YARN. 
  • Implement Data Ingestion pattern using Sqoop2, Flume, REST API and other 3rd party tools using Avro serialization tool.
  • Skills Required:
  • At lease 2 years of experience on Big Data Hadoop framework and it components like MapReduce1 and MapReduce2 (YARN), HDFS, Sqoop2, Flume, Pig, Hive, Hue, Cloudera Manager and HBase and ETL tools.
  • 5 to 7 years working experience on Database (Oracle/DB2/MS SQL), Data Warehouse,  ETL and Business Intelligence tools. 
  • Expert level understanding of Big Data Hadoop(Cloudera CDH4.5) installation and configuration of Hadoop tools.
  • Experience on implementing Kerberos security in Hadoop framework and HBase.
  • Strong Data Modeling experience 

 

Position 3: Big Data Hadoop Developer (Java/Python)

  • 8 – 10 years of working experience on Java J2EE development to handle, manipulate and cleanse Data. 
  • Must have worked at least 1 year on Hadoop HDFS Data Ingestion using AVRO, httpFS, REST API and Java code for Data Ingestion.
  • Must have development experience on Cloudera CDH4.5
  • Skills Required:
  • Strong Java J2EE and Python programming experience for Extraction, Transformation and Loading of Data.
  • 2-3 years experience on Data Ingestion in HDFS using AVRO and Data manipulation using Pig and Hive.
  • Experience on Data Extraction, Data massaging/Transformation and Data Loading to HDFS using Avro.

 

 Thanks,

Chandra Kanumuri – Altimetrik, Corp

Recruiting Manager – People Experience

Ph: 248-281-2538/ Fax: 248-262-2938

Advertisements