Lead Datawarehouse
4 days ago
Collabera is ranked amongst the top 10 Information Technology (IT) staffing firms in the U.S., with more than $550 million in sales revenue and a global presence that represents approximately 12,000+ professionals across North America (U.S., Canada), Asia Pacific (India, Philippines, Singapore, Malaysia) and the United Kingdom. We support our clients with a strong recruitment model and a sincere commitment to their success, which is why more than 75% of our clients rank us amongst their top three staffing suppliers.
Title: Lead Datawarehouse /Big Data
Location: Torrance, CA
Duration Months (Good Chance of Extension)
Lead Datawarehouse /Big Data Expert
- Senior resource in a Tech Lead role responsible for delivering quality ETL work using Informatica in a Datawarehouseing environment that involves Hadoop, Linux, R programming, Netezza.
- An experienced Big Data resource, expertise in Administering, Developing and deploying Oracle Big Data Appliance (Oracle BDA) solution. Proven work experience on h/w, s/w configs, enterprise level installation, configuration of Hadoop distributed cluster especially with the Oracle BDA, its security framework, best practices and standards for development/deployment processes . Big Data/Netezza Integration, comfortable in usage of the Sqoop, Linux Shell scripting, related utilities and the R programming.
- Excellent knowledge of Hadoop architecture and administration and support
- Hands-on experience with the Hadoop stack (e.g. MapReduce, Sqoop, Pig, Hive, Hbase, Flume).
- Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python), especially the R programing.
- Experience with Hadoop Security model consisting of authentication, service level authorization, authentication for Web consoles and data confidentiality.
- Hand-on experience in leading and providing expertise and guidance to groups across the enterprise in the design and support of Hadoop security integration with Client LDAP
- 2 to 3 year's applicable experience in developing and/or administering Hadoop (highly desirable the R instillation, configuration and the Oracle BDA) production and development data hub environments in a large setting for complex systems.
Required Skills:
- Oracle Big Data Appliance / Cloudera Hadoop technical stack, R programming
- Informatica9.5.1, RedHad Linux6.4, AutoSys
- Netezza / IBM Pure Data Anatytics 7.x,
Strong pluses:
- Expertise in Hadoop ecosystem products such as HDFS, MapReduce, Hive, Pig, Spark, AVRO, HBase, Zookeeper
- Experience with SQL, Java (MapReduce), Python (linux shell like scripts) development
- Experience of Business Intelligence, data mining techniques and analytics functions
- Predictive analytics experience is a PLUS
Requirements:
- Past work experience on h/w, s/w configs, enterprise level installation, configuration, setting up "R" program environment, its security framework, best practices and standards for development/deployment processes .
- Past work experience on h/w, s/w configs, enterprise level installation, configuration of Hadoop distributed cluster especially with the Oracle BDA, its security framework, best practices and standards for development/deployment processes .
- Oracle Big Data Appliance, Cloudera Hadoop
- Expertise in Architecting, Administering, Developing and deploying Big Data solutions.
- Hands-on development, design work on Big Data Analytics to support the data scientists.
Contact Details:
Name: Arshdeep Kaur
Phone:
Additional InformationTo know more about this opportunity, please contact:
Arshdeep Kaur
Cell:
Email: [@]