Skip to Main Content

Pentaho Specialist

Cloud BC Labs
Newark, NJ
  • Expired: over a month ago. Applications are no longer accepted.
Job Description

Duration: Long term

Location: Newark, NJ

Hybrid 2-3 days a week (LOCAL Candidates required)

VISA: H1B, H4 EAD, GC & UGC (C2C can work)

Our Role: Our Technology Solutions Group is a dynamic, fast-paced environment, with exciting changes on the horizon under new senior leadership. We are looking for you to be a technical expert on the Pentaho platform and work across various teams to support the development, enhancement and maintain the existing process. You will need to have in-depth technical knowledge as a Pentaho ETL developer and should feel comfortable working within large internal and external data sets. We want you to see this challenge as a unique and valuable opportunity, so if this sounds interesting, then PGIM could be the place for you.


Over 8-10 years of experience in the field of Information Technology with proficiency in ETL design/development and Data Warehouse Implementation/development.

Experienced in Design, Development and Implementation of large - scale projects in Financial, Shipping and Retail industries using Data Warehousing ETL tools (Pentaho).

Knowledge about Software Development Lifecycle (SDLC), Agile, CI/CD deployment.

Excellent data analysis skills.

Experience in Architecting and building Data Warehouse systems and Business Intelligence systems including ETL using Pentaho BI Suite (Pentaho Data Integration Designer / Kettle).

Hands on experience on the whole ETL (Extract Transformation & Load) process.

Experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling.

Used various steps in Pentaho transformations including Row Normalizer, Row Demoralizer, Database Lookup, Database Join, Calculator, Add Sequence, Add Constants and various types of inputs and outputs for various data sources including Tables, Access, Text File, Excel and CSV file.

Integrating Kettle (ETL) with Hadoop and other various NoSQL data and loaded unstructured data into Hadoop File System (HDFS)

Experience in performing Data Masking/Protection using Pentaho Data Integration (Kettle).

Experience in writing shell scripting for various ETL needs.

Deep knowledge of RDBMS

Proficient in writing Confidential - SQL Statements, Complex Stored Procedures, Dynamic SQL queries, Batches, Scripts, Functions, Triggers, Views, Cursors and Query Optimization.

Quick understanding of Relational Source Database Systems and data models for building accurate transformation logic that can be used in Data Migration and Data Integration.

Motivated team player with excellent communication, interpersonal, analytical and problem-solving skills

Must Have:

ETL Design


Data Warehousing

Cloudera Hadoop

Skills Required:

Hands on developer on Pentaho 8-10 yrs

Pentaho background not investing in someone to learn this on the job so needs at least some experience

Financial knowledge will help

This is hands on developer role

What person will be doing:

running 200+ jobs across value streams

Deep dive into code

Upgrading Pentaho

Cloud someone who can help on upgrading jobs based on new version

Moving from Hadoop to Amazon S3

Production support

Cloud BC Labs


Newark, NJ
07193 USA