Vous êtes sur la page 1sur 6

ELIZA BOTOMANI

ebmalinda@gmail.com
(248)-818-2819
Summary
Highly Motivated, Solutions Driven with over 7+ years of Data Warehousing experience in the areas of ETL design
and Development. Involved in complete Software Development life-cycle (SDLC) of various projects, including
Requirements gathering, System Designing, Data modeling, and ETL design, development, Production
Enhancements, Support and Maintenance. Excellent Interpersonal and communication skills with an ability to
remain highly focused and self-assured in fast-paced and high-pressure environments.

Extensive ETL tool experience using IBM Infosphere/Websphere DataStage, Ascential DataStage.
Worked on DataStage tools like DataStage Designer, DataStage Director and DataStage
Administrator.
Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and
star/snowflake schema modeling.
Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for
data warehouses.
Developed parallel jobs using different processing stages like Transformer, Aggregator, Lookup, Join,
Sort, Copy, Merge, Funnel, CDC, Change Apply and Filter,FTP Enterprise.
Used Enterprise Edition/Parallel stages like Datasets, Change Data Capture, Row Generator and
many other stages in accomplishing the ETL Coding
Familiar in using highly scalable parallel processing infrastructure using parallel jobs and multiple node
configuration files.
Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts and
scheduling tools.
Experience in troubleshooting of jobs and addressing production issues like data issues, ENV issues,
performance tuning and enhancements.
Knowledge in using Erwin as leading Data modeling tool for logical (LDM) and physical data model
(PDM).
Extensive experience in design and development of Decision Support Systems (DSS).
Assisted in development efforts for Data marts and Reporting.
Technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and
dimension modeling for OLAP.
Extensive experience in Unit Testing, Functional Testing, System Testing, Integration Testing,
Regression Testing, User Acceptance Testing (UAT) and Performance Testing.
Worked with various databases like Oracle 10g/9i/8i, DB2, SQL Server, Teradata.

Educational Qualification
Bachelors in Computers, Oakland University
Technical Skills
ETL Tools

Database
Data Warehousing

IBM Infosphere DataStage 8.7, IBM Infosphere DataStage 8.5, IBM


Infosphere DataStage 8.1 (Parallel & Server), IBM Websphere DataStage
8.0.1
(Designer, Director, Administrator), Ascential DataStage 7.5.2
(Designer, Director, Administrator, Manager),Informatica 6.1
Oracle 10g/9i/8i, IBM DB2/UDB, Teradata, SQL Server 2003/2005/2008.
Star & Snow-Flake schema Modeling, Fact and Dimensions, Physical

Operating systems
Languages/Scripting
Testing/Defect
Tracking

and Logical Data Modeling, Erwin,Cognos


Windows 7x/NT/XP, UNIX, LINUX, Solaris, MS-DOS,MS Access
C, C++, Java, D2K, Visual Basic, PL/SQL, UNIX Shell scripts
HP Quality Center, Test Director, Bugzilla

Professional Experience
Wells Fargo, Minneapolis, MN

Feb 2013 Current

Sr ETL Datastage Developer

Responsibilities:

Involved in understanding of business processes and coordinated with business analysts to get specific
user requirements.
Helped in preparing the mapping document for source to target.
Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage Director for
developing jobs and to view log files for execution errors.
Involved in design and development of Datastage batch jobs for loading the data into the Huntingtons
Customer Information System (CIS).
Experienced in developing parallel jobs using various Development/debug stages (Peek stage,
Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage) and processing
stages (Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel, FTP, Remove
Duplicate Stage)
Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the SQL
Server, DB2 and Oracle databases.
Developed job sequencer with proper job dependencies, job control stages, triggers.
Used the ETL Data Stage Director to schedule and running the jobs, testing and debugging its
components & monitoring performance statistics.
Controlled jobs execution using sequencer, used notification activity to send email alerts.
Imported table/file definitions into the Datastage repository.
Participated in Datastage Design and Code reviews.
Worked in Agile/Scrum environment.
Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel
job execution
Worked on programs for scheduling Data loading and transformations using DataStage from DB2 to
Oracle using SQL* Loader and PL/SQL.
Successfully implemented pipeline and partitioning parallelism techniques and ensured load
balancing of data.
Worked on Zena scheduler tool for scheduling the Datastage batch jobs.
Involved in performance tuning of the ETL process and performed the data warehouse testing
Documented ETL test plans, test cases, test scripts, and validations based on design specifications for
unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
Prepared documentation including requirement specification.
Participated in weekly status meetings.

Environment: IBM Infosphere DataStage 8.7 (Parallel & Server), Oracle 10g, DB2, SQL Server 2008,
PL/SQL,Flat files,XML files, Zena Scheduler.
L.A. Care , Lynwood, CA

Jun 2011 Jan 2013

Sr ETL Datastage Developer


Responsibilities:
Analyzed, designed, developed, implemented and maintained Parallel jobs using IBM info sphere Data
stage.
Involved in design of dimensional data model Star schema and Snow Flake Schema

Generating DB scripts from Data modeling tool and Creation of physical tables in DB.

Worked SCDs to populate Type I and Type II slowly changing dimension tables from several
operational source files

Created some routines (Before-After, Transform function) used across the project.

Experienced in PX file stages that include Complex Flat File stage, DataSet stage, LookUp File
Stage, Sequential file stage.

Implemented Shared container for multiple jobs and Local containers for same job as per
requirements.

Adept knowledge and experience in mapping source to target data using IBM Data Stage 8.x

Implemented multi-node declaration using configuration files (APT_Config_file) for performance


enhancement.

Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets,
Join, Lookup, Change Capture, Funnel, FTP, Peek, Row Generator stages in accomplishing the
ETL Coding.

Debug, test and fix the transformation logic applied in the parallel jobs

Excessively used DS Director for monitoring Job logs to resolve issues.

Experienced in using SQL *Loader and import utility in TOAD to populate tables in the data warehouse.

Involved in performance tuning and optimization of DataStage mappings using features like
Pipeline and Partition Parallelism to manage very large volume of data.

Deployed different partitioning methods like Hash by column, Round Robin, Entire, Modulus, and Range
for bulk data loading and for performance boost.

Repartitioned job flow by determining DataStage PX best available resource consumption.

Created Universes and reports in Business object Designer.

Created, implemented, modified and maintained the business simple to complex reports using Business
objects reporting module.

Environment: IBM Info sphere DataStage 8.5, Oracle 11g, Flat files, Autosys, UNIX, Erwin, TOAD, MS SQL
Server database, XML files, MS Access database.
Medical Graphics Corporation, Saint Paul, MN
Sr. DataStage Developer/ Data Modeler

Jan 2010 May 2011

MGC Diagnostics is a global medical technology company dedicated to cardiorespiratory health solutions. This
singular focus guides our strategy and defines our commitment to customers, employees and shareholders.
These attributes make us uniquely qualified to solve todays challenges and uncover solutions for tomorrows
opportunities.
Responsibilities:

Extensively used DataStage for extracting, transforming and loading databases from sources including
Oracle, DB2 and Flat files.
Collaborated with EDW team in, High Level design documents for extract, transform, validate and load
ETL process data dictionaries, Metadata descriptions, file layouts and flow diagrams.
Collaborated with EDW team in, Low Level design document for mapping the files from source to target
and implementing business logic.
Generation of Surrogate Keys for the dimensions and fact tables for indexing and faster access of data
in Data Warehouse.
Tuned transformations and jobs for Performance Enhancement.
Extracted data from flat files and then transformed according to the requirement and Loaded into target
tables using various stages like sequential file, Look up, Aggregator, Transformer, Join, Remove
Duplicates, Change capture data, Sort, Column generators, Funnel and Oracle Enterprise.
Created Batches (DS job controls) and Sequences to control set of jobs.
Extensively used DataStage Change Data Capture for DB2 and Oracle files and employed change
capture stage in parallel jobs.
Executed Pre and Post session commands on Source and Target database using Shell scripting.
Collaborated in design testing using HP Quality Center.
Extensively worked on Job Sequences to Control the Execution of the job flow using various Activities &
Triggers (Conditional and Unconditional) like Job Activity, Wait for file, Email Notification, Sequencer,
Exception handler activity and Execute Command.
Collaborated in Extraction of OLAP data from SSAS using SSIS.
Extensively used SAP R/3 and SAP BW packs
Collaborated with BI and BO teams to find how reports are affected by a change to the corporate data
model.
Collaborated with BO teams in designing dashboards and scorecards for Analysis and Tracking of key
business metrics and goals.
Utilized Parallelism through different partition methods to optimize performance in a large database
environment.
Developed DS jobs to populate the data into staging and Data Mart.
Executed jobs through sequencer for better performance and easy maintenance.
Performed the Unit testing for jobs developed to ensure that it meets the requirements.
Developed UNIX shell scripts to automate file manipulation and data loading procedures.
Scheduled the jobs using AutoSys, Tivoli and Corntab.
Collaborated in developing Java Custom Objects to derive the data using Java API.
Responsible for daily verification that all scripts, downloads, and file copies were executed as planned,
troubleshooting any steps that failed, and providing both immediate and long-term problem resolution.
Provided technical assistance and support to IT analysts and business community.

Environment: IBM InfoSphere DataStage and QulalityStage 8.5 (Administrator, Designer, Director), IBM
Information Analyzer8.0.1a, Microsoft SQL 2005/2008, IBM DB2 9.1, AIX6.0, Microsoft SQL 2008, Oracle 11g,
Toad 9.5, Java, MS Access, SAP BW, SAP MDM, AS/400, shell scripts, PUTTY, WinSCP, ERWIN 4.0, HP Quality
Center, Tivoli, Corntab, AutoSys.
Prudential Financial, Newark, NJ

Nov 2008 Dec 2009

ETL Developer
Prudential has its branches worldwide, in order to keep track of the huge amounts of data generated, a data
warehouse was developed which aided all levels of management in obtaining a clear perspective on the trend of
the business. Budgeting for the company needs and forecasting the company business decisions were based on
the reports produced using this data warehouse. The system was developed for analyzing and reporting the time
variant data Database was maintained with the user account details, and change requests on user A/C.
Responsibilities:

Provided Technical support to the team as the ETL developer. Addressed best practices and productivity
enhancing issues.
Worked on designing and developing the Quality stage.
Loaded data into load, staging and lookup tables. Staging area was implemented using flat files.
Created jobs in DataStage to import data from heterogeneous data sources like Oracle 9i, Text files and
SQL Server.
Generation of Surrogate IDs for the dimensions in the fact table for indexed and faster access of data in
server jobs.
Extensively worked on Job Sequences to Control the Execution of the job flow using various Activities &
Triggers (Conditional and Unconditional) like Job Activity, Wait for file, Email Notification, Sequencer,
Exception handler activity and Execute Command.
Dicing and Slicing of the input data for the Business feedback. Testing of the system.
Designing Data masking techniques to mask sensitive information when working with offshore
Assisted Mapping team to transform the business requirements into ETL specific mapping rules.
Enhanced various complex jobs for performance tuning.
Responsible for version controlling and promoting code to higher environments.
Worked on Teradata optimization and performance tuning.
Performed Unit Testing, System Integration Testing and User acceptance testing
Involved in ongoing production support and process improvements. Ran the DataStage jobs through third
party schedulers

Environment: Ascential DataStage 7.5.2 (Designer, Manager, Director, Administrator), Oracle 9i, TOAD,
SQL/PLSQL, Teradata, Erwin 4.0, UNIX (AIX).
Mellon Financial Corporation, Southfield, MI

Feb 2007 Oct 2008

ETL Datastage Developer


Mellon Financial Corporation is a global financial services company that provides a wide range of services i.e.
investment management, trust and custody, foreign exchange, securities lending, employee benefits consulting,
outsourcing services for benefit plans, stock transfer, proxy solicitation, treasury management and banking
services etc.
Responsibilities:

Worked on DataStage Designer, Manager, Administrator and Director.


Worked with the Business analysts and the DBAs for requirements gathering, analysis, testing, and
metrics and project coordination.
Involved in extracting the data from different data sources like Oracle and flat files.
Involved in creating and maintaining Sequencer and Batch jobs.
Creating ETL Job flow design.
Used ETL to load data into the Oracle warehouse.

Created various standard/reusable jobs in DataStage using various active and passive stages like Sort,
Lookup, Filter, Join, Transformer, aggregator, Change Capture Data, Sequential file, DataSets.
Involved in development of Job Sequencing using the Sequencer.
Used Remove Duplicates stage to remove the duplicates in the data.
Used designer and director to schedules and monitor jobs and to collect the performance statistics.
Extensively worked with database objects including tables, views, indexes, schemas, PL/SQL packages,
stored procedures, functions, and triggers.
Creating local and shared containers to facilitate ease and reuse of jobs.
Implemented the underlying logic for Slowly Changing Dimensions.
Executed Pre and Post session commands on Source and Target database using Shell scripting.
Worked with Developers to troubleshoot and resolve issues in job logic as well as performance.
Documented ETL test plans, test cases, test scripts, and validations based on design specifications for
unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.

Environment: IBM Websphere DataStage 8.0.1, IBM AIX 5.2, Oracle 10g, XML files, Autosys, MS SQL
Server database, sequential flat files, TOAD

Vous aimerez peut-être aussi