Vous êtes sur la page 1sur 6

Mounica

//Email: mona.gunta@yahoo.com // Call: 682-220-9914//

ETL/INFORMATICA DEVELOPER

SUMMARY:
Over 7+ years of experience as an ETL Developer with the development of ETLs for Data ware House/Data
Migration using Informatica Power Center version and SSIS (SQL Server Integration Services) ETL tools.
Extensive experience in the design and development of Data Warehouse applications primarily in Oracle using
PL/SQL programming and IBM Data stage for ETL, Erwin for data modelling, Perl/shell scripting for batch
processing with Unix/Linux as solution environment.
Strong hands on experience using Teradata utilities (SQL, B-TEQ, Fast Load, Multi load, Fast Export, Tpump,
Visual Explain, and Query man), Teradata parallel support, Perl and Unix Shell scripting.
Hands on experience in data modelling (Erwin) data analysis, data integration, data mapping, ETL/ELT processes
and in applying dimensional data modelling (star schema) concepts in data warehouse.
Hands on experience on Various No SQL databases such as H base and Mongo DB.
Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data
Quality (IDQ).
Hands on experience in developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL
Server and Oracle PL/SQL.
Data Processing experience in designing and implementing Data Mart applications, mainly transformation
processes using ETL tools Informatica Power Center, IDQ.
Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables.
Preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies.
Experience in importing/exporting data between different sources like Oracle/Access/Excel etc. using SSIS/DTS
utility.
Responsible for translation of requirements, business rules, and SQL into Ab-initio graphs.
Hands-on experience with BI Tools informatica Power Center,Informatica Power Exchange,, Cognos, SQL Server
Integration Services (SSIS), SQL Server Reporting Services (SSRS), Micro strategy.
Expert Level skills in developing Strategies for Extraction, Transformation and Loading (ETL) mechanism
using Talend Bigdata 5.5,6.0, Informatica Power Centre 8.x/9.x.
Experience working with salesForce.com IDE, Data Loader and salesforce.com Sandbox environments.
Experience with industry Software development methodologies like Waterfall, Agile within the software
development life cycle.
Experienced in Extracting Data from Mainframes Flat File and converting them into Teradata tables
using SAS PROC IMPORT, PROC SQL etc.
Strong experience in Data warehousing, Dimensional Star Schema and Snowflakes Schema methodologies.
Expertise in using global variables, expressions and functions for the reports with immense experience in
handling sub reports in SSRS.
Hands on experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and
scheduling them on Pentaho BI Server.
Experience working with MS SQL Server, Oracle 11g, Oracle APEX, BIG Data, Oracle No SQL and Oracle
Exadata.
Having exposure on Informatica Cloud Services.
Strong in SQL, T-SQL, PL/SQL, SQL*LOADER, SQL*PLUS, MS-SQL & PRO*C.
I have experience with ETL tool ODI and I am familiar with Version 12 C.
Participated in Business Requirements, Use Cases, and Agile Stand up and Defect Triage meetings.
Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business
requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
Hands on experience in big data technologies such as Hadoop, Hive, H base, Map reduce, HDFS.
TECHNICAL SKILLS:

Data Warehousing Informatica Power Center 10/9.6/9.1/8.6.1/8.5/8.1.1/7.1.2/7.1.1/6.1/5.1.2, Power


Connect, Power Exchange, Informatica Power Mart 6.2/5.1.2/5.1.1/5.0/4.7.2,
Informatica Web services, Informatica MDM 10.1/9.X,Oracle Data Integrator 12c/11g,
OBIA/BI APPS11g/ 7.9.6.x/7.9.5, Oracle Data warehouse builder (OWB), Informatica
CDC, Informatica BDE, Informatica B2B DX/DT(version 10), SQL*Loader,
Informatica on demand IOD, Flat Files (Fixed, CSV, Tilde Delimited, XML, IDQ, IDE,
Oracle Data Integrator (ODI), ETL Tools Data Transformation Services (DTS),
Exadata, Metadata Manager, MS SQL Server Integration Services (SSIS).
DimensionalData Modelling Dimensional Data Modelling, Star Schema Modelling, Snow-Flake Modelling, FACT
and Dimensions Tables, Physical and Logical Data Modelling.
Scheduling Tool OBIEE DAC, Autosys, Tidal, Control M, Puppet, Chef
Reporting Tools SSRS, Business Intelligence Tools, Tableau, Power BI, Cognos 8, MS Access
Database and related tools Oracle 10g/9i/8i/8/7.x, MS SQL Server 2000/7.0/6.5, Teradata, Netezza, Amazon S3,
Vertica, Sybase ASE, PL/SQL, T SQL, NoSQL, TOAD 8.5.1/7.5/6.2. DB2 UDB, Amazon
Red shift, Red hat Enterprise Linux
Languages SQL, PL/SQL, SQL*Plus, C, Dynamic SQL, C#, Working knowledge of Unix Shell
Scripting, Perl scripting , Java
Web Technologies HTML, XHTML and XML
Operating Systems Microsoft XP/NT/2000/98/95, UNIX, Sun Solaris 5
Cloud Technologies AWS, Azure, Informatica Cloud

PROFESSIONAL EXPERIENCE:

Datafaction, Los Angeles, CA [Oct 2015 Present]


Senior ETL/ Informatica Developer

RESPONSIBILITIES:
Using Informatica Power centre tools developed Workflows using task developer, work lets designer, and
workflow designer in Workflow manager and monitored the results using workflow monitor.
In EDW we load the consignment related data based business rules, this data we extract from SAP source system.
Based on requirements, developed Source-To-Target mapping document with business rules and also developed
ETL Spec documentation.
Design and develop methodologies to migrate multiple development/production databases from Sybase to Oracle
11g.
Implemented / Developed incremental ETL mappings.
Wrote heavy stored procedures using dynamic SQL to populate data into temp tables from fact and dimensional
tables for reporting purpose.
Implemented Talend POC to Extract data from Salesforce API as an XML Object & .csv files and load data into
SQL Server Database.
The technologies in use included Provia Viaware (WMS), EDI (Trusted Link), JD Edwards, iSQL, Oracle, SQL,
Unix,Cron(JobScheduler)&ShellScripting.
Tuned the performance of mappings by following Informatica best practices and also applied several methods to
get best performance by decreasing the run time of workflows.
Created logical and physical data models using Erwin, process flow diagrams and data mapping documents.
Extracted data from Teradata, BIG Data, and Oracle No SQL and Oracle Exadata databases.
Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
Writing HIVE table scripts using AWS S3 for L1, L2, L3.
Excellent knowledge of HIPAA standards, EDI (Electronic data interchange), EDIFACT, Implementation and
Knowledge of HIPAA code sets, ICD-9, ICD-10 coding and HL7.
Extensively used Perl scripts to edit the xml files and calculate line count according to the client's need.
Worked on optimizing and tuning the Teradata views and SQLs to improve the performance of batch and
response time of data for users.
Involved in Data Integration and Migration by using SalesForce.com Apex data loader, web based import wizard.
Used Informatica power center 9.6.1 to Extract, Transform and Load data into Netezza Data Warehouse from
various sources like Oracle and flat files.
Having experience in Client-Server application development using Oracle Apex ,for this I used Oracle Apex 4.2.
Developed processes that conduct service calls through APIs which interface with applications on cloud.
Wrote PL/SQL scripts for pre & post session processes and to automate daily loads.
Expertise in developing applications, batch processes using PERL, Shell Scripting, JavaScript since two years.
Involved in Analyzing/ building Teradata EDW using Teradata ETL utilities and Informatica.
Write Shell script running workflows in UNIX environment.
Modified several of the existing mappings based on the user requirements and maintained existing mappings,
sessions and workflows.
I Used Informatica B2B Data Exchange to Structured data like XML.
Load and Extract the data from multiple cloud applications.
Installed and configured Pentaho BI Server 3.6/3.7 on Red Hat Linux and Windows Server.
Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
Developed mapping in creation of EDI 834 file format using informatica XSD's.
Provide Application Maintenance & Support on Viaware (WMS), JD Edwards (World soft) & EDI application.
Prepared SQL Queries to validate the data in both source and target databases.

ENVIRONMENT: Informatica Power Center 10/9.6, EBS, Informatica BDE,Hive 2.7,HL7,Teradata 12, SSRS, Oracle
11/10g, PL/SQL,Jitterbiy,Perl Scripting,SSAS, Autosys, TOAD 9.x,Oracle Financials, Shell Scripting,python, Dynamic
SQL, Oracle SQL *Loader, SSIS 2008 and Sun Solaris UNIX, OBIEE, Windows-XP.

Toyota Financial, Torrance, CA [ January 2015 - Sep 2015]


ETL Developer

RESPONSIBILITIES:
Developed mappings/sessions using Informatica Power Center 8.6.1 for data loading.
Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
Created High level/detailed level design documents and also involved in creating ETL functional and technical
specification.
Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository
and folder.
Developed number of Ab lnitio Graphs for the ETL processes based on business requirements using various
Ab lnitio components.
I optimized ELT and ETL processes using columnar MPP techniques.
Imported Cognos reports into Excel and formatted those using Cognos 8 Go! Office and scheduled them.
Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
Worked extensively on PL/SQL as part of the process to develop several scripts to handle different scenarios.
Integrated MS SQL Server using Jitterbit to pump the subscriber data downstream from Salesforce.
Created Test cases for the mappings developed and then created integration Testing Document.
Used the advanced features of PL/SQL like Records, Tables, Object types.
Created programming code using advanced concepts of Records, Collections, and Dynamic SQL.
Process mapped EDI technical & business present , proposed automated areas and "to be" processes
Automated the Informatica jobs using UNIX shell scripting.
Successfully Loaded Data into different targets from various source systems like Oracle Database, Flat
files, XML files etc.
I am involved in Signing Up of SOW and MPP plan by the Stakeholders.
Design the physical model and ETLs to source data from current systems.
Designed a flexible ETL process to run as Incremental load or Full Load with minimal tweak.
Developed Mappings, Sessions and Workflows to extract, validate, and transform data according to the business
rules.
Create Oracle 11g database databases and replicate Sybase schema objects to Oracle.
Troubleshoot PL/SQL procedures and functions to support corresponding Sybase functionalities.
Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
Designing/Development of the ETL jobs using the various Datastage Stages, depending on the scenario.
Created ODI packages, scenarios using interfaces, variables, procedure

ENVIRONMENT: Informatica Power Center 9.1/8.6.1, Workflow Manager, Workflow Monitor, Informatica Power
Connect / Power Exchange 9.6, Dynamic SQL, Informatica on Demand (IOD), SSIS/SSRS2008, Control M, Data
Analyzer 8.1, Task Factory, Oracle BI Apps 7.9.6.1, Hyperion,EDI, Power BI 2013,Oracle Data Integrator
10.1.3.5,informatica b2b, PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005, Sybase, UNIX AIX, Toad 9.0,
Cognos 8.

XO communication, Dallas, TX [March 2014 December 2014]


IDQ ETL Developer

RESPONSIBILITIES:
Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data
Quality (IDQ).
Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data
Quality (IDQ).
Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency
and performance.
Data Profiling, Cleansing, Standardizing using IDQ and integrating with Informatica suite of tools.
Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data
conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.
Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to
load them in the central oracle database.
Created connections in Informatica power center to retrieve data from different sources by using Java API.
Worked on Cognos 10 (Business Insight and Active Reports) for two demo projects to demonstrate the End-User
the new features available in Cognos 10. Built Portal pages for them.
Worked on performance tuning of programs, ETL procedures and processes.
Involved in Dimensional modelling (Star Schema) of the Data warehouse and used Erwin to design the business
process, dimensions and measured facts.
Automate the on boarding of new external data sources or trading partners
Automate the processing of unstructured data formats (PDF, XLS, DOC, etc)
Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data
conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jar Distance
and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with
mailing lists by preventing multiple pieces of mail.
Involved in developing application using sql and wrote queries to test the data that was sent through the API .
Configured UNIX shell scripts for executing DataStage jobs.
Used Informatica Power Center for extraction, transformation and load (ETL) of data in the data warehouse.
Extensively used Transformations like Router, Aggregator, Normaliser, Joiner, Expression and Lookup, Update
strategy and Sequence generator and Stored Procedure.
Developed complex mappings in Informatica to load the data from various sources.

ENVIRONMENT: Informatica Power Center 9.0/8.x/7.1.3, Informatica Data Quality, Informatica MDM, Power
Exchange 8.6.1, Data Explorer 5.0, Oracle 10g/9i, PL/SQL, SSIS/SSRS, Toad 10.5/9.5, Cognos 8.4, Power BI, Puppet,
Windows XP pro and AIX UNIX, PVCS, Tidal, Magic ticket management, SQL Server, Teradata SQL Assistant,
Teradata external loaders (T pump, Ml oad).
Vivimed Labs, Hyderabad, India [January 2012-October 2012]
ETL Developer

RESPONSIBILITIES:
Studied the existing OLTP system(s) and Created facts, dimensions and star schema representation for the data
mart.
Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source
systems.
Developed PL/SQL procedures for processing business logic in the database and use them as a Stored Procedure
Transformation.
Used workflow monitor to monitor the jobs, reviewed session/workflow logs that were generated for each
session to resolve issues, used Informatica debugger to identify issues in mapping execution.
Re-engineered lots of existing mappings to support new/changing business requirements.
Monitored production jobs on a daily basis and worked on issues relating to the job failure and restarted failed
jobs after correcting the errors.
Designed and developed ETL Processes based on business rules, job control mechanism using Informatica Power
Center.
In Vivimed labs I worked on a CRM application called Electronic Medical Record (EMR).
Worked extensively on complex mappings using source qualifier, joiner, expressions, aggregators, filters, Lookup,
update strategy, stored procedure transformations, etc.
Involved in requirements gathering, functional/technical specification, Designing and development of end-to-end
ETL process for Sales Data Warehouse.
Developed reusable transformations, mapplets, sessions, work lets to make Informatica code very modular and
reuse as required
Performance tuned SQL statements, Informatica mappings, used Informatica parallelism options to speed up
data loading.
(EMR) is health care software that helps maintaining records of patients daily and form excel sheets.
Using Informatica Power center9.1 to make the changes to the existing ETL mappings in each of the
environments.
Collaborated with Project Manager, Tech Lead, Developers, QA teams and Business SMEs to ensure delivered
solutions optimally support the achievement of business outcomes.

ENVIRONMENT: Informatica Power Centre 9.1, Repository Manager, Designer, Work Flow Manager, Oracle
11g,SQL Server, Teradata, XML Files, Flat Files, CSV files, PL/SQL(Stored Procedure, Trigger, Packages), Windows.

Atos, Pune, India [September 2008 December 2011]


DW/ETL Developer

RESPONSIBILITIES:
Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router,
Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
Implemented Out-of-the-box analytics reporting functionality and successfully implemented OBIEE
delivers/configured proactive agents (IaBots) and configured interactive dashboards to alert the Business/Field
users as per the requirements.
Worked with complex mappings having an average of 15 transformations.
Created and scheduled Sessions, Jobs based on demand, run on time and run only once
Monitored Workflows and Sessions using Workflow Monitor.
Designing/Development of the ETL jobs using the various Datastage Stages, depending on the scenario.
Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and
capturing the deleted records in the source systems.
Worked on SSIS Package, DTS Import/Export for transferring data from Database (Oracle and Text format data)
to SQL Server.
I am also used UNIX and shell scripting extensively to enhance the PERL scripts and develop, schedule and
support Control M batch jobs to schedule the data generation and reporting.
Involved in design, development and maintenance of database for Data warehouse project.
Involved in Business Users Meetings to understand their requirements.
Developed OBIEE Repository with all the three layers: Physical Layer, Business Model and Mapping Layer and
Presentation Layer.
Worked extensively with the connected lookup Transformations using dynamic cache.
Performed Unit testing, Integration testing and System testing of Informatica mappings Coded PL/SQL scripts.
Wrote UNIX scripts, Perl scripts for the business needs.
Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for
ETL process.
Created Universes and generated reports on using Star Schema.
By using PERL and SHELL scripts invoke the stored procedures for data load, computation and generation of
reports.

ENVIRONMENT: OBIEE 10.1.3.2, Informatica Power Center 8.1, Oracle 11g, SSIS/SSRS 2005, UNIX.

Vous aimerez peut-être aussi