Vous êtes sur la page 1sur 5

Aditya Vasista

Ab Initio Developer - Kaiser


Email me on Indeed: indeed.com/r/Aditya-Vasista/a3496da57574792d
Over six years of professional experience in information technology field on database packages and
data warehousing applications.
Working from past four years as Ab Initio developer in Extraction, Transformation and Loading (ETL)
mechanism.
Having knowledge of full life cycle development (Waterfall, Agile, etc.) for building a data warehouse.
Having good command on all air commands of Ab Initio.
Worked extensively with various program component sets (like Partition, Transform, De-Partition,
Miscellaneous etc., ), Database and Dataset components.
Experience with Ab Initio Co-Operating System, application tuning, and debugging.
Having good knowledge in Ab Initio parallelism techniques and implemented a number of Ab Initio
graphs using parallelism and Multi File System (MFS) techniques.
Experience in Relational Databases - Oracle, SQL Server (SQL and PL/SQL)
A team player with strong technical, analytical and leadership skills.
Ability to communicate with all levels of corporate personnel.

Work Experience

Ab Initio Developer
Kaiser - Permanente, CA
December 2011 to Present
This is a Data Warehouse Package for Health Care Industry that deals with the Providers (Doctors)
and Customers. An Extract file is created from the pharmacy system of all regions and the flat files are
loaded to the data staging area and transferred from staging to EDW. This data will be available in the
Panel Support Tool for reporting, analysis or data mart loads. The pharmacy data contains details on
sold prescriptions, associated allergies, associated signatures and drug product element. It also details
pharmacy locations, pharmacy products and the pharmacy doctors.
Responsibilities:
Involved in System Study & Business Requirements Analysis & Documentation.
Developed and supported the extraction, transformation and load process (ETL) for a Data Warehouse
from their legacy systems using Ab Initio and provide technical support and hands-on mentoring in the
use of Ab Initio.
Worked with various Components like Reformat, Join, Filter-By Expression, Sort, Rollup, Scan etc.,
to transform the data.
Replicate operational data into staging area, Transform and load data into warehouse tables using Ab
Initio GDE and responsible for automating the ETL process through scheduling and exception-handling
routines.
Database Query Optimization and I/O tuning techniques have been used for performance
enhancements.
Generated database configuration (.dbc) files for source and target databases, which are used in Ab
Initio graphs.
Developed graphs to extract Historical and Incremental Data.
Responsible for extracting daily text files from ftp server and historical data from DB2 Tables, cleansing
the data and applying transformation rules and loading to staging area.

Responsible for creating parameterized graphs, which are used to create common DMLs, XFRs during
data transformation phase prior to creating load ready files.
Divided Ab Initio graphs into phases with checkpoints to safeguard against failures.
Extensively written and used UNIX shell scripts for wrapper scripts.
Developed various graphs for data cleansing using Ab Initio functions like is_valid, is_error, is_defined
and various string and math functions.
Developed Ab Initio ETL process and interpret the transformation rules for all target data objects and
develop the software components to support the transformation as well as estimating task duration.
Responsible for consolidating customer records from various sources to create master customer list.
Environment: Ab Initio (GDE 3.0, Co>Operating System 3.0), UNIX andSQL.

Ab Initio Developer
Home Depot, GA, USA
April 2011 to December 2011
Home Depotis one of the nation's largest home appliances retailers. Home Depot Company spans
many states. The DSS sources information from three different data sources PIMS (Product Information
Management System), CIMS (Customer Information Management System), CCAS (Customer Contact
Administrative System) . This Application provides data of the various product sales in US, and reports
were generated on the various product groups, product promotions and market share of competitors'
product group.
Responsibilities:
Understand the business requirements with extensive interaction with users and reporting teams and
assist in developing the low level design documents.
Ab Initio ETL tool is used in designing & implementing Extract Transformation & Load processes.
Replicate operational tables into staging tables, Transform and load data into warehouse tables using
AB Initio GDE and responsible for automating the ETL process through scheduling and exceptionhandling routines.
Design and Develop Ab Initio Graphs by using different Ab Initio components based on the Business
Requirements.
Develop Complex Ab Initio XFR's to derive new fields and solve rigorous business requirements.
Responsible for the Validation of the different sources by using Ab Initio Functions.
Work on Ab Initio components to create Summary tables using Rollup and Scan components.
Convert user defined functions and complex business logic of an existing application process into Ab
Initio graphs using Ab Initio components such as Reformat, Join, Transform, Sort, Partition to facilitate
the subsequent loading process.
Responsible for deploying Ab Initio graphs and running them through the Co-operating systems.
Using Enterprise Meta Environment (EME) in order to perform the necessary modifications.
Improve the performance of Ab Initio graphs by using Various Ab Initio performance techniques like
using lookup's (instead of joins), In-Memory Joins and rollups to speed up various Ab Initio Graphs.
Generate error, reject and log ports to see and understand the components' behaviors.
Good experience in Korn shell scripting.
Responsible for writing the SQL queries in order to retrieve the data.
Tested the applications of Ab Initio graphs.
Interacted with the developers and BA's extensively.
Environment: Ab Initio (GDE 1.16, Co>Operating System 2.16), UNIX, Korn Shell Scripting and SQL.

Ab Initio Developer
Sprint-Nextel Corp
September 2010 to April 2011

The project's main aim is to serve advertisements on mobile phones. The Customer feed is processed
using AbInitio and sent to Ad Server. The Ad Server serves the advertisements to the mobile phones.
Responsibilities:
Involved in the Functional and detailed design of the project.
Gained experienced in working with EME and Project management.
Process and Transform delta feeds of customer data, which comes in daily.
Developed graphs to unload data into tables and validate the new data in comparison with already
existing data.
Wrote several Shell scripts for Project maintenance (to remove old/unused files and move raw logs
to the archives)
Wrote Shell scripts to download unprocessed raw logs and upload processed logs to the client server.
Developed dynamic graphs to load data from data sources into tables and to parse records.
Extensive usage of Multi file system where data is partitioned into four partitions for parallel processing.
Wide usage of Lookup Files while getting data from multiple sources and size of the data is limited.
Used AbInitio for Error Handling by attaching error and rejecting files to each transformation and making
provision for capturing and analyzing the message and data separately.
Tuning of AbInitio graphs for better performance.
Environment: AbInitio (Co>Operating System 2.16, GDE 1.16), UNIX, Oracle, EMEand Windows.

Ab Initio Developer
Sun Technologies
December 2008 to August 2010
Phase 1 is the first of 4 phases of a tool suite that will include sales definition of customers, potential entry,
sales segmentation and territory alignments. Functionality includes integration of web-based alignments
including account, entity, facility and territory building blocks that support existing alignment rules and
functionality currently implemented for sales. For retirement of the existing alignment system. A utility
to centrally store and manage alignment files by legal entities, sector, segment, account, territory, etc.
Mass upload of alignment entry and change.
Responsibilities:
Development of source data profiling and analysis - review of data content and metadata will facilitate
data mapping, and validate assumptions that were made in the business requirements.
Created the mini specs for different applications.
Involved in review the data Analysis, best practices.
Developed various Ab-Initio graphs to validate using data profiler, comparing the current data with
previous month data.
Used different Ab-Initio components like partition by key, sort, rollup, scan, reformat, join and fuse in
various graphs.
Also used components like run program and Sql to run UNIX and SQL commands in Ab-Initio.
Written several Unix control scripts, specific to application in order to pass the environment variables.
Responsible for extracting daily text files from ftp server and historical data from DB2 Tables, cleansing
the data and applying transformation rules and loading to staging area.
Used Abinitio as ETL tool to pull data from source systems, cleanse, transform, and load data into
databases.
Involved in Design best practices and Coding and documentation best practices.
Writing the .dbc files for the Development, Testing and Production Environments.
Expertise in unit testing, system testing using of sample data, generate data, manipulate date and verify
the functional, data quality, performance of graphs.
Performing transformations of source data with transform components like join, match sorted, deadlock,
sorted, de-normalize, reformat and filter-by- expression.
Wide usage of lookup files while getting data from multiple sources and size of the data is limited.

Written several Unix control scripts, specific to application in order to pass the environment variables.
Development of UNIX Korn shell scripts to automate job runs or Support the redaction infrastructure
and SQL and PL/SQL Procedures to load the Data into Database.
Involved in Production implementation best practices.
Environment: AbInitio (GDE1.15, Co>Operating System 2.15), UNIX, Oracle 10.x, SQL Server 2005,
Teradata.

Developer
InfoVision Solutions Pvt. Ltd
June 2007 to December 2008
The Asset Tracking and Management System is the software solution for any type of small-scale
industries, IT Company etc. The core object of the product is for the effective management of different
asset tracking, monitoring and auditing. The ATMS Project having 4 Main modules. In these modules
were interconnected for the tracking the Fixed Assets. The ATMS User management system provides
to manage the users by giving the roles and privileges. Users of the ATMS Applications will manage
the Assets. In ATMS application will captured the Fixed Assets information and generated the Barcode.
This Barcode will label the Assets. This will be used the scanning the Assets by using Barcode, RFID,
Mobile Scanners. In Asset Tracking and Management System will have facilitates like Asset Loan, Asset
Transfer, Asset Insurance, Asset Depreciation, Asset Utilization, Asset Physical Verification.
Responsibilities:
Developed Web Forms using C and ASP.NET.
Developed Cascading Style Sheets (CSS) for User Interface uniformity throughout the application.
Involved in creating business layer, Data Service layer using various patterns like Singleton Factory
patterns.
Involved in creating Data Service Layer using Enterprise library 2.0 Data Access Application Blocks.
Developed T-SQL and Stored procedures using SQL server 2005.
Used JavaScript and AJAX for validate client data.
Extensively used Grid Views with sorting and paging and Generating the Grid View Data to Excel and
PDF documents.
Provided Production support.

Software Engineer
Human Resources (HR)
May 2006 to June 2007
The project was to develop the various modules, i.e., Human Resources (HR), Order Entry (OE), Product
Media (PM) and Sales History (SH) and to integrate the modules. Also, it was required to generate
reports in specified formats for a given requirement (e.g.: Inventory Position) using Oracle Business
Intelligence Publisher (BI Publisher) . SQL query is written to retrieve the data from the various tables.
Then an RTF (Rich Text Format) template is created in the Microsoft Office Word according to the design
layouts, which is then uploaded in the Oracle BI Publisher for the generation of the required reports.
Responsibilities:
Requirement Analysis.
Created stored procedures for Order Entry module.
Perform Unit testing of the OE module.
Part of the Report Generation team.
Involved in the complete cycle of report generation.
Responsible of creating the query using the query builder of the Oracle BI Publisher.
Responsible for developing the templates according to the required design layouts.
Contributed technical documents regarding the reports.

Environment: Oracle Client, PL/SQL Developer, Oracle BI Publisher, Microsoft Office Word (2003)

Ajax Control Tool Kit


2005 to 2005
2005, SQL Server 2005, Windows XP.
Virtusa Consultancy Services, India.

Education

Bachelor of Engineering
Anna University

Additional Information

Skills:
Database: Oracle, DB2, SQL Server and Teradata
Languages: SQL, PL/SQL, C
Operating Systems: Windows XP/Vista/7, UNIX (HP-UX, AIX, Sun Solaris), MS-DOS, Linux
Data Warehouse Tools: Ab-Initio (Co-Op - 2.15, 2.16 and 3.0, GDE 1.15, 1.16 and 3.0), EME,
Informatica (7.0/8.0)

Vous aimerez peut-être aussi