Vous êtes sur la page 1sur 6

Rajesh Kumar Reddy B Mobile: +91-6364228947

Mail: rajesh.nirmalab4@gmail.com

Professional Summary:

• Having 5 years of experience in dataware housing as an ETL Developer using IBM Data
Stage and software testing which involves ETL/DWH/BI Testing and Manual testing
• Extensive knowledge on Data Warehousing Concepts and ETL process
• Proficient in developing strategies for Extraction, Transformation and Loading (ETL)
mechanism.
• Hands on experience in various stages like Join, Merge, Remove duplicates, Filter,
Dataset, Lookup file set, Complex flat file, Modify, Aggregator, Sequential file,
Sort,Funnel,Peek,Row Generator etc.
• Experienced in scheduling Sequence and parallel jobs using Data Stage Director
and scheduling tools.
• Familiar in using highly scalable parallel processing infrastructure using parallel jobs
and multiple node configuration files.
• Extensive experience in Unit Testing, Functional Testing, System Testing,Integration
Testing, Regression Testing, User Acceptance Testing (UAT) and Performance
Testing.
• Experienced in query tools like Oracle,Netezza and MS SQL Server
• Having work experience on Telecom, Finance and Sales Domain projects
• Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and
identifying data mismatch
• Experienced in testing methods including Waterfall, Agile Methodology(SDLC) and STLC
process
• Familiar in working with HP ALM to store Business Requirements, Test Cases, Test
Runs, and Test Results for every iteration and store defects and link them back to the
requirements.
• Expertise in Defect Tracking, Defect management and Bug Reporting using tools like HP
ALM, Quality Center.
• Team player with excellent Organization and Interpersonal skills and ability to work
independently and deliver on time.

Academic Qualification:

• B.Tech from JNTU, Anantapur 2012

Working Organization:

• Associated with Tek Systems Bangalore as Senior Software Engineer from July 2016
to till date.
• Associated with Indecomm Global Services Pvt Ltd, Bangalore as QA Engineer from
Aug 2015 to April 2016.
• Associated with Precession Staffers, Bangalore as an Associate from Sep 2014 to May
2015.
Technical Skills:

ETL Tools IBM Data Stage v8.1/v8.5/v9.1/v11.5


Databases Oracle 10/11g,Microsoft SQL Server,Vetica
Programming Languages SQL
Data Interaction Tools Toad, SQL Developer, Netezza,DBeaver
Reporting tools SAP Business Objects, Power BI
Test Management Tools HP QC/ALM,JIRA

Project Summary:

1) PROJECT NAME Sales Comp Gateway


Client HPE
Solution Environment IBM Data Stage 11.5, DB2, Windows 10, Unix
Role Data Stage Developer

Description:

HPE is one of the biggest product based company globally with many products. The aim
of the project is to extract the data from different upstream systems and transform the data and
loading into multiple target databases. This data helps to setup a quota for sales representative,
and also used for analyzing and reporting purpose. The model used in the project is star schema.
This project helps the client to do business effectively.

Responsibilities:

• Responsible for creating detailed design and source to target mappings.


• Responsible to communicate with business users and project management to get
business requirements and translate to ETL specifications.
• Experience with full development cycle of a Data Warehouse, including requirements
gathering, design, implementation, and maintenance.
• Designed Mappings between sources to operational staging targets, using Star Schema,
Implemented logic for Slowly Changing Dimensions (SCD).
• Involved in the migration of Data Stage jobs from development to QA and then to
production environment.
• Extensively worked on Data Stage Job Sequencer to Schedule Jobs to run jobs in
Sequence.
• Conduct code walkthroughs and review peer code and documentation.
• Created system for capturing, reporting, and correcting error data.
• Gather requirements and design of data warehouse and data mart entities and creating
reports for business.
• Data cleansing, data quality tracking and process balancing checkpoints
2) PROJECT NAME Mycomp Quota (MCQ)
Client HPE
Solution Environment IBM Data Stage 8.1, SQL Server 2012, Windows 8/UNIX
Role Data Stage Developer
Description:

HPE is one of the biggest product based company globally with many products. The aim
of the project is to extract the data from different upstream systems and transform the data and
loading into multiple target databases. This data helps to setup a quota for sales representative,
and also used for analyzing and reporting purpose. The model used in the project is star schema.
This project helps the client to do business effectively.

Responsibilities:

• Involved in business analysis and technical design sessions with business and technical
staff to develop requirements document and ETL design specifications.
• Worked SCDs to populate Type I and Type II slowly changing dimension tables from
several operational source files
• Adept knowledge and experience in mapping source to target data using IBM Data Stage
• Experienced in developing parallel jobs using various Development/debug stages (Peek
stage, Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage)
and processing stages (Aggregator, Change Capture, Change Apply, Filter, Sort &
Merge, Funnel, Remove Duplicate Stage)
• Debug, test and fix the transformation logic applied in the parallel jobs
• Used the ETL Data Stage Director to schedule and running the jobs, testing and
debugging its components & monitoring performance statistics.
• Involved in performance tuning of the Data Stage job designs.
• Extensively used Director to validate and run the jobs by specifying run time. Parameters
and carefully analyzed to fix the defects
• Used Control M to create new Drafts for new environment Jobs, and those jobs are
ordered to Control M Enterprise Manager

3) PROJECT NAME Columbus-Integrated Compensation Engine (ICE)


Client HP Inc.
Solution Environment Microsoft Azure Cloud Services ,Talend Big Data 6.3.1,SQL Server Management
Studio12.0,Ambari,Hive,HP ALM ,Unix ,Putty ,WinSCP
Role ETL Testing Engineer

Description:

Columbus is a process which allows to display Deal-based orders, shipments & rebates
with their linkage to the respective CRM Opportunity.
With a Deal/Opportunity-based compensation, HP aims at more accurate actual sales
measurement, increased funnel visibility and finally more accurate Sales forecast.

Responsibilities:

• Requirement analysis and designed the test scenarios


• Involved in preparation of Test approaches and Data analysis
• Prepared test signoff reports end of each test phase.
• Developed test cases based on Functional Specification Documents including test data
preparation for Data Completeness and Data quality.
• Developed SQL queries /scripts to validate the data such as checking duplicates, null
values, truncated values and ensuring correct data aggregations.
• Performed data quality analysis using advanced SQL skills.
• Tracking and reporting the issues to project team and management during test cycles.
• Prepared daily status reports with details of executed, passed, and failed test cases and
defect status.
• Participated in regular project Standup and status meetings.
• Report test results and defects to onshore managers.
• Coordinate with Team activities at offshore.

4) PROJECT NAME Sales Compensation


Client HP Inc.
Solution Environment Informatica, Oracle 11g,Unix, HP ALM
Role ETL Test Engineer

Description:

The Sales Compensation Mission is to design, deploy and operate a simple, agile and
competitive sales compensation environment that Collaborates to define industry leading
incentives that motivate the sales force to deliver business objectives. Consulting with the business
to define solutions that are simple and practical. Facilitates processes to deliver an accurate, timely
performance cycle and deliver compensation services.

Responsibilities:

• Requirement analysis and designed the test scenarios


• Involved in preparation of Test approaches and Data analysis
• Involved in verifying Web UI changes manually with requirements
• Coordination with onshore team for addressing offshore issues
• Prepared test signoff reports end of each test phase.
• Developed test cases based on test matrix including test data preparation for Data
Completeness, Data Transformations, Data quality, Performance and scalability.
• Developed SQL queries /scripts to validate the data such as checking duplicates, null
values, truncated values and ensuring correct data aggregations.
• Performed data quality analysis using advanced SQL skills.
• Tracking and reporting the issues to project team and management during test cycles.
• Prepared daily status reports with details of executed, passed, and failed test cases and
defect status.
• Responsible for the creation of technical design specs for universes and reports, Test
universes and reports.
• Participated in regular project Standup and status meetings.
• Report test results and defects to onshore managers.
• Monitor Team activities on offshore.
5) PROJECT NAME KODIAK
Client KODIAK
Solution Environment MSTR,Netezza,Jira, Test Link
Role QA Engineer

Description:

KODIAK Network in one of the leading providers of innovative mobile enterprise


productivity applications since 2003. The organization has partnered with wireless carriers around
the globe to deploy their enterprise applications that enhance Employee Communication, Increased
Productivity, Simplify Operations, and Reduce Cost.

The Kodiak-powered platform is purpose-built for Push-to-Talk (PTT) and other advanced mobile
solutions that focus on enabling instant and group-based communications. Leveraging its carrier
partners, Kodiak has enabled enhanced PTT communications through its integrated and diverse
solutions offerings to key industry verticals including: Hospitality, Manufacturing, Construction,
Transportation, Utilities, and Energy.
Responsibilities:

• Requirement analysis
• Involved in preparation of Test approaches and Data analysis
• Involved in verifying UI changes manually with requirements
• Coordination with onshore team for addressing offshore issues
• Preparation of test case’s using SQL Queries
• Execution of Test Cases & results documentation
• Validating reports data with database
• Defect Analyzing and Reporting in JIRA
• Verified column mapping between source and target
• Presenting DEMO’s to the client every sprint ending day
• Providing daily status report to managers.

6) PROJECT NAME AIMSCOMS


Client GE Capital(AUS)
Solution Environment Informatica 9.1, Oracle 11g,ALM
Role ETL Tester

Description:

The DF business in Australia and New Zealand is undertaking a project to move from its
Current receivables platform, AWARE, onto the global system run by the CDF US team, AIMS.As
part of this project, all Distribution Finance data currently received from AWARE and Utilized for
various reporting and business processes will need to be replaced by equivalent AIMS data. Local
data load and data transform processes are required to be analyzed and updated as needed, to
ensure all business processes that rely on the Data Warehouse data will continue after the system
migration.
Responsibilities:

• Requirement Analysis
• Executing the jobs using the Control-M tool
• Test Data preparation to validate all possible scenarios
• Test plan preparation with detailed test case’s using SQL Queries
• Verifying the ETL data in target database
• Verified column mapping between source and target
• Interactions with Dev teams to resolve the issues
• Reporting daily testing status
• Defect Analyzing and Reporting in QC

Vous aimerez peut-être aussi