Vous êtes sur la page 1sur 2

Ahetesham Sayyad +91-7249753045

sayed.ahetesham@gmail.com
ETL Developer linkedin.com/in/ahetesham-sayyad/

SUMMARY
 Over 2.10 + years of robust experience in the field of Information Technology with proficiency in ETL development and Data
Warehouse Implementation/development.
 Certified Denodo Developer.
 Good understanding and experience in Data Virtualization.
 Fine-tuned and optimized the performance of Complex views in Denodo.
 Experienced in Design, Development and Implementation of large – scale projects in Media, Financial, Shipping industries
using Data Warehousing ETL tools (Pentaho).
 Worked for one of the world's largest publishers of monthly magazines, with 25 US titles and close to 300 international
editions.
 Achieved the close deadline for on production deployment efficiently.
 Excellent Data Analysis skills.
 Hands-on experience on Data warehouse Star Schema Modeling, Snow-Flake Modeling, FACT & Dimension Tables, Physical
and Logical Data Modeling
 Well versed with optimizing the ETL environment, quality and validation processes to ensure data accuracy and integrity
 Proficiency in investigating and resolving ETL processes issues.
 Proficient in writing - SQL Statements, Complex Stored Procedures, Dynamic SQL queries, Scripts, Functions, Triggers,
Views, Cursors,Oracle PLSQL and Query Optimization.
 Created custom Denodo views by joining tables from multiple data sources.
 Integrated Oracle using JDBC to Denodo.
 Hands on expertise on VQL analysis, various joins & functions.
 Good knowledge of JDBC, XML and Web Services APIs.
 Good knowledge of query optimization and analysis of SQL execution plans in context of heterogeneous joins.
 Proficient knowledge of service offerings from Amazon Web Services.
 Able to design high availability applications on AWS across availability zones and availability regions

PROFESSIONAL EXPERIENCE

ETL Developer
Aloha Technology Jan 2017 - July 2019
Exponentia Datalabs July 2019 - Current
Environment  Deployed fully automated 5 daily incremental jobs having more than 300 tables.
Pentaho Data  Created several Pentaho Data Integration Transformations & Jobs (aka Kettle with UI as Spoon) to
Integration 7.1 extract data from OLAP systems and load into staging databases.
(PDI/Kettle),
 Involved in performance tuning of SQL queries and stored procedures using Index Tuning.
Denodo
Virtualization,  Wrote complex stored procedures in MySQL using MySql Workbench to summarize the data in staging
QlikSense, environment and then developed PDI’s to load the summarized data in Data warehouse Data Mart
MySql Workbench Dimension and Fact Tables.
6.1,  Developed UNIX scripts for scheduling the master, extracts, FTP loads using Cronjob Scheduler.
MariaDB- 10.3.11,  Worked closely with Business Analysts, data architects for business requirements, design standards,
Hadoop, validations.
Resource Manager
 Troubleshoot the Productions failure and provide root cause analysis. Worked on emergency code
fixes to Production.
Environment  Loaded data into STAGE/DW by using Transformations/Jobs which consists of 300+ Transformations
AWS, and 200+ Jobs all together using SMART Master ETL Job.
Apache TOAD  Created logging for each Transformation/Jobs into ETL PDI and setup Email notification failure on
1.5.3, each Component level.
Apache Sqoop,  Extensive experience in Production support to operational team for daily/weekly/Monthly job failure in
S3 Browser, Production environments.
Cloudberry,
 Created Bus Matrix representing table dependencies.
PuTTY,
Winscp,  Pertinent experience of BIG DATA- HDFS, MAPREDUCE, Sqoop and HIVE (Apache TOAD).
Filezilla,  Problem solving: One such problem resolved was introducing an efficient and cost-savings approach to
Repository - generate complex audit report from Data Integration into Excel workbook having multiple dynamically
Git ,Tortoise SVN
generated sheets containing data analysis of tables using Template sheets date wise.

ACADMIC CREDANTIALS

Bachelor of Computer Science and Engineering


 P.E.S College of Engineering , Aurangabad (2013-2016 First Class with Distinction).
Diploma in Computer Engineering
 M.H Saboo Siddik Polytechnic ,Mumbai(2010 - 2013 First Class)

Vous aimerez peut-être aussi