Vous êtes sur la page 1sur 4

Vishal Bhatt

SUMMARY:

A proficient, well rounded IT professional with myriad interests and 7 years of experience ranging from legacy systems
to contemporary data tools. Looking to expand the horizon and trying to find the next gig in Python development and
data analysis.
Strong knowledge in Object oriented development, version control, release management, change management
Experience in designing and developing Python applications in medium to large scale IT Projects
Experience in building data flows and doing extensive data analysis using Python,R,SQL
Experience in building end to end web application using a nosql database (Neo4j), Flask, d3.js, javascript
Experience in database development and administration in SQL server, Oracle 10 g
Experience in writing complex stored procedures using TSQL, PL/SQL
Experience and hosting web applications from a docker container
Experience in UNIX command line and shell scripting
Good understanding of Hadoop Ecosystem

EDUCATION:

School of Information Studies, Syracuse University, NY (GPA- 3.55) MAY2012


Master of Science in Information Management
University of Mumbai, Mumbai, India .
Bachelor of Engineering, Computer Engineering

TECHNICAL SKILLS:

Programming Languages: Python, R, Javascript, C++, T SQL, PL/SQL, JCL.


Platforms: Windows XP, Windows Vista, Windows 7, Linux Redhat ER5, Debian,Ubuntu
Databases: Neo4j, Oracle9i/10g, MS SQLServer2005/2008
Software Tools: Docker, Rational Rose, SQL editors, PL/SQL Editors, Putty, MS Visio, MS Project, SSRS, SSIS,SSAS ,
Hive , Pig ,CA-ESP(scheduling package), TSO & TPX, JCL, Sysview, Planet, Tandem, C1 Endeavor, Netcool

TRAININGS & CERTIFICATIONS:


Cloudera Data analyst training: Using Pig ,Hive and Impala May 2014
Hortonworks certified Apache Hadoop developer Feb 2015
ITIL Foundation Certificate in IT Service Management Aug 2013

WORK EXPERIENCE:

Western Union
Lead software engineer, Risk Modelling and statistics, Englewood,CO Mar 2016-present

Western union is a leader in global payment services. From small businesses and global corporations, to families near
and far away, to NGOs in the most remote communities on Earth, Western Union helps people and businesses move
money.

As a senior software engineer working in the risk modelling team, I am responsible for understanding the requirements of
the projects and owning and implementing various data engineering and fraud assessment solutions using contemporary
tools.

I am also engaged in ad hoc data requests and data manipulation and statistical computing using R.
Vishal Bhatt
Also, performed ad hoc data analysis in HDFS environment which involved writing Pig statements and Hive queries.

Data engineering project:

The purpose of the project was to use extensive data analysis on the transaction data and come up with risk factors that
server as upstream to various risk models

Transaction data was extracted from various data sources like Sql server and Hive using python module called
sqlalchemy
Performed extensive data wrangling on the extracted data using pandas and numpy modules in Python
Pushed the transformed data back to a table in SQL server
Encapsulated the entire process into a single workflow using the luigi module in Python so that it runs perpetually in
small batches
Used Pythons multiprocessing capabilities to run multiple processes for different time periods based on available
memory on the server

Technical environment: SQL server 2008, Debian, Python 2.7, Hive

Network visualization tool:

The purpose of the project was to come up with a network visualization tool as a proof of concept that would help the
fraud investigation teams in identifying patterns by inspecting the network associated with the transacting customer

Built a Neo4j database (graph database) by using the relational data from SQL server database
Neo4j database served as the data source for the tool
Used python modules like Py2neo to connect to the Neo4j db from python
Wrote nosql queries to come up with networked data which was up to 5 levels deep
Built a python class and used pandas and igraph modules to perform necessary aggregations on the raw networked
data
Transformed the aggregated data into JSON
Built a website using Python microframework called Flask
Used ajax,jquery to link the drop down buttons to the back end
Used d3.js library to render graphs from the JSON data
Added various buttons on the front end
Used extensive javascript to filter through the graphs based on button clicks
Deployed the web application on a private IP from a docker container

Technical environment: Debian, Python 2.7, Neo4j 3.0, Sql server 2008, d3.js v3, jquery, ajax, HTML

VISA
Systems Analyst, Distributed systems, Ashburn, VA Oct 2012-Mar 2016

VISA is a global payments technology company working to enable consumers, businesses, banks and governments to
use digital currency.

The Systems Analyst is responsible for understanding both the technical and business impact within operations and is an
integral part of a team that provides 2nd Level support, resolution, escalation of Batch processing on UNIX/LINUX
Environments of the Visas Core applications, operating system and services.

Support activities:

Worked with application and development teams such as Abinitio, DBAs, Hadoop support in recovery of complex
technical problems from a UNIX standpoint
Wrote shell scripts to load data from flat files to Oracle DB
Vishal Bhatt
Checked system logs and troubleshot various applications for connectivity issues and memory leak
Supervised and managed application team implementations such as new releases and enhancements with no impact
to Service Levels and minimizing outages
Supported initiatives such as mock disaster recoveries and tools enhancement to make the systems more robust and
fault tolerant

Development projects:

Database activities for an internal web application:

The purpose of the web application was to display the status of batch jobs with details like run time, job status as failed,
running or aborted, expected completion time.

Determined the data model and architecture of the Oracle database in terms of tables, columns and dependency

between various tables


Used SQLLoader to populate the oracle db with the data from csv files
Wrote PL/SQL stored procedures to manipulate the data in the db according to the needs of the front end

Technical environment: IBM mainframe, UNIX, Oracle 10g, ASP.NET

Python development work:

A dashboard was built that helped analyzed various applications from a higher level in terms of the success rate of various
batch jobs running in it.

Fetched the data from SQL server using sqlalchemy


Built a python class where the objects were batch jobs depending on their severity
Used python modules like numpy, Pandas, datetime to perform extensive data analysis
Using sqlalchemy pushed the results back to a table in SQL server
The data in the table was then used to make reports for the dashboard

Technical environment: SQL server 2008, Python 2.7, SSRS

Research Assistant: School of Information Studies, Syracuse University, NY Sep2011-


May2012
Interview industry professionals to identify challenges in collaboration
Document and consolidate the findings from the interviews using Excel
Oracle DBA/developer Intern: Synchronoss Technologies, Bethlehem, PA May2011-
Aug2011
Created and managed schemas and users
Performed deployment for new releases of homegrown application
Maintained and managed space as per requirement for table space and segment
Wrote PL/SQL stored procedures for ad hoc data manipulation
Performed database back-ups by using export and import utility
Inspected STATSPACK reports and AWR reports for tuning of Oracle
Vishal Bhatt
Junior SQL Developer: Paras Cadd, Mumbai, India Aug2008-
March2010

PARASCADD is a specialist engineering services & solutions company offering products and services across a range of
Engineering, Procurement and Construction segments

As a Junior SQL Developer I was engaged in doing development and maintenance work for a couple of databases.

Developed data mappings which specified the business transformation rules to be applied to the data
Developed data dictionary which stored information about data such as meaning, relationships to other data, origin,
usage, and format
Implemented database objects like procedures, functions and triggers to achieve consistent implementation of logic

INDEPENDENT PROJECTS:

Ecommerce Technologies: Website design for online ordering of books Aug2011-Dec2011


Gathered the overall website requirements for a consistent site design
Created an ERD using Visio for the online book system
Connected database to the front end using ADO.NET
Developed the front end functionality by linking pages and a home page showing ten latest offers by default

Advanced Database Management Systems: Project Lifecycle Management System Jan2011-May2011


Prepared a project scope document along with business rules and ERD
Designed tablespace, tables, schemas, users and assigned appropriate roles and privileges
Developed appropriate procedures and functions
Developed forms and generated reports in Oracle 10G and prepared a final project report

Vous aimerez peut-être aussi