Académique Documents
Professionnel Documents
Culture Documents
component of the Microsoft SQL Server database software that can be used to perform a broad range of data migration tasks. The data from various data houses is collected into a single dwh using SSIS. SQL Server analysis Services (SS S) is used as a tool b! organi"ations to anal!"e and make sense of information possibl! spread out across multiple databases# or in disparate tables. This is called cube data. SS S# is an $nline nal!tical %rocessing# $L %# data mining and reporting tool in s&l SS S enables users to construct special databases for fast anal!sis of ver! large amounts of data SQL Server reporting Services (SS'S) SQL Server Reporting Services (SSRS) is a server-based report generation software system from Microsoft.
$pening )
Selecting data Source
data source * source of data from where !ou want to pull the data server name * name of the computer database select a db and click ne(t
Selecting destination ) destination * where u want to push the data to a data provider or (flat file#e(cel#.so on.. ) sever name* database * and click on ne(t
+rowse to the file destination select format and click on ne(t which opens the below window
Select table ,view #row delimiter and column delimiter -lick on edit mappings to see how coloumns are mapped and click on ne(t...
Select run immediatel! or save ssis package for future use... and click on ne(t.
Select import to import data into database or e(port to e(port from database
-ontrol flow *-ontrol flow is where !ou define operations and order of
e(ecution of those operations. for e(ample !ou put two operations* e(ecute t3s&l command on a database and send mail. then !ou define order of them with %recedence -onstraint for e(ample the t3s&l statement should be e(ecuted and then if it succeeded mail will be send.
!ata flow "0ata flow is where !ou define data stream5 where data comes from
(data sources)# how data should be transformed (transformations) and where data should be loaded (data destinations). !ou can pass data from a component to another components with 0ata %aths.
T S6S
Task * task is a individual unit of work having a functionalit! %recedence constraints tells the order,flow in which tasks are to be e(ecuted
Integration Services includes the following t!pes of tasks. !ata #low $as% "The task that runs data flows to e(tract data# appl! column level transformations# and load data. !ata &reparation $as%s *These tasks do the following processes* cop! files and directories5 download files and data5 run 7eb methods5 appl! operations to 8ML documents5 and profile data for cleansing. 'or%flow $as%s "The tasks that communicate with other processes to run packages# run programs or batch files# send and receive messages between packages# send e3mail messages# read 7indows Management Instrumentation (7MI) data# and watch for 7MI events. SQL Server $as%s "The tasks that access# cop!# insert# delete# and modif! SQL Server ob4ects and data. Scripting $as%s" The tasks that e(tend package functionalit! b! using scripts. (nalysis Services $as%s "The tasks that create# modif!# delete# and process nal!sis Services ob4ects. Maintenance $as%s "$he tasks that perform administrative functions such as backing up and shrinking SQL Server databases# rebuilding and reorgani"ing inde(es# and running SQL Server gent 4obs.
In connection t!pe we can set different t!pes of connections like *ado.net etc...
-reate two tables emp:a#emp:b with same no of cols in the e(cel sheet $pen microsoft ssis integration pro4ect and name it as importe(cel 'ight click properties debugigng set run ;< bit to false Take a se&uence container and a data flow task in the se&uence container 0eclare two variables =)e(celpath datat!pe to string and value is e&ual to the path in which the folder is present >) table datat!pe to string and value is e&ual to emp:a
9o to data flow task properties to dela! validation to true ?ow select a e(cel source in dataflow tasks ) under properties set validate e(ternal metadata to false Start configuring the e(cel source b! right click edit )under connection manager new ) browse choose the e(cel sheet path folder )under nature of the e(cel sheet select sheet= 7atch cloumns column mappings to check @meaning dont mess with itA -lick on connection manager under propert! select e(cel file path under e(pression choose the user defined variable @Buser**e(celpath A ?ow come to destination select oledb destination and under properties select validate e(ternal meta data to false
9etting value from a table and enumerating it through for each container
Take variable #name it @tableA and choose datat!pe as ob4ect @we can select n no of rows and colsA In connection set the connection *if its local host set localhost.databasename
In the s&l statement write the corresponding s&l statement*@select C from tablnameA In order to capture the result from set result set to full result set ?ow go the result set and and set result name to D and variable name to user**table
Take for each loop container and add it to e(ecute s&l task ?ow take four more variables like cit!:name cit!:desc#!ear# t:population@userdefinedA ?ow we will capture the column b! column in for each loop container Select the for each loop container right click it to go to collection In for each loop editor enumerator select for each ado.enumerator Ender ob4ect ado enumerator select the variable * user *table Select rows in the first table and ok ?ow goto variable mappings and select all the variables cit!:name cit!:desc#!ear# t:population@userdefinedA and u can see the ind(es as D#=#>#F respectivel!@if we want onl! one column data we can chose onl! one columnA
?ow take a data flow task Inside the data flow taskselect oledb source Ender oledb select localhost.database name and ok Ender s&l command test write start = as rows ?ow add a derived coloumn to the data flow task In derived column in the derived column name set four variables c:name#c:des#c:!ear #c:population and under e(pression add the four user defined variables #@u can see the length to D so under t!pe casting use dt:str length G HD#codename =>H>A !I 0ont ask !s....J ?ow add a row count @data transformationA it counts the no of rows take one more variable as row count and add to custom variable in row count
9etting data in e(cel to a table in ssis and enumerating it through a foreach container =) take a variable b! name it. @tableA $0/ data t!pe ob4ect. Take a e(ecute s&l task Set the connection
-onstraint editor
Ender evaluatio operation !ou have four operations * constraint #e(pression#constraint or e(pression#constraint and e(pression.
The star marked task has two paths under such case select one of these
-reating variables
'ight click on the empt! space ) variables dd variable to add a new variable if !ou select on the empt! space the scope will be package level if select a particular task the scope will be task level
containers
-ontainers* containers are used to group a task into logical units 0efault container is task host container There are basicall! three containers * =)for loop >) for each F)se&uence container
Ender construction
0ata flow tasks -onnection managers * connection manager is a pointer to the source or destination connection 0ata sources 0ata source views
Ender construction
-onnections*
Microsoft
SQL Server Integration Services packages use connections to perform different tasks and to implement Integration Services features* -onnecting to source and destination data stores such as te(t# 8ML# 1(cel workbooks# and relational databases to e(tract and load data. -onnecting to relational databases that contain reference data to perform e(act or fu""! look ups. -onnecting to relational databases to run SQL statements such as S1L1-T# 01L1T1# and I?S1'T commands and also stored procedures. -onnecting to SQL Server to perform maintenance and transfer tasks such as backing up databases and transferring logins. 7riting log entries in te(t and 8ML files and SQL Server tables and package configurations to SQL Server tables. -onnecting to SQL Server to create temporar! work tables that some transformations re&uire to do their work. -onnecting to nal!sis Services pro4ects and databases to access data mining models# process cubes and dimensions# and run 00L code. Specif!ing e(isting or creating new files and folders to use with /oreach Loop enumerators and tasks. -onnecting to message &ueues and to 7indows Management Instrumentation (7MI)# SQL Server Management $b4ects (SM$)# 7eb# and mail servers.
-onnection managers
http*,,technet.microsoft.com,en3us,librar!,ccHDHL;D.asp(
Ender construction
Transformations
+usiness Intelligence transformations 'ow transformations 'ow set transformations Split and Moin uditing Eser defined
+usiness Intelligence ) Slowl! changing dimension transformation ) /u""! Look up ) /u""! 9rouping ) Term 1(traction ) Term look up ) 0ata mining Quer! ) 0QS -leansing