Although, PowerShell give the whole remoting experience, you may still need to ask you network administrator to let you configure remoting on all the machines you need to access. Of course, unless you hold the keys of the kingdom. But, it is much simpler without PowerShell Remoting.
When you use the SQLPS Module it automatically let you connect to any of your SQL Servers on your network. So, YES!! As long as you have permission to manage these servers, SQLPS already gives you the ability to remotely connect to other machines. There’s no need to setup the Remoting with Enable-PSRemoting or Set-WSManQuickConfig commands.
Simple Remoting #1
So, How can I change from my SQL Server local prompt to another instance? Here’s an example how:
[sourcecode language=”powershell”]
cd SQLServer:\SQL\Your-Machine-name\Default #or
cd SQLServer:\SQL\Your-Machine-name\Your-InstanceName
[/sourcecode]
4. Now you can continue and navigate through your SQL Server drive like in DOS filesystem.
[sourcecode language=”powershell”]
Dir Databases -force
[/sourcecode]
5. Finally, to remote connect top another session just change the “Your-Machine-name”, and your “Your-InstanceName”.
Simple Remoting #2
One thing you need to understand, after you load SQLPS module, now you have access to all SMO (SQLServer Management objects) .NET Classes. This will help you build short and very effective .NET that extend the scripting of PowerShell. Also, you don’t need to be at the SQLServer: drive to use it.
$SaveResult = Invoke-Sqlcmd -Query "SELECT * FROM [AdventureWorks].[dbo].[AWBuildVersion];" -ServerInstance "Your-Machine-name\InstanceName"
$SaveResult
[/sourcecode]
Again, there’s no need to be at the SQLServer: drive in order for this to work. As you can see, with just a few lines of code you can connect to another SQL Server instance.
As I started to build executable solutions in PowerShell for my SSIS packages, now I need to take advantage of providing an Exit Code value within the application. Creating your own exit code can be very useful for providing a Success or Failure of a task. This is way you can take the correct action on how your flow is going to behave.
1. Exit N – as shown on MOW’s blog (link above), only work in PowerShell
2. Exit (“N”) – I just try it but only work with PowerShell.
3. [Environment]::Exit(“N”) – Following the MSDN article (link above), it work in both PowerShell and DOS.
Here’s where “N” equals the Exit Number you want to provide. Also, you need to understand that this only work when executing your script with the PowerShell.exe at any command prompt: DOS or PowerShell.
Now, as you notice and probably wondering, I found out that “Exit N” and Exit(“N”) behave differently when is executed in both as a script, and compiled under PrimalForms 2011. I will show this in my examples.
In PowerShell V2, I found some different behaviour when using the above methods. My goal in these exercise is to provide an Exit Code that can be use in either PowerShell (as a Script and as Executable) and DOS (as executable). The reason is that you want to eventually convert your script as an executable in the future.
Here’s the basic structure on how this is done taking an example of a Division process where we’ll be trapping the “Divided by 0″ exception using “Try-Catch“. Then, after testing our script we can create the executable that can be included in an SSIS solution.
To validate the Exit Code has a value, we need to create en executable using PrimalForms 2011, then use the $LastExitCode in PowerShell and the %ErrorLevel% in DOS to confirm we have the values we want. Both Exit Code values need to match in PowerShell and DOS. These samples scripts will run under PowerShell console, and the compiled *.exe version will run under DOS.
Sample Script 3 – DivideNow3.ps1 – using “Exit N”.
This is the results we want to get. Both DOS and PowerShell will show the correct assigned Exit Codes. This script and executable will work and can be integrated in an SSIS package.
Bonus Script – Executing a PS Script from PowerShell.exe in the PowerShell Console to execute DivideNow.ps1
[sourcecode language=”powershell”]
#========================================================================
# Created with: SAPIEN Technologies, Inc., PrimalForms 2011 v2.0.12
# Created on: 9/2/2011 9:21 AM
# Created by: Max Trinidad
# Organization: PutItTogether
# Filename: DivideNow_test.ps1
#========================================================================
# .\DivideNow.ps1 40 5 – This will execute and close PowerShell immediately
# .\DivideNow.ps1 40 0 – This will execute and close PowerShell immediately
# But if I execute with PowerShell.exe
PowerShell -noexit -command { `
cd ‘C:\Users\Max\Documents\SAPIEN\PrimalForms 2011\Files’; `
& ./DivideNow.ps1 40 5}
Here’s where the fun begins. It took me a few hours to figure something out in order to my SSIS package pass some values into my PowerShell executable. So, here’s a sample solution where I found out that my Flat file source record didn’t match the record layout provide by another Internet source. Yes, believe or not, sometimes documentation out there might be way out of day. But, Thanks to SSIS “..Source Error Output“, you can redirect these records error to a destination object. Then you can later analysis these bad records.
So, if you work creating SSIS packages, then everyone should know how to use the “Data Flow component” to help you define your data Source, and Destination objects. Now, in cases when you need to isolate records that are causing your package to fail, then you need to add a second Destination object to deal with them later. In my example, my second Destination object will be to a new SQL table called “CompanyData_SkippedRecords“.
Here’smy sample Text file to be process by my SSIS package: “CompanyData.txt”
Notice, the first line are the columns, but some of the data in this file won’t match the column layout. If I run this solution with the Source and Destination objects defined, this package will fail.
Now, we can add your another Destination component to isolate these unmatching records. Connect the Source component by dragging the “Red Arrow” to the second Destination Component handling the errors.
A “Configure Error Output” windows will popup, change the “Set this value to selected cells:” to “Redirect row“. Then, highlight all fields in both “Error” and “Truncation” columns, and Click “Apply” to change the “Fail Component” to “Redirect row“.
Click “OK” when you confirmed that all fields has “Redirect row“.
These changes are in place for our Flat File Source “Error Output” section.
Now, we need to complete setting up out second destination component labeled “Error Skipped Records Destination” where our Error Records will be dropped into our new SQL table.
Open our Error Destination component. For now, we are going to use the default, the new table will be named “[Error Skipped Records Destination]”. So, all bad records will be stored in this table.
Then, click “OK” to exit.
At this step we are done with our Import and Error/Mismatch records handling, and we are ready to run our solution. At the end of this step will have bad records table.
Here’s our good data:
And, this is Error Records table:
Now, the next step, I need to a table to consolidate all my mismatched records. I will be reading from my “[Error Skipped Records Destination]” and selecting the “[Flat File Source Error Output Column]” field as “DataSkipped” into a new table called “CompanyData_SkippedRecords“.
CREATE TABLE [dbo].[CompanyData_SkippedRecords](
[DataSkipped] [text] NOT NULL,
[ErrRecordsLength] [varchar](7) NOT NULL,
[TotalFieldsPerRecords] [varchar](5) NOT NULL,
[FileGroupSource] [varchar](4) NOT NULL,
[ProcessDate] [datetime] NOT NULL,
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
[/sourcecode]
As you may notice, this table is very different than the we one createdto store the errors. And here’s where we integrate our PowerShell script solution into our SSIS Package.
PowerShell Script
In this new table out of the 5 fields 4 will be genrated by PowerShell , and two of them are coming from parameters:
DataSkipped – comes from the our Err table.
ErrRecordsLength – Count of the length of each records skipped.
TotalFieldsPerRecords – Count the number of fields per records (using the delimiter -1).
FileGroupSource – Denominates the type of records that was imported (ie.Comp).
ProcessDate – Date that this Err table was process.
The purpose of this PowerShell executable it to generate this additional fields to help analyzing what’s wrong with my Text file. This process will read one Err Record at a time, generate the addition values, and insert into the new table.
One requirement:You need to have the SQLPS module installed in your “MyDocument\WindowsPowerShell\Modules” folder were the user is running this SSIS package, or will not run. If you’re usign SQL Server 2008 R2, then download the SQLPS module from Chad’s Miller blogsite:http://sev17.com/2010/07/making-a-sqlps-module/(SQL Denali CTP3 already has it). max
Here’s the Script (don’t be scare!), to be compiled with PrimalForms 2011:
[sourcecode language=”powershell”]
# =============================================================
# Microsoft PowerShell Source File — Created with SAPIEN Technologies PrimalScript 2011
# NAME: AnalyzeSkippedRecords.ps1
# AUTHOR: Max.Trinidad ,
# DATE : 8/18/2011
#
# COMMENT: PowerShell Script to Analyze the Data Skipped for processing and update the Skipped
# table with additional information.
# =============================================================
PARAM([String] $Arg1, [String] $Arg2, [String] $Arg3)
#Imput: $Arg1="Err*"; $Arg2 = "Developer"; $Arg3 = "YourComputer"
Notice this script was meant to search for multiple error tables based of the type of File groups (ie. ADB1, ADB2, and ADB*). Now, using the steps from Passing Parameter to a PowerShell executable – Part – 1 on the previous blog, i compiled my script and create my executable. Keep in mind, the you will need to move this executable to the machine you are running this SSIS Package. Thanks to SAPIEN for this Great product!
Create SSIS Variables for input Parameters for the PowerShell Executable
This part took me a few hour to figure out how to pass parameters using the SSIS variables in combination with “Expressions”. Yes!! That was the keyword “Expressions”. I couldn’t fine a blog out there that will properly explain this process. Just one that I will give some credit because this keyword got my attention: http://dennyglee.com/2006/05/12/ssis-global-variables-and-execute-process-task/ – “..you can modify the Execute Process Task at the point of the Property Expression attribute..”. Unfortunately, thsi blog didn’t show a step-by-step on how this done. But, lucky you, I’m blogging about it.
Here’s how is done:
Create three variable that will hold some parameters values to be injected into the executable:
Now, we add the “Execute Process Task” component, connect the “Data Flow Task”, and double-click on the “Execute Process Task”. In the “Execute Process Task Editor”, go to the “Process” section and in “Executable” add the executable name “AnalyzeSkippedRecords.exe”.
And now,… to setup your input parameters, Go to the “Expressions” section:
Here’s where you define the input Argument(s) for your executable(s). Under “Property expressions”, you select the property “Argument”, and then click on the “…” to build the expresions:
Yes!! This is the money maker: ( there’s a space between the double-quotes )
Now, click “OK” twice to save the changes, and we ready to run our full SSIS package.
Success!! This solution works.
Check my “CompanyData_SkippedRecords” table:
Now, I can keep working on analyzing this data with PowerShell, and/or other tools.
Couple of Pointers:
In the “Execute Process Task Editor”, you can troubleshoot your executable session if you leave the setting “WindowStyle” to Normal. This way you can see the console popup. Then, when everything is working correctly, change it back to Hidden“.
Also, if you’re having some problem with the PowerShell executable failing, you can add the “Write-Host ‘Pause’; Start-Sleep -second 10;” oneliner command to pause the session and give you time to see the error when leaving the “WindowsStyle” as Normal.
Yes!! It was a great event with over 400 people showing up, and only two PowerShell Sessions included. Thanks to all SQL Saturday organizers for all the efforts in putting this events together. I’m honor to be part of it. Thanks Scott and Herve!!
Well, here’s both presentations: Session 1 – Let’s Talk About SQL PowerShell, Session 2 – Extendign your T-SQL with PowerShell, and more samples:
Many THANKS!! to all who attended my sessions, and I know that you all were asking for more at the end. This is just an example that there’s a lot to talk about PowerShell and every little time to do it. But, don’t worry!
I’m in the process of setting up some livemeeting event every on the third week of the month at 11:30am(EST). It will a “PowerShell Code Snippet” 3o minutes demo with the purpose of showing block of code by myself and our PowerShell community. Hopefully, this will speed up your learning by reusing these block of code.
Stay Tuned to this blog site for more information.
Just forgot… Congrats to all the winners how found marking on the flyers I provided. I gaveaway 2 SAPIEN PowerShell V2 ebooks (X), 2 PluralSight One Month Training Subscription(PL), and 2 Diskeeper Pro Premier Defrag software for desktop (D).
And, there’s a reason for still using this method, which I will show in a minute. Now, PowerShell Version 2.0, introduce a new command “Add-Type” use to simplify loading the assemblies:
Add-Type -AssemblyName “Microsoft.SqlServer.Smo”;
And, works Great! Except, when you’re working with different versions of SQL Server on the same box. You will experience some issues loading the SMO Assembly using the “Add-Type”:
Here’s the error message:
PS [3] >Add-Type -AssemblyName “Microsoft.SqlServer.Smo”; Add-Type : Could not load file or assembly ‘Microsoft.SqlServer.Smo, Version=9.0.242.0, Culture=neutral, PublicKeyToken =89845dcd8080cc91′ or one of its dependencies. The system cannot find the file specified. At line:1 char:9 + Add-Type <<<< -AssemblyName “Microsoft.SqlServer.Smo”; + CategoryInfo : NotSpecified: (:) [Add-Type], FileNotFoundException + FullyQualifiedErrorId : System.IO.FileNotFoundException,Microsoft.PowerShell.Commands.AddTypeCommand
I’m sure, many if you may have seen this error, and that’s why people sometime go back to use the old faithful V1 format.
Now, you need to be careful because you might be loading the wrong SMO assemblies to use against your database. But, for most cases, I sure it’s OK to load the latest one because it can handle older versions. This is why you most test all your scripts before putting them in your production environment.
Well, the old and faithful V1 “ [reflection.assembly]::Load… ” to load your SMO assemblies work because it grabs the latest version from your windows “Assembly” folder. Here’s an example:
In my environment I do have SQL Server Express, SQL Server 2008 R2 Developer, and SQL Server Denali. So, when I use V1 way to load my SMO, you will notice in the PSConsole that it’s loading the assembly from SQL Server SMO version 11.0.0.0. Notice that using the V2 command “Add-Type”, was trying to find 9.0.242.0 and fail. Even, if this error might be due to a registry issue, this is no the assembly I want to load.
Now, can we still use V2 “Add-Type” command to load the correct version of this assemblies. Hell Yea!!
Here is how:
1. You need to confirm the full path and filename of the SMO DLL, which is located in the “C:\Windows\Assembly” folder.
2. It helps to load the latest SMO assembly using V1 way, then use the following .NET line to list the location of that assembly: [system.appdomain]::CurrentDomain.GetAssemblies() | Select location
3. Now, look for your loaded SMO assembly location, then highlight and press ‘Enter’ to copy the line: C:\Windows\assembly\GAC_MSIL\Microsoft.SqlServer.Smo\11.0.0.0__89845dcd8080cc91\Microsoft.SqlServer.Smo.dll
4. Use the V2 “Add-Type” command with the “-Path” parameter, and paste (right-click mouse) to add the copied SMO full path. This value must be a string so use quotes around it. Don’t press enter yet!!
Sample line copied but change version to “10.0.0.0”:
“C:\Windows\assembly\GAC_MSIL\Microsoft.SqlServer.Smo\10.0.0.0__89845dcd8080cc91\Microsoft.SqlServer.Smo.dll”
In order to load a different version, you need to manually change the version number. In my case, I need to change from 11.0.0.0 to use my SQL Server SMO version 10.0.0.0. This is why you need to check for your multiple SMO version already installed in your system under you Windows Assembly folder.
Finally, press enter on the following command will load the SMO assembly for SQL Server 2008/R2: (this is a one-liner)
As we all are desperate trying to get into the new SQL Server Denali CTP3, we are going to face some few stones in the road. If you have a clean virtual machine to installed it, then you are OK. But, if you are going to create another instance side-by-side with another SQL Server engine, then keep this in mind.
First, about the installation experience, there are good points. As far as my installation, it was smooth. Prior the CTP3 installation I had to remove my previous SQL Denali CTP1, and it went without a problem.
In case, you’re a SQL Developer and multiple version of SQL installed, then you must be aware that you may have a problem openning your SSIS 2008 R2 solution in your R2 “Business Intelligence Developement Studio” (BIDS) after installing CTP3
Now, last time I did a SQL Server 2008 R2 Side-by-Side installation, I had SQL Server 2005 BIDS installed and working properly. But installing SQL Server 2008 R2 did affect my SQL Server 2005 BIDS. So, I did experience the same situation after installing CTP3 and trying to use SQL Server 2008 R2 BIDS.
Here’s the error you will get when trying to open a previously created SSIS solution:
What’s the workaround?
1. You need to have you SQL Server 2008 R2 media.
2. Go to your Control Panel, add/remove programs.
3. Right-Click on “Microsoft SQL Server 2008 R2 (64-bit)“, and select “Remove“.
4. Follow all the prompts, select SQL Instance, check to remove only the “Business Intelligence Developement Studio“, and complete the uninstall.
5. I would recommend to reboot at the end of this process.
6. Now, you can reverse this process by Right-Click on “Microsoft SQL Server 2008 R2 (64-bit)” again, and select “Add“.
At the end of this process, you will be able to successfully open your SSIS R2 solution in your R2 BIDS.
Let’s keep trying more stuff with SQL Denali CTP3.
Here’s one of the function that’s part of my recent SQLDevTools Module – “Get-TSQLTableRecordCount” which is a good example of a multi-purpose command. The main purpose of this function is to assist in displaying the record count of all tables in the database. But, at the same time it can assist you to provide a list of all users databases.
Usage #1 – To show all databases:
In case that you don’t remember the database name in your SQL Server instance, this command give you the ability to list all databases. Here’s an example:
Here’s a quick way to get data off on of your SQL Server table and dump it to a CSV file. Only 4 steps:
1. Make sure you are in SQLPS console, or have the community SQLPS (or SQLPSv2) module loaded ( ie. Import-Module SQLPS ) in either your PowerShell Console or ISE.
2. Have you query ready, then save it into a PowerShell variable:
$sql = “SELECT * FROM [AdventureWorks].[Production].[Location]”
3. Next one-liner will build the PowerShell object and exported to a *CSV file: (Execute in localhost only. Use -ServerInstance with the -database parameter if is needed t0 execute query)
I’m excited to release this new version of SQLDevTools today Monday June 6th. Here’s list of the updates:
##- SQLDevTools Alpha 0.1.0 Released 06/06/2011:
##-
##- Functions fixed;
## 1. Get-TSQLTableRecordCount dual syntax issue not working properly.
## 2. Write-TSQLScriptIndex parameters ‘DropIndexOnly” and ‘CreateIndexOnly’ was mispelled.
## 3. Get-TSQLTableTriggers not work, add modify_date field as last_modified.
## – Updated two function to include rename of destination table: (need to all help)
## 1. Write-TSQLScriptDFConstraint – Help updated
## 2. Write-TSQLScriptInsertSelect – Help updated
## – Add two new functions to change destination table name in the PS Object: (Help included)
## 1. Rename-TSQLScriptedTableName
## 2. Rename-TSQLScriptedTableIndexes
## 3. Get-TSQLUserDefinedDataTypes
## 4. Get-TSQLXmlSchemaCollections
## 5. Get-TSQLUserDefinedFunctions
## 6. Write-TSQLScriptXmlSchemaCollections
## 7. Write-TSQLScriptUserDefinedFunctions
## 8. Write-TSQLScriptUserDefinedDataTypes
Next release -> will include Views, User Stored-Procedures, and Database triggers.
Basically, this completes the basic need to script out most object needed to rebuild a table, or series of tables. It takes into account “Identity”, Computed, and User Defined objects. Also, I included the ability to alter(rename) your scripted destination tablename, and index objects.
For testing these functions, I’ve use Microsoft “AdventureWorks” which is full of scripting surprises. Keep in mind, I’ve use a mix of both SMO and embed-TSQL that assist in generating the results.
Here’s a teaser sample using new function to script User Defined Data Types:
## Save PS Object to a file, and then view it:
[string] $ScriptToFile = Write-TSQLScriptUserDefinedDataTypes -SQLInstanceName $SQLInstanceName -DatabaseName $SourcedbName `
-DestinationdbName $DestinationdbName
$ScriptToFile | Out-File c:\Temp\SQLScript01.sql
ii c:\Temp\SQLScript01.sql
[/sourcecode]
Here’s a preview on what I’ve been doing for work. I create about 12 new functions to assist me in automating a task to copy table(s) from one database to another. Then, this process will also rebuild the data, indexes, and constraints from the scripts extracted during this process.
All these functions are using embedded T-SQL scripts to assist in the scripting of the SQL Table objects. Finally, I will be ending having a module to load all my functions when is needed.
Just recently, I updated two on the main functions I posted in the Microsoft Script Repository:
1. “Function Write-SQLCreateTable” – Updated – 05/25/2011: Added three new switch parameters so we can create individual scripts for:DropTableOnly, CreateTableOnly, and TurnOffidentity.
2. “Function Write-SQLInsertSelect” – Updated – 05/25/2011: Add ability to detect if there’s an Identilty Column so it can add the “Set Identity_Insert …” line. And, Add set-location go back to your “Home Drive” when the process completed.
These two updated functions require you have the SQLPS module or in won’t work. That’s why I include to do both Import and Remove of SQLPS module. Also, these functions will not script out Indexes, and Constraints.
In my new SQLDevTools Module, all functions will not import/remove the SQLPS module. It will included functions to script out Indexes, Check Constraints, Foreign Keys and even Triggers. Functions name are change slightly, and a test functions script is included to assist in building rebuilding a table in another database. More functions will be added to this module. I will make it available to the community through CodePlex when is ready.
Here’s a preview snapshot of my new functions:
Special Thanks to the following bloggers for their contributions: