PowerShell Deep Dive session at Work – a Success!

At TBC Corp. (Tire Kingdom) I scheduled the second PowerShell meeting for September 28th close to noon.  A give a 2hr PowerShell Deep Dive where everyone brought their laptops so they could follow along the session.  It was a total success!  I had 15 people signed up and 30 showed up for this session.

TBCPowerShell

This was a diversified group of people which included: System Administrators(Windows, Linus, iSeries, SAP), Data Center Operators, Business Intelligence Developers, SQL Server Administrator and even some Business Analyst.  The general overview went well. The whole group was very interested in learning about PowerShell.

GettingStartedPowerShell_01_9-28-2015 jpg

GettingStartedPowerShell_02_9-28-2015 jpg

It all happened thanks for the support of all the technologies managers which realized this technology need to be adopted and embraced in their daily responsibilities.  This is opening to new opportunities to improve productivity.

I’m honored to be part of TBC Corp organization to allow me to be an “influencer” and provide guidance to this amazing technology.  I thanks Microsoft MVP program for helping me in accomplishing my goals and keeping up-to-date with technology.

I’m planning to do more TBC PowerShell Monthly meetings so they can easily adopt PowerShell and its amazing new features such as Desired State Configuration.

WP_20150928_020

WP_20150928_019

WP_20150928_017

WP_20150928_018

I’m so excited!!

PowerShell Studio working with Datagrid component

I love working with PowerShell. Yes! I admitted. And, I love using SAPIEN Technology “PowerShell Studio” product and specially when I can develop a productive windows application for our Data Center team to use. This make my and their job easy to handle.

SAPIEN_9-4-2015 10-36-39 AM

I just develope a Windows application to handle our company tables updates which monitor failures and recovery via PowerShell. The SAP 3rd party vendor application our team are using is old and doesn’t give any productive way to handle failures and recovery for these nightly tasks. And, here’s where PowerShell comes to the rescue.

So, one on the main components in building this Windows application is the use of the Datagrid. As PowerShell has a very useful Out-Gridview cmdlet but SAPIEN product gives you the .NET Visual Studio look-a-like feel for developing Windows Application.

The recent enhancements to the Out-GridView cmdlet has made it powerful and very useful command:

1. The use of filtering the data.
2. The use of the ‘-passthru” parameter so you select one or multiple items.
3. The use of the “-title” parameter.

SAPIEN5_9-4-2015 10-36-39 AM

SAPIEN6_9-4-2015 10-36-39 AM

Now, getting back to using a Datagrid in a SAPIEN PowerShell Windows Application. PowerShell Studio give us the ability to build Windows application with PowerShell code. For a person that have never use Visual Studio, the learning curve is short thanks to all the Wizards this product have included to get you started.

SAPIEN2_9-4-2015 10-36-39 AM

SAPIEN3_9-4-2015 10-36-39 AM

Now, sometime ago I posted a blog about creating a Windows application with a datagrid. Here’s the link:
1. http://www.maxtblog.com/2015/04/playing-with-powershell-studio-2015-windows-form-12/
2. http://www.maxtblog.com/2015/04/playing-with-powershell-studio-2015-windows-form-22/

After I posted the above blog articles,my MVP friend June Blender, remind me that SAPIEN had a snippet “ConvertTo-DataTable” which can be use to convert the PowerShell object to be loaded to the Datagrid. Here’s the blog link:
http://www.maxtblog.com/2015/05/using-powershell-studio-2015-snippet-sample/

SAPIEN4_9-4-2015 10-36-39 AM

But just recently, while build more applications with Datagrid, I discover that the DateTime value returned in thedatagrid is incomplete and only shows the date.

ConvertTo-DataTable_9-4-2015 8-07-31 AM

So, I went back to my own version that build the object for the datagrid to verify that the datetime values are retained.
Then, I decided to create a function “ConvertTo-DataSetObject” which is a smaller version and seem to work with ‘datetime‘ values loaded to a Datagrid.

ConvertTo-DataSetObject_9-4-2015 8-07-31 AM

Then, I added the function to my User snippet section. Here’s the function “ConvertTo-DataSetObject” code snippet:

[sourcecode language=”powershell”]
function ConvertTo-DataSetObject
{
<# .SYNOPSIS Convert an object to a dataset object for a datagrid. .DESCRIPTION Convert an object to a dataset object for a datagrid by create a global variable "$global:dsObj". .PARAMETER InputObject This parameter will accept any PowerShell variable. .EXAMPLE PS C:\> ConvertTo-DataSetObject -InputObject $psObject

.NOTES
This function is useful when creating object for a datagrid component.
#>
Param (
[object]$InputObject
)

#Create DataSet Object
$global:dsObj = New-Object System.Data.DataSet;

#Create DataTable
$global:dtObj = New-Object System.Data.DataTable("PSObjTable");
$dtCols = $global:dtObj.Columns; $dtRows = $global:dtObj.Rows;

## – Populate Columns
foreach ($c in ($InputObject | gm -MemberType ‘*Property’).Name)
{
$x = 1;
($global:dtObj.Columns.Add().ColumnName = $c);
};

## – Populate Rows:
foreach ($c in $InputObject)
{
## Initialize row:
$nwRow = $global:dtObj.NewRow();

## Data
foreach ($c2 in ($InputObject | gm -MemberType ‘*Property’).name)
{
$nwRow[$c2] = ($c).$c2;
}
## Add row to object:
$global:dtObj.Rows.Add($nwRow);
};

$global:dsObj.Tables.Add($global:dtObj);

};

[/sourcecode]

Here some additional tips when working with datagrid component using the following:

In the Datagrid properties:
1. Add a descriptive text under Captiontext.
2. Set the ReadOnly to True.

Using code to resize all column before display the datagrid:
[sourcecode language=”powershell”]

$datagrid1.PreferredColumnWidth = ‘120’;
:
ConvertTo-DataSetObject -InputObject $d;
$datagrid1.SetDataBinding($dsObj, "PSObjTable");

[/sourcecode]

Windows 10, PowerShell, Visual Studio Tools and more

More to learn and education never stops.

Now that Windows 10 has arrived take advantage of the new tools to start developing applications. Check out PowerShell version 5.0 and the new Visual Studio 2015 with its tools.

Take advantage of all the free training provided by both Microsoft people and MVP’s by connecting to either Microsoft Virtual Academy and Channel 9.  There are no excuses, just do it!

Click on the following banners for more information; (its free)

DownloadVS2015a

MSVirtualAcada

Channel9a

Don’t forget to check out the new Microsoft PowerShell at:

Also, take the opportunity to check out our PowerShell community sites, such as:

All these are GREAT source of information about PowerShell.  Along with it you can find other organizations that provide free registrations to webcasts, such as:

And, for affordable online training check Pluralsight Video Library: http://www.pluralsight.com/

So, as you can see, there’s plenty of resources available to start your education in Microsoft Technologies.

Go ahead and don’t wait any longer!

FLPSUG PowerShell August 2015 special events

A live Florida PowerShell User Group session at PC Professor in West Palm has to be rescheduled to a later date. Topic: “Integrating PowerShell in Visual Studio Community“.

DownloadVS2015a

For more information and registration:  (Coming soon)

I’m hoping to provide more live presentation at this location in the future. Stay tuned!!

Also, our online Florida PowerShell User Group meeting is pleased to have PowerShell MVP Trevor Sullivan on August 27th at 7:00pm. Topic: “Microsoft Azure Cloud Automation using PowerShell“.

For more information and registration: http://www.eventbrite.com/e/florida-powershell-user-group-monthly-meeting-august-2015-tickets-18088286552

Come and join us on any of these free events!

FLPSUG – “PowerShell SQL Server SMO Simplicity” video available

Recorded Thursday – 05/28/2015 7:00pm
Speaker Bio: Maximo Trinidad, (Microsoft MVP – PowerShell)

Topic: “PowerShell SQL Server SMO Simplicity” –
In this presentation I will be showing how easy it is to use PowerShell the extract information from SQL Server. I will cover creating, discovering, and formattting objects which is part of how you use PowerShell. At the same time, I will show the use of SMO to connect and interact with your SQL Server engine to collect information.

Download presentation script file:

Devart dbForge Studio for SQL Server Management Tool

I’ve been using DevartdbForge Studio for SQL Server mainly for helping documenting some of my legacy databases and to build some T-SQL script to embed in my PowerShell scripts.

This is a Great tool to have in your SQL arsenal. Check it out!

devart_logo

Download free for 30 days evaluation copy: https://www.devart.com/dbforge/sql/studio/download.html

* Customized Skin
* Database Comparison
* .. Many more interesting features.

Here are some pictures showing what I like about this product:

devart_01

devart_02

devart_03

devart_04

devart_05

devart_06

devart_07

devart_08

devart_09

devart_10

devart_11

PowerShell in South SQL Saturday 379 was a Great Success

SQLSaturdaySoFlorida2015

Once again I’m thankful to the organizers to have me speak at this “Awesome” event.  I appreciate the all whom attend my session meking it a Great Success and they got more.  My “PowerShell with Visual Studio SQL Data Tools” session became also a “SMO Simplicity Recap” session. They got two session in one.

SQLSat379_02

 

Session highlights

1. Visual Studio Community 2013 is the environment to use for integrated development by including Microsoft and Third-Party tools like:
a. PowerShell Tools for Visual Studio
b. Python Tools for Visual Studio
c. SQL Data Tools – Business Intelligent Developement
d. PowerShell Studio 2015  – call from within Visual Studio
e. PrimalXML 2015 – Call from within Visual Studio
d. And many more can be added…

SQLSat379_03

2. Visual Studio integration with either Team Foundation and Github repositories.

3. A quick dive in XML objects.

4. PowerShell error trapping in integrated solution.

SQLSat379_06

5. PowerShell SMO embedding and executing T-SQL code.

SQLSat379_05

During my presentation I demo for the first time how Visual Studio can trap PowerShell errors from within a SSIS Script Task component. And, everyone dropped their mounth. Beside the fact that you can also run and trap script errors by executing by itself.

SQLSat379_04

In the SMO session, I show how useful the ScriptBlock can be when embedding and running T-SQL code while PowerShell reads one object at a time.

To download my presentation and demo files are all available under South Florida SQLSaturday Schedule page:
http://www.sqlsaturday.com/379/Sessions/Schedule.aspx

Once again, THANKS to everyone for your attendance and support.

June Month for learning PowerShell and SQL Server SMO

Yes! This is month you could learn PowerShell and SQL Server Management Object (SMO) techniques to start building script(s) to manage your SQL Server(s).

I will be speaking at the following events:

SQLSaturdaySoFlorida2015

1. South Florida SQL Saturday #379 on 06/13/2015 all day event. Session: “PowerShell with Visual Studio SQL Data Tools” (03:00PM – 04:15 PM) Register at:  http://www.sqlsaturday.com/379/Sessions/Schedule.aspx

GeekSynch

2. IDERA’s Geek Sync Webinar – “PowerShell Essentials using SQL Server SMO” on 06/16/2015 at 10:00 AM CT (11:00 AM EST). Register at: https://attendee.gotowebinar.com/register/5998631242300195841

At the SQLSaturday event there will some giveaways during my session.  Go ahead and register to any of these events.  It’s Free!

PowerShell SQLServer SMO Simplicity Series – 2

Now that we got the connection to the server there are a couple of lines we could include to avoid connection timeout during execution of our T-SQL script using SMO with PowerShell. So, in this blog will be covering executing a T-SQL script and viewing its results.

The *T-SQL script will build me table with data statistics about my selected Database indexes need to either Reorganize, or Rebuild.

*Note: The following T-SQL script modified from its original “Index optimization – REBUILD vs. REORGANIZE” by Author Sarjen Haque. It can be found at http://sqltouch.blogspot.com/2013/07/index-optimization-rebuild-and.html

[sourcecode language=”powershell”]
$TSQLqry1 = @"
if object_id(‘$($IdxStatDatabase).$($tblSchema).[$($DatabaseName)_IndexFragmentationStatsList]’) is not null
begin
Drop Table $($IdxStatDatabase).$($tblSchema).[$($DatabaseName)_IndexFragmentationStatsList];
end

select
‘$($IdxStatDatabase).$($tblSchema).[$($DatabaseName)_IndexFragmentationStatsList]’ as [IdxFragStatDatabase] ,
‘$($DatabaseName)’ as [IdxFragStatOnDatabase] ,
object_name(o.object_id) as [table_name] ,
schema_name(o.schema_id) as [schema_name] ,
i.name as [index_name] ,
i.type_desc as [index_type] ,
dmv.page_count as [Page_Count] ,
dmv.fragment_count as Fragment_Count,
round(dmv.avg_fragment_size_in_pages, 2, 2) as [avg_fragment_size_in_pages] ,
round(dmv.avg_fragmentation_in_percent, 2, 2) as [avg_fragmentation_in_percent] ,
case when dmv.avg_fragmentation_in_percent <= 5 then ‘RELAX’
when dmv.avg_fragmentation_in_percent <= 30 then ‘REORGANIZE’ when dmv.avg_fragmentation_in_percent > 30 then ‘REBUILD’
end as [action],
getdate() as [RunOnDate]
into $($IdxStatDatabase).$($tblSchema).[$($DatabaseName)_IndexFragmentationStatsList]
from sys.partitions as p with ( readpast )
inner join sys.indexes as i with ( readpast ) on i.object_id = p.object_id
and i.index_id = p.index_id
inner join sys.objects as o with ( readpast ) on o.object_id = i.object_id
inner join sys.dm_db_index_physical_stats(db_id(), null, null, null, N’LIMITED’) dmv
on dmv.OBJECT_ID = i.object_id and dmv.index_id = i.index_id
and dmv.partition_number = p.partition_number
where objectproperty(p.object_id, ‘ISMSShipped’) = 0
and (i.name is not Null and i.type_desc <> ‘HEAP’)
order by [avg_fragmentation_in_percent] DESC,
[table_name],
[index_name]
"@;

[/sourcecode]

Notice that PowerShell will substitute the values for $($IdxStatDatabase).$($tblSchema).[$($DatabaseName)_IndexFragmentationStatsList] within the Here-String @” .. “@. It’s important to notice when using Here-String TABS are not allow at the end of ‘@”;’ or you’ll get an error.

Back to the connection section. We need to include the following just after the ‘$MySQL = New-Object $MySQL = new-object Microsoft.SqlServer.Management.Smo.Server …‘:

$MySQL.ConnectionContext.ConnectTimeout = 21600;   (This is an example (optional) set to 21600 sec = 6hrs)
$MySQL.ConnectionContext.StatementTimeout = 0;   (This is an example set to 0 for no timeout when running T-SQL queries)

Here’s the sample for the code placement:

[sourcecode language=”powershell”]
## – Connect and Execute T-SQL script:
[system.reflection.assembly]::LoadWithPartialName("Microsoft.SQLServer.Smo") | Out-Null;
$MySQL = new-object Microsoft.SqlServer.Management.Smo.Server $SQLServerInstanceName;
#$MySQL.ConnectionContext.ConnectTimeout = 21600; #Optional#
$MySQL.ConnectionContext.StatementTimeout = 0;

[/sourcecode]

If you don’t use the ‘ConnectionContext.StatementTimeout’ you’ll get a timeout error after 10 minutes of execution.

Now, we proceed to create our variables use for the T-SQL script:

[sourcecode language=”powershell”]
[string] $SQLServerInstanceName = ‘.’
[string] $DatabaseName = ‘AdventureWorks2014’
[string] $tblSchema = ‘dbo’
[string] $IdxStatDatabase = ‘devMaxTest’

[/sourcecode]

The purpose of the $IdxStatDatabase is to redirect the data index stat report to another Database. Then, we use the following line to execute our T-SQL script using SMO with PowerShell.

[sourcecode language=”powershell”]
$r = ($MySQL.Databases[$DatabaseName]).ExecuteWithResults($TSQLqry1);

[/sourcecode]

After the PowerShell script execute, open SQL Server Management Studio (SSMS), go to the database where the index stat data is stored, and query the table to view results.

SQLPOSHSIMPLY2_01

This report will help you identify those indexes that need to be taken care off.

Full sample script below:

[sourcecode language=”powershell”]
## – Set variables for T-SQL script:
$SQLServerInstanceName = ‘.’;
$DatabaseName = "AdventureWorks2014";
$tblSchema = "dbo";
$IdxStatDatabase = "devMaxTest";

## – Build the T-SQL script for execution:
$TSQLqry1 = @"
if object_id(‘$($IdxStatDatabase).$($tblSchema).[$($DatabaseName)_IndexFragmentationStatsList]’) is not null
begin
Drop Table $($IdxStatDatabase).$($tblSchema).[$($DatabaseName)_IndexFragmentationStatsList];
end

select
‘$($IdxStatDatabase).$($tblSchema).[$($DatabaseName)_IndexFragmentationStatsList]’ as [IdxFragStatDatabase] ,
‘$($DatabaseName)’ as [IdxFragStatOnDatabase] ,
object_name(o.object_id) as [table_name] ,
schema_name(o.schema_id) as [schema_name] ,
i.name as [index_name] ,
i.type_desc as [index_type] ,
dmv.page_count as [Page_Count] ,
dmv.fragment_count as Fragment_Count,
round(dmv.avg_fragment_size_in_pages, 2, 2) as [avg_fragment_size_in_pages] ,
round(dmv.avg_fragmentation_in_percent, 2, 2) as [avg_fragmentation_in_percent] ,
case when dmv.avg_fragmentation_in_percent <= 5 then ‘RELAX’
when dmv.avg_fragmentation_in_percent <= 30 then ‘REORGANIZE’ when dmv.avg_fragmentation_in_percent > 30 then ‘REBUILD’
end as [action],
getdate() as [RunOnDate]
into $($IdxStatDatabase).$($tblSchema).[$($DatabaseName)_IndexFragmentationStatsList]
from sys.partitions as p with ( readpast )
inner join sys.indexes as i with ( readpast ) on i.object_id = p.object_id
and i.index_id = p.index_id
inner join sys.objects as o with ( readpast ) on o.object_id = i.object_id
inner join sys.dm_db_index_physical_stats(db_id(), null, null, null, N’LIMITED’) dmv
on dmv.OBJECT_ID = i.object_id and dmv.index_id = i.index_id
and dmv.partition_number = p.partition_number
where objectproperty(p.object_id, ‘ISMSShipped’) = 0
and (i.name is not Null and i.type_desc <> ‘HEAP’)
order by [avg_fragmentation_in_percent] DESC,
[table_name],
[index_name]
"@;

## – Connect, set timeout values:
[system.reflection.assembly]::LoadWithPartialName("Microsoft.SQLServer.Smo") | Out-Null;
$MySQL = new-object Microsoft.SqlServer.Management.Smo.Server $SQLServerInstanceName;
#$MySQL.ConnectionContext.ConnectTimeout = 21600; #Optional#
$MySQL.ConnectionContext.StatementTimeout = 0;

## – Execute T-SQL script:
$r = ($MySQL.Databases[$DatabaseName]).ExecuteWithResults($TSQLqry1);
Write-Host "End of Script";

[/sourcecode]