Chris Umbel

PowerShell 2.0 Out-GridView, ISE and ScriptCmdlets

In my last post I discussed the background-job and remoting systems of PowerShell 2.0. While I find those features interesting personally there are three more that I'd like to discuss that have a broader appeal: the Out-GridView CmdLet, ScriptCmdlets and the PowerShell Integrated Scripting Environment (ISE).

Note that PowerShell 2.0 is in CTP 3 at the time of writing and everything is subject to change.

Out-GridView

The Out-GridView cmdlet gives you a lightwieight sortable/searchable grid that you can easily pipe collections into as demonstrated below.

ls | Out-GridView

ScriptCmdlets

Previously the only way to develop your own Cmdlets was to resort to one of the higher level .Net languages such as VB.Net and C#. This restriction has been removed with the introduction of ScriptCmdlets.

Consider the following ScriptCmdLet that retrieves a user's Twitter status:

Cmdlet Get-UsersStatus
{
    # definition of the Cmdlet's parameters
     param([string] $username)
    
    $url = ("http://twitter.com/users/{0}.xml" -f $username)
    
    $webClient =  New-Object Net.WebClient
    $responseDoc = New-Object Xml.XmlDocument
    
    $responseDoc.LoadXml([Text.Encoding]::ASCII.GetString(
      $webClient.DownloadData($url)))
      
    $responseDoc.SelectSingleNode("/user/status/text")   
}

Which can now be executed Cmdlet style:

Get-UsersStatus "chrisumbel"

With a minor change to our param definition we can now accept input from the pipeline:

param([ValueFromPipeline][string] $username)

So our Cmdlet can be executes as such:

"chrisumbel", "wimbledon" | Get-UsersStatus

which gets the status all the users piped into it, chrisumbel and wimbledon in this case.

An interesting thing to consider is that ScriptCmdlets can override other Cmdlets, including those that are built-in.

PowerShell ISE

A weakness of PowerShell up to this point is that you typically had to resort to a third-party solution to get a rich development expirience. While the PowerShell ISE is certainly not a replacement for some of the fancy commercial offerings it's far more helpful than notepad. It grants the scripter syntax highlighting, one-click script running and a tabbed environment for working with multiple scripts.

Check out the following screenshot which contains the code from our Twitter status example:

Conclusion

These features go a long way to making a powerful shell environment even more powerful, not to mention quite a bit more friendly. Here's looking forward to the official release!

Sun Jul 05 2009 17:07:21 GMT+0000 (UTC)

Comments

Asynchronous and remote execution with powershell 2 ctp3

An interesting feature released with the PowerShell 2 CTP3 is the ability to run background jobs consisting of arbitrary PowerShell code. In order to use this functionality you must download and install the PowerShell 2 CTP3 and the WinRM (Windows Remote Management) 2.0 CTP.

Keep in mind that installing the CTP requires uninstalling previous versions of PowerShell. Depending on the PowerShell and operating system versions involved the procedure can vary. Google/Bing is your friend.

Once bringing up PowerShell you have to enable remoting by invoking the aptly named Enable-PSRemoting cmdlet

Enable-PSRemoting -force

Now the meat. Consider the following code:

# copy bigfile.txt on a background thread
$job = start-job -scriptBlock { cp bigfile.txt bigfilecopy.txt }

# here's where we'd perform some other logic while our file copies

# wait for job to finish
wait-job $job

Note that you can also retrieve the status of jobs and the return values with the Get-Job and Receive-Job cmdlets respectively.

More interesting still is the ability to execute jobs remotely on other systems running PowerShell and WinRM. This can be demonstrated with the Invoke-Command cmdlet coupled with the -AsJob parameter as follows:

Invoke-Command -ComputerName Comp1, Comp2 -ScriptBlock { cp bigfile.txt bigfilecopy.txt } -AsJob

The -ComputerName parameter's values of Comp1 and Comp2 indicate that our script will will execute on two remote machines named Comp1 and Comp2.

In conclusion PowerShell 2 introduces some interesting features for asynchronous and remote operations. This opens many doors for administrators and developers alike with minimal code.

Tue Jun 30 2009 19:50:15 GMT+0000 (UTC)

Comments

Understanding Source Code with NDepend and CQL

The longer I work in this industry the more I realize the pain involved when source code gets out of control. The larger the project the harder it is to refactor and the longer it can take to see the need to do so. Then once you've made the decision to refactor the task can become so overwhelming and complex that you do so ineffectively or simply give up.

There have long been tools to combat refactoring complexities but one of the more interesting modern ones I've found is NDepend. What specifically intrigues me about this tool is its Code Query Language (CQL), which is NDepend's flagship feature. CQL is a language patterned after SQL that can be used to determine what source code elements meet specified criterion. This allows the developer to effectively understand and navigate the codebase which is critical in developing and refactoring.

To demonstrate I'll need a nicely-sized Visual Studio solution, let's say the Enterprise Library 4.1 application block's source code. According to the report generated after adding the solution to an NDepend project we'll be querying a solution of 1,542 classes spread over 28,281 lines of code which compiles down to 196,966 IL instructions.

Now let's take a look at some actual CQL. Consider the following line:

SELECT TYPES 
WHERE DeriveFrom "System.Drawing.Design.UITypeEditor"

Notice the similarity to SQL. There's a familiar SELECT and WHERE clause. Although this particular statement is lacking a FROM clause they are supported and will be demonstrated later. The TYPES expression indicates that my query will return .Net types such as classes and structs. The DeriveFrom "System.Drawing.Design.UITypeEditor" expression simply stipulates that all types returned will subclass System.Drawing.Design.UITypeEditor.

The result is a list of types that you can select in NDepend and perform a number of actions such as navigating directly to that class in Visual Studio or Reflector as well as viewing dependancy graphs and matrices.

Let's consider another query, this time slightly more complex.

SELECT TYPES 
FROM ASSEMBLIES "Microsoft.Practices.EnterpriseLibrary.Data.Configuration.Design"
WHERE NameLike "Oracle.*Builder"
AND IsSealed 

This query employs a FROM clause which limits the results to the assembly "Microsoft.Practices.EnterpriseLibrary. Data.Configuration.Design". Also the NameLike expression accepts a regular expression which filters the results to types named accordingly. As you may guess the IsSealed expression causes the query to return only sealed classes.

CQL queries need not always be written manually as much of the core NDepend functionality simply writes it for you. Interestingly the software is forthright about this and includes CQL queries directly into its menu system as is shown below.

CQL is certainly one of the most flexible tools a developer can employ to understand a large codebase, especially one they're not entirely familiar with. It's also very intuitive because its set-based query approach is something most developers are already accustomed to. Despite it's simplicity it's very feature-rich and I suggest taking a look at The CQL 1.8 Specification for a more detailed view.

I also suggest taking a look at CQL Constraints which are a more in-depth topic that facilitate proactive management of source code.

Mon Jun 22 2009 16:06:48 GMT+0000 (UTC)

Comments

Object Oriented Databases with db4o

One of the most notable initiatives in the business software development industry in the last few years is the simplification of data access. The core complication lies in the impedance mismatch between relational storage and object oriented code higher in the stack. Far too much plumbing is required to map object properties to columns in result sets, deal with data type variations and handle other annoyances.

The most popular response to the problem has been to leave relational storage in place and introduce ORM (Object Relational Mapping) frameworks to automate the conversion between relational and object oriented structures. This approach is popular in business environments because having a relational database under the hood offers many advantages such as ease of reporting and integration.

Object oriented databases, however, offer an interesting alternative.

While I personally think it'll be quite some time before object oriented databases are reasonable for most large-scale enterprise applications they're certainly viable for a range of projects and can be incredibly simple to adopt.

As a demonstration I'll whip up a quick .Net console app that will employ db4o, a popular object oriented database product, to store and query some simple sales order data.

First we need some objects to store. I'll define the following "Order" and "Detail" classes for my application to consume, store and query. Nothing special has to be done to these classes to make them eligible for db40 persistence. This will all be purely POCO.

public class Detail
{
	public int ItemNumber { get; set; }
	public short Qty { get; set; }
}

public class Order
{
	public string CustomerName { get; set; }
	public List<Detail> Details = new List<Detail>();
}

Now I'll set up a connection to a database file. Note that it's also possible for db4o to operate against a server over a network.

IObjectContainer database = Db4oFactory.OpenFile("test.db4o");

Keep in mind that the file "test.db4o" does not have to exist on the filesystem before our code is run. If the file does not exist db4o will create it for us.

Now I'll instantiate some Order and Detail objects and persist them to disk via "Db4oFactory"'s "Store" method.

database.Store(new Order() { CustomerName = "Chris Umbel", 
	Details = new List<Detail>() { 
		new Detail() { ItemNumber = 1, Qty = 3} 
	} 
});

database.Store(new Order() { CustomerName = "Billiam Smith", 
	Details = new List<Detail>() { 
		new Detail() { ItemNumber = 1, Qty = 1 }, 
		new Detail() { ItemNumber = 2, Qty = 1} 
	} 
});

That was pretty easy and about as plumbing free as could be hoped for. Now I'll use Linq to query what we've stored.

IEnumerable<Order> orders = 
	from Order o in database 
	from Detail d in o.Details
	where d.ItemNumber == 2
	select o;

foreach (Order o in orders)
{
	Console.WriteLine(o.CustomerName);
}

Which produces the following output:

Billiam Smith

because that was the only order stored containing an item #2.

db40 also offers other methods of querying than Linq such as their "Native Query" implementation, Query-By-Example and the SODA API, but they're beyond the scope of this introduction.

As a final matter of housekeeping I'll clean up after myself and close my connection to the database file.

database.Close();

Based on that simple example it's evident that persisting objects with db4o is painless and requires minimal plumbing to implement. More importantly still it does not impose any restrictions on the inheritance chain so you can subclass however you chose.

There are plenty of other object oriented database systems out there each with their own strengths and weaknesses. I recommend checking out ODBMS.org for more info on the subject.

Sun Jun 07 2009 18:37:34 GMT+0000 (UTC)

Comments
< 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 >
Follow Chris
RSS Feed
Twitter
Facebook
CodePlex
github
LinkedIn
Google