Using New-Object -ArgumentList when the constructor takes one parameter that is an array

PowerShell is a great scripting language. One of its greatest features is that you have the full power of the .NET framework at your disposal when you use it. However, PowerShell is a dynamically language, and the .NET CLR was designed for static languages. Because of this, sometimes when you call CLR assemblies in Powershell, the types do not get resolved correctly.

I ran into this recently while calling New-Object to invoke the MemoryStream constructor that takes an array of bytes as a parameter. PowerShell did not automatically turn byte[] into object[] {byte[]}. Instead I got an error.

Windows PowerShell New-Object MemoryStream error

The same error appeared if I tried New-Object System.IO.MemoryStream @($bytes).

I initially solved my problem the hard way. I created a List<byte>, add the the array to the list and called List.ToArray(). Eventually, I figured out how to do it correctly thanks to this MSDN PowerShell Tip of the Week. To make an array of one item you precede it with a comma (I tried placing the comma afterwards during my initial experiments).  Since the argument list is not the first parameter of New-Object, I needed to wrap it in parenthesis. So I ended up with New-Object System.IO.MemoryStream (,$bytes) which worked as illustrated below.

Windows PowerShell New-Object MemoryStream success

My ideas for The Atlantis Interactive Schema Engine

I first became aware of the open source Atlantis Interactive Schema Engine on Febuary 24th 2011 thanks to a retweet by Paul Randal. Its a .NET assembly which contains most of the “heavy lifting” code related to the schema operations in their various products. Matt Whitfield, the sole proprietor of Atlantis Interactive, is going to setup an account on codeplex or one of the other forges so people can post bugs, feature requests, and of course patches. Personally, I have several ideas for feature requests. However, I have no channel at the moment for these requests besides emailing Matt. While Matt has been very responsive, I wanted the conversation to be more public, I talked to Matt and he agreed it was a good idea to create put them on this blog post. Please use the comments to add your own bugs, feature requests, or patches, or comment on my feature requests.

Now, here are the feature requests.

Integration with ILSpy

ILSpy is a free and open source .NET decompiler. Development was accelerated by the fact that Redgate Reflector ceased to be free. It would be great if the library could decompile CLR assemblies to C# with this tool. It would be great to produce a diff of the decompiled source of two CLR assemblies.

Powershell related improvements

Powershell Cmdlets

I’ve written about using the schema engine with  powershell previously. Its a relatively pleasant experience, but it could be improved with powershell cmdlets. I’m pretty new to powershell, so I don’t have a clear direction on where I’d like these to go. However, I might consider taking the initiative on this one.

Shortened namespaces for the Enums

Basically, in powershell you have no using statements, and the Enums that are used as parameters are several namespaces deep. It might not make sense to change them. However, I am just throwing this out there.

More flexibility for creating SchemaReaders

I’m looking to be able to do a few things:

  • Pass an existing connection object
  • Create a pooled connection string. While that doesn’t make sense for desktop apps, it might make sense if I was using the library from a web app. Being able to sync databases from a web app would be really cool.

Get the existing connection string from  SchemaCollection

Its self explanatory.

Be able to script a schema to a Microsoft Sql Server Project

I’d like to be able to point the library at a database and have a function in SchemaCollection that will script everything to a visual studio database project. Initially it would be easiest to hard code this functionality. However, it would probably be best if it was template (t4?) based so that you could easily support other database versions.

Naturally a long term possibility would be to be able to do a schema sync between databases and sql server projects.

Support for Mono

I actually started a port of this. Its very easy to get it to compile (comment out like 3 lines), but you do lose Geospatial support. For this reason, it might make sense to have separate mono and .NET builds. In my limited time testing it I could not get my linux vm to connect to SQL server, so I don’t know if it would actually work. I want to pursue this path a bit myself, even if it just leads to me filing upstream bugs in mono.

Support for other databases

It seems like the object model is pretty well organized. That being said I realize this exercise would be far from mechanical. I don’t think cross database compares will ever be practical on a whole database level.  Below are the specific databases I’d like to see supported.

Microsoft Access

I don’t actually see much of a case for this other than “it can be done” and all windows machines have support for the JetSQL engine.


Support for SQLite in .net is very good via System.Data.SQLite.


Its been a while since I used NPgsql, but its pretty stable and robust. That being said, there will always be edge cases here since postgres is so customizable and extensible.


So thats a tall list of requests. When we have an issue tracker I will be transferring them all there. Until then, use the comments.

A BSON ObjectId implementation in Javascript

Recently I spent a few days implementing the BSON ObjectId data type used by MongoDB in javascript so I could generate ObjectIds in a a web browser. I originally had a specific problem to solve. However, I ended reworking my approach in that instance so I did not have to generate ObjectIds in the browser. Despite this, the code is still a valid approach to solving my original problem. The code is available on the justaprogrammer github org, The project is called ObjectId.js. Although the git repository contains a sample html file to illustrate its usage, all you need is the javascript file, and if you want to support IE6, json2.js.

How It Works

Originally, I wanted an implementation of ObjectId that could interact with the format that the WCF DataContractJsonSerializer serialized objects of the type MongoDB.Bson.ObjectId in the official 10gen MongoDB C# driver. This format looked like this:


Now, it was trivial a JSON object that looked like this. The real issue was filling those 4 values in a sensible manner. I ended up settling on the following:

  • timestamp: This is supposed to be seconds since the unix epoch. Javascript represents time as milliseconds since the UNIX epoch. Therefore all I had to do was set this to Math.floor(new Date().valueOf() / 1000)
  • machine: This is normally the first three bytes of the md5 hash of the hostname. Since I can’t access that from the browser I store a random number in html5 local storage and a cookie for fallback. This makes the machine id probably consistent for a given combination of machine, logon and browser. Naturally, I could use something like evercookie to make the machine part of the object id more sticky. However, I felt this was “good enough”
  • pid: Pid is generated every time ObjectId.js is executed, so usually once per page. This means that pid changes each page load, but remains consistent across Ajax calls on a single page reload.
  • increment: This was simple. pid++ every time I generate a new ObjectId.

So if you need to generate ObjectId’s in javascript, check out this class. If you find a bug or have an improvement, fork and send a pull request. Happy coding!.

How many ways can we ‘script’ a stored proc or UDF from the command line?

Update: Added link to the scripts on github.

If you’ve worked with SQL Server for any length of time, you’ve probably written some stored procedures and user defined functions. If you’re like me, you’ve written many. Usually, when you edit a stored proc or UDF. you do it from Visual Studio or Sql Server Management Studio (SSMS). However, what if you want to script the sproc or udf from the command line? Well, there are quite a few ways to skin this cat, and I’ll go through a few.

Using sqlcmd

The simplest way to do this is the command line client that ships with SQL server 2005 and up, sqlcmd, and a system stored proc called sp_helptext. For our example lets have sp_helptext script itself.

sqlcmd -S .sqlexpress2k8r2  -d master -h-1 -s"" -W -w 1024 -Q "EXEC sp_helptext 'sp_help'" -o sp_helptext.sql

Lets breakdown the arguments:

  • -S .sqlexpress2k8 this is the named instance of sql server I am connecting to. If you want to connect to the local default instance, you can omit this
  • -d master This is the database we want to connect to. This can be omitted if you want to script an object from the users default catalog.
  • -h-1 -s”” -W -w 1024 These format the output of sqlcmd in a way that makes sense for this task. If you want to understand these options, look at the sqlcmd msdn page, or this stackoverflow question.
  • -Q “EXEC sp_helptext ‘sp_helptext'” -Q means “execute this T-SQL and exit.
  • -o sp_helptext.sql This writes the command output to the text file sp_helptext.sql

Using a batch file

Now we could easily wrap this in a batch file as I demonstrate here:

C:\Users\zippy\Documents\deleteme>type scriptProcOrUdf.cmd
@echo off
sqlcmd -S %1  -d %2 -h-1 -s"" -W -w 1024 -Q "EXEC sp_helptext '%3'" -o "%3.sql"<span style="font-family: Georgia, 'Times New Roman', 'Bitstream Charter', Times, serif; font-size: 13px; line-height: 19px; white-space: normal;">
So we saved ourselves a few keystrokes with a 2 line batch file.

Using Powershell

Batch files work, but they’re so 2002. Powershell is the CLI of choice these days on windows. So lets see how we would do this in powershell:

So apparently it took me 26 lines of Powershell (if you remove the BSD License) to accomplish what I could have done in a 2 line batch file. Whats the point you might be asking? Well, honestly if you just want the quickest way to script stored procedures to a file from the command line, I’d use the batch file. However, if you wanted to make changes to this script, for example, make it script all stored procedures and udfs in the database. You’d have an easier job with Powershell. You could turn the body of the script into a Poweshell function that takes a SqlConnection object and object name as a parameter. You could then get a list of all the stored procedures and functions with the T-SQL statement SELECT [ROUTINE_NAME], [ROUTINE_TYPE], [ROUTINE_DEFINITION] FROM [INFORMATION_SCHEMA].[ROUTINES]. I’ll leave writing that script as an exercise to the reader.

Using Powershell and “Original Thinking” to do it in less than 26 LOC

In the fictional Star Trek universe, Denny Crane, James T. Kirk cheated on the Kobayashi Maru by reprogramming the no-win scenario to allow him to win. He was awarded for his original thinking. Our scenario is less dire. However, like Kirk, we can change the rules. I chose to use a the open source .NET assemble Atlantis.SchemaEngine.dll. This dll is used for synchronizing schema between two SQL server databases. We don’t want to synchronize or compare anything. However, the library synchronizes database schema’s by generating SQL scripts. Therefore, it can generate DDL for any script-able object in a database. So lets take a look at our leaner, meaner script.

So we went from 26 to 15 lines. That’s a savings of 42%. Some of those are pretty long lines since we have to fully qualify our namespaces in Powershell. However, its a net simplification compared to making the ADO.NET calls directly.


I showed you 4 ways to script a stored proc. I also left plenty of room for you the reader to explore on your own. The code for ScriptProc.ps1 and Script-Object.ps1 lives in the justaprogrammer powershell github repo. If you have any improvements, feel free to leave a comment or pingback below. Also, pull requests are always welcome. Happy Scripting!