Tech in the 603, The Granite State Hacker

SharePoint Saturday NH

I’m psyched to be a part of the founding and building of SharePoint Saturday for New Hampshire. It’s a part of the Granite State SharePoint Users Group (follow twitter @NHSharePointUG ) or check us out on Linked-IN.

Stay tuned for more info…. lots of exciting stuff happening for the SharePoint community in NH.

Edit: May have jumped the gun on this a bit… the site may not be publicly accessible… yet. I’ll update this post when it’s made public.

Tech in the 603, The Granite State Hacker

Infrastructure Agility via Cloud Technology

I’m honored to have just been published on Edgewater’s public blog…

It’s bit about managing infrastructure agility. The basic idea is architecting your infrastructure so that you can push off parts to different clouds when you need to, for any of a multitude of reasons. The idea goes a bit beyond virtualization.

Check it out:

http://edgewatertech.wordpress.com/2009/04/24/best-practice-cloud-computing/

Tech in the 603, The Granite State Hacker

Workstation Virtualization

I’ve been having fun (ya, really… fun!) with MS VPC 2007 SP1 lately.

I’ve put some VM’s on a portable USB disk. USB2 isn’t the best connection, but it’s workable. It does a few things… adds a spindle to the system config, offloading that overhead from the main disk.

An external disk also means the VM is “portable”. I can launch the VM on my laptop or any other host system (like my home system) without any significant difficulty. Even better, by having the VPC config files on the host, rather than on the external disk, you can tune VM settings (like memory and network connectivity) for optimal conditions on the host system.

One other nice time-saver is differencing disks. You can create a “base” virtual hard disk, and create VHD’s that are deltas of the base… by doing this, you can create a primary configuration, and then create several machines that inherit that basic config. It came in very handy for a recent product evaluation… I just created a base VM roughly according to what the client expects to host the system on, and then created VMs based on that for each product I wanted to evaluate.

Another nice feature is virtual assist hardware. At first, I didn’t know my ThinkPad had it, but it turns out to be a BIOS setting. Flip that on, do a cold boot, and VM performance is visibly better. I knew of some of the other features from past experience with MS VPC 2005, but the hardware acceleration is new to me. (Ironically, my newer home desktop, a 64-bit monster with huge RAM doesn’t support the hardware assist… performance isn’t a problem, there, tho. )

One more trick: enable the Undo disk option… It put another layer of protection on your VM, allowing you snap a line on your VM at a point in time that you can back-out to. The cool part about this is that the Undo disk is created as a temporary file on the host system, (typically on the primary system drive). This distributes load across the spindles even more, which further improves run-time performance. The downside: when it comes time to commit the undo, it can take a while.

I still love the idea, going forward, of putting client dev environments on a config like this… Not only does it create a nice level of separation between client system configurations, but when you get your hands on better hardware, migration is not an issue.

Tech in the 603, The Granite State Hacker

WORKAROUND: Misconfigured Windows-Integrated Authentication for Web Services

In trying to drive a process from a SharePoint list, I ran across a problem…

I couldn’t create a web reference in my C# project due to some really weird problem… In the “Add web reference” wizard, I entered my URL, and was surprised by a pop-up titled “Discovery Credential”, asking me for credentials for the site.

Since I was on the local domain and had “owner” permissions to the site, I thought I would just waltz in and get the WSDL.

Ok, so it wants creds… I gave it my own.

Negative…!?!?

After a few attempts and access denied errors, I hit Cancel, and was rewarded by, of all things, the WSDL display… but I still couldn’t add the reference.

After quite a bit of wrestling, it turns out there was an authentication provider configuration problem. The site was configured to use Kerberos authentication, but the active directory configuration was not set up correctly. (I believe it needed someone to use SetSPN to update the Service Principal Name (SPN) for the service.)

One way to resolve the problem was to set the authentication provider to NTLM, but in my case, I didn’t have, (and wasn’t likely to get) that configuration changed in the site (a SharePoint Web Application) I really needed access to.

In order to make it work, I had to initially create my reference to a similar, accessible site.

(e.g. http://host/sites/myaccessiblesite/_vti_bin/lists.asmx )

Then, I had to initialize the service as such, in code:


private void InitWebService()
{
System.Net.AuthenticationManager.Unregister("Basic");

System.Net.AuthenticationManager.Unregister("Kerberos");

//System.Net.AuthenticationManager.Unregister("Ntlm");

System.Net.AuthenticationManager.Unregister("Negotiate");

System.Net.AuthenticationManager.Unregister("Digest");


SmokeTestSite.Lists workingLists = new SmokeTest.SmokeTestSite.Lists();

workingLists.Url = "http://host/sites/mybrokensite/_vti_bin/lists.asmx";

workingLists.UseDefaultCredentials = true;

workingLists.Proxy = null;

lists = workingLists;
}

What this accomplishes is it unregisters all authentication managers in your application domain. (This can only be done once in the same app domain. Attempts to unregister the same manager more than once while the program’s running will throw an exception.)

So by having all the other authentication managers disabled in the client, the server would negotiate and agree on Ntlm authentication, which succeeds.

Tech in the 603, The Granite State Hacker

Anonymous Form Submission to Form Library with InfoPath in MOSS

Here’s a bit of a trick I ran across while helping to develop some MOSS2007 solutions.

I needed to configure InfoPath so that it could submit documents to a site that the submitter would not be able to access. In SharePoint, this is not directly possible.

A common work-around is to set up incoming email for the target list, and submit by email to that. Unfortunately, my client is part-way through a Notes to Exchange migration, so this wasn’t practical in the given time frame.

The solution… create two sites, one that is accessible to the submitter, and the other that is not. On the accessible site, create a new, “hidden” list that the user can submit to. Add an event receiver to that list, such that whenever a new item is added, the item is moved to the real intended target using elevated privileges.

Using VSeWSS extensions, create a List Definition project that has something like this in the ItemEventReciever.cs file:

 

using System;
using System.Security.Permissions;
using System.Runtime.InteropServices;
using Microsoft.SharePoint;
using Microsoft.SharePoint.Security;
using Microsoft.SharePoint.Utilities;
using VSeWSS;

namespace AutoForwardingFormLibrary
{
[CLSCompliant(false)]
[TargetList("38aea319-af78-4489-9059-d124c68bf9fe")]
[Guid("9d0f139b-d9ed-4b6c-b0ba-2353cb3bad85")]
public class AutoForwardingFormLibraryItemEventReceiver : SPItemEventReceiver
{
private SPListItem addedItem = null;

///
/// Initializes a new instance of the Microsoft.SharePoint.SPItemEventReceiver class.
///

public AutoForwardingFormLibraryItemEventReceiver()
{
}

///
/// Asynchronous after event that occurs after a new item has been added to its containing object.
///

///
/// A Microsoft.SharePoint.SPItemEventProperties object that represents properties of the event handler.
///
public override void ItemAdded(SPItemEventProperties properties)
{
addedItem = properties.ListItem;
if (addedItem == null)
{
throw new ArgumentNullException("properties.ListItem");
}
SPSecurity.CodeToRunElevated moveItem = moveItem_Elevated;
SPSecurity.RunWithElevatedPrivileges(moveItem_Elevated);
}

private void moveItem_Elevated()
{
addedItem.CopyTo(SPUrlUtility.CombineUrl("http://targetserver/sites/jimw/docs/formlibrary", addedItem.Name));
addedItem.Delete();
}
}
}


Tech in the 603, The Granite State Hacker

“Rapture for the Geeks”: Singularitarianism 101

My wife and I were at Barnes & Noble last week. While we were there, she walked over to me with a book, with a grin on her face, called “Rapture for the Geeks”. She was making a personal joke with it, but I was too immediately intrigued… which made it all the more hilarious to her.

Being a spiritualist, optimist, scientist, technologist, and a sci-fi fan-boy (I believe the way Albert Einstein did: Everything is a miracle), all my life, I’ve felt like humanity’s on the brink (in geological terms) of something fascinating… Needless to say, I bought it and read it.

The author, Richard Dooling, has done some fiction, but this was (mostly) non-fiction. It turns out to be an introduction to a concept called the “Technological Singularity” for para-technologists. Dooling doesn’t offer any insights of his own, but brings together a lot of interesting view points from a lot of notable “Singularitarians“, especially Ray Kurzweil, and Bill Joy.

The thing that I found most interesting about this book was not the ideas Dooling was relaying, but the fact that there appears to be a build-up of buzz around it. Even as the crescendo that is the “Moore’s Law” prediction persists, so does the interest crescendo around The Singularity.

I remember having conversations with a wise, elder grand-uncle when I was a kid that bordered on philosophical regarding the progression of technology… back in the early 80’s. I remember wondering what would happen when humanity’s technology over-reached its own capacity to manage it. What would we really do, for example, if we eventually automated ourselves out of all of our “work”?

It’s true… technology sure has come a long way in the past 35 years that I’ve been observing it (my whole life). When I was a preschooler, my folks had this awesome TI calculator… it had a red LED display, and did all four arithmetic functions. Almost a decade later, my first programmable computer was a Timex-Sinclair 1000 with 2K of RAM (expanded to 16K) and a cassette player for storage. Today, my preschool-age children play with “old” 1-gigahertz Pentium III machines while my own machine (a Christmas gift from my wife) is a commercial-off-the-shelf quad-core 2.3 gigahertz monster with 8GB RAM and a full Terabyte of disk space in a RAID 0 array.

With humanity’s ability to abstract and build on its own technology, it’s not a question of “if” we will hit some sort of existentially disruptive technology… the only important questions are “when” and “will life on Earth survive it?”

The really fun part is imagining how The Singularity might manifest itself… artificial intelligence gaining sentience? Discovering the truly united nature of time/space/matter/energy? Teleportation? Limitless energy? Immortality (via perfected nutrition, nano-technology, replicable parts, or even transference into “robotic” bodies) ? Inter-galactic travel? Perhaps God, the uber geek creator of this simulation we call life, will don His Holy “VR gear” to be present to witness the birth of His grandchild(ren). (I wonder which Cloud He might be riding in on… EC2? Blue Cloud? Surely not Azure… 🙂 )

There’s some good humor in the book, but the last few chapters are really hard trudging through. It suffers from more than a couple bouts with verbal diarrhea… one, for example, is a ten page rant about making sure you save your work as text. He also comes across as a programmatic poseur to someone who really is a programmer.

If you are already familiar with these ideas, you’ll probably be insulted by this book. What kept me reading was the false hope that the author had some synthesis of his own on the subject. That said, if you’re not a programmer, and are new to the topic, the first nine chapters are good, and the remaining chapters might be forgivable.

Of course, Dooling does suggest that heightened interest in The Singularity may also just be a symptom of a mid-life crisis. 🙂

Tech in the 603, The Granite State Hacker

“ETL”ing Source Code

The past couple weeks, I’ve been between projects, which has gotten me involved in a number of “odd jobs”. An interesting pattern that I’m seeing in them is querying and joining, and updating data from very traditionally “unlikely” sources… especially code.

SQL databases are very involved, but I find myself querying system views of the schema itself, rather than its contents. In fact, I’m doing so much of this, that I’m finding myself building skeleton databases… no data, just schema, stored procs, and supporting structures.

I’m also pulling and updating metadata from the likes of SharePoint sites, SSRS RDL files, SSIS packages… and most recently, CLR objects that were serialized and persisted to a file. Rather than outputting in the form of reports, in some cases, I’m outputting in the form of more source code.

I’ve already blogged a bit about pulling SharePoint lists into ADO.NET DataSet’s. I’ll post about some of the other fun stuff I’ve been hacking at soon.

I think the interesting part is how relatively easy it’s becoming to write code to “ETL” source code.

Tech in the 603, The Granite State Hacker

Reading SharePoint Lists into an ADO.Net DataTable

[Feb 18, 2009: I’ve posted an update to show the newer technique suggested below by Kirk Evans, also compensating for some column naming issues.]

The other day, I needed to write some code that processed data from a SharePoint list. The list was hosted on a remote MOSS 2007 server.

Given more time, I’d have gone digging for an ADO.NET adapter, but I found some code that helped. Unfortunately, the code I found didn’t quite seem to work for my needs. Out of the box, the code missed several columns for no apparent reason.

Here’s my tweak to the solution:

(The ListWebService points to a web service like http://SiteHost/SiteParent/Site/_vti_bin/lists.asmx?WSDL )

private data.DataTable GetDataTableFromWSS(string listName)

{

ListWebService.Lists lists = new ListWebService.Lists();

lists.UseDefaultCredentials = true;

lists.Proxy = null;

//you have to pass the List Name here

XmlNode ListCollectionNode = lists.GetListCollection();

XmlElement List = (XmlElement)ListCollectionNode.SelectSingleNode(String.Format(“wss:List[@Title='{0}’]”, listName), NameSpaceMgr);

if (List == null)

{

throw new ArgumentException(String.Format(“The list ‘{0}’ could not be found in the site ‘{1}'”, listName, lists.Url));

}

string TechListName = List.GetAttribute(“Name”);

data.DataTable result = new data.DataTable(“list”);

XmlNode ListInfoNode = lists.GetList(TechListName);

System.Text.StringBuilder fieldRefs = new System.Text.StringBuilder();

System.Collections.Hashtable DisplayNames = new System.Collections.Hashtable();

foreach (XmlElement Field in ListInfoNode.SelectNodes(“wss:Fields/wss:Field”, NameSpaceMgr))

{

string FieldName = Field.GetAttribute(“Name”);

string FieldDisplayName = Field.GetAttribute(“DisplayName”);

if (result.Columns.Contains(FieldDisplayName))

{

FieldDisplayName = FieldDisplayName + ” (“ + FieldName + “)”;

}

result.Columns.Add(FieldDisplayName, TypeFromField(Field));

fieldRefs.AppendFormat(“”, FieldName);

DisplayNames.Add(FieldDisplayName, FieldName);

}

result.Columns.Add(“XmlElement”, typeof(XmlElement));

XmlElement fields = ListInfoNode.OwnerDocument.CreateElement(“ViewFields”);

fields.InnerXml = fieldRefs.ToString();

XmlNode ItemsNode = lists.GetListItems(TechListName, null, null, fields, “10000”, null, null);

// Lookup fields always start with the numeric ID, then ;# and then the string representation.

// We are normally only interested in the name, so we strip the ID.

System.Text.RegularExpressions.Regex CheckLookup = new System.Text.RegularExpressions.Regex(“^\\d+;#”);

foreach (XmlElement Item in ItemsNode.SelectNodes(“rs:data/z:row”, NameSpaceMgr))

{

data.DataRow newRow = result.NewRow();

foreach (data.DataColumn col in result.Columns)

{

if (Item.HasAttribute(“ows_” + (string)DisplayNames[col.ColumnName]))

{

string val = Item.GetAttribute(“ows_” + (string)DisplayNames[col.ColumnName]);

if (CheckLookup.IsMatch((string)val))

{

string valString = val as String;

val = valString.Substring(valString.IndexOf(“#”) + 1);

}

// Assigning a string to a field that expects numbers or

// datetime values will implicitly convert them

newRow[col] = val;

}

}

newRow[“XmlElement”] = Item;

result.Rows.Add(newRow);

}

return result;

}

// The following Function is used to Get Namespaces

private static XmlNamespaceManager _nsmgr;

private static XmlNamespaceManager NameSpaceMgr

{

get

{

if (_nsmgr == null)

{

_nsmgr = new XmlNamespaceManager(new NameTable());

_nsmgr.AddNamespace(“wss”, “http://schemas.microsoft.com/sharepoint/soap/”);

_nsmgr.AddNamespace(“s”, “uuid:BDC6E3F0-6DA3-11d1-A2A3-00AA00C14882”);

_nsmgr.AddNamespace(“dt”, “uuid:C2F41010-65B3-11d1-A29F-00AA00C14882”);

_nsmgr.AddNamespace(“rs”, “urn:schemas-microsoft-com:rowset”);

_nsmgr.AddNamespace(“z”, “#RowsetSchema”);

}

return _nsmgr;

}

}

private Type TypeFromField(XmlElement field)

{

switch (field.GetAttribute(“Type”))

{

case “DateTime”:

return typeof(DateTime);

case “Integer”:

return typeof(int);

case “Number”:

return typeof(float);

default:

return typeof(string);

}

}

Tech in the 603, The Granite State Hacker

The Great Commandment

While I was writing a post the other day, I noticed that I had neglected a topic that I find very important in software development. Risk management.

There are only a few guarantees in life. One of them is risk. Companies profit by seizing the opportunities that risks afford. Of course, they suffer loss by incidents of unmitigated risks. All our government and social systems are devices of risk management. In business, risk management is (now, and ever shall be) the great commandment.

Many software engineers forget that risk management is not just for PM’s. In fact, software and its development is fundamentally a tool of business, and, by extension, risk management. The practice of risk management in software really extends in to every expression in every line of source code.

Don’t believe me? Think of it this way… If it wasn’t a risk, it would be implemented as hardware. I’ve often heard hardware engineers say that anything that can be done in software can be done in hardware, and it will run faster. Usually, if a solution is some of the following…
· mature,
· ubiquitous,
· standard,
· well-known,
· fundamentally integral to its working environment

…it is probably low risk, particularly for change. It can likely be cost-effectively cast in stone (or silicone). (And there are plenty of examples of that… It’s what ASIC’s are all about.)

Software, on the other hand, is not usually so much of any of those things. Typically, it involves solutions which are…
· proprietary,
· highly customized,
· integration points,
· inconsistently deployed,
· relatively complex / error-prone
· immature or still evolving

These are all risk indicators for change. I don’t care what IT guys say… software is much easier to change than logic gates on silicone.

I’ve dug in to this in the past, and will dig in more on this in future posts, but when I refer to the “great commandment”, this is what I mean.

Tech in the 603, The Granite State Hacker

Application Platform Infrastructure Optimization

In doing some research for a client on workflow in SharePoint, I came across this interesting article about the differences between BizTalk 2006 and the .NET Workflow Foundation (WF).

The article itself was worth the read for its main point, but I was also interested in Microsoft’s Application Platform Infrastructure Optimization (“APIO”) model.

The “dynamic” level of the APIO model describes the kind of system that I believe the .NET platform has been aiming at since 3.0.

I’ve been eyeing the tools… between MS’s initiatives, my co-workers’ project abstracts, and the types of work that’s coming down the pike in consulting. From the timing of MS’s releases, and the feature sets thereof, I should have known that the webinars they’ve released on the topic have been around for just over a year.

This also plays into Microsoft Oslo. I have suspected that Windows Workflow Foundation, or some derivative thereof, is at the heart of the modeling paradigm that Oslo is based on.

All this stuff feeds into a hypothesis I’ve mentioned before that I call “metaware”, a metadata layer on top of software. I think it’s a different shade of good old CASE… because, as we all know… “CASE is dead… Long live CASE!”