Skip to main content

How to create an adapter for the TFS Integration Platform - Part VII: WIT Conflict Handling

Note: This post is part of a series and you can find the rest of the parts in the series index.

The WIT adapter needs a custom conflict type and a custom conflict handler, really for no reason other than the platform expects it.

Conflict Handler

If you have no reason for a custom conflict handler, a simple implementation which allows for manual resolution can be created, which is what I have below.

public class SharePointWITGeneralConflictHandler : IConflictHandler
{
    public bool CanResolve(MigrationConflict conflict, ConflictResolutionRule rule)
    {
        return ConflictTypeHandled.ScopeInterpreter.IsInScope(conflict.ScopeHint, rule.ApplicabilityScope);
    }

    public ConflictResolutionResult Resolve(MigrationConflict conflict, ConflictResolutionRule rule, out List<MigrationAction> actions)
    {
        actions = null;

        if (rule.ActionRefNameGuid.Equals(new ManualConflictResolutionAction().ReferenceName))
        {
            return ManualResolve(out actions);
        }

        return new ConflictResolutionResult(false, ConflictResolutionType.Other);
    }

    public ConflictType ConflictTypeHandled
    {
        get;
        set;
    }

    private static ConflictResolutionResult ManualResolve(out List<MigrationAction> actions)
    {
        actions = null;
        return new ConflictResolutionResult(true, ConflictResolutionType.Other);
    }
}

Conflict Type

If you have no reason for a custom conflict type, you can do what I did which is to re-implement the generic one with even less features namely only supporting ManualConflictResolution and a very simple scope hint.

public class SharePointWITGeneralConflictType : ConflictType
{
    public static MigrationConflict CreateConflict(Exception exception)
    {
        return new MigrationConflict(
            new SharePointWITGeneralConflictType(),
            MigrationConflict.Status.Unresolved,
            exception.ToString(),
            CreateScopeHint(Guid.NewGuid().ToString()));
    }

    public static MigrationConflict CreateConflict(Exception exception, IMigrationAction conflictedAction)
    {
        return new SharePointWITGeneralConflictType().CreateConflict(exception.ToString(), CreateScopeHint(Guid.NewGuid().ToString()), conflictedAction);
    }

    public override Guid ReferenceName
    {
        get
        {
            return s_conflictTypeReferenceName;
        }
    }

    public override string FriendlyName
    {
        get
        {
            return s_conflictTypeFriendlyName;
        }
    }

    public override string Description
    {
        get
        {
            return s_conflictTypeDescription;
        }
    }

    public SharePointWITGeneralConflictType()
        : base(new SharePointWITGeneralConflictHandler())
    { }

    public static string CreateScopeHint(string sourceItemId)
    {
        return string.Format(CultureInfo.CurrentCulture, "/{0}/{1}", sourceItemId, Guid.NewGuid().ToString());
    }

    protected override void RegisterDefaultSupportedResolutionActions()
    {
        AddSupportedResolutionAction(new ManualConflictResolutionAction());
    }

    private static readonly Guid s_conflictTypeReferenceName = new Guid("{606531DF-231A-496B-9996-50F239481988}");
    private const string s_conflictTypeFriendlyName = "TFS WIT general conflict type";
    private const string s_conflictTypeDescription =
        "This conflict is detected when an unknown exception is thrown during Work Item data submission.";
}

How to create an adapter for the TFS Integration Platform - Part VI: IAnalysisProvider

Note: This post is part of a series and you can find the rest of the parts in the series index.

IAnalysisProvider, has a name that is a bit misleading, or was misleading to me because for a long time I thought it did some analysis of the environment as a pre-step and then real work happened elsewhere. The reality is the IAnalysisProvider is the reader part of your adapter, it’s goal is to get data from your system and into a format and/or location that the platform can work with.

image

IServiceProvider

IAnalysisProvider inherits from IServiceProvider which means you need to implement a method for that, GetService, which just returns this object.

object IServiceProvider.GetService(Type serviceType)
{
    return (IServiceProvider)this;
}

Misc Methods

I am not cover every method you need to implement from IAnalysisProvider, because you seldom need to implement them all. For example in my implementation of DetectConflicts is just does some logging:

void IAnalysisProvider.DetectConflicts(ChangeGroup changeGroup)
{
    TraceManager.TraceInformation("WSSVC:AP:DetectConflicts");
}

InitializeServices

The first method you must care about is InitializeServices, this is the first method which is called by the platform and it does five key things in my scenario:

void IAnalysisProvider.InitializeServices(IServiceContainer serviceContainer)
{
    TraceManager.TraceInformation("WSSVC:AP:Initialize");
    this.analysisServiceContainer = serviceContainer;

    supportedContentTypes = new Collection<ContentType>();
    supportedContentTypes.Add(WellKnownContentType.VersionControlledFile);
    supportedContentTypes.Add(WellKnownContentType.VersionControlledFolder);

    SharePointVCChangeActionHandler handler = new SharePointVCChangeActionHandler(this);
    supportedChangeActions = new Dictionary<Guid, ChangeActionHandler>();
    supportedChangeActions.Add(WellKnownChangeActionId.Add, handler.BasicActionHandler);
    supportedChangeActions.Add(WellKnownChangeActionId.Delete, handler.BasicActionHandler);
    supportedChangeActions.Add(WellKnownChangeActionId.Edit, handler.BasicActionHandler);

    configurationService = (ConfigurationService)analysisServiceContainer.GetService(typeof(ConfigurationService));

    highWaterMarkDelta = new HighWaterMark<DateTime>(Constants.HwmDelta);
    highWaterMarkChangeset = new HighWaterMark<int>("LastChangeSet");
    configurationService.RegisterHighWaterMarkWithSession(highWaterMarkDelta);
    configurationService.RegisterHighWaterMarkWithSession(highWaterMarkChangeset);
    changeGroupService = (ChangeGroupService)analysisServiceContainer.GetService(typeof(ChangeGroupService));
    changeGroupService.RegisterDefaultSourceSerilizer(new SharePointVCMigrationItemSerializer());
}
  • The first part is the setting up of what types of content we support (lines 6 to lines 8). You can see above I only care about files and folders.
  • The second part is the setting up of what actions we support for reading (lines 10 to 14), which in this case is Add, Delete and Edit. Delete is a bit of a lie since we do not actually support it but we say we do.
  • The third part is getting the configuration service which is important since we will use it later (line 16).
  • The forth part is getting the HWM, or high watermark information (lines 18 to 21) which I will explain in a moment.
  • Lastly we register the default item serialiser (line 23) so that the platform knows how to convert the items.

Registration

We setup the content types and actions we support and then we need to register those and the way to do that is with RegisterSupportedChangeActions. What I did was to loop over the actions and then loop over the types finally calling RegisterChangeAction. You could do this differently, for example if you only supported some actions on some types:

void IAnalysisProvider.RegisterSupportedChangeActions(ChangeActionRegistrationService contentActionRegistrationService)
{
    TraceManager.TraceInformation("WSSVC:AP:RegisterSupportedChangeActions");
    this.changeActionRegistrationService = contentActionRegistrationService;
    foreach (KeyValuePair<Guid, ChangeActionHandler> supportedChangeAction in supportedChangeActions)
    {
        foreach (ContentType contentType in ((IAnalysisProvider)this).SupportedContentTypes)
        {
            changeActionRegistrationService.RegisterChangeAction(supportedChangeAction.Key, contentType.ReferenceName, supportedChangeAction.Value);
        }
    }
}

ChangeActionHandler

The ChangeActionHandler class is a separate class which is used in the IAnalysisProvider during registration which provides the minimal functionality for figuring out how to register a new action (i.e. add file, update work item etc…).

public abstract class ChangeActionHandlers
{
    protected ChangeActionHandlers(IAnalysisProvider analysisProvider)
    {
    }

    public virtual void BasicActionHandler(MigrationAction action, ChangeGroup group)
    {
        if (action == null)
        {
            throw new ArgumentNullException("action");
        }

        if (group == null)
        {
            throw new ArgumentNullException("group");
        }

        group.CreateAction(action.Action, 
            action.SourceItem, 
            action.FromPath, 
            action.Path, action.Version,
            action.MergeVersionTo,
            action.ItemTypeReferenceName, 
            action.MigrationActionDescription);
    }
}

High Watermark

High watermarks are a very interesting feature of the platform, and they let you store a value in the database for the usage of identification of change groups. One aspect I liked of it’s construction is its use of generics, meaning you can work with the types that make the most sense, so I have Int and DateTime, and as you associate a name to it.

To get the value from the database you called the Reload method, to set the value and save call Update.

highWaterMarkDelta.Reload();
highWaterMarkDelta.Update(deltaTableStartTime);

Conflict Types

I have mentioned conflict types briefly before and I have said that my VC adapter does not have a conflict type which is not 100% true. It does have one, the GenericConflictType, which is base from the platform. Below is the code snippet from the WIT adapter which does have a custom conflict type. The only difference with the VC adapter is that last line does not exist.

public void RegisterConflictTypes(ConflictManager conflictManager)
{
    TraceManager.TraceInformation("WSSWIT:AP:RegisterConflictTypes");
    this.conflictManagerService = (ConflictManager)analysisServiceContainer.GetService(typeof(ConflictManager));
    this.conflictManagerService.RegisterConflictType(new GenericConflictType());
    this.conflictManagerService.RegisterConflictType(new SharePointWITGeneralConflictType(), SyncOrchestrator.ConflictsSyncOrchOptions.Continue);
}

GenerateDeltaTable

The next method to cover, and the second most important is GenerateDeltaTable which is responsible for actually getting the values from the source system. This is done below in two steps first GetSharePointUpdates and second PromoteDeltaToPending.

void IAnalysisProvider.GenerateDeltaTable()
{
    TraceManager.TraceInformation("WSSVC:AP:GenerateDeltaTable");
    highWaterMarkDelta.Reload();
    TraceManager.TraceInformation("\tWSSVC:AP:Initial HighWaterMark {0} ", highWaterMarkDelta.Value);
    deltaTableStartTime = DateTime.Now;
    TraceManager.TraceInformation("\tWSSVC:AP:CutOff {0} ", deltaTableStartTime);
    GetSharePointUpdates();
    highWaterMarkDelta.Update(deltaTableStartTime);
    TraceManager.TraceInformation("\tWSSVC:AP:Updated HighWaterMark {0} ", highWaterMarkDelta.Value);
    changeGroupService.PromoteDeltaToPending();
}

GetSharePointUpdates

This is a huge method and does the heavily lifting and I will skip covering all the boring details of talking to SharePoint and focus covering what you need to do. First you need to identify what is new, this is done using the HWM and comparing the modified date.

// item has been modified since HWM & before deltra table start time
if (item.Modified.CompareTo(highWaterMarkDelta.Value) > 0 && item.Modified.CompareTo(deltaTableStartTime) < 0) 

You also need to figure out if the file is new or an update. In my VC adapter I created a special system called the ProcessLog. This was to cater with a situation caused by SharePoint and won’t apply to other systems. Once you done all of that you can tell the platform about it by creating an action and saving the action. The following code is for VC:

TraceManager.TraceInformation("\tChangeSet:{0} - {1} ({2})", highWaterMarkChangeset.Value, item.Filename, item.AbsoluteURL);
string itemType = item.ItemType.ToWellKnownContentType().ReferenceName;
ChangeGroup cg = CreateChangeGroup(highWaterMarkChangeset.Value, 0);
cg.CreateAction(actionGuid, item, null, item.AbsoluteURL, item.Version, null, itemType, null);
cg.Save();
highWaterMarkChangeset.Update(highWaterMarkChangeset.Value + 1);

and this is the same logic for WIT:

ChangeGroup changeGroup = CreateChangeGroup(highWaterMarkChangeSet.Value, 0);
changeGroup.CreateAction(actionGuid, task, string.Empty, listName, string.Empty, string.Empty,
    WellKnownContentType.WorkItem.ReferenceName, CreateFieldRevisionDescriptionDoc(task));
changeGroup.Save();
highWaterMarkChangeSet.Update(highWaterMarkChangeSet.Value + 1);

Revision Description Doc

While the VC adapter is fairly easy, the downloading is done in the SharePointItem - the WIT adapter doesn’t download anything. What it needs is a special XML file called a revision description document. You are responsible for the generation of this document as part of the creation of the action (you may have noticed that in the sample above).

This is what actually makes the field mapping possible, if you do not understand field mapping you must read Willy-Peter’s post on it. You can see below how I create my document, which is created per SharePoint list item and how I can support all the columns, including custom ones:

private static XmlDocument CreateFieldRevisionDescriptionDoc(SharePointListItem task)
{
    XElement columns = new XElement("Columns",
            new XElement("Column",
                new XAttribute("DisplayName", "Author"),
                new XAttribute("ReferenceName", "Author"),
                new XAttribute("Type", "String"),
                new XElement("Value", task.AuthorId)),
            new XElement("Column",
                new XAttribute("DisplayName", "DisplayName"),
                new XAttribute("ReferenceName", "DisplayName"),
                new XAttribute("Type", "String"),
                new XElement("Value", task.DisplayName)),
            new XElement("Column",
                new XAttribute("DisplayName", "Id"),
                new XAttribute("ReferenceName", "Id"),
                new XAttribute("Type", "String"),
                new XElement("Value", task.Id.ToString())));

    foreach (KeyValuePair<string, object> column in task.Columns)
    {
        columns.Add(new XElement("Column",
            new XAttribute("DisplayName", column.Key),
                new XAttribute("ReferenceName", column.Key),
                new XAttribute("Type", "String"),
                new XElement("Value", column.Value)));
    }

    XElement descriptionDoc = new XElement("WorkItemChanges",
        new XAttribute("Revision", "0"),
        new XAttribute("WorkItemType", "SharePointItem"),
        new XAttribute("Author", task.AuthorId),
        new XAttribute("ChangeDate", task.ModifiedOn.ToString(CultureInfo.CurrentCulture)),
        new XAttribute("WorkItemID", task.Id.ToString()),
        columns);

    XmlDocument doc = new XmlDocument();
    doc.LoadXml(descriptionDoc.ToString());
    return doc;
}

How to create an adapter for the TFS Integration Platform - Part V: Items (IMigrationItem & IMigrationItemSerializer)

Note: This post is part of a series and you can find the rest of the parts in the series index.

An important class in your implementation is your item implementation, which is used to identify what an item (be that file, directory, list item, bug etc…) is. This class needs to implement IMigrationItem. This class is more important for VC than WIT but in both cases you need to put in all the properties you need to know about. You must also make sure that all properties are serialisable by the default XML serialiser in .NET.

Power Tip: VC stands for Version Control. This refers to an adapter that works with the source control aspects of the system. WI, work items, and WIT, work item tracking, are the same thing. File attachments in WI are NOT regarded as VC and must be handled by your WI adapter.

IMigrationItem

WIT

The WIT adapter is slightly smaller than the VC one since it is just a set of properties. I do not care too much about the inherited DisplayName property here either and the Download method is just logging. The only interesting part is the SimpleDictionary<T,V> I use. SimpleDictionary<T,V> is just to store Key/Value pairs of column information from SharePoint (because you may have customised the columns in SharePoint, I cannot hard code them). The reason I use this rather the Dictionary<T,V> which .NET provides, is because that class is not able to be serialised using the default XML serialiser. I will cover SimpleDictionary<T,V> in a later post.

public class SharePointListItem : IMigrationItem
{
    public SharePointListItem()
    {
        this.Columns = new SimpleDictionary<string, object>();
    }

    public string Id { get; set; }
    public DateTime ModifiedOn { get; set; }
    public string AuthorId { get; set; }
    public SimpleDictionary<string, object> Columns { get; set; }    
    public string DisplayName { get; set; }

    public void Download(string localPath)
    {
        TraceManager.TraceInformation("WSSWIT:MI:Download - {0}", localPath);
    }
}

VC

The VC adapter is a little more complex with more properties but the key difference is the DisplayName property, which I return the filename. Even more important though it the Download method which is used to get the actual file or folder to a specific location, provided by the localpath parameter, on disk for the platform to use.

Power Tip: When you are downloading files in the IMigrationItem, you are responsible for the creation of the path too. So make sure you are creating directories and also checking what directories exist too.

string IMigrationItem.DisplayName
{
    get { return Filename; }
}

void IMigrationItem.Download(string localPath)
{
    TraceManager.TraceInformation("WSSVC:Item:Download:From {0} to {1}", this.AbsoluteURL, localPath);
    if (this.ItemType == SharePointItemType.File)
    {
        TraceManager.TraceInformation("\tType is file");
        string targetDir = Path.GetDirectoryName(localPath);
        if (!Directory.Exists(targetDir))
        {
            TraceManager.TraceInformation("\tCreating Directory for file - {0}", targetDir);
            Directory.CreateDirectory(targetDir);
        }

        HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(this.AbsoluteURL);
        webRequest.Credentials = this.Credentials;
        using (Stream responseStream = webRequest.GetResponse().GetResponseStream())
        {
            using (FileStream fileStream = new FileStream(localPath, FileMode.CreateNew, FileAccess.ReadWrite))
            {
                byte[] buffer = new byte[1024];
                int bytesRead;
                do
                {
                    // Read data (up to 1k) from the stream
                    bytesRead = responseStream.Read(buffer, 0, buffer.Length);

                    // Write the data to the local file
                    fileStream.Write(buffer, 0, bytesRead);
                } while (bytesRead > 0);
            }
        }

        TraceManager.TraceInformation("\tFile downloaded successfully");
    }

    if (this.ItemType == SharePointItemType.Directory)
    {
        TraceManager.TraceInformation("\tType is directory");
        if (!Directory.Exists(localPath))
        {
            TraceManager.TraceInformation("\tCreating Directory - {0}", localPath);
            Directory.CreateDirectory(localPath);
        }
    }
}

IMigrationItemSerializer

The migration item serializer is just a way to get your item to XML using .NET serialisation. My implementation of this is the exact same for both VC and WIT.

public class SharePointWITMigrationItemSerializer : IMigrationItemSerializer
{
    public IMigrationItem LoadItem(string itemBlob, ChangeGroupManager manager)
    {
        TraceManager.TraceInformation("WSSWIT:S:LoadItem");
        if (manager == null)
        {
            throw new ArgumentNullException("manager");
        }

        if (string.IsNullOrEmpty(itemBlob))
        {
            throw new ArgumentNullException("itemBlob");
        }

        XmlSerializer serializer = new XmlSerializer(typeof(SharePointListItem));

        using (StringReader itemBlobStringReader = new StringReader(itemBlob))
        {
            using (XmlReader itemBlobXmlReader = XmlReader.Create(itemBlobStringReader))
            {
                return (SharePointListItem)serializer.Deserialize(itemBlobXmlReader);
            }
        }
    }

    public string SerializeItem(IMigrationItem item)
    {
        if (item == null)
        {
            throw new ArgumentNullException("item");
        }

        TraceManager.TraceInformation("WSSWIT:S:SerializeItem - {0}", item.DisplayName);

        XmlSerializer sharePointTaskSerializer = new XmlSerializer(item.GetType());

        using (MemoryStream memoryStream = new MemoryStream())
        {
            sharePointTaskSerializer.Serialize(memoryStream, item);
            memoryStream.Seek(0, SeekOrigin.Begin);
            using (StreamReader streamReader = new StreamReader(memoryStream))
            {
                return streamReader.ReadToEnd();
            }
        }
    }
}

How to create an adapter for the TFS Integration Platform - Part IV: IProvider

Note: This post is part of a series and you can find the rest of the parts in the series index.

IProvider is the first class we will look at implementing for both adapters (WI and VC) as it provides the core information for the platform to talk to our adapter. The first thing you provider needs is the ProviderDescriptionAttribute, which has three properties ID, Name and Version.

[ProviderDescription("{7F3F91B2-758A-4B3C-BBA8-CE34AE1D48EE}", "SharePoint TIP Adapter - Version Control", "1.0.0.0")]
The ID must be unique and you will need a record of it somewhere as it is used in configuration for the platform. The name and version are potentially used to make it easier for users, but I have not seen them used anywhere (maybe in future/different tools).

The only  method in the provider is the GetService method which is used to get the implementations of the interfaces/classes we will be building later. Put another way this allows the platform to request a class which implements a specific interface using this method:

object IServiceProvider.GetService(Type serviceType)
{
    TraceManager.TraceInformation("WSSVC:Adapter:GetService - {0}", serviceType);

    if (serviceType == typeof(IAnalysisProvider))
    {
        if (analysisProvider == null)
        {
            analysisProvider = new SharePointVCAnalysisProvider();
        }
        return analysisProvider;
    }
    
    if (serviceType == typeof(IMigrationProvider))
    {
        if (migrationProvider == null)
        {
            migrationProvider = new SharePointVCMigrationProvider();
        }
        return migrationProvider;
    }

    if (serviceType == typeof(IServerPathTranslationService))
    {
        if (transalationProvider == null)
        {
            transalationProvider = new SharePointVCAdapterTranslation();
        }
        return transalationProvider;
    }        

    return null;
}

Above is the implementation is what I used in the SharePoint VC adapter, the WI adapter is the same except it does not have the server path translation service at the end.

Power Tip: Using Visual Studio 2010’s new “Generation from usage” features makes this stage of development much easier. 

How to create an adapter for the TFS Integration Platform - Part III: Overview of adapters

Note: This post is part of a series and you can find the rest of the parts in the series index.

The TFS Integration Platform has two types of adapters, a WI (for work items, tasks,  bugs etc…) and a VC (version control) adapter and they are nothing more than a .NET assembly made up of a number of classes which, mostly, you will inherit from interfaces in the Toolkit project. For both adapters the key interfaces you need to implement are:

  • IProvider: Gives the platform the way to invoke your adapter.
  • IMigrationProvider: This is used for writing to the adapters source system, so for me SharePoint.
  • IMigrationItemSerializer: This provides support for converting the item to XML.
  • IAnalysisProvider: This is used for reading from the adapters source system.

as well as they both need to implement the ChangeActionHandlers abstract class.

The VC adapter also needs:

  • IServerPathTranslationService: Used to translate the path (i.e. directories and such) from other adapters to this adapter and visa versa.

 

image 

While the WIT adapter needs:

  • IConflictHandler: Provides support for handling conflicts during the migration.

image

So in both adapters you will need a minimum of 6 classes you will implement, excluding any extra ones you will need for your specific requirements.

The core concepts of the adapters are all explained in the interfaces and so it appears that it is very simple to implement, and indeed it is – however there are some weird things which may catch you up which we will cover in detail in future posts.

TraceManager

Something very nice in the platform is the TraceManager class which is really just a wrapper around System.Diagnostics.Trace but it has some extras in that wrapper, such as being what is included in the console windows and log files. You will see this sprinkled through out my code because it is useful to to have when trying to debug later on.

Power Tip: The TraceManager puts all information written to it in the log files, so please make sure you do not put any sensitive information in there.

How to create an adapter for the TFS Integration Platform - Part II: Setup of the environment

Note: This post is part of a series and you can find the rest of the parts in the series index.

Getting started with the adapter development is not the easiest task because you are stuck a little in the wild, so this part will serve as as a quick start guide for getting what you need to become a TFS Integration Platform developer.

SQL Server

The TFS Integration Platform requires Microsoft SQL Server so you need to install an instance of that.

TFS

It goes without saying, or maybe it doesn’t, that if you plan to write an adapter to integrate to TFS you will need TFS. Even if you don’t care about TFS, you will want to test and the TFS 2010 adapters are of the highest quality and so they make a great test target (so testing between your adapter and TFS). Thankfully with TFS 2010 you can now install on Windows 7 natively so this means as a developer you can have a great easy environment.

Target system (SharePoint for me)

Since I was developing for SharePoint that meant I needed a SharePoint installation, which meant a 20Gb Windows 7 Virtual Machine :( Hopefully for you this will be less of an issue.

TFS Integration Platform

The TFS Integration Platform, is a software component and database which runs on your machine and handles the actual moving of data around. You can get it from http://tfsintegration.codeplex.com/releases - however it may not be obvious which is the one you want since the team has so many download options, you want the tools:

image

During install you will get an option to install the service, which is recommended for production environments when you want to have the synchronisation running continuously. However for development this is not needed.

Power Tip: Once you have completed the tools install, go into to SQL Server and backup the TFSIntegrationPlatform database immediately. There are not only a few odd bugs that roam around the platform (it’s still in beta) which may cause you to need a restore of the database but if you want to test on a clean environment then a restore is quicker than a reinstall.

Platform Source

To build adapters you will also need the source code for the TFS Integration Platform which you can also get from CodePlex. Best is to get the latest drop of the code which you can get from the Source Control page and then by clicking on the Download link in the latest version box on the far right.

image

In there you will find the IntegrationPlatform folder which contains all the code from Microsoft.

image

Power Tip: Make a common root for the TFS code and yours (in my case I used RangersCode) and then create sub directories in there for platform and your code (so I had My Production and MS production folders under RangersCode). This helps keep the items close, which makes things easier later plus keeps them separate so you can identify them.

The code itself is for Visual Studio 2008, however you can be just like me and use Visual Studio 2010 and it will work just fine. Once you have done all of this you are finally ready for writing your adapter!

How to create an adapter for the TFS Integration Platform - Part I: Introduction

Note: This post is part of a series and you can find the rest of the parts in the series index.

Since September 2009 I have been engaged in a ALM Rangers project, namely the TFS Integration Platform. Which is:

The TFS Integration Platform is a project developed by the Team Foundation Server (TFS) product group and the Visual Studio ALM Rangers to facilitate the development of tools that integrate TFS with other systems. Currently, the scope of this project is to enable TFS to integrate with other version control and work-item/bug tracking systems, but the eventual goal of this project is to enable integration with a broader range of tools/systems (i.e. build). This platform enables the development of two major classifications of tools: tools that move data unidirectionally into TFS, and tools that synchronize data bidirectionally.

So in short it is a integration system, like BizTalk or SSIS but specially built for version control and work items. I have not said TFS there because, it can work to migrate between other source control and work item systems provided adapters exist. Adapters are the logic which allows the TFS platform to connect to a variety of technologies, and my goal has been to build two of them – one for SharePoint lists and one for SharePoint document libraries.

You may have noticed that SharePoint isn’t a version control or work item system, so why integrate? Well lots of companies do use it for ALM related items, such as the lists being used for tracking work items and the document libraries are used to store content which should be in a source control system. This is the first post in a series which will give you an idea of what is involved in building adapters, show you what to avoid and hopefully give you a few laughs at my expense. 

Now I want to be clear this series will not covering usage of the platform or any of the core concepts in it. For those please see the links below in particular Willy-Peter’s blog. You do need to understand a bit about how the platform works before you attempt to build your own adapter.

As all my work was done for the ALM Rangers the source code for my adapters is included in the code which can be obtained from the CodePlex site.

To help you on your way let’s list a few links which are key for this:

.NET 4 Baby Steps: Part XIII - Tiny steps

07042010191 Note: This post is part of a series and you can find the rest of the parts in the series index.

There is a bunch of tiny additions in .NET 4 which I have not covered yet, this post provides a quick hit list of some of the new and improved features:

New

  • new StringBuilder.Clear: Quick method to clear a string builder.
  • new StopWatch.Reset: Quick method to reset a stop watch timer.
  • new IntPtr & UIntPtr: Both have had two new methods added, one for addition and one for subtraction.
  • new Thread.Yield: Allows you to yield execution to another thread that is ready to run on the current processor.
  • new System.Guid: Has got two new methods, TryParse and TryParseExact to allow for testing of the parsing.
  • new Microsoft.Win32.RegistryView: This allows you to request 64bit or 32bit views of the registry.
  • new Environment: Now contains two properties to identify 64bit scenarios:
    • Is64BitOperatingSystem: To identify if the OS is 64bit.
    • Is64BitProcess: To identify if the process is 64bit.
  • new System.Net.Mail.SmtpClient: Support for SSL

Improved

  • better Path.Combine: A new method overload to allow you to combine file paths.
  • better Compression.DeflateStream & Compression.GZipSteam: They have been improved so that they so no try to compress already compressed data.
  • better Compression.DeflateStream & Compression.GZipSteam: The 4Gb size limit has been removed.
  • better Monitor.Enter: A new overload has been added which allows you to pass in a reference boolean which returns true of the monitor was successfully entered.
  • better Microsoft.Win32.RegistryOptions: Now includes an option to specify a volatile key which is removed when the system restarts.
  • better Registry keys are no longer limited to 255 characters.
  • better System.Net.Mail.MaiMessage: Support for new headers
    • HeadersEncoding: Sets the type of text encoding used in the mail header.
    • ReplyToList: Sets the list of addresses to use when replying to a mail. This replaces ReplyTo which only supported one email address.
  • better System.Net.NetworkCredential: To improve security passwords can now be stored in a SecureString.
  • better ASP.NET Hashing: The default value has been changed from SHA1 to SHA256.
  • better ASP.NET Output caching: Previously setting the output cache to ServerAndClient also required calling SetOmitVaryStar to ensure it would be cached on the client. From .NET 4, calling of SetOmitVaryStar is no longer needed.
  • better TimeZoneInfo.Local & DateTime.Now: Both of these follow the OS daylight savings settings rather than using the .NET Framework settings.
  • better When running on Windows 7, locale info will be retrieved from the OS rather than being stored in the framework.
  • better Support for all 1400 characters of Unicode 5.1.
  • better ServiceInstaller.DelayedAutoStart: If you on a more modern OS (Vista, Win 7 etc…) then you can services can start as Automatic Delayed. This means they start, but after system boot so that the user gets in quickly. This is now possible for your .NET apps using the DelayedAutoStart property.

.NET 4 Baby Steps: Part XII - Numbers

22032010170 Note: This post is part of a series and you can find the rest of the parts in the series index.

A new namespace has arrived in .NET 4 for those who spend a lot of time with numbers, System.Numerics which has two classes: BigInteger and Complex – and they are exactly what they say they are. BigInteger is for big integers and Complex is complicated ;)

BigInteger

BigInteger is a class, not a type (like float), which allows you to have an integer with no theoretical upper and lower limits! Why is that cool? think about Int64 which can do up to: 9,223,372,036,854,775,807. If you have an Int64 which has that massive value, and you add one to it, the Int64 it overflows and becomes -9,223,372,036,854,775,806. That is not possible with BigInt since it has no upper limit!

Being a class means it has methods and properties you can use too, for example some of the properties

  • IsZero: Tells you if it equals zero.
  • IsEven: Tells you if it is an even number.

An example of using it:

BigInteger firstBigInt = new BigInteger(Int64.MaxValue);
BigInteger secondBigInt = new BigInteger(Int64.MaxValue);

Console.WriteLine("First BigInt is even? {0}", firstBigInt.IsEven);
Console.WriteLine("First BigInt = 1? {0}", firstBigInt.IsOne);
Console.WriteLine("First BigInt is power of twp? {0}", firstBigInt.IsPowerOfTwo);
Console.WriteLine("First BigInt = 0? {0}", firstBigInt.IsZero);
Console.WriteLine("First BigInt is positive (1), zero (0), or negative (-1)? {0}", firstBigInt.Sign);
Console.WriteLine("{0} multipled by {0} is {1}", Int64.MaxValue, BigInteger.Multiply(firstBigInt, secondBigInt));

You can also use the standard operators (-, +, * etc…) with it.

This gives the following output (look at the size of the number from the multiplication!):

image 

BigRational

What if you want to work with rational numbers with no limits, rather than integers? Then you can use the BigRational class the BCL team has made available at http://bcl.codeplex.com/

Complex

A complex number is a number that comprises a real number part and an imaginary number part. A complex number z is usually written in the form z = x + yi, where x and y are real numbers, and i is the imaginary unit that has the property i2 = -1.

That snippet is the first line from the documentation on System.Numeric.Complex and unfortunately I am not smart enough to know what they are talking about. So who should understand this?

  • Electrical engineers: Using Complex they can do the following: Resistance(R) and Reactance(X) to calculate the impedance Z.
  • Mathematicians: Vector Calculus as well as Graphs.
  • People using positional (mapping) info: X, Y coordinates on a  map or 2d plane.

For an example I will just wimp out and show you what the MSDN documentation has:

// Create a complex number by calling its class constructor.
Complex c1 = new Complex(12, 6);
Console.WriteLine(c1);

// Assign a Double to a complex number.
Complex c2 = 3.14;
Console.WriteLine(c2);

// Cast a Decimal to a complex number.
Complex c3 = (Complex)12.3m;
Console.WriteLine(c3);

// Assign the return value of a method to a Complex variable.
Complex c4 = Complex.Pow(Complex.One, -1);
Console.WriteLine(c4);

// Assign the value returned by an operator to a Complex variable.
Complex c5 = Complex.One + Complex.One;
Console.WriteLine(c5);

// Instantiate a complex number from its polar coordinates.
Complex c6 = Complex.FromPolarCoordinates(10, .524);
Console.WriteLine(c6);

That produces:

image

Some info on complex is from: http://www.dotnetspider.com/resources/36681-Examples-On-Complex-Class-C-New-Feature.aspx