Try fast search NHibernate

30 May 2010

Sharp Tests Ex 1.0.0 release

I’m happy to announce the final release of Sharp Tests Ex 1.0.0

After use it so long time in various projects (Open Source and commercial) without any kind of problems, enjoying its fluent interface and its ‘Satisfier’, is time to put a final point to its first version.

In these months I have enjoined most of users comments as, for example, the anagram of the project name : Sharp Test sEx.

In one issue, a user has defined it as “sexy framework”… so true that Sharp Tests Ex has its Satisfier ;)

Few moths ago the NHibernate team has approved the usage of Sharp Tests Ex in our tests and now, that it is stable, I can use it even in NHibernate.

before Sharp Tests Ex was as you can hear here but now: I can get satisfaction!!
Enjoy it!!… with moderation… LOL

From SVN to Mercurial (in Windows7)

This is the resume a two successful  stories of conversion from an SVN repository to Mercurial.
Before start the conversion you will need the last available release of TortoiseSVN and TortoiseHg installed in your box.
Now you have the basic requirement you need to “activate” the Mercurial’s conversion plug-in. In your Win7 machine edit the file c:\Users\<YourUserName>\mercurial.ini and add the following two lines:
[extensions]
convert=

The conversion procedure

The real conversion is just one Mercurial’s command but if you have a repository hosted in the cloud, the best procedure, is to create a local read-only mirror of your SVN-repository.
In this example I will convert the ConfORM repository hosted in Google-code.

Create a new folder where put your local mirrors of your SVN repositories:
D:\>md svnmirrors
D:\>cd svnmirrors

To create the mirror:

D:\svnmirrors>svnadmin create ConfOrmMirror
D:\svnmirrors>cd ConfOrmMirror\hooks
D:\svnmirrors\ConfOrmMirror\hooks>echo. 2>pre-revprop-change.bat
D:\svnmirrors\ConfOrmMirror\hooks>cd ..\..
D:\svnmirrors>svnsync init --username fabiomaulo file:///d:/svnmirrors/ConfOrmMirror https://codeconform.googlecode.com/svn/trunk/

Now the local mirror is ready to be synchronized with the remote repository (you can sync it any time you need).

D:\svnmirrors>svnsync sync file:///d:/svnmirrors/ConfOrmMirror

The synchronization will take a while depending on how long is the history of your repository (or the part was not synchronized).
After the synchronization I’m ready for the conversion.

D:\svnmirrors>md ConfOrmHg

Converting the full history, I want ‘convert’ even all committers names in order to use the Mercurial users convention. To achieve it, I will create a new file named authors.txt in the ConfOrmHg folder. In ConfOrm, so far, I’m the only one committer so:
authorsFile
Now I’m having everything needed to run the conversion directly from my local machine and if I forgot a user name or an e-mail or whatever may happen I don’t need to pay the pain of a conversion over the wire.

D:\svnmirrors\ConfOrmHg>hg convert file:///d:/svnmirrors/ConfOrmMirror ConfOrm -A authors.txt

Work done and now I’m ready to push to the remote Mercurial repository.

If you need to continue working on your SVN repository and only be ready to the conversion you can execute the synchronization any time you want and both, the sync and the conversion, will work only on differences from previous state.

08 May 2010

Azure: TableServiceEntity or TableDataRow


There are a lot of things to say about Windows Azure SDK, and how many classes I would remove, especially for those to manage the TableStorage. In the project I’m involved there is no time, so far, to implement what we would like to have… the main issue is that the SDK has some limitation that the Windows Azure Platform does not have.
Today is the day of the base class TableServiceEntity.
The first issue is its name. If you want to confuse a user about what is represented in a Azure’s table the best way is put the postfix “Entity”.
The implementation of the base class proposed in the SDK is this:
  1. public abstract class TableServiceEntity
  2. {
  3.     protected TableServiceEntity()
  4.     {
  5.     }
  6.  
  7.     protected TableServiceEntity(string partitionKey, string rowKey)
  8.     {
  9.         this.PartitionKey = partitionKey;
  10.         this.RowKey = rowKey;
  11.     }
  12.  
  13.     public virtual string PartitionKey { get; set; }
  14.     public virtual string RowKey { get; set; }
  15.     public DateTime Timestamp { get; set; }
  16. }
Nice POCO, no ? The problem with it is that it is too much POCO (the translation of the word ‘poco’ from Italian and Spanish is: ‘FEW’).

Problem 1

The class TableServiceEntity is only a mere base classes with three properties required by Azure’s Table Storage and I can’t call it “Entity” because in MY DOMAIN does not exist a string property called ‘PartitionKey’, does not exist a string property called ‘RowKey’, does not exist “Timestamp” and overall, in my case, I’ll not use the Azure’s table-storage to store the state of MY real entities.

Problem 2

Since PartitionKey and RowKey represents the composite primary key of a data-row, and both are required, which is the class with the responsibility to return valid values composed with some of MY properties ?
For me the answer is clear: the same instance should provide valid values of PartitionKey and RowKey and both properties shouldn’t have a public setter.
Since the property Timestamp is completely managed by the Table-Storage, should the property have a public setter ? … No, it shouldn’t.
Unfortunately the Azure SDK does not allow neither private nor protected setters!! :((

Problem 3

As said PartitionKey and RowKey are required and I would have values, needed to compose it, required in the constructor. For example I would have something like this:
  1. public NewsInfoPerCategoryData(string category, string title, DateTime createdAt)
I don't want another parameter-less public constructor. If needed (and believe me I know which is the reason) I can put there a protected parameter-less constructor. Well… to be short, if you use the SDK forget it!! you must have a public parameter less constructor. :((
To create a new instance of a class with a protected parameter-less constructor the C# line is:
  1. var instance = Activator.CreateInstance(typeof (TTableEntity), true);
When you have ten minutes, please, use it to create an instance as result of a query, thanks.

Problem 4

The other problem is the Timestamp. So far I have not checked its behavior directly in the cloud but in the Development Fabric it seems to have the same precision problem of the DateTime of MsSQL2005: it is rounded to 3 millisecond (hopefully it is only a limitation of Development Fabric because if the limitation is the same in the cloud we will have a real PITA to manage the optimistic-locking, or we must hope in the help of fate).

Why TableServiceEntity is too much POCO ?

If you will use the SDK you will use classes inherited from TableServiceEntity with the TableServiceContext. The TableServiceContext is a state-full-context based on identity-hash-table (why we should use a state-full-context to work with a REST-FULL service is one of those “mysteries of Faith”).
Inside the TableServiceContext, two instances represent the same entity if they have the same Hash-Code but the implementation of TableServiceEntity does not care about this fact… in practice the implementation should reflect what will happen in the storage where two “entities” will represent the same state if they have the same Type (or we can say same table), the same PartitionKey and the same RowKey.

My proposal

Well… not exactly my real proposal… this is more, the proposal “accepting” some of above constraints.
The code below in under Ms-PL (if you need another license, to be comfortable, let me know).
  1. [DataServiceKey(new[] { "PartitionKey", "RowKey" })]
  2. [CLSCompliant(false)]
  3. public abstract class TableDataRow
  4. {
  5.     private int? requestedHashCode;
  6.     private string partitionKey;
  7.     private string rowKey;
  8.  
  9.     public string PartitionKey
  10.     {
  11.         get { return partitionKey ?? (partitionKey = CreatePartitionKey()); }
  12.         set { partitionKey = value; }
  13.     }
  14.  
  15.     public string RowKey
  16.     {
  17.         get { return rowKey ?? (rowKey = CreateRowKey()); }
  18.         set { rowKey = value; }
  19.     }
  20.  
  21.     public DateTime Timestamp { get; set; }
  22.  
  23.     protected abstract string CreatePartitionKey();
  24.     protected abstract string CreateRowKey();
  25.  
  26.     public override bool Equals(object obj)
  27.     {
  28.         return Equals(obj as TableDataRow);
  29.     }
  30.  
  31.     public bool Equals(TableDataRow other)
  32.     {
  33.         if (ReferenceEquals(null, other))
  34.         {
  35.             return false;
  36.         }
  37.         if (ReferenceEquals(this, other))
  38.         {
  39.             return true;
  40.         }
  41.         return GetType().IsAssignableFrom(other.GetType()) && Equals(other.PartitionKey, PartitionKey) &&
  42.                      Equals(other.RowKey, RowKey);
  43.     }
  44.  
  45.     public override int GetHashCode()
  46.     {
  47.         if (!requestedHashCode.HasValue)
  48.         {
  49.             unchecked
  50.             {
  51.                 requestedHashCode = (GetType().GetHashCode() * 397) ^ (PartitionKey != null ? PartitionKey.GetHashCode() : 0) ^
  52.                                                         (RowKey != null ? RowKey.GetHashCode() : 0);
  53.             }
  54.         }
  55.         return requestedHashCode.Value;
  56.     }
  57. }

03 May 2010

C#4 ‘dynamic’ hole

I’m working in a new .Net4 Cloud project to run in Windows Azure. The proof of concept work fine and now I’m working in the real project writing tests and implementing classes.
I need some classes to initialize the Azure storage based on some conventions, in particular during the implementation of my TableStorageInitializer<TTableEntity> and, overall, its test I found a really nice issue with dynamic.
To simplify the situation a little bit have a look to this simplified implementation:
public class TableStorageInitializer<TTableEntity> where TTableEntity : class, new()
{
public void Initialize()
{
InitializeInstance(new TTableEntity());
}

public void InitializeInstance(dynamic entity)
{
entity.PartitionKey = Guid.NewGuid().ToString();
entity.RowKey = Guid.NewGuid().ToString();
}
}

To test it I’m using a private class declared inside the test class:

private class MyClass
{
public string PartitionKey { get; set; }
public string RowKey { get; set; }
public DateTime Timestamp { get; set; }
}


With my very big surprise what happen is shown in this picture:


DynamicHole


As you can see dynamic does not work. Even if the debugger can show the right type with its public properties the DLR can’t recognize it.

How solve the situation ? Well… very very easily, old-school Reflection!!!

public void InitializeInstance(object entity)
{
var entityType = entity.GetType();
entityType.GetProperty("PartitionKey").SetValue(entity, Guid.NewGuid().ToString(), null);
entityType.GetProperty("RowKey").SetValue(entity, Guid.NewGuid().ToString(), null);
}

Note: with a public class the dynamic work as expected.