I was puzzled by decimal type - when I load currency value from database, it is serialized as 0.0000, whereas if I assign the value directly within c# code it is serialized as 0. Thus, my AssertEqual() function fails.
After some digging, I found the problem can be simplified into these two sets of Debug.Prints.
The difference is because decimal remembers precision. You can confirm that by using Decimal.GetBits(). Even though 0 and 0.0000 are all 0, their GetBits() result is different.