April 2006 Entries
My current project is making me see more and more that iterative development with constant input from the end users is the best way to have a successful outcome. Recently I was involved in a conference call with the end users and the business representatives. These representatives had given us all of our requirements and we had no direct contact with the people in the field. BIG MISTAKE!
This was compounded by the fact that the client is following a very waterfall approach with tons of documentation. Throw in the fact that they are not willing to re-evaluate time-lines and resources after phases in the process and all you can do is bring on the antacids.
Now I am not running to throw out all of the documentation and screaming for anarchy. In my opinion the best approach seems to be to get a well define list of the high level goals of the project. From there break it into digestible chunks. After each piece is delivered re-evaluate what should be worked on next. Most importantly, get user feedback early and often.
Ok, so it sounds like Scrum. Maybe it is. If it works, use it. Keep it simple and usable. As always other opinions are welcome.
Rocky Lhotka visited the Downers Grove Microsoft office this evening to present to a packed (around 170 people) CNUG meeting. He presented as comprehensive a coverage of CSLA.NET 2.0 as could be done in two hours.
In that time he covered much of the same material as he did on dnrTV, except that in this format you have the opportunity to pick his brain. One topic that the group seemed to have an interest in is data driven authorization as he was asked it by a couple of people. Rocky explained it is the fact that different people do authorization in different ways. That is why he implemented the authorization and business rules as methods rather than attributes.
The nice thing about Rocky is that he doesn't hide the warts on .NET or even his own code. He talked about disappointment with the ObjectDataSource but had huge praise for the way WinForms object binding works in .NET 2.0.
There were a couple times where people asked questions about the way his framework was designed, such as why DataPortal_Fetch is private and isn't overridden. I think every time he was able to say that he had tried the alternatives and while his way may not be perfect that is the way he decided was best for him.
I think what I like best about his framework is that it demonstrates a many good object oriented concepts but remains flexible enough to work in most situation. There is no way I could do his presentation justice. I will just say that if you get a chance to see Rocky in this type of setting don't pass it up.
Ok. So I am a glutton for punishment. I just finish writing a chapter on the Logging Application Block and was starting to research some new topics. Now I have agreed to write a chapter on the Cryptography Application Block. Hopefully with the experience of the first one under my belt this one will go a little faster. I need to get cranking so that the book
can meet the July publishing date. Stay tuned and see how things turn out.
We have been having prolonged adventures with this wonderful data type on my current project. We are storing XML data in a CLOB field. In the beginning we used the Oracle ODP.NET provider. When we got into performance testing we found that saving this data killed our performance. A couple of the developers did some testing and found that the Microsoft Oracle provider worked at least an order of magnitude better. I was a little skeptical about moving to the Microsoft provider because I remembered that certain database condition weren't reported very well, but time lines always win.
A short time later the client got a response from Oracle saying we should use a VarChar parameter to send data to a CLOB in the database. We did some testing and it did perform well, but we figured we had changed enough data access code at this point that we would wait until a maintenance release to consider performing the change.
The system went into pilot and suddenly we started getting failures to insert into the table. Again the developers did some research and found that any data over 32K would cause this condition.
The solution? Go back to Oracle's suggestion.
The moral? Take your medicine now. It taste worse if you wait.