May 2005 Entries
Microsoft Corporation has launched the Technology Adapter Challenge Contest for Developers and IT Pros.
The contest is to put your talents to create Killer Applications using the forthcoming Visual Studio 2005 (Whidbey) and coming up with the most ingenious Tips-and-Tricks for Windows Server 2003 deployment.
The contest is broadly classified under two umbrellas:-
1. Developer Challenge - Taming Whidbey
2. IT PRO Challenge - Taming WS 2003
Come up with solutions - Applications and Tips-and-Tricks - that will put you in position to win a series of fabulous prizes.
You can register for the contest from Here
For more information on the contest, please check Microsoft Technology Adapter Challenge
Cheers and Happy Participation !!!
Dekstop Search Tools are becoming very popular and of late there have been a lot of popular search engines coming up with Desktop Search Tools.
Desktop Search Engines provide way to search your hard disk files, documents, Outlook mails etc., in a quicker way than you can imagine, particularly if you have a dump of huge data in your drives.
Well, I installed a couple of such tools but ever since I installed the MSN Desktop Search, I started falling in love with it.
The first appealing thing is the ease to use. The other things that attract me are the filtering options.
You can filter your search results to just Documents, Email, etc., as well as filter only specific locations such as Files, Outlook etc.,
The UI is simply appealing and ever since I installed it, I have become a bit lazy to browse directories for simply finding an old document.
Try it out and I bet, its worth your time and can save you a lot of time. It takes very less time to scan your drives and once its done, you are all set to search.
I have received a lot of spam mails years before, that somebody in Rajasthan has developed an OS which can provide access to any file within 2 clicks. Well I dont know the authenticity of the mail as usually it comes with the prefix/suffix, forward this to N people and you will get something Good etc., (Though during my early days I believed and forwarded.. but thats a different story). The Desktop Search actually provides access to all your files in a Single click. You just need to enter the name of first letters of the name of your file and you get the popup with the available files from where you can just click to open the files.
You can try the MSN Desktop Search from http://desktop.msn.com/Cheers and Happy Desktop Searching !!!
All of us do a Search everyday ! I mean the Internet Search !!
We do a search starting from movies to whats new in the market. Developers and Software Professionals often use search engines to view information related to latest technologies, which is shared across the net.
The New MSN Search http://search.msn.com/ is just a new revolution and experience in your search. Its faster than you expect.
Some of the cool features include:-
1. Intelligent Search Builder
Now you can select the way you want MSN Search to interpret your search terms.
2. The powerful MSNBot
The web crawler which, when I last heard had indexed more than 5 Billion Web Pages.
3. Most up-to-date information
Updated daily, the index provides relevant, timely and accurate data as quickly as possible – and minimizes frustrating dead links.
4. Encarta Answers
Get direct answers to questions, solve equations, and look up facts with Microsoft Encarta.
5. Music Answers
When you’re searching for an artist, song or album, you’ll get immediate results from content provided by the MSN Music service, giving you the opportunity to take immediate action. One click enables you to listen to a sample from a song, and a second click lets you purchase and download.
6. Image Search
An index of hundreds of millions of images.
7. New Search Tools
With Search Builder and Search Near Me tools you can quickly get results that precisely meet your needs.
Of course there are many other features which can help you find results quickly and perfectly. It doesnt provide you with a long chunk of unrelated information but rather provides you the most relevant information.
Needless to say, the New MSN Search has many a feature to explore and make your search experience easier.
Happy Searching !!!
Tech•Ed 2005 is Microsoft's largest technology educational conference. Attending Tech Ed provides a good knowledge update to Develpors/Architects etc., and they can update their skills, get a feel of the upcoming technologies and ask questions/get clarifications from experts.
Tech Ed India 2004 was a grand success in all the cities and it provided a good introduction to Whidbey and Yukon which are scheduled to be released to market sometime within this year. The Beta versions are already out for a long time and the developers in Microsoft have heard to the queries from all Beta Users / MVPs etc., and implemented their views/ideas etc.,
Tech Ed 2005 would be a great venue for meeting thousands of peers and to explore solutions with the experts behind the technology.
More information about the Venue, Dates, Agenda etc., can be found at Tech•Ed 2005
I look forward to see you all there.
Cheers and Happy Programming !!!
You may receive the error "AllowCustomPaging must be true and VirtualItemCount must be set for a DataGrid with ID DataGrid1 when AllowPaging is set to true and the selected datasource does not implement ICollection." while attempting to bind a DataGrid to a DataReader and also setting the Datagrid's AllowPaging property to True.
The error occurs since you are binding a DataReader to the DataGrid. The DataReader offers a Forward Only - Read Only access to the Data that is being retrieved from the DataSource.
Since Paging would require the ResultSet to be accesible Forward as well as Reverse (To implement the Previous / Next set of records), the DataReader cannot help in this scenario.
Even if you set the AllowCustomPaging=true, though the error disappears, you will only be able to see the first set of records and will be unable to implement the built-in paging functionality provided by the DataGrid.
The resolution for the same is to use a DataSet which offers an In memory representation of Data so that you can navigate back and forth the resultset as well as do modifications to the result set. This way you can use the built-in paging functionality in your DataGrid.
However, if you want to only use a DataReader, then you need to store all the values of the DataReader in an array etc., and write custom paging functionality to implement paging.
Cheers and Happy Programming !!!
<UPDATE November 26, 2007>
I find that a lot of people found this article a little useful. This article is in continuation to earlier one which deals with IIS and SQL Server on the same machine. The link for the same is
This article is in continuation with Part I where I explained the scenarios where this error occurs when the Webserver and SQL Server are running on the same machine.
In case you havent read that, please Read Part I of this article, before proceeding.
Let us examine the 2nd scenario as follows:-
2. IIS (Webserver) and SQL Server are on different machines.
Check the SQL Server Authentication Mode and set is to Mixed Mode Authentication, as explained in Part I of this article.
Then comes the actual interesting part. Though you configure your SQL Server to Mixed Mode Authentication and add ASPNET or NETWORK SERVICE account to the Databases as explained in Part I of this article, you will still get the error.
The reason is that, the local accounts/credentials (ASPNET in Windows 2000 Server and Windows XP Pro., Network Services in Windows Server 2003) under which ASP.NET worker process runs are local accounts (local to the web server). Therefore the database server on a separate machine will not be able to see/recognize these accounts. So if you try using the same steps mentioned above to configure a trusted connection between the web server and the SQL server, you will get the error still.
The resolution for this, is as follow:-
1. Create a Domain Account with priveleges similar to ASPNET or NETWORKSERVICE.
2. Grant this Domain Account (DomainName/UserName) access in the SQL Server Database for the necessary databases.
3. Use Impersonation in your web.config (setting identity impersonate="true") in the web.config of your application.
Now, while enabling impersonation, you can either set the username and password in the web.config itself as follows:-
<identity impersonate="true" userName="DomainName\UserName" password="password" />
However this defeats the process of security as you are again storing the password in the web.config file.
The second method is to simply set identity impersonate to true and assign the username and password in the IIS.
To do that, do the following steps
1. Type inetmgr from your command prompt and give enter.
2. This would open the IIS Control Panel.
3. Expand the appropriate nodes and select the Virtual Directory of your application.
4. Right Click and select Properties.
5. Switch to the Directory Security Tab.
6. Under Anonymous access and authentication control click Edit
7. Check the Enable Anonymous access in case you want people to access the application without logging in with Windows Logon Screen.
8. Uncheck the Allow IIS to control password and enter the DomainName/UserName and Password in the respective boxes. Usually IIS uses IUSER_MACHINENAME credentials for Anonymous access.
9. Uncheck if any other authentication mode is checked and then press Ok twice to exit.
Now the application should serve and the error "Login failed..." shouldnt appear.
Cheers and Happy Programming !!!
The steps and guidelines provided in this article have been explained in many other online resources, MSDN and KB Articles and this is just an initiative to further spread the awareness and provide help for people who get stumped by this error.
I have freely referred many resources and taken steps for writing this articles and am extending my thanks to the respective authors who have helped me in compiling this article.
<UPDATE Date=November 26, 2007>
I find that a lot of people have been finding this article a little useful and at the same time finding it hard to get to the second part of the article. So here below is the link for the Part II of this article.
Integrated authentication allows for SQL Server to leverage Windows NT authentication to validate SQL Server logon accounts. This allows the user to bypass the standard SQL Server logon process. With this approach, a network user can access a SQL Server database without supplying a separate logon identification or password because SQL Server obtains the user and password information from the Windows NT network security process.
Choosing integrated authentication for ASP.NET applications is a good choice because no credentials are ever stored within your connection string for your application.
However, you may receive the error "Login failed for user '(null)'. Reason: Not associated with a trusted SQL Server connection" while trying to use Integrated Windows Authentication and your application needs to access the SQL Server Database using Trusted Connection.
Lets examine two scenarios where this error can occur.
1. Your IIS (Webserver) and SQL Server are on the same machine.
The first thing to check is that whether the SQL Server is configured to use Mixed Mode Authentication or Integrated Authentication.
To do that, do the following steps :-
1. Click Start - Programs - Microsoft SQL Server - Enterprise Manager to open the Enterprise Manager.
2. Connect to the appropriate Server if the SQL is a client installation.
3. Right click on the Server Node and click on Properties.
4. This opens the Properties Dialog.
5. Switch to the Security Tab.
6. Under the Security section, make sure the Option SQL Server and Windows is selected.
7. This ensures that your SQL Server is running under Mixed Mode Authentication.
The Next step is to ensure that the ASPNET account (IIS_WPG in case of Windows server 2003) has the appropriate access to the Database.
To do that, do the following steps:
1. Open SQL Server Enterprise Manager (Start - Programs - Microsoft SQL Server - Enterprise Manager), select the appropriate server, and expand the Security folder.
2. In the Logins check whether the IIS_WPG is listed.
3. If it is not listed, right-click on Logins and select New Login
4. In the Name: textbox either enter [Server/Domain Name]\IIS_WPG or click on the ellipses button to open the Windows NT user/group picker.
5. Select the current machine’s IIS_WPG group and click Add and OK to close the picker.
6. You then need to also set the default database and the permissions to access the database. To set the default database choose from the drop down list,
7. Next, click on the Database Access tab and specify the permissions.
8. Click on the Permit checkbox for every database that you wish to allow access to. You will also need to select database roles, checking db_owner will ensure your login has all necessary permissions to manage and use the selected database.
9. Click OK to exit the property dialog.
Your ASP.NET application is now configured to support integrated SQL Server authentication.
I will explain the next scenario of Webserver and SQL Server residing in different machines, in my next article.
Read Part II
Cheers and Happy Programming.
You may receive the error "BC30138: Unable to create temp file in path 'C:\Windows\TEMP\:' Access is denied" while trying to debug or browse your ASP.NET Applications.
This may occur if the ASPNET account doesnt have sufficient priveleges on the 'C:\WINDOWS\TEMP' folder, in windows versions prior to Windows Server 2003.
In Windows Server 2003, it is the Network Service account that needs to have the necessary rights for that folder.
Even if you assign rights for EVERYONE it wont work since these accounts do not form a part of the EVERYONE account.
The error can be resolved by explicitly assigning MODIFY rights to ASPNET User account (prior to Windows Server 2003) and NETWORK SERVICE account in case of Windows Server 2003.
That should solve the problem.
Cheers and Happy Programming !!!
I had a query from user complaining that his validators are firing for the click of a wrong button. I provided a work around solution for his problem which I thought would be worthwhile to share with all of you.
All of us would agree that ASP.NET Validation Controls are a great means of quickly developing user input based forms and enforcing validations for the entries.
Say in an ASPX Page you have the following controls:-
TextBox1, TextBox2 - Two Textboxes
Button1, Button2 - Two Buttons
RequiredFieldValidator1, RequiredFieldValidator2 - Two RequiredFieldValidators associated with each Textbox.
Now, once you declare the above and assign the ControlToValidate property to the TextBoxes respectively, you would expect that the first validator should fire only when the TextBox1 doesnt get a valid input and the second validator should fire only when the TextBox2 doesnt get a valid input. That is, when you click Button1 which should validate the TextBox1, you should get the validation error only if TextBox1 is empty and similarly if you click Button2 you should get the validation error only if TextBox2 is empty.
However, when you click any of the buttons with either one or both the TextBoxes not meeting the criteria, the form cannot be submitted.
The reason is that, the Button (asp:button) is a server control and it posts back to the server. However, before the page is posted, the validity of the page would be checked and if one of the two validations are not met, the page is not posted.
The workaround is as follows:-
1. On the page_load event of the page set the Enabled property of the second validator i.e. RequiredFieldValidator2 to false, as follows:-
RequiredFieldValidator2.Enabled = false
2. This would ensure that when Button1 is clicked, the second validator doesnt fire.
3. Then, on the Button click event of Button1, set back the Enabled property of the second validator to True.
RequiredFieldValidator2.Enabled = true
4. Now, the second validator would fire if the TextBox2 is left empty and the page is tried to be posted.
5. Similarly you can also enable and disable other validation controls.
Well, this is a specific requirement for which I have provided the solution and may not be a generic solution. However, I am exploring other options and in the meanwhile if any one has a good pointer other than using Third party validation controls, please do let me know.
This article applies to .NET 1.x versions only.
Windows Server 2003 SP1 is a very good security update released by Microsoft and I hope many of you would have already installed the same.
However, if for some reason, you had to remove the SP1 then you may no longer be able to browse ASPX Pages.
Even after you enable the WebService Extensions from IIS, you will get a 403.1 Error message and will be unable to execute ASPX Pages.
The resolution for the same is to uninstall and reinstall the IIS.
That should resolve the issue.
Cheers and Happy Programming !!!
In my earlier articles on ASP.NET 1.x versions I had explained the ways of Caching an ASP.NET Page in which we are limited to two options
i. Caching the whole page using Output Caching - Not applicable in many realtime scenarios where some sections of the page need to be dynamic ex.- Stock Rates
ii. Caching Usercontrols by Fragment Caching - Difficult to implement since page requires to be breaked into separate usercontrols for enabling/disabling caching.
So, if we had situation where we want the whole Page to be cached and only a particular portion of the page not to be cached, we have to separate the page into usercontrols and then provide caching only for those sections for which we need to cache and leave the dynamic section without caching. The reason is that, if you specify outputcaching for the page, then the whole page, including the controls will be cached for that duration.
Whidbey, once again, has come up with a resolution for this, thanks to the Substitution Control to specify a section on an output-cached Web page where you want dynamic content substituted for the control.
The Substitution control offers a simplified solution to partial page caching for pages where the majority of the content is cached. You can output-cache the entire page and then use Substitution controls to specify the parts of the page that are exempt from caching.
The syntax for declaring a substitution control is as follows:-
< asp:substitution id="Substitution1"
methodname="Provide Method Name Here"
Let us examine how we can check the functionality of the Substitution control on an output cached page.
In our ASPX Page, first we will declare the Output Caching next to the Page Directive as follows:-
<%@ Outputcache duration="60" varybyparam="none" %>
Where the duration="60" specifies that the page will be cached for 60 seconds.
Then, we declare a label, a substitution control and a button control as follows:-
Label displaying the Current Date !!!
Substitution control displaying the Current Date !!!
text="Check Now !!!"
Then, in the codebehind, we specify the text for the Label in the Page_load event as follows:-
void Page_Load(object sender, System.EventArgs e)
lblCurrentDate.Text = DateTime.Now.ToString();
We will set the Current Date Time to the label so that when we refresh the page, we can see whether the time is retained or new time is displayed on the Label.
Next, we have to define the method that is specified to the Substitution control. If we examine the substitution control declaration above, we can see that we have specified as methodname="GetCurrentDateTime".
When the Substitution control executes, it calls a method that returns a string. The string that the method returns is the content to display on the page at the location of the Substitution control.
We define the method "GetCurrentDateTime" in the codebehind as follows:-
public static string GetCurrentDateTime(HttpContext context)
After compiling the application, if we browse the page we fill find that the text below Label and Substitution control is the same i.e. the Current Date and Time.
However, if you refresh the page by clicking the Button, you will find that the time displayed in the Label remains the same, while the time displayed in the Substitution control is changed (the change might be in seconds).
For subsequent requests also, the time would be changing in substitution control while it remains the same in the label, until the duration of "60" seconds expires.
Once it expires then both the label and substitution control would display the updated current time.
Thus we can see that though we have cached the whole page using output caching, the substitution control is dynamic to show the current time.
This is really a great advantage for developers who would like to use Caching for building robust applications while maintaining dynamic sections of their page as well.
There are other ways to achieve this functionality which I would discuss in my next articles.
Cheers and Happy Programming !!!