Srijith Sarman

Time,space and living

  Home  |   Contact  |   Syndication    |   Login
  16 Posts | 1 Stories | 8 Comments | 1 Trackbacks

News



Archives

Algorithms

General

Other software

Programming .NET

Sunday, May 20, 2012 #

Continues from last post!!

So when I tried to hit the webservice, the 403 error is gone. But now a new error in the classic .net yellow screen.

Could not find a base address that matches scheme https for the endpoint with binding BasicHttpBinding

Well, this is obviously due to the binding configuration.  Let me open my config file again.
Here it is, the first culprit.

<serviceBehaviors>
          <behavior>
            <!-- To avoid disclosing metadata information, set the value below to false and remove the metadata endpoint above before deployment -->
            <serviceMetadata httpsGetEnabled="true"/>

I changed it to httpGetEnabled. No! the same error!
Infact, the error if due to httpsGetEnabled would as below.


"The HttpsGetEnabled property of ServiceMetadataBehavior is set to true and the HttpsGetUrl property is a relative address, but there is no https base address.  Either supply an https base address or set HttpsGetUrl to an absolute address. "

The next area is the binding configuration.

Here it is,

    <basicHttpBinding>
            <binding name="BasicHttpBindingConfig" closeTimeout="00:10:00"
            openTimeout="00:10:00" receiveTimeout="01:00:00" sendTimeout="00:10:00"
            maxBufferPoolSize="2147483647" maxReceivedMessageSize="2147483647">
            <readerQuotas maxDepth="64" maxStringContentLength="2147483647"
                maxArrayLength="2147483647" maxBytesPerRead="2147483647" maxNameTableCharCount="16384" />
            <security mode="Transport">
                <!--<transport clientCredentialType="None" />-->
            </security>
            </binding>
        </basicHttpBinding>
        <wsHttpBinding>
            <binding name="WSHttpBindingConfig" closeTimeout="00:10:00" openTimeout="00:10:00" receiveTimeout="01:00:00" sendTimeout="00:10:00" maxBufferPoolSize="2147483647" maxReceivedMessageSize="2147483647">
                <readerQuotas maxDepth="64" maxStringContentLength="2147483647" maxArrayLength="2147483647" maxBytesPerRead="4096" maxNameTableCharCount="16384" />
                <reliableSession inactivityTimeout="01:00:00" />
                <security mode="Transport">
                    <!--<transport clientCredentialType="None"  />-->
                </security>
            </binding>
        </wsHttpBinding>
        </bindings>


Change the security mode to None, then it's all right!



HTTP 400 Bad Request
HTTP 404 File Not Found

Anybody who has done some web development would have surely come across these errors. These could be arguably most popular error codes in the HTTP 4xx family ;). Let's take a deeper look at what are the other common errors and the possible reasons for that. Well, there are uncountable resources in the web on HTTP error codes and I need not add one more on top of it. Let me explain my stand then. I was getting complaint that a consumer of  one of my web service is getting a 403 error for quite some time. So thought why can't I group all these errors in my blog, for a reference. 

400 family of errors are client erros. Wikipedia says like this "The 4xx class of status code is intended for cases in which the client seems to have erred. Except when responding to a HEAD request, the server should include an entity containing an explanation of the error situation, and whether it is a temporary or permanent condition. These status codes are applicable to any request method. User agents should display any included entity to the user."

Now let's look at each of these error codes.

400 Bad Request
The request cannot be fulfilled due to bad syntax.
401 Unauthorized
Similar to 403 Forbidden, but specifically for use when authentication is possible but has failed or not yet been provided.The response must include a WWW-Authenticate header field containing a challenge applicable to the requested resource. See Basic access authentication and Digest access authentication.
402 Payment Required
Reserved for future use.[2] The original intention was that this code might be used as part of some form of digital cash or micropayment scheme, but that has not happened, and this code is not usually used. As an example of its use, however, Apple's MobileMe service generates a 402 error ("httpStatusCode:402" in the Mac OS X Console log) if the MobileMe account is delinquent.[citation needed]
403 Forbidden
The request was a legal request, but the server is refusing to respond to it.[2] Unlike a 401 Unauthorized response, authenticating will make no difference.[2]
404 Not Found
The requested resource could not be found but may be available again in the future.[2] Subsequent requests by the client are permissible.


I don't want to copy paste all the error codes in here. But let me look deeper into 403.

Below are the sub statuses of 403.
  • 403.1 - Execute access forbidden.
  • 403.2 - Read access forbidden.
  • 403.3 - Write access forbidden.
  • 403.4 - SSL required.
  • 403.5 - SSL 128 required.
  • 403.6 - IP address rejected.
  • 403.7 - Client certificate required.
  • 403.8 - Site access denied.
  • 403.9 - Too many users.
  • 403.10 - Invalid configuration.
  • 403.11 - Password change.
  • 403.12 - Mapper denied access.
  • 403.13 - Client certificate revoked.
  • 403.14 - Directory listing denied.
  • 403.15 - Client Access Licenses exceeded.
  • 403.16 - Client certificate is untrusted or invalid.
  • 403.17 - Client certificate has expired or is not yet valid.
  • 403.18 - Cannot execute request from that application pool.

The most common reason for 403 errors was when the directory browsing is disabled by the web server. You can see the sub status  403.14 where it's mentions directory listing is denies. This means you are trying to access a path to which you have no access. You should have a proper way using which you should try to hit this path. If it's due to "no directory browsing" (You can check if your url ends with a slash) then you need to find out the proper path.
Another case I got was 403.4 SSL required.  This is as the error suggest is due to incorrect implementation of SSL channel. install correct certificate and access the service over https should resolve this.

So,
 I get a HTTP 403.14 error when I tries to access my webservice. Now, you know that 403.14  Forbidden is due to the wrong path that I use. What I was doing, is just right clicking on the IIS manager and click on browse. Since this is a WCF service which is not developed by me , I first looked for the svc file. No it's not there! Then I searched for ServiceHost. Yes, I found the service host configuration in the app.config. And there it is


      <serviceHostingEnvironment multipleSiteBindingsEnabled="true" >
        <serviceActivations>
          <add relativeAddress="MyService.svc"
            service="BlahBlahBlah.Service />    
        </serviceActivations>
      </serviceHostingEnvironment>

So I browsed,  http:\\localhost\MyService.svc.

Cool, the 403 error disappeared.

Did my service loaded correctly? No, I will explain it in another post.



Saturday, February 4, 2012 #

It's been exactly over three years and two months since I last logged in to this blog. The last post which I made on Dec 2008, was itself after a one year gap :). Well, now let me try if I can continue this activity. Lot of learning, lot of new cool stuffs. Technology itself has been changed quite dramatically over this period with the addition of new cool stuffs and I believe my English as well improved a little bit :).  Microsoft developers finally started realizing the drawbacks of ASP.NET architecture and started drifting towards more web world friendly architecture like ASP.NET MVC. Thanks to the huge fan base of Ruby On Rails.

In this context to start off this new beginning I will quote some interesting remark I found from the Mr.Asp.Net Scott Guthrie. Who is better than him to quote when I want to discuss about MVC?


"Some guidance I occasionally give people on my team when working and communicating with others:

1.You will rarely win a debate with someone by telling them that they are stupid - no matter how well intention ed or eloquent your explanation of their IQ problems might be.
2.There will always be someone somewhere in the world who is smarter than you - don’t always assume that they aren’t in the room with you.
3. People you interact with too often forget the praise you give them, and too often remember a past insult - so be judicious in handing them out as they come back to haunt you later. 
4. People can and do change their minds - be open to being persuaded in a debate, and neither gloat nor hold it against someone else if they also change their minds. "

Great points!!!  Very true as well.


Sunday, December 7, 2008 #

God...It's been one year since I started neglecting  this blog. I must post something. Is there any point to wait for making a blog entry until you really invent something new? Then I may not make any post  at all .

This is a small tip which might be usefull for someone. I haven't tested the originality of this topic but this is the thing on which I wasted  some of my time on a precious sunday. I was just trying to load some excel files to their corresponding SQL Server database.

 Data entry? no way!.

I was kind of sure that I could use ADO.NET to do their job. I prefer this over other methods like DTS or SSIS as  both were not my option since I was not having the required installations on my machine. Wrote the code but when I started executing it I was  getting this error.

The 'Microsoft.Jet. OLEDB.4.0' provider is not registered on the local machine.

Hell, Iooking at my code I was not able to see a mistake.  I have seen many comments on net but thought of trying it later. First I need to get the job done as there are lot of alternative ways.

Another option I had in my mind was using linked server. It worked like charm.

I  did not try using enterprise manager and all. We can do the job simply by a query like this.

Insert into mydb.dbo.test
SELECT *  FROM OPENDATASOURCE('Microsoft.Jet.OLEDB.4.0',
'Data Source=test.xls;Extended Properties=Excel 8.0')...[test$]

To execute Ad Hoc Distributed Queries,you may have to configure your sql server accordingly.

The following snippet will do.

EXEC sp_configure 'show advanced options',1
GO
RECONFIGURE
GO
EXEC sp_configure 'Ad Hoc Distributed Queries', 1
GO
reconfigure

A thing to note here is,the excel file should be in the server machine. Otherwise you will get an error like this.

OLE DB provider 'Microsoft.Jet.OLEDB.4.0' reported an error. The provider did not give any information about the error.


Another way of achieving the same result is using OPENROWSET function

 

SELECT *  FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0''Excel 8.0;Database=C:\test.xls','SELECT * FROM [test$]')


Thursday, September 6, 2007 #



"LINQ will be a language intrinsic capability for universal query".

I have seen an interesting comment  in javalobby as a response to the above statement that came in an article regarding LINQ

It goes like this.

    "LINQ is solving a problem that's not a problem. There's no problem with accessing a DB with SQL, Java objects with Java code, and XML data with a DOM or SAX parser. No need to "unify" these."

Even there exists some confusion among the .Net programers ,about the need of  LINQ. We have always been told about the advantages of stored procedures and LINQ says otherwise. Both the arguments have supporters.

Here I am trying to discuss one feature of LINQ which I feel quite useful . That is querying over datasets.

I had faced situations to query datasets.. Join two datatables and retrieve result. Since there was no option then, We moved to other alternatives which of course assured more tedious. But LINQ handles this wonderfully. We can offer complex join operations in memory itself.

Let's look at why do wee need this. If we are using some enterprise databases such as Oracle or SQL Server, we can apply queries directly to databases. But,consider the scenarios when we have table informations in small DBs like SQlite,flat files,xmls etc. Performing complex queries against them would be out of question. If we could load these tables directly into dataset and apply queries over the dataset,life would be very easy, right?.

For example,

    var linqSample= from emp in employee
            join dept in department
            on emp.Field<int>("departmentID") equals dept.Field<int>("departmentID")
            where emp.Field<int>("age") > 30
            select new { employeeid = emp.Field<int>("employeeid"),
                       name = emp.Field<string>("employeename"),
                       department = dept.Field<string>("deptname"),
                       };


As we know,in C# 3.0 a new type var has been offered.


The above query performs an inner join between two tables ,employee and department.

We can even perform outer joins by using DefaultIfEmpty operator, like.


 var linqSample= from emp in employee
            join dept in department
            on emp.Field<int>("departmentID") equals dept.Field<int>("departmentID") into res
        from dept in res.DefaultIfEmpty()
            where emp.Field<int>("age") > 30
            select new { employeeid = emp.Field<int>("employeeid"),
                       name = emp.Field<string>("employeename"),
                       department = dept.Field<string>("deptname"),
                       };

Now,you decide whether LINQ is  only a kind of syntatic sugar coating as some people say ..

Friday, May 18, 2007 #


Over a month I was trying to log in to my blog .All that happened when the new look of geekswithblogs introduced. I remember  logging into the blog and checking some of the new options . After that it's only after a  week  again I am trying to log in...But it was not allowing me to get inside ...I was not sure whether I changed my username and password..I raised a ticket with the support ,tried to send mails..but no luck. Finally today I tried with all the permutations and combinations ..and  got my blog back.

Sunday, January 14, 2007 #

          Bulk inserting would not be a problem for most of the developers.Lot's of examples are out there. Simply bulk inserting in a transaction will do the purpose. Another common method is to call the DataAdapter.Update() method.

         Last day I stumbled a little ,when I test the time taken by the bulk delete and bulk update code which I have writtten. Each time ,the sql query which has a where condition kills the time.Also the operation was being done in a client side database (Had it been an enterprise database like oracle,we could have used tools like sqlldr.exe to load  data). I tried it by putting all delete operations in a transaction. But that was also not sufficient. Finally I made up my mind to do all the operations in the memory and to perform a bulk insert after truncating the table.

Like,

DataSet dsUpdate=getData() // Data to Update

 DataSet ds=FetchTableData() // function to return data from the database.

// Some times I may have to update all the records in the table.So I deleted all those records ,which has to be updated ,and then inserted the new dataset to the table.

 DataTable DtChange=Changes(ds.Table[0],dsUpdate.Tables[0],uniqueColumns) ;

//After getting the difference ,you can add the extra records to DtChanges  which have to be inserted,and then you can perform the bulk insert operation.


public DataTable Changes(DataTable dt1, DataTable dt2,string[] uniColumns)

{

DataTable resultTable = new DataTable();

DataSet ds = new DataSet();

ds.Tables.AddRange(new DataTable[]{dt1.Copy(),dt2.Copy()});

int dt1Len=uniColumns.Length;

DataColumn[] dt1Cols = new DataColumn[dt1Len];

for(int i = 0; i <dt1Len ; i++)

{

string col1=uniColumns[i];

dt1Cols[i] = ds.Tables[0].Columns[col1];

}

int secLen=uniColumns.Length;

DataColumn[] dt2Cols = new DataColumn[secLen];

for(int i = 0; i <secLen; i++)

{

string col2=uniColumns[i];

dt2Cols[i] =ds.Tables[1].Columns[col2];

}

DataRelation r = new DataRelation(string.Empty,dt1Cols,dt2Cols,false);

ds.Relations.Add(r);

for(int i = 0; i < dt1.Columns.Count; i++)

{

resultTable.Columns.Add(dt1.Columns[i].ColumnName, dt1.Columns[i].DataType);

}

foreach(DataRow row in ds.Tables[0].Rows)

{

DataRow[] childrows = row.GetChildRows(r);

if(childrows == null || childrows.Length == 0)

resultTable.ImportRow(parentrow);

}

return resultTable;

}


     I used Sqlite as client side database which is a very fast zero configuration database.Bulk insert could be easily done if you have all those records in a transaction. It took hardly one minute for me inserting50,000 records in a single transaction.When I take away the transaction,it takes nearly 15 minutes!!!.

 


Friday, October 13, 2006 #

I am writing this from an internet cafe.I don't know what to write .Infact,I have been a regular reader of some of the geeks with blogs for quite along time. Just for a craze I registered ,and now ....I have to start writing . Blogging is an emotional experience,indeed.I know,many of our posts may remain unread.But still you do post .Because it gives you a kind of energy of confronting with the world.

Why most of the programming languages starts with a "Hello world " pragramme.I came to this conclusion that,each language is your  oppurtunity to call the world.Likewise this is your voice to the world .And,if the world started reconizing your voice ,it will reply "Hello You".And even if the reply hasn't come ,your journey would continue.

Salute to all my seniors,who started the journey earlier.


Friday, October 20, 2006 #

Lots of tips are available in the net for improving .Net perfomance.There are some common practices which we often do.As everyone,knows .Net has some inherent tools with them which helps to make high perfomace applications.But it would be foolish if e rely completely on .Net framework to take care of the application perfomance.

 Some tips we often here are like avoiding loops,disposing objects after usage , usage of normal arrays instead of arraylist in order to reduce the overhead of boxing and so on....Another tip is to use for loops instead of for-each as the latter is actually creates an enumerator.For each can be used when enumaration is required.Even though .Net framework has an inbulit garbage collector ,it's adviced to dispose the objects after use,or use using blocks ,which will call the dispose method by itself.

But most of us wouldn't realize the application's being slow down ,until we face the problem.Perfoamnce would be a big headache ,when we code applications which includes technlogies like webservice,remoting or other distributed technologies etc.Hence,the first step of starting in this technologies should be by reading some of the artices regarding the perfomance issues.One of the best resource for this is here

 http://msdn2.microsoft.com/en-us/library/ms998530.aspx

 (Today I took print out sof some of the chapters.It would be useful to keep as a collection).Though some of the papers have been written in the period of .Net 1.0,it will be still useful. After reading all these aricles ,you would be having some confidence to face high end technologies.Also technology is changing fastly.Microsoft and others are bringing more and more methods o trap the perfomance issues.If we couldn't adopt today's technologies,things would be more complicated by tomorrow.


Zeno's paradox is a mathematical paradox.During my school days the story of tortoise  and achilles had haunted me very much.I have heard the story in many different forms(In some stories it's the race between a rabbit and a trotoise ,as there is already a famous folk story of a race between them).But the theroy was the same.Zeno of Elea lived during the period of B.C 450.And his contributions to the world are  perceived as slightly negative.

His paradoxes confounded mathematicians for centuries, and it wasn't until Cantor's development (in the 1860's and 1870's) of the theory of infinite sets that the paradoxes could be fully resolved.

Since I may cannot paraphrase the story with it's sole ,I copied the below story from mathacademy.com

The Tortoise challenged Achilles to a race, claiming that he would win as long as Achilles gave him a small head start. Achilles laughed at this, for of course he was a mighty warrior and swift of foot, whereas the Tortoise was heavy and slow.
“How big a head start do you need?” he asked the Tortoise with a smile.
“Ten meters,” the latter replied.
Achilles laughed louder than ever. “You will surely lose, my friend, in that case,” he told the Tortoise, “but let us race, if you wish it.”
“On the contrary,” said the Tortoise, “I will win, and I can prove it to you by a simple argument.”
“Go on then,” Achilles replied, with less confidence than he felt before. He knew he was the superior athlete, but he also knew the Tortoise had the sharper wits, and he had lost many a bewildering argument with him before this.
“Suppose,” began the Tortoise, “that you give me a 10-meter head start. Would you say that you could cover that 10 meters between us very quickly?”
“Very quickly,” Achilles affirmed.
“And in that time, how far should I have gone, do you think?”
“Perhaps a meter – no more,” said Achilles after a moment's thought.
“Very well,” replied the Tortoise, “so now there is a meter between us. And you would catch up that distance very quickly?”
“Very quickly indeed!”
“And yet, in that time I shall have gone a little way farther, so that now you must catch that distance up, yes?”


“Ye-es,” said Achilles slowly.
“And while you are doing so, I shall have gone a little way farther, so that you must then catch up the new distance,” the Tortoise continued smoothly.
Achilles said nothing.
“And so you see, in each moment you must be catching up the distance between us, and yet I – at the same time – will be adding a new distance, however small, for you to catch up again.”
“Indeed, it must be so,” said Achilles wearily.
“And so you can never catch up,” the Tortoise concluded sympathetically.
“You are right, as always,” said Achilles sadly – and conceded the race.

Zeno's Paradox may be rephrased as follows. Suppose I wish to cross the room. First, of course, I must cover half the distance. Then, I must cover half the remaining distance. Then, I must cover half the remaining distance. Then I must cover half the remaining distance . . . and so on forever. The consequence is that I can never get to the other side of the room.

This makes motion impossible.


Wednesday, October 25, 2006 #

Serialization is one among those beutiful techniques, which every programmer would love to do.It reduces the complexity of an operation to  great extents. Windows forms uses serialization internally in the clipboard method ,which uses for copy and paste and Asp.net uses the same for storing data in sessions.One major use of serialization comes when we need to pass object through the network. I am not digging into the details of Binaryformatter,Soapformatter or Xmlformatter available with .Net  framework ,as there are many resources available on these in the net.

 Serialization may seem as very easy,as we have already equipped with all the necessary formatters.Only thing that we need to do is ,to get a stream which derives from the abstract class System.Io.Stream, and write some code as shown below., We can return the resultant stream accordingly.

                            BinaryFormatter bi=new BinaryFormatter();
                            MemoryStream mem=new MemoryStream();
                            bi.Serialize(mem,graph);

      Writing a custom formatter is a really challenging task for any programmer.As of now,I know only about Angelo scotto's compactFormatter which  has succeeded to a certain extent in emulating the  formating algorithms, and also  it's faster than .Net framework formatters.But it's still in beta..Net formatters converts all the object graphs into binary streams(Yeah! That's serialization).Eventhough microsoft has assured about the reliability of it's formatters,serializing complex datatypes such as datasets are indeed a headache .Despite of the type of formatters which is being used ,dataset would always converts in to xml. I think it's better to deal with this issue and solutions in a seperate post.

Even though binaryformatter fits for almost all the needs ,we may think that it can be still improved.for example if I serialize a string of length 60 the returned byte arry will have a length of  250.But if I use some other methods such as Encoding.Ascii.GetBytes() ,it returns only 60 bytes as the byte array.Ofcourse,there is difference in terms of functionality.

I am not continuing my blah ,blah .For references about serialization ,to me,the best resources are.

http://msdn.microsoft.com/msdnmag/issues/02/04/net/

http://msdn.microsoft.com/msdnmag/issues/02/07/net/#edupdate

http://msdn.microsoft.com/msdnmag/issues/02/09/net/

After reading all these article ,you would get a clear cut picture about serialization.


Saturday, December 23, 2006 #

Transfering large volume of data through the wire is indeed a challenge for developers. Distributed technology offers many ways .Remoting or webservices ? .That's the commonly asked question. Ofcourse,technology is in full swing as names like WCF is in front of us .

 But my aim is to transfer large amount of data through the network.Let's set apart all these technologies. Whatever be the technology,I would be willing to choose it ,provided if it suits for my requirements.

My requirements are,

             1. It should be able to transfer huge data without any loss,say for 100 MB.

              2.How can I compromise on speed?. So it should be fast.I would be happy if this 100 MB data reaches the other end within a second(a little over ambitious.isn't it?).

             3.It should not eat up all the available bandwidth.There sould be provision to control the bandwidth consumption.

             4.There should not be any Outofmemory errors.

             5.Binary data should be transfered as binary itself.

           So I have webservices,remoting ,enterprise services and straight dsn connection in my list. In order to set up a dsn client-sever application, I should have WAN or LAN environment and each client also needs seperate license.So that option is dropped.DCOM is also dropped ,since it's an older technology and it's a shame to use DCom in this centuary(just kidding..).         

          Remoting is obviously faster than webservices.But,Considering the first requirement, I have to cut down remoting from my list.Why? can't remoting be used in a scenario to transfer 100 or 200 MB data.yes, of course .But it will give us a lot of pain to implement. It's specified in MSDN that remoting may cause huge load over server if the data reches over 10 MB and may throw up out of memory errors.Well,this is happening due to the buffering of  binary data in the server. Remoting is implemented in the form of a straight tcp channel.So the serialization channel will end up buffering the data even if send it as data chunks. Now it's up to us to prevent this is from being happened and implement remoting effectively. Means,we have to write a suitable algorithm. A nice problem for practice!!

         For second requirement,I thought passing data in binary format will do.We have to implement a suitable algorithm which effectively serializes complex data types such as data sets.There are some widely popular utility classes such as data set surrogates, ghost serializer etc. Each of them has their own advantages and disadvantages.I feel that it's good to write your own algorithm,which suits your requirement.

         There are many tools available for data compression .The most popular is of course sharpziplib.But considering only speed I prefer lzo compressor which outperforms sharpziplib .lzo is written in C++ ,so not in managed code where sharpziplib is written in C# .

            Now,taking the fifth requirement ,we know that passing byte array straight into a web service channel will cause encoding them into bas64binary.This would definitely affect the perfomance.There are mainly two methods available in the scene.One is Dime and other is MTOM. Dime is not obselete as there are lot of developments still happen in .Net 1.1 . MTOM is very simple to implement,but powerful too.We can return byte[] as it is .Only we need to modify some configuration settings,which is very well specified in the WSE 3.0 document. In Dime we need to attach the binary data with the requestsoapcontext . Since the attached data is outside the soap packet ,it's not fully secured.Now ,that wouldn't be a majour issue for many of us as there may be sufficient security measures in the environment.

         We can also implement a chunking algorithm whch splits the binary data into several chunks. As you know,the default packet size in machine.config is 4 MB.We can modify this value according to our requirement considering the network capacity.

 Any way there are lot's of developments happening in the field which often confuses a developer.Lot's of new technogies are comimg up in a high speed.After some months ,I think we  can browse all TV programmes.Wouldn't it be funny? minimizing BBC and maximizing FOX TV(It's started happening,though).


Friday, December 29, 2006 #

When I transfer my webservice to a new webserver I started getting the "503 Bad gateway" error. Server configuration is one thing that we need to take care of. Each server may behave differently ,unless it has not been configured well to fit our requirement. Additionally firewalls and all would have already been set up in the server.
I just added the following line in the class which calls the webservice and it started working.

ws.Proxy=new WebProxy(ws.Url);

Saturday, March 3, 2007 #

         Suddenly I started getting this error after the deployement of our project.I haven't faced this type of a problem in the development environment.The project is done in .Net 1.1 . I found in some forums that this does occur due to stack overflow.But yet I couldn't figure out exactly where the stack overflow happens.Hope that it could be sorted out.

Edit: In our case it was due to some badly done multi threading.

Wednesday, March 7, 2007 #

            As there is a clear overload for sorting the datatable as DataTable.Select(filter expression,Sort expression), I don't understand why some of the develepors go for dataview only for the sorting purpose.As we know ,in many cases dataview would not suit for all the requirements that can be done with a DataTable or DataSet. And above all, though we give a sort expression to a dataview,if we retrive the datatable from the dataview,the sorting will go.

Recently I had confronted with this isuue when implementing the sorting feture of  a nested datagrid in an Asp.net application.dataview cannot be given as a datsource since we cannot have a child table in a dataview.Immediately I switched back to the old code.i.e:DataTable.Select(filter,sort expression)