Blog Stats
  • Posts - 24
  • Articles - 0
  • Comments - 16
  • Trackbacks - 0

 

Monday, August 11, 2014

Unity DontDestroyOnLoad causing scenes to stay open


My Unity project has a class (ClientSettings) where most of the game state & management properties are stored.  Among these are some utility functions that derive from MonoBehavior.  However, between every scene this object was getting recreated and I was losing all sorts of useful data.  I learned that with DontDestroyOnLoad, I can persist this entity between scenes.  Super.


The problem with adding DontDestroyOnLoad to my "ClientSettings" was suddenly my previous scene would stay alive, and continue to execute its update routines.  An important part of the documentation helps shed light to my issues:

"If the object is a component or game object then its entire transform hierarchy will not be destroyed either."

My ClientSettings script was attached to the main camera on my first scene.  Because of this, the Main Camera was part of the hierarchy of the component, and therefore was also not able to destroy when switching scenes.  Now the first scene's main camera Update routine continues to execute after the second scene is running - causing me to have some very nasty bugs.

Suddenly I wasn't sure how I should be creating a persistent entity - so I created a new sandbox project and tested different approaches until I found one that works:

In the main scene: Create an empty Game Object:  "GameManager" - and attach the ClientSettings script to this game object.  
Set any properties to the clientsettings script as appropriate.
Create a prefab, using the GameManager.

Remove the Game Object from the main scene.

In the Main Camera, I created a script:  Main Script.  This is my primary script for the main scene.

<code>
public GameObject[] prefabs;

private ClientSettings _clientSettings;

// Use this for initialization
void Start () {
GameObject res = (GameObject)Instantiate(prefabs[0]);

}
</code>

Now go back out to scene view, and add the new GameManager prefab to the prefabs collection of MainScript.

When the main scene loads, the GameManager is set up, but is not part of the main scene's hierarchy, so the two are no longer tied up together.

Now in our second scene, we have a script - SecondScript - and we can get a reference to the ClientSettings we created in the previous scene like so:

<code>
private ConnectionSettings _clientSettings;

// Use this for initialization
void Start () {
_clientSettings = FindObjectOfType<ConnectionSettings> ();

}
</code>

And the scenes can start and finish without creating strange long-running scene side effects.

Thursday, January 23, 2014

Friendly URLs with Visual Studio 2013


A few months ago I set up a web forms project for a new client using Visual Studio 2013.  One of the default NuGet packages - ASPNet.FriendlyURLs - was very exciting to see - 


And as I've been really enjoying MVC's routing tools it's very nice to see that also available for web form technologies.  However, once we deployed to the dev server things went wrong.

Apparently even on Windows Server 2008, FriendlyURL also relies on the URL Rewrite extension to be installed on IIS: More info here which really surprised me - I expected having the right version of .Net being installed to be sufficient.  But, the web server was not routing the FriendlyURL addresses and was instead delivering 500 errors.

Rather than add an additional dependency on the project - I removed the FriendlyURL NuGet package from the project.  Things worked fine on the web server, but I noticed something strange on my local machine even after removing FriendlyURL from the solution.

Internet Explorer on my local machine kept sending me to Default instead of Default.aspx, and Settings instead of Settings.aspx.  Chrome didn't seem to have a problem, and any pages that I added afterwards also didn't have this redirect issue.  Somehow Administration.aspx and Import.aspx weren't trying to redirect to their non-existent friendly URL versions.

Digging deeper, I wanted to see if somehow my IISExpress was holding onto FriendlyURLs and was doing something about requests coming from IE - but watching the logs, the request coming into IISExpress was already being changed to drop the aspx extension.

This told me that the problem had to be coming from inside Internet Explorer - somehow the IE browser when it first was redirected by the FriendlyURL component several months ago, it kept that in a super secret cache and wasn't letting go.  Now ever time I tried to navigate to the unfriendly version (default.aspx), IE said "I know where you really want to go" and instead redirected me to the friendly URL (default) - which is no longer being handled and would give a 404 error.

Apparently somebody at PennState had this problem not too long ago - and wrote up a little internal article describing the problem. And in case PennState ever goes away, here is the jist of the article:

A "redirect" is feature that allows a web server to tell a browser "You asked for page X, now I'm going to send you to page Y." This is commonly used to create short URLs, similar to the service bit.ly provides. Unfortunately, IE9 permanently caches these redirects and they cannot be purged by clearing the browser cache.

(Since I'm running IE 11 and seeing this problem, this functionality has been around for quite a while...)

And then their solution:

Solution

There appears to be no way to purge the redirect from the browser cache by using the standard cache purging functionality in the Internet Options configuration screen.

One method that appears to work:

  • Clear your browser history and cache. These instructions are for IE8, but will work in IE9 as well.
  • Go to the Tools menu and enable InPrivate Browsing (anonymous browsing) mode.  This will open a new window.
  • Paste the original URL of the page that incorrectly redirects into the URL bar of the new window
  • Verify this redirects to the correct page.
  • Close and restart Internet Explorer.

Definitely more complex of a solution than I would have ever expected for a problem that seems like it should be pretty common...  At least I didn't have to use my black bag of voodoo chicken bones...  I'll save that for next time.

Thursday, September 19, 2013

Options for Azure Blob Storage


A client needs a tool where internet users can upload CAD files through an internet portal and after being approved those CAD files then need to be accessible to users internal to the company.  Azure blob storage is a great utility to store these large files - since by default everything is available via http - but we wanted a bit of security associated with these CAD files being uploaded, so we didn't just want to expose public urls.  There was also a wide range of users needing access to the CAD files, and some people needed to be able to update and modify those files.  Temporary URLs can be generated, but that creates the extra overhead of needing to know ahead of time when somebody will want a URL.  There are tools like Cloudberry Azure Explorer, and utilities designed to expose Azure blob storage as a mounted drive, but these utilities would involve installing custom software on any machine that might need to have access to the CAD files.


Better yet: expose the Azure Blob storage data through an FTP portal.  

In a matter of minutes, I was able to download FTP 2 Azure and set up a stand-alone Azure web role using the provided azure package.  The provided config file allows you to define the storage connection string and create custom users mapped directly to Azure blob containers, and provide custom passwords for each user.

Finally I also created a web portal with my favorite jQuery file upload widget allowing outside internet users that do not have the FTP credentials to submit their CAD files to the client's service.  Now anybody with the appropriate FTP credentials access can open and modify the uploaded files on the other side without needing to know the details of an Azure Blob storage backend, or be concerned with public storage keys.


10/3/2013 Update:

After using the FTP 2 Azure package for a few days, suddenly the FTP site "stopped working".  I was able to authenticate against the FTP server, but after login I was unable to get a directory listing.  I downloaded a few different FTP clients to see if I could get more information on what was happening.

Invalid credentials would not authenticate, valid credentials would authenticate - so it appeared the FTP site was running... So I removed all of the files from the FTP site and suddenly it started working again.

One by one I brought the files in until I noticed - Files with a space in the name cause FTP 2 Azure to have problems.  The package is open source, so maybe the next step is to see why spaces are actually causing problems... Either way - I modified the upload process so that spaces are replaced with underscores and so far the problem of my FTP site note being able to give a directory listing has not returned.

Sunday, December 16, 2012

SSAS with a slowly changing dimension


At a client, they are interested in trying out some data cubes, so I spun up an instance of SSAS and gave them a demo of some of the awesome opportunities that cubes unlock – but found an interesting puzzle trying to cube-ify some of their data.

The client sells subscriptions for their services – and their subscriptions can be in different statuses at different times.  Whether or not they knew it, their subscriptions where a slowly changing dimension.  And various departments wanted to know how many of their subscriptions might be in a given status at a given time.

With SQL (or MDX), it is fairly straightforward to right to right queries that use can return what rows call within a start and end date, but with a self-service BI style pivot table in Excel, that may not be so easy – so my first goal was to change from date ranges to effective dates.

I created a view of the data, where one row for a one year subscription becomes 365 rows – one for every day of the year.  I then created a measure that does a distinct count of Subscription IDs - This means that in Excel you can filter the data on a given day and know what subscriptions apply to that day.

However, with 200,000 active subscriptions on any given day, and 365 days in a year, that means 73 million rows of data per year.  Sadly – the distinct count on all that data was less than speedy.  At that point, the suggestion was to change from a measure to a calculation.

Instead of a distinct count of the Subscription ID in the subscription fact table, a Subscription Dimension was created and a calculated measure added which does a distinct count on Subscription ID.

Where the distinct count was scanning through millions of rows in the fact table for an answer, the calculation can focus on only the thousands of rows that match the given query.  The run time on more complex queries now shifted from minutes to seconds – not ideal, but definitely tolerable.

Monday, August 6, 2012

LINQ to SQL & Entity Framework slow relationships


Lately I've inherited some code written by another company using quite a bit of LINQ to SQL and quite a bit of lazy loading.  I changed much of the worst offenders to eager load the relationships I knew I would want, and still felt like the system was slower than I would have expected.

A sample of my code:

DataContext custom = new DataContext();
DataLoadOptions dlo = new DataLoadOptions();
custom.DeferredLoadingEnabled = false;
dlo.LoadWith<Customer>(C => C.CustomerAddress);
custom.DataLoadOptions = dlo;

custom.Customer.Where(c => c.Active == true).ToList();

After looking at the function in SQL Profiler, I noticed something interesting about how LoadWith works in LINQ to SQL - And I was surprised -

Inside one database transaction, first SQL executes the filtered query against the Customer table.  Next, for every Customer Address in the result set, the following query executes:

SELECT [t0].[AddressID],[t0].[Address1],[t0].[Address2],[t0].[City],[t0].[State],[t0].[Zip]
FROM [dbo].[CustomerAddress] AS [t0]
WHERE [t0].[AddressID] = @p0

In many of the queries, the relationship chain is several tables long, and in worst cases - causing the number of related look-ups to grow geometrically.

For many of those queries, I switched to using LINQ to SQL to fetch all of the data in one query, and iterating through the result assigning to lists on the server rather than using the LINQ to Entities approach which brought the run-time down to a much more respectable number.

Sunday, July 15, 2012

Azure Virtual Machine disappeared on me


I had been putting together a solution for a client over the past few weeks using Microsoft Azure, and had been really enjoying their new Virtual Machine functionality.  The other morning I logged in to make some changes to the virtual machine and noticed it was missing!

After browsing the web and asking around, the answer is that once the free azure trial runs out, rather than charge any money Microsoft destroys the virtual machine compute instance.

The VHD is still on the Azure blob storage, but the compute instance is no more.

Well, not entirely no more - When you try to create a new Azure IaaS Virtual Machine - this time with money on the table - Microsoft Azure will inform you that the name is already taken.  

When Microsoft destroyed your machine, little pieces of Virtual Machine are still left in their infrastructure.  Now it is time to pull out Azure Power Shell to clean up the little bits of Virtual Machine still lying around so that you can get the system back to where you started.

Download and install PowerShell and the Azure Commandlets:  https://www.windowsazure.com/en-us/manage/downloads/

And when you start up PowerShell, you will need to synchronize PowerShell with your Azure Compute instance - To do this you will need your publish settings file (the same one you use to synchronize Azure with Visual Studio)  - Pull that down from here: https://windows.azure.com/download/publishprofile.aspx and save that somewhere convenient.

Here are some sample Azure Commandlets for PowerShell: http://msdn.microsoft.com/en-us/library/windowsazure/jj152841

But the ones you want are:

*** Connect PowerShell to your Azure instance

$cert = Get-File D:\path\my.pubsettings
Import-AzurePublishSettingsFile $cert

*** Delete the phantom azure virtual machine
Remove-AzureVM servicename -myvirtualmachine


*** You should now be able to create a new virtual machine with the same name as the old virtual machine.
*** It is possible to see the virtual machines you currently have running - strangely enough, the one that Microsoft destroyed will likely not be in this list

$myMachines = Get-AzureVM
$myMachines

*** This will list every active virtual machine you have on your account

Good luck!



Thursday, February 23, 2012

Type initializer for [ClassName] threw an exception


I was helping a coworker with some code changes on a windows console application, when suddenly the entire program refused to load:

The type initializer for [program] threw an exception.

was thrown at the static void main() routine starting the console app.

 

The console application would not even load.  Looking into the stack trace of the exception, it seemed to point to configuration problems, so first thought was to look at the app.config.

Everything checked out.  No issues there.

After looking deeper under the cover, started looking at the class itself.  The class itself was defined as static, and there were a half dozen static properties of the object that were being assigned in the class, in addition to the program's static void main() that was trying to run. 

The exception was resulting from attempting to initialize the static variables in the static class, but the stack trace was pointing at the static void main() because technically the execution context is at that point in the application lifecycle. 

Tuesday, November 1, 2011

Javascript memory leaks


Last week I pushed a new application up to production, and started hearing complaints of a memory leak.  After running some diagnostics I learned two exciting things:

http://bugs.jqueryui.com/ticket/7666

The current version of jQuery UI (1.18.16) has a memory leak with the DatePicker control - including the DatePicker control as part of the jQuery UI is all it takes to cause memory to be allocated and never returned until the browser is closed.  Every refresh of the page, or every time a new page is loaded, more memory is allocated to the browser process.

Although my application did not utilize the DatePicker control, it came as part of the default jqueryui package and I did not go out of my way to exclude the DatePicker from the jqueryui package.

This reminded me of an important aspect of software design:  The more features a system has, the more features a system has that can go wrong.  Although today we may believe our system to be safe and secure, new vulnerabilities will undoubtably be found tomorrow proving that we weren't so safe after all.  Exposing additional functionality that is not intended to be used is potentially dangerous and in my case caused an unecessary memory leak.

 

 

At the same time I discovered another memory leak in my implementation of http://datatables.net/ that applies to IE 7.

I had a table like such:

/* Using aoColumns */
$(document).ready(function() {
    $('#example').dataTable( {
        "aoColumns": [
            { "fnRender": function ( oObj ) {
                return "<div onMouseOver=\"alert('abc');\">" + oObj.aData[0] + "</div>";
            } }
        ]
    } );
} );

(In other words, I added some basic javascript functionality at render time to the outputed html of my datatables.)

I noticed that when I would page forwards and backwards through my datatable, memory was being leaked within Internet Explorer - In the html of the table, we have a javascript onMouseOver event which was being registered against the DOM, but since the datatables library did not do the registering, it did not know to unregister the event before loading the next page of data.

Instead of registering the events to the cells themselves, I switched to event delegation (http://www.quirksmode.org/blog/archives/2008/04/delegating_the.html) which is a much better strategy than registering an event listener for every row of a table that could potentially be several hundred rows long. 

 

Two memory leaks down.  How many more could there possibly be?...

Monday, October 31, 2011

Custom assemblies for Reporting Services


With SSRS we can quickly generate reports that can be exported to multiple formats. What happens when you want to extend your report with custom code? Friday November 11th 2011 (this Veteran's Day) I'll be talking about just that in Eden Prairie MN during: http://sqlsaturday.com/99/eventhome.aspx Perhaps you need a custom authentication layer, custom access to data requiring .Net code, or you want to extend some of the controls that come out of the box with SSRS. In this session we will see an example of extending SSRS to use a .Net library for a data provider allowing us to use a custom .Net business layer for our report. We will also show an example extension to the SSRS graph control allowing us to make design graphs for SSRS. Custom code is a great alternative for those considering using toolkits like Nevron or Dundas toolkits to extend their SSRS graphing capabilities

Sunday, October 23, 2011

SSRS Multi-Data Source DPE (Data Processing Extension)


SSRS is a wonderful tool for quickly retrieving data from many different data sources and presenting the data to the user at a run-time decided format. One area where SSRS often falls short is when the underlying data needs to come from several different sources. Perhaps we want to retrieve data from the General Ledger which is in Oracle, and join that against a list of departments and employees which are stored in SQL Server for us to display in one table. When this happens, we are unable to join them into one dataset without the use of a linked server.

I've put together a sample SSRS DPE project and uploaded it to Code Plex as a starting point for any interested developers to work from.  Depending on the specific needs of the project I'm certain additional development will be required, but my goal is to create a useful starting point for anybody that needs to use SSRS to report data coming from two or more different databases.

http://www.codeproject.com/KB/reporting-services/SSRSMultiDataSourceDPE.aspx

 

 

Copyright © jkrebsbach