April 4, 2014
In IIS, there is a setting to set the recycling conditions for Application Pools (AppPool). So, what is an AppPool? When a website is run within IIS, it runs within a single worker process (w3wp.exe). Running a single worker process puts a high load on the process. Instead, IIS allows to run multiple worker processes using AppPools. A single AppPool can run one or multiple web applications.
Now, AppPool has a recycle option which basically restarts the process. When the process recycles, the new process is started simultaneously so that no HTTP requests are missed. However, the process kills the data in the memory that the applications stores including cache data, sessions and static variables. Now this is both good and bad. It's good because any outages caused by memory leaks issue will not occur but it's bad there will be a performance hit once the process starts.
To make the above situation worse, the default settings in IIS recycles the AppPool every 1740 minutes that is 29 hours. That is, with default setting, it will recycle today at midnight, tomorrow at 5am, day after tomorrow at 10am and so on. To minimise the performance issue, we should reset the setting to a fixed time when the usage on the server is low. The performance issue will still be there but it will impact a lot less number of people this way.
February 15, 2014
How many of the web developers have heard this before that the site runs fine in localhost and development environment but is slow in production? I am sure many of us have had this issue in the past and have worked it out after few trials and errors. The following are the few components to consider to solve the issue.
Scenario 1: Many users have issues with many pages
This is a big problem as it means the site is not usable. I would start by looking into the server usage - CPU and memory. If the site is receiving lots of requests then this could cause the CPU and memory usage it spike up. Check for the server specs - how much CPU and memory it has. One the web servers I worked for had 2 GB ram when my laptop had 4 GB. The server definitely needs to be upgraded with better specs.
This would ease the problem but may not solve the issue totally. If you are receiving way more requests than the server can handle, you need to think about multiple server load balancing. Windows Azure can help you do that - http://www.windowsazure.com/en-us/documentation/articles/load-balance-virtual-machines/ .
Also, if the site is database driven, check if that is causing the issue. There might be lots of database calls in individual pages or maybe the master page has few database calls which is slowing the site down. You might be logging each of the requests, etc to the database which is calling the database too many times. Monitor your database activity when a page is requested and compare.
Another way to check if database is slowing down the site is to create html pages with same content and to note the loading time differences between the 2 page.
If you conclude that it is a database related issue, consider caching your content where possible and only grab the data that is needed instead of a random set of data. For example,
In case, it is a database related issue but the application is not really doing any strange queries, it could be the database server that is overloaded. In many cases, the same database server is used to host multiple database instances and chances are another database instance is taking all the resources in the server. In this situation, you need to figure out our hosting situation.
To improve bandwidth issue, you can cache content on the client as necessary. For example, the scripts, css and images can be cached in the client browser but there is no benefit to first time users. We can cache content by using the cache control for static content in IIS7 or the system.webServer property in web.config file.
If the HTML file served to the client is large, then HTML minification is another component look into. You should also reduce the spaces and new lines when possible from the controls. For example, if you have the following snippets in a ListView,
it will generate 3 new lines for every record returned. So, in case you generated 100 records - this will result in 300 extra lines of html. Instead of above, use,
Also, put the css files at the top within the head section and scripts at the bottom of the page before the body end tag. Putting css at top allows the page to render progressively and putting scripts to the bottom avoid blocking other content downloads.
Static sub-domain for static resources:
Few years back, while working on a site, it was recommended to have all urls in lowercase and submitting from one domain for SEO reasons. In my situation, the site was configured to serve from both example.com and www.example.com and the same link was created in many different way like example.com/abc , example.com/Abc , www.example.com/abc . Even though these are referring to the same page, it's a bad practice to have multiple urls to the same page. The site was also hosted in IIS 6 and to overcome this I used isapi rewrite tool to redirect pages to one url format. Though it solved the url formatting issue, the extra redirects slowed down the pages.
Scenario 2: One user having issues with speed of the site
If you have more than one user, you can be more or less certain that the issue is at the client end. To start with, check where the user is located. For example, you might be hosting the site in US whereas the user is visiting it from Australia. The site will certainly be slower due to network latency. You can use the site - http://www.webpagetest.org/ - to get some idea on how the site performs in various geographical locations.
January 11, 2014
Few days back, while I was at work, I found that my computer running slow. I quickly checked the task manager and found that I was running out of memory. That was a bit strange as I have 8GB of memory. Anyways, what I found is there were lot of processes running from Google Chrome that was killing it. Basically, each of the tab is a process and there is no way to add up the full memory usage in Task Manager. So, I did something silly and add up the memory from each of the processes in the calculator. But that's silly!
May 26, 2012
With the images, I saved the images in the size needed rather scaling down. This reduces the file size of the images. Also, I saved the images by lowering the quality of the images. Depending on the image, reducing the quality of the images may not reduce the visible quality. However, reducing the quality can greatly decrease the file sizes. With some images, I tried converting from png to jpg format and vice versa. I notice that with some images, file sizes are greatly reduced when converted from png to jpg. This is great as it reduced the data size needed to be downloaded to load the page.
In the site I worked, I reduce the total size from around 2MB to just under 1MB just by compressing the images which greatly improved the performance of the site.
March 20, 2012
For Referer, it basically contains the url for the page that referred to this resource. If the url could be shortened, then the size of the Referer and therefore the header could be minimised. This scenario really depends on the priority - for example, between www.example.com/a.html ans www.example.com/australia.html, the first url is better for reducing the header size but the latter is much better for seo. So, one would really make to check if it is worth having short urls over a longer one.
Also, when cookies are set, it should be checked if the path for the cookle is correctly set. For example, if cookies are needed only for one application - www.example.com/abc, then the cookie should be set only for www.example.com/abc and not for the whole site - www.example.com/ . This would also result in less cookies being sent back to the server.
March 18, 2012
Image maps can be used to combine few images into one. Converting multiple images to image maps may or may not be possible depending on the requirement. Image maps do not necessarily improve the size that needs to be downloaded but does decrease the number of http requests.
Multiple stylesheets can be combined into one. This may or not be suitable depending on the site's design. If the same css are used on most of the pages, it's worth to combine the css into one file. This again will reduce the number of css files that needs to be downloaded.
Number of image requests can be reduced by combining the background images into a single image and using "background-image" and "background-position" css properties. The "background-position" can be used to show the segment of the image that needs to be displayed.
Every now and then, I have noticed web pages where the height and width properties of the images are modified and displayed. When larger images are used and displayed smaller using height, width properties, the browser is actually downloading a larger image than is necessary. These images should be reduced in size using tools like Photoshop or Google Picasa.