Showing posts with label Performance. Show all posts
Showing posts with label Performance. Show all posts

April 4, 2014

In a web page, a lot of content and different types of files are transferred from the server to the client - for example, dynamic content from asp.net / mvc, and static content like images, css and javascripts. All these types of files add to performance hit. To avoid some of these problems we can cache the static content to the client browser. So, in this post, I will talk about how to add expire header to the static files.

AppPool Recycling and Performance

In IIS, there is a setting to set the recycling conditions for Application Pools (AppPool). So, what is an AppPool? When a website is run within IIS, it runs within a single worker process (w3wp.exe). Running a single worker process puts a high load on the process. Instead, IIS allows to run multiple worker processes using AppPools. A single AppPool can run one or multiple web applications.

Now, AppPool has a recycle option which basically restarts the process. When the process recycles, the new process is started simultaneously so that no HTTP requests are missed. However, the process kills the data in the memory that the applications stores including cache data, sessions and static variables. Now this is both good and bad. It's good because any outages caused by memory leaks issue will not occur but it's bad there will be a performance hit once the process starts.

image

To make the above situation worse, the default settings in IIS recycles the AppPool every 1740 minutes that is 29 hours. That is, with default setting, it will recycle today at midnight, tomorrow at 5am, day after tomorrow at 10am and so on. To minimise the performance issue, we should reset the setting to a fixed time when the usage on the server is low. The performance issue will still be there but it will impact a lot less number of people this way.

February 15, 2014

How many of the web developers have heard this before that the site runs fine in localhost and development environment but is slow in production? I am sure many of us have had this issue in the past and have worked it out after few trials and errors. The following are the few components to consider to solve the issue.

Scenario 1: Many users have issues with many pages

Server issue:

This is a big problem as it means the site is not usable. I would start by looking into the server usage - CPU and memory. If the site is receiving lots of requests then this could cause the CPU and memory usage it spike up. Check for the server specs - how much CPU and memory it has. One the web servers I worked for had 2 GB ram when my laptop had 4 GB. The server definitely needs to be upgraded with better specs.

This would ease the problem but may not solve the issue totally. If you are receiving way more requests than the server can handle, you need to think about multiple server load balancing. Windows Azure can help you do that - http://www.windowsazure.com/en-us/documentation/articles/load-balance-virtual-machines/ .

Database issue:

Also, if the site is database driven, check if that is causing the issue. There might be lots of database calls in individual pages or maybe the master page has few database calls which is slowing the site down. You might be logging each of the requests, etc to the database which is calling the database too many times. Monitor your database activity when a page is requested and compare.

Another way to check if database is slowing down the site is to create html pages with same content and to note the loading time differences between the 2 page.

If you conclude that it is a database related issue, consider caching your content where possible and only grab the data that is needed instead of a random set of data. For example,

  1. SELECT * FROM Customer -- bad
  2. SELECT TOP 10 * FROM Customer --good
  3. SELECT * FROM Customer WHERE criteria IN (1,2) -- good

In case, it is a database related issue but the application is not really doing any strange queries, it could be the database server that is overloaded. In many cases, the same database server is used to host multiple database instances and chances are another database instance is taking all the resources in the server. In this situation, you need to figure out our hosting situation.

Bandwidth issue:

The speed of a web page is dependent on the size of the data that is passed on. Check if the page got huge video files or maybe it has a large number of css, javascript and images files that is taking a lot of time to download. In Google Chrome, you can right click on a page, inspect an element and look into the network tab to check the file sizes and also the number of files served/ To avoid multiple css, javascript files you can use asp.net bundling and minification - http://www.asp.net/mvc/tutorials/mvc-4/bundling-and-minification - to reduce the number of files to 1 which will improve the download time. In addition, CDNs can also be used to used to serve certain files like jquery script or some other standard script. Using cdn, there is a risk that the cdn server might be down but the chances are not really so high.

To improve bandwidth issue, you can cache content on the client as necessary. For example, the scripts, css and images can be cached in the client browser but there is no benefit to first time users. We can cache content by using the cache control for static content in IIS7 or the system.webServer property in web.config file.

 

HTML structure:

If the HTML file served to the client is large, then HTML minification is another component look into. You should also reduce the spaces and new lines when possible from the controls. For example, if you have the following snippets in a ListView,

  1. <ItemTemplate>
  2.     <li>
  3.         <%#Eval("data") %>
  4.     </li>
  5. </ItemTemplate>


it will generate 3 new lines for every record returned. So, in case you generated 100 records - this will result in 300 extra lines of html. Instead of above, use,

  1. <ItemTemplate><li><%#Eval("data") %></li></ItemTemplate>


The images and css and javascripts should have the correct sources specified. With incorrect or empty sources an unnecessary request will be made to the server and waste bandwidth and server compute.

Also, put the css files at the top within the head section and scripts at the bottom of the page before the body end tag. Putting css at top allows the page to render progressively and putting scripts to the bottom avoid blocking other content downloads.


Static sub-domain for static resources:

Cookies are sent across the network for every requests browser makes. For example, in a request to www.example.com, the browser will send request to get each of the images, css, javascripts and any other static resources and any cookies saved from the domain will be sent each time. To improve the scenario, send the static resources using a sub domain like static.example.com and make sure there are no cookies associated with this sub domain.
 

JavaScript issue:

Sometimes, pages will run slow if we have too many javascripts. Nowadays, we use a number of javascripts by default, for example, scripts for analytics and tracking, jquery, mooTools, modernizr, social sharing, etc. We also have custom scripts for generating items like menu or enforcing compatibility and so on. Most of these will be harmless but when you have a number of these, it will slow down your site. For example, if you run functions from body onload or window.onload - these will impact the page's loading.

Large number of javascripts from domains also have a negative impact as the browser needs to looks multiple DNS. Each DNS lookup takes up 20-120 milli seconds, so 10 DNS lookups will cost approximately half a second.

Also, make sure the same JavaScript resource is not called multiple times. This can happen when a page is created with a master page and both master and child page is called the JavaScript. The page will still work and it's a waste of bandwidth and slow down the site a bit.

 

Avoid redirect:

Few years back, while working on a site, it was recommended to have all urls in lowercase and submitting from one domain for SEO reasons. In my situation, the site was configured to serve from both example.com and www.example.com and the same link was created in many different way like example.com/abc , example.com/Abc , www.example.com/abc . Even though these are referring to the same page, it's a bad practice to have multiple urls to the same page. The site was also hosted in IIS 6 and to overcome this I used isapi rewrite tool to redirect pages to one url format. Though it solved the url formatting issue, the extra redirects slowed down the pages.


Scenario 2: One user having issues with speed of the site

If you have more than one user, you can be more or less certain that the issue is at the client end. To start with, check where the user is located. For example, you might be hosting the site in US whereas the user is visiting it from Australia. The site will certainly be slower due to network latency. You can use the site - http://www.webpagetest.org/ - to get some idea on how the site performs in various geographical locations.

Secondly, it could also be the user has a proxy setup which blocks certain files, sites, port number etc. For example, you might be serving an image, javascript file from a different location which is blocked by the user's proxy. Alternatively, you might be serving the site using a non 80 port number which is blocked by the client's proxy. If that's the case, the user needs to communicate with their internal support team.

January 11, 2014

Few days back, while I was at work, I found that my computer running slow. I quickly checked the task manager and found that I was running out of memory. That was a bit strange as I have 8GB of memory. Anyways, what I found is there were lot of processes running from Google Chrome that was killing it. Basically, each of the tab is a process and there is no way to add up the full memory usage in Task Manager. So, I did something silly and add up the memory from each of the processes in the calculator. But that's silly!

May 26, 2012

Compress images to improve performance

Lately, I have been working on improving the performance of the website. Going through the site, I notice that there are 30+ images with many of them over 50KB+, few over 100+KB and couple over 500+KB. Most of these are images are not really doing much and I so tried to compress the images using Photoshop and save them using the "Save for web" option.

With the images, I saved the images in the size needed rather scaling down. This reduces the file size of the images. Also, I saved the images by lowering the quality of the images. Depending on the image, reducing the quality of the images may not reduce the visible quality. However, reducing the quality can greatly decrease the file sizes. With some images, I tried converting from png to jpg format and vice versa. I notice that with some images, file sizes are greatly reduced when converted from png to jpg. This is great as it reduced the data size needed to be downloaded to load the page.

In the site I worked, I reduce the total size from around 2MB to just under 1MB just by compressing the images which greatly improved the performance of the site.

March 20, 2012

One way to improve the performance of a website is to reduce the size of the http headers. The http headers contains information like Host, User-Agent, Referer, Cookie, Accept-Language, etc. For some of the request headers like Accept-Language or User Agent, it is probably not possible to remove these or to minimise these as these automatically generated, But for other like Cookie and Referer, it is possible to minimise their sizes.

For Referer, it basically contains the url for the page that referred to this resource. If the url could be shortened, then the size of the Referer and therefore the header could be minimised. This scenario really depends on the priority - for example, between www.example.com/a.html ans www.example.com/australia.html, the first url is better for reducing the header size but the latter is much better for seo. So, one would really make to check if it is worth having short urls over a longer one.

For Cookies, it is better to have smaller cookies. Cookies are sent back to the server for each of the requests - including, images, css, javascripts. That is, if one url requests 20 items (images, css, javascripts), then each cookie are sent back to the server 20 times. Couple of steps can be taken to partially improve this situation.

Static content like images, css and javascripts can be sent from a cookie-less domain. For example, setup a new sub-domain called static.example.com that would only serve static content. This would mean the cookies will not be sent over for static content improving the performance.

Also, when cookies are set, it should be checked if the path for the cookle is correctly set. For example, if cookies are needed only for one application - www.example.com/abc, then the cookie should be set only for www.example.com/abc and not for the whole site - www.example.com/ . This would also result in less cookies being sent back to the server.

March 18, 2012

One of the ways to improve the speed of downloading a website is to decrease the number of http requests. A typical web page can contain html for the content, javascripts, stylesheets, images, flash, etc. Each of the components results in a http request. To decrease the number of http requests, a page can either be simplified using less images, less css, etc or by using some techniques to optimise the design.

Image maps can be used to combine few images into one. Converting multiple images to image maps may or may not be possible depending on the requirement. Image maps do not necessarily improve the size that needs to be downloaded but does decrease the number of http requests.

Multiple stylesheets can be combined into one. This may or not be suitable depending on the site's design. If the same css are used on most of the pages, it's worth to combine the css into one file. This again will reduce the number of css files that needs to be downloaded.

Multiple javascript files can be combined to one if the same files are used on most of the pages. Doing so will reduce the number of javascript files that need to be downloaded.

For both css and javascript files, the html document should be checked to make sure the same files are not being called. If the same files are called, the some browsers will re-download the files making the pages slow.

Number of image requests can be reduced by combining the background images into a single image and using "background-image" and "background-position" css properties. The "background-position" can be used to show the segment of the image that needs to be displayed.

The size of the web pages can be decreased by compressing the css, javascript and html files. White spaces can be removed from the files and other online tools like Google Closure or www.csscompressor.com can be used to further compress the documents.

Every now and then, I have noticed web pages where the height and width properties of the images are modified and displayed. When larger images are used and displayed smaller using height, width properties, the browser is actually downloading a larger image than is necessary. These images should be reduced in size using tools like Photoshop or Google Picasa.

Inline javascript and css can be minified to reduce the size of the html document. Scripts and css can be embedded using the
Reference: Shahed Kazi at AspNetify.com