September 2010 Newsletter
**** The 4specs Perspective
This month's newsletter is intended to help you work with your web designer or IT department. These are recommendations you may want to discuss with them. They represent steps I would take if I were managing your website.
Later this fall, our monthly newsletter will address the expectations we have when selecting a hosting companies and questions you may want to consider if you need to find a new host.
4specs is hosted on a FreeBSD Unix server using Apache 2.x, and we have done all of the below as part of our optimizations for speed and search engines. Microsoft servers can generally be set up to do the same. The examples I will provide are from the 4specs set up and .htaccess file.
Speed Up Your Website - Zip The Files
My single biggest suggested change is to use the deflate or gzip feature to send zipped files transparently to your user. Most modern browsers (since 2004 or so) will accept gzipped files with no action from the user. When set up, your server takes a 25,000 byte file and sends it out as about 5,000 bytes. Much faster - and the css and js files and image files also start to download more quickly.
This is enabled by making 2 changes:
1. changing one line in the Apache http.conf file:
LoadModule deflate_module modules/mod_deflate.so
2. and adding to the .htaccess file
AddOutputFilterByType DEFLATE text/html text/plain text/css
You can see more information here:
http://httpd.apache.org/docs/2.0/mod/mod_deflate.html
https://betterexplained.com/articles/how-to-optimize-your-site-with-gzip-compression/
Eliminate Canonical Duplication
Canonical duplication is difficult to explain. The goal is for the search engines to see a single page with a single URL. Here is an example of multiple URLs all leading to the same page:
http://www.example.com/
http://example.com/
http://www.example.com/index.html
http://example.com/index.html
http://www.example2.com/
http://www.example2.com/index.html
The goal is to for your server to serve only one url in response, even if there are multiple ways to reach that information. Here is the example from the 4specs .htaccess file:
RewriteEngine on
RewriteBase /
# Rewrite other domains all to 4specs.com
RewriteCond %{HTTP_HOST} !^www\.4specs\.com [NC]
RewriteRule ^(.*)$ https://www.4specs.com/$1 [L,R=301]
# Rewrite the index file for canonical reasons
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /(([^/]+/)*)index\.html
RewriteRule index\.html$ https://www.4specs.com/%1 [R=301,L]
You can also add a tag to the headers in each page, although the solution above is my recommendation. This can be used where you do not have access to the .htaccess file.
<link rel="canonical" value="http://www.example.com/page.html"/>
If this does not make sense to you, ask your web designer. It should make sense to him or her. Here is a link to more information:
[link broken]
Set a Last Modified Header
Last-Modified: Tue, 17 Aug 2010 18:59:10 GMT
This is important to tell the search engine robots and spiders when the page was last modified. The date and time should be the same as in your sitemap. Typically a content management or database driven website will not be able to respond with this information.
With this information, the search engine robot will send a head command and be able to determine whether the page was changed. With a standard set up on a Apache server, this will be a standard response for straight html pages - meaning pages without an include or other processing. I believe a Microsoft server also has the last modified header as a default.
Eliminate ETags
While you are working in the .htaccess file, you can eliminate ETags as duplicating the Las-Modified header. Here is the code I use:
Header unset ETag
FileETag None
Here is more information:
http://www.websiteoptimization.com/secrets/advanced/configure-etags.html
Set a Cache or Expires Time for Pages and Images
When your pages and images are cached, a return users browser does not have to request or download the information again. For the repeat user, this means a faster website. I have all the 4specs html pages cached for 12 hours - so the user will not download or check to see if the page has changed since their last visit within the cache time. I have images and pdfs set for one year. Here is the code I use in the .htaccess file:
ExpiresActive On
<FilesMatch "\.(ico|pdf|jpg|png|gif)$">
Header set Cache-Control "max-age=31536000, public"
</FilesMatch>
ExpiresByType text/html "access plus 12 hours"
ExpiresByType text/css "access plus 2 days"
Here is a link to an article by Google about caching:
https://developers.google.com/speed/docs/insights/LeverageBrowserCaching
Eliminate Default Directory Listings [added 9-18-2010]
Add this code to the .htaccess file to block listing of a directory without a default or index file. This provides some protection for random listings that may compromise security
Options -Indexes
Other Ideas
There are lots of articles on web optimization and speeding up your website. The points above are a good starting spot to discuss with your web developer or IT department. In a future newsletter I plan to cover where I would host a website and why. Here are two more articles to get you started in the process:
https://www.sitepoint.com/web-frontend-optimization-from-the-get-go-part1/
https://www.sitepoint.com/front-end-optimization-from-the-get-go-part-2/
Questions and comments are always appreciated.
--------------------------------------
Colin Gilboy
Publisher - 4specs
Contact us