Reduce, Re-use, We’re Not Talking About Recycling Here.

reduce-re-use-were-not-talking-about-recycling-here

So your business is green now… how about your website?

Imagine if you printed each page of your site, and then gave them to Google. Then, after you handed them over, Google had to file them away in their archives. The smaller the pages are, the less space your pages would take up, correct?

If we look back at rule no. 1 of the Three Golden Rules Of SEO: “Do right to Google and Google will do right to you,” we uncover a great SEO rule for techs: the more we can reduce the size of the pages on a website, the better the website is for Google to crawl.

So - how do I reduce the size of my pages and my site?

Removing on-page CSS and Javascript and placing them in external files.

Reusing the same style sheet and javascript file will make your site easier to update and also more consistent. This will also reduce the total number of bytes the page contains when Google has to index it. In order to do this, you would need to edit the pages of your site and move the code into a seperate file.

Given the sample below, PennDOT has 87 lines of javascript on their home page. They could copy and paste that code into a file called home-page.js and replace all those lines of code with 1 line.

<script src=”/home-page.js” language=”JavaScript” type=”text/javascript”></script>

For moving CSS, it is the same approach but the code is slightly different. Take a look at the 300+ lines of CSS on the Lancaster County Library website. This can be easily moved to an external style-sheet by copying and pasting that code into a file named home-style.css and replaced with 1 line in the head section as follows:

<link href=”/home-style.css” rel=”stylesheet” type=”text/css” />

You’re welcome Google, we just saved you 8KB of bandwidth and 8KB of storage each time you index those two pages!


No-Index duplicate pages with robots.txt

Create a file in the main directory of your website named “robots.txt”. The search engines will read this file each time it crawls your site to see what urls you don’t want included in the index. To determine what urls to exclude, you could do a Google search using “site:yourdomain.com” and look at the results.

If you have a lot of duplicate pages, especially from a dynamically generated script, the results will most likely be displayed at the end of the results. Click to the end of the pages of results, and look for the caption at the end “In order to show you the most relevant results, we have omitted some entries very similar to the X already displayed.”

If you like, you can repeat the search with the omitted results included. Click on that link and browse through the duplicate results. Once you have determined some urls to exclude you would simply add them to the robots.txt one url per line, as follows:

User-agent: *
Disallow: /url-to-block

The search engine spiders support blocking an entire directory as follows:

User-agent: *
Disallow: /directory-to-block/

Googlebot specifically supports a wild card feature. So if you would like to block an entire range of urls, say from a web calendar at an address like /calendar-2009.html you could do this as follows:

User-agent: googlebot
Disallow: /calendar-*.html

See if your web server supports the If-Modified-Since HTTP headers

I use a Firefox plugin called Live HTTP Headers so that I can inspect the http server headers. This is a handy troubleshooting tool while testing 301 redirects as well.

In order to use this you would need to be using Firefox, and install the plugin. Go under the tools menu and choose “Live HTTP Headers.” Leave the box open and load your website in the browser.

Several lines of text will go whizzing past. Go the whole way up to the top to look at the original request and response. Note the headers for a page from Wikipedia in the image below. In the first section - “GET /wiki/Google HTTP/1.1″ - is the request that the browser sent to the server. Note the line “If-Modified-Since.” This second section is the response from the server. What we are looking for here is the first line “HTTP/1.x 304 Not Modified” and “Last-Modified: Sat, 24 Oct”… This server does support the If-Modified-Since HTTP header.

When the Googlebot spiders this page again, it will be able to determine if the current web page is newer then the one already in the Google cache, therefore saving bandwith to download the page and storage space to store duplicate pages.

0 Comments

3 Golden Rules of SEO (for the Tech)

3-golden-rules-of-seo-for-the-tech

We find, more often then not, that the people who develop business websites and web applications do not understand the basic principals of search engine optimization. Even if they do understand, they may not care, or may not find it to be an important enough part of the project, deciding to just worry about it later.

It is much easier to consider these tactics during the process of the project rather then trying to modify it after the fact. For all the tech’s out there, I’d like to offer my 3 golden rules of SEO:

  1. Do right to Google, and Google will do right to you.
  2. If you try to trick the search engines, it may work today. But eventually, you will be penalized.
  3. You don’t have to do everything right. Just do more right than your competition.

What do these golden rules actually mean?

Do right to Google, and Google will do right to you.

“Do right to Google” means several things. It means to follow their best practices. Think about what Google’s goal is - they want to provide their users with an efficient way to find the information they are looking for. In order to do that, they have created an algorithm to rank sites in order of which they feel is the most likely internet result to answer the user has queried. Most of the scoring techniques are secret, but the goal is not.

So think about it this way - anything we can do during site development to help Google do their job, is extra points for the site. See the Google Webmaster Tools site for more information.

If you try to trick the search engines, it may work today. But eventually, you will be penalized.

Even though rule number two simply states the opposite as number one, it still is important in its’ own right. If we are doing everything right to Google, we shouldn’t be doing anything wrong, right? Sometimes SEOs will employ methods to optimize their site to unnaturally inflate their rankings. Using these techniques are considered “Black Hat” or “Grey Hat.”

The entire premise of rule no. 2 is that we don’t want to do anything that appears like we are trying to trick the search engines. If so, you may not get caught today or tomorrow. But one day, maybe your site will have points taken away… therefore dropping your ranking.

You don’t have to do everything right. Just do everything more right than your competition.

Last but not least, you don’t have to do everything right. Just make sure you’re doing more things right than your competition. It makes sense… perfect sense.

Just like winning a race, a football game or the even the World Series. In all of those examples, the winners can still make all sorts of mistakes, and still end up winning. They just need to do better then the competition.

So fear not, developers of the world. You don’t have to become an expert overnight. Just take these three solid rules into account while working on your next project.

0 Comments

PHP Developer Job

We are looking for several experienced PHP web developers for three full time positions at our Lancaster internet marketing office. We don’t care what’s on your resume, we care more about your abilities and potential. Send us some examples of work that you have done. Impress us with your personal website or any work you’ve done on your own.

Don’t bother to contact us if you…
php logo

  • Want to be out the door at 5PM everyday
  • Have issues with code reviews
  • Are not familiar with writing database queries
  • Feel sad when a piece of code you wrote is tossed in the can
  • Have never worked on a web based programming project
  • Feel that you need more then a Linux box with Firefox and Vi to do your job

If you are passionate about programming and want an opportunity to be recognized for your hard work and knowledge, send us your info today.

1 Comment