Introduction
Queness webblog is 6 months old. Before I start this website, I spent few weeks of planning and wordpress was the first blogging platform that came into my mind. I like it because of its simplicity, robustness and plugins. However, thing didnt turn out the way it should. Instead of using wordpress, I decided to write my own blogging platform which will guarantee a greater flexibility and make my life easier in doing website modifications.
Due to that decision, I have to start everything from ground zero, start from planning, design, development, testing, maintenance to optimization. I exposed myself to an unfamiliar area - SEO (Search Engine Optimization).
I have been digging around for SEO tips & tricks, and read a lot of tutorials/articles. After that, during the optimization process, I wrote code to generate unique title and description for every single pages, use URL Rewrite for search engine friendly url, check the keyword density, write dynamic sitemap and constantly monitoring stats from Google analytics and Awstat from cpanel.
I've been doing SEO optimization for this website until now, and I reckon it's a long term maintenance. It's something that you have to monitor constantly. So, this time, I would like to share what I have learnt during that process. These are the rules/tips/tricks I used for Queness.
1. Less Flash and Javascript Generated Content
Yes, flash looks really awesome, but search engine will not able to crawl the content of flash. Of course, you can't put hidden text or redirect search engine bots to somewhere else (you might get banned by Google). The only way to solve this is create an alternate version of the complete site in HTML so that search engines and non-Flash browsers can view it.
However, if you really want to know about how to optimize flash and seo, check out these website:
For Javascript generated website, it's refering to AJAX based website. Since search engine spiders couldn't read javascript, AJAX based website can have pretty bad ranking. I have an article to solve some of the AJAX problems such as back button, and static url for every pages - AJAX Driven Website with jQuery + PHP Tutorial. However, the best thing to do is not to overly use it.
2. Understand the Way Search Engine Crawl Your Website
"Crawler-based search engines are those that use automated software agents (called crawlers) that visit a Web site, read the information on the actual site, read the site's meta tags and also follow the links that the site connects to performing indexing on all linked Web sites as well. The crawler returns all that information back to a central depository, where the data is indexed. The crawler will periodically return to the sites to check for any information that has changed. The frequency with which this happens is determined by the administrators of the search engine. " - Webopedia
SEO is a long term website maintenance, we need to check the keyword density frequently and the structure of the website to ensure it's "SEO optimized". Luckily, there are a lot of tools that will make our life much more easier:
3. Use Search Engine Friendly URL
Do not miss this, it's very important. URL is one of the ways to include your content keywords. "Even if you can't get your keywords into your domain name, you can put them into your URLs. Search engines read the URLs and assign value to the text they find there." - about.com
We can rewrite dynamic URL by using Mod Rewrite. The following websites teach show us how to do it:
4. Stick to One URL
We should stick to one url only. For example, you have to decide whether to use http://www.websitename.com or http://websitename.com. The main purpose is to gather all the traffic and point it to one destination.
I found this somewhere online, it forces website url to appear as http://websitename.com.
RewriteEngine On RewriteCond %{HTTP_HOST} ^yourdomain.com RewriteRule (.*) http://www.yourdomain.com/$1 [R=301,L] RewriteRule ^sitemap.xml?$ sitemap.php [NC,L] #Rewrite sitemap.php to sitemap.xml
5. Title and Meta Tag
Title and Description Meta tag should be generated dynamically according to the content of the website and they should be unique as well. Usually, the title will be the heading of the article, and description should be the excerpt of the content.
Some people say, META Keyword tag is dead. I reckon so, but I'd suggest to put some keyword phases in it. You wouldn't know, some search engines might still using it.
You can always use google search engine to test some keywords. Choose keyword phrases that are most relevant and specific to what your web page is about. Think from the perspective of someone searching for what you are offering on your site. Ask, as if you were they: What would I search for if I am looking for something on your page?
6. Put Keywords in Anchor/Links, Heading and Body
Always remember, use your keyword phrase a lot, but not too much. Otherwise, it'd get too spammy. Search engines recognize keyword phase in heading (h1, h2, h3, h4...), and usually the heading has greater chances of getting crawled compared to surrounding text.
7. Generate Sitemap Dynamically
Unless you have a website with static content/fixed structure, otherwise, we should make sure that the sitemap is generated dynamically. Everytime we make a new page/article/post/news, it should be included in the sitemap.
Of course, don't forget to submit your sitemap file to search engines. Google can crawl new content pretty quick. Based on my experience, it took 1-2 hours to crawl my new post.
Anyway, If you generate sitemap dynamically, you can rewrite dynamic URL by using Mod Rewrite. The following websites teach you how to do it:
This is the Mod Rewrite code to rewrite the sitemap.php to sitemap.xml:
RewriteEngine On RewriteCond %{HTTP_HOST} ^yourdomain.com RewriteRule (.*) http://www.yourdomain.com/$1 [R=301,L] RewriteRule ^sitemap.xml?$ sitemap.php [NC,L] #Rewrite sitemap.php to sitemap.xml
8. Use robots.txt
When a search engine spider comes to your website, it search for a file called robots.txt. In robots.txt, it tells search engine which files can be indexed. The following robots.txt file will allow all search engine spiders to index all files of your Website.
/* index all */ User-agent: * Disallow: /* dont index any of the files */ User-agent: * Disallow: / /* don't index all files in this folder */ User-agent: * Disallow: /admin /* only allow googlebot to index your website */ User-agent: googlebot Disallow: /* you can also include your sitemap in robots.txt */ /* I'm using this one */ User-agent: * Disallow: Sitemap: http://www.websitename.com/sitemap.xml
If you want to know more about it check this website
SEO Tools - Robots.txt Tutorial9. Exchange Links with Other Relevant Website
Link exchange program is to build up backlink. For web design and development website, we can take the advantages of CSS/Website Gallery showcase website to build backlinks, and also other methods such as blog directories, affiliate program, friend website and forums. From my previous post, I have gathered a huge number of CSS galleries.
- CSS and Web Gallery List: Promote and Increase your Website Traffic
- 20 Essential Blog Directories to Submit Your Blog to
- RSS Blog Directories
10. Use Google Analytics
I found myself checking Google Analytic Stat daily. I love it because it has all the data I need to analyse web traffic. It has 4 majors section:
- Visitors
- Traffic Sources
- Content
- Goal
In every section, it has comprehensive statistic, data, drill down report and the level of details are simply impressive. Afterall, it's FREE! This tool is highly recommended.
Final words
I hope these tips will able to assists to optimize your website. After we have done the SEO, we can start promoting our website to all sorts of free medias out there, which will be my follow up to this post next week. Stay tuned!
Comments will be moderated and
rel="nofollow"
will be added to all links. You can wrap your coding with[code][/code]
to make use of built-in syntax highlighter.For #3, I read some articles before, search engine like Google, they do index dynamic url, but the rank will be lower than non-dynamic (seo friendly) url. Someone had done the test before. However, with the new caffeine from google, they might have changed the rules.
7) XSLT is also something to look into. You can customize your xml sitemap to look like the rest of your site so you dont have some plain boring xml doc. This can be useful for your users who may be searching around on it.
8) Is very useful. The only problem is if you have a lot of hidden pages it can be a pain updating it every time. There is another way you can do this in the head of your page.
Sorry for the long responses. Really like the article. Simple SEO is the best SEO.
Amazing how many sites have dropped the meta tags for keywords and description.