Complete master guidelines and tips on SEO (Search engine optimization) for beginners

Recently I wrote an article on SEO that introduces the essential things a beginner will need to be aware and know about. If you haven't had a chance to look at it, please feel free to take a look. It may worth your time.

The reason for this complete guide is that when I was learning SEO, I am required to flip through many tips and guidelines on the World Wide Web and couldn't really find a complete one that introduce to all these methods of SEO in an article. After a huge amount of research and reading I finally decide to come up with a guide that can help  people out there who are learning  SEO. This will be a very long guide to cover a lot of things on SEO. Bear with me =)

The sections are split as at below:

1. Understands what search engine spider sees

2. SEO accessible Design for website

3. SEO development for website

4. SEO for webmaster

5. Never for SEO friendly website

6. Conclusion

This post will be maintained to keep up with the on going changes in the world of SEO.

Understands what search engine spider sees

Before we can start anything about optimizing, it is essential that we understand what the search engine spider can see. These search engine spiders will determine whether they like your website or not which determine whether you are on the indexing or off the indexing of the search engine. Search engine spiders are robots so they will not be able to see what a human can visualize. Thus, it is important to reduce the number of things a search engine spider cannot see and increase the numbers of things a search engine spider can see. Search engine spider is interested to know more about your website and what does your website carries. Therefore, all they see are instruction and texts in your website. Below list the things a search engine spider cannot see and should be avoid or place at the bottom of the page.

1. Image text

2. JavaScript

3. Flash

4. Frame

SEO accessible Design for website

SEO is all about how to make search engine likes you. However, we must aware that search engines spider do not know how pretty your website can be but how accessible your website is. Designing a website that creates both beauty and accessibility to users and search engine are important part a web designer should be aware of. We must always try to retain the user in your site and allow the search engine spider to crawl out as much information as it could on your website. Therefore, all those free template or web design are off the shelf! Those are not built for SEO in mind and there is a reason why it is free in the first place. But tweaking it here and there might just help you solve this problem.

1. The key point is to load the content first so that the search engine spider will sees what it wants. The least important things such as navigation bar or footer should be placed at the bottom after everything has loaded. In order to do this, you will be required to layout your design in CSS format. This way, you can prioritize your display to the search engine spider!

2. Be aware that these search engine spiders are trying to guess what is important. Thus, do not ever use Image text for beauty purposes unless you are not doing SEO in mind.

3. Provide meaningful image names, title and alt. These are important so that images are accounted for in search engine and makes search engine spider understands what the image is about.

4. It is equally important that the sites you design are up to standard. And the minimum required for search engine spider to crawl and cache up your site is to have workable functionality and display. It is necessary to full validate of HTML and CSS in accordance with W3C guidelines.

SEO development for website

It is important to know what kind of structure is being love by these search engine spiders. If you have developed a structure that doesn't makes SEO friendly, altering or changing them to SEO will squeeze out all the juice you have. It is best to aware this from the beginning than regarding later.

1. One of the good practices you may adopt is naming your URL. Providing a meaningful URL is good when it come to searching on search engine. For example, I have a page which has an URL http://hungred.com/2009/04/02/seo/introduction-seo-search-engine-optimization/ is always better than a URL http://hungred.com/example.php. It provides more keyword for search engine to find you.

2. Do not use JavaScript ready made navigation menu. These menus are not SEO friendly and search engine spider do not like them! Simple to say they are not page text. Use

  • with CSS to create your navigation menu instead!

    3. Standard your anchor links! Since we are dealing with robot here, they are instructed to check your linking structure. Instead of linking them as http://www.hungred.com/seo, link them as http://hungred.com/seo/! The reason is because the spider will check in the following way if you link them in the first case.

    i. http://www.hungred.com/seo

    ii. http://hungred.com/seo

    iii. http://hungred.com/seo/

    Since search engine spiders are not instructed to check for with or without www, it will check all possible way of linking it to them. Thus, it is also necessary to tell these search engine spiders how is your page being identify. You can configure this either in your .htaccess or in their webmaster tools.

    4. Dynamic pages especially those that contain a question mark in their URL, they are no no to search engine spider! To illustrate this better, if you have 3 URL as follows,

    i. Hungred.com/seo.php?thread=1&&sort=increase

    ii. Hungred.com/seo.php?thread=2&&sort=decrease

    iii. Hungred.com/seo.php?thread=3&&sort=increase

    These 3 URL are totally different pages. But if the search engine purges the information after the first offending character, the question mark (?), and the page will all looks like this.

    i. Hungred.com/seo.php

    ii. Hungred.com/seo.php

    iii. Hungred.com/seo.php

    Now you have three duplicate pages and it won't be indexed. Another problem is that dynamic pages do not have meaning words in their URL that makes it difficult for even human to identify what the page really means. It is important to have key words contain in your URL as mention in point 1. There are definitely solutions to this problem and it will require the help of .htaccess which many open source CMS had already applied this technique to make it SEO friendly.

    5. Use tag such as

    ..
    to tell search engine spider what are the most important phase and keywords in your site. Search engine nowadays are very smart. They are able to identify cheating in certain way. Therefore, placing everything to a

    tag will never bring you anywhere. It is best to have a balance of H1, H2 and so on in your site.

    6. Utilize Meta data or robots.txt! Unlike open sources where there is community who built SEO friendly tools to optimize SEO, most sites do not have such benefits. Your best bet is to use Meta data or robots.txt and optimize the necessary things you wish to crawl and things you want it to avoid. Search engine spiders do not like duplication content; they penalize you for having those! This is also the reason why we have to instruct search engine spiders not to look at some of the duplicate contents across pages such as navigation bar content.

    7. Consider nofollow tag linking outside. This will cause most search engine spider to ignore the link inside the nofollow tag and concentrate on other more important keywords on your site. But this will definitely sacrifice the relationship with other site. Consider carefully.

    8. Use external links for scripts and CSS! This is to ensure that both users and search engine spiders do not need to download them every time as it can be kept in their cache if the scripts are placed external!

    9. Do not use any hidden links or things that only search engine spiders can see. This will not help your website at all and might even cost you.

    10. Name all your files and folder related to your keyword that your site will be building.

    11. Any PHP script that malfunctions might cause the rest of the page to stop displaying. If humans can't sees it, robot will never have a better chance in crawling up your contain. It is important to always check the functionality of your page ensuring it is working flawless.

    12. Build optimize site. Always compress all your script and images and remove any unnecessary codes and comment so that your site loads faster.

    13. It is necessary to inform the search engine spider that your page no longer exist. It is often a mistake to allow search engine spider to assume an error 404 page as a valid page. This will definitely affect the overall quality of your website. Thus, it is necessary to send out correct error code on the header when there is an error on the page.

    14. Try as much as possible to keep all style and script tag at the bottom of the page! So that it is not prioritize as important by the search engine spider! ( 13/04/2009)

    SEO for webmaster

    In order to build a SEO friendly site, webmaster plays a major part in maintaining the site at tip top condition. Since ranking will drop sometimes, it is necessary to keep up to the news of SEO and maintain the site according.

    1. The most important for a website is the type of content it contains. It is critical important to write valuable information and useful issues that a user wish to know about your site. Relate your article to the keywords used for the page. Try your best to include the keyword in your site. This way search engine spider will have a higher chance of guessing your keywords

    2. Although nowadays search engine spider has programmed smarter to ignore Meta data for information regarding the site. But some engine spiders do look at these Meta tag especiall Yahoo search engine. Therefore, utilizing the Meta keyword and description are important for these engine spiders to check them out if they are unable to understand the content of your page. The description must be around 160 characters to reach optimal.

    3. Manually submit sitemap to big website such as google.com, yahoo.com, msn.com and ask.com. But manually submit them ONCE and once only! If you submit more than once, duplication will be the greatest problem hitting your site. On the other hand, use those free submissions for your site. It may help in some kind but most of these search engines took result from these big search engines. You can always create a sitemap easily from online tools available on the internet. It will be a pain to create a sitemap for your site with 100+ pages.

    4. Fully utilize the webmaster tools provided by these sites to understand your site indexed situation on their search engine. These webmaster tools make it interesting on how your site has been performing and you might even get addicted to it! Tools such as Google toolbar are a great tool to understand your page rank in Google. The other one will be the webmaster site that most search engines provided

    5. It is necessary to link related site across your pages. These help spider to crawl them up as well and increase your chances on being indexed.

    6. It is essential to let users know what links they will be clicking. Try to use keywords or meaningful words on the links. For example, instead of using 'click here' try to use words like 'discount details available here!' This helps both users and search engine spiders know what they are.

    7. It is best to get links from other site to your site in order to gain permanent referral which is far better than relying on SEO that algorithm may change one day.

    8. Search engine spiders are design to act as an author for approving website. Thus, similar to an author who always have to see the same content again and again is pure boring. The best way is to review and update your page with even more interesting and relevant information regularly so that the search engines are updated. On the other hand, users who visit your site also receive such benefit.

    9. Analysis the keywords used by your competitor. Fully utilize these keywords and strengthen the contents to involve these keywords for better chances of being indexed on top.

    10. There are times when some of your links doesn't work and people who visit your site dislike such things. This is also true for search engine spiders. Thus, it is important to always check for broken link so that you do not get penalize for this.

    11.Provide social network links for your visitors to know more about you such as Facebook and Myspace. Provide options for them to bookmark your article in many other web application available.

    12. It is difficult for many starting website to attract traffics even with the help of SEO. Thus, it is best to increase your chance by getting recommended by web directories. Here is a list of directories you can submit.

    NEVER for SEO friendly website

    There are things that a SEO friendly website shall never do. These are the unethical techniques that were used by the black hat SEO which were practice previously. They can get you ban in popular search engine such as Google, MSN, Yahoo or Ask. Once it is ban, you might not ever see your site on these popular search engines again. But you may request for reconsideration from these search engine after all of these practices have been eliminated. However, Search engines kept records for those who had been ban from their services. The chances of seeing your website on their search engine again will be very low.

    1. Do not perform links farms that create multiple links to your site to raise your page rank by any unethical means. In particular, avoid links to web spammers on the web.

    2. Do not try to perform cloaking! Cloaking is to deceive users or present different content to search engines than you display to users. Build for users not for search engines.

    3. Do not try to use sneaky redirects via JavaScript or any other means.

    4. Do not load page with irrelevant keywords in order to attract traffics.

    5. Don't create pages with malicious behavior, such as installing viruses, Trojans, or other badware.

    6. Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate search engines Terms of Service.

    Conclusion

    This guide provides all the necessary things you should know for any SEO friendly site. Honestly, having your site being indexed and mature takes longer than just building a SEO friendly site. The keywords to achieve results is consistently update and upgrade your site for the latest SEO trend on World Wide Web. Lastly, if you guys have any thing to update on this post feel free to give me some feedback on improving the overall guideline.

Overall Introduction to SEO (Search engine optimization) essential that beginner needs to know

Introduction

If you have entered this post, you might probably be one of those who came across SEO this acronym for the first time. SEO which is also known as 'search engine optimization' or 'search engine optimizer' is a process of improving the volume of traffic to a web site from search engines via search results. There are billions of people out there using search engine to seek for information that are desirable to them. Without search engine, goal knows where are all the gold in the World Wide Web?

Who needs to know and learn  SEO?

Personally, I think this should be split to two groups. Group that required to learn and another group that is not required to learn. But both of them should at least know there such term call SEO. Web Developers and designers are definitely in the required group. In order to produce a better SEO web site, the required group will need to know SEO to perform quality services for their consumers or themselves. Learning them late will cause quite a bit of problems when it comes to indexing on search engines. The optional group will most likely be people who wish to know more about SEO (that is why you are here) which doesn't really required a number of hands on job with your website.

How does SEO works?

Seriously, SEO works according to the rules of the search engine. In order to keep your site on top of the search result, one has to blind themselves with the rules of the search engine. Different search engine has different rules so it depends on which search engine you are targeting. However, most search engine works similarly nowadays. They all have crawlers which crawl your submitted site pages but not all crawled pages are being indexed by the search engine. The reasons why it is not being indexed although it has been crawled by the search engine are because crawler looks at different factors (rules) when crawling a site. Of course, the factors are being advice to the webmaster nowadays from the search engine site to prevent incident that happens between 1997-2005. (Webmaster excessively stuffed irrelevant words in order to climb on top of the search result)

How does crawling works?

bear in mind that the spider or googlebot that enters your site to crawl your data do not stay in your site for long. It will basically sees what it appeared to them and get what they found and return to the server with the information gathered from your site.

Which website is suitable for SEO?

SEO is a good way to attracts traffics to your site but it definitely does not guarantee any sales. Thus, not all website are suitable for such marketing strategy whereas other types of internal marketing strategy might be better in this situation. Furthermore, if your website is a business website, it is definitely not advisable to depend on search engine for referrals! The reason being is those search engines are not paid by you to operate, which means they are capable of changing their search algorithm which will severally damage your business operation as they are no longer sending visitors to your site. Thus, it is wiser to get traffics from other permanent places such as traffics from other links.

Who are the trusted SEO?

Personally, there are MILIIONS of marketing fraud in the World Wide Web. Every single advertisement is so attractive that makes consumers eager to purchase from them. This is not only happening in SEO but many other parts of the industries. A lot of research will have to be done before purchasing a certain goods from the World Wide Web. Please do not trust most of the site which are paid to review these products. They may be honest but they are being paid after all. Therefore, your best bid is to research very carefully on the reviews given by non-paid reviewers that have experience with their services. Google has list down few things that you must look at when dealing with a SEO.

What are the best search engines for SEO?

The answer really depends on where you are right now. Different country uses different search engine for their daily needs. For example, Baidu dominates the search engine in China whereas in Russia, Yandex controls 50% of the paid advertisement revenue. It varies from country to country.

Where is Google standing for SEO?

Google has done a great job in educating web master on how Google search engine work. All types of fantastic search engine tools have been developed by Google for SEO in their search engine. Many talks about Google methods of indexing site which most of us are applying the best practice Google has provided for us. There are definitely no guarantee ways of being indexed by a search engine without playing fairly. The best way is to learn SEO and apply to your website either by yourself or your company. There are definitely no shortcuts on SEO (not even if you pay Google). But at the end of the day, you learn something which may benefit you.

Hope this sums up all about SEO.

jQuery Tips and Tricks

I have been busy dealing with jQuery lately with a project in hand which has given me a lot of hands on with jQuery. Therefore, I would like to share some of the things I came across when I was working with jQuery. If you still confuse what i am talking about you may want to visit my previous post on jQuery.

1.     jQuery id and class selector.

jQuery selector can only selects one id but is able to select multiple classes. For example, you have a multiple id with the same name,

<div id='sameid'></div>
<div id='sameid'></div>

with a declaration of selector as follow,

$('div#sameid')

jQuery will only retrieve the first id it sees. However, if your declaration is a class instead of id,

<div class='sameid'></div>
<div class='sameid'></div>

with a declaration of selector as follow,

$('div.sameid')

it will provide you with a set of jQuery object that contains the two object above.

2.     jQuery animation lag during hover on multiple objects.

jQuery has an event 'hover' which act exactly similar to 'hover' in CSS. However, when we try to render hover on multiple set of elements, an animation lag may occur that doesn't seem smooth. This may be a problem with the way the code has been written. Let me show you what i meant here, take a look at playgroundblues.com, notice they have a nice navigation bar on the left hand side which display nicely on the left corner. The problem with this navigation bar is when you try to move very fast across the multiple objects several times, you will notice the animation continues to perform although your hover has already ended.

If we look at his code written for the menu,

playgroundbluescom-source-code

You will notice that there are two selector of $('#navigation > li') where one assign the hover function while the other perform the nice welcome sliding when visitor enters their side. There are no problems with the logic on the code above but because the codes for animation are being separated to two assignments, it created the lag during the animation. If all the assignment is done on a single selector of $('#navigation > li') through chaining, the animation lag will be eliminated.

3.  $(this) and 'this' .

Often enough i find myself getting confuse with when to use $(this) and 'this' in jQuery. The differences are $(this) refers to ALL jQuery objects while 'this' refers to the calling object in javascript.  For example, we want to use .each and fadeIn() of jQuery after a selector such as below,

$('div#product').each(function(){
$(this).fadeIn()
})

since fadeIn() is a jQuery effect function, we use $(this) but if we are not dealing with jQuery objects we used 'this' instead.

$('div.product').each(function(){
var myClass = this.className; // provided you have multiple class product
})

The reason why 'this' can be used here is due to the function of jQuery .each which look into each DOM object. Simple to say, this refer to the normal DOM object while when you have the extra $(xxxx), it will refer it to a jQuery object which allows you to use jQuery method.

4. find() function in jQuery

Being a forgetful person (which is why i have this blog), i often find myself forgetting what does the code $('a', this) in jQuery means. It basically means, find anchor object in 'this'. It is just a shortcut for jQuery to subsitute find () method.

5. ',' in jQuery selector.

You may have a chance to see $("div#idx, div#idc") and confuse got confuse with it as this doesn't look like anything on the api of jQuery! It is just another quick function in jQuery to do a selector in a single declaration instead of multiple statement. The ',' comma represent the 'OR' condition which is similar to '||' in java or javascript. So the selector i wrote means "select div with id idx or div with id idc".

That is all I can remember from what had happen during my coding with jQuery. If there are any more tips that I have come across during my coding, i will update here! Hope these helps! Cheers!

How CSS containers overlap and float on each other

We all know containers that web designers or even programmers used on the web! If we ignore all the coding you will find that all these things are actually done by CSS! i created two container with the following code,

<html>
<head>

<style>
#box1 {
width:450px;
height:338px;
background:#23e;
float: left;

}
#box2 {
width:450px;
height:338px;
background:#000;
padding: 5px 5px;
margin: 10px 10px;
position: absolute;
float: left;
z-index: 1;
}
</style>
<script>

</script>
</head>

<body>
<div id='box1'></div>
<div id='box2'></div>
</body>
</html>

You can have a look at the example HERE, notice that the black box is overlapping the blue box? The blue box has the CSS of box1 while the black box has the CSS of box2. The reason why it is overlapping was because of the following declaration,

<pre class="brush: php; title: ; notranslate" title="">  position: absolute;
 float: left;
 z-index: 1;

position absolute must be there to tell the page that the position must absolutely be obey and only with this declaration, z-index can be used. z-index tells the box to go 1 up (float upward) while negative means it will go downwards. So if i want the blue box to be overlapping the black instead, i will place z-index:-1, this way it will go below the blue box. While float:left tells the box to appear on the left side. If i do not want them to be overlapped, i will remove the declaration of position:absolute, this way z-index will not be valid and it will become side by side as shown HERE.

CSS is capable of styling and perform layout for your site! If i have another box name box3 and wanted the box to be place below the two box what do i do? assume the code declaration is as below,

<pre class="brush: php; title: ; notranslate" title=""><html>
<head>

<style>
#box1 {
width:250px;
height:338px;
background:#23e;
float: left;

}
#box2 {
width:250px;
height:338px;
background:#000;
padding: 5px 5px;
margin: 10px 10px;
float: left;

}

#box3 {
width:250px;
height:338px;
background:#23e;
float: left;
clear: left;
}
</style>
<script>

</script>
</head>

<body>
<div id='box1'></div>
<div id='box2'></div>
<div id='box3'></div>
</body>
</html> 

in box3, there is a new declaration, clear:left. This tells the page that box3 left side shall not have any other element which makes box3 move down to a new line. Click Here for example.Notice that this kind of layout is called a fixed layout where the boxes width cannot be more than the user screen width. If a liquid layout is being applied, the boxes will resize according to the user screen size. In order to change the fixed layout to a liquid layout, we just have to change the width to % instead of px.

Introduction to jQuery basic 2

There are many powerful methods used by the wrapper of jQuery. But i can't really discuss it right here as it will take too long to read. However, i will explain the api of each section of jQuery! This way it will make life easy for us when we are searching the api at jQuery official site!

jQuery Core:

jQuery Core contain all the functions available in jQuery that you need to extend, create, manipulate jQuery objects

Selectors:

It's like what the section name said! This section give you all the functions used by jQuery wrapper to filter its selection!

Attributes:

Attributes section provide you with all the functions available in jQuery to manipulate Attributes of DOM. This include CSS, attributes and properties.

Traversing:

This section provides all the functions jQuery have when chaining with jQuery.

Manipulation:

jQuery text/DOM Manipulation functions.

CSS:

jQuery CSS functions that manipulate the data in or css file.

Events:

All the add-on event that developers can use with jQuery other than the default javascript events available.

Effects:

All the special effects available in jQuery.

Ajax:

jQuery ajax functions that help reduce the risk of using Ajax technology and shorten the need to write lengthy codes for it.

Utilities:

jQuery Utilities function which provides extra feature other than the one given in JavaScript.

jQuery UI:

This is the official documentation for jQuery UI, jQuery's visual controls. jQuery UI features a wide range of core interaction plugins as well as many UI widgets. The project homepage is located at jqueryui.com. Please visit these pages for downloading UI and many demos. (taken from the offical site)

Well, i believe once you guys have read the first basic tutorial of jQuery, this is basically just a lookup for people who want to know what each individual api is for. And seriously speaking, once you have done the first part of jQuery, working with jQuery isn't that difficult as long as you can read the api provided by jQuery team. For people who have foundation of such api from Java, this is a piece of cake!