SEO Trraining PPC Training

  • Subscribe to our RSS feed.
  • Twitter
  • StumbleUpon
  • Reddit
  • Facebook
  • Digg

Thursday, 24 January 2008

Subdomains Vs. Subdirectories for SEO

Posted on 01:53 by Unknown

Subdomain vs Subdirectories



It has been revealed by Matt Cutts that in recent times Google is treating subdomains like the subdirectories of a domain, they hope to limit the number of results that are shown for the given keyword search from the particular website. Previously few of the search marketers tried to use keyworded subdomains for the purpose of promoting search referral traffic from the search engines in the process
deploying out a number of keyword subdomains for the conditions for which they expected to rank well.

An article was written about how few of the local directory sites employing the subdomains for the purpose of achieving proper ranking results in the search engines. According to the article the websites were ranking well and this is not due to the presence of keyword as the subdomain. Some of the examples have been shown of sites that ranked well or even better in few of the cases where the keyword was among the Uniform Resource Indent as opposing the subdomain. In Google, subdirectories and the subdomains were functioning for keyword ranking optimization.

A number of sites that possessed different degrees of quality in their subdomaining strategies have been found. If there are subdomains then you must make sure that these include primarily unique content which are reflected on the other domains. It is important to see that the subdomain must contain page content which is not present on the other subdomains since this might lead to spamming of the search engine indices.

Googles Webmaster Guidelines are provided for the particular subject:

There is no necessity of creating subdomains, multiple pages with significantly duplicate content.

A number of large corporate websites have few of the accidental duplicate content, however if you install dozens and hundreds of subdomains with the dupe texts may project that you are attempting to spam the search engines “don’t do it.

If you are looking for the method of structuring the URLs as well as the site content for the purpose of natural search marketing, it is best to use a format of keyword directories and subdirectories instead of the subdomains. It is convenient to manage, and it appears far more natural and reasonable from the perspective of the search engines. There is very less chance of duplicating the content.

A number of major websites host a number of site applications and sections on the subdomains as well as the external providers sending the content on the different servers. However, it is pretty convenient to assign the subdomain to the third party that is providing the service for you. So long there is no duplication of the content of the pages on subdomains there is no problem.


People have enquired several times which is better subdirectory or the subdomain.

For this purpose a different top-level domains (TLDs) are preferred as it provides the opportunity to send a signal to search engines that the content is meant for the different countries. For instance, the French language pages could be delivered on the .FR domains like the www.example.fr

If you are not interested to use the TLDs for the pages of alternate language, there is no need to worry excessively for using the different subdomains versus directory/subdirectories. It is said that the french.example.com will probably function well as www.example.com/french. It is believed that the translated versions of the webpages are not considered to be the duplicate content since they include a number of various texts. The information might be duplicate but not the text content. The pages in the two separate languages may not come for the same keyword search.

TLDs are recommended for the foreign language pages to get best performance. However you can use the convenient methods. Check out before implementing as to which one of these will help you in the long run of your business in terms of traffic to you website through seo.


Read More
Posted in | No comments

Subdomains Vs. Subdirectories

Posted on 01:50 by Unknown

-------------------------------------------------------------------------------------------------

Subdomains Vs. Subdirectories For SEO

It has been revealed by Matt Cutts that in recent times Google is treating subdomains like the subdirectories of a domain, they hope to limit the number of results that are shown for the given keyword search from the particular website. Previously few of the search marketers tried to use keyworded subdomains for the purpose of promoting search referral traffic from the search engines in the processdeploying out a number of keyword subdomains for the conditions for which they expected to rank well.

An article was written about how few of the local directory sites employing the subdomains for the purpose of achieving proper ranking results in the search engines. According to the article the websites were ranking well and this is not due to the presence of keyword as the subdomain. Some of the examples have been shown of sites that ranked well or even better in few of the cases where the keyword was among the Uniform Resource Indent as opposing the subdomain.

In Google, subdirectories and the subdomains were functioning for keyword ranking optimization. A number of sites that possessed different degrees of quality in their subdomaining strategies have been found. If there are subdomains then you must make sure that these include primarily unique content which are reflected on the other domains. It is important to see that the subdomain must contain page content which is not present on the other subdomains since this might lead to spamming of the search engine indices.

Googles Webmaster Guidelines are provided for the particular subject:

There is no necessity of creating subdomains, multiple pages with significantly duplicate content. A number of large corporate websites have few of the accidental duplicate content, however if you install dozens and hundreds of subdomains with the dupe texts may project that you are attempting to spam the search engines “don’t do it. If you are looking for the method of structuring the URLs as well as the site content for the purpose of natural search marketing, it is best to use a format of keyword directories and subdirectories instead of the subdomains. It is convenient to manage, and it appears far more natural and reasonable from the perspective of the search engines. There is very less chance of duplicating the content. A number of major websites host a number of site applications and sections on the subdomains as well as the external providers sending the content on the different servers. However, it is pretty convenient to assign the subdomain to the third party that is providing the service for you. So long there is no duplication of the content of the pages on subdomains there is no problem.

People have enquired several times which is better subdirectory or the subdomain. For this purpose a different top-level domains (TLDs) are preferred as it provides the opportunity to send a signal to search engines that the content is meant for the different countries. For instance, the French language pages could be delivered on the .FR domains like the www.example.fr If you are not interested to use the TLDs for the pages of alternate language, there is no need to worry excessively for using the different subdomains versus directory/subdirectories. It is said that the french.example.com will probably function well as www.example.com/french. It is believed that the translated versions of the webpages are not considered to be the duplicate content since they include a number of various texts. The information might be duplicate but not the text content. The pages in the two separate languages may not come for the same keyword search. TLDs are recommended for the foreign language pages to get best performance. However you can use the convenient methods.

-------------------------------------------------------------------------------------------------

Read More
Posted in | No comments

Wordpress Optimization

Posted on 01:22 by Unknown


Wordpress Optimization Tips


  1. WWW or no WWW: The best way is to decide whether to use www or not in the URL. You must then redirect it so that the reader types, he will be redirected to your choice.

  1. Permalink structure: Making permalink structure is yet another important factor. Uniform resource locator with post titles are more search engine friendly as well as reader friendly than those with “?id=78”. When you have already changed your permalink structure after building some links, you can use WordPress Plugin like the Dean’s Permalink Migration.

  1. Add a Robots.txt file: For the purpose of SEO, you can use the robots.txt file. A quick glance at the sample robot.txt file so that you can make the WordPress blog Google friendly.

  1. Usage of Google Webmasters Tools: Webmaster Tools is yet another aid for making your website optimized for Google. You can take a look at the Blogger’s Guide to Webmaster Tools in order to know about the advantages and the way to handle it.

  1. Custom 404 Error Page: A popular post or the tag cloud will be beneficial if the readers land up in the error page. For instance they are search engine directed reader; you can show posts that are relevant to their search terms.

  1. Be SEO Friendly: You can install All In One SEO plugin and start off with the optimization of the blog for the increase of search engine traffic. This technique is implemented by most of the avid bloggers.

  1. Feed Redirect: You can use the Feedburner Feedsmith in order to redirect your feed to Feedburner feed. You must not forget to turn off the Feedburner urls in feed. There is no need of making it difficult for the other bloggers who want to link with your site. By burning your feed at Feedburner, you can also get statistics about the subscribers for free.

  1. Contact Form: This will help the readers get connected to you.

  1. Subscribes to Comment: When there is a comment on the blog, you must subscribe to the notification of the email. This will also help in the discussion between you and the reader.

  1. Separate Comments from Trackbacks: By default WordPress displays comments and Trackbacks. It is better to separate the two. Your reader might pay $10 for this feature.

  1. Sitemaps: You will have to submit the sitemap if you want Google to index your blog. Sitemap.xml file is generated so that is well-matched with a number of search engines. Google XML Sitemaps is a plugin which helps in the generation of Sitemap.xml file.

  1. Track the RSS subscription: If you want to increase the TSS subscribers it is advisable to keep a track of the new signups.

  1. Add Google Search: Google Search can be added for your blog. This process will help the readers select your site or the web.
  1. Feed Autodiscovery: The browsers also prefer Internet Explorer7, Firefox, Safari, Opera etc for the auto detection of the blog’s RSS feed and then exhibit an orange icon for the reader. You must be sure that the blog has the right feeds that are site for the auto-discovery. The Comments feed can also be made auto-discovered.
  1. Add WordPress 2.3 Tags: In case your theme was coded before the release of the WordPress 2.3, you can follow the WordPress 2.3 Tags to the previous theme.

You can use the post for the reference purpose and also add your own tips that you have added in the comments.






This content is licensed under a Creative Commons Attribution 2.5 India License
Creative Commons License
Read More
Posted in | No comments

Wednesday, 23 January 2008

SEO FAQ's | SEO Tips

Posted on 02:10 by Unknown


Website Structure FAQ

Do we build a site based on a flat architecture or a vertical architecture?

My answers would be simple. Consider this example will you as a user find it easy to navigate through the Fig 1 or Fig 2.


Websites that are build on flat architectures helps search engine crawlers to crawl large amount of pages rather than crawling for pages that needs to travel through many clicks. Each time a crawler faces click barriers it reduces the amount of content to be taken from each page link. Deep linking pages take much time to get their content indexed by search engine bots and thus take time to rank. My preference is (remaining everything is conducive) to make a flat structure website. The reason would be

  1. Fast crawl rate to enable more content per page.
  2. Crawlers would be able to detect new pages and updated content easily.
  3. Fewer pages would restrict the flow of PageRank and the resultant effect would be to sustain ranking for niche keywords.

Five most important things in html for a site

  1. The Title Tag
  2. The Meta Description
  3. Robots Meta Tag
  4. Anchor Text
  5. ALT Attribute

What is the difference between local link popularity and global link popularity?

Local link popularity refers to links from sites in a specific topical neighborhood (as identified by algorithms such as Teoma - now used by Ask.com), while global link popularity doesn't discriminate and counts all links from any site on the web.

How do search engines treat content inside an IFrame?

The engines all interpret content in an embedded IFrame as belonging to a separate document from the page displaying the IFrame content. Thus links and content inside IFrames refer to the page they come from, rather than the page they are placed on. For SEO, one of the biggest implications of this is that links inside an IFrame are interpreted as internal links (coming from the site the IFrame content is on) rather than external links (coming from the site embedding the IFrame).

Name twelve unique metrics search engines are suspected to consider when weighting links and how each affects rankings positively or negatively

There are dozens of answers to this question, but some of the most relevant and important would be:

  • Anchor text (when it matches queries, it can have a significant positive impact)
  • Placement on the page (MSN's research here describes how it may influence rankings)
  • PageRank of the linking page (more PR = more good)
  • Trust in the linking domain (more trust = ++)
  • Link structure in HTML (inside an image, javascript, standard a href, etc.) - although Javascript links are sometimes followed, they appear to provide only a fraction of the link weight that normal links grant. Likewise, links from images (and anchor text in the form of alt text) appears to provide somewhat less weight than standard HTML links.
  • Temporal nature of the link (when it appeared, how long it stays on the page for, etc.) - can affect how much weight the link is given and be used to identify patterns that may indicate manipulation
  • Use of nofollow
  • Relevance of page content to linked-to page (more relevant = better)
  • Relevance of site to linked-to page (more relevance = better)
  • Text surrounding the link (as the two above)
  • Previous link relationships between the domains (if the page/site has already linked to the page/site in the past, it may be given less weight and this may also be used to identify and discount reciprocal linking schemes)
  • Hosting relationships (if the domains are hosted on the same IP address, or same c-block of IP addresses, the link may lose some of its weight)
  • Registration relationships (if the domains share registration information, it may be interpreted as less editorial and given less weight)

    Courtesy: SEOMOZ



This content is licensed under a Creative Commons Attribution 2.5 India License
Creative Commons License




Read More
Posted in | No comments

Monday, 7 January 2008

Web 2.0 Concepts and Design

Posted on 02:53 by Unknown

What is web 2.0?

Web 2.0 is a concept which refers to the second generation of web based service- like social platform (MySpace, Friendster), communication tools, wikis and folksonomies etc. It was coined by O’Reilly that refers to a new generation of websites. During surfing the web, these programs help people to communicate and share information with each other online that was not possible in web 1.0 because it was ‘read-only’ web, whereas web 2.0 is extended into ‘read-write’ web.

Google, Amazon, etc. are some examples of web 2.0.

According to Wikipedia-

"Web 2.0 is the business revolution in the computer industry caused by the move to the Internet as platform, and an attempt to understand the rules for success on that new platform."

Basically web 2.0 is a web based application that focuses on user experience and collaboration.

Characteristics of Web 2.0

Web 2.0 has the following characteristics:

  • The entire data is shared through open source code as well as open source content.
  • Introduction to different web trends like sharing of articles, blog posting- increasing Web 2.0 applications.
  • Web 2.0 delivers web based applications.
  • It executes social networking capabilities.
  • Using web 2.0 application, the users of different websites are able to interact with the web based application.
  • It is more user friendly based on the latest technologies like AJAX.
  • It has a vast potential for pioneering web applications.
  • It helps to implement social networking capabilities where people can interact with each other.

The tools used here are RSS, Social Bookmarking and AJAX etc which are very powerful and operate faster compared to the previous concepts that were implemented in the earlier arenas.

In traditional web application, when anyone clicks something, he has to wait for the page to load, and thus is time consuming. But in Web 2.0, AJAX has made it most popular because the result comes without wasting time e.g. Google Maps.

Several different business models uses web 2.0 concept. Some of them are-

Ø Web development

Ø Web designing

Ø Search Engine Optimization

Ø E commerce.

The web2.0 is gaining popularity because of it's simplicity and becoming important for marketers. It redefines the market with new opportunities.

Due to the simplicity in web 2.0 concept, various websites are implementing the web 2.0 technologies and are becoming successful. Some of them are:

· YouTube.com : YouTube has gained popularity due to it’s user friendly nature and simple concept. Here people can also share video files among each other worldwide. Due to its popularity, it was sold for over 2 billion dollars to Google.

· MySpace.com: In this website, anyone can create his own profile, friend list and personal homepage. They can share their profile and web page both. In this web site anyone can create the web page by introducing images, text, video, etc.

· Wikipedia: It is a huge and famous encyclopedia available free to online. It is full of resource for everyone that can be updated and edited by anyone.

· Digg.com: Websites like Digg.com is an example of social bookmarking that offers people to create friend lists and share websites, opinions, stories etc globally.

· MySpace.com: As mentioned above, people can also create their own profile, friend list and their own homepage by adding text, images, videos, etc. in this website anyone can share their profile and web page worldwide. It is one of the most visited website viewed online.

Web 2.0 design:

Web 2.0 is designed in a simple way where high contrast colors are used. Colors like green, blues, oranges and pinks are used frequently in web 2.0.

3 basic things are needed to design a Web2.0:

1. Text.
2. Object, and
3. Style

The style of rounded corners in web 2.0 has made it popular globally. The user friendly nature, clever visual design with layout and copywriting will help to go a long way.

Web 2.0 font and color scheme:

Simple clean and rounded font style with pastel or high contrast colors are used in web 2.0.




Read More
Posted in basic seo, color, design, font, layout, web 2.0 | No comments

Link Building Factors

Posted on 01:57 by Unknown

Link Building Factors:

  • Number 1: The traditional mutual linking between sites. This process is termed as two-way linking. However the SEO’s are of the opinion that this strategy of link building has been diminished by Goggle. There is also a belief that the better strategy of building links with sites of relevant content.

  • Number 2: The three-way link also known as Triangular linking is the method of linking the site A and site B for a link back from another site. It is a much developed and modern strategy of link building than the two-way method. The Two-way linking is not in practice since they seem to be similar to the one-way linking to Google. However there are Search Engine optimizers who are of the opinion that the Google identifies three-way linking and there is often the risk of penalties.

  • Number 3: This is the three-way link method. The SEO offers a link from the site of the SEO or from a different site instead of a link from the site A. The benefit of this type of link building is that there is no need of the outbound links in case of the site A. However the SEO’s are of the opinion that site A might be have to suffer penalties from Google.

  • Number 4: The four-way linking is favored by few webmasters. They have the same advantages and disadvantages like the number two and number three methods as discussed above.

  • Number 5: Article links: The SEO will provide an informative article with links displayed in the middle or at the end of the article. These articles are provided to the webmasters free of cost as content for their sites in return of the link in the site of the client. There is a belief that the publication of the article on different sites might result into some problem since it has the duplicate content and the links of the articles are also similar. A number of these articles are submitted to “article farms”. It is advisable to offer unique articles on the webmaster but producing many of the unique articles is often expensive.

  • Number 6: One-way link is also considered to be effective by few of the SEO’s. Even though these one way links come from the sites having low page rank these are considered to be effective. However SEO’s are of the opinion that one-way link from FFA’s and the directories are considered to be spam by Google. It is essential to find out related sites in order to avoid being marked as spam by Google. However, this method seldom works.

  • Number 7: Buying Links - SEO’s prefer buying links with text links along with keywords. It is stated in the guidelines for webmasters that Google does not support buying of links. There is also a method of reporting against the sites that buy links. Few of the SEO’s think that Google creates this fear among people so that people buy Adwords instead of advertising somewhere else. The SEO’s also believe that buying banner ads is better than text links since Google cannot consider them as Spam. If the text based ads are sponsored link then text based ads are not a better idea. Some SEO are of the view that if there is no keyword in the link nothing can be done for reputation. It is also possible to detect reputation from site where the banner is.

  • Number 8: We consider A as the site that we are trying to promote. The best strategy is to create various information sites with relevant content on the different IP C-blocks in order to “take the fall”. If you follow this strategy then site A will receive links from links from other sites like site B, C and D. The sites B, C and D participate in the different linking activities with the purpose of creating Page Rank. This might be considered spam. Junkier information sites will make the chance of site A getting more hit.

  • Number 9: You can also build a site naturally. The best way is to create excellent content. It is an easy way to promote a site that is informative but is not ideal for commercial site. However it is pretty time-taking affair.

  • Number 10: It is a combination of all the above strategies. However, Google can punish a site for any of the strategy and may not be aware of the strategy.

Is there any other strategy of link building?

You can use interactive gimmicks and unique tools as link building. You can offer content which can be used by other sites. You can use free link back to your own site. For instance, if one of the sites is based on gambling industry and if I come across sites asking for calculator tool for my site. These are done by flash and they look for back links. This has to be something that would be beneficial to other sites but cannot be conveniently replicated.




Read More
Posted in link building | No comments
Newer Posts Older Posts Home
Subscribe to: Posts (Atom)

Popular Posts

  • How to improve Call To Action
    Understanding Calls to Action Many people in the industry use the Term Calls to Action without actually implementing the same. Calls To Act...
  • Adwords Tips on Quality Score
    It’s been a long time that I could actually post some nice information for the newbies in the Digital Marketing world. Almost every day I ge...
  • How to use AdWords Express
    AdWords or AdWords Express which one to go for? Most of the people know about Google AdWords but very few have heard or used AdWords Expres...
  • TEOMA Search Engine is back
    TEOMA Search Engine Rise from the Ashes. Started operation in 2001, then went off for some period of time and then again refurbished them an...
  • Basic SEO Principles
    Just to give you a brief description of the different function. There four important tags used in the HTML code. The Title Tag The Meta Desc...
  • Directory and Reciprocal Link Exchange
    Directories in Link Exchange – Is it still worth a game? Links so simple a word yet so complicates are its effect. Some people go crazy for ...
  • WhiteHat vs BlackHat SEO
    What is SEO? Search Engine Optimization is the process of positioning a site in the Search Engine Results Page (SERP). The main aim is to ma...
  • Ecommerce site and SEO
    Ecommerce site and SEO Electronic commerce or e-commerce consists of the buying, selling, marketing, and servicing of products or services o...
  • Does Google Adword provide authentic data?
    ================================================================= Does Google Adword provide authentic data? I have seen many people using G...
  • Password Security and Hacks
    Are you aware about your Password Security? If not then this is probably the time to think about it in a serious manner. According to a stud...

Categories

  • advanced seo course
  • Affordable SEO Plan
  • Affordable SEO Services
  • article writing
  • australia
  • basic seo
  • basic seo course
  • blogging
  • brand
  • championship
  • color
  • content writing
  • cricket 2007
  • data collection
  • design
  • directory
  • Directory Submission
  • Ecommerce and SEO
  • example
  • font
  • freelancer seo
  • google
  • google Universal Search
  • information
  • jobs
  • keywords
  • kolkata
  • layout
  • link building
  • link exchange
  • LSI Optimization
  • Meta Keyword
  • minimum budget services
  • new yahoo homepage
  • optimization
  • organic search engine placement
  • organic seo
  • organic seo ranking
  • professional seo course
  • Quintura Search
  • rankings
  • search data
  • Search Engine Optimization
  • Search Engine Strategies
  • Search Engine Strategy
  • Search Engine workflow
  • Seo
  • SEO Basics
  • seo company
  • seo consultant
  • seo course
  • seo course in india
  • seo course in kolkata
  • seo expert
  • seo kolkata
  • seo links
  • seo plan
  • seo professional
  • SEO Tips
  • seo writing
  • serp
  • services
  • statistics
  • supplemental index
  • Universal Search
  • web 2.0
  • web writing
  • Webmaster Tools
  • Whitehat SEO vs Blackhat SEO
  • world cup 2007
  • yahoo news
  • youtube

Blog Archive

  • ►  2012 (4)
    • ►  July (1)
    • ►  June (2)
    • ►  May (1)
  • ►  2010 (17)
    • ►  June (1)
    • ►  May (5)
    • ►  April (8)
    • ►  March (1)
    • ►  January (2)
  • ►  2009 (7)
    • ►  September (1)
    • ►  August (3)
    • ►  June (1)
    • ►  April (2)
  • ▼  2008 (18)
    • ►  May (1)
    • ►  April (1)
    • ►  March (7)
    • ►  February (3)
    • ▼  January (6)
      • Subdomains Vs. Subdirectories for SEO
      • Subdomains Vs. Subdirectories
      • Wordpress Optimization
      • SEO FAQ's | SEO Tips
      • Web 2.0 Concepts and Design
      • Link Building Factors
  • ►  2007 (12)
    • ►  December (1)
    • ►  July (1)
    • ►  June (2)
    • ►  April (5)
    • ►  February (3)
  • ►  2006 (8)
    • ►  December (3)
    • ►  November (2)
    • ►  September (1)
    • ►  August (2)
Powered by Blogger.

About Me

Unknown
View my complete profile