Thursday, July 26, 2007

Seo Help

There are no "hooks" here. I spend a lot of time talking to new prospective clients on the phone or by email almost every day and that often eats into my production time. So, I figure if I can do a good job of explaining the basics, it will save us both time, keep my projects on schedule and let me make a little more money.
You might decide you don't need my help and that with this free information you want to do SEO yourself. Good for you, go for it. On the other hand, you might decide that you need my help or that you want help that goes beyond what you want to learn to do yourself. Good for both of us.I have links to information in this page and in my SEO free information I will send you where I am an affiliate for the products or services I recommend. It is important to me to tell you that I actually USE these products and services. You can consider my recommendation a personal endorsement. If you click through and buy something, I get a commission, thank you. If you decide to sign up under me as a fellow affiliate for any of these goodies, I may get a little spiff when you sell something. So, this is an income opportunity for me and maybe for you, too.Beyond saving time and making a little money, there is another motive for providing free SEO advice, and that is helping folks get started on their way to success with their website. Compared to 1995 when I did my first web, the internet has millions of times more content online and new folks starting out have a really tough challenge to compete. It's not easy any more, which is probably why you are reading this? There are folks who helped me starting out. I am passing on the favor. Remember that movie with Helen Hunt? Paying It Forward?I have another reason for doing this. Unfortunately, there are a lot of folks providing "services" like submitting your website to a ga-gillion search engines (a total waste of time and money) or folks who have hung out their SEO shingle who may know less than you do. Like any emerging and rapidly moving field, a lust for the opportunity to say I ARE ONE is not necessarily supported by knowledge or value to the customer. This often gives our SEO industry a bad name and makes folks skeptical about my abilities and my services. I don't like that. I want that to change.So, now that you know where I am coming from, let me tell you a little about me so you will be able to judge the value of what I am giving away for free here.When I put my first web online in late 1995 I started out doing SEO the hard way (meaning self taught) by trial and error, experimenting, reading and looking at what successful webs were doing. I got some good results for both my own webs and for my clients. I realized some time in late 2004 that keeping up with all the changes taking place in techniques and algorithms was getting very, very challenging. This is a FAST MOVING game? You betcha, and it's getting faster and more challenging all the time.After considerable research, I decided to attend Search Engine Workshops and get Advanced SEO Certified. Search Engine Workshops' curriculum is university approved and eligible for college level CEUs or continuing education units. John Alexander and Robin Nobles do a great job and have become real friends as well as mentors. They treat their students like family. Let me tell you, it is great to know and be friends with gurus. Yes it is.Robin and John teach only ethical SEO techniques and start at the beginning in their training programs for folks new to SEO. They have a 2 day course for beginners followed by a 3 day course for advanced learners and an ultra advanced workshop for folks who really know optimization. If you do the whole 5 days I guarantee your brain will be in a kink for a couple of weeks. There is a lot to learn, they teach it well and they teach it all.

Friday, July 6, 2007

SEO Tips Use keywords in Title

One of the things I’m constantly amazed at is how many web publishers miss one of the easiest ways to maximize their positioning in Search Engines by simply including the keywords that they’d like to be found for in their post titles.

I spend a lot of time looking at online articles written on blogs, newspapers and websites and some days it seems that every second or third one has a title that is either cryptic, clever or cute at the expense of Search Engine Optimization (SEO).

To put it bluntly - when it comes to blog SEO I believe that your page and post titles are incredibly important. Google in particularly seems to value the words in your title incredibly highly.

Whilst I too feel the temptation to be clever with my post titles from time to time (and sometimes give into it) - I know that if I don’t get traffic from search engines then a fairly significant part of my income will disappear.

So if you’re writing about a new ‘Pink Widget’ have a think about the words that a potential reader will use to search for in Google to find the information you’re presenting. How would you search the net for information on ‘Pink Widgets’?

Without a doubt we’d all include ‘pink widget’ in the search we did. We might refine it by including a third word like ‘price’, ‘review’, ‘advice’, ‘problems’ etc (which may be worthwhile words to include either in the title or body of content) but the best words to include in the title are ‘pink widgets’ - if you don’t you’ve got virtually no chance of being found for that search term unless no one else is writing about them.

Keep in mind that research shows that people search the web a lot for names of products and people and that they are often quite specific their searches. If you’re writing about something specific make your title reflect this.

Of course it’s worth saying that it’s not as simple as just stuffing your titles with keywords - for one they need to make sense (no one will click on a link in Google if its a collection of unrelated words), secondly if you put too many words in your title you run the risk of decreasing their power and confusing the search engines and thirdly you’ll disillusion your regular readers if you mess with stuffing titles with too many words.

My advice is to keep it simple - get to the point with your titles and try get into the shoes of your reader.

Thursday, July 5, 2007

Search Engine Optimization India

Launched about six years back, TARGET SEO India has the unique ability to focus on your markets, both domestic and International, to route selects customers directly online or set them to visit your brick and mortar business destination. It has also helped us to understand the criteria, methodologies and Nautral SEO techniques suited to individual clients to achieve top search rankings on major search engines. We envision, and our clients realize a dramatic increase in online traffic and sales as a result of their targeted search engine promotion and Internet marketing." Having developed best Seach Engine Marketing techniques and SEO Promotion tools through web promotion services, we help you to prepare and construct your website, prior to submitting it on the top search engines. Target SEO India offers Top 10, Top 20 and Top 30 Google, Yahoo and other Search Engine Ranking SEO Services. Avail them and ensure that your web site achieves and retains ranking position on the top of all web searches and business websites. We believe that a Search Engine Optimization Company should cater to the individual needs of the client and be readily available to handle any request at the drop of a hat. Our success is largely due to this tailored customer service, and to our commitment to be available any time for an SEO consultation. We are always happy to help establish the most successful and economical search engine optimization campaign to fit your SEO budget and to achieve your long term online goals.

India Seo Company

Search Engine Optimization, the acronym for SEO , as a service, can be a little difficult to describe. While we all know that the main goal of optimization is to improve site rank position on the search engine and increase targeted traffic and business. In today's business world, the most successful businesses need to create effective and economical marketing and promotions for both their online and offline business.

Tuesday, June 12, 2007

Sandbox Effect - Theory

The Sandbox Effect is a theory used to give details certain behaviors observed with some Internet search engines. The Sandbox Effect is the theory that websites with newly-registered domains or domains with frequent ownership or name server changes are placed in a sandbox (holding area) in the indexes of Google until such time is deemed appropriate before a ranking can commence. Webmasters have noticed that their site will only show for keywords that are not competitive. It appears this effect does not affect new pages unless the domain is in the sandbox.

The Sandbox Effect is a topic of hot debate among those interested in search engines and search engine optimization. There are many different opinions about it, including the view that the Sandbox Effect doesn't actually exist and that the search ranking behavior can be explained as a result of a mathematical algorithm, rather than a decided policy.

Although assured details of the Sandbox Effect are unclear by webmasters, there is general agreement that the Sandbox only applies to search results from Google, and that results from other search engines such as Yahoo or MSN are not affected.

Monday, March 5, 2007

The Traps of a Robots.txt File

When you start making complicated files i.e. you decide to allow different user agents access to different directories problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.

The more serious problem is with logical errors. For instance:

User-agent: *

Disallow: /temp/

User-agent: Googlebot

Disallow: /images/

Disallow: /temp/

Disallow: /cgi-bin/

The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.

Friday, March 2, 2007

Notify Relevant Bloggers of your content

Whilst I don’t advocate spamming other bloggers and asking for links - I would recommend that if you write a quality post on a topic that you know will interest another blogger that it might be worth shooting them a short and polite email letting them know of your post. Don’t be offended if they don’t link up, but you might just find that they do and that in addition to the direct traffic that the link generates that it helps build your own page rank in the search engines.

Wednesday, February 28, 2007

Quality Content

There are all kinds of link generating systems out there but in my opinion the best way to get links to your blog is to write quality content that people will want to read. You can solicit links with others or sign up for different link building programs or even buy text links on other sites but the cheapest and probably safest approach is to build inbound links in a natural organic way as others link to your quality content.

Monday, February 26, 2007

Basic SEO tips for Bloggers


I’ve written previously on this topic so rather than writing the same thing again in a slightly different way will republish some of my previous tips below. I hope you find them useful.
Before I start - I’ll say that while I do get a lot of SE traffic that I’m not really an SEO expert (it isn’t what I devote most of my time to). If you want to read something by someone who has spent a lot more time and effort on the topic I recommend looking at an e-book by Aaron Wall - SEO Book (aff). I’ve actually been reading through it in the last few weeks (it’s been on my to do list for a long time) and have found it really helpful.
SEO experts tend to divide search engine optimization techniques into off site and on site techniques.

Tuesday, February 20, 2007

Robots.txt


It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.
One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.
Syntax:
User-agent: *
disallow:
This will allow all search engine for spidering and will index all pages

NOINDEX prevents anything on the page from being indexed.
NOFOLLOW prevents the crawler from following the links on the page and indexing the linked pages.
NOIMAGEINDEX prevents the images on the page from being indexed but the text on the page can still be indexed.
NOIMAGECLICK prevents the use of links directly to the images, instead there will only be a link to the page.

What Is Robots.txt?


What Is Robots.txt?Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.
The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.
The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.

Tuesday, February 13, 2007

How to generate quality inbound Links?


Of course whilst most of us know this it doesn’t make getting such links any easier - its in the hands of others in many cases. So how do you get such links?
· Quality Content - There are all kinds of link generating systems out there but in my opinion the best way to get links to your blog is to write quality content that people will want to read. You can solicit links with others or sign up for different link building programs or even buy text links on other sites but the cheapest and probably safest approach is to build inbound links in a natural organic way as others link to your quality content.
· Notify Relevant Bloggers of your content - Whilst I don’t advocate spamming other bloggers and asking for links - I would recommend that if you write a quality post on a topic that you know will interest another blogger that it might be worth shooting them a short and polite email letting them know of your post. Don’t be offended if they don’t link up, but you might just find that they do and that in addition to the direct traffic that the link generates that it helps build your own page rank in the search engines (more on letting other bloggers know of your posts here).
· Directories - Another way to generating inbound links is to submit your links to directories. I know of webmasters who swear by the benefits of such a strategy - the first thing that they do when starting a new site is to do the rounds of directories - submitting links to key pages with appropriate keywords in the links. There are loads of directories out there - many of which offer a free submission. Ari Paparo has compiled a list of blog directories that you might want to start with.
· Inter-link your Blogs - Increasingly bloggers are starting or joining blog networks to enjoy the benefits of multiple sites and writers working together. One of the advantages of networks of sites is that they usually link to one another. In doing so you have complete control over how your sites are linked to from multiple domains. It is worth noting that you should be careful with this approach - if all your sites are hosted on the one server many think that Search Engines will work out what you’re doing and the impact will be lessened.
· Buy Links - Many professional web masters have a budget to purchase links from other highly ranked and and relevant sites. I won’t go into this too much here but you might like to read more about it in my recent post On Buying Text Links.
· Swap Links - Similarly many bloggers swap links with other bloggers. Sometimes this happens pretty naturally (you see someone linking to you so you link back) but in many cases the links are strategic ones and formally arranged between site owners. I get daily requests for such reciprocal links (I rarely act on them). Whilst there is some benefit in such link swapping I would again advise caution here as many SEO experts believe that the search engines have methods for tracking such strategies and devaluing the links. Some try to get around this by doing indirect or triangulated links. ie instead of site A and B doign a direct swap they involve other sites. So A links to C in exchange for D (also owned by C) linking to B (also owned by A) - makes your head hurt doesn’t it!?! There are also a variety of systems around that say they’ll take care of such interlinking for you - I know many who use Digital Point’s Free C0-Op Advertising system. Personally I tend to avoid such schemes and have a policy of linking to sites I think are valuable to my readers. If they link back then so be it.
If you’re looking for link exchange/buying/selling programs you might like to look at systems like:

Monday, February 5, 2007

How to generate quality inbound Links?


Of course whilst most of us know this it doesn’t make getting such links any easier - its in the hands of others in many cases. So how do you get such links?
· Quality Content - There are all kinds of link generating systems out there but in my opinion the best way to get links to your blog is to write quality content that people will want to read. You can solicit links with others or sign up for different link building programs or even buy text links on other sites but the cheapest and probably safest approach is to build inbound links in a natural organic way as others link to your quality content.

Saturday, February 3, 2007

Off Site SEO Techniques


Off site SEO techniques are as the name suggests factors from outside the site itself (ie from other sites) that impact the blog’s ranking in search engines. Many of these factors are outside the blogger’s control - however they are useful to know. The most obvious and probably most powerful offsite factor are Inbound Links (something I’ve already referred to above).
It is generally agreed that the links that point to a website are one of the most powerful way of climbing Search Engines results pages (in fact many argue it is THE most important factor). - To put it most simply - every link to your site is seen by the search engines as being a vote of confidence in your site.
Ideally Speaking - The best inbound links have three main qualities to them:
1. they are from higher ranked sites than your own
2. they are relevant to the topic you are writing about
3. they link to you using relevant keywords to your page
Whilst you may not have complete control over who links to you these are the types of links that you should be dreaming of.
How to generate quality inbound Links?
Of course whilst most of us know this it doesn’t make getting such links any easier - its in the hands of others in many cases. So how do you get such links?

Wednesday, January 31, 2007

How to Optimize Your Blog for Search Engines

So you’re looking to increase the profitability of your blog for the Christmas period (and beyond). You’ve optimized your AdSense, Chitika and Affiliate programs, you’ve even written a little seasonal content…. but there’s one missing element…. Traffic.
Unless you actually have people viewing your blog it is very difficult to actually earn anything from it.
So how do you drive traffic to your blog?
I’ve written quite a bit of this previously in a number of posts (for example here) but want to spend a little time talking today about Search Engine Optimization (SEO).

Tuesday, January 30, 2007

Accessibility for all users, even search engines

Accessibility for all users, even search engines
On further reflection, this overlap makes sense. The goal of accessibility is to make web content accessible to as many people as possible, including those who experience that content under technical, physical, or other constraints. It may be useful to think of search engines as users with substantial constraints: they can’t read text in images, can’t interpret JavaScript or applets, and can’t “view” many other kinds of multimedia content. These are the types of problems that accessibility is supposed to solve in the first place.
Walking through a few checkpoints
Now that I’ve discussed the theory of why high accessibility overlaps with effective SEO, I will show how it does so. To do this, I am going to touch upon each Priority 1 checkpoint in the W3C Web Content Accessibility Guidelines, which affects search-engine optimization.

High Accessibility Is Effective Search Engine Optimization

High Accessibility Is Effective Search Engine Optimization

Many web designers view search-engine optimization (SEO) as a “dirty trick,” and with good reason: search engine optimizers often pollute search engine results with spam, making it harder to find relevant information when searching. But in fact, there is more than one type of search-engine optimization. In common usage, “black-hat” SEO seeks to achieve high rankings in search engines by any means possible, whereas “white-hat” SEO seeks to code web pages in a way that is friendly to search engines.In Using XHTML/CSS for an Effective SEO Campaign, Brandon Olejniczak explains that many web design best practices overlap with those of white-hat SEO. The reason is simple: such practices as separating style from content, minimizing obtrusive JavaScript, and streamlining code allow search engines to more easily spider, index, and rank web pages.Two years later, I am going to take Brandon’s conclusions a step further. I have been a search engine optimizer for several years, but only recently have become infatuated with web accessibility. After reading for weeks and painstakingly editing my personal website to comply with most W3C Web Content Accessibility Guidelines, I have come to a startling revelation: high accessibility overlaps heavily with effective white hat SEO.

Monday, January 15, 2007

STUDY EASY SEARCH ENGINE OPTIMIZATION

Search engine optimization (SEO) as a subset of search engine marketing seeks to improve the number and quality of visitors to a web site from "natural" ("organic" or "algorithmic") search results. The quality of visitor traffic can be measured by how often a visitor using a specific keyword leads to a desired conversion action, such as making a purchase or requesting further information. In effect, SEO is marketing by appealing first to machine algorithms to increase search engine relevance and secondly to human visitors. The term SEO can also refer to "search engine optimizers", an industry of consultants who carry out optimization projects on behalf of clients.
Search engine optimization is available as a stand-alone service or as a part of a larger marketing campaign. Because SEO often requires making changes to the source code of a site, it is often most effective when incorporated into the initial development and design of a site, leading to the use of the term "Search Engine Friendly" to describe designs, menus, Content management systems and shopping carts that can be optimized easily and effectively.
A range of strategies and techniques are employed in SEO, including changes to a site's code (referred to as "on page factors") and getting links from other sites (referred to as "off page factors"). These techniques include two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve of and attempt to minimize the effect of, referred to as spamdexing. Some industry commentators classify these methods, and the practitioners who utilize them, as either "white hat SEO", or "black hat SEO". Other SEOs reject the black and white hat dichotomy as an over-simplification.