A simple definition of search engine optimisation in 2015 is that it is a technical and creative process to improve the visibility of a website in search engines, with the aim of driving more potential customersto it.
Thesefree seotips willhelp you create a successful seofriendly website yourself, based on my15 years experience making websites rankin Google. If you need optimisation services see my seo audit.
This is abeginners guide to effective white hat seo. I deliberately steer clear of techniquesthat might be grey hat, as what is grey today is often black hat tomorrow, as far as Google is concerned.
No one page guide can explorethis complex topic in full. What youll read here is how I approach the basics and these are the basics as far as I remember them. At least these are answers to questions I had when I was starting out in this field. And things have changed since I started this company in 2006.
Google insists webmasters adhere to theirrules and aims to reward sites withhigh quality contentandremarkable white hat web marketingtechniques with high rankings.Conversely it also needsto penalise web sites that manage to rank in Google by breaking these rules.
These rules are not laws, only guidelines, for ranking in Google; laid downbyGoogle. You shouldnote thatsome methods of ranking in Google are, in fact, actually illegal. Hacking, for instance, is illegal.
You can choose to follow and abide by these rules, bend them or ignore them all with different levels of success (and levels of retribution, from Googles web spam team).Whitehatsdo it by the rules;black hatsignore the rules.
What you read in this article is perfectly within the laws and within the guidelines and will help you increase the traffic to your website through organic, or natural search engine results pages (SERPS).
While there are a lot of definitions of SEO (spelled Search engine optimisation in the UK, Australia and New Zealand, or search engine optimization in the United States and Canada) organic SEO in 2015is mostly about getting free traffic from Google, the most popular search engine in the world (and the only game in town in the UK):
The guide you are reading is for the more technical minded.
The art of web seo is understanding how people search for things, and understanding what type of results Google wants to (or will) display to its users. Its about putting a lot of things together to look for opportunity.
A good optimiser has anunderstanding of how search engines like Google generate their natural SERPSto satisfy usersNAVIGATIONAL,INFORMATIONAL andTRANSACTIONALkeyword queries.
A good search engine marketer has a goodunderstanding of the short term and long term risks involved in optimising rankings in search engines, and an understanding of the type of content and sites Google (especially) WANTS to return in its natural SERPS.
The aim of any campaign isincreased visibility in search engines.
There are rules to be followed or ignored, risks to be taken, gains to be made, and battles to be won or lost.
A Mountain View spokesman once called the search enginekingmakers, and thats no lie.
Ranking high in Google is VERY VALUABLE its effectively free advertising on the best advertising spacein the world.
Traffic from Google natural listings is STILLthe most valuable organic traffic to a website in the world, and it can make or break an online business.
The state of play STILL is that you can generate your own highly targeted leads, for FREE, just by improving your website and optimising your content to be as relevant as possible for a customer looking for your company, product or service.
As you can imagine, theres a LOT of competition now for that free traffic even from Google (!) in some niches.
The process cansuccessfully practiced in a bedroom or a workplace, but it has traditionallyinvolved mastering many skills as they arose includingdiverse marketing technologies including but not limited to:
website designaccessibilityusabilityuser experiencewebsite developmentphp, html, css etcserver managementdomain managementcopywritingspreadsheetsback link analysiskeyword researchsocial media promotionsoftware developmentanalytics and data analysisinformation architecturelooking at Google for hours on end
It takes a lot, in 2015, to rank on merit a page in Google in competitive niches, and the stick Google ishitting every webmasterwith (at the moment, and for theforeseeablefuture) is the QUALITY USER EXPERIENCE stick.
If you expect to rank in Google in 2015, youd better have a quality offering, not based entirely on manipulation, or old school tactics.
Is a visit to your site a good user experience? If not beware MANUAL QUALITY RATERSand BEWARE the GOOGLE PANDA algorithm which is looking for signs of poor user experience and low quality content.
Google raising the quality bar ensures a higher level of quality in online marketing in general (above the very low quality weve seen over the last years).
Success onlineinvolves HEAVY INVESTMENT in on page content, website architecture, usability, conversion to optimisation balance, and promotion.
If you dont take that route, youll find yourself chased down by Googles algorithms at some point in the coming year.
This what is seo guide(and this entire website) is not about churn and burn type of Google seo (called webspam to Google).
What Is A Successful Strategy?
Get relevant. Get trusted. Get Popular.
Its is no longer just about manipulation. Its about adding quality and often utilitarian content to your websitewhich meeta PURPOSE that delivers USER SATISFACTION.
If you are serious about getting more free traffic from search engines, get ready to invest time and effort into your website and online marketing.
Google wants to rank QUALITY documents in its results, and force those who want to rank high to invest in great content, or great service, that attracts editorial links from other reputable websites.
If youre willing to add a lot of great content to your website, and create buzz about your company, Google will rank you high. If you try to manipulate Google, it will penalise you for a period of time, and often until you fix the offending issue which we know can LAST YEARS.
Backlinks in general, for instance, are STILL weighed FAR too positively by Google and they are manipulated to drive a site to the top positions for a while. Thats why blackhats do it and they have the business model to do it. Its the easiest way to rank a site, still today. If you are a real business who intends to build a brand online you cant use black hatmethods. Full stop.
Google Rankings Are In Constant Ever-Flux
Its Googles job to MAKE MANIPULATING SERPSHARD.
So the people behind the algorithms keeps movingthe goalposts, modifying the rules and raising quality standards for pages that compete for top ten rankings. In 2015 we have ever-flux in the SERPS and that seems to suit Google and keep everybody guessing.
Google is very secretive about its secret sauce and offers sometimes helpful and sometimes vague advice and some say offers misdirection about how to get more from valuable traffic from Google.
Google is on record as saying the engine is intent on frustratingsearch engine optimisers attempts to improve the amount of high quality traffic to a website at least (but not limited to) usinglow quality strategies classed as web spam.
At its core, Google search engine optimisation is aboutKEYWORDSandLINKS. Its aboutRELEVANCE,REPUTATIONandTRUST. It is aboutQUALITY OF CONTENT&VISITOR SATISFACTION.A Good USER EXPERIENCEis the end goal.
Relevance, Authority & Trust
Web page optimisation is about making a web page being relevant enough for a query, and being trusted enough to rank for it.
Its about ranking for valuable keywords for the long term, on merit. You can play by white hat rules laid down by Google, or you can choose to ignore those and go black hat a spammer. MOST seo tacticsstill work, for some time, on some level, depending on whos doing them, and how the campaign is deployed.
Whichever route you take, know that if Google catches you trying to modify your rank using overtly obvious and manipulativemethods, then they will class you a web spammer, and your site will be penalised (normally you will notrank high for important keywords).
These penalties can last years if not addressed, as some penalties expire and some do not and Google wants you to clean up any violations.
Google does not want you to try and modify your rank. Critics would say Googlewould prefer you paid themto do that using Google Adwords.
The problem for Google is ranking high in Google organic listings is a real social proof for a business, a way to avoid ppc costs and still, simply, the BEST WAY to driveREALLY VALUABLEtraffic to a site.
ItsFREE, too, once youve met the always-increasing criteria it takes to rank top.
In 2015, you need to be aware that what works to improve your rank can also get you penalised (faster, and a lot more noticeably).
In particular, the Google web spam team is currently waging a pr war on sites that rely onunnatural linksand other manipulative tactics (and handing out severe penalties if it detects them) and thats on top of many algorithms already designed to look for other manipulative tactics(like keyword stuffing).
Google is making sure it takes longer to see results from blackandwhite hat seo, and intent on ensuring a flux in its SERPS based largely on where the searcheris in the world at the time of the search, and where the business is located near to that searcher.
There are some things you cannot directly influence legitimately to improve your rankings, but there is plenty you CAN do to drive more Google traffic to a web page.
Google has HUNDREDS of ranking factors with signalsthat can change daily to determine how it works out where your pageranks in comparison to other competing pages.
You will not ever find them all. Manyranking factors are on page, on site and some are off page, or off site. Some are based on where you are, or what you have searched for before.
Ive been in online marketing for 15 years. In that time, Ive learned to focus on optimising elements in campaigns that offer the greatest return on investment of ones labour.
Learn SEO Basics.
Here isfew simple seo tipsto begin with:
If you are just starting out, dont think you can fool Google abouteverythingall the time.Google has VERY probably seen your tactics before. So, its best tokeep your plan simple. GET RELEVANT. GET REPUTABLE. Aim for a good, satisfying visitor experience. If you are just starting out you may as well learn how to do it withinGoogles WebmasterGuidelinesfirst. Make a decision, early, if you are going to follow Googles guidelines, or not, and stick to it. Dont be caught in the middle with an important project. Do not always follow the herd.If your aim is to deceive visitors from Google, in any way, Google is not your friend. Google is hardly your friend at any rate but you dont want it as your enemy. Google will send you lots of free traffic though if you manage to get to the top of search results, so perhaps they are not all that bad.A lot of optimisationtechniques that are effectivein boosting sites rankings in Google are against Googles guidelines. For example:many links that may have once promoted you to the top of Google, may in fact today be hurting your site and its ability to rank high in Google. Keyword stuffing might be holding your page back. You must be smart, and cautious, when it comes to building links to your site in a manner that Google *hopefully* wont have too much trouble with in the FUTURE. Because they will punish you in the future.Dont expect to rank number 1 in any niche for a competitivewithout a lot of investment, work. Dont expect results overnight. Expecting too much too fast might get you in trouble with the spam team.You dont pay anything toget into Google, Yahoo or Bingnatural, or free listings. Its common for the major search engines to find your website pretty easily by themselves within a few days. This is made so much easier if your website actually pings search engines when you update content (via XML sitemaps or RSS for instance).To be listed and rank highin Google and other search engines, you really should consider and largely abide by search engine rules andofficial guidelines for inclusion. With experience, and a lot of observation, you can learn which rules can be bent, and which tactics are short term and perhaps, should be avoided.Google ranks websites (relevancy aside for a moment) by the number and quality of incoming links to a site from other websites (amongst hundreds of other metrics). Generally speaking, a link from a page to another page is viewed in Google eyes as a vote for that page the link points to. The more votes a page gets, the more trusted a page can become, and the higher Google will rank it in theory. Rankings are HUGELY affected by how much Google ultimately trusts the DOMAIN the page is on. BACKLINKS (links from other websites trump every other signal.)Ive always thought if you are serious about ranking do so with ORIGINAL COPY. Its clear search engines reward good content it hasnt found before. It indexes it blisteringly fast, for a start (within a second, if yourwebsite isnt penalised!). So make sure each of your pages has enough text content you have writtenspecificallyfor that page and you wont need to jump through hoops to get it ranking.If you have original quality content on a site, you also have a chance of generating inbound quality links (IBL). If your content is found on other websites, you will find it hard to get links, and it probably will not rank very well as Google favours diversity in its results. If you havedecent original contenton your site, you can then let authority websites those withonline business authorityknow about it, and they might link to you this is called aquality backlink.Search engines need to understand a link is a link. Links can be designed to be ignored by search engines with therel nofollowattribute.Search engines can also find your site by other web sites linking to it. You can alsosubmit your site to search enginesdirect, but I havent submitted anysite to a search engine in the last 10 years you probably dont need to do that. If you have a new site I would immediately register it with Google Webmaster Tools these days.Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to spider. These bots might find a link to your home page somewhere on the web and then crawl and index the pages of your site if all your pages are linked together (in almost any way). If your website has an xml sitemap, for instance, Google will use that to include that content in its index. An xml site mapis INCLUSIVE, not EXCLUSIVE. Google will crawl and index every single page on your site even pages out with an xml sitemap.Many think Google will not allow new websites to rank well for competitive terms until the web address ages and acquires trust in Google I think this depends on the quality of the incoming links. Sometimes your site will rank high for a while then disappears for months. A honeymoon period to give youa taste of Google traffic, no doubt.Google WILL classify your site when it crawls and indexes your site and thisclassificationcan have a DRASTIC affect on your rankings its important for Google to work out WHAT YOUR ULTIMATE INTENT IS do you want to classified as an affiliate site made just for Google, a domain holding page, or a small business website with a real purpose? Ensure you dont confuse Google by being explicit with all the signals you can to show on your website you are a real business, and your INTENT is genuine and even more importantly today FOCUSED ON SATISFYING A VISITOR.NOTE If a page exists only to makemoneyfromGoogles free traffic Google calls this spam. I go into this more, later in this guide.The transparency you provide on your website in text and links about who you are, what you do, and how youre rated on the web or as a business is one way that Google could use (algorithmically and manually) to rate your website. Note that Google has a HUGE army of quality raters and at some point they will be on your site if you get a lot of traffic from Google.To rank for specific keyword phrase searches, you generally need to have the keyword phrase or highly relevant words on your page (not necessarily altogether, but it helps) or in links pointing to your page/site.Ultimately what you need to do to compete is largely dependenton what the competition for the term you are targeting is doing. Youll need to at least mirror how hard they are competing, if a better opportunity is hard to spot.As a result of other quality sites linking to your site, the site now has a certain amount of realPageRankthat is shared with all the internal pages that make up your website that will in future help provide a signal towhere this page ranks in the future.Yes, you need to build links to your site toacquire more PageRank, or Google juice or what we now calldomain authorityortrust. Google is a links based search engine it does not quite understand good or quality content but it does understand popular content. It can also usually identify poor, or THIN CONTENT and it penalises your site for that or at least it takes away the traffic you once had with an algorithm change. Google doesnt like calling actions the take apenalty it doesnt look good. They blame your ranking drops on their engineersgetting better at identifying quality content or links, or the inverse low quality content and unnatural links. If they do take action your site for paid links they call this a Manual Action and you will get notified about it in Webmaster Tools if yousign up.Link buildingis not JUST a numbers game, though. One link from a trusted authority site in Google could be all you need to rank high in your niche. Of course, the more trusted links you build, the more trust Google will have in your site. It is evident you need MULTIPLE trusted links from MULTIPLE trusted websites to legitimately get the most from Google.Try and get links within page text pointing to your site with relevant, or at least natural looking, keywords in the text link not, for instance, in blogrolls or site wide links. Try to ensure the links are not obviously machine generated e.g. site-wide links on forums or directories. Get links from pages, that in turn, have a lot of links to them, and you will soon see benefits.Onsite, consider linking to your other pages bylinking to them within text.I usually only do this when it is relevant often, Ill link to relevant pages when the keyword is in the title elements of both pages. I dont really go in for auto-generating links at all. Google has penalised sites for usingparticularauto link plugins, for instance, so I avoid them.Linking to a page with actual key-phrases in the link help a great deal in all search engines when you want to feature for specific key-terms. For example; seo scotland as opposed tohttp://www.hobo-web.co.ukor click here. Saying that in 2015, Google is punishing manipulative anchor text very aggressively, so be sensible and stick to brand links and plain url links that build authority with less risk. I rarely ever optimise for grammatically incorrect terms these days (especially links).I think the anchor text links in internal navigation is still valuable but keep it natural. Google needs links to find and help categorise your pages. Dont underestimate the value of aclever internal link keyword-rich architectureand be sure to understand for instancehow many words Google counts in a link, but dont overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.Search engines like Google spider or crawl your entire site by following all the links on your site to new pages, much as a human would click on the links of your pages. Google will crawl and index your pages, and within a few days normally, begin to return your pages in SERPS.After a while, Google will know about your pages, and keep the ones it deems useful pages with original content, or pages with a lot of links to them. The rest will be de-indexed. Be careful too many low quality pages on your site will impact your overall site performance in Google. Google is on record talking about good and bad ratios of quality content to low quality content.Ideally you will have unique pages, withunique page titlesandunique page descriptionsif you deem to use the latter. Google does not seem touse the meta description when actually ranking your page for specific keyword searches if not relevant and unlessyou are careful if you might end up just giving spammers free original text for their site and not yours once they scrape your descriptions and put the text in main contenton their site. I dont worry aboutmeta keywordsthese days as Google and Bing say they either ignore them or use them as spam signals.Google will take some time toanalyseyour entire site, analysing text content and links. This process is taking longer and longer these days but is ultimatelydetermined by your domain authority / real PageRank as Google determines it.If you have a lot of duplicate low quality text alreadyfound by Googlebot on other websites itknows about, Google will ignore your page. If your site or page has spammy signals, Google will penalise it, sooner or later. If you have lots of these pages on your site Google will ignore your efforts.You dont need tokeyword stuff your textand look dyslexic to beat the competition.You optimise a page for more trafficby increasing the frequency of the desired key phrase, related key terms, co-occurring keywords and synonymsin links,page titlesand text content. There isno ideal amount of text no magickeyword density. Keyword stuffing is a tricky business, too, these days.I prefer to make sure I have as many UNIQUE relevant words on the page that make up as many relevant long tail queries as possible.If you link out to irrelevant sites, Google may ignore the page, too but again, it depends on the site in question. Who you link to, or HOW you link to, REALLY DOES MATTER I expect Google to use your linking practices as a potential means by which to classify your site. Affiliate sites for example dont do well in Google these days without some good quality backlinks.Many search engine marketersthink who you actually link out to (and who links to you) helps determine a topical community of sites in any field, or a hub of authority. Quite simply, you want to be in that hub, at the centre if possible (however unlikely), but at least in it. I like to think of this one as a good thing to remember in the future as search engines get even better at determining topical relevancy of pages, but I have never really seen anygranularrankingbenefit(for the page in question) from linking out.Ive got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once its been soaked up by the higher pages in your site structure (the home page, your category pages). This is old school though but it still gets me by. I dont need to think you really need to worry about that in 2015.Original content is king and will attract a natural link growth in Googles opinion. Too many incoming links too fast might devalue your site, but again. Iusuallyerr on the safe side I always aimedfor massive diversity in my links to make them look more natural. Honestly, I go for natural links in 2015full stop.Google can devalue whole sites, individual pages, template generated links and individual links if Google deems them unnecessary and a poor user experience.Google knows who links to you, the quality of those links, and whom you link to. These and other factors help ultimately determine where a page on your siteranks. To make it more confusing the page that ranks on your site might not be the page you want to rank, or even the page that determines your rankings for this term. Once Google has worked out yourdomain authority sometimes it seems that the most relevant page on your site Google HAS NO ISSUE with will rank.Google decides which pages on your site are important or most relevant. You can help Google by linking to your important pages and ensuring at least one page is really well optimised amongst the rest of your pages for your desired key phrase. Always rememberGoogle does not want to rank thin pages in results any page you want to rank should have all the things Google is looking for. PS Thats a lot these days!It is important you spread all that realPageRank or link equity to your sales keyword / phrase rich sales pages, and as much remains to the rest of the site pages, so Google does not demote pages into oblivion or supplemental results as we old timers knew them back in the day. Again this is slightly old school but it gets me by even today.Consider linking to important pages on your site from your home page, and other important pages on your site.Focus onRELEVANCEfirst. Then, focus your marketing efforts and get REPUTABLE. This is thekey to ranking legitimately in Google in 2015.Every few months Google changes its algorithm to punish sloppy optimisation or industrial manipulation. Google Panda and Google Penguin are two such updates, but the important thing is to understand Google changes its algorithms constantly to control its listings pages (over 600 changes a year we are told).The art of rank modificationis to rank without tripping these algorithms or getting flagged by a human reviewer and that is tricky!
Welcome to thetightropethat is modern web optimisation.
Read on if you would like to learnhow to seo.
Keyword Research is ESSENTIAL
The first step in any professional campaign is to do some keyword research and analysis.
Somebody asked me about this a simple white hat tactic and I thinkwhat isprobably the simplest thing anyone can do that guarantees results.
The chart above (from last year) illustrates a reasonably valuable 4 word term I noticed a page I had didnt rank high in Google for, but I thought probably should and could rank for, with this simple technique.
I thought it as simple as an example to illustrate an aspect of onpage seo, or rank modification, thats white hat, 100% Google friendly and never, ever going to cause you a problem with Google. This trick works with any keyword phrase, on any site, with obvious differing results based on availability of competing pages in SERPS, and availability of content on your site.
The keyword phrase I am testingrankings for isnt ONthe page, and I did NOTadd the key phrase. or in incoming links, or using any technical tricks like redirects or any hidden technique, but as you can see from the chart, rankings seem to be going in the right direction.
You can profit from it if you know a little about how Google works (or seems to work, in many observations, over years, excluding when Google throws you a bone on synonyms. You cant ever be 100% certain you know how Google works on any level, unless its data showing your wrong, of course.)
What did I do to rank number 1 from nowhere for that key phrase?
I added one keyword to to the page in plain text, because adding the actual keyword phrase itself would have made my text read a bit keyword stuffed for other variations of the main term. It gets interesting if you do that to a lot of pages, and a lot of keyword phrases. The important thing is keyword research and knowing which unique keywords to add.
This illustrates a key to relevance is. a keyword. The right keyword.
Yes plenty of other things can be happening at the same time. Its hard to identify EXACTLY why Google ranks pages all the timebut you can COUNT on other things happening and just get on with what you can see works for you.
In a time of lightoptimisation, its useful to earn a few terms you SHOULD rank for in simple ways that leave others wondering how you got it.
Of course you can still keyword stuff a page, or still spam your link profile but it is light optimisation I am genuinely interested in testing on this site how to get more with less I think thats the key to not tripping Googles aggressive algorithms.
There are many tools on the web to help with basic keyword research (including the Google Keyword Plannertool andthere are evenmore useful third partyseo tools to help you do this).
You can use manykeyword research tools to quickly identify opportunities to get more traffic to a page:
Google Analytics Keyword Not Provided
Google Analytics was the very best place to look at keyword opportunity for some (especially older) sites, but that all changed a few years back.
Googlestopped telling us which keywords are sending traffic to our sites from the search engine back in October 2011,as part of privacy concerns for its users.
Google will now begin encrypting searches that people do by default, if they are logged into Google.com already through a secure connection. The change to SSL search also means that sites people visit after clicking on results at Google will no longer receive referrer data that reveals what those people searched for, except in the case of ads.
Google Analyticsnowinstead displays keyword not provided, instead.
In Googles new system, referrer data will be blocked. This means site owners will begin to lose valuable data that they depend on, to understand how their sites are found through Google. Theyll still be able to tell that someone came from a Google search. They wont, however, know what that search was.SearchEngineLand
You can still getsomeof this data if you sign up forGoogle Webmaster Tools(and you can combine this in Google Analytics)but the data even there is limited and often not entirely the most accurate. The keyword data can be useful though and access to backlink data is essential these days.
If the website you are working on isan aged site theres probably a wealth of keyword data in Google Analytics:
This is another example of Google making ranking in organic listingsHARDER a change for users that seems to have the most impact on marketers outside of Googles ecosystem yes search engine optimisers.
Now, consultants need to be page centric (abstract, I know), instead of just keyword centric when optimising a web page for Google. There are now plenty of third party tools that help whenresearching keywords but most of us miss the kind of keywordintelligencewe used to have access to.
Proper keyword research is important because getting a site to the top of Google eventually comes down to your text content on a page and keywords in external & internal links. All together, Google uses these signalsto determine where you rank if you rank at all.
Theres no magic bullet, to this.
At any one time, your site is probably feeling the influence of some sort of algorithmic filter (for example, Google Panda or Google Penguin) designed to keep spam sites under controland deliver relevant high quality results to human visitors.
One filter may be kicking in keeping a page down in the SERPS, while another filter is pushing another page up. You might have poor content but excellent incoming links, or vice versa. You might have very good content, but a very poor technical organisation of it.
Try and identify the reasons Google doesnt ratea particular page higher than the competition the answer is usually on the page or in backlinks pointing to the page.
Do you have too few inbound quality links?Do you have too many?Does your page lack descriptive keyword rich text?Are you keyword stuffing your text?Do you link out to irrelevant sites?Do you have too many advertisements above the fold?Do you have affiliate links on every page of your site, and text found on other websites?
Whatever they are, identify issues and fix them.
Get on the wrong side of Google and your site might well be flagged for MANUAL review sooptimiseyour site as if, one day, you will getthat website review from a Google Web Spam reviewer.
The key to a successful campaign, I think, is persuading Google that your page is most relevant to any given search query. You do this by good unique keyword rich text content and getting quality links to that page. The latter is far easier to say these days than actually do!
Next time your developing a page, consider what looks spammy to you is probably spammy to Google. Ask yourself which pages on your site are really necessary. Which links are necessary? Which pages on the site are emphasised in the site architecture? Which pages would you ignore?
You can help a site along in any number of ways (including making sure your page titles and meta tags are unique) but be careful. Obvious evidence of rank modifying is dangerous.
I prefer simple seo techniques, and ones that can be measured in some way. I have neverjustwanted torank for competitive terms; I have always wanted to understand at least some of the reasons why a page ranked for these key phrases. Itry to create a good user experience forfor humans AND search engines. If you make high quality text content relevant and suitable for both these audiences, youll more than likely findsuccess in organic listings and you might not ever need to get into the technical side of things, like redirects and search engine friendly urls.
To beat the competition in an industry where its difficult to attract quality links, you have to get more technical sometimes and in some industries youve traditionally needed to be 100% black hat to even get in the top 100 results of competitive, transactional searches.
There are no hard and fast rules to long term ranking success, other than developing quality websites with quality content and quality links pointing to it. The less domain authority you have, the more text youre going to need. The aim is to build a satisfying website and build real authority!
You need tomix it upand learn from experience. Make mistakes and learn from them by observation. Ive found getting penalised is a very good way to learn what not to do.
Remember there are exceptions to nearly every rule, and in an ever fluctuating landscape, and you probably have little chance determining exactlywhyyou rank in search engines these days. Ive been doing it for over 10 years and everyday Im trying to better understand Google, to learn more and learn from others experiences.
Its important not to obsess about granular ranking specificsthat havelittle return on your investment,unless you really have the time to do so! THERE IS USUALLY SOMETHING MORE VALUABLE TO SPEND THAT TIME ON. Thats usually either good backlinks or great content.
The fundamentals of successful optimisation while refined have not changed much over the years although Google does seem a LOT better than it was at rewarding pages with some reputation signals and satisfying content / usability.
Google isnt lying about rewarding legitimate effort despite what some claim. If they were I would be a black hat full time. So would everybody else trying to rank in Google.
The majority of small to medium businesses do not need advanced strategies because their direct competition generally has not employed these tactics either.
Most strategies are pretty simple in construct.
I took a medium sized business to the top of Google recently for very competitive terms doing nothing but ensuring page titles where optimised, the home page text was re-written, one or two earnedlinks from trusted sites.
This site was a couple of years old, a clean record in Google, and a couple of organic links already from trusted sites.
This domain had the authority and trust to rank for some valuable terms, and all we had to do was to make a few changes on site, improve the depth and focus of website content, monitor keyword performance and tweak.
There was a little duplicate content needing sorting out and a bit ofcanonicalisation of thin contentto resolve, butnoneof the measures I implemented Id call advanced.
A lot of businesses can get more converting visitors from Google simply by following basic principles and best practices:
always making sure that every page in the site links out to at least one other page in the sitelink to your important pages oftenlink not only from your navigation, but from keyword rich text links in text content keep this natural and for visitorstry to keep each page element and content unique as possiblebuild a site for visitors to get visitors and you just might convert some to actual sales toocreate keyword consideredcontent on the site people will link towatch which sites you link to and from what pages, but do link out!go and find some places on relatively trusted sites to try and get some anchor text rich inbound linksmonitor trends, check statsminimise duplicate or thin contentbend a rule or two without breaking them and youll probably be ok
. once this is complete its time to add more, and better content to your site and tell more people about it, if you want more Google love.
OK, so you might have to implement the odd 301, but again, its hardly advanced.
Ive seen simple seo marketing techniques working for years.
You are better off doing simple stuff better and faster than worrying about some of the more advanced techniques you read on some blogs I think its more productive, cost effective for businesses and safer, for most.
Pseudoscienceis a claim, belief, or practice posing as science, but which does not constitute or adhere to an appropriate scientific methodology
Beware folktrying to bamboozle you with science. This isnt a science when Google controls the laws and changes them at will.
You see I have always thought that optimisation was about:
looking at Google rankings all night long,keyword researchobservations about ranking performance of your own pages and that of others (though not in a controlled environment)putting relevant, co-ocurring words you want to rank for on pagesputting words in links to pages you want to rank forunderstanding what you put in your title, thats what you are going to rank best forgetting links from other websites pointing to yoursgetting real quality links that will last from sites that are pretty trustworthypublishing lots and lots of content (did I say lots? I meant tons)focusing on the long tail of search!!!understanding it will take time to beat all this competition
I always expected to get a site demoted by:
getting too many links with the same anchor text pointing to a pagekeyword stuffing a pagetrying to manipulate google too much on a sitecreating a frustrating user experiencechasing the algorithm too muchgetting links I shouldnt havebuying links
Not that any of the above is automatically penalised all the time.
I was always of the mind I dont need to understand the maths or science of Google, that much, to understand what Google engineers want.
The biggest challenge these days are to get really trusted sitesto link to you, but the rewards are worth it.
To do it, you probably should be investing in some sort of marketable content, or compelling benefits for the linking party (thats not just paying for links somebody else can pay more for). Buying links to improve rankings WORKS but it is probably THE most hated link building technique as far as the Google Webspam team is concerned.
I was very curious about about the science of optimisation I studied what I could but it left me a little unsatisfied.I learnedbuilding links, creating lots of decent content and learning how to monitise that content better (whilst not breaking any major TOS of Google) would have been a more worthwhile use of my time.
Getting better and faster at doing all that would be nice too.
Theres many problems with blogs,too, including mine.
Misinformation is an obvious one. Rarely are your results conclusive or observations 100% accurate. Even if you think a theory holds water on some level. I try to update old posts with new information if I think the page is only valuable with accurate data.
Just remember most of whatyou read about how Google works from a third party is OPINIONand just like in every other sphere of knowledge, facts can change with a greater understanding over time or with a different perspective.
Chasing The Algorithm
There is no magic bullet and there are no secret formulas to achieve fastnumber 1 ranking in Google in any competitive niche WITHOUT spamming Google.
Legitimately earned high positions in search engines in 2015 takes a lot of hard work.
There are a few less talked about tricks and tactics that are deployed by some better than others to combat Google Panda, for instance, but there are no big secrets (no white hat secrets anyway). There is clever strategy, though, and creative solutions to be found to exploit opportunities uncovered by researching the niche.As soon as Google sees a strategy that gets results it usually becomes out with the guidelines and something you can be penalised for so beware jumping on the latest fad.
The biggest advantage any one provider has over another isexperience and resource. The knowledge of what doesnt work and what will actually hurt your site is often more valuable than knowing what will give you a short lived boost.Getting to the top of Googleis a relatively simple process. One that is constantly in change. Professional SEO is more a collection of skills, methods and techniques. It is more a way of doing things, than a one-size-fits all magic trick.
After over a decade practicing and deploying real campaigns, Im still trying to get it down to its simplest , most cost effective processes.I think its about doing simple stuff right. From my experience, this realised time and time again. Good text, simple navigation structure, quality links. To be relevant and reputable takes time, effort and luck, just like anything else in the real world, and that is the way Google want it.
If a company is promising you guaranteed rankings, and has a magic bullet strategy, watch out. Id check it didnt contravene Googles guidelines.
How long does it take to see results?
Some results can be gained within weeksandyou need to expect some strategies to take months to see the benefit. Google WANTS these efforts to take time.Critics ofthe search engine giant would point to Google wanting fast effective rankingsto be a feature of Googles own Adwords sponsored listings.
Optimisation is not a quick process, and a successful campaign canbe judged on months if not years. Most successful, fast ranking website optimisationtechniques end up finding their way into Google Webmaster Guidelines so be wary.
It takes time to build quality, and its this quality that Google aims to reward.
It takes time to generate the data needed to begin to formulate a campaign, and time to deploy that campaign. Progressalso depends on many factors
How old is your site compared to the top 10 sites?How many back-links do you have compared to them?How is their quality of back-links compared to yours?What the history of people linking to you (what words have people been using to link to your site?)How good of a resource is your site?Can your site attract natural back-links (e.g. you have good content or a great service) or are you 100% relying on your agency for back-links (which is very risky in 2015)?How much unique content do you have?Do you have to pay everyone to link to you (which is risky), or do you have a natural reason why people might link to you?
Google wants to return quality pages in its organic listings, and it takes time to build this quality and for that quality to be recognised.
It takes time too to balance your content, generate quality backlinks and manage your disavowed links.
Google knows how valuable organic traffic is and they want webmasters investing a LOT of effort in ranking pages.
Critics will point out thehigher the cost of expert SEO, the better looking Adwords becomes, but Adwords will only get more expensive, too. At some point, if you want to compete online, your going to HAVE to build a quality website, with a unique offering to satisfy returning visitors the sooner you start, the sooner youll start to see results.
If you start NOW, and are determined to build an online brand, a website rich in content with a satisfying user experience Google will reward you in organic listings.
Web optimisation, is a marketing channel just like any other and there are no guarantees of success in any, for what should be obvious reasons. There are no guarantees in Google Adwords either, except that costs to compete will go up, of course.
Thats why it is so attractive but like all marketing it is still a gamble.
At the moment, I dont know you, your business, your website, its resources, your competition or your product. Even with all that knowledge, calculating ROI is extremely difficult because ultimately Google decides on who ranks where in its its results sometimes thats ranking better sites, and sometimes (often) it is ranking sites breaking the rules above yours.
Nothing is absolute in search. There are no guarantees despite claims from some companies.What you make from this investment is dependant on many things, not least, how suited your website is to actually convert into the sales the extra visitors you get.
Every site is different.
Big Brand campaigns are far, far different from small business seo campaigns who dont have any links to begin with, to give you but one example.
Its certainly easier, if the brand in question has a lot of domain authority just waiting to be awoken but of course thats a generalisation as big brands have big brand competition too. It depends entirely on the quality of the site in question and the level and quality of the competition, but smaller businesses should probably look to own their niche, even if limited to their location, at first.
Local SEO is always a good place to start for small businesses.
There AREsome things that are evident, with a bit of experience on your side:
Page Title Tag Best Practice<title>What Is The Best Title Tag For Google?</title>
The page title tag (or HTML Title Element) is arguably the most importanton page rankingfactor (with regards toweb page optimisation). Keywords in page titles can undeniably HELP your pages rank higher in Google results pages (SERPS). The page title is also often used by Google as thetitle of a search snippetlink in searchengineresults pages.
For me, a perfect title tag in Google is dependant on a number of factors and I will lay down a couple below but I have since expanded page title advice on another page (link below);
Apage title that is highly relevant to the page it refers to will maximise its usability, search engine ranking performance and click through satisfaction rate. It will probably be displayed in a web browsers windowtitle bar, and in clickable search snippet links used by Google, Bing & other search engines.The title element is the crown of a web page with important keyword phrase featuring AT LEAST ONCE within it.Most modern search engines have traditionally placed a lot of importancein thewords contained within this html element. A good page title is made up of keyword phrases of value and/or high search volumes.The last time I looked Google displayed as many characters as it can fit into a block element thats512pxwide and doesnt exceed 1 line of text. So THERE BECAMENO AMOUNT OF CHARACTERS any optimisercouldlay down as exact best practice to GUARANTEEa title will display, in full in Google, at least, as the search snippet title. Ultimately only the characters and words you use will determine if your entire page title will be seen in a Google search snippet. RecentlyGoogle displayed70 characters in a title but that changed in 2011/2012.If you want to ENSURE your FULLtitle tag shows in the desktop UK version of Google SERPS, stick to a shorter title of about 55characters but that does not mean your title tag MUST end at 55 characters and remember your mobile visitors see a longer title (in the UK, in March 2015 at least). I have seen up-to 69 characters (back in 2012) but as I said what you see displayed in SERPSdepends on the characters you use. In 2015 I just expect what Google displaysto change so I dont obsess about what Google is doing in terms of display.Google is all about user experience and visitor satisfaction in 2015 so its worth remembering thatusabilitystudies have shown that a good page title length is aboutseven or eight words long and fewer than 64 total characters. Longer titles are less scannable in bookmark lists, and mightnot display correctly in many browsers (and of course probably will be truncated in SERPS).Google will INDEX perhaps 1000s of characters in a title but I dont think no-one knows exactly how many characters or words Google will actually count AS a TITLE when determining relevance for ranking purposes. It is a very hard thing to try to isolate accurately with all the testing and obfuscation Google uses to hide its secret sauce.I have had ranking success with longer titles much longer titles. Google certainly reads ALL the words in your page title (unless you are spamming it silly, of course).You can probably include up to 12 words that will be counted as part of a page title, and consider using your important keywords in the first 8 words. The rest of your page title will be counted as normal text on the page.NOTE, in 2015, the html title element you choose for your page, may not be what Google chooses to include in your SERP snippet. The search snippet title and description is very much QUERY dependant these days. Google often chooses what it thinks is the most relevant title for your search snippet, and it can use information from your page, or in links to that page, to create a verydifferentSERP snippet title.When optimising a title, you are looking to rank for as many terms as possible,withoutkeyword stuffing your title. Often, the best bet is to optimise for a particular phrase (or phrases) and take a more long-tail approach. Note thattoo many page titles and not enough actual page text per page could lead to Google Panda or other user experience performance issues. A highlyrelevantunique page title is no longer enough to float a page with thin content. Google cares WAY too much about the page text content these days to let a good title hold up a thin page on most sites.Some page titles do better with a call to action a call to actionwhich reflects exactly a searchers intent (e.g. to learn something, or buy something, or hire something. Remember this is your hook in search engines, if Google chooses to use your page title in its search snippet, and there is a lot ofcompetingpages out there in 2015.The perfect title tag on a page is unique to other pages on the site. In light of Google Panda, an algorithm that looks for a quality in sites, you REALLY need to make your page titles UNIQUE, and minimise any duplication, especially on larger sites.I like to make sure my keywords feature as early as possible in a title tagbut the important thing is to have important keywords and key phrases in your page title tag SOMEWHERE.For me, when improvedsearch engine visibilityis more important than branding, the company name goes at the end of the tag, and I use a variety of dividers toseparateas no one way performs best. If you have a recognisable brand then there is anargumentfor putting this at the front of titles although Google often will change your title dynamically sometimes putting your brand at the front of your snippet link title itself.Note that Google is pretty good these days at removing any special characters you have in your page title and I would be waryof trying to make your title or Meta Description STAND OUT using special characters. That is not what Google wants, evidently, and they do give youa further chance to make your search snippet stand out with RICH SNIPPETS and SCHEMA markup.I like to think I write titles for search engines AND humans.Know that Google tweaks everything regularly why not what the perfect title keys off? So MIX it upDont obsess. Natural is probably better, and will only get better as engines evolve. I optimise forkey-phrases, rather than just keywords.I prefer mixed case page titles as I find them more scannable than titles with ALL CAPS or all lowercase.Generally speaking, the more domain trust/authority your SITE has in Google, the easier it is for a new page to rank for something. So bear that in mind. There is only so much you can do with your page titles your websites rankings in Google are a LOT more to do with OFFSITE factors than ONSITE ones negative and positive.Click through rate is something that is likely measured by Google when ranking pages (Bing say they use it too, and they now power Yahoo), so it is really worth considering whether you are best optimising your page titles for click-through rate or optimising for more search engine rankings.I would imagine keyword stuffing your page titles could be one area Googlelook at (although I see little evidence of it).Remember.think keyword phrase rather than keyword, keyword ,keyword think Long Tail.Google will select the best title it wants for your search snippet and it will take that information from multiple sources, NOT just your page title element.A small title is often appended with more information about the domain. Sometimes, if Google is confident in the BRAND name, it will replace it with that (often adding it to the beginning of your title with acolon, or sometimes appending the end of your snippet title with the actual domain address the page belongs to).A Note About Title Tags;
When you write a page title, you have a chance right at the beginning of the page to tell Google (and other searchengines) if this is a spam site or a quality site such as have you repeated the keyword 4 times or only once? I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible.
I always aim to keep my html page title elements things as simple, and looking as human-generated and unique, as possible.
Im certainly cleaning up the way I write my titles all the time. How do you do it?
More Reading:External LinksMeta Keywords Best Practice
A hallmarkof shadynatural search engine optimisationcompanies the meta-keywords tag. Companies that waste time and resources on these items waste clients money thats a fact:
I have one piece of advice with the meta keyword tag, which like thetitle tag, goes in the head section of your web page,forget about them.
If you are relying on meta-keyword optimisation to rank for terms, your dead in the water. From what I see,Google + Bing ignores meta keywords or at least places no weight in them to rank pages. Yahoo may read them, but really, a search engine optimiserhas more important things to worry about than this nonsense.
What about other search engines that use them? Hang on while I submit my site to those 75,000 engines first [sarcasm!]. Yes, 10 years ago early search engines liked looking at your meta-keywords. Ive seen OPs in forums ponder which is the best way to write these tags with commas, with spaces, limiting to how many characters. Forget about meta-keyword tags they are a pointless waste of time and bandwidth. Could probably save a rain-forest with the bandwidth costs we save if everybody removed their keyword tags.
Tin Foil Hat Time
So you have a new site.. you fill your home page meta tags with the 20 keywords you want to rank for hey, thats what optimisation is all about, isnt it? Youve just told Google by the third line of text what to sandbox youfor. The meta name=Keywords was actually originally for words that werent actually on the page that would help classify the document.
Sometimes competitors might use the information in your keywords to determine what you are trying to rank for, too.
If everybody removed them and stopped abusing meta keywords Google would probably start looking at them but thats the way of things in search engines.
I 100% ignore meta keywordsand remove them from pages I work on.
Meta Description Best Practice
Likethetitle elementandunlikethemeta keywords tag, this one is important, both from a human and search engine perspective.
<meta name="Description" content="Get your site on the first page of Google,Yahoo and Bing. Call us on 0845 094 0839. A company based in Scotland." />
Forget whether or not to put your keyword in it, make it relevant to a searcher and write it for humans, not search engines. If you want to have this 20 word snippet which accurately describes the page you have optimised for one or two keyword phrases when people use Google to search, make sure the keyword is in there.
I must say, I normally do include the keyword in the description as this usually gets it in your serp snippet, but I think it would be a fair guess to think more trusted sites would benefit more from any boost a keyword in the meta description tag might have, than an untrusted site would.
Google looks at the description but there is debate whether it actually uses the description tag to rank sites. I think they might at some level, but again, a very weak signal. I certainly dont know of an example that clearly shows a meta description helping a page rank.
Some times, I will ask a question with my titles, and answer it in the description, sometimes I will just give a hint;
That is a lot more difficult in 2015 as search snippets change depending on what Google wants to emphasise to its users.
Its also very important in my opinion to haveunique title tagsandunique meta descriptionson every page on your site.
Its a preference of mine, but I dont generally autogenerate descriptions with my cms of choice either normally Ill elect to remove the tag entirely before I do this, and my pages still do well (and Google generally pulls a decent snippet out on its own which you can then go back and optimise for SERPS. There are times when I do autogenerate descriptions and thats when I can still make them unique to the page using using some sort of server-side php.
Tin Foil Hat Time
Sometimes I think if your titles are spammy, your keywords are spammy, and your meta description is spammy, Google might stop right there even they probably will want to save bandwidth at some time.Putting a keyword in the description wont take a crap site to number 1 or raise you 50 spots in a competitive niche so why optimise for a search engine when you can optimise for a human? I think that is much more valuable, especially if you are in the mix already that is on page one for your keyword.
So, the meta description tag is important in Google, Yahoo and Bing and every other engine listing very important to get it right. Make it for humans.
Oh and by the way Google seems to truncate anything over@156 charactersin the meta description, although this may actually be limited by pixel width in 2015.
More Reading:External LinksRobots Meta Tag
Thus farIve theorised about theTitle Element, theMeta Description Tagand Meta Keywords Tag. Next:
The Robots Meta Tag;
<meta name="robots" content="index, nofollow" />
I could use the above meta tag to tell Google toindexthe page but not tofollowany links on the page, if for some reason, I did not want the page to appear in Google search results.
By default, Googlebot will index a page and follow links to it. So theres no need to tag pages with content values of INDEX or FOLLOW.GOOGLE
There are various instructions you can make use of in your Robots Meta Tag, but remember Google by default WILL index and follow links, so you have NO need to include that as a command you canleave the robots meta out completely and probably should if you dont have a clue.
Googlebot understands any combination of lowercase and uppercase.GOOGLE.
Valid values for Robots Meta TagCONTENT attribute are: INDEX, NOINDEX, FOLLOW, NOFOLLOW. Pretty self explanatory.
Google will understand the following andinterprets the following robots meta tag values:
NOINDEX prevents the page from being included in the index.NOFOLLOW prevents Googlebot from following any links on the page. (Note that this is different from the link-level NOFOLLOW attribute, which prevents Googlebot from following an individual link.)NOARCHIVE prevents a cached copy of this page from being available in the search results.NOSNIPPET prevents a description from appearing below the page in the search results, as well as prevents caching of the page.NOODP blocks theOpen Directory Projectdescription of the page from being used in the description that appears below the page in the search results.NONE equivalent to NOINDEX, NOFOLLOW.Robots META Tag Quick ReferenceTermsGooglebotSlurpBingBotTeomaNoIndexYESYESYESYESNoFollowYESYESYESYESNoArchiveYESYESYESYESNoSnippetYESNONONONoODPYESYESYESNONoYDIRNOYESNONONoImageIndexYESNONONONoTranslateYESNONONOUnavailable_AfterYESNONONO
Ive included the robots meta tag in my tutorial as this ISone of only a few meta tags / html head elements I focus on when it comes to managing Googlebot and Bingbot. At a page level it is a powerful way to control if your pages are returned in search results pages.
These meta tags go in the [HEAD] section of a [HTML] page and represent the only tags for Google I care about. Just about everything else you can put in the [HEAD] of your HTML document is quite unnecessary and maybe even pointless (for Google optimisation, anyway).
If you are interested in using methods like on-page robots instructions and the robots.txt file to control which pages get indexed by Google and how Google treats them,Sebastianknows a lot more than me.
External LinksH1-H6: Headers
I cant find any definitive proof online that says you need to use Heading Tags (H1, H2, H3, H4, H5, H6) or that they improve rankings in Google, and I have seen pages do well in Google without them but I do use them, especially the H1 tag on the page.
For me its another piece of a perfect page, in the traditional sense, and I try to build a site for Google and humans.
<h1>This is a page title</h1>
I still generallyonly use one <h1> heading tagin my keyword targeted pages I believe this is the way theW3Cintended it be used in HTML4 and I ensure they appear at the top of a page above relevant page text and written with my main keywords or keyword phrases incorporated.
I have never experienced any problems using CSS to control the appearance of the heading tags making them larger or smaller.
You can use multiple H1s in HTML5, but most sites I find I work on still use HTML4.
I use as many H2 H6 as is necessary depending on the size of the page, but generally I use H1, H2 & H3. You can see herehow to use header tags properly(basically just be consistent, whatever you do, to give your users the best user experience).
How many words in the H1 Tag? As many as I think is sensible as short and snappy as possible usually.
Ialso discoveredGoogle will use your Header tags as page titlesat some level if your title element is malformed.
As always be sure to make your heading tags highly relevant to the content on that page and not too spammy, either.
How Many Words & Keywords?
I get asked this all the time
how much text do you put on a page to rank for a certain keyword?
The answer is there isno optimal amount of text per page, but how much text youll need will be based on your DOMAIN AUTHORITY, your TOPICAL RELEVANCE andhow much COMPETITION there is for that term, and HOW COMPETITIVE thatcompetition actually is.
Instead of thinking about thequantityof the text, you should think more about thequalityof the content on the page. Optimise this with searcher intent in mind. Well, thats how I do it.
I dont findthat you need a minimum amount of words or text to rank in Google. I have seen pages with 50 words out rank pages with 100, 250, 500 or 1000 words. Then again I have seen pages with no text rank on nothing but inbound links or other strategy. In 2015, Google is a lot better at hiding away those pages, though.
At the moment, I prefer long form pages witha lot of text although although I still rely heavily on keyword analysis to make my pages. The benefits of longer pages are that they are greatfor long tail key phrases. Creating deep,information rich pagesreally focuses the mind when it comes to producing authoritative, useful content.
Every site is different. Some pages, for example, can get away with 50 words because of a good link profile and the domain it is hosted on. For me, the important thing is to make a page relevant to a users search query.
I dont care how many words I achieve this with and often I need to experiment on a site I am unfamiliar with. After a while, you get an idea how much text you need to use to get a page on a certain domain into Google.
One thing to note the more text you add to the page, as long as it is unique, keyword rich and relevant, the more that page will be rewarded with more visitors from Google.
There is no optimal number of words on a page for placement in Google. Every website every page is different from what I can see. Dont worry too much about word count if your content is original and informative. Google will probably reward you on some level at some point if there is lots of unique text on all your pages.
TIP: The inverted pyramid pictured above is useful when creating pages for the web too very useful.
The short answer to this is no.
There is no one-size-fits-all keyword density, no optimal percentage guaranteed to rank any page at number 1. However, I do knowyou can keyword stuff a page and trip a spam filter.
Most web optimisation professionals agree there is no idealpercent of keywords in text to get a page to number 1 in Google. Search engines are not that easy to fool, although the key to success in many fields doingsimple things well (or at least better than the competition).
I write natural page copy where possible always focused on the key terms I never calculate density in order to identify the best % there are way too many other things to work on. Ihave looked into this. If it looks natural, its ok with me.
Normally I will try and get related terms in the page, and if I have 5 paragraphs, I might have the keyword in 4 or 5 of those as long as it doesnt look like I stuffed them in there.
Optimal keyword densityis a myth, although there are many who would argue otherwise.
Internal Links To Relevant Pages
I link to relevant internal pages in my site when necessary.
I silo any relevance or trust mainly though links in text content and secondary menu systems and between pages that are relevant in context to one another.
I dont worry about perfect siloing techniques any more, and dont worry about whether or not I should link to one category from another, as I think the boost many proclaim is minimal on the size of sites I usually manage.
I do not obsess about site architecture as much as I used to. but I always ensure my pages I want indexed are all available from a crawl from the home page and I still emphasise important pages by linking to them where relevant. I always aim to get THE most important exact match anchor text pointing to the page frominternal links but I avoid abusinginternals and avoid overtly manipulative internal links that are not grammatically correct, for instance..
Theres no set method I find works for every site, other than tolink to related internal pages often without overdoing itand where appropriate.
What Are Google Sitelinks?
When Google knows enoughabout the history of a website (or web page), it will sometimes display what are called sitelinks (or mega sitelinks) under the domain of the website in question. This is normally triggered when Google is confident this is the site you are looking for, based on the search terms you used.
At one time in the past you could activate for these for some generic terms based on building links to the domain (or if the site had a lot of domain authority), but in 2015, site links are usually reserved for navigational queries with a heavy brand bias, a brand name or a company name, for instance, or the website address.
Ive tracked the evolution of Google sitelinks in organic listings over the years, and they are seemly picked based on a number of factors, a few of which I have observed.
How To Get Google Sitelinks?
Pages that appear in the blue sitelinks are often popular pages on your site, in terms of internal or external links, or even recent posts that may have been published on your blog. Google likes to seem to mix this up a lot, perhaps to offer some variety, and probably to obfuscate results to minimise or discourage manipulation.
Sometimes it returns pages that leave me scratching my head as to why Google selected a particular page appears.
If you dont HAVE site links, have a bit of patience and focus on other areas of your web marketing, like adding more content, get some PR or social activity focussed on the site. Google WILL give you sitelinks on some terms, ONCE its convinced your site is the destination for most people typing that term in. That could be a week or months, but the more popular the site is, the more likelyGoogle will catch up fast.
Sitelinks are not something can be switched on or off, although you can control to some degree which pages are presented as sitelinks. You can do that in Google Webmaster Tools.
Link Out To Related Sites
With regard toon page seo best practices, I usuallylink out to other quality relevant pages on other websites where possible and where a human would find it valuable.
I dont link out to other sites from homepage. I want all the PR residing in the home page to be shared only with my internal pages.I dont like out to other sites from my category pages either, for the same reason.
I link to other relevant sites (a deep link where possible)from individual pagesand I do it often, usually. I dont worry about link equity orPR leakbecause I control it on a page to page level.
This works for me, it allows me to share the link equity I have with other sites while ensuring it is not at the expense of pages on my own domain. It may even help get me into aneighbourhood of relevant sites, especially when some of those start linking back to my site.
Linking out to other sites, especially using a blog, also helps tell others that might be interested in your content that your page is here. Try it.
I dont abuse anchor text, but I will be considerate, and usually try and link out to a site using keywords these bloggers / site owners would appreciate.
The recently leaked Quality Raters Guidelines document clearly tells web reviewers to identify how USEFUL or helpful your SUPPLEMENTARYNAVIGATION options are wether you link to other internal pages, or pages on other sites.
Redirect Non WWW To WWW
Your site probably hascanonicalisation issues (especially if you have an e-commerce website) and it might start at the domain level.
Simply put, http://www.hobo-web.co.uk/ can be treated by Google as a different url than http://hobo-web.co.uk/ even though its the same page, and it can get even more complicated.
Its thought REAL Pagerank can be diluted if Google gets confused about your URLS and speaking simply you dont want this PR diluted (in theory).
Thats why many, including myself, redirect non-www to www (or vice versa) if the site is on a linux/apache server (in the htaccess file
Basically you are redirecting all the Google juice to one canonical version of a url.
In 2015 this isa MUST HAVE best practice. It keeps it simple, when optimising for Google. It should be noted, itsincredibly important not to mix the two types of www/non-www on site when linking your own internal pages!
Note in 2015Google asks you which domain you prefer to set as your canonical domain in Google Webmaster Tools.
NOTE: Alt Tags are counted by Google (and Bing), but I would be carefulover-optimizingthem. Ive seen a lot of websitespenalizedfor over-optimising invisible elements on a page. Dont do it.
ALT tags are very important and I think a very rewarding area to get right. I always put the mainkeywordin an ALT once when addressing a page.
Dont optimise your ALT tags (or rather, attributes) JUST for Google!
Use ALT tags (or rather, ALT Attributes) for descriptive text that helps visitors and keep them unique where possible, like you do with your titles and meta descriptions.
Dont obsess. Dont optimise your ALT tags just for Google do it for humans, for accessibility and usability. If you are interested, I ran a simple test using ALT attributes to determinehow many words I could use in IMAGE ALT text that Google would pick up.
And remember even if, like me most days, you cant be bothered with all the image ALT tags on your page, at least use a blank ALT (or NULL value) so people with screen readers can enjoy your page.
Update 17/11/08 Picked This Up At SERoundtableabout Alt Tags:
JohnMu from Google:alt attributeshould be used to describe the image. So if you have an image of a big blue pineapple chair you should use the alt tag that best describes it, which is alt=big blue pineapple chair.title attributeshould be used when the image is a hyperlink to a specific page. The title attribute should contain information about what will happen when you click on the image. For example, if the image will get larger, it should read something like, title=View a larger version of the big blue pineapple chair image.
Barry continues with a quote:
As the Googlebot does not see the images directly, we generally concentrate on the information provided in the alt attribute. Feel free to supplement the alt attribute with title and other attributes if they provide value to your users!So for example, if you have an image of a puppy (these seem popular at the moment) playing with a ball, you could use something like My puppy Betsy playing with a bowling ball as the alt-attribute for the image. If you also have a link around the image, pointing a large version of the same photo, you could use View this image in high-resolution as the title attribute for the link.
Search Engine Friendly URLs (SEF)
Clean URLs (or search engine friendly urls) are just that clean, easy to read, simple.
You do not need clean urls in a site architecture for Google to spider a site successfully (confirmed by Google in 2008), although I do use clean urls as a default these days, and have done so for years.
Its oftenmore usable.
Is there a massive difference in Google when you use clean urls?
No, in my experience its very much a second or third order affect, perhaps even less,if used on its own. However there it is demonstrable benefit to having keywords in urls.
The thinking is that you might get a boost in Google SERPS if your URLs are clean because you are using keywords in the actual page name instead of a parameter or session ID number (which Google often struggles with).
I think Google might reward the page some sort of relevance because of the actual file / page name. I optimise as if they do.
It is virtually impossible to isolate any ranking factor with a degree of certainty.
Where any benefitis slightly detectableis when people (say in forums)link to your site with the url as the link.
Then it is fair to say you do get a boost because keywords are in the actual anchor text link to your site, and I believe this is the case, but again, that depends on the quality of the page linking to your site i.e. if Google trusts it and it passes Page Rank (!) and anchor text relevance.
And of course, youll need citable content on that site of yours.
Sometimes I will remove the stop-words from a url and leave the important keywords as the page title because a lot of forums garble a url to shorten it. Sometimes I will not and prefer to see the exact phrase I am targeting as the name of the url I am asking Google to rank.
I configure urls the following way;
www.hobo-web.co.uk/?p=292 is automatically changed by the CMS using url rewrite towww.hobo-web.co.uk/websites-clean-search-engine-friendly-urls/ which I then break down to something likewww.hobo-web.co.uk/search-engine-friendly-urls/
It should be remembered it is thought although Googlebot can crawl sites with dynamic URLs, it is assumed by many webmasters there is a greater risk that it will give up if the urls are deemed not important and contain multiple variables and session IDs (theory).
As standard, I use clean URLS where possible on new sites these days, and try to keep the URLs as simple as possible and do not obsess about it.
Thats my aim at all times when I optimise a website to work better inGoogle simplicity.
Google does look at keywords in the URL even in a granular level.
Having a keyword in your URL might be the difference between your site ranking and not potentially useful to take advantage of long tail search queries for more seeDoes Google Count A Keyword In The URI (Filename) When Ranking A Page?
Keywords In Bold Or Italic
As I mentioned in myALT Tagssection,some webmasters claimputting your keywords in boldorputting your keywords in italicsis a beneficial ranking factor in terms of search engine optimizing a page.
It is essentially impossible to test this, and I think these days, Google could wellbe using this (and other easy to identify on page optimisation efforts) to identify what to punisha site for, not promote it in SERPS.
Any item you can optimise on your page Google can use this against you to filter you out of results.
I use bold or italics these days specifically for users.
I only use emphasis if itsnaturalorthis is really what I want to emphasise!
Do not tell Google what to filter you for that easily.
I think Google treats websites they trust far different to others in some respect.
That is, more trusted sites might get treated differently than untrusted sites.
Keep it simple, natural, useful and random.
Absolute Or Relative URLS
My advice would be to keep it consistent whatever you decide to use.
I prefer absolute urls. Thats just a preference. Google will crawl either if the local setup is correctly developed.
What is an absolute URL? Example http://www.hobo-web.co.uk/search-engine-optimisation/What is a relative URL? Example /search-engine-optimisation.htm
Relative just means relative to the document the link is on.
Move that page to another site and it wont work.
With an absolute URL, it would work.
Subfolders or Files For URLS
Sometimes I use subfoldersand sometimes I use files. I have not been able to decide if there is any real benefit (in terms of ranking boost) to using either. A lot of CMS these days usesubfolders in their file path, so I am pretty confident Google can deal with either.
I used to prefer files like.htmlwhen I was building a new site from scratch, as they were theend of the line for search engines, as I imagined it, and a subfolder (or directory)was acollectionof pages.
I used to think it could take more to get a subfoldertrusted than say an individual file and I guess this sways me to use files on mostwebsites I created (back in the day). Once subfoldersare trusted, its 6 or half a dozen, what the actual difference is in terms of ranking in Google usually, rankings in Google are more determined by how RELEVANT or REPUTABLE a page is to a query.
In the past, subfolderscould be treated differently than files (in my experience).
Subfolderscan be trusted less than other subfoldersor pages in your site, or ignored entirely. Subfolders *used to seem to me* to take alittlelonger to get indexed by Google, than for instance.htmlpages.
People talk abouttrusted domainsbut they dont mention (or dont think) some parts of the domain can betrusted less. Google treats some subfolders.. differently. Well, they used to and remembering how Googleusedto handle things has some benefits even in 2015.
Some say dont go beyond 4 levels of folders in your file path. I havent experienced too many issues, but you never know.
UPDATED I think in 2015its even less of something to worry about. Theres so much more important elements to check.
Which Is Better For Google? PHP, HTML or ASP?
Google doesnt care. As long as it renders as a browsercompatibledocument, it appears Google can read it these days.
I prefer php these days even with flat documents as it is easier to add server side code to that document if I want to add some sort of function to the site.
Does W3C Valid HTML / CSS Help?
Above a Google video confirming this advice I first shared in 2008.
Does Google rank a page higher because of valid code? The short answer is no, even though I tested it on asmall scale testwith different results.
Google doesnt care if your page is valid html and valid css. This is clear check any top ten results in Google and you will probably see that most contain invalid HTML or CSS. I love creating accessible websites but they are a bit of a pain to manage when you have multiple authors or developers on a site.
If your site is so badly designed with a lot of invalid code even Google and browsers cannot read it, then you have a problem.
Where possible, if commissioning a new website, demand at least minimum web accessibility compliance on a site (there are three levels of priority to meet), and aim for valid html and css. Actually this is the law in some countries although you would not know it, and be prepared to put a bit of work in to keep your rating.
Valid HTML and CSS are a pillar of best practice website optimisation, not strictly a part of professional search engine optimisation. It is one form of optimisation Google will not penalise you for.
Addition I usually still aim to followW3C recommendations that actuallyhelp deliver a better user experience;
Hypertext links.Use text that makes sense when read out of context. W3CTop Ten Accessibility Tips
301 Old Pages
Rather than tell Google via a 404 or some other command that this page isnt here any more, considerpermanently redirecting a page to a relatively similar page to pool any link equitythat page might have.
My general rule of thumb is to make sure the information (and keywords) are contained in the new page stay on the safe side.
Most already knowthe power of a301 redirectand how you can use it to power even totally unrelated pages to the top of Google for a time sometimes a very long time.
Google seems to think server side redirects are OK so I use them.
You can change the focus of a redirect but thats a bit black hat for me and can be abused I dont really talk about that sort of thing on this blog. But its worth knowing you need to keep these redirects in place in your htaccess file.
Redirecting multiple old pages to one new page works for me, if the information is there on the new page that ranked the old page.
NOTE This tactic is being heavily spammed in 2015. Be careful with redirects.I think I have seen penalties transferred via 301s. I alsoWOULDNTREDIRECT 301s blindly to your home page. Id also be careful of redirecting lots of low quality links to one url. If you need a page to redirect old urls to, consider your sitemap or contact page. Audit any pages backlinks BEFORE you redirect them to an important page.
Im seeing CANONICALS work just the same as 301s in 2015 though they seem to take a little longer to have an impact.
Hint a good tactic at the moment is to CONSOLIDATE old, thin under performing articles Google ignores, into bigger, better quality articles.
I usually then 301 all the pages to a single source to consolidate link equityand content equity. As long as the intention is to serve users and create something more up-to-date Google is fine with this.
Webmasters are often confused about getting penalised for duplicate content, which is a natural part of the web landscape, especially at a time when Google claimsthere is NOduplicate content penalty.
The reality in 2015 is that if Google classifiesyour duplicate content as THINcontent then you DOhave a very serious problem that violates Googles website performance recommendations and this violation will need cleaned up.
Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin..
Its very important to understand that if, in 2015, as a webmaster you republish posts, press releases, news stories or product descriptions found on other sites, then your pages are very definitely going to struggle to gain in traction in Googles SERPS (search engine results pages).
Google doesnt like using the word penalty but if your entire site is made of entirely of republished content Google does not want to rank it. If youhave a multiple site strategy selling the same products you are probably going to cannibalise your own traffic in the long run, rather than dominate a niche, as you used to be able to do.
This is all down to how the search engine deals with duplicate content foundon other sites and the experience Google aims to deliver for its users and its competitors.
Mess up with duplicate content on a website, and it might look like a penalty, as the end result is the same important pages that once ranked might notrank again and new content might not get crawled as fast as a result.
Your website might even get a manual action for thin content. Worse case scenario your website is hit by the GOOGLE PANDA algorithm.
A good rule of thumb is do NOT expect to rank high in Google with content found on other, more trusted sites, and dont expect to rank at all if all you are using is automatically generated pages with no value add.
See my latest post onGoogle Advice on Duplicate Content.
Broken Links Are A Waste Of Link Power
The simplestpiece of advice I ever read about creating a website / optimising a website was years ago and it is still useful today:
make sure all your pages link to at least one other in your site
This advice is still sound today and the most important piece of advice out there in my opinion. Yes its so simple its stupid.
Check your pages for broken links. Seriously, broken links are a waste of link power and could hurt your site, drastically in some cases. Google is a link based search engine if your links are broken and your site is chock full of 404s you might not be at the races.
Heres the second best piece of advice in my opinion seeing as we are just about talking about website architecture;
link to your important pages often internally, with varying anchor text in the navigation and in page text content
. especially if you do not have a lot of Pagerank to begin with!
Do I Need A Google XML Sitemap For My Website?
What is a xml sitemap and do I need one to seo my site for Google?
(The XML Sitemap protocol) has wide adoption, including support from Google, Yahoo!, and Microsoft
No. You do NOT,technically, need an XML Sitemapto optimise a site for Google if you have a sensible navigation system that Google can crawl and index easily. HOWEVER in 2015 you should have a Content Management Systemthat produces one as a best practice and you should submit that sitemap to Google in Google Webmaster Tools. Again best practice. Google has said very recently XML and RSS is still a very useful discovery method for them to pick out recently updated content on your site.
An XML Sitemap is a file on your https://en.wikipedia.org/wiki/Search_engine_optimization server withwhich you can help Googleeasily crawl& index all the pages on your site. This is evidentlyuseful for very large sites that publish lots of new content or updates content regularly.
Your web pages will still get into search results without an xml sitemap if Google can find them by crawling your website, if you:
Make sure all your pages link to at least one other in your siteLink to your important pages often, with (varying anchor text, in the navigation andin page text content if you want best results)
Remember Google needs links to find all the pages on your site, and links spread Pagerank, that help pages rank so an xml sitemap is not quite a substitute for a great website architecture.
Sitemapsare an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, andhow important it is,relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Most modernCMS auto-generate xml sitemaps, andGoogle does ask you submit a site map in webmaster tools, and I do these days.
I prefer to manually define my important pages by links and depth of content,but a XML sitemap is a best practice in 2015 for most sites.
Does Only The First Link Count In Google?
Does the second anchor text link on a page count?
One of the more interesting discussions in the webmastercommunity of late has been trying to determine which links Google counts as links on pages on your site. Some say the link Google finds higher in the code, is the link Google will count, if there are two links on a page going to the same page.
I tested this (a while ago now)with the postGoogle Counts The First Internal Link.
For example (and I am talking internal here if you took a page and I placed two links on it, both going to the same page? (OK hardly scientific, but you should get the idea).
Will Google only count the first link? Or will it read the anchor txt of both links, and give my page the benefit of the text in both links especially if the anchor text is different in both links? Will Google ignore the second link?
What is interesting to me is that knowing this leaves you with a question. If your navigation aray has your main pages linked to in it, perhaps your links in content are being ignored, or at least, not valued.
I think links in body text are invaluable. Does that mean placing the navigation below the copy to get a wide and varied internal anchor text to a page?
As I said, I think this is one of the more interesting talks in the communityat the moment and perhaps Google works differently with internal links as opposed to external; links to other websites.
I think quite possibly this could change day to day if Google pressed a button, butI optimise a site thinking that only the first link on a page will count based on what I monitor although I am testing this and actually, I usually only link once from page to page on client sites, unless its useful for visitors.
Canonical Tag Canonical Link Element Best Practice
When it comes to Google SEO, the rel=canonical link elementhas become*VERY*IMPORTANT over the years and NEVER MORE SO.
This element is employed byGoogle, Bing and othersearch engines to help them specify the pageyou wantto rank out of duplicate and near duplicate pages found on your site, or on other pages on the web.
In the video above, Matt Cutts from Google shares tips on the new rel=canonical tag (more accurately thecanonical link element) that the 3 top search engines now support.
Google, Yahoo!, and Microsoft have all agreed to work together in a
joint effort to help reduce duplicate content for larger, more complex sites, and the result is the new Canonical Tag.
Example Canonical Tag From Google Webmaster Central blog:
The process is simple. You can put this link tag in the head section of the duplicate content urls, if you think you need it.
I add a self referring canonical link element as standard these days to ANY web page.
Is rel=canonical a hint or a directive?
Its a hint that we honor strongly. Well take your preference into account, in conjunction with other signals, when calculating the most relevant page to display in search results.
Can I use a relative path to specify the canonical, such as<link rel=canonical href=product.php?item=swedish-fish />?
Yes, relative paths are recognized as expected with the<link>tag. Also, if you include a<base>link in your document, relative paths will resolve according to the base URL.
Is it okay if the canonical is not an exact duplicate of the content?
We allow slight differences, e.g., in the sort order of a table of products. We also recognize that we may crawl the canonical and the duplicate pages at different points in time, so we may occasionally see different versions of your content. All of that is okay with us.
What if the rel=canonical returns a 404?
Well continue to index your content and use a heuristic to find a canonical, but we recommend that you specify existent URLs as canonicals.
What if the rel=canonical hasnt yet been indexed?
Like all public content on the web, we strive to discover and crawl a designated canonical URL quickly. As soon as we index it, well immediately reconsider the rel=canonical hint.
Can rel=canonical be a redirect?
Yes, you can specify a URL that redirects as a canonical URL. Google will then process the redirect as usual and try to index it.
What if I have contradictory rel=canonical designations?
Our algorithm is lenient: We can follow canonical chains, but we strongly recommend that you update links to point to a single canonical page to ensure optimal canonicalization results.
Can this link tag be used to suggest a canonical URL on a completely different domain?
**Update on 12/17/2009: The answer is yes! We now support across-domain rel=canonicallink element.**
More reading athttp://googlewebmastercentral.blogspot.co.uk/2009/02/specify-your-canonical.html
Is Domain Age An Important Google Ranking Factor
For traffic from Google, on its own at least, in my experience,no.
Having a ten year old domain that Google knows nothing about is the same as having a brand new domain.
A 10 year old site thats constantly cited by, year on year, the actions of other, more authoritative, and trusted sites? Thats valuable.
But thats not the age of your website address ON ITS OWN in-play as a ranking factor.
A one year old domain cited by authority sites is just as valuable if not more valuable than a ten year old domain with no links and no search performance history.
Perhaps Domain age may come into play when other factors are considered but I think Google works very much like this on all levels, with all ranking factors, and all ranking conditions. I dont think you can consider discovering ranking factors without ranking conditions.
Other Ranking Factors:
Domain age; (NOT ON ITS OWN)Length of site domain registration; (I dont see much benefit ON ITS OWN even knowing Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year.) paying for a domain in advance just tells others you dont want anyone else using this domain name, it is no indication youre going to do something Google will reward you for iMO).Domain registration information hidden/anonymous; (possibly, under human review if OTHER CONDITIONS are met like looking like a spam site)Site top level domain (geographical focus, e.g. com versus co.uk); (YES)Site top level domain (e.g. .com versus .info); (DEPENDS)Sub domain or root domain? (DEPENDS)Domain past records (how often it changed IP); (DEPENDS)Domain past owners (how often the owner was changed) (DEPENDS)Keywords in the domain; (DEFINITELY ESPECIALLY EXACT KEYWORD MATCH although Google has a lot of filters that mute the performance of an exact match domain in 2015))Domain IP; (DEPENDS for most, no)Domain IP neighbours; (DEPENDS for most, no)Domain external mentions (non-linked) (I Dont think so)Geo-targeting settings in Google Webmaster Tools (YES of course)Rich Snippets
Rich Snippets and Schema Markup can be intimidating if you are new to them but important data about your business can actually be very simplyadded to your siteby sensibleoptimisationof anywebsite footer.
This is easy to implement.
An optimised website footer cancomply with law, may help search engines and can help usability and improve conversions.
Properly optimised your website footercan also help you make your search snippet stand out in Google results pages:
If youare a business in the UK your website needs to meet the legal requirements necessary to comply with the UK Companies Act 2007. Its easy to just incorporate this required information into your footer.
Companies in the UK must include certain regulatory information on their websites and in their email footers or they will breach the Companies Act and risk a fine. OUTLAW
Heres what you need to know regarding website and email footers to comply with the UK Companies Act (with our information in bold);
The Company Name
Hobo WebPhysical geographic address (A PO Box is unlikely to suffice as a geographic address; but a registered office address would If the business is a company, the registered office address must be included.)
24 Patrick Street,
UKthe companys registration number should be given and, under the Companies Act, the place of registration should be stated (e.g.
Hobo Web Limited is a company registered in Scotland with company number SC299002email address of the company (It is not sufficient to include a contact us form without also providing an email address and geographic address somewhere easily accessible on the site)
email@example.comThe name of the organisation with which the customer is contracting must be given. This might differ from the trading name. Any such difference should be explained
hobo-web.co.uk is the trading name / style of Hobo Web Limited.If your business has a VAT number, it should be stated even if the website is not being used for e-commerce transactions.
VAT No. 880 5135 26Prices on the website must be clear and unambiguous. Also, state whether prices are inclusive of tax and delivery costs.
All Hobo Web prices stated in email or on the website EXCLUDE VAT
The above information does not need to feature on every page, more on a clearly accessible page. However with Google Quality Raters rating web pageson quality based on Expertise, Authority and Trust (see my recentmaking high quality websites post) ANY signal you can send to an algorithmor human reviewers eyes that you are a legitimate business is probably a sensible move at this time (if youhave nothing to hide, of course).
Note: If the business is a member of a trade or professional association, membership details, including any registration number, should be provided. Consider also the Distance Selling Regulations which contain other information requirements for online businesses that sell to consumers (B2C, as opposed to B2B, sales).
For more detailed information about the UK Companies:
Although we display most if not all of this information on email and website footers, I thought it would be handy to gatherthis information clearly on one page and explain why its there and wrap it all up in a (hopefully) informative post.
Dynamic PHP Copyright Notice in WordPress
Now that your site complies with the Act youll want to ensure your website never looks obviously out of date.
While you are editing your footer ensure your copyright notice is dynamic and will change year to year automatically.
Its simple to display a dynamic date in your footer in WordPress, for instance, so you never need to change your copyright notice on your blog when the year changes.
This little bit of code will display the current year. Just add it in your themes footer.php and you can forget about making sure you dont look stupid, or give the impression your site is out of date and unused, at the beginning of every year.
A simple andelegantphp copyright notice for WordPress blogs.
Adding Schema.org Markup to Your Footer
You can take your information you have from above and transform it with Schema.org markup to give even more accurate information to search engines.
<div> <p> Copyright 2006-2015 Hobo-Web LTD, Company No. SC299002 | VAT No. 880 5135 26 <br> The Stables, 24 Patrick Street, Greenock, PA16 8NB, Scotland, UK | TEL: 0845 094 0839 | FAX: 0845 868 8946<br> Business hours are 09.00 a.m. to 17.00 p.m. Monday to Friday - Local Time is <span id="time">9:44:36</span> (GMT) </p></div>
<div> <div itemscope="" itemtype="http://schema.org/LocalBusiness"> Copyright 2006-2015 <span itemprop="name">Hobo-Web LTD</span> <div itemprop="address" itemscope="" itemtype="http://schema.org/PostalAddress"> ADDRESS: <span itemprop="streetAddress">24 Patrick Street</span>, <span itemprop="addressLocality">Greenock</span>, <span itemprop="addressRegion">Scotland</span>, <span itemprop="postalCode">PA16 8NB</span>, <span itemprop="addressCountry">GB</span> | TEL: <span itemprop="telephone">0845 094 0839</span> | FAX: <span itemprop="faxNumber">0845 868 8946</span> | EMAIL: <a href="mailto:firstname.lastname@example.org" itemprop="email">email@example.com</a>. </div> <span itemprop="geo" itemscope="" itemtype="http://schema.org/GeoCoordinates"> <meta itemprop="latitude" content="55.9520367"> <meta itemprop="longitude" content="-4.7667952"> </span> <span>Company No. SC299002</span> | VAT No.<span itemprop="vatID">880 5135 26</span> | Business hours are <time itemprop="openingHours" datetime="Mo,Tu,We,Th,Fr 09:00-17:00">09.00 a.m. to 17.00 p.m. Monday to Friday</time> Local Time is <span id="time">9:46:20</span> (GMT)</div> <span class="rating-desc" itemscope="" itemtype="http://schema.org/Product"> <span itemprop="name">Hobo Web SEO Services</span> <span itemprop="aggregateRating" itemscope="" itemtype="http://schema.org/AggregateRating"> Rated <span itemprop="ratingValue">4.8</span> / 5 based on <span itemprop="reviewCount">6</span> reviews. | <a class="ratings" href="https://plus.google.com/b/113802450121722957804/113802450121722957804/about/p/pub?review=1">Review Us</a> </span> </span></div>
Tip: Note the codenearthe end of the above example, if you are wondering how to get yellow star ratings in Google results pages.
I got yellow stars in Google within a few days of adding the code to my website template directly linking my site to information Google already has about my business.
Also you can modify that link toplus.google.com to link directly to your REVIEWS page on Google Plus to encourage people to review your business.
Now you can have a website footer that helps your business comply with UK Law, is more usable, automatically updates the copyright notice year and helps your website stick out in GoogleSERPS.
Keep It Simple, Stupid
Dont Build Your Site With Flash or HTML Frames.
Well not entirely in Flash, and especially not if you know very little about the ever improving accessibility of Flash.
Flash is a propriety plug in created by Macromedia to infuse (albeit) fantastically rich media for your websites. The W3C advises you avoid the use of such proprietary technology to construct an entire site.Instead, build your site with CSS and HTML ensuring everyone, including search engine robots, can sample your website content. Then, if required, you can embed media files such as Flash in the HTML of your website.
Flash, in the hands of an inexperienced designer, can cause all types of problems at the moment, especially with:
AccessibilitySearch EnginesUsers not having the Plug InLarge Download Times
Flash doesnt even work at all on some devices, like the Apple IPhone. Note that Google sometimes highlights if your site is not mobile friendly on some devices. And on the subject of mobile friendly websites note that Google has alerted the webmaster community that mobile friendliness will be a search engine ranking factor in 2015.
Starting April 21 (2015), we will be expanding our use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results. Consequently, users will find it easier to get relevant, high quality search results that are optimized for their devices. GOOGLE
Html5 is the preferred option over Flash these days, for most designers. A site built entirely in Flashcould cause an unsatisfactory user experience, and could affect your rankings, and especially in mobile search results. For similar accessibility and user satisfaction reasons, I would also saydont build a site with website frames.
As in any form of design, dont try and re-invent the wheel when simple solutions will suffice. TheKISSphilosophy has been around since the dawn of design.
Keep layouts and navigation arrays consistent and simple too. Dont spend time, effort and money (especially if you work in a professional environment) designing fancy navigation menus if, for example, your new website is an information site.
Same with website optimisation keep your documents well structured and keepyour page Title Elementsand text content relevant, use Headings tags sensibly and try and avoid leaving too much of a footprint whatever you are up to.
Googles Mobile Friendly Testhttp://validator.w3.org/mobile/Best Screen SizeA Non-Technical Google SEO Strategy
Here are some final thoughts:
Use common sense Google is a search engine it is looking for pages to give searchers results, 90% of its users are looking for information. Google itself WANTS the organic results full of information. Almost all websites will link to relevant information content so content rich websites get a lot of links especially quality links. Google ranks websites with a lot of links (especially quality links) at he top of its search engines so the obvious thing you need to do isADD A LOT INFORMATIVE CONTENT TO YOUR WEBSITE.I think ranking in organic listings is a lot about trusted links making trusted pages rank, making trusted links making trusted pages rank ad nauseam for various keywords. Some pages can pass trust to another site, some pages cannot. Some links can. Some cannot. Some links are trusted to pass ranking ability to another page. Some are not.YOU NEED LINKS FROM TRUSTED PAGES IF YOU WANT TO RANK AND AVOID PENALTIES & FILTERS.Google engineers are building an AI but is all based on simple human desires to make something happen or indeed to prevent something. You can work with Google engineers or against them. They need to make money for Google but unfortunately for them they need to make the best search engine in the world for us humans as part of the deal. Build a site that takes advantage of this. What is a Google engineer trying to do with an algorithm? I always remember it was an idea first, before it was an algorithm. What was that idea?Think like a Google engineers and give Google what it wants.What is Google trying to give its users? Align with that. What does Google not want to give its users? Dont look anything like that.THINK LIKE A GOOGLE ENGINEER & BUILD A SITE THEY WANT TO GIVE TOP RANKINGS.Google is a links based search engine.Google doesnt need content torankpages but it needs content to give tousers. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need to first make sure you tell the world about your site so other sites link to yours. Dont worry about reciprocating to more powerful sites or even real sites I think this adds to your domain authority which is actually better to have than ranking for just a few narrow key terms.
Everything has limits. Google has limits.What are they? How would you go about observing them or even testing, breaking them or benefiting from them or being penalised by them? Its not a lab setting you cant test much, if anything, 100% accurately, but you can hypothesise based on the sensible approach bearing in mind what a Google engineer would do, and what you would do if Google was yours.The best way for Google to keep rankings secret ultimately is to have arandomness or at least a randomness on the surface, as it is presented to users of Google to it while keeping somethingsstable surely the easiest way for it to prevent a curious optimiser finding out how it works. Well I think that anyway. And I think this randomness manifests itself in many ways. What will work for some sites might not necessarily work for your sites not exactly the same anyway. Perhaps no two sites are the same (the conditions are different for a start for any two sites) and Im actually thinking about how to test this for a bit of fun.Google may play dice with the Google multi-verseso be ware of that. It uses multiple results and rotates them and serves different results to different machines and browsers even on the same computer. Google results are constantly shifting some pages rank at the top constantly because they are giving Google what it wants in a number of areas or they might just have a greater number and diversity of more trusted links than your do.Google has a long memorywhen it comes to links and pages and associations for you site perhaps an infinite memory profile of your site. Perhaps it can forgive but never forget. Perhaps it can forget too, just like us, and so previous penalties or bans can be lifted. I think (depending on the site because Google can work out if you have a blog or a e-commerce site) Google probably also looks at different history versions of particular pages even on single sites WHAT RELATIONSHIP DO YOU WANT TO HAVE WITH GOOGLE? Onsite, dont try to fool Google were not smart enough. Be squeaky clean on-site and make Google think twice about bumping you for discrepancies in your link profile.Earn Googles trust.Most of our more lucrative accounts come from referrals from clients who trust us. Before clients told them of us, they didnt know about us. ok, they might have heard about us from people in turn they didnt trust that much. Upon the clients testimonial, the referral now trusts us a lot more. These referrals automatically trusts us to some extent. That trust grows when we deliver. The referral now trusts us very much. But its un uphill struggle from that point on to continue to deliver that trust and earn even more trust because you dont want to dip in trust its nice to get even more and more trusted. Google works the exactly the same way as this human emotion, and search engines have tried for years to deliver a trusted set sites based on human desire and need.MAKE FRIENDS WITH GOOGLE
Dont break Googles trust if youre friend betrays you, depending on what theyve done, theyve lost trust. sometimes that trust has been lost altogether. If you do something Google doesnt like manipulate it in a way it doesnt want, you will lose trust, and in come cases, lose all trust (in some areas). For instance, your pages might be able to rank, but your links might not be trusted enough to vouch for another site.DONT FALL OUT WITH GOOGLE OVER SOMETHING STUPID
YOU NEED TO MAKE MORE FRIENDS AND ESPECIALLY THOSE WHO ARE FRIENDS WITH GOOGLE.When Google trusts you its because youve earned its trust to help it carry out what it needs to carry out in the quickest and most profitable way. Youve helped Google achieve its goals. It trusts you and it will reward you by listing your contribution in order of the sites it trusts the most. It will list friends it trusts the most who it knows to be educated in a particular area at the top of these areas.IF GOOGLE TRUSTS YOU IT WILL LET YOUR PAGES RANK AND IN TURN VOUCH FOR OTHER FRIENDS GOOGLE MIGHT WANT INFORMATION ON.Google is fooled and manipulated just like you can but it will probably kick you in the gonads if you break its trust as I probably would.Treat Google as you would have it treat you.
REMEMBER IT TAKES TIME TO BUILD TRUST. AND THAT IS PROBABLY ONE OF THE REASONSWHY GOOGLE is pushing trust as a ranking signal.
I of course might be reading far too much into Google, TRUSTand the TIME Google wants us to wait for things to happen on their end.but consider trust to be a psychological emotion Google is trying to emulate using algorithms based on human ideas.
If you do all the above, youll get more and more traffic from Google over time.
If you want to rank for specific keywords in very competitive niches, youll need to be a big brand, be picked out by big brands (and linked to), or buy links to fake that trust, or get spammy with it in an intelligent way you wont get caught.
I suppose Google is open to the con just as any human is, if its based on human traits.
What Not To Do In Website Search Engine Optimisation
Google has aVERY basic organic search engine optimisationstarter guidepdf for webmasters, which they use internally:
Although this guide wont tell you any secrets thatll automatically rank your site first for queries in Google (sorry!), following the best practices outlined below will make it easier for search engines to both crawl and index your content.Google
It is still worth a read, even if it is VERYbasic, best practice search engine optimisation for your site.
No search enginewill EVERtell you what actual keywords to put on your site to improveyour rankings or get more converting organic traffic and in Google thats the SINGLE MOST IMPORTANT thing you want to know!
If you want a bigger pdf try my free seo ebook. Its been downloaded by tens of thousands of webmasters.
Heres a list ofwhat Google tells you toavoidin the document;
choosing a title that has no relation to the content on the pageusing default or vague titles like Untitled or New Page 1using a single title tag across all of your sites pages or a large group of pagesusing extremely lengthy titles that are unhelpful to usersstuffing unneededkeywords in your title tagswriting adescription meta tagthat has no relation to the content on the pageusing generic descriptions like This is a webpage or Page about baseball
cardsfilling the description with only keywordscopy and pasting the entire content of the document into the description meta tagusing a single description meta tag across all of your sites pages or a large group of pagesusing lengthy URLs with unnecessary parameters and session IDschoosing generic page names like page1.htmlusing excessive keywords like baseball-cards-baseball-cards-baseball-cards.htmhaving deep nesting of subdirectories like /dir1/dir2/dir3/dir4/dir5/dir6/
page.htmlusing directory names that have no relation to the content in themhaving pages from subdomains and the root directory (e.g. domain.com/
page.htm and sub.domain.com/page.htm) access the same contentmixing www. and non-www. versions of URLsin your internal linking structureusing odd capitalization of URLs (many users expect lower-case URLs and remember them better)creating complex webs ofnavigation links, e.g. linking every page on your site
to every other pagegoing overboard with slicing and dicing your content (it takes twenty clicks to get to deep content)having a navigation based entirely on drop-down menus, images, or animations (many, but not all, search engines can discover such links on a site, but if a user can reach all pages on a site via normal text links, this will improve the accessibility of your site)letting your HTML sitemap page become out of date with broken linkscreating an HTML sitemap that simply lists pages without organizing them, for
example by subject (Edit Shaun Safe to say especially for larger sites)allowing your 404 pages to be indexed in search engines (make sure that your
webserver is configured to give a404 HTTP status codewhen non-existent
pages are requested)providing only a vague message like Not found, 404, or no 404 page at allusing a design for your 404 pages that isnt consistent with the rest of your sitewriting sloppy text with many spelling and grammatical mistakesembedding text in images for textual content (users may want to copy and
paste the text and search engines cant read it)dumping large amounts of text on varying topics onto a page without paragraph, subheading, or layout separationrehashing (or even copying) existing content that will bring little extra value to
Pretty straight forwardstuff but sometimes its the simple stuff thatoften gets overlooked. Of course, you put the above together withGoogle Guidelines for webmasters.
Search engine optimization is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your sites user experience and performance in organic search results.
Dont make these simple but dangerous mistakes..
The aim of any rank modification is not to flag your site as spammy to either the Googlealgorithmsand/or the Google webspam team.
I would recommend you forgetabout tricks like links in H1 tags etc orlinking to the same page 3 times with different anchor text on one page.
Forget about which is best when considering things you shouldnt be wasting your time with.
Every elementon apage is abenefit to you until you spam it.
Put a keyword in every tag and you will flag your site as trying too hard if you havent got the link trust to cut it and Googles algorithms will go to work.
Spamming Google is often counter productive over the long term.
Dont spam your anchor text link titles with the same keyword.Dont spam your ALT Tags, or any other tags either.Add your keywords intelligently.Try and make the site mostly for humans, not just search engines.
On Page SEO is not as simple as a checklist any more of keyword here, keyword there. Optimisers are upagainst lots ofsmartfolk at the Googleplex and they purposely make this practice difficult.
For those who need a checklist, this is the sort of one that gets me results;
DokeywordresearchIdentify valuable searcher intent opportunitiesIdentify the audience & the reason for yourpageWrite utilitarian copy be useful. Use related terms in your content. Use plurals. Use words with searcher intent like buy, compare. I like to get a keyword or related term in every paragraph.Use emphasis sparingly to emphasise the important points in the page wether they are your keywords are notPick an intelligent Page Title with your keyword in itWrite an intelligent meta description, repeating it on the pageAdd an image with user centric ALT attribute textLink to related pages on your site within the textLink to related pages on other sitesYour page should have a simple Google friendly URLKeep it simpleShare it and pimp it
You can forget about just about everything else.
What Makes A Page Spam?
What makes a page spam?:
Theres more on this announcment atSEW.
If A Page Exists Only To Make Money, The Page Is Spam, to Google
If A Page Exists Only To Make Money, The Page Is Spam GOOGLE
In BOTHleaked quality rater guidelinesweve seen for Google quality ratersthis statement is prettystandout and should be a heads up to any webmaster out there who thinks they are going to make a fast buck from Google organic listings these days.
It should at least make you think about the types of pages you are going to spend your valuable time making.
Without VALUE ADD for Googles users dont expect to rank.
If you are making a page today with the sole purpose of making money from it and especially withfree traffic from Google you obviously didnt get the memo.
Consider this from a manual reviewer:
when they DOget to the top, theyhave to be reviewed with a human eyein order to make sure the site has quality.potpiegirl
Its worth remembering:
If A Page Exists Only To Make Money, The Page Is SpamIf A Site Exists Only To Make Money, The Site Is Spam
Thisis how what you make will be judged wether it is fair or not.
IS IT ALL BAD NEWS?
Of course not in some cases it actually levels the playing field.
If you come at a website thinking it is going to be a load of work and passion, thinking:
DIFFERENTIATEYOURSELFBE REMARKABLEBE ACCESSIBLEADD UNIQUE CONTENT TO YOUR SITEGET CREDITED AS THE SOURCE OF UNIQUE CONTENTHELP USERS (!)IN A WAY THAT IS NOT ALREADY DONE BY 100 OTHER SITES
. then you might actually find youve built a pretty good site and even abrand.
Google doesnt care about us seo or websites but itDOEScare aboutHELPING USERS.
So, if you are actually helping your visitors and not by just getting them to another website you are probably doing one thing right at least.
With this in mind I am already building affiliate sites differently.
Google has announced they intend to target doorway pages in the next big update. The definition of what a doorway page is sure to evolve over the coming years and this will start again, soon.
The last time Google announced they were going after doorway pages and doorway sites was back in 2011 and I have some examples of sites that tanked at this time.
Regarding the images below, all pages on the siteseemed to be hit with a -50+ penalty for everything.
First Googlerankings for main terms tanked.
which led to a traffic apocalypse of course.
and they got a niceemail from Google WMT:
Google Webmaster Tools notice of detected doorway pages onxxxxxxxx Dear site owner or webmaster of xxxxxxxx, Weve detected that some of your sites pages may be using techniques that are outside Googles Webmaster Guidelines. Specifically, your site may have what we consider to be doorway pages groups of cookie cutter or low-quality pages. Such pages are often of low value to users and areoften optimized for single words or phrases in order to channel users to a single location. We believe that doorway pages typically create a frustrating user experience, and we encourage you to correct or remove any pages that violate our quality guidelines. Once youve made these changes, please submit your site for reconsideration in Googles search results. If you have any questions about how to resolve this issue, please see our Webmaster Help Forumfor support. Sincerely, Google Search Quality Team
What Are Doorway Pages?
Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase. In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination. Doorway pages are web pages that are created for spamdexing, this is, for spamming the index of a search engine by inserting results for particular phrases with the purpose of sending visitors to a different page. They are also known as bridge pages, portal pages, jump pages, gateway pages, entry pages and by other names. Doorway pages that redirect visitors without their knowledge use some form of cloaking. Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users, and are in violation of our Webmaster Guidelines. Googles aim is to give our users the most valuable and relevant search results. Therefore, we frown on practices that are designed to manipulate search engines and deceive users by directing them to sites other than the ones they selected, and that provide content solely for the benefit of search engines. Google may take action on doorway sites and other sites making use of these deceptive practice, including removing these sites from the Google index. If your site has been removed from our search results, review our Webmaster Guidelines for more information. Once youve made your changes and are confident that your site no longer violates our guidelines, submit your site for reconsideration.
At the time, I didnt immediately class the pages on the affected sites in question as doorway pages. Its evident Googles definition of a doorways changes over time.
When I looked in Google Webmaster Forums there are plenty of people asking questions about how to fix this, at the time and as usual it seems a bit of a grey area with a lot of theories.. and some of the help in the Google forum is, well, clearly questionable.
A lot of people do notrealise they are building what Googleclassesas doorway pages.. and its indicative that .. what youintendto do with thetrafficGoogle sends you may in itself, be a ranking factor not too often talked about.
You probably DO NOT want to register at GWT if you have lots doorway pages across multiple sites.
Here is what Google has said lately abut this algorithm update:
Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.
withexamples of doorway pages listed as follows:
Having multiple domain names or pages targeted at specific regions or cities that funnel users to one pagePages generated to funnel visitors into the actual usable or relevant portion of your site(s)Substantially similar pages that are closer to search results than a clearly defined, browseable hierarchy
Google also said recently:
Here are questions to ask of pages that could be seen as doorway pages:
Is the purpose to optimize for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your sites user experience?Are the pages intended to rank on generic terms yet the content presented on the page is very specific?Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic?Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality?Do these pages exist as an island? Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?A Real Google Friendly Website
At one time A Google-Friendly website meanta website built so Googlebot could scrape it properly and rank itaccordingly.
When I think Googlefriendly these days I think a website Google will rank top, if popular and accessible enough, and wont drop like a f*&^ing stone for no apparent reason one day, even though I followed theGoogle SEO starter guideto the letter.. just because Google has found something it doesnt like or has classifiedmy site as undesirableone day.
It isnot JUST about original content anymore its about the function your site provides to Googles visitors and its about your commercialintent.
I am building sites at the moment with the following in mind..
Dont be a website Google wont rank WhatGoogle classifies your site as is perhaps the NUMBER 1 Google ranking factor not often talked about wether it Google determines this algorithmicallyor eventually,manually. That is wether it is a MERCHANT, an AFFILIATE, a RESOURCE or DOORWAY PAGE, SPAM, or VITAL to a particular search what do you think Google thinks about your website?Is your websitebetter than the ones in the top ten of Google now? Or just the same? Ask, why should Google bother ranking your website if it is just the same, rather than why it would not, because it is just the same. how can you make yoursdifferent. Better.Think, that one day, your website will have to pass amanual review by Google the better rankings you get, or the more traffic you get, the more likely you are to be reviewed. Know that Google at least classes even useful sites as spammy, according to leaked documents. If you want a site to fly in Google it better do something other than just link to another site forcommission.Know that to succeed,your website needs to be USEFUL, to a visitor Google will send you and auseful website is not just a website, with a http://www.mainstreethost.com/ sole comercial intent, of sending a visitor from Google, to another site or a thin affiliate as Google CLASSIFIES it.Think about how Google can algorithmically and manually determine thecommercial intent of your website what are the signals that differentiates a real small business website from a website set up JUST to send visitors to another website non-masked affiliate links, on every page, for instance, or adverts on your site, above the fold etc,can be a clearindicator of a webmasters particular commercial intentGoogle is NOT going to thank you for publishing lots of similar articles and nearduplicate content on your site so EXPECT to have to create original content for every page you want to perform in Google, or at least, not publish content found on other sites.EnsureGoogle knows your website is the origin of any content you produce(typically by simply pinging Google via xml or rss) Id go as far to say think of using Google+ to confirm this too. this sort of thing will only get more important as the year rolls onUnderstand andaccept why Google ranks yourcompetitionabove you they are either:
1. more relevant and more popular,
2. more relevant and more reputable, or
3. manipulating back-links better than you.
Understand that everyone at the top of Google falls into those categories and formulate your own strategy to compete relying on Google to take action on your behalf is VERY probably not going to happen.Being relevantcomes down to keywords & key phrases in domain names, urls, Title Elements, the number of times they are repeated in text on the page, text in image alt tags, rich markup and importantly in keyword links to the page in question. If you are relying on manipulating hidden elements on a page to do well in Google, youll probably trigger spam filters. If it is hidden in on-page elements beware relying on it too much to improve your rankings.The basics ofGOOD SEOhasnt changed for years though effectiveness of particular elements has certainly narrowed or changed in type of usefulness you should still be focusing on building a simple site using VERYsimple seobest practices dont sweat the small stuff, while all-the-time paying attention to the important stuff add plenty ofunique PAGE TITLESand plenty of new ORIGINAL CONTENT.Understand how Google SEES your website. CRAWL it, like Google does, with (for example)Screaming Frog SEO spider, and fix malformed links or things that result in server errors (500), broken links (400+) and unnecessary redirects (300+). Each page you want in Google should serve a 200 OK header message.and thats all for now.
This is a complex topic, as I said at the beginning of this in-depth article.
I hope you enjoyed thisfree diy seo guide for beginners.DOkeep up to date withGoogle Webmaster Guidelines.
If you made it to the bottom here. you should read my Google Panda post which will take your understanding of thisjob to a higher level.
Hobo UK SEO A Beginners Guide (2015) Free PDF EBOOK
Congratulations! Youve actually just finished readingthe first chapter of my 2015 training guide.
Hobo UK SEO A Beginners Guide (2015) is a free pdf ebook written by myself you can DOWNLOAD COMPLETELY FREE from here(2mb) that contains my notes about driving increased organic traffic to a site within Googles guidelines.
I am based in the UK and most of my time is spent looking at Google.co.uk so this ebook (and blog posts) should be read with that in mind. Google is BIG with many different country specific search engines with wildly different results in some instances. I do all my testing on Google.co.uk.
It is a guide based on my 15 years experience.
I write and publish to my blog to keep track of thoughts and get feedback from industry and peers. As a result of this strategy I get about 100K visitors a month from Google.
My ebookis meandering I am not a professional author butcontained within it is largely the information I needed as I took apenalised site to record Google organic traffic levels in 2015.
This is the 3rd version of this document Ive publishedin 6 years and I hope this, and previous ones, have reflected my aim of communicating something I have been evidently obsessed with for a long time.
Theres no warranties it is a free pdf. This seo training guide is my opinions, observations and theoriesthat I put into practice, not advice. I hope you find it useful.
Ihopebeginnerscan get something out of it or the links to other high quality resources it points out.
Click here and subscribeto this blog for free updates.
Pay per click (PPC), also called cost per click, is an internet advertising model used to direct traffic to websites, in which advertisers pay the publisher (typically a website owner or a host of website) when the ad is clicked. It is defined simply as the amount spent to get an advertisement clicked.
With search engines, advertisers typically bid on keyword phrases relevant to their target market. Content sites commonly charge a fixed price per click rather than use a bidding system. PPC "display" advertisements, also known as "banner" ads, are shown on web sites or search engine results with related content that have agreed to show ads.
In contrast to the generalized portal, which seeks to drive a high volume of traffic to one site, PPC implements the so-called affiliate model, which provides purchase opportunities wherever people may be surfing. It does this by offering financial incentives (in the form of a percentage of revenue) to affiliated partner sites. The affiliates provide purchase-point click-through to the merchant. It is a pay-for-performance model: If an affiliate does not generate sales, it represents no cost to the merchant. Variations include banner exchange, pay-per-click, and revenue sharing programs.
Websites that utilize PPC ads will display an advertisement when a keyword query matches an advertiser's keyword list, or when a content site displays relevant content. Such advertisements are called sponsored links or sponsored ads, and appear adjacent to, above, or beneath organic results on search engine results pages, or anywhere a web developer chooses on a content site.
The PPC advertising model is open to abuse through click fraud, although Google and others have implemented automated systems to guard against abusive clicks by competitors or corrupt web developers.
Pay-per-click, along with cost per impression and cost per order, are used to assess the cost effectiveness and profitability of internet marketing. Pay-per-click has an advantage over cost per impression in that it tells us something about how effective the advertising was. Clicks are a way to measure attention and interest. If the main purpose of an ad is to generate a click, then pay-per-click is the preferred metric. Once a certain number of web impressions are achieved, the quality and placement of the advertisement will affect click through rates and the resulting pay-per-click.
Pay-per-click is calculated by dividing the advertising cost by the number of clicks generated by an advertisement. The basic formula is:
There are two primary models for determining pay-per-click: flat-rate and bid-based. In both cases, the advertiser must consider the potential value of a click from a given source. This value is based on the type of individual the advertiser is expecting to receive as a visitor to his or her website, and what the advertiser can gain from that visit, usually revenue, both in the short term as well as in the long term. As with other forms of advertising targeting is key, and factors that often play into PPC campaigns include the target's interest (often defined by a search term they have entered into a search engine, or the content of a page that they are browsing), intent (e.g., to purchase or not), location (for geo targeting), and the day and time that they are browsing.
In the flat-rate model, the advertiser and publisher agree upon a fixed amount that will be paid for each click. In many cases the publisher has a rate card that lists the pay-per-click (PPC) within different areas of their website or network. These various amounts are often related to the content on pages, with content that generally attracts more valuable visitors having a higher PPC than content that attracts less valuable visitors. However, in many cases advertisers can negotiate lower rates, especially when committing to a long-term or high-value contract.
The flat-rate model is particularly common to comparison shopping engines, which typically publish rate cards. However, these rates are sometimes minimal, and advertisers can pay more for greater visibility. These sites are usually neatly compartmentalized into product or service categories, allowing a high degree of targeting by advertisers. In many cases, the entire core content of these sites is paid ads.
The advertiser signs a contract that allows them to compete against other advertisers in a private auction hosted by a publisher or, more commonly, an advertising network. Each advertiser informs the host of the maximum amount that he or she is willing to pay for a given ad spot (often based on a keyword), usually using online tools to do so. The auction plays out in an automated fashion every time a visitor triggers the ad spot.
When the ad spot is part of a search engine results page (SERP), the automated auction takes place whenever a search for the keyword that is being bid upon occurs. All bids for the keyword that target the searcher's geo-location, the day and time of the search, etc. are then compared and the winner determined. In situations where there are multiple ad spots, a common occurrence on SERPs, there can be multiple winners whose positions on the page are influenced by the amount each has bid. The bid and Quality Score are used to give each advertise's advert an ad rank. The ad with the highest ad rank shows up first. The predominant three match types for both Google and Bing are broad, exact and phrase match. Google also offers the broad modifier match type which differs from broad match in that the keyword must contain the actual keyword terms in any order and doesn't include relevant variations of the terms.
In addition to ad spots on SERPs, the major advertising networks allow for contextual ads to be placed on the properties of 3rd-parties with whom they have partnered. These publishers sign up to host ads on behalf of the network. In return, they receive a portion of the ad revenue that the network generates, which can be anywhere from 50% to over 80% of the gross revenue paid by advertisers. These properties are often referred to as a content network and the ads on them as contextual ads because the ad spots are associated with keywords based on the context of the page on which they are found. In general, ads on content networks have a much lower click-through rate (CTR) and conversion rate (CR) than ads found on SERPs and consequently are less highly valued. Content network properties can include websites, newsletters, and e-mails.
Advertisers pay for each click they receive, with the actual amount paid based on the amount bid. It is common practice amongst auction hosts to charge a winning bidder just slightly more (e.g. one penny) than the next highest bidder or the actual amount bid, whichever is lower. This avoids situations where bidders are constantly adjusting their bids by very small amounts to see if they can still win the auction while paying just a little bit less per click.
To maximize success and achieve scale, automated bid management systems can be deployed. These systems can be used directly by the advertiser, though they are more commonly used by advertising agencies that offer PPC bid management as a service. These tools generally allow for bid management at scale, with thousands or even millions of PPC bids controlled by a highly automated system. The system generally sets each bid based on the goal that has been set for it, such as maximize profit, maximize traffic at breakeven, and so forth. The system is usually tied into the advertiser's website and fed the results of each click, which then allows it to set bids. The effectiveness of these systems is directly related to the quality and quantity of the performance data that they have to work with low-traffic ads can lead to a scarcity of data problem that renders many bid management tools useless at worst, or inefficient at best.
In 1996, the first known and documented version of a PPC was included in a web directory called Planet Oasis. This was a desktop application featuring links to informational and commercial web sites, and it was developed by Ark Interface II, a division of Packard Bell NEC Computers. The initial reactions from commercial companies to Ark Interface II's "pay-per-visit" model were skeptical, however. By the end of 1997, over 400 major brands were paying between $.005 to $.25 per click plus a placement fee.
In February 1998 Jeffrey Brewer of Goto.com, a 25-employee startup company (later Overture, now part of Yahoo!), presented a pay per click search engine proof-of-concept to the TED conference in California. This presentation and the events that followed created the PPC advertising system. Credit for the concept of the PPC model is generally given to Idealab and Goto.com founder Bill Gross.
Google started search engine advertising in December 1999. It was not until October 2000 that the AdWords system was introduced, allowing advertisers to create text ads for placement on the Google search engine. However, PPC was only introduced in 2002; until then, advertisements were charged at cost-per-thousand impressions or Cost per mille (CPM). Overture has filed a patent infringement lawsuit against Google, saying the rival search service overstepped its bounds with its ad-placement tools.
Although GoTo.com started PPC in 1998, Yahoo! did not start syndicating GoTo.com (later Overture) advertisers until November 2001. Prior to this, Yahoo's primary source of SERPS advertising included contextual IAB advertising units (mainly 468x60 display ads). When the syndication contract with Yahoo! was up for renewal in July 2003, Yahoo! announced intent to acquire Overture for $1.63 billion. Today, companies such as adMarketplace, ValueClick and adknowledge offer PPC services, as an alternative to AdWords and AdCenter.
Among PPC providers, Google AdWords, Yahoo! Search Marketing, and Microsoft adCenter had been the three largest network operators, all three operating under a bid-based model. In 2010, Yahoo and Microsoft launched their combined effort against Google, and Microsoft's Bing began to be the search engine that Yahoo used to provide its search results. Since they joined forces, their PPC platform was renamed AdCenter. Their combined network of third party sites that allow AdCenter ads to populate banner and text ads on their site is called BingAds.
In 2012 Google was ruled to have engaged in misleading and deceptive conduct by the Australian Competition and Consumer Commission in possibly the first legal case of its kind. The Commission ruled unanimously that Google was responsible for the content of its sponsored AdWords ads that had shown links to a car sales website CarSales. The Ads had been shown by Google in response to a search for Honda Australia. The ACCC said the ads were deceptive, as they suggested CarSales was connected to the Honda company. The ruling was later overturned when Google appealed to the Australian High Court. Google was found not liable for the misleading advertisements run through AdWords despite the fact that the ads were served up by Google and created using the companys tools.
See alsoCost per impressionCost per orderClickthrough rateReferences^ a b c Farris, Paul W.; Neil T. Bendle; Phillip E. Pfeifer; David J. Reibstein (2010). Marketing Metrics: The Definitive Guide to Measuring Marketing Performance. Upper Saddle River, New Jersey: Pearson Education, Inc. ISBN 0-13-705829-2. The Marketing Accountability Standards Board (MASB) endorses the definitions, purposes, and constructs of classes of measures that appear in Marketing Metrics as part of its ongoing Common Language in Marketing Project.^ a b "Customers Now", David Szetela, 2009.^ Shuman Ghosemajumder (March 18, 2008). "Using data to help prevent fraud". Google Blog. Retrieved May 18, 2010.^ How Google prevents invalid activity Google AdSense Help Center, Accessed November 17, 2014^ Card Shopping.com Merchant Enrollment Shopping.com, Accessed June 12, 2007^ "Keyword Matching Options Article: Keyword Matching Options Bing Ads". Google Support. Retrieved 26 January 2013.^ Yahoo! Search Marketing (May 18, 2010). "Sponsored Search". Website Traffic Yahoo! Search Marketing (formerly Overture). Retrieved May 18, 2010.^ The cost of AdWords Google AdWords Help, Accessed May 18, 2012^ and documented Planet Oasis gives web sites promotion clout, Advertising Age July 8, 1996, retrieved December 5, 2012^ Overture and Google: Internet Pay Per Click (PPC) Advertising Auctions, London Business School, Accessed June 12, 2007^ Stefanie Olsen and Gwendolyn Mariano (April 5, 2002). "Overture sues Google over search patent". CNET. Retrieved Jan 28, 2011.^ Yahoo! Inc. (2002). "Yahoo! and Overture Extend Pay-for-Performance Search Agreement". Yahoo! Press Release. Retrieved May 18, 2010.^ Stefanie Olsen (July 14, 2003). "Yahoo to buy Overture for $1.63 billion". CNET. Retrieved May 18, 2010.^ Singel, Ryan (18 February 2010). "Yahoo and Microsoft Join Search Forces". Retrieved 26 September 2012.^ "Link to Webpronews.com Article: Yahoo And Microsoft Introduce The Yahoo Bing Network, adCenter Becomes Bing Ads". WebProNews. Retrieved 26 September 2012.^ "Google Inc vs ACCC". Retrieved 2015-07-02.External linksPaid listings confuse web searchers, PC WorldRetrieved from "https://en.wikipedia.org/w/index.php?title=Pay_per_click&oldid=691219864"
Businesses are growing more aware of the need to understand and implement at least the basics of search engine optimization (SEO). But if you read a variety of blogs and websites, youll quickly see that theres a lot of uncertainty over what makes up the basics. Without access to high-level consulting and without a lot of experience knowing what SEO resources can be trusted, theres also a lot of misinformation about SEO strategies and tactics.
1. Commit yourself to the process. SEO isnt a one-time event. Search engine algorithms change regularly, so the tactics that worked last year may not work this year. SEO requires a long-term outlook and commitment.
2. Be patient. SEO isnt about instant gratification. Results often take months to see, and this is especially true the smaller you are, and the newer you are to doing business online.
3. Ask a lot of questions when hiring an SEO company. Its your job to know what kind of tactics the company uses. Ask for specifics. Ask if there are any risks involved. Then get online yourself and do your own researchabout the company, about the tactics they discussed, and so forth.
4. Become a student of SEO. If youre taking the do-it-yourself route, youll have to become a student of SEO and learn as much as you can. Luckily for you, there are plenty of great web resources (like Search Engine Land) and several terrific books you can read. (Yes, actual printed books!) See our What Is SEO page for a variety of articles, books and resources.
5. Have web analytics in place at the start. You should have clearly defined goals for your SEO efforts, and youll need web analytics software in place so you can track whats working and whats not.
6. Build a great web site. Im sure you want to show up on the first page of results. Ask yourself, Is my site really one of the 10 best sites in the world on this topic? Be honest. If its not, make it better.
7. Include a site map page. Spiders cant index pages that cant be crawled. A site map will help spiders find all the important pages on your site, and help the spider understand your sites hierarchy. This is especially helpful if your site has a hard-to-crawl navigation menu. If your site is large, make several site map pages. Keep each one to less than 100 links. I tell clients 75 is the max to be safe.
8. Make SEO-friendly URLs. Use keywords in your URLs and file names, such as yourdomain.com/red-widgets.html. Dont overdo it, though. A file with 3+ hyphens tends to look spammy and users may be hesitant to click on it. Related bonus tip: Use hyphens in URLs and file names, not underscores. Hyphens are treated as a space, while underscores are not.
9. Do keyword research at the start of the project. If youre on a tight budget, use the free versions of Keyword Discovery or WordTracker, both of which also have more powerful paid versions. Ignore the numbers these tools show; whats important is the relative volume of one keyword to another. Another good free tool is Googles AdWords Keyword Tool, which doesnt show exact numbers.
10. Open up a PPC account. Whether its Googles AdWords, Microsoft adCenter or something else, this is a great way to get actual search volume for your keywords. Yes, it costs money, but if you have the budget its worth the investment. Its also the solution if you didnt like the Be patient suggestion above and are looking for instant visibility.
11. Use a unique and relevant title and meta description on every page. The page title is the single most important on-page SEO factor. Its rare to rank highly for a primary term (2-3 words) without that term being part of the page title. The meta description tag wont help you rank, but it will often appear as the text snippet below your listing, so it should include the relevant keyword(s) and be written so as to encourage searchers to click on your listing. Related bonus tip: You can ignore the Keywords meta tag, as no major search engine today supports it.
12. Write for users first. Google, Yahoo, etc., have pretty powerful bots crawling the web, but to my knowledge these bots have never bought anything online, signed up for a newsletter, or picked up the phone to call about your services. Humans do those things, so write your page copy with humans in mind. Yes, you need keywords in the text, but dont stuff each page like a Thanksgiving turkey. Keep it readable.
13. Create great, unique content. This is important for everyone, but its a particular challenge for online retailers. If youre selling the same widget that 50 other retailers are selling, and everyone is using the boilerplate descriptions from the manufacturer, this is a great opportunity. Write your own product descriptions, using the keyword research you did earlier (see #9 above) to target actual words searchers use, and make product pages that blow the competition away. Plus, retailer or not, great content is a great way to get inbound links.
14. Use your keywords as anchor text when linking internally. Anchor text helps tells spiders what the linked-to page is about. Links that say click here do nothing for your search engine visibility.
15. Build links intelligently. Begin with foundational links like trusted directories. (Yahoo and DMOZ are often cited as examples, but dont waste time worrying about DMOZ submission. Submit it and forget it.) Seek links from authority sites in your industry. If local search matters to you (more on that coming up), seek links from trusted sites in your geographic area the Chamber of Commerce, local business directories, etc. Analyze the inbound links to your competitors to find links you can acquire, too. Create great content on a consistent basis and use social media to build awareness and links. (A blog is great for this; see below.)
16. Use press releases wisely. Developing a relationship with media covering your industry or your local region can be a great source of exposure, including getting links from trusted media web sites. Distributing releases online can be an effective link building tactic, and opens the door for exposure in news search sites. Related bonus tip: Only issue a release when you have something newsworthy to report. Dont waste journalists time.
17. Start a blog and participate with other related blogs. Search engines, Google especially, love blogs for the fresh content and highly-structured data. Beyond that, theres no better way to join the conversations that are already taking place about your industry and/or company. Reading and commenting on other blogs can also increase your exposure and help you acquire new links. Related bonus tip: Put your blog at yourdomain.com/blog so your main domain gets the benefit of any links to your blog posts. If thats not possible, use blog.yourdomain.com.
18. Use social media marketing wisely. If your business has a visual element, join the appropriate communities on Flickr and post high-quality photos there. If youre a service-oriented business, use Quora and/or Yahoo Answers to position yourself as an expert in your industry. Any business should also be looking to make use of Twitter and Facebook, as social information and signals from these are being used as part of search engine rankings for Google https://support.google.com/webmasters/answer/35291?hl=en and Bing. With any social media site you use, the first rule is dont spam! Be an active, contributing member of the site. The idea is to interact with potential customers, not annoy them.
19. Take advantage of local search opportunities. Online research for offline buying is a growing trend. Optimize your site to catch local traffic by showing your address and local phone number prominently. Write a detailed Directions/Location page using neighborhoods and landmarks in the page text. Submit your site to the free local listings services that the major search engines offer. Make sure your site is listed in local/social directories such as CitySearch, Yelp, Local.com, etc., and encourage customers to leave reviews of your business on these sites, too.
20. Take advantage of the tools the search engines give you. Sign up for Google Webmaster Central, Bing Webmaster Tools and Yahoo Site Explorer to learn more about how the search engines see your site, including how many inbound links theyre aware of.
21. Diversify your traffic sources. Google may bring you 70% of your traffic today, but what if the next big algorithm update hits you hard? What if your Google visibility goes away tomorrow? Newsletters and other subscriber-based content can help you hold on to traffic/customers no matter what the search engines do. In fact, many of the DOs on this listcreating great content, starting a blog, using social media and local search, etc.will help you grow an audience of loyal prospects and customers that may help you survive the whims of search engines.
Need more advice and guidance on the tips above? Be sure to see our other SEO resources:
Note: This page was first created on June 28, 2007 and has been updated since then to keep it current.
About The AuthorMatt McGee is the Editor-In-Chief of Search Engine Land. His news career includes time spent in TV, radio, and print journalism. After leaving traditional media in the mid-1990s, he began developing and marketing websites and continued https://support.google.com/webmasters/answer/35291?hl=en to provide consulting services for more than 15 years. His SEO and social media clients ranged from mom-and-pop small businesses to one of the Top 5 online retailers. Matt is a longtime speaker at marketing events around the U.S., including keynote and panelist roles. He can be found on Twitter at @MattMcGee and/or on Google Plus. You can read Matt's disclosures on his personal blog. You can reach Matt via email using our Contact page.(Some images used under license from Shutterstock.com.)
There's frequently a perception these sorts of purchases (big ticket items purchased by large businesses) are not something that key stakeholders will probably hunt for.
Several additional variables that could be in generating the appropriate search traffic to your B2B company barriers complicate further this:
Small Useful Search Quantity. Frequently, if you sell a really special product that solves a an issue that is very special, you won't have massive amounts of people hunting for that item and may struggle to grow traffic. If I am selling human resources software and targeting mid-to-large businesses, editions of "HR software" and "human resources applications" may send a few extremely strong early-stage leads, but "niching down" to unique features of my merchandise and specific challenges my prospects confront around my merchandise may start to offer quite low search volume and diminishing yields.
Evolving Search Results. Google is rewarding different kinds of listings today, however, along with the page you'd like to see rating for your desired duration could be tough to push full of search results (as well as the content you can actually get to rank may well not convert at the same time as you'd like and could raise questions with your CEO regarding why you've got a page about hr applications that does not talk about how great your human resources software is).
Believe People First, Not Keywords
Like any marketing effort, you want to start with the question:
Who's buying my product?
This exercise is about identifying the person who you'd want to buy your software, not merely the terms you believe people uses to describe your merchandise. Ideally, you have already spent time as an organization and a marketing section thinking relating to this question.
I'd like to get these folks to my website. Business buyers are people, and people search for things, even though they're not hunting in the volume I had like for the most instinctive way to describe my product. I need get myself and to start asking questions about these people that are buying software like search engine marketing mine:
What difficulties do these people consistently wrestle with?
What content do they consume on other sites?
How do I create that type of content on my website, and solve those problems?
The answers to these questions will unlock quite a bit of content ideas for topics that are exceptionally relevant to your own target audience. Often, these issues will also represent search terms and key words that are less competitive and more easy to rank for in search results (since the people're less obvious and less prone to be targeted by your opponents). There are a number of amazing ways to get these records, including:
1. Talk To The People you are Targeting
Innovative idea, right?
An extension of this can be to regularly meet together with the sales and services people at the language customers and prospects are using to describe distinct issues and feature requests, common objections they confront and also your organization to understand what problems customers and prospects most often have.
2. Look At Conference Programs
My firm promotes and creates content for companies. Sometimes this means doing content ideation in a niche we are unfamiliar with. At fleshing out content ideas, a great early step would be to look at summit programs.
Organizers here have a strong financial incentive to concentrate presentations and tracks around subjects that are interesting to attendees. To help identify opportunities for my HR software company, I had look at the plans for events that HR professionals will be likely to attend:
Using conference programs for B2B keyword research
In this example screenshot from the EBN Gains Forum & Expo plan, I will quickly spot some potential content topics that are fascinating including:
Private Exchange (about implementing this, pros and cons I could take positions like points and so forth)
Private Exchanges vs. Self-Managed Plans
Health Insurance Company Consolidation
It was just the initial agenda I looked at from the very first convention; I'll start to see some common problems and various combinations of topics I will attack in various content assets here as I study a lot of different conventions.
3. Forums, Support Q&A And Content Sites
I probably have a few of my own, personal support and newsgroup content on my own site. This may be an unmined trove of amazing content ideas. What're my users asking often here? Exactly what are popular feature requests?
Even if I can't build these for my customers instantaneously, detailing a great way to do this manually/outside my applications could be an extremely popular content strength (and will likely map to a search term my prospects are looking for -- if a known segment of your target market is fighting with an issue, it's practically guaranteed a larger cut of folks have the same issue).
It is also possible to utilize exactly the same approach to consider your competitors' forums and support content. If your competitors're featuring a certain support question on the main page of their support section, that's probably because it is a common dilemma their users (who presumably are either my direct prospects or have very similar issues) have.
You'll be able to think similarly for popular issues on feature requests, their newsgroups and much more. If Zenefits is a direct competitor for my HR software company, I can see at a glance inside their help section how they categorize issues and common questions:
Beyond that, I will stop up that subdomain into something like SEMrush to see what search terms particularly are driving traffic to their own help subdomain:
Example of utilizing SEM Rush for competitive keyword research for B2B firms
Here a treasure trove of content subjects that are possible myself understand my prospects are likely to be interested in. As myself dig into multiple opponents' support segments, I will once more start to see common themes in subjects being focused on and questions that often come up.
4. Content Your clients And Possibilities Are Consuming
What websites do your prospects read often?
5. Tools Your Possibilities Are Using
In addition to your merchandise, what other tools are your prospects currently using?
By helping your possibilities evaluate categories of tools that are tangential but not competitive to your own offering and identify useful tools, you'll be able to eventually become a trusted supply of advice and can frequently rank well for search terms they're seeking.
Frequently, these sorts of comparisons will really outrank the unique tool firms themselves, since this is the kind these searchers are now looking for and will be more inclined to click on, use up and share than the sales page of an individual tool provider.
"Conventional" Keywords will probably Be Your Friend, Also: The way to Attack Center Keywords And Get Out More Of what is Already Working
Even should a heart keyword like "HR software" is highly competitive, does not have a ton of search traffic and is hard to rank for with your merchandise page, that doesn't always mean you have to discount it.
You can also focus efforts on becoming more out of the content in your web site that is already working (assuming that content exists) . On how best to squeeze more value from your SEO landing pages that were most important, inside my post, I walked through several means by which you can capitalize on pages that are working
Executing Against Issues that are Great: Choose The Appropriate Offers as well as the Right Assets
By following the process outlined above, you'll likely possess a ton of thoughts for new useful content that could drive qualified B2B search engine optimization traffic, in addition to numerous ideas for getting more value from heart SEO keywords and pages which are already driving quality SEO traffic.
The bad news is, you still have plenty of work to do.
1. Determine Priorities
You first have to triage what's likely a big listing of potential opportunities. Here you will like to look at relevance to your own prospects, the possible search volume and also the realistic likelihood you could rate for these terms.
2. Map Topics To Content Kinds
From that point you need to work down the listing of likely future key words and topics to map special types of content you are able to create for every one https://www.reddit.com/r/seo of your issues. There certainly are numerous other methods to map content sorts that are convincing to key words that are targeted, as well as the asset you utilize will have to do with the keywords you are targeting.
Core Key Words. For the heart keywords, you may choose to get extremely aggressive and only talk honestly about all the competition and yourself, but most businesses may want to take a different approach here.
Low Competition, Specific Terms. For modified variants of your heart keywords or just really particular lower opposition terms, you may not desire a huge resource (briefer content can win sometimes, too) -- only a short glossary-style overview of an issue might be well-placed to rate for the term and may be exactly what searchers were looking for (Bonus points if you're able to get your content into the reply box).
3. Create, Get Leads from your own Content And Promote
Finally, you need boost the content to create it and map a specific offer to your articles.
Your content creation efforts should have already been executed in mind (Try to create each advantage as fail-proof as possible) with promotion, and you should own a certain strategy for who share your content and will link to (and why). In case you're not sure of how exactly to execute on outreach and promotion, there are a lot of distinct resources and lots of great information on the topic of content promotion.
The largest search marketing conference and expo returns to San Jose March 1-3. Learn more about SMX West!
Or attend an SMX near you. See all the dates and locations.
Learn More About Our SMX Events
Attend Marketing Land's SocialPro conference and learn fresh new strategies and tactics from some of the savviest brands and digital marketing agencies managing earned, owned and paid social media marketing campaigns across multiple platforms. Visit the SocialPro site to learn more about our next event, November 18-19 in Las Vegas.
MarTech: The Marketing Tech Conference is for marketers responsible for selecting marketing technologies and developing marketing technologists. MarTech returns to San Francisco March 21-22 learn more!