Saturday, December 24, 2011

Competition Analysis Basics for SEO

you can't solidify your targets until you understand what you're up against. All the keyword research in the world won't help you rank for the keyword phrase “windows” in 6 months with a brand new site. So understanding how to analyze your competitors and get a feel for who you can compete with in a reasonable period of time is paramount to creating a solid strategy. I'll also be flashing back a bit on keyword strategy. In the last article we closed with a list of potential keyword phrases, the idea that we needed to divide our phrases into major phrases and longtail phrases and also a new domain (just to keep things realistic). So where do we go from there? Generally I start at the top. From the highest searched phrases to the lowest – I do a quick analysis of the major phrases to determine the long term goals and the short term. I also like to look for what I call “holes”. These are phrases that have competition levels lower than one would expect when looking at the search volume. So let's use the example I was using in the last article and imagine a US-based downhill mountain bike company. And let's begin with the major targets. The phrases we'll examine for the purposes of this article are the top 10 phrases as ordered by search volume. They are: mountain bike mountain bikes specialized mountain bike trek mountain bike mountain bike frame full suspension mountain bike cannondale mountain bike giant mountain bike mountain bike parts mountain bike reviews So what are we looking for? It's obviously not feasible to do incredibly thorough competition analysis at this stage. I've listed 10 phrases here but in reality there are hundreds to consider and so we need a quick(ish) way to determine the competition levels of phrases. First, let's install a couple tools to help you make some quick decisions. You'll need to install the Firefox browser and the SEO Quake add on. Now when you run a search you'll be able to quickly pull the competitor stats. I like to look at the PageRank, links to the ranking page and sitelinks. Remember now – this is the basic competitor analysis here. Here are the stats for the top 10 ranking sites across the 10 top phrases (I'll leave out the URLs so there's no promotion): Phrase: mountain bike

Site 1 – PR6, 70,268 page links, 71,177 domain links
Site 2 – PR6, 262,609 page links, 290,281 domain links
Site 3 – PR5, 0 page links, 604 domain links
Site 4 – PR6, 101,136 page links, 206,397 domain links
Site 5 – PR5, 741 page links, 118,791,902 domain links Phrase: mountain bikes
Site 1 – PR5, 33,097 page links, 40,747 domain links
Site 2 – PR6, 42,010 page links, 91,385 domain links
Site 3 – PR6, 262,609 page links, 290,281 domain links
Site 4 – PR6, 101,136 page links, 206,397 domain links
Site 5 – PR5, 25,059 page links, 38,132 domain links Phrase: specialized mountain bikes
Site 1 – PR6, 101,136 page links, 206,397 domain links
Site 2 – PR1, 1 page links, 206,397 domain links
Site 3 – PR4, 2,001 page links, 2,095 domain links
Site 4 – PR5, 734 page links, 738 domain links
Site 5 – PR2, 4 page links, 230 domain links Phrase: trek mountain bikes
Site 1 – PR6, 65,464 page links, 178,712 domain links
Site 2 – PR4, 108 page links, 178,712 domain links
Site 3 – PR4, 127 page links, 523 domain links
Site 4 – PR4, 2,001 page links, 2,095 domain links
Site 5 – PR0, 0 page links, 3,854,233 domain links Phrase: mountain bike frame
Site 1 – PR4, 6,348 page links, 44,535 domain links
Site 2 – PR2, 6 page links, 4,303 domain links
Site 3 – PR4, 196 page links, 523 domain links
Site 4 – PR0, 28 page links, 35 domain links
Site 5 – PR1, 0 page links, 294,361,703 domain links Phrase: full suspension mountain bike
Site 1 – PR4, 58 page links, 178,712 domain links
Site 2 – PR4, 20 page links, 1,729 domain links
Site 3 – PR3, 7 page links, 9,959,894 domain links
Site 4 – PR5, 240 page links, 290,281 domain links
Site 5 – PR3, 0 page links, 294,362,703 domain links Phrase: cannondale mountain bikes
Site 1 – PR6, 62,614 page links, 91,301 domain links
Site 2 – PR6, 410 page links, 91,301 domain links
Site 3 – PR4, 0 page links, 2,056 domain links
S ite 4 – PR3, 3 page links, 80,580 domain links
Site 5 – PR2, 3 page links, 9,959,894 domain links Phrase: giant mountain bikes
Site 1 – PR3, 7 page links, 136,232 domain links
Site 2 – PR4, 2,001 page links, 2,095 domain links
Site 3 – PR0, 6 page links, 6 domain links
Site 4 – PR4, 2,262 page links, 2,392 domain links
Site 5 – PR2, 1 page links, 60,131 domain links Phrase: mountain bike parts
Site 1 – PR4, 610 page links, 2,366 domain links
Site 2 – PR4, 851 page links, 4,303 domain links
S ite 3 – PR4, 6,348 page links, 44,535 domain links
Site 4 – PR5, 4,612 page links, 20,931 domain links
Site 5 – PR6, 4,612 page links, 20,931 domain links Phrase: mountain bike reviews
Site 1 – PR6, 262,609 page links, 290,281 domain links
Site 2 – PR5, 240 page links, 290,281 domain links
Site 3 – PR6, 560 page links, 361,873 domain links
Site 4 – PR5, 0 page links, 604 domain links
Site 5 – PR4, 22 page links, 90,123 domain links Now, I'd definitely look further down my keyword list than this but for the purposes of this article let's assume this is all we have. If that's the case – what do you suppose would be the primary choice(s)? Were it to me I'd go with: mountain bike frame – we have a range of PageRank, a range of links and a range of sites. Basically – we're not up against a wall of high competition and the search volume is solid. full suspension mountain bike – a full range of sites. Higher competition than “mountain bike frame” but we're looking at a phrase that would sell a whole bike which needs to be considered and a slightly higher competition is thus acceptable. So of these two phrases what would I do? Well – if this was all we had to work with I'd select “full suspension mountain bike” as the main phrase and follow that up with “mountain bike frame” as a major secondary phrase and thus a prime target for proactive internal page link building and optimization. So now let's look at whether there are any good longtail phrases. In this industry we'll be looking for specific parts. Since going through all the different types of parts would be a nightmare in an article I'll focus on a couple parts I just ordered recently and that was a new handlebar and and a new rim. To keep things simple I'm going to focus on just a couple brands in the research BUT in reality we'd take the extra time and look into all the part types and all the brands that we'd be able to sell on our site. So for handlebars, here's the long and short of the numbers and competition: Brands researched – origin and easton “easton handlebars” with 1,000 estimated searches/mth with low competition outside of the manufacturer is a great start. Further, when we look up the manufacturer we further see that the ea70 and ea90 Easton models are both sought after as well. When we build our site we obviously want to build a structure and heirarchy that are conducive to longtail rankings overall but what we're looking for here are ideas as to where to put our energies when it comes to content creation and link building. Handlebars looks good by search volume. The average sale per item would be around $25. And now to rims: Brands researched – mavic and sun “mavic rims” and “sun rims” both come in at 1,900 estimated searches but the comeptition for “sun rims” is significantly lower with lower link counts and lower PageRank sites ranking. The average sale here is also going be in the $40 to $45 range. Based on this my first efforts for the whole site wold be “full suspension mountain bike” for the homeapge, mountain bike frame” as a major internal page and I'd focus my first efforts on “rims” (“sun rim” specifically). Now – we'd of course look further than this but what we can see is the direction that we'd go if all we had to go on was the above data. As noted – were we launching this site we'd look into every brand and every part type and research further than the top 10 phrases but that would have made for a book, not and article and let's be honest – it would have been a very boring book unless you were planning on launching a mountain bike site. So now you've done enough competition analysis (remember – it's basic research we're talking about) to figure out what direction to head in. In my next article I'm going to cover more advanced competition analysis. We'll go in knowing what we want to accomplish in the way of keywords and be working to map out how to take the top spots. Until then – get your campaigns sorted out for potential keywords and keep reading … this is where it gets really interesting.
PSD to Drupal
PSD to Magento

Google+ and the Potential Impact on SEO

Although you can only join by invitation at this point, you've no doubt heard of Google+, Google's latest attempt to join (or, in time perhaps, completely overtake?) Facebook and Twitter as a must have social networking tool. In the months before Google+ was launched, Google also began implementing the "+1" button as a usable option for users to signify that they enjoy a particular site or page in an attempt to gather as much raw data as possible about the popularity and social value of sites and content before Google+ was rolled out for the masses. Preceding the Google+ and +1 button was the introduction of real time search, which was able to incorporate search results from Twitter, blogs and Facebook. Google, it would appear, is realizing the immense value of social media and the impact of social media on web search. Search will continue to have a social element infused into it as the addition of the +1 button will change search results, as will live feeds from Google+ pages, much like Facebook "likes" and Twitter "tweets" are currently affecting search results by influencing user decisions due to their value as endorsements of certain sites and content. Google definitely wants websites to implement the +1 button in their pages so that they can track and measure changes in click through rates. The +1 button will also be included on all SERPs as well as all Google+ feeds. What this means is business owners and marketers must ensure that a positive customer experience is, perhaps more than ever before, their primary focus in the hope that as many users as possible will +1 their site, and in doing so, endorse their business (and by association, reputation). While it is plain to see that the introduction of the +1 button was merely a precursor/trial balloon for Google+, the potential impact of the +1 button on search could be the bridge between all of the social oriented sites and tools and ways of doing things on the web and the subsequent influence on search results. Recently, Rand Fishkin, head of SEO Moz, decided to test some theories on the subject of social sites influencing search results. He shared a number of un-indexed URLs via Twitter both before and after Google had unceremoniously aborted the real time search results feature. Fishkin repeated the process, only this time he used Google+. He then requested that his followers on Twitter and Google+ to share the post, with the only caveat being that they were not to share it outside of the originating site. What this yielded in terms of hard data was that even though Google has dropped the real time search, re-tweeting and tweets are still assisting page indexation. As for Google+, Fishkin's test page ended up ranking #1 on Google within a few hours. This illustrates the fact that Google+ can also help pages get indexed, if not quite as quickly as Twitter. But perhaps the most interesting concept presented by Google+, and one that could potentially have a significant impact on SEO, is the "Google Circles" feature. The "Circles" feature is interesting because it grants users the ability to share whatever they choose with specific groups, or Circles, of people. As Google+ users build their Circles, they will subsequently be able to see the sites that users in their circles have +1'd in Google's SERPs. This has enormous potential - users will be far more likely to make a choice or purchase based on the recommendation of people they have invited to their Circles - people who they know and whose opinions they trust. Most users are going to be far more likely to trust the recommendation of someone they know rather than the recommendation or review from a stranger. Over time, Circles will become much more defined as more available user data is integrated into them - using that data to effectively market could be potentially powerful SEO strategy. Basically, Google has taken the ideas behind some of their social media competitors more influential and successful features in an attempt to make search more about real people. Google+ and the +1 button are enabling users to influence online activity, and, as such, they will have an effect on search results. Many experts are already proclaiming Google+ to have no impact on SEO whatsoever, citing Google Wave and past attempts by Google to get in on the social side of the net as indicators that this new attempt will also fail. While it is far too early to make any kind of definitive statement as to the long term usefulness or impact of Google+ and the +1 button on SEO, citing past failures as the basis for an argument as to why Google+ is going to fail as well is short sighted at best. The fact of the matter is, social factors are already intertwined with search, and this is likely only going to become more prevalent as these sites are expanded and the way we interact on the internet continues to evolve also, not less so. Whether or not Google+ ends up revolutionizing or merely co-existing with established SEO methodology remains to be seen, but the enormous potential of these features and their long term impact is fairly clear - site ranking methods are changing thanks to the +1 button and this will likely end up creating an altogether new method of SEO in the future.
PSD to Drupal
PSD to Magento

Understanding The Canonical Tag

Since the Panda updates from Google earlier this year, duplicate content has become an issue that no website owner can afford to overlook. While the update was designed specifically to target low value sites, content farms and scraped content, its paramount imperative was to reduce the amount of duplicate content that resulted in mass amounts of spam-ridden search results. As a direct result of the updates to the Google search algorithm, many thousands of both legitimate and nefarious sites were penalized with a significant drop in rankings and traffic.


Duplicate content can include the textual content of a website, scraped content from other sites, or similar content on multiple domains. Duplicate content issues also arise from dynamically generated product pages that display duplicate content throughout different sorting features. Google sees these pages as duplicate content.
Of the tactics available the 301 redirect and the more recent canonical tag, are the primary weapons in a web developers arsenal to help combat the problems associated with duplicate content. Unfortunately many aspiring webmasters do not always have a clear understanding of what they are, or how, or when each method should be employed.


What is a 301 Redirect?

In most cases a 301 redirect is used when you move your domain to a new webhost. The redirect tells search engines that your site has moved but still allows you to preserve your rankings. The other common usage of the 301 is to specify the preferred url of your domain.Typically you can go to either http://www.exampledomain.com or http://exampledomain.com< they are the same url but the search engine treats them as different urls. The 301 redirect allows you to specify the “proper” domain and retain the strength of the sites ranking so that it is not split between the two. 301s is that they were only designed to work at the domain level and did not address the duplicate content issues that were arising from have multiple dynamically driven pages. 301s also require that you have access to the web server hosting your site in order to implement them and an understanding of the syntax used to describe the parameters. Introducing the Canonical Tag

Prior to the introduction of the canonical tag, duplicate content was simply ignored and people used link building practices to game the SERPs in order to determine which would be the first to be listed. However, this had the negative systemic effect of inundating the SERPs with webspam which made it increasingly difficult to get quality, relevant results when performing web searches. As a result, Google introduced the canonical tag in early 2009 as a way to resolve some of the major duplicate content issues faced by the search engines.

The canonical tag was designed as a page level element in which you edit the “head” of the HTML document and edit the parameters. The canonical tag is a very simple one line code string that is treated in very much the same way as a permanent 301 redirect. It ensures that the PageRank, backlinks and link juice flow to the “proper url” and is not split between domains. It is fully supported by Google, Bing, Yahoo and other search engines.

Another scenario is which you may want to use a canonical tag is when you have web pages that produce “ugly” urls (http://www.example.com/product.php?item=bluewidgets&trackingid=1234&sessionid=5678), due to advance sorting features, tracking options and other dynamically driven user-defined options. You can specify that the clean url, or the “proper,” or “canonical” version of the url, is at “location B.” Search engines will then index the url that you have specified and regard it as the correct url.


*This example tells the search engine that the “correct” version of the Blue Widgets page is located at the www version and not the non-www version of the page.

The main difference between a 301 redirect and the canonical tag is that the later only works within a single domain or subdomain; that is you cannot go from domain A to domain B. This has the added benefit of alleviating problems associated with 301 hijacks and similar attacks.


Introduction of The Cross-Domain Canonical Tag

In December of 2009, Google announced a (http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html)cross-domain rel="canonical" link element that was also going to work across domains; thereby allowing webmasters of multiple sites with similar content to define specific content as fundamentally sourced from a different domain.

A simple scenario in which the cross-domain tag would be used is If you have three related domains, on three separate urls and all featured the same article (or product descriptions, etc). You can use the cross-browser tag to specify the page that is the authority (or preferred page). As a result, the specified page will collect all associated benefits of Page Rank and Link Juice and will not penalize you for duplicate content.

In essence the new tag performs the exact same function as the 301 redirect but allowed for a much more user-friendly method of implementation.

During the release and subsequent promotion of the canonical tag, Matt Cutts stated that “anywhere from 10%-36% of webhosts might be duplicated content.” According to Cutts, there are several effective strategies to combat the problem of duplicate content including:

Using 301 redirects
Setting your preference in Google to the www or non www version in Google’s Webmaster Tools (http://www.google.com/webmasters/ )
Ensuring that your CMS only generates the correct urls
Submiting a sitemap to Google. They will try to only use those urls in the sitemap in an effort to pick the “best url”
301s Versus rel=canonical?
Some people have concerns are over how much link juice will they lose if they use a 301 instead of a canonical redirect. There is very little difference in the relative amount of page rank that gets passed between the two methods.

Matt Cutts from Google addressed the problem by stating:

”You do lose some (page rank) but the amount is pretty insignificant. This is used to try and stop people from using 301s exclusively for everything within their own site instead of hyperlinks.
Watch the full video where Matt discusses the issue:

The canonical tag is most appropriate used when you cannot get to the server’s headers to implement the 301 directly as a web technician is typically required to implement the 301 for you.

The Hack
In the video above Matt addresses the question of relative strength loss between using a 301 Redirect and a rel=canonical tag. In a recent blog post (http://searchenginewatch.com/article/2072455/Hacked-Canonical-Tags-Coming-Soon-To-A-Website-Near-You), Beanstalk SEO's CEO, Dave Davies discusses a possible exploit of this “relative strength loss.”

Matt Cutts sent out a Tweet on May 13th stating, “A recent spam trend is hacking websites to insert rel=canonical pointing to hacker's site. If you suspect hacking, check for it.”

The conclusion is that there is a viable exploit of the rel=canonical tag and that by inserting the tag into a page can be a very effective strategy; on par with 301ing the page itself but even “better” in that it likely won't be detected by the site owner.

Davies continues by posing the following statement: “The next question we need to ask ourselves is, “Is this an issue now or just a warning?” implying that Google is certainly aware of the hack and will be analyizing ways to detect and penalize those that are planning to attempt this hack.

Article Take Aways:

The Panda updates have made the issue of duplicate content a priority for site owners to address.
Always use 301s whenever possible. They are more widely supported by search engines and can follow a 301 redirect. This also means that any new search engine that comes on to the market will have to support them as well.
301s only work at the domain level (ie. Pointing domainexapmle.com to www.domainexample.com)
301s also require that you have access to the web server hosting your site in order to implement them
The rel=canonical tag is a more user-friendly method to accomplish the same task as a 301.
The cross-domain Canonical tag works almost identical to a 301 direct.
The canonical tag is a user-friendly version designed to work within the site’s HTML head section.


PSD to Drupal

PSD to Magento

Facebook Beith Not The Devil


Social media is not going away, and nor should it. For all the nay-sayers out there who deem Facebook as the work of the Devil, just remember social media has always existed. Since the beginning of time, in fact. It was called word of mouth. The baker told the butcher about his latest new blend of grains, the butcher told the housemaid, the housemaid told the tailor, the tailor told the constable and gradually everyone knew about the fabulous new compilation the baker was using in his breads. Now, with the evolution of technology social media has become more than just status updates and shared links to videos of cats snuggling with dogs. The business world has finally begun to see the benefit of this techno-word-of-mouth phenomenon that has 500 million (active) users tweeting their every thought. Social media is a marketing tool. On one side of the fence it is a means for the baker to put the word out about his new grain blend. On the other side, it is the opportunity for the butcher, housemaid, tailor and constable to learn of the baker’s activities and pass along the information. Therefore it makes good business sense for companies to use Yelp, Twitter, Facebook et all as their voice to the consumer. Not long ago there was an article released by the American Pediatric Society warning parents about “Facebook Depression”. Parents were directed to watch their Facebook-friendly children for signs of depression stemming from either too much exposure to social media, or negative interactions taking place there. Undoubtedly there were a lot of parents observing their teen’s behavior a little more closely after that article hit the cables. Alongside the warning was the explained “Fear of Missing Out” or FOMO. Apparently adolescents are glued to their laptops and smartphones 24/7 waiting for the tiniest signal from a social media outlet, just so they don’t miss out on Justin Beiber tickets. What parents seem to be missing the point on is FOMO is natural. What three year old has not had a temper tantrum at bedtime? The teen version of FOMO is just the same, only on steroids. Parents, it’s all part of growing up. Didn’t you ever go way too far on the side of punk and stick a safety pin through your nose? The difference now is technology, and the corporate world has noticed. The music industry has always paid close attention to the demographics of their followers, but now it is so much easier to obtain and use that information. Your teen is not only drooling at the mouth for tweets from peers, they are also watching for the latest news on celebrities, musicians, fashion icons, gaming news and probably dozens of other subjects parents would shudder to hear of. Young people with after-school jobs have the most disposable income of any demographic, and business executives know that. Your grumpy 15 year old may not be saddled with Facebook Depression, but there is a pretty good chance he or she is probably sulking over not having the latest Toms or missing out on the Halo release. That doesn’t mean every parent needs to confiscate the internet. Nope, in fact let it be. This is a learning experience. Adolescents are moody, demanding, boundary pushing kids still trying to figure out who they are. Let them continue on the journey, they will thank you for it later. Of course the caveat to all of this is good parenting. We still need to be aware of our children’s activities and who their friends are. The internet, not social media in particular, is acting as a vehicle for speeding up the learning process for kids today – in both good and bad ways. Accept their use of social media as a means of skill building, but it will take a sound parental influence with a good sense of boundaries to know how to spot dangerous behavior. Social media is not going away. It is here to stay because no other form of communication gets information out to the masses as quickly. And since we are all social beings with an insatiable need for information, social media is our drug. Embrace it. Use it to your advantage. Make Twitter work for you instead of the other way around. Feed your Google+ profile through to Twitter, re-tweet industry related blogs you follow, Bump your contacts to build up your network, Yelp about favorite restaurants. All this in an effort to get traffic with your name pinned to it, flowing. Use social media to be more involved with your teen’s activities. They can hide, but they can’t hide from the web. Their obsession with social media is your ticket to knowing what they are up to. In the end, you will love the fact that social media is not going away. PSD to Drupal
PSD to Magento

Friday, April 29, 2011

Advancing Strategy for Social Marketing


When it comes to digital marketing I believe marketers need to be more strategists & research minded than idea evaluators and implementers."
After discussing social media this year with senior marketers from several large brands, the implementer reference in the above tweet by Shiv Singh really resonates with me.
More brands are taking (social) community management activities back in house while seeking outside expertise to continue guiding decisions around social strategies and applications.
When it comes to the day-to-day of social marketing, corporate competence is rising -- and the "yeah, I get that, but what's next?" mentality is placing a higher demand on strategy with expectations of research (or at least experience) to back it up.
As I've been preparing to speak about Facebook marketing with custom applications at next week's Online Marketing Summit, I've found a common thread in the key takeaways pertains more to strategy than turn-key tactics. The following is a preview of a couple key topics I'll discuss as part of that presentation.
Game Mechanics for Custom Facebook Applications
For those of you sick of hearing about it, I'll start by saying game mechanics are not a magic silver bullet -- and I took great delight in hearing Gowalla CEO Josh Williams proclaim "we don't need no stinkin' badges" at last month's Snowcial.
However, like Williams, those who have an established understanding of game mechanics are better positioned to get ahead. Why? Because it's a matter of better knowing how human behavior works.
If you're aware of certain ingredients that foster a higher propensity for sharing a social experience on Facebook, then you may realize higher fan growth and engagement as a result of implementation.
I touched on the Sanrio/Hello Kitty gifts application as an example of this when discussing social intelligence for Facebook marketing.
Another recent and impressive implementation of game mechanics (and overall digital strategy) is Vail Resort's EpixMix, which is also promoted on the Snow.com Facebook page.
Although the application doesn't reside on Facebook, the Connect functionality takes full advantage of Facebook sharing via passive, automated check-ins at six separate ski resorts, all enabled by an RFID chip embedded in your ski pass.
"Passive" means you don't need to pull out a mobile device for checking in. Updates to your Facebook feed are automatically posted based on your location with the pass, and one-time Facebook authorization.
A leading game mechanic in play for EpicMix is the use of more than 200 ski pins (digital "stinkin' badges") you can earn based on locations you ski at each resort, total feet of elevation skied and more. Although Vail Resort's CEO, Rob Katz, wasn't specific about adoption rate when asked last month, he was very clear about the fact that users signing on to share in Facebook exceeded expectations.
Game on.
Strategic Modeling for Social Strategies
While game mechanics address specific strategies from a human behavior perspective, the bigger and equally important picture pertains to how all elements of social marketing work together for the good of a business.
A valuable, but often overlooked practice is to adopt a model that facilitates a framework for strategy. There are a range of options with strategic models, but the one I follow is a layered ("Four Cs") approach:
  • Content: This is the foundational element, focusing not only on the type of content (video, infographic, written, etc.) but also how to apply supporting research to guide its development and/or justification.

  • Context: Think of this second layer as platforms enabling the display and distribution of your content. Facebook, for example, would be an element of context in this model.

  • Campaigns: This layer puts the context in action, addressing key variables around planning, implementation, supporting applications, visibility efforts, communication, and measurement.

  • Community: As the top layer, the strategic focus centers on loyalty achieved through specific campaigns, advocacy, or customer experiences. Community should be viewed as long-term, with the expectation of learning that can be applied to future iterations of strategy and research.
Practically speaking, we as marketers should be both implementers and "idea evaluators." But as strategists, we're called to a higher accountability -- one that distinguishes originality from repurposing, and activity from productivity.

LinkedIn: 5 Useful Tips to Leverage the Waking Giant


I recently heard a CEO refer to LinkedIn as the sleeping giant of social networks. With last week's announcement of 100 million members, the SEC filing in January (in preparation of going public) and 2010 being LinkedIn's first profitable year since the company's eight-year existence -- I'd say the giant has awakened.
For those of us active on LinkedIn, it's obvious the company is moving aggressively. Even this week, you may have received an email invitation to spend your first $50 worth of advertising free of charge via LinkedIn Ads (formerly called DirectAds).
Still, there's a lot to be discovered about how to best utilize this social network. Would you believe the most popular advice about using LinkedIn begins with making sure your account profile/setup is designated "100 percent complete," including the upload of a real picture of yourself?
Sure, that's valuable advice -- but a bit elementary, don't you think? Hopefully, you'll find some of the tips described herein even more useful.
Tip 1: Sharing is Caring
linkedin-sharing.png
LinkedIn began releasing tools in early 2010 to enable new ways to share, but the end-of-year launch of the LinkedIn share button was a game changer -- essentially catching LinkedIn up to Facebook in its ease of bringing outside content in.
As marketers we want our content shared on LinkedIn just as we want it shared on Facebook. The newness of LinkedIn Share Buttons means we don't yet have hard usage stats. Regardless, the recommendation is to start implementing them around appropriately shareable (especially blog-related) content.
Tip 2: Do Your Detective Work
If we agree that knowledge is power, this should become a fundamental exercise in preparing for meetings, calls, insights and talking points. As best stated by Lindsey Pollak: "look up everyone." Even having small amounts of information about people can break the ice in conversations, giving you an edge in presentations or sales calls.
As a helpful reminder, I use the Rapportive plugin in my web browser, giving me fast access (via the right column of my Gmail account) to a person's LinkedIn, Facebook, recent Tweets and more.
Tip 3: Leverage Expert Content with Advertising
Last year, Guy Kawasaki not only pointed to the prospect of winning new business via LinkedIn by answering questions associated with your expertise -- but also recommended promoting unique (blog) content using LinkedIn's small text ads. As of late January 2011, these text ads enable targeting by job title, company and groups.
Whether you're part of a small emerging business or a Fortune 500, the value of establishing and maintaining thought leadership is priceless. If you have the right (expert) content, promoting it in conjunction with the new ad targeting features is a winning approach.
Note: LinkedIn states that "good ads" in their network will have a click-through rate (CTR) greater than 0.025 percent. Yes, it's low -- but arguably consistent with what many report on the lower-end CTR performance of their Facebook ad campaigns. LinkedIn offers some insight to best practices for advertising here.
Tip 4: Be Part of the New News
Beyond promoting your content with ads, the simple sharing of it may also extend the reach of your expertise, thanks to the new LinkedIn Today news delivery format.
This new feature aggregates and delivers shared headlines from your network and industry via tweets and LinkedIn sharing. It also enables topic search functions with categorical filtering through industries, companies, etc.
The connection/algorithm between LinkedIn share buttons and what appears in LinkedIn Today content isn't clear -- but given the nature of how content is shared in LinkedIn Groups, it's certainly high time you join and begin contributing to at least one.
Tip 5: Get Radical with Recruiting
It would be remiss to skip mentioning LinkedIn as a recruiting tool. From resume searching to identification of funding sources, finding people to fill your knowledge gaps is critical to business growth.
Although you may not see firsthand use of companies targeting their competitor's employees for hire, Marty Weintraub's step-by-step tutorial demonstrates how the guerilla nature of acquiring talent just got kicked up a notch.
With this "awakening," LinkedIn continues demonstrating it warrants our business attention. If you still find yourself in the process of making friends with this giant, I'd love to hear more about how you are getting along. For quickly ramping up your LinkedIn strategy, also consider tapping into some of the starter resources out there from folks like Lewis Howes.

pubished by seogenie

50+ Tools to Automate Your Link Building


Are you a human link builder? If so, ask yourself this: "if a robot link builder existed, what would I still be able to do that it could not?"
Analyze a complex backlink profile and distinguish quality links from spammy ones? Check. Write a funny personal email that gets someone's attention in the right way? Check. Decide when a phone call might be the best outreach method? Check.
And what could the robot do faster and better than you?
Find every link to a site? Check. Automatically search through SERPs and connect each result to external data? Check. Automatically search for contact information on three different pages and score how closely it matched a person's name? Check. Automatically pre-populate data fields in a CRM? Check.
If you've ever heard the phrase "build on your strengths," the lesson for link building is this: that we need to automate as much of the routine, "robot work" as possible, and spend more time doing what we're best at: being sentient human link builders.
In this post, we'll look at tools that can help link builders shift their workload to computers as much as humanly possible.
Backlink Data
Let's start with the most basic automation. You need tools to research sites' backlink profiles. These tools crawl the web and build a database of raw data about backlinks.
Each tool provides, at minimum, the ability to lookup a list of all the pages linking to a URL or domain, and some include detailed information about each link's anchor text, type (text or image), follow status, authority for the linking page, and in some cases the ability to group, sort, search, and filter the results.
  • Majestic SEO: A well-regarded index of link data with information about anchor text, authority, Class C IPs, and relevance, not to mention good sorting and filtering. My only complaint is that their pricing and user-interface is a bit confusing.

  • Open Site Explorer: A very user-friendly tool with anchor text data, follow status, and authority. The only downside is the index may miss some links in the "deep web."

  • Yahoo Site Explorer: Known for being relative fast to find new links (other indexes are updated monthly), but very limited because it can only return 1,000 links per page or domain and offers no "extra" data such as follow status or filtering capabilities. But it's free! Yahoo also offers an API (Yahoo BOSS), which according to many, is more current than the Site Explorer website.

  • Google: Yes, their "link:" operator leaves much to be desired, but just because it's incomplete doesn't mean it's useless.

  • Blekko: This new search engine offers tons of free backlink data available from a very deep index.
Site-Level Backlink Analysis
Many tools offer backlink reports at the site or URL level, but are limited to only the data points they have available. So then what do you do if you want to filter a site's backlinks down to only followed inbound links, with toolbar PageRank of at least 5, and no more than 50 outbound links?
Enter site-level backlink analysis tools. These tools gather traditional backlink data with a traditional set of backlink data, often pulled from one or multiple backlink data providers.
  • Link Diagnosis: Powered by Yahoo BOSS, Link Diagnosis uses a Firefox extension to pull up to 1,000 links per page and lookup metrics such as the toolbar PageRank of each URL, whether the link actually was found on the page, follow status, anchor text of each link, and aggregate level reporting.

  • BacklinkWatch!: Also powered by Yahoo, BacklinkWatch! pulls the first 1,000 links for a page (the most Yahoo will give up), and appends the number of outbound links on the source page along with any flags they find (nofollow, image links, etc.).

  • AnalyzeBacklinks: Simple and free tool that analyzes backlinks to a page and appends anchor text, total number of links, outbound links, title of the linking page. One feature I like is that ability to flag links that mention a keyword you've selected.

  • SEOBook Link Harvester: Shows backlinks grouped by linking domain, groups them by top level domain (TLD), and provides summary metrics about the number of incoming links and percentage of deep links to the page.

  • SEOBook Back Link Analyzer: A free downloadable tool that pulls backlink data from Google, MSN, and Yahoo, crawls the linking pages, and builds a table of information about each link including follow status, number of outbound links, page title, and more.

  • SearchStatus Plugin for Firefox: A free Firefox extension from iAcquire that pulls the backlinks from Google, Yahoo, and Bing.

  • SEOLink Analysis: Supplements lists of links produced by Google Webmaster Tools and Yahoo Site Explorer with information about each link's PageRank, anchor text, and follow status.

  • WhoLinksToMe: Produces various detailed backlink reports with views by link, anchor text, country, IP, and more. Many charts and graphs to aid in the analysis. Freemium.

  • Many enterprise SEO packages also offer data-rich site level backlink analysis, including BrightEdge, SecondStep, RankAbove's Drive, seoClarity, SEO Diver, SISTRIX Toolbox, and gShift Labs, and link building specific tools such as Advanced Link Manager, Linkdex and Cemper's LinkResearchTools offer powerful backlink analysis features.
SERP-Level Backlink Analysis
It seems like every link builder has a preferred set of data when it comes to competitive analysis. So don't expect any single tool to pull every conceivable piece of data and put them all into same columns you've always used.
You still may find yourself exporting data to Excel and merging with other data sources. But any time saved from manually copying and pasting data into spreadsheets (or hiring and managing people to do so) can be spent on more human, value-added activities.
Anyone without research tools for SERP analysis is at a competitive disadvantage.
Link Prospecting Tools
There are many ways to find link opportunities, and the tools listed below can only really scratch the surface when it comes to the universe of link opportunities that some creativity and insight can find.
  • Query generators are the original link automation tools. These tools take a keyword and automatically create dozens or hundreds of canned searches to find common link opportunity types (e.g. resource pages, guest posts, and directories). Here are a few of the most popular: SoloSEO, Ontolo's Link Query Generator, SEOBook's Link Suggest, BuzzStream's Link Building Query Generator (disclaimer: I co-founded BuzzStream), and Webconf's Backlink Builder. I would strongly caution anyone using a list of other people's queries to weed out queries that don't make sense for them -- don't just head down a link building path because a tool suggested you seek a link on every "inurl:links.html" page in your industry.

  • Link prospecting tools build upon the query generator idea, but automate the task of visiting each page in and compiling addition metrics (and in some cases, contact info). This can save link builders time by enabling them to prioritize prospects with the highest value. But you can't just take everything the tools give you. Plan to review each prospect to assess its appropriateness to your link building campaign. Here are a few of the most popular: Ontolo, Adgooroo's Link Insight, and Advanced Link Manager.

  • Cocitation or "hub finder" tools help you find sites that link to multiple competitors. Some look across all links to your competitors, and some analyze the top ranking sites for a given keyword. The best known offerings in this area are SEOBook's Hub Finder, Adgooroo's Link Insight, Raven's SiteFinder, LinkResearchTools, Linkdex, WordTracker Link Builder, SEODiver, and Shoemoney Tools.

  • Proprietary technique research tools use a combination of their own search queries and analysis rules to generate a list of screened, quality link prospect opportunities. Link Insight is known for integrating many of Eric Ward's (a.k.a., "Link Moses") research methods, though I wouldn't call it an Eric-SaaS just yet. Ontolo offers a number of proprietary searches, but also leaves a fair bit of detail and control in users' hands.

  • Checklist-driven link building tools give users bite-sized link building tasks, such as "Today you should request a link on DMOZ!" (except their suggestions tend to be more clever than that): LotusJump, Hubspot, DIYSEO, and SEOScheduler.
Next time, I'll cover tools that address contact research, link management (CRM), link outreach management, and link monitoring.
A note about some tools I won't cover: tools that scrape SERPs for sites and automatically extract email addresses and send blast mass emails, tools that automate directory submission, "article marketing," and blog commenting, services that automate blindly placing link-laden content on an unknown network of sites, or tools to automate reciprocal link exchanges. These tools exist and some people use them, but I have yet to find them to be beneficial.
The best strategy for using link building automation tools is to first develop a good process for tracking your link prospecting data and managing your outreach via a structured workflow. Once you have data and process in place, you can start automating some of your routine tasks.
The point of using great tools, whether it's an array of three 24" monitors on your desk, an Aeron chair, or fancy link building tools, is to eliminate energy wasted on low value activities, work in new ways, and free you up to focus on what you, the human link builder, is uniquely suited to do.

Social Design Strategy Working from the Outside In


A few weeks ago at the Facebook Developer Garage in Paris, design strategist Eric Fisher defined social design in the context of the three core components: identity, conversation, and community.
While most discussions on social design encompass the practice of creating optimized user experiences within applications and websites, Fisher's presentation (and ultimate strategic suggestion to marketers) centered upon the most foundational elements to Facebook's success.
As many of us desire to stay sharp on social media strategy, I'm hopeful the interpretive summary of his presentation that follows will provide a helpful reference (if not, reminder) of what we as marketers should keep top of mind when helping businesses meaningfully engage with their customers.


Facebook's Social Design Components Defined
First, we acknowledge that relationships and trust are foundational to any social network. The level of trust depends on the types of relationships we have, which can be categorized on the spectrum between strong ties (family and close friends) and weak ties (short-lived/formal relationships).
In this context, we can consider the following three components, beginning with the inner circle in the diagram below:
core-components.png
http://seogenie.blogspot.com
  • Identity: Facebook started by enabling us to define our identities within their network. Inherently, our identities are reinforced through strong ties, or relationships with those we most closely trust who are also in the network.

  • Conversation: We insert our identity into the community through conversation. By listening and responding (sharing), we personally benefit from self-expression -- yet also benefit others who learn or are inspired from our contribution to the conversation. Conversation is what Fisher refers to as "the glue between identity and community."

  • Community: As conversation continues, community is developed around various common values and participating identities. The strength of community grows from weak ties that give back via the conversation. The community response may reinforce or even influence our identity. And as the community influences us, even to the extent of our mere participation in the conversation, so may we influence our strong ties.
Fisher's Key Insight
While Facebook has been about growing community from the inside out (starting with identity) -- we as marketers realize our greatest opportunities by approaching it from the outside in, starting with community.
The widespread adoption of Facebook translates to an enormous volume of communities already established -- and fortunately, we have visibility into any number of them. Practically speaking, the exercise from here becomes one of social business intelligence, learning what we can about existing conversations and communities so we can properly define a conversation and add to the identities participating in them.
At a more granular level, we can get into the kind of strategic modeling that specifically addresses the content, context, and campaigns which inspire people to act and share.
Additional Notes on Social Design
As implied at the start, this post could be considered a very different angle on social design. Although the importance of "traditional" social design merits a separate post, the following are some quick insights and references I hope you will also find useful.
First, if we had to boil down the common goal of social design, it would be to maximize the accessibility, ease and usefulness of social interactions between people and content. Since most agree that social media is about conversations, it's easy to recognize there is great value to designing interfaces that best enable them.
Fortunately, we don't need to become user experience (UX) design experts to take advantage of what's been learned. The following are a few great references, many of which provide highly practical examples and tips:


Five months ago I wrote a post titled, "So you wanna be a user experience designer," in which I gathered all of the resources in my UX arsenal: publications and blogs, books, local events, organizations, mailing lists, webinars, workshops, conferences, and schooling. My intent was to give aspiring user experience designers, or even those on the hunt for additional inspiration, a launching pad for getting started.
The response has been pretty remarkable—the link continues to be sent around the Twitterverse and referenced in the blogosphere. I'm really pleased that so many people have found it to be a useful aid in their exploration of User Experience.
In the post I promised that it would be the beginning of a series, and I'm happy to report that Step 2 is finally here: Guiding Principles.
"Guiding principles" are the broad philosophy or fundamental beliefs that steer an organization, team or individual's decision making, irrespective of the project goals, constraints, or resources.
I have collected a set of guiding principles for user experience designers, to encourage behaviors that I believe are necessary to being a successful practitioner, as well as a set of guiding principles for experience design—which I think anyone who touches a product used by humans should strive to follow.
DISCLAIMER: These lists are meant to be both cogent and concise. While there are certainly other universal truths that I may not have noted, the principles below are the ones I consider to be most critical to designing user experiences and are often the most neglected.
I would love to hear your additions and edits in the comments.

5 Guiding Principles for Experience Designers

  1. Understand the underlying problem before attempting to solve it
  2. Your work should have purpose—addressing actual, urgent problems that people are facing. Make sure that you can clearly articulate the core of the issue before spending an ounce of time on developing the design. The true mark of an effective designer is the ability to answer "why?". Don't waste your time solving the wrong problems.
  3. Don't hurt anyone
  4. It is your job to protect people and create positive experiences. At the very minimum you must ensure that you do not cause any pain. The world is filled with plenty of anguish—make your life goal not to add to it.
  5. Make things simple and intuitive
  6. Leave complexity to family dynamics, relationships, and puzzles. The things you create should be easy to use, easy to learn, easy to find, and easy to adapt. Intuition happens outside of conscious reasoning, so by utilizing it you are actually reducing the tax on people's minds. That will make them feel lighter and likely a lot happier.
  7. Acknowledge that the user is not like you
  8. What's obvious to you isn't necessarily obvious to someone else. Our thought processes and understanding of the world around us are deeply affected by our genetics, upbringing, religious and geographical culture, and past experiences. There is a very small likelihood that the people you are designing for have all the distinctive qualities that make you you. Don't assume you innately understand the needs of your customers. How many people do you think truly understand what it feels like to be you?
  9. Have empathy
  10. Empathy is the ability to understand and share another person's perspective and feelings. Step outside your box and try really hard to understand the world from another person's point of view. Go out of your way to identify with their needs. If certain things just don't make sense to you, ask more questions. Ask as many questions as you need to until you finally understand. When you really get what makes people tick and why they do what they do, you'll have a much easier time going to bat to make their lives better. If you aren't trying to make people's lives better, what are you even doing here?

20 Guiding Principles for Experience Design

  1. Stay out of people's way
  2. When someone is trying to get something done, they're on a mission. Don't interrupt them unnecessarily, don't set up obstacles for them to overcome, just pave the road for an easy ride. Your designs should have intentional and obvious paths, and should allow people to complete tasks quickly and freely.
  3. Present few choices
  4. The more choices a person is presented with, the harder it is for them to choose. This is what Barry Schwartz calls The Paradox of Choice. Remove the "nice to haves" and focus instead of the necessary alternatives a person needs to make in order to greatly impact the outcome.
  5. Limit distractions
  6. It's a myth that people can multitask. Short of chewing gum while walking, people can't actually do two things simultaneously; they end up giving less attention to both tasks and the quality of the interaction suffers. An effective design allows people to focus on the task at hand without having their attention diverted to less critical tasks. Design for tasks to be carried out consecutively instead of concurrently in order to keep people in the moment.
  7. Group related objects near each other
  8. Layout is a key ingredient to creating meaningful and useful experiences. As a person scans a page for information, they form an understanding about what you can do for them and what they can do for themselves using your services. To aid in that learning process, and to motivate interaction, don't force people to jump back and forth around disparate areas in order to carry out a single task. The design should be thoughtfully organized with related features and content areas appropriately chunked, and…
  9. Create a visual hierarchy that matches the user's needs
  10. …by giving the most crucial elements the greatest prominence. "Visual hierarchy" is a combination of several dimensions to aid in the processing of information, such as color, size, position, contrast, shape, proximity to like items, etc. Not only must a page be well organized so that it's easy to scan, but the prioritization of information and functionality ought to mimic real world usage scenarios. Don't make the most commonly used items the furthest out of reach.
  11. Provide strong information scent
  12. People don't like to guess. When they click around your site or product, they aren't doing so haphazardly; they're trying to follow their nose. If what they find when they get there isn't close to what they predicted, chances are they're going to give up and go elsewhere. Make sure that you use clear language and properly set expectations so that you don't lead people down the wrong path.
  13. Provide signposts and cues
  14. Never let people get lost. Signposts are one of the most important elements of any experience, especially one on the web where there are an infinite number of paths leading in all directions. The design should keep people aware of where they are within the overall experience at all times in a consistent and clear fashion. If you show them where they came from and where they're going, they'll have the confidence to sit back and relax and enjoy the ride.
  15. Provide context
  16. Context sets the stage for a successful delivery. By communicating how everything interrelates, people are much more likely to understand the importance of what they're looking at. Ensure that the design is self-contained and doesn't break people out of the experience except for when it's entirely necessary to communicate purpose.
  17. Avoid jargon
  18. Remember that the experience is about them (the customer), not you (the business). Like going to a foreign country and expecting the lady behind the counter to understand English, it's just as rude to talk to your visitors using lingo that's internal to your company or worse, expressions you made up to seem witty. Be clear, kind and use widely understood terminology.
  19. Make things efficient
  20. A primary goal of experience design is to make things efficient for the human before making things efficient for the computer. Efficiency allows for productivity and reduced effort, and a streamlined design allows more to get done in the same amount of time. Creating efficiency demonstrates a great deal of respect for your customers, and they'll be sure to notice.
  21. Use appropriate defaults
  22. Providing preselected or predetermined options is one of the ways to minimize decisions and increase efficiency. But choose wisely: if you assign the defaults to the wrong options (meaning that the majority of people are forced to change the selection), you'll end up creating more stress and processing time.
  23. Use constraints appropriately
  24. Preventing error is a lot better than just recovering from it. If you know ahead of time that there are certain restrictions on data inputs or potential dead ends, stop people from going down the wrong road. By proactively indicating what is not possible, you help to establish what is possible, and guide people to successful interactions. But make sure the constraints are worthwhile—don't be overly cautious or limiting when it's just to make things easier for the machine.
  25. Make actions reversible
  26. There is no such thing as a perfect design. No one and nothing can prevent all errors, so you're going to need a contingency plan. Ensure that if people make mistakes (either because they misunderstood the directions or mistyped or were misled by you), they are able to easily fix them. Undo is probably the most powerful control you can give a person—if only we had an undo button in life.
  27. Reduce latency
  28. No one likes to wait. Lines suck. So do delays in an interface. Do whatever you can to respond to people's requests quickly or else they'll feel like you aren't really listening. And if they really have to wait…
  29. Provide feedback
  30. …tell them why they're waiting. Tell them that you're working. Tell them you heard them and offer the next step along their path. Design is not a monologue, it's a conversation.
  31. Use emotion
  32. Ease of use isn't the only measure of a positive user experience; pleasurably is just as important. Something can be dead simple, but if it's outrageously boring or cold it can feel harder to get through. Designs should have flourishes of warmth, kindness, whimsy, richness, seduction, wit—anything that incites passion and makes the person feel engaged and energized.
  33. Less is more
  34. This isn't necessarily about minimalism, but it is important to make sure that everything in the design has a purpose. Some things are purely functional; other things are purely aesthetic. But if they aren't adding to the overall positivity of the experience, then take it out. Reduce the design to the necessary fundamentals and people will find it much easier to draw themselves in the white space.
  35. Be consistent
  36. Navigational mechanisms, organizational structure and metaphors used throughout the design must be predictable and reliable. When things don't match up between multiple areas, the experience can feel disjointed, confusing and uncomfortable. People will start to question whether they're misunderstanding the intended meaning or if they missed a key cue. Consistency implies stability, and people always want to feel like they're in good hands.
  37. Make a good first impression
  38. You don't get a second chance! Designing a digital experience is really no different than establishing a set of rules for how to conduct yourself in a relationship. You want to make people feel comfortable when you first meet them, you want to set clear expectations about what you can and can't offer, you want to ease them into the process, you want to be attractive and appealing and strong and sensible. Ultimately you want to ensure that they can see themselves with you for a long time.
  39. Be credible and trustworthy
  40. It's hard to tell who you can trust these days, so the only way to gain the confidence of your customers is to earn it—do what you say you're going to do, don't over promise and under deliver, don't sell someone out to fulfill a business objective. If you set people's expectations appropriately and follow through in a timely matter, people will give you considerably more leeway than if they're just waiting for you to screw them over.
The above principles are general and can be applied across many types of experiences. However some products require a more focused set of directives due to their specific audiences or brand goals. Below are examples of Guiding Principles that have been made public by some of the best known organizations. Use these as inspiration, but don't think that just following the same instructions will yield the same results.

Thursday, January 27, 2011

List Of Keyword Tools-Spice up your Keyword Research




1. Google Adwords Keyword Tool

This is one of the most popular and also free keyword tool available online right now, many webmasters do feel its enough just to make use of this tool alone for their keyword research process. Many do feel its the most authentic keyword tool available as the datas directly come from one of the most popular search engine that is Google itself.

2. Alexa

Although alexa is not a typical keyword research tool, but it does provide more less accurate details about the site and what keywords drive majority of the traffic for a particular site, this data can be quite useful when you want to analyze your competition and where it is getting its valuable hits from, thus making this tool invaluable for research purposes.

3. Compete

Compete is again a tool more similar to alexa, but its been said that compete gives more accurate details about a site over alexa. But just know that no tools can give really accurate data about a site, but most tools strive to achieve the near accuracy area and compete really stands out of the crowd by providing valuable data about a site which includes the terms which a particular site ranks on top which makes a site popular. You still get very limited information about the keywords on free version, if you want to expand your list then you need to pay for the pro version.

4. Keywordspy

Keyword spy is more like compete but this tool gives more in depth details in to your competitors, also gives you details about your PPC competitors, one of the best keyword tools out there, if you haven’t tried it before I highly recommend checking this tool out.

5. Spyfu

Spyfu gives you very good idea about your keywords you want to focus on, information about who are ranking well for your keywords already, stats for domains which are using PPC for your keywords. But one limitation with this tool is that it gives you only information from the data collected from US and UK, if you are focusing on market outside of UK and US this tool might not be very useful for you.

6. Wordtracker

Perhaps the second popular keyword tool out there after Google, unlike Google the datas of this keyword tool is collected from search engines like dogpile cannot be said that it gives very accurate datas about search volume but its been said it gives more or less accurate results for keywords. Again it comes in both paid and free version. You can opt for a paid version if you want additional functionality and more indepth keyword research.

White hat versus black hat


SEO techniques are classified by some into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve of and attempt to minimize the effect of, referred to as spamdexing. Some industry commentators classify these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.White hats tend to produce results that last a long time, whereas black hats anticipate that their sites will eventually be banned once the search engines discover what they are doing.

A SEO tactic, technique or method is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see.

White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to game the algorithm. White hat SEO is in many ways similar to web development that promotes accessibility, although the two are not identical.

White Hat SEO is merely effective marketing, making efforts to deliver quality content to an audience that has requested the quality content. Traditional marketing means have allowed this through transparency and exposure. A search engine's algorithm takes this into account, such as Google's PageRank.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One infamous example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list

Wednesday, January 19, 2011

How to use Twitter For Business

Twitter is a wonderful business tool, not least because it's free; all it will cost is your time (and if that's in short supply, you can hire a social media marketer to manage it for you).

Used well, Twitter can provide good exposure for your business; but you can also damage your brand with social media marketing if you're not careful, so it's worth learning the biggest dos and don'ts before you start using Twitter.

Tip 1: Be yourself and be human

The beauty of Twitter is that it's a huge global community of human beings (mostly; there are spammer accounts but they're easy to spot, block and report). So do show your human side, especially when using your business account. Talk about things that matter to you: funny things your children say, recent achievements, your favourite band or TV show, and so on. Join in with conversations that interest you - be friendly, show emotion, and use smilies if you want to.

On the other hand, don't be too human. Don't share anything you wouldn't share at a real-world business networking event; keep intimate health problems and controversial or potentially offensive opinions to yourself.

Tip 2: Watch how you write

Some people write well, others don't - that's true in all areas of life, not just on Twitter. You don't need to be a bestselling novelist to use Twitter, but it helps if you have basic literacy skills (and if you use Twitter at the website instead of through a client, your Tweets will be spellchecked as you type anyway - which helps).

However good (or bad) your writing skills are, with Twitter's 140-character limit you'll need to be creative with your Tweets. Your Tweets need to be concise yet informative, and often you'll be trying to squeeze in a URL too (URL shortening services like bit.ly and tinyurl.com are lifesavers).

One definite don't is using text speak. Text speak is fine if you're 13, but as a professional adult promoting your business you're just going to look silly, and won't communicate your messages efficiently - unless you're targeting 13 year olds.

Tip 3: Share and share alike

If you have some good news - related to your business or your personal life - share it; everybody loves a good news story.

Do share links - to your website, your blog, your local news service, or anything else that interests your followers - this is a great way to get conversations going. But do remember to explain what the link's about, or your followers will feel less inclined to click it. And don't Tweet the same link over and over; people will quickly become bored and may stop following you.

Do retweet your friends' links, too; they'll be grateful, and so will your followers if the link is interesting and relevant. But here's a very big 'do' - DO make sure you click the link and read the content before sharing it with your followers, or you could end up sharing a page that's irrelevant or offensive, or which contradicts your usual position on the subject.

Tip 4: Be part of the community

Don't treat Twitter as your personal billboard. It's not: it's a community, millions of members strong, and the community as a whole is not very tolerant of users who constantly advertise. Try to stick to the 80-20 rule when you use Twitter for business: no more than 20% of your Tweets should advertise or self-promote, and at least 80% should be non-promotional. If you can get the ratio down to 90-10 or 95-5, even better.

Listen to what people are saying, and join in. Twitter is a network of conversations, so it's good practice to listen and respond to parts of those conversations that interest you; don't just stand in the middle of the room with a megaphone, shouting "I'm fabulous! I'm selling widgets at 20% off this week!" Again - if you wouldn't do it at a business networking event, don't do it on Twitter.

Do re tweet your friends' requests for help (for example, charity appeals and sponsorship requests), and do introduce friends that are new to Twitter and could do with some followers. And again - do re tweet useful, interesting links from people you follow, but always check links before sending.

Tip 5: Mind your language

Don't use offensive language when representing your business on Twitter; even mild swearwords can put sensitive souls off following you (and besides - cursing in public is hardly professional).

Use Twitter to answer customer questions and solve their problems, by all means; many organizations use Twitter as a customer services tool very effectively. But never, ever use an impolite or impatient tone with a customer. On Twitter, everything you say is out there for everyone to see, so leave your followers with the best possible impression of your brand at all times... the Internet has a very long memory!

Finally - consider this a bonus tip, since it's not really connected to any of the previous ones - try to enjoy yourself when you use Twitter. Try to embrace all that's good about Twitter - the new friendships and business contacts you'll make, the fun hashtags and trending topics, the strong community spirit - and before long you'll be singing (or is that Tweeting?) Twitter's praises to anyone who'll listen.
 

Monday, January 10, 2011

The Distribution of PageRank Regarding Search Engine Optimisation

Up to this point, it has been described how the number of pages and the number of inbound and outbound links, respectively, influence PageRank. Here, it will mainly be discussed in how far PageRank can be affected for the purpose of search engine optimisation by a website's internal linking structure.
In most cases, websites are hierachically structured to a certain extend, as it is illustrated in our example of a web site consisting of the pages A, B and C. Normally, the root page is withal optimised for the most important search phrase. In our example, the optimised page A has an external inbound link from page X which has no other outbound links and a PageRank of 10. The pages B and C each receive a link from page A and link back to it. If we set the damping factor d to 0.5 the equations for the single pages' PageRank values are given by



PR(A) = 0.5 + 0.5 (10 + PR(B) + PR (C))
PR(B) = 0.5 + 0.5 (PR(A) / 2)
PR(C) = 0.5 + 0.5 (PR(A) / 2)
Solving the equations gives us the follwing PageRank values:
PR(A) = 8
PR(B) = 2.5
PR(C) = 2.5
It is generally not advisable to solely work on the root page of a site for the purpose of search engine optimisation. Indeed, it is, in most cases, more reasonable to optimise each page of a site for different search phrases.
We now assume that the root page of our example website provides satisfactory results for its search phrase, but the other pages of the site do not, and therefore we modify the linking structure of the website. We add links from page B to page C and vice versa to our formerly hierarchically structured example site. Again, page A has an external inbound link from page X which has no other outbound links and a PageRank of 10. At a damping factor d of 0.5, the equations for the single pages' PageRank values are given by



PR(A) = 0.5 + 0.5 (10 + PR(B) / 2 + PR(C) / 2)
PR(B) = 0.5 + 0.5 (PR(A) / 2 + PR(C) / 2)
PR(C) = 0.5 + 0.5 (PR(A) / 2 + PR(B) / 2)
Solving the equations gives us the follwing PageRank values:
PR(A) = 7
PR(B) = 3
PR(C) = 3
The result of adding internal links is an increase of the PageRank values of pages B and C, so that they likely will rise in search engine result pages for their targeted keywords. On the other hand, of course, page A will likely rank lower because of its diminished PageRank.
Generally spoken, PageRank will distribute for the purpose of search engine optimisation more equally among the pages of a site, the more the hierarchically lower pages are interlinked.
Well Directed PageRank Distribution by Concentration of Outbound Links
It has already been demonstrated that external outbound links tend to have negative effects on the PageRank of a website's web pages. Here, it shall be illustrated how this effect can be reduced for the purpose of search engine optimisation by the systematic arrangement of external outbound links.
We take a look at another hierarchically structured example site consisting of the pages A, B, C and D. Page A has links to the pages B, C and D. Besides a link back to page A, each of the pages B, C and D has one external outbound link. None of those external pages which receive links from the pages B, C and D link back to our example site. If we assume a damping factor d of 0.5, the equations for the calculation of the single pages' PageRank values are given by


PR(A) = 0.5 + 0.5 (PR(B) / 2 + PR(C) / 2 + PR(D) / 2)
PR(B) = PR(C) = PR(D) = 0.5 + 0.5 (PR(A) / 3)
Solving the equations gives us the follwing PageRank values:
PR(A) = 1
PR(B) = 2/3
PR(C) = 2/3
PR(D) = 2/3
Now, we modify our example site in a way that page D has all three external outbound links while pages B and C have no more external outbound links. Besides this, the general conditions of our example stay the same as above. None of the external pages which receive a link from pages D link back to our example site. If we, again, assume a damping factor d of 0.5, the equations for the calculations of the single pages' PageRank values are given by



PR(A) = 0.5 + 0.5 (PR(B) + PR(C) + PR(D) / 4)
PR(B) = PR(C) = PR(D) = 0.5 + 0.5 (PR(A) / 3)
Solving these equations gives us the follwing PageRank values:
PR(A) = 17/13
PR(B) = 28/39
PR(C) = 28/39
PR(D) = 28/39
As a result of our modifications, we see that the PageRank values for each single page of our site have increased. Regarding search engine optimisation, it is therefore advisable to concentrate external outbound links on as few pages as possible, as long as it does not lessen a site's usabilty.
Link Exchanges for the purpose of Search Engine Optimisation
For the purpose of search engine optimisation, many webmasters exchange links with others to increase link popularity. As it has already been shown, adding links within closed systems of web pages has no effects on the accumulated PageRank of those pages. So, it is questionable if link exchanges have positive consequences in terms of PageRank at all.
To show the effects of link exchanges, we take a look at an an example of two hierarchically structured websites consisting of pages A, B and C and D, E and F, respectively. Within the first site, page A links to pages B and C and those link back to page A. The second site is structured accordingly, so that the PageRank values for its pages do not have to be computed explicitly. At a damping factor d of 0.5, the equations for the single pages' PageRank values are given by
PR(A) = 0.5 + 0.5 (PR(B) + PR(C))
PR(B) = PR(C) = 0.5 + 0.5 (PR(A) / 2)
Solving the equations gives us the follwing PageRank values for the first site
PR(A) = 4/3
PR(B) = 5/6
PR(C) = 5/6
and accordingly for the second site
PR(D) = 4/3
PR(E) = 5/6
PR(F) = 5/6
Now, two pages of our example sites start a link exchange. Page A links to page D and vice versa. If we leave the general conditions of our example the same as above and, again, set the damping factor d to 0.5, the equations for the calculations of the single pages' PageRank values are given by



PR(A) = 0.5 + 0.5 (PR(B) + PR(C) + PR(D) / 3)
PR(B) = PR(C) = 0.5 + 0.5 (PR(A) / 3)
PR(D) = 0.5 + 0.5 (PR(E) + PR(F) + PR(A) / 3)
PR(E) = PR(F) = 0.5 + 0.5 (PR(D) / 3)
Solving these equations gives us the follwing PageRank values:
PR(A) = 3/2
PR(B) = 3/4
PR(C) = 3/4
PR(D) = 3/2
PR(E) = 3/4
PR(F) = 3/4
We see that the link exchange makes pages A and D benefit in terms of PageRank while all other pages lose PageRank. Regarding search engine optimisation, this means that the exactly opposite effect compared to interlinking hierachically lower pages internally takes place. A link exchange is thus advisable, if one page (e.g. the root page of a site) shall be optimised for one important key phrase.
A basic premise for the positive effects of a link exchange is that both involved pages propagate a similar amount of PageRank to each other. If one of the involved pages has a significantly higher PageRank or fewer outbound links, it is likely that all of its site's pages lose PageRank. Here, an important influencing factor is the size of a site. The more pages a web site has, the more PageRank from an inbound link is distributed to other pages of the site, regardless of the number of outbound links on the page that is involved in the link exchange. This way, the page involved in a link exchange itself benefits lesser from the link exchange and cannot propagate as much PageRank to the other page involved in the link exchange. All the influencing factors should be weighted up against each other bevor one trades links.
Finally, it shall be noted that it is possible that all pages of a site benefit from a link exchange in terms of PageRank, whereby also the other site taking part in the link exchange does not lose PageRank. This may occur, when the page involved in the link exchange already has a certain number of external outbound links which don't link back to that site. In this case, less PageRank is lost by the already existing outbound links.

The Effect of the Number of Pages

Since the accumulated PageRank of all pages of the web equals the total number of web pages, it follows directly that an additional web page increases the added up PageRank for all pages of the web by one. But far more interesting than the effect on the added up PageRank of the web is the impact of additional pages on the PageRank of actual websites.
To illustrate the effects of addional web pages, we take a look at a hierachically structured web site consisting of three pages A, B and C, which are joined by an additional page D on the hierarchically lower level of the site. The site has no outbound links. A link from page X which has no other outbound links and a PageRank of 10 points to page A. At a damping factor d of 0.75, the equations for the single pages' PageRank values before adding page D are given by




PR(A) = 0.25 + 0.75 (10 + PR(B) + PR(C))
PR(B) = PR(C) = 0.25 + 0.75 (PR(A) / 2)
Solving the equations gives us the follwing PageRank values:
PR(A) = 260/14
PR(B) = 101/14
PR(C) = 101/14
After adding page D, the equations for the pages' PageRank values are given by
PR(A) = 0.25 + 0.75 (10 + PR(B) + PR(C) + PR(D))
PR(B) = PR(C) = PR(D) = 0.25 + 0.75 (PR(A) / 3)
Solving these equations gives us the follwing PageRank values:
PR(A) = 266/14
PR(B) = 70/14
PR(C) = 70/14
PR(D) = 70/14
As to be expected since our example site has no outbound links, after adding page D, the accumulated PageRank of all pages increases by one from 33 to 34. Further, the PageRank of page A rises marginally. In contrast, the PageRank of pages B and C depletes substantially.
The Reduction of PageRank by Additional Pages
By adding pages to a hierarchically structured websites, the consequences for the already existing pages are nonuniform. The consequences for websites with a different structure shall be shown by another example.
We take a look at a website constisting of three pages A, B and C which are linked to each other in circle. The pages are then joined by page D which fits into the circular linking structure. The regarded site has no outbound links. Again, a link from page X which has no other outbound links and a PageRank of 10 points to page A. At a damping factor d of 0.75, the equations for the single pages' PageRank values before adding page D are given by






PR(A) = 0.25 + 0.75 (10 + PR(C))
PR(B) = 0.25 + 0.75 × PR(A)
PR(C) = 0.25 + 0.75 × PR(B)
Solving the equations gives us the follwing PageRank values:
PR(A) = 517/37 = 13.97
PR(B) = 397/37 = 10.73
PR(C) = 307/37 = 8.30
After adding page D, the equations for the pages' PageRank values are given by
PR(A) = 0.25 + 0.75 (10 + PR(D))
PR(B) = 0.25 + 0.75 × PR(A)
PR(C) = 0.25 + 0.75 × PR(B)
PR(D) = 0.25 + 0.75 × PR(C)
Solving these equations gives us the follwing PageRank values:
PR(A) = 419/35 = 11.97
PR(B) = 323/35 = 9.23
PR(C) = 251/35 = 7.17
PR(D) = 197/35 = 5.63
Again, after adding page D, the accumulated PageRank of all pages increases by one from 33 to 34. But now, any of the pages which already existed before page D was added lose PageRank. The more uniform PageRank is distributed by the links within a site, the more likely will this effect occur.
Since adding pages to a site often reduces PageRank for already existing pages, it becomes obvious that the PageRank algorithm tends to privilege smaller web sites. Indeed, bigger web sites can counterbalance this effect by being more attractive for other webmasters to link to them, simply because they have more content.
None the less, it is also possible to increase the PageRank of existing pages by additional pages. Therefore, it has to be considered that as few PageRank as possible is distributed to these additional pages.