From the category archives:

Search Engine Optimization

Build a Facebook Application as Part of Your Search Engine Optimization Effort

by Teresa Valdez Klein on April 7, 2008

Everyone knows that Web 2.0 technologies have permanently shaken up the practice of Search Engine Optimization. But when people discuss the confluence of Web 2.0 and SEO, they’re usually talking about blogging. After all, we all know that search engines love blogs because they’re dynamic, link to each other frequently and have well-structured code. Blogs usually beat metatagging and link exchanges on a static website.

But what about Facebook applications? Until recently, search engines weren’t indexing them. But according to Justin Smith of Inside Facebook:

Facebook recently enabled developers to serve XML sitemaps off the apps.facebook.com. Sitemaps are used by webmasters to notify search engines of updates to pages and page structure, and generally are a worthwhile exercise in any SEO strategy. Since apps are served from apps.facebook.com, developers get to ride on the back of Facebook’s PageRank – potentially a big leg up on regular web apps.

As of this writing, the domain www.facebook.com has a Google PageRank of 8. It’s entirely possible that a well-optimized application page could be indexed by Google as being more relevant than a company’s own website. An inbound link from an application page could also make your site more relevant.

If you’re attempting to make the case for developing a Facebook applicatio to reach your audience, don’t forget to mention the SEO benefit to your boss.

{ 1 comment }

What is good ‘SEO copywriting?’

by Jason Preston on February 19, 2008

There’s a five-part series at Copyblogger called SEO Copywriting 2.0.

It’s a really cool and useful breakdown of what you can do with your copy to really boost your results in Google. I’d recommend reading the whole series for good ideas on how you can tailor your blog posts for a better showing.

But it’s a five part series, and let’s face it, most of us are lazy. So here’s the big not-so-secret secret: almost 90% of what you can do to get good search results is get linked to.

As Brian Clark puts it:

That’s why any true SEO copywriter is simply a writer who has a knack for tuning in to the needs and desires of the target audience. And due to the pursuit of links, those needs and desires have to be nailed well before you’ll ever show up in the search engines.

“Ask yourself what creates value for your users,” sayeth Google. As those brainy engineers continue to diligently create better algorithms, combined with people-powered social media tagging and blog-driven links, copywriters with a flair for prompting link response and conversions will become vital members of any search engine marketing effort.

In other words, good SEO copywriting is linkbait.

I think that it goes a little bit farther than that, though: I’m betting on Google. Google’s entire business is based around providing the best search results to whoever is searching.

So my strategy has always been this:

  1. Who do I want to reach?
  2. What are they searching for?
  3. What is the best response to that question?

And that’s what I try to write.

{ 1 comment }

Some Great SEO Tricks for WordPress: robots.txt and the Template Hierarchy

by Teresa Valdez Klein on July 20, 2007

Michael Gray from Graywolf’s SEO Blog has a wonderfully helpful video up on YouTube that shows the search engine optimization downside of using WordPress and how to get around it.

According to Gray, one of the biggest issues with WP from an SEO standpoint is that it puts content in a lot of different places:

  • Main Index
  • Categories
  • Date Archives
  • Author Archives

Duplicate content is a major problem in SEO because it confuses Google. When Google is confused, it gives lower priority to your content. You want to keep nice little silos for all of your information.

[click to continue...]

{ 3 comments }

Classic Example of the Inferior Findability of Traditional Web Sites, and How the Blogger Ecosystem Can Help the “Dinosaurs”

by Steve Broback on July 11, 2007

As many of our conference attendees know, I put a high priority on finding news and relevant content hidden away in traditional HTML so we can introduce it into the RSS ecosystem. Done right, it avoids contributing to the echo chamber and creates what economists call a “Pareto Efficient Allocation” — where everyone involved is made better off and no one is made worse off.

A classic example of this surrounds a post I made today on our bigbusinessjet site. Here’s the chronology:

1) My favorite Firefox Plugin Update Scanner noticed one of the better (yet archaic) HTML subject expert sites we monitor has a new article posted. Aviation gurus Conklin & de Decker have written a piece about aircraft leasing. See below, as Update Scan even highlights the new item on the page.

update_scan.jpg

2) Click through to the article and read.

conklin_dedecker_article_page.jpg

Notice: At this stage Google has not noticed that the article exists.

not_in_google.jpg

Nor has Google indexed anything (yet) with the same string I used for my post headline.

not_in_google_21.jpg

3) Write an overview post, link back, and encourage readers to click through.
We get a nice post, relevant to our readers, Conklin & de Decker gets an inbound link and the resulting traffic. We win, Conklin & de Decker wins, and (see below) readers that previously had no idea this content existed can now find it.

The good news: 5 minutes later Google has indexed my post.

after_posting.jpg

The not-so-good news (which should self-correct in a few hours/days:) Google sees us, but not Conklin & de Decker yet for the article title search string.

after_posting2.jpg

We’ll follow this over the next few days and see what Google picks up on and when.

{ 1 comment }

In Search of Traffic: How to do all the SEO Tricks in Today’s Wall Street Journal Report with a Business Blog

by Teresa Valdez Klein on April 30, 2007

Today’s Wall Street Journal Report on Search Engine Optimization offers some great advice for businesses that want to boost their search engine rankings. Their approach combines traditional search engine optimization techniques with blog-based evangelism.

But while the article mentions blogs here and there, it never states explicitly that blogs cover most of what the experts they interviewed recommend without a lot of fuss. After the jump, I’ve broken the article down into its basic components and explained why blogs can help you do just about everything the Journal article suggests.
[click to continue...]

{ 4 comments }

Alexa Rankings Debunked? I Still Think It’s Useful.

by Teresa Valdez Klein on March 5, 2007

John Battelle linked today to Peter Norvig’s comparison study of Alexa vs. actual site statistics.

We all know that Alexa ranking is a fuzzy measure of a site’s actual traffic. It is extremely vulnerable to selection bias. But I contend that it’s still a useful tool.

When we were working to determine which bloggers should get press passes to CES, we looked at Alexa ranking as one of many factors to determine whether or not the blogger had a significant enough audience to qualify as “press.” When combined with a number of other qualitative and quantitative factors, Alexa rank can be a good indicator.

Norvig’s results should serve as a useful reminder that no one statistic or qualitative assessment — especially one that is susceptible to so much bias — should be used as the definitive indicator of a site’s merit.

{ 3 comments }

Sponsored links

advertise here