Texas Domainers and Dev Convention – Gaylord Texan Nov 3-6

Gabriel Baker is hosting the first Texas Domainers and Developers Convention at the Gaylord Texan fromNovember 3 – 6.  Several hundred people are expected to attend, and it features a number of speakers over the four day event.  There will also be a silent and live domain auction, with a list of many Texas oriented names onthe block. 

Discounted passes and additional information is available at the official site: http://www.tdadc.com/

 

SEA, SEM, and SEM: Who invented these terms?

My latest Search Insider column takes a look at the origins of a few key terms in the digitalmarketing lexicon: SEM, SEA, and SEO. It was spurred by Bob Heyman’s article in Search Engine Land last week.   

Here is an excerpt from the article:

A story last week on Search Engine Land (“Who Coined The Term SEO?”, by Bob Heyman) got me to thinking about the somewhat nebulous origins of the term “search engine optimization”, or “SEO”, as well other common search terms such as “SEM” and “SEA”.   There are a number of claimants and facts around the term “SEO”, so I revisited a few of them, and found a few additional interesting facts along the way.

Before I go into the SEO claims, the origins of the terms “SEM” and “SEA” are pretty clear.  In 2001 Danny Sullivan achieved a consensus with the readership of Search Engine Watch on the term “search engine marketing”, noting that the organic-centric SEO no longer covered the full range of tactics in the search space, given the rise of pay-per-click.  “The phrase “search engine marketing”, or “SEM”, very logically covered a wide range of tactics related to search engine visibility, and somewhat relegated SEO as a subtheme within the overall practice of search marketing (see “Congratulations, You’re A Search Engine Marketer”).”

Read the rest here:

http://www.mediapost.com/blogs/search_insider/?p=891

 

Impact of Vanity Generic Top Level Domains Part 2 – Search Insider

Here is the first half of my current column at MediaPost’s Search Insider.  This is great story, and it is very exciting to see another piece of Internet history happening before us.

Originally published in Search Insider:

 

In my last column, I discussed some of the challenges of moving an existing website to a new vanity generic top-level domain (gTLD).  In this installment, I will provide a review of several existing gTLDs, and discuss the branding and search impact, and also the process of applying for a gTLD.

For a quick recap, ICANN, the governing and administrating body over all Internet addresses, voted in late June of this year to allow individuals and corporations to apply for .anything, or literally any word or phrase exceeding three characters not taken. A new top level domain can be used as a registry, or for one’s own Web presence.  Under the new policy, the following names are true possibilities as a home for an online Web presence:

http://www.2020.abc
http://dallas.cowboys
http://bach.music
http://www.checkmy.email
http://yourband.mp3
http://username.myspace
http://caramelmacchiato.starbucks
http://batman.movie
http://thisisspinaltap.imdb
http://olympics.wikipedia

Let your imagination run wild. But first, let’s take a look at other existing generic TLDs.

The branding of a TLD – why .com will always be king

If the marketing novelty of the vanity gTLD seems to outweigh all other considerations, it may be a good exercise to first analyze the current landscape.  Ever heard of .Museum?  Yes, it’s a real working gTLD (see this redirected URL for http://nyc.moma.museum; their main URL is http://www.moma.org), though the average Internet user is wholly unaware of its existence.  .Travel has been in existence for almost two years, but very few travel sites have adopted it as their primary address on the Web.  More commonly, major travel and hospitality brands have reserved these names and pointed them at “Brand.com” as a matter of driving traffic, and for brand and trademark defense. Other extensions such as .Jobs and .Pro have yet to gain mainstream appeal, even though their categories have wide potential within their respective theme-space.  Another highly anticipated extension, .Mobi, has also failed to gain mainstream adoption as the default address of the mobile Web, with most major brands choosing to host their mobile presence on their legacy brand.com or subdomain (ex. m.cnn.com), targeted to mobile devices. 

Your own awareness of these gTLDs (or lack thereof) is a direct reflection of how well that TLD was branded. Enterprise marketers will face the same challenge if/when they change over their existing .com presence to a new extension.  Hosting your Web presence on .com benefits from a TLD brand that everyone has helped build. The .Com  domain had no brand until U.S. advertisers got behind it, and a valid question to ask is whether or not your new gTLD is ready to compete against this level of awareness and trust.  The answer is really simple – no single advertiser has the budget to match up to the amount of collective ad dollars that have promoted .com – it is synonymous with the Internet, more so than any other domain brand.  This may be obvious to most readers, but marketers should keep this fact in mind as discussions around changing to gTLDs progress in their respective organizations. 
 

The birth of the search-optimized Top Level Domain

Shifting gears a little bit, let’s pick back up on the natural search aspect of gTLDs.  Having a generic keyword theme in a vanity gTLD also doesn’t guarantee natural search success or authority.  Just like a new domain, the Top Level Domain still earns its authoritativeness in the search engines.  It has long been recognized by SEOs that engines have shown bias and trust towards content and links on …..

Read the rest of the column here:

http://www.mediapost.com/blogs/search_insider/?p=842

 

 

 

 

Cuil (“cool”) – New search engine intrigues, but I have questions

This one seemed to come out of the blue this morning.  New search engine debuts, from a team that includes former Altavista lead engineer Louis Monier.  I ran a few queries, and the most interesting thing I noticed was that it picked up relevant listings that I had not found before.  But I have a few questions and personal thoughts about my experience:

- “Cuil” (pronounced “cool”) could not be any more confusing, and they would have to get “cool.com” to make it right.  The problem is that the owner of that name is well known for holding on to his names, as he can afford to.  Driving traffic to “cuil.com” will only drive traffic to “cool.com”, especially if word-of-mouth picks up. Google and Yahoo don’t have this problem.

- I saw a lot of relevant images, but they were misaligned with the results.  It seems to infer that the image came from the site it indexed, but it wasn’t.  It’s confusing, and irrelevant.

- The layout can be disorienting, and does not allow for easy scanning of the entire list.  It would be nice if users had more control of the results display, including a standard ordered list, reduction of the desc/snippet, and prominence on the URL (a lot of trust is placed in the URL).

- The index could stand to be freshened up a bit.  I found pages that had been removed months ago.

- The top 10 could use more diversity at the domain level.  I found a lot of pages from a single domain that had little to offer.

All said, I’m keeping an eye on this one.    Lots more to review – but this was my first impression. Danny Sulivan has a detailed review here: http://searchengineland.com/080728-000100.php
 

 

.Anything – Thoughts on new ICANN gTLDs

My latest column is posted at MediaPost Search Insider, the first of a two-part series on the impact of new vanity ICANN generic top level domains (gTLDs).  The title accidentally got hacked off – it should say “.anythinggoes”, so it looks a little out of context in its current state.

http://www.mediapost.com/blogs/search_insider/?p=831

Also, here some additional columns I wrote for MediaPost that discuss the importance of a domain move, and the importance of planning for search:

Five Tips For Assessing the Value of Natural Search
http://blogs.mediapost.com/search_insider/?p=624

The Unfolding Search Story of Bodog.com
http://blogs.mediapost.com/search_insider/?p=614

Seven Challenges of SEM Planning and Execution
http://blogs.mediapost.com/search_insider/?p=408

Solutions to Seven Challenges of SEM Planning and Execution
http://www.mediapost.com/blogs/search_insider/?p=417

 

New TLD’s mark a different kind of goldrush

ICANN’s approval today of the opening of .anything is going to be a wild show at the very least. 

Here is an overview:

http://news.cnet.com/8301-10784_3-9978448-7.html

 

 

Natural-Born Search Killers, Part 2

This column originally appeared in Search Insider on August 2, 2006.by Rob Garner 

 

In my Aug. 2, column, “Natural-Born Search Killers,” I discussed how not having a URL strategy could kill a search presence, detailing four key elements that contribute to the value of site domains and URLs.  (The elements included link equity, positive search engine equity, bookmark equity and search investment.)  In this article, I will discuss a few additional elements of URL equity: 

1)       The consequences of redesigning without a URL strategy

2)       Questions every marketer and developer should ask before a redesign

3)       How to assess the value of existing URLs

4)       How to transition existing URL equity

 

The consequences of redesigning without a URL strategy

Redesigning a Web site without paying attention to the inherent value of URLs can have a tremendously negative impact on search visibility.  Here are a few common issues that you should watch for during your site relaunch:

Indexed pages drop out of the search engines.

New site pages are not found by crawlers.

Backlink history is lost.

Navigation becomes difficult, and visitors cannot find what they are looking for.

Bookmarks are rendered useless, leading to “not found” error pages.

Server bandwidth is wasted.

Valuable site traffic is lost.

Lost conversions and sales

Calling out these issues should emphasize the importance of paying attention to URL strategy and structure. Here are three key questions every marketer or Web designer should ask before planning and redesigning a Web site:

1. Is search-engine based URL architecture included in the site’s business and technical requirements?

Getting URL structure and search optimization established as business and technical requirements before starting a redesign goes 90 percent of the way to maintaining positive search equity.  Trying to tackle complex search issues in the middle or end of redesign is a losing battle that nobody wins.

2. How much URL equity is established in the current site structure?


URL equity should be based on several factors, including the number of backlinks, the quality of those backlinks, the previous investment in search, the age of the domain and URL structure, and positive search engine equity.

3. How can the existing URL structure be preserved?

If you have a sustainable URL structure, in order to preserve positive equity it is a search best practice to maintain that structure when going from one site design to another.   While you can’t always avoid changes in URL structure (possibly because of user experience changes or technical issues), making every attempt to maintain consistent structure will provide better long-term benefits for search.

Assessing URL Equity

Once Web site stakeholders understand the value in preserving URL structures in a redesign, there are many considerations for assessing the quality of existing link structures.  To assess the value of a URL prior to the redesign, consider the following:

 

Quality of inbound home page and deep site links–Time should be spent reviewing hundreds, or even thousands, of links through a manual backlink check in a major search engine;  key links should be identified and prioritized in order of importance.

 

Age and history of domain and URLs–The age of a domain and internal URL structure can also have a major positive impact on a site’s search visibility. 

Log file history–Based on incoming search engine referrals and link traffic, reading log files can provide a good indication of which internal site pages are performing well. 

Liabilities–URLs can also have a negative history that could reduce the search performance of a Web site.  If a site has ever engaged in tactics that search engines don’t approve, or if the URLs have been banned at any time, you should consider a change of domain.

 

Transitioning URL equity

No matter how much you prepare for a smooth transition, some URLs will inevitably change, and some documents will be removed.  In this event, proper redirection techniques are essential in preserving positive search engine visibility.  In most cases, 301 permanent redirects are the best solution for using multiple domains and for the pages that have moved to a new location.

301 redirects vs. 302 redirects: When a page is permanently moved to a new location, a 301 status will tell the search engine to remove the previous page from the index, and will start crawling the new location from that point forward. 

Pointing multiple domains: Pointing multiple domains is acceptable to search engines, as long as 301 redirects are used.  When 302 or 200 status redirects are used, you may encounter  duplicate content issues.  In a worst-case scenario, duplicate content issues can result in the total ban of pages or an entire site, and will often decrease the overall search engine performance of a Web site (though most duplicate content is removed without any site penalty). 

URL rewriting:  If URLs must change, one solution is to rewrite the new URLs in the same format as the old structure, or create a new search-friendly structure entirely.  URL rewriting will give you more freedom to change platforms and file names, while maintaining a consistent naming convention.