Covid-19: Valuable Marketing Resources and Information.

Article Archive by Lee Zoumas


June 10 2009

Search Engine Optimization Toolkit for Microsoft IIS 7

by Lee Zoumas

Recently, Microsoft released a beta version of an SEO Toolkit that can be integrated with its web server software, IIS 7 (Internet Information Services), which is part of the Windows 2008 Server installation. The toolkit provides several SEO related tools for web developers and web server administrators. It shows that Microsoft is committed to making a website’s content more SEO friendly by offering free tools, an increasing trend for Microsoft, that provide detailed, real-time, analysis of a site’s structure. The SEO toolkit offers a variety of features, including site analysis, robots exclusion, sitemaps and sitemap indexes, all accessible through an easy to use interface integrated within the IIS 7 administration tool.

Web developers and web administrators have been waiting a long time for a tool like this. The SEO toolkit will help identify multiple SEO errors, such as duplicate content, broken links, invalid markup and poorly written title tags, meta descriptions and meta keywords. Traditionally, developers using IIS had to rely on external tools to get this valuable data. Now these tools are available in the same interface where other parts of a website are administered. The SEO toolkit will take a while to catch on, since it is still a beta version for a brand new operating system, I think it’s a great step in the right direction for Microsoft. It definitely

To read more about the Search Engine Optimization Toolkit for IIS 7 please visit http://www.iis.net/extensions/SEOToolkit

May 15 2009

Domain Misconfigurations and Duplicate Content

by Lee Zoumas

An often overlooked part of establishing a web presence is not paying close attention to how a domain name is configured. Usually, we buy a domain name from a registrar, buy some hosting, and then point that domain name to the host, using the registrar’s control panel. However, some registrars or hosts, contain a wildcard setting that will allow all possible combinations of subdomains to resolve back to your main domain. For example, imagine you buy a domain called example.com. Typically, you would want users to browse the site in one of two ways:

http://example.com
http://www.example.com

Ideally, you would want the non-www to 301 redirect to the www version, otherwise you would have duplicate content. However, sometimes there are wildcard subdomains enabled by default in registrar or hosting control panels. Typical settings for a domain configuration will look like this:

Chart

The problem with the above settings lies in the “* (All Others)” wildcard setting. This basically says “route any subdomain to the main domain.” So all possible combinations will lead back to the main website:

http://wwww.example.com
http://abc.example.com
http://spam.example.com
http://duplicatecontent.example.com

The list could go on and on and could get downright nasty. Basically, what this means is that someone could create a page that links to all these subdomain variations and those pages will get indexed by search engines, which will create duplicate content for your website. That is why it is imperative to make sure that your domains are not configured this way.

March 16 2009

Windows Web Servers and Case Sensitive URLs

by Lee Zoumas

A few weeks ago we noticed that one of our clients had a single web page showing up multiple times in Google. In the SEO world, this is known as duplicate content and is generally frowned upon. The URLs for the two entries in question looked something like this:

http://www.domain.com/SubFolder
http://www.domain.com/subfolder

As you can see, the difference in case is what designates this as being two separate pages to the search engines, but they do in fact point to the same physical page. In the Linux world, this would not matter so much, because URLs are case sensitive, so one of the above URLs would throw a 404 or Page Not Found error. So in effect, only one page would get indexed. However, in the Windows world, URLs are not case sensitive, so both URLs would serve up the same webpage. Upon further examination of the internal linking structure, we noticed that the same case (lower) was being used to reference all of the internal URLs. The problem came from the outside world pointing to the same page in various case combinations. This meant that people who were linking to the website from the outside world could have potentially given the website duplicate content penalties from search engines, which is not good.

The solution to this problem is quite simple and elegant. We added one rewrite rule to the website’s .htaccess file, which permanently redirected (301) any URL containing upper case characters to its all lower case equivalent. Since you cannot predict, or enforce how people link to your website, we strongly suggest you utilize this simple solution to prevent this duplicate content penalty.

© 2020 MoreVisibility. All rights reserved.