What are the best practices for optimized website design and user experience? How can you design an attractive, user-friendly website that maximizes your ability to be found in the Search Engine Results Pages and drives conversions? Read our expert tips for optimized design and user experience, compelling aesthetic design, website architecture, usability and more.
Many moons ago, I wrote a blog post about properly handling 404 (page not found) errors with ASP.NET. But an often overlooked and underused approach to error handling is the custom 500 error page. For those not familiar with a 500 error, it is the error that occurs when an exception is made. An exception could be anything from incorrect logic introduced into the code by the developer (not me of course) to malicious data entered by a user into a form field. If the developer chooses not to implement a custom 500 error page, the end user will be presented with an often very unpicturesque screen containing information about the exception. Often, this information can be very revealing and can give a hacker just the right amount of information they need to compromise your website.
The solution to all of this is to properly configure your web application to display a page that is more user-friendly, pleasing to the eyes and one that conveys a custom message to the end user. Read More
Redesigning a website is a major task in itself. There are branding, design, and functionality considerations in addition to a handful of others; and on top of all that you’ve got to ensure the site is launched properly from a technical standpoint to ensure all marketing initiatives will continue to operate smoothly. Among these are ensuring that the organic health of the website remains stable, and is in a position to grow after launch.
At MoreVisibility, we’ve helped lots of webmasters properly plan for and launch their newly designed websites as well as help repair sites that were launched without a proper SEO site transition plan.
Below we’ve compiled a list of 3 of the most critical factors we see webmasters not doing correctly during the design & launch phase that caused them to see a significant loss of organic traffic after the new site launch.
1. Not Properly Benchmarking Against Current Site
How can webmasters best identify traffic changes, and troubleshoot specific issues on the new site if there hasn’t been a proper benchmark of activity established on the current site? It can be easy to get so caught up in the excitement of design mockups, wireframes, and anticipation of seeing your new site live that you completely forget about properly benchmarking your current site’s activity; but you shouldn’t – this step is vital.
A thorough “snapshot” of the website’s organic health should be taken in the several months leading up to launch. This includes mining data from Google Analytics, Google/Bing Webmaster Tools, and various other keyword trending tools to help provide a comprehensive picture of how the website is performing organically today.
After the new website is launched, this data can be tremendously helpful in evaluating the success of the launch, and also to identify additional opportunities for continued organic growth.
2. Not Recognizing the Importance of URL changes
When a search engine crawls your site it’ll search through your pages, content and URLs, and store the information it finds in it’s “memory,” also known as it’s index. When a user enters a keyword into the search engine bar, the search engine then references it’s index to determine which webpages to display to the user.
If the URL of one of your page changes, and you don’t directly inform the search engine about it, then it has no way of knowing that anything has changed. It’ll continue to show your old page URL in the search results for a period of time, and when users click on it, they’ll be sent to a non-existent page on your website. Now you’ve got an unsatisfied web searcher, and a search engine that will continually send users to URLs that don’t exist within your site.
When your site is re-crawled by the search engine it won’t find the page it was looking for, and the previously high ranking page could drop completely from the top listings for the keyword, and all of your traffic will go with it.
To fix this problem it’s vital to ensure proper implementation of 301 redirects, which brings us to issue #3…
3. Incorrectly Implementing 301 Redirects
A 301 redirect is essentially a signal to search engines that an existing page on your website has permanently moved to a new location. When it comes to site redesigns they can either be your best friend, or worst nightmare if not implemented correctly.
Improperly implementing 301 redirects (or not doing them at all) will essentially cause a search engine to not know where to go to find the page on your website. The page will be temporarily lost from the search engine’s index, and as already mentioned, when the page isn’t indexed it also drops from the search results. After this has been done it’s increasingly difficult to get the page to rank as high as it once was. Traffic that was once flowing to this page along with all of the inbound links you’ve built to the page will be lost until 301 redirects are implemented properly.
On the flipside, proper implementation of 301 redirects ensures that search engines know where to go to find the existing page on your new website, and also that all link juice (links pointing toward the URL) are now contributed to the new page URL instead of the old non-existent URL.
It’s important to have a strategic plan in place prior to the new site launch, and also to monitor the site closely for a period of time after launch to ensure a successful transition has taken place.
Below is a search engine report from a website that overlooked these factors when launching their new website.
Don’t let it happen to you!
One of the fundamentals for constructing a website with SEO in mind is ensuring that all of the pages for the site are accessible to both the site’s visitors and the search engines in as concise a path as possible. Internal link structure is an important factor in determining a site’s performance in the search engine results pages. The faster a search spider can access all of your pages mitigates server latency and will make sure your most important pages are served in search results. Also, by linking explicitly to all of your top level, category and service level pages, you are alerting the search engines to their relative importance on the site.
If using less SEO friendly coding elements is something you just can not avoid (because of CMS constraints or the website architecture is not conducive to coding manipulation), web developers can always make certain that their site’s pages are also accessible via the internal, html sitemap page, the external XML Sitemap pages and footer navigation.
If you want to analyze the internal links on your site, Google Webmaster Tools has a section that will let you view how it sees these links and how many other links from your site are pointing to them. From the Webmaster Tools dashboard go to: Your site on the web>>Internal links. This will list the pages on your site (listed underneath the “Target Pages” column”), along with the associated count of links Google has found to be pointing to them: