In my last few posts I have covered the different types of metadata and how to use each effectively. In this post you will learn about the “alt attribute” or more commonly called the alt tag. The alt tag is html code we use to label images found on websites.
If you view a site’s source code and look for an image, you will most likely see the name of the image in gif format and next to it will be the alt= tag. This is where you can label your image. Example: Alt Attribute example:
<img alt="Big Brown Dog" src="dog.gif" />
You might be asking yourself “What is the importance of the alt tag?” Good question! The alt tag is important for three distinct reasons:
1. Some searchers turn graphic functionality off so a page will load faster if they have a slower connection speed. When this is the case, the image will not appear, but rather the image’s alt tag will be viewable.
2. It’s used for blind and visually impaired readers who access a page using audio-based browsers, or screen readers. — These devices read the page aloud so the user can hear the content on the page. If there is no alt tag, the images will be skipped and important information could be lost.
3. Universal Search — This has been around for about a year, and is still evolving. Universal search uses alt tags and other information to display the images of your website in the “blended results”. These blended results feature images, video, news and regular listings in the search results. The search engines’ algorithm takes alt tags into account when displaying these mixed results.
Alt Tag Tips
While it is important to realize that the alt tag should be used for all of the images on your site, you should not over use this tool. Abusing this tag can have serious consequences when it comes to the search results. Below are some do’s and don’ts to creating effective alt tags.
– Use on every image on your site
– Describe what the image represents
– Use keywords where applicable
– Make sure each image has a unique alt tag
– Don’t stuff all of your keywords into the tag to “game” the search engine — This is called alt spam
– Don’t add alt tags for things like buttons and images smaller than 10 x 10 pixels. These items are not necessarily important to search, so you don’t have to label them.
As always, alt tags are not the “SEO Golden Ticket” but rather a piece of the bigger picture. Work on Metadata and alt tags, and you will be one step closer to having an optimized website.
In our last installment of explaining Metadata for a website, we will cover the Keyword Tag. In the past, savvy webmasters and marketers would place many keywords, including words that had nothing to do with their site in this tag. When the databases were small, Search Engines would use the Keyword Tag as a way to index pages. Sometimes, people would abuse this and rise to the top undeservingly. As the databases grew and search engines become more sophisticated, they put a stop to spammers stuffing the Keyword Tag. As a result, there is less weight placed on the Keyword Tag today.
By now I may have convinced you to scrap the keyword Metatag altogether, but that is not my intention. To fully optimize a site, you must use all legitimate SEO practices such as content optimization, link building and Metatag optimization, including the KEYWORD TAG. Search engines use many factors in determining ranking, and as mentioned before, while they place reduced weight on the Keyword Tag, it is still a part of the SEO Big Picture. Below are some tips for writing effective Keyword Tags that are not considered spamming:
– The Keyword Tag should contain only phrases found on the page
– Phrases should be unique to the page
– Primary Keyword Phrase should be listed first
– Target 1 Primary key phrase and 2-3 supporting keyword phrases to put in the tag
– Every phrase in the tag should appear on the page
– Separate keywords with a comma and use no more than 12 phrases per page.
Things not to do when writing Keyword Tags include:
– Keyword Stuffing — repeating keywords over and over in the tag
– Place unrelated keywords in the tag — This is putting possibly high volume search terms in the tag that have nothing to do with the content of your site in hopes of tricking the search engines
By following these simple steps, you will be well on your way to writing Keyword Tags that will help with the SEO big picture of your site.
When the Internet was first developed in 1971, existing as the U.S. Department of Defense’s ARPANET (Advanced Research Projects Agency Network), it was only 23 mini-computers large and spanned several universities and institutions across the United States.
In the ten years that followed, the U.S. National Science Foundation created a separate network called CSNET (Computer Science Network). This was developed for institutes without access to ARPANET. The CSNET network eventually merged with other networks to become the Internet.
Figure 1: ARPANET (then)
Figure 2: Internet (now)
The early version of the Internet was created to provide researchers with a way in which they could more easily communicate, quickly sharing theories, discoveries, and opinions. In the decades that followed the birth of the Internet, the practices of Web 1.0 and the fledgling directory and search engine technologies gave way to improvements under the Web 2.0 banner.
While the arts and technology disciplines burgeoned under Web 1.0 and came to prominence with Web 2.0 advancements, science languished. The technology that was developed by science for science left its parent field behind. But with Web 2.0 and the ever improving ways in which the search engines — and, by extension, search engine optimization (SEO) — are able to perform more relevant and functional information retrieval, science is now cautiously moving into the “Science 2.0” realm.
Much as the Internet has helped industries such as musical entertainment, hardware manufacturing, and software development realize the value of the Internet during the Web 1.0 phase, and enable such industries to monetize said value with Web 2.0, the Internet is now helping science to realize the value lying within. The ability to create a website that is easily crawled and indexed by the search engines, as well as the means by which one can create content that is identified as useful and relevant through SEO best practices, have made scientific publications open-access. This level of transparency and connectivity is driving science to challenge itself, making such traditionally opaque publications such as scientific journals and peer reviews subject to the challenge of self-sufficiency.
Web 2.0 practices, especially the rise of blogging and blogging software, now makes it easy for anyone to become a web publisher. Adopting SEO best practices, such as keyword targeting, cultivated trusted and relevant inbound links, utilizing title tags, description meta tags, and keywords meta tags, and content freshness can allow researchers to self-publish — the scientists who grew into Web 2.0 can now post ideas and data online, and optimize the information in a way that enables them to share their ideas quickly and universally. The wide-spread adoption of social media, and interactivity via social networking means that scientists who embrace Web 2.0 practices can debate and discuss on blogs, in newsgroups and forums, via mailing lists, and simply publish research documents under open-access publishing on their own website. This level of interactivity and transparency has led to a shift in the journal publication and peer review process: scientific research can now be web-published, indexed by search engines, instantly referenced and downloaded by web searchers, and the rarified practice of peer review has given way to the immediacy of peer interest.
These shifts are reflected across disciplines all over the Internet. Peer interest, or user interest, and the ability to easily search for and find information using search engines and directories, has led to virtual forests of information. These forests can only be navigated by helping the search engines determine which information is the most important. SEO practices such as writing compelling, keyword-rich content and using metadata help the search engines to be better able to serve up the best possible results for a web search. The addition of optimizing bookmarks, tags, multimedia, and other social media elements help to further refine search results. Sites such as Amazon.com, Digg.com, and Wikipedia.org have enabled critics, authors, readers, anyone, to be able to generate content and to comment upon it. This sort of “sociability” has now finally seeped its way into science.
While the age-old customs of scientific journals and peer review won’t be so readily replaced, science embracing Web 2.0 and the now adolescent child of its creation, the Internet, can only lead to faster advancements and swifter changes. After all, it was the need for improved communication that helped to bring about this modern age of Web 2.0, soon to be Web 3.0. As science becomes more and more accustomed to the way in which its child has matured, future improvements will be considerable. There are still over a billion plus pages of web content and documentation that have yet to be crawled and indexed by the search engines, information that is sitting deep within the Internet as part of the semantic web. As more and more scientists embrace the Internet and its various technologies, these deep pages, rich with content, profound with relevance, will come to the surface as science helps to push the search engines to dive deeper and to search better, truly serving up the most relevant and the most useful results… and as the Internet and the search engines develop and improve thanks to science, so will SEO. Search engine optimization continues to grow and adjust to the ever shifting Internet, learning along with the search engine algorithms and their robots and spiders. Now, decades after creating the Internet, science is back to push things along and SEO will be there, ready to adapt to the changes.