Article Archive by Marjory Meechan

April 21 2009

Social Network Theory and Optimal Social Media Marketing Campaigns

by Marjory Meechan

With the rise of social media channels like Facebook, Twitter and countless others, everyone is working hard on social media marketing campaigns to maximize their followers, fans and members in these networks and spread their message as widely as possible. This is really nothing new. Marketing has always been about spreading the word and networking should always have been part of anyone’s business plan.

To maximize the opportunities afforded by this new way to network it is a good idea to have an understanding of how social networks work. Luckily sociologists, anthropologists and other social scientists including those in the field of marketing have been busy studying social networks for years, so we actually know quite a bit about how people organize themselves and how social behavior is spread. There is a great article on this in Wikipedia that describes it in detail.

One thing that I’ve always found fascinating about the way social networks work is the differing values of social network relationships. In dense social networks, people have lots of close connections between each other and regularly interact. Because of this, participants in dense social network connections tend to strongly influence each other. As a result, they are also usually very homogenous in their attitudes and behaviors, so much so that it can be difficult to get the group to change. However, when you do, they all change, which can be very valuable if this change involves the adoption of your product or service. For example, when I was in high school, we all had to have Lee jeans with the little leather brand label intact and alpaca sweaters. I have no idea why – everybody just did. Members with many connections in a group are said to have a lot of social capital in that they have great social influence within the group. However, at some point, somebody had to start the trend and that’s where understanding social networks is important. In particular, understanding which members of networks spread new ideas and behaviors is critical for a good social marketing campaign.

It might seem that the person with the most connections and therefore, the most social capital in the group would be the most influential in spreading change. The emphasis in social media marketing which is on getting lots of friends and followers would seem to follow that theory. However, it turns out that sheer numbers of connections does not necessarily signal the most valuable members of the network for spreading a message. In fact, it is the people with the most direct connections between groups that have the most influence on spreading change. These people have “bridging capital” in that they serve as bridges between groups.

What this means for using social networks to spread your message is that the best people to have in your network are those that have many direct connections to a number of dense social networks, not just lots of connections within a dense social network. These are the people who will be the innovators and will have the most value for spreading the message.

April 2 2009

How Important are the Results of the Keyword Density Tool for Search Engine Optimization?

by Marjory Meechan

The keyword density tool is a standard part of any search engine optimization tool arsenal but how important are the results of the tool for determining how well a page is optimized? What role does keyword density play in the search engine algorithms?

First, as far as we know, keyword density has never been the formula used by search engines to calculate keyword relevance for a page. Unfortunately, we don’t know what the true formula is (it’s a secret and Google won’t tell).

Most believe that it is actually based on term weight and that would probably be a better way to evaluate keyword use on a web page. However, one of the main components for calculating term weights is the total number of times the keyword is used on the internet. We have no way of knowing what that number is because even if we could crawl the whole internet and find all the pages and all the words, we still wouldn’t know if our crawl was the same as Google’s or Yahoo’s or any other search engine. This is why we can’t create a tool to more sharply evaluate keyword use on a page using term weight — we don’t have access to all the data. (And it may also explain part of the reason why you see different results across search engines — they have different indexes).

However, there are some things we do know:

The number of times that the key phrase (or parts of it) appears on the page carries some weight. We know this because we have seen rankings change after keywords are added to pages. We have also observed sites undergo “penalties” for stuffing keywords into meta tags, alt attribute tags or in fine print at the bottom of the page — commonly referred to as “keyword stuffing”. Too many unnaturally placed keywords may trigger a spam filter and cause rankings to go down. This appears to be algorithmic so we have to assume that counting the keywords on the page has some place in the algorithm, however small.

The total amount of page content carries some weight. Pages with more content tend to rank better (all other things considered).

Overall keyword theme relevance carries some weight — search engines evaluate the content based on factors like proximity of the keywords to each other, ordering on the page, position in the overall body content, positioning in important places on the page, as well as the thematic relationship between the key phrase and other words on the page. In some cases, synonyms of the key phrase can even count toward keyword relevance. So, the algorithm is far more complex than a simple keyword density calculation, but keywords undoubtedly do feature in the calculation.

Finally, we know that if search engines cannot properly interpret the code on the page, they cannot “read” the content properly. This could result in parts of the code being included in the text indexing for the page possibly resulting in lowered overall keyword relevance for the page. Search engines might even have trouble seeing the keywords on the page which could affect any keyword weighting for relevance. While it is true that they likely attempt to filter out this kind of noise in the indexes, we do not recommend leaving it to the robots — it’s best to control what they see whenever possible.

So, we use keyword density analysis tools as a quick way to look at the content on websites the way that a spider might see it. If the keyword density is too high, it can be a signal that the page may trigger a spam filter. If the keyword density analyzer doesn’t find the keyword at all or the density is very low, it can indicate that the page is not appropriately targeted. Because the keyword density tool is basically a simple spider, if it cannot properly “read” the content, it could indicate problems with the coding on the site.

That said, if the content on the page is properly focused on the keyword theme of the page and the keyword appears in all of the important places, title tag, description tag, headline, a couple of times in the main text content (ideally in the first and last sentences), the text is probably fine even if you don’t get up to the “optimal” keyword density. Furthermore, if your key phrase is very long (more than 3 words), using it 4% of the time on the page would be very unwieldy so you may not even want to use the keyword that many times. Content should always be written for people first, with the keyword density tool just used as a kind of reality check to make sure that it’s also going to be okay for spiders. A real “optimized” page will have content that is useful for visitors and will attract quality inbound links.

March 20 2009

Local Search Map Spam: Real danger in low quality results

by Marjory Meechan

Quality in local search directory results is extremely important because the consequences of bad local search results can be very serious for the user. This is because contact with dubious businesses or individuals found in local search brings those individuals to the user’s door — not just their computer. Consumer alerts from the Federal Trade Commission and the Better Business Bureau describe some of the issues surrounding inaccurate information in Locksmith listings that illustrate the concerns that relate to any listing that involves a serviceman being called to the door. Most reports of abuses involve over-charging or other types of consumer scams,but there could easily be more serious consequences. Recently, Google announced that they have taken steps to close loopholes that allowed unscrupulous persons to hijack the listings of real businesses, but Google watchers remain skeptical as to how well this will solve the problem.

Local search directories including Google Maps, Yahoo Local and Live Maps are rapidly overtaking more traditional ways of finding local business information. Unlike the old Yellow Pages where all information was specifically submitted and carefully verified, information in online local directories is usually listed for free and gathered from a combination of user submissions and information from other local online directories. Business owners can “claim” their listings and “correct” inaccuracies but the monitoring of this has not been very vigilant and there have been widespread reports of listing hijackings or registering of multiple accounts. In fact, when Google’s Matt Cutts called for ideas on what areas of webspam their team should tackle for 2009, map spam was the second-most popular choice (no-result search results was first).

Google Maps has been the center of much of the concern although there are issues throughout the local directory listing sector of the internet as well. Originally, Google Maps were more careful about their listings. In order to create or claim a listing in Google Maps, a business owner had to obtain a PIN number to access the listing by mail. However, this required Google to actually mail the cards to business owners and Google eventually switched to telephone verifications instead and this has turned out to be not nearly as accurate. Google supported the less strict telephone verifications with bans and/or penalties for businesses that attempted to game the results. Naturally, this only works with the more scrupulous businesses and there have been calls for Google to return to stricter verification methods for their results.

In the meantime, business owners need to be very vigilant about their listings. Local search listings should be monitored regularly (at least once a month). Ensure that all information is still valid and any duplicate listings are removed. Here is information that will help edit or remove information in Google, Yahoo and

Since these search engines also cull information from other popular local search directories, it is necessary to make sure that these are up-to-date and accurate. Probably the easiest way to check is to search for your business and review any listings that appear for accuracy. Some important local search directories to monitor include:

Setting alerts with Google, Yahoo and is a good way to monitor mentions of your site automatically.

Finally, it may also be a good idea to make your customers aware of any scams that they may encounter by alerting them to the existence of the issue. One of the best weapons against phishing scams where users were diverted to phony banking or other financial websites via suspicious e-mails is customer awareness. Local search spam is similar in that hijacked local listings may be tricking your customers in much the same way. Combat map spam with awareness. If you are a member of a national organization, include this information on your site and if there are any alerts or important information about these kinds of issues in your industry or local area, link to them so your customers can be aware.

© 2021 MoreVisibility. All rights reserved.