After Microsoft’s bid for Yahoo! fell apart earlier this year, the proposed deal seemed dead and done — until, that is, billionaire Carl Icahn stepped in and Yahoo finalized a search advertising partnership with Google.
The saga of Microsoft and Yahoo! started in January of this year. That was when the software giant first launched a bid to buy the entirety of the search engine portal.
As we — and the rest of the webosphere — blogged about earlier, months of talks ensued with Yahoo constantly turning down offers in varying denominations. Ultimately, Microsoft offered to buy Yahoo, one of the Internet’s first portals, for $33 a share. This was a figure which valued the company at $47.5 billion all told and was, in turn, flatly refused by Yahoo.
Speculation ran rampant, especially with regard to the reasons for which the deal fell apart. One of the many reasons was that Yahoo co-founder and CEO Jerry Yang was looking for $37 a share. Allegedly, the price per share was too high for Microsoft CEO Steve Ballmer, who reportedly decided he did not want to buy Yahoo at that price.
The ensuing criticism, from the industry and shareholders alike, was led in no small part by billionaire investor Carl Icahn, who purchased millions of dollars worth of Yahoo shares. In fact, Mr. Icahn tried to oust Yahoo’s current board under the belief that it had made a mistake by not accepting the Microsoft offer.
Coming on the tails of this activity was recent news that Microsoft has said it is considering a deal with Yahoo which would not involve a full buyout of the company. There are no details as yet on this alternative transaction. The statement from Microsoft says that the company, “is not proposing to make a new bid to acquire all of Yahoo at this time, but reserves the right to reconsider that alternative”.
After Microsoft’s statement, Yahoo confirmed it was looking at a number of “value maximizing” alternatives with Microsoft, and would assess offers made by the firm.
A twist on the on-going saga is Yahoo’s partnership with Google to foster an alternative search advertising arrangement. Post-partnership, what is the latest to come from the on again, off again scenario? It’s news that Microsoft may be willing to sweeten its previous offer for a partial buyout of Yahoo’s search business.
In the meantime, following the end of discussions with Microsoft mere weeks ago, Yahoo’s board said in a statement that a sale leaving the company without an independent search business “would not be in the best interests of Yahoo stockholders.” However, it’s been reported that several of Yahoo’s nine board members, including its chairman, Roy Bostock, have suggested an interest in holding further discussions with Microsoft on a possible deal to sell the search operations.
Should Microsoft increase its buyout bid for just Yahoo’s search assets, at the end of the day, what would this mean for the Internet and search, especially in terms of SEO? We might have to wait for the dust to settle from this most recent cycle of will-they-won’t-they before we can figure that out.
With the launch of the new iPhone, announced at the Apple developer’s conference on June 9th, both iPhone 2.0 and Apple’s new MobileMe service generated excitement among the attendees. What’s of interest to many mobile users is not only the new 3G iPhone, but the fact that it will replace AT&T’s EDGE network. Simply put, the new iPhone should make web browsing easier.
What does this mean for SEO? Well, specific to mobile SEO, there could be a shift due to the addition of this new player on the mobile device market. One lucrative market segment that might be challenged by iPhone 2.0 and MobileMe is that of working professionals. Most of these professionals now use the BlackBerry ®, made by Research in Motion (RIMM).
SEO for mobile devices such as the BlackBerry may be a smaller part of the SEO big picture, especially when compared to “desktop” SEO, but it shouldn’t be ignored. In what might be a stroke of irony, some companies with mobile sites — whether it’s a mobile sub-domain, a mobile portion of their primary domain, or even a .mobi site — have ignored SEO best practices used on their desktop sites when creating their mobile sites.
For example, splash pages and forms pages can often be found on mobile sites. Traditionally the home page of a site, mobile or otherwise, has the highest link popularity of any page on the site, and a splash page doesn’t take advantage of that link popularity with competitive keywords.
When it comes to forms, if a mobile site has added forms only for a personalized mobile user experience, it’s better to provide a text link to a non-personalized experience for those users who prefer to use the site while not logged in, or for mobile search engine spiders to crawl. Yes, there are mobile search engine spiders and mobile search engines and databases as well. By keeping to SEO best practices regardless of the forum, you can help to ensure that more content will be crawled and indexed, giving your mobile site more opportunities to be ranked for non-branded keywords.
If you are serious about your search engine optimization and online marketing strategies; you should definitely be contemplating mobile SEO. With cell phone technology providing the ability for subscribers to search, visit, view and buy online, it is going to be crucial to ensure your site is optimized for mobile devices.
Mobile search technology is no longer in its infancy, but adoption of mobile SEO practices has been slow. By taking the SEO best practices you’ve performed on your desktop website and applying it to your mobile site, you can put your website ahead of your competitors. As popular as the iPhone and the BlackBerry and there are many mobile device users who surf the web using other devices, including cell phones. Optimizing pages on your mobile site can help to capture their attention as well.
When the Internet was first developed in 1971, existing as the U.S. Department of Defense’s ARPANET (Advanced Research Projects Agency Network), it was only 23 mini-computers large and spanned several universities and institutions across the United States.
In the ten years that followed, the U.S. National Science Foundation created a separate network called CSNET (Computer Science Network). This was developed for institutes without access to ARPANET. The CSNET network eventually merged with other networks to become the Internet.
Figure 1: ARPANET (then)
Figure 2: Internet (now)
The early version of the Internet was created to provide researchers with a way in which they could more easily communicate, quickly sharing theories, discoveries, and opinions. In the decades that followed the birth of the Internet, the practices of Web 1.0 and the fledgling directory and search engine technologies gave way to improvements under the Web 2.0 banner.
While the arts and technology disciplines burgeoned under Web 1.0 and came to prominence with Web 2.0 advancements, science languished. The technology that was developed by science for science left its parent field behind. But with Web 2.0 and the ever improving ways in which the search engines — and, by extension, search engine optimization (SEO) — are able to perform more relevant and functional information retrieval, science is now cautiously moving into the “Science 2.0” realm.
Much as the Internet has helped industries such as musical entertainment, hardware manufacturing, and software development realize the value of the Internet during the Web 1.0 phase, and enable such industries to monetize said value with Web 2.0, the Internet is now helping science to realize the value lying within. The ability to create a website that is easily crawled and indexed by the search engines, as well as the means by which one can create content that is identified as useful and relevant through SEO best practices, have made scientific publications open-access. This level of transparency and connectivity is driving science to challenge itself, making such traditionally opaque publications such as scientific journals and peer reviews subject to the challenge of self-sufficiency.
Web 2.0 practices, especially the rise of blogging and blogging software, now makes it easy for anyone to become a web publisher. Adopting SEO best practices, such as keyword targeting, cultivated trusted and relevant inbound links, utilizing title tags, description meta tags, and keywords meta tags, and content freshness can allow researchers to self-publish — the scientists who grew into Web 2.0 can now post ideas and data online, and optimize the information in a way that enables them to share their ideas quickly and universally. The wide-spread adoption of social media, and interactivity via social networking means that scientists who embrace Web 2.0 practices can debate and discuss on blogs, in newsgroups and forums, via mailing lists, and simply publish research documents under open-access publishing on their own website. This level of interactivity and transparency has led to a shift in the journal publication and peer review process: scientific research can now be web-published, indexed by search engines, instantly referenced and downloaded by web searchers, and the rarified practice of peer review has given way to the immediacy of peer interest.
These shifts are reflected across disciplines all over the Internet. Peer interest, or user interest, and the ability to easily search for and find information using search engines and directories, has led to virtual forests of information. These forests can only be navigated by helping the search engines determine which information is the most important. SEO practices such as writing compelling, keyword-rich content and using metadata help the search engines to be better able to serve up the best possible results for a web search. The addition of optimizing bookmarks, tags, multimedia, and other social media elements help to further refine search results. Sites such as Amazon.com, Digg.com, and Wikipedia.org have enabled critics, authors, readers, anyone, to be able to generate content and to comment upon it. This sort of “sociability” has now finally seeped its way into science.
While the age-old customs of scientific journals and peer review won’t be so readily replaced, science embracing Web 2.0 and the now adolescent child of its creation, the Internet, can only lead to faster advancements and swifter changes. After all, it was the need for improved communication that helped to bring about this modern age of Web 2.0, soon to be Web 3.0. As science becomes more and more accustomed to the way in which its child has matured, future improvements will be considerable. There are still over a billion plus pages of web content and documentation that have yet to be crawled and indexed by the search engines, information that is sitting deep within the Internet as part of the semantic web. As more and more scientists embrace the Internet and its various technologies, these deep pages, rich with content, profound with relevance, will come to the surface as science helps to push the search engines to dive deeper and to search better, truly serving up the most relevant and the most useful results… and as the Internet and the search engines develop and improve thanks to science, so will SEO. Search engine optimization continues to grow and adjust to the ever shifting Internet, learning along with the search engine algorithms and their robots and spiders. Now, decades after creating the Internet, science is back to push things along and SEO will be there, ready to adapt to the changes.