Saturday, July 24, 2010

Windows 7 and Windows Server 2008 R2 Service Pack 1 Beta standalone update ISO

email post

Windows 7 and Windows Server 2008 R2 Service Pack 1 Beta

Download: ISO
Support  in these languages: English, French, German, Japanese and Spanish.
Download is 1.2 GB. The download is a unified ISO including 32 bit SP1 Update for Windows 7 and the 64 bit update for both Windows 7 and Windows Server 2008 R2. 
Jerry Blogger / CC BY-ND 3.0

Glydo - Discover and Share Related News, Videos, Twitter, Shopping Advice, Web Pages, Information and Hot RSS Feeds 1.0.3

email post
News, video, tweets & shopping advice on what you're doing now.

Glydo automatically detects what you are interested in right now and suggests related web sites, news, tweets from twitter, videos, shopping advice & information.

Glydo helps you discover new and exciting content related to what you are browsing right now. As you browse the web, Glydo analyzes the web pages you are viewing and suggests recommendations for high quality, interesting content such as:

* Related News - stories and articles relevant to your current interests
* Web pages & blogs - the web pages, bloggers, and blog posts that are most relevant to what you are reading
* Videos - video clips related to your current interests from YouTube and other sources
* Shopping Advice: Shopping online? Glydo will show you other, often cheaper offers and user reviews
* Reference Information - relevant information from Wikipedia, IMDB,, etc.
* Tweets from Twitter - related tweets and relevant twitter users
* Top Headlines - the top headlines and RSS feeds as they occur (not related to the page you are viewing)
* and much more

Recommendations are made easily available at the bottom of your browser and do not interrupt your browsing. Recommendation titles are displayed un-intrusively in a ticker on your status bar, and you can also browse recommended content in more detail through additional buttons. You can also quickly share recommended news, video, and other content with your friends on Twitter, Facebook, MySpace, and other social networks and bookmarking or sharing services.

Glydo is based on cutting edge semantic and contextual discovery technology that understands what you are viewing and gets you the best content on the web right when you want it and with zero effort.

Download Glydo and start discovering cool content right away!

Meaning of name: Gerard

email post

ORIGIN English
MEANING Brave Spearman
As a name that has been popular since the Middle Ages, Gerard personifies an outgoing, adventure-loving individual. It was made most popular by famed French film actor, Gerard Depardieu.
ALTERNATIVES Jerard  Jerry  Gerry  Gerrard  Jirard 
NICKNAMES Gerry  Gerd  Gere 
FAMOUS Gerards Gerard Butler (actor)  Gerard Hopkins (poet)  Gerard Depardieu (actor) 

Meaning of Name: Jeremy

email post
Jerry is a variation of Jeremy
ORIGIN Hebrew MEANING Exalted by God
The English form of Jeremiah. Can be found in some versions of the New Testament. Took a huge jump in popularity in 1970, influenced by the character Jeremy Bolt in the television series, "Here Come the Brides." Peaked at No. 14 in 1976, but still remains a favorite among parents today.
ALTERNATIVES Jeramey  Jeremie  Jeramy  Jeromy  Jeremey  Jeramee 
FAMOUS Jeremys Jeremy Sumpter (actor)  Jeremy Shockey (football player)  Jeremy Jackson (actor)  Jeremy Brett (actor)  Jeremy Irons (English actor)  "Jeremy" (1992 Pearl Jam song) 

Firefox 3.6.8 is Released!

email post
Return to top

Security & Privacy

Personas for Firefox | ASUS - Republic of Gamers

email post
Personas for Firefox | ASUS - Republic of Gamers

ASUS - Republic of Gamers

ASUS - Republic of Gamers

All New Microsoft Bing Webmaster Tools

email post

Intro to new Microsoft Bing Webmaster Tools

Microsoft launched a revamp to their Bing Webmaster Tools. I talked to them back in June, when they previewed the tools at SMX Advanced, and they told me that they were starting from scratch and rebuilding the tool from the ground up. So how are things different? They say they are focused on three key areas: crawling, indexing, and traffic. They provide charts that enable you to analyze up to six months of this data. Note that none of this information is available unless Silverlight is installed. See more on that later.
Crawling, Indexing, and Traffic Data

Microsoft tells me that they provide, per day, the number of:
  • pages crawled
  • crawl errors
  • pages indexed
  • impressions
  • clicks
Sounds pretty cool. Let's go and see it

Traffic – Impressions and Clicks
The data is very similar to what Google provides. (Although Google currently only provides the latest month’s data. I’m not sure what happened to the historical data they used to provide.)
Bing Webmaster Tools: Traffic Summary
How does the accuracy stack up? I looked at a few samples.
Traffic Comparison
It’s potentially useful to compare click through rates for Google and Bing, although Google provides the additional data point of the average position. Without that on the Bing side, it’s hard to discern anything meaningful from the comparison. Note that for both Google and Bing, the click numbers reported in webmaster tools in some cases vary significantly from what is reported in Google Analytics (and in other cases are nearly exactly the same). Google has some explanation of why the numbers sometimes vary, but my guess is that Google Analytics is reporting in particular organic Google traffic from more sources than Google Webmaster Tools is. Google Webmaster Tools also clearly buckets the numbers.
Unfortunately, while Microsoft provides six months of data, it appears that you can only view it on screen and can’t download the data. This makes the data much more difficult to use in actionable ways.
Index Summary
Bing Webmaster Tools: Indexing Chart
This chart shows the number of pages in the Bing index per day. This certainly seems useful, but it’s deceptive. Decreased indexing over time seems like a bad thing, worthy of sounding the alarms and investing resources to figure out the cause, but indexing numbers should always be looked at in conjunction with traffic numbers. Is traffic down? If not, there may not be a problem. In fact, if a site has had duplication and canonicalization problems, a reduction in indexing is often a good thing.
The ability to use XML Sitemaps to categorize your page types and submit canonical lists of those URLs to Google and monitor those indexing numbers over time provides much more actionable information. (Of course, Google doesn’t provide historical indexing numbers, so in order to make this data truly actionable, you have to manually store it each week or month.)
Index Explorer
The Index Explorer enables you to view the specific pages of your site that are indexed and filter reports by directory and other criteria.
Bing Webmaster Tools: Index Explorer
Again it can be useful to drill in to this data, but it would be significantly more useful if it were downloadable. When you click on a URL, you see a pop up with controls to block the cache, block the URL and cache, and recrawl the URL. These are the same actions described below (see “block URLs” and “submit URLs”).
Crawl Summary
This chart is similar to what Google provides and shows the number of pages crawled each data.
Bing Webmaster Tools: Crawl Data
Crawl errors are still available, but the “long dynamic URLs” and “Unsupported content type” reports are missing. In their places are additional HTTP error code reports. (The previous version of the tool listed only URLs with 404 errors.) Since Google provides all of these reports as well, the additional value is mostly in knowing if BingBot is having particular problems crawling the site that Googlebot isn’t. As with the query data, you can’t download any of this information, only view it on screen, which makes it much more cumbersome to use.
Bing Webmaster Tools: Crawl Details
Block URLs
The new block URLs feature appears to be similar to Google’s removal URL feature. You can specify URLs that you want to remove from Bing search results. However, this feature differs from Google’s in that you don’t also have to block the URL with robots.txt, a robots meta tag, or return a 404 for the page. Microsoft told me that they are offering this feature because site owners may need to have page removed from the search results right away but might not be able to quickly block or remove the page from the site itself.
I find this a bit dangerous as it makes troubleshooting later very difficult. I can see someone blocking a bunch of URLs or a directory and someone else, months or years later, building new content on those pages and wondering why they never show up in the Bing index. Microsoft did tell me that they recommend this feature as a short term, emergency solution only, as the pages will still be crawled and indexed, they simply won’t display in results. But recommended uses and actual uses tend to vary.
Submit URLs
This feature enable you to “signal which URLs Bing should add to its index”. When I talked to Microsoft back in June, I asked how this feature was different from submitting an XML Sitemap. (And for that matter, different from the existing Submit URLs feature.) They said that you can submit a much smaller number of URLs via this feature (up to 10 a day and up to 50 a month). So I guess you submit XML Sitemaps for URLs you want indexed and use this feature for URLs you REALLY want indexed?
Yes, I realize this is a technology, not a feature. And in fact, it may well be an obstacle for some users rather than a benefit. (For instance, I primarily use Chrome on my Mac, which Silverlight doesn’t support.) But Microsoft is touting it as the primary new feature of this reboot. Since most of the data is available only graphically, and not as a download, without Silverlight, you basically can’t use Bing Webmaster Tools at all.
Bing Webmaster Tools: Silverlight
What’s Missing
Microsoft says that they “hit the reset button and rebuilt the tools from the group up.” This means that many of the features from the previous version of the tool are now missing. When I spoke to them, they said that they took a hard look at the tool and jettisoned those items that didn’t provide useful, actionable data. So, what have they removed?
  • Backlinks report – What is this feature does, in fact, have useful data if you invested a little effort in configuring the reports. You could only download 1,000 external links (and the UI showed only 20), but you could see a count of the total number of incoming links and could use filters to download different buckets of 1,000. For instance, you could filter the report to show only links to certain parts of your web site or from certain types of sites (by domain, TLD, etc.). Of course, I have no way of knowing how accurate this data was. It seems just about impossible to get accurate link data no matter what tool you use. Below is some comparison data I grabbed before this report went away yesterday. Backlink Count Comparison
  • Outbound links report
  • Robots.txt validator – This tool enabled you to test a robots.txt file to see if it blocked and allowed what you expected. Google provides a similar tool.
  • Domain score – I don’t think anyone will be sad that this “feature” has gone away. No one could ever figure out what it (or the related page score) could possibly mean.
  • Language and region information – This was potentially useful information, particularly in troubleshooting.
Overall, the relaunch provides data that’s potentially more useful than before, although this usefulness is limited without the ability to download the data. I also find the Silverlight requirement frustrating, but it remains to be seen if this is a significant obstacle to use of the tool. There’s nothing here that Google doesn’t provide in its tools, but with the Bing soon to be powering Yahoo search, site owners may find getting insight on Bing-specific issues and statistics to be valuable.. Historical information is great (although you can get this manually from Google if you download the data regularly), but particularly with query data, it’s hard to know how accurate the reports are (for both Google and Bing). In some cases, the data is misleading without additional data points (such as having click through data without position information and overall indexing trends without details).I always welcome additional information from the search engines, but as always, make sure that the data you use to drive your business decisions is actionable and is truly telling you what you think and what it actually is.

Although it has a lot of improvement, Google is still the best! I recommend using Google's Webmaster Tools.