الاثنين، 29 ديسمبر 2008

Well, it says First Click Free but turns out to be...

Well, it says First Click Free but turns out to be every click free.

What webmasters will do is just disallow googlebot accessing their premium content.

Then how will users be able to find premium content on Google?

السبت، 27 ديسمبر 2008

Japanese WMC Blog launched

Konnichiwa! Hajimemashite! *Hello, Nice to meet you!

We just launched a new Webmaster Central Blog in Japanese. For those of you who feel more comfortable reading Japanese, and are interested in webmaster-related information from Google, and even learning about issues specific to our region and language, we hope you enjoy on our Japanese version of the Webmaster Central Blog :D



الخميس، 25 ديسمبر 2008

Feliz Navidad from the Spanish Webmaster Central team!

About three and a half months ago we kicked off the Spanish Webmaster Central blog hoping to reach many webmasters. Given the time of the year, we would like to say a big ¡Muchas Gracias! to all our readers.

It's been both a pleasure and a great opportunity for us to share our knowledge and hear your feedback. A few of this year's highlights:

For the blog, we had:As for conferences, we had the chance to talk to some of you in:And last, but not least, the Spanish Help Group, with lots of interesting stories.

This is us, several members of the Spanish Webmaster Central team:


From left to right: Cristina, Alvar, Rebecca, and Esperanza in Google's Dublin office, with a holiday touch :)



الأربعاء، 24 ديسمبر 2008

Wishing you and your site a happy holiday!

Your presence is our favorite present -- thanks for joining us for another year of making your site, and therefore the web, a better place.


Every day we see new people commenting and joining the discussion. This holiday season we'll try to update our blog to accommodate your growing needs. Always feel free to let us know how we're doing (especially if we publish a typo! :), because first and foremost and everywhere in the middle, we're trying to improve for you.

Happy holidays from all of us at Webmaster Central.



الاثنين، 22 ديسمبر 2008

Quick and easy tips for the holiday rush

Season's greetings, webmasters! We've compiled a list of quick and simple tips for websites preparing for the holiday rush. For online and offline retailers, we understand that your website is a big part of your business, especially this time of year. Whether it's to make the sale online or to increase foot traffic to your brick-and-mortar location, your web presence is a critical part of your business plan. The tips below are fast, free, and can make a big difference.

Verify that your site is indexed by Google (and is returned in search results)
Check your snippet content and page titles with the site: command [site:example.com] -- do they look accurate and descriptive for users? Ideally, each title and snippet should be unique in order to reflect that each URL contains unique content. If anything is missing or you want more details, you can also use the Content Analysis tool in Webmaster Tools. There you can see which URLs on your site show duplicate titles or meta descriptions.


Label your images accurately
Don't miss out on potential customers! Because good 'alt' text and descriptive filenames help us better understand images, make sure you change non-descriptive file names [001.jpg] to something more accurate [NintendoWii.jpg]. Image Search is one of our largest search properties, so you should take advantage of it.

Know what Google knows (about your site)
Check for crawl errors and learn the top queries that bring traffic to your site through Webmaster Tools. See our diagnostics checklist.

Have a plan for expiring and temporary pages
Make sure to serve accurate HTTP status codes. If you no longer sell a product, serve a 404. If you have changed a product page to a new URL, serve a 301 to redirect the old page to the new one. Keeping your site up-to-date can help bring more targeted traffic your way.

Increase foot traffic too
If your website directs customers to a brick-and-mortar location, make sure you claim and double check your business listing in Google Local.


Usability 101
Test the usability of your checkout process with various browsers. Ask yourself if a user can get from product page to checkout without assistance. Is your checkout button easy to find?

Tell us where to find all of your web pages
If you upload new products faster than Google crawls your site, make sure to submit a Sitemap and include 'last modification' and change frequency' information. A Sitemap can point Googlebot to your new or hard-to-find content.

Manage your sitelinks
Your site may be triggering Sitelinks in the search results, so check the links and make sure the destination pages are fully functional. Remember: in Webmaster Tools you can remove any sitelinks that you don't think users will find useful.


Don't forget to check out these additional resources:


الخميس، 18 ديسمبر 2008

Sitemap Submission Made Simple

Submitting a Sitemap to Google just became even easier. No longer do you have to specify the Sitemap file type—we'll determine the type of data you're submitting automatically. Let's take a quick look at the kinds of Sitemap files we support as well as the ways they can be submitted to us.

A sample Webmaster Tools account with Sitemaps

Sitemap file formats supported by Google

Part of what makes the web so interesting is that there are so many different kinds of content out there. Do you use videos on your website? If so, send us a Video Sitemap file so that we can send you visitors to those videos! Do you host source-code samples? Submit a Code Search Sitemap! Here are the various kinds of Sitemap files that Google supports at the moment:

If you have multiple Sitemap files that you wish to submit to Google, you can include up to 1,000 of these in an XML Sitemap Index file. If you have more than 1,000 Sitemap files, you can just submit multiple Sitemap Index files - we'd love to take them all!

Submitting your Sitemap files to Google

Once you have your Sitemap files ready and available on your server, all that's left is making sure that the search engines can find them. Google supports three simple ways to submit Sitemap files:
  • Using Google Webmaster Tools
    Submitting your Sitemap files through Google Webmaster Tools is the preferred way of letting us know about them. The main advantage of doing it this way is that you'll always have direct feedback about how your Sitemap files were downloaded (were we able to reach your server?), how they were recognized (were they in the right format?) and what happened to the web pages listed in them (how many were indexed?). To submit your Sitemap files, make sure that your website is verified in Webmaster Tools, then go to "Sitemaps" in Webmaster Tools and enter the file name of your Sitemap(s).

    Sometimes it makes sense to keep your Sitemap file on a different server / domain name. To submit Sitemap files like that, you must verify ownership of both sites in Webmaster Tools and submit the Sitemap on the appropriate site. For instance, if your Sitemap file for http://www.example.com is kept on http://sitemap-files.example.com/ then you need to verify ownership of both sites and then submit the Sitemap file under http://sitemap-files.example.com (even though the URLs listed in it are for http://www.example.com). For more information, please see our Help Center topic on submitting Sitemap files for multiple sites.

  • Listing Sitemap files in the robots.txt file
    Another way of submitting a Sitemap file is to specify the URL in your robots.txt file. If you use this method of submitting a Sitemap file, it will be found by all search engines that support the Sitemaps protocol (although not all of them support the extensions listed above). Since you can specify the full URL of your Sitemap file in the robots.txt file, this method also allows you to store your Sitemap file on a different domain. Keep in mind that while Sitemap files submitted this way are processed on our side, they will not be automatically listed in your Webmaster Tools account. In order to receive feedback on your files, we recommend adding them manually to your account as well.

  • Using an HTTP "ping"
    If your Sitemap files are generated automatically, a convenient way to submit (and re-submit) them is to access the "ping" URL for Google Sitemaps. This URL includes the URL of your Sitemap file. For more information on the "ping" URL for your website, please see the Help Center article on Updating a Sitemap. Feel free to "ping" this URL whenever you update your Sitemap file - we'll know to pick it up and process it again. If you also have your Sitemap file registered in Webmaster Tools, we'll update the status there as well. This method is also valid if your Sitemap file is kept on a different server, but you must still verify both sites in Webmaster Tools as previously mentioned.

    Search engines that are a members of sitemaps.org support a similar way of submitting general web Sitemap files.

We hope these simplifications make it even easier for you to send us your Sitemap files!



الأربعاء، 17 ديسمبر 2008

Webmaster Tools in 40 languages!

(Инструменти за уеб администратори, Eines per a administradors web de Google, Webmaster Tools, Googlen Verkkovastaavan työkalut, Εργαλεία για Webmasters, Alat WebMaster, Tīmekļa pārziņa rīki, Žiniatinkli valdytojo įrankiai, Ferramentas para o webmaster do Google, Алатке за вебмастере, Nástroje správcu webu, Orodja za spletne skrbnike, Інструменти для веб-майстра, Công cụ Quản trị Trang Web)

In our recent Webmaster Tools launch, we went live in 14 new languages, bringing our total language support count to 40! With the launch of Bulgarian, Catalan, Croatian, Filipino, Greek, Indonesian, Lithuanian, Latvian, Portuguese (Portugal), Slovak, Slovenian, Serbian, Ukrainian and Vietnamese, Webmaster Tools joins Google products such as Google.com, AdWords, Gmail and Toolbar to reach the 40 Language Initiative (Google's company-wide initiative to make sure Google products are available in the 40 languages read by more than 98% of Internet users).

Our team is very excited to reach so many of you by offering our tools in 40 languages. At the same time, both the Google Localization and Webmaster Tools teams know that there's more room for improvements in the features and quality of our service. We hope to hear your input in the comments below, especially on the linguistic quality of our new languages.



الأحد، 14 ديسمبر 2008

Has anyone implemented this and seen results yet? ...

Has anyone implemented this and seen results yet? I am concerned about links being built to pages people have been able to navigate-to, but once clicked by someone else it would force regionalization. Also I am concerned about how other engines may view this.

الخميس، 11 ديسمبر 2008

Message Center info through our API

Recently we mentioned some updates in the Webmaster Tools GData API: we've just launched a whole new API, the Message Center GData API, as part of the Webmaster Tools API. The Message Center is the way that Google communicates to webmasters important issues regarding their sites—for example, if there's a problem crawling your site, or if someone has requested a change in crawl rate. Until now it was only possible to access these messages through the Message Center section of Webmaster Tools; but now you can also use GData to access it as a feed. This way you don't need to continually check your messages in Webmaster Tools, you can retrieve the messages feed automatically and be informed as soon as possible of any critical issues regarding your site.

What can I do?

The Message Center GData API lets you retrieve all messages, mark the messages as read or unread, and delete messages. You can do these tasks using the provided Java client libraries, or you can create your own client code based on the protocol information.

  • Retrieve messages: The messages feed contains all the messages sent to your account. These messages have important information about your verified sites. Examples of messages include infinite spaces warnings and crawl rate change notifications.
  • Mark messages as read or unread: In order to keep track of new communications from Google, you can mark your messages as read or unread, the same way that you would manage your inbox. If you retrieve a single message, this message will be automatically marked as read.
  • Delete mesages: It's possible to delete messages using the GData API. However, be careful because if you delete a message through the API it will also be deleted in your Webmaster Tools account, as both interfaces share the same data.

How do I do it?

You can download code samples in Java for all these new features. These samples provide simple ways to use the messages feed. The following snippet shows how to retrieve the messages feed in a supported language and print all the messages:

  // Connect with the service and authenticate
  WebmasterToolsService service
      =
new WebmasterToolsService("exampleCo-exampleApp-1");
  
try {
    service.setUserCredentials(
USERNAME, PASSWORD);
  }
catch (AuthenticationException e) {
    System.out.println(
"Username or password invalid");
    
return;
  }

  // Retrieve messages feed
  MessagesFeed messages;
  
try {
    URL feedUrl;
    
if (USER_LANGUAGE == null) {
      feedUrl =
new URL(MESSAGES_FEED_URI);
    }
else {
      feedUrl =
new URL(MESSAGES_FEED_URI
          +
"?hl=" + USER_LANGUAGE);
    }
    messages = service.getFeed(feedUrl, MessagesFeed.
class);
  }
catch (IOException e) {
    System.out.println(
"There was a network error.");
    
return;
  }
catch (ServiceException e) {
    System.out.println(
"The service is not available.");
    
return;
  }

  // Print the messages feed
  System.out.println(messages.getTitle().getPlainText());
  
for (MessageEntry entry : messages.getEntries()) {
    
if (entry.getRead()) {
      System.out.print(
"   \t");
    }
else {
      System.out.print(
"new\t");
    }
    System.out.print(entry.getDate().toUiString() +
"\t");
    System.out.println(entry.getSubject());
  }

Where do I get it?

If you want to know more about GData, you may want to start by checking out the GData website. The homepage of the Webmaster Tools GData API contains a section on the messages feed, with details about the protocol. You can also download the sample Message Center client form the GData download site. It will show you how to use all the Message Center GData API features.




الاثنين، 8 ديسمبر 2008

Reintroducing your English Webmaster Help Google Guides

When we announced our latest round of Bionic Posters in the old Help Group, an astute webmaster noted that we had never fully introduced ourselves to the webmaster community. With our announcement of a new English Webmaster Help forum, I'm happy to (re)introduce you all to the Google Guides from around the world who will be hanging out in the English forum:

Guides in Mountain View, California (from left to right): Adam, Maile, Matt Cutts, Chris, Wysz, Matt, Chark, and Adi.

Also in Mountain View: Evan, Jessica, and Nate.

Guides in Kirkland, Washington: Susan, Riona, and Jonathan.

Guides in Zürich, Switzerland: John Mueller and Balázs.

Guides in Hyderabad, India: Koti, Reid (visiting from Mountain View), and Jayan.

Guide in Ottowa Ottawa :), Ontario: Oliver.
All of us look forward to seeing you around the forum! Our Google Guides in non-English Webmaster Help Groups introduce themselves here.



الجمعة، 5 ديسمبر 2008

Friend Connect now available in beta to everyone

If you've been looking for a way to grow traffic and make your site more interactive, check out Google Friend Connect -- now in beta and available to all webmasters. Remember that with Friend Connect, you can easily add social features to your site by just copying and pasting a few snippets of code, no programming necessary! Your visitors will be able to join your site, create or link to a profile, interact with other visitors, and invite friends to visit your site. Best of all, your visitors won't be required to create yet another username and password -- Friend Connect lets them sign in using an existing Google, Yahoo, AOL, or OpenID account. To learn more, watch the video below:


So for any of you who were interested in Friend Connect after our first announcement, and also to all of the newer readers out there, go ahead and give Friend Connect a try.



One place for changing your site's settings

One of the many useful features of Webmaster Tools is the ability to adjust settings for your site, such as crawl rate or geographic target. We've been steadily adding settings over time and have now gotten to the point where they merit their own page. That's right, Webmaster Tools now provides a single, dedicated page where you can see and adjust all the settings for your site.

The settings that have been moved to the new Settings page are:
1. Geographic Target
2. Preferred domain control
3. Opting in to enhanced image search
4. Crawl rate control





Changing a Setting
Whenever you change a setting, you will be given an option to save or cancel the change.

Please note: The Save/Cancel option is provided on a per setting basis and hence if you change multiple settings, you'll have to click the Save button associated with each setting.


Expiration of a setting
Some of the settings are time-bounded. That is, your setting will expire after a certain time period. For example, the crawl rate setting has an expiration period of 90 days. After this period, it's automatically reset to the default setting. Whenever you visit the Settings page, you can view the date that your setting will expire underneath the setting name.


That's all there is to it!

We always like adding features and making our interface clearer based on your suggestions, so keep them coming! Please share your feedback (or ask questions) in the Webmaster Help Forum.



الخميس، 4 ديسمبر 2008

A new look for our Webmaster Help Group

Googlers strongly believe in dogfooding our own products. We manage our work schedules with Google Calendar, publish our blogs on Blogger, and store scads of documentation on Google Sites. So, ever since we launched our first Webmaster Help Group, we've been using Google Groups to facilitate conversations about Webmaster Tools and web search issues.

Today, however, I'm thrilled to announce that our English and Polish Help Groups are getting a makeover. And the changes are more than just skin-deep. Our new Help Forums should make it easier for you to find answers, share resources with others, and have your participation acknowledged.

You can read more about the changes on the Official Google Blog, and then check it out for yourself: English, Polish.

Q: What will happen to the old English and Polish Help Groups?
A: While our old groups are now closed to new posts, they will still be available in read-only mode in case you want to reference any of your favorite posts from the good old days. Many of the most frequently-asked questions (and answers!) have already been transferred to our new Help Forums.

Q: If I was a member of the old group, will I automatically be a member of the new forum?
A: We won't be "transferring" membership from the old groups to the new, so even if you were a member of our Google Groups forum, you'll still need to join the new forum in order to participate. Nicknames and user profiles are also managed separately, so you're welcome to recreate your Google Groups profile in our new forum, or reinvent yourself.

Q: What about the Webmaster Help Groups in other languages?
A: They'll be moving to the new Help Forum format in 2009. Specific dates will be announced in each of the groups as they get closer to their moving date.

Feel free to post any other questions about the new Help Forums in the comments below.




More control of Googlebot's crawl rate

We've upgraded the crawl rate setting in Webmaster Tools so that webmasters experiencing problems with Googlebot can now provide us more specific information. Crawl rate for your site determines the time used by Googlebot to crawl your site on each visit. Our goal is to thoroughly crawl your site (so your pages can be indexed and returned in search results!) without creating a noticeable impact on your server's bandwidth. While most webmasters are fine using the default crawl setting (i.e. no changes needed, more on that below), some webmasters may have more specific needs.

Googlebot employs sophisticated algorithms that determine how much to crawl each site it visits. For a vast majority of sites, it's probably best to choose the "Let Google determine my crawl rate" option, which is the default. However, if you're an advanced user or if you're facing bandwidth issues with your server, you can customize your crawl rate to the speed most optimal for your web server(s). The custom crawl rate option allows you to provide Googlebot insight to the maximum number of requests per second and the number of seconds between requests that you feel are best for your environment.

Googlebot determines the range of crawl rate values you'll have available in Webmaster Tools. This is based on our understanding of your server's capabilities. This range may vary from one site to another and across time based on several factors. Setting the crawl rate to a lower-than-default value may affect the coverage and freshness of your site in Google's search results. However, setting it to higher value than the default won't improve your coverage or ranking. If you do set a custom crawl rate, the new rate will be in effect for 90 days after which it resets to Google's recommended value.

You may use this setting only for root level sites and sites not hosted on a large domain like blogspot.com (we have special settings assigned for them). To check the crawl rate setting, sign in to Webmaster Tools and visit the Settings tab. If you have additional questions, visit the Webmaster Help Center to learn more about how Google crawls your site or post your questions in the Webmaster Help Forum.


Written By Pooja Shah, Software Engineer, Webmaster Tools Team


الخميس، 6 نوفمبر 2008

Hi I have a new question however belive should be...

Hi I have a new question however belive should be in another artical ...here I go.. HELP can anyone Help! Reason for the Cry for help is we have a site we look after that is not showing on google for search phrases that are expected and were showing strong and highly before?
the question and also discussions
i have seen note alot of sites were the owners have recorted the same problems... For instance we have a site that is now not showing for prases or words prevelent in the domain name and title etc.. were other terms not related however still important are showing ... this aposed to another site that only recently whent live is now serpassing the site in question for similar search phrases Can anyone advise... I have assumed that relevence to the domain name when searching for the name or phrase should and in every other site and account seems to be try were in the case in question this site is not highly profiled and if wasnt listed on google maps would not be found for its name within the domain name. the site in question has had sitemap submited and has been verified and we also have in google alytics. the site shows for many terms but the puzzle is that its not high for its most expected tearm. This site also when we type in the full domain name www.thesite.com.au (not with site:)shows however only showed two times were any other would dominate to the beter part (now this is the other way) are there any issues or things I should consider or who should I talk to this would help us understand more and give opertunity to adjust and encourage better profile this would help.

الأربعاء، 5 نوفمبر 2008

RE: "we sometimes run into situations where import...

RE: "we sometimes run into situations where important content is not publicly available"

That would make content private (by the publishers CHOICE), correct?

السبت، 1 نوفمبر 2008

一些我應該永遠不會懂的事

活在這個世界上也有20幾年了,但是我覺得還是有好多事我永遠不會懂的感覺.很多人的想法我也永遠無法理解.真的長越大,自己處在的世界就越複雜.想要單單純純的過生活感覺是遙不可及的一件事.人常常需要為了保護自己或是顧及自己的面子,做出一些不符合邏輯的事情或是說一些不合邏輯的話.誰說,在國外唸書,就一定也要用英文跟同是講中文的人聊msn或是寫email.有些人覺得,不跟著對方講英文好像就顯得自己弱一成.為了顧及面子或是宣示自己的能力,也一樣拋棄共通的母語,用著不熟悉的第二母語聊天.這種人在這邊很多,多到我都懶得理會,所以常常這種人跟我聊天的情況都是,他們用著笨拙的英文,我用著流利的中文,很可笑吧?!我不在乎別人覺得我用中文是不是英文不好,因為我自己知道什麼時候該用什麼樣的語言跟人溝通,如果要溝通順暢,應該就要用對的方法.我只想好好溝通,可不想邊聊天還要幫那些人注意哪些字打錯了阿!我這麼講,

الجمعة، 31 أكتوبر 2008

FCF has been around, though perhaps not well known...

FCF has been around, though perhaps not well known, for years. I agree that this is practically begging for a simple Firefox StealThisWebPage plugin, which is why I helped kill the suggestion to implement FCF at my old job. Google Search quickly turns up some blogs on which folks have released one-line Javascript "bookmarklets" for accessing Wall Street Journal, and suggestions for using an existing privacy-focused Firefox plugin to take advantage of FCF.

I don't know if it's possible to truly secure this (especially with Google's insisting that there be no per-individual limit on FCF access). Chances are that the reader knows the "title" of the restricted content. Clearly the reader knows the site's URL. All the reader has to do, as eddie said, is search Google for that title with a "site:" phrase. It doesn't matter what fancy-pants crypto, Web Services, etc. Google might build for FCF -- hacking around FCF will always be fairly easy to do. So the value proposition of FCF is inversely proportional to the cost & difficulty of paying for the restricted content. WSJ wants $103 for an annual subscription, so naturally people will try to abuse that more than if they only asked for $10 per year, or were actually willing to sell a seven-day pass for $1.99.

الخميس، 30 أكتوبر 2008

John- Can you tell us how long we have to adhere t...

John-

Can you tell us how long we have to adhere to these guidelines before being penalized?

If penalized, would it be just the offending pages or the entire site?

Thanks

السبت، 25 أكتوبر 2008

great idea, but unfortunately as is, poor implemen...

great idea, but unfortunately as is, poor implementation.

the main question is: what will keep people from using the "site:" search modifier, or faking the referrer, to see my entire site's premium content for free? it doesnt have to a perfect solution, but, as is, it's just too easy to be useful.

one idea is to make FCF only available to users logged in to google. google would then pass an anonymous reference to the google user ID to the content provider's site, and the owner of the site can decide how many FCFs to give each user (1, 3, 5, 10, who knows). google would make a web service available to verify that the user ID being supplied was recently active in a google search. of course this locks the user even more into the googlesphere, and restricts opening up FCF to users from other SEs. now if everyone was on openID that problem too would be solved.

using referrer alone is certainly not enough protective enough to be useful.

الخميس، 23 أكتوبر 2008

I agree it may be enough for some types of site in...

I agree it may be enough for some types of site in the short term (before they go bust, or realise their mistake ;) ), but the aim of FCF is to make premium content freely available, and if it becomes widespread (which I can't see happening in this form - for anything other than non-profit sites - or sites that don't really need signups anyway), awareness and tools will develop to automate this for users. I'd have to place little value on signups/payments on the content to use a system like this.

I'm suggesting that the entire model of allowing access based on referer is flawed, and suggesting a model by which IMO a better result could be achieved.

FCF is based on the idea that users want to be able to access stuff fully when they click on it (From Google, and by extension from any search engine or directory - why stop there?). I fundamentally don't think that this model is sustainable or securable.

I'm suggesting that it would be a better outcome for everyone if searchers were simply able to distinguish between these results, and choose to filter based on whether they would tolerate signup or pay-for links.

Someone has to pay for the web, and not all content is freely available. I think people will be much more understanding about this (and hence links will convert better to customers) if it's clear to them that the information will cost or require signup before they click on the link.

> Subscription/paid sites need signups to be susta...

> Subscription/paid sites need signups to be sustainable, and
> for a key aspect of their business cannot rely on such
> weak "security".

It could be enough for many sites to just show the registration box to say 80% of the people, not caring about all the different ways power users would see the content, as it's not really about "securing" the site to users (after all Google searchers can see it anyway; though the case may be different if you want to "secure" your site from non-Googlebot crawling). It's more about deceiving, uhm, I mean convincing enough people to send around the link they found in Google believing that the content will show to their friend, and then the friend will see the payment request instead of what was intended to be sent.

Beth Ann,1) You don't need to spoof a referer to "...

Beth Ann,

1) You don't need to spoof a referer to "beat" this, all you need to do is do a Google search for
"site: www.premium-content-here.com" and then open all the links in a new tab. You then get the referer in a perfecly legimate way, and get access to all the FCF content. Not difficult.

2) How long do you think it would be before browser versions / plugins / mods appear with a "Via Google" button that reloads the page with a Google referer header? Various plugins and browsers already exist to alter these headers (mainly for debugging purposes), and don't require any specialist knowledge.

Subscription/paid sites need signups to be sustainable, and for a key aspect of their business cannot rely on such weak "security".

IMO, a better approach to this would be for search engines to support meta data about content (via the sitemap), which specifies whether or not it's free / signup, which appear in the search results, and allow the searcher to specify preferences about whether they require free/pay/subscribe sites in their results. Robust authentication of googlebot could allow webmasters to permit indexing of a full article whilst retaining it as premium content for customers.

Cloaking content in this way would be far less damaging, as users would be aware that they were about to view a pay / subscription link, and would have the option to filter their search results to restrict these sites if they wished.

If extra processing & development costs are an issue, Google could potentially charge businesses a small amount per clickthrough on links cloaked like this - in the manner of Adwords. The customer would be well aware that the link was pay / register beforehand, so the conversion rates would be high. Google searchers would of course want to be confident that search rankings were not otherwise affected by this, and would be able to filter those results if desired.

Deep / hidden web indexed, customers happy that they are in control of their search results. Businesses happy that their content can be found and conversion rates are high. Google making more money.

Everyone is happy, the world becomes a better place. Free cake for everyone. Steve hailed as saviour of teh interwebs.

Now I'm off to solve the Middle East. What's all the fuss about?

الأربعاء، 22 أكتوبر 2008

@beth ann: The real problem are not those few peop...

@beth ann: The real problem are not those few people knowing how to fake referrers in their browsers... the real problem arises from people writing own bots which always supplys a google referrer trying to steal the content of my site. As long as there is no 100% working way to tell if someone did really a search on searchengineXY or is just supplying a google.com?q=asdfasdf referrer, I will not implement this. I can check if someone claiming googlebot is really googlebot with dns lookups, but currently their is no way to check the referrer for authenticity and thats the big problem.

My feelings on this "service" are mixed, but I've ...

My feelings on this "service" are mixed, but I've got one simple question for the naysayers: How many *regular* people really know how to fake a referrer? Many of the Joe Searchers I know can't figure out the difference between the address box and the Google search box. A few of them might be able to figure out that an advanced search could let them see more content for free, but most people I know aren't that industrious.

الثلاثاء، 21 أكتوبر 2008

great info man...

great info man...

B1, Google's cache can be instructed to ignore you...

B1, Google's cache can be instructed to ignore your page by use of a Meta tag in the document.

[meta name=”robots” content=”noarchive”]

(Use angle brackets around it instead of square)

For premium content this would seem like a good idea anyway, but it doesn't solve all the issues.

I run a subscription-based website and I also sell...

I run a subscription-based website and I also sell a plugin for Wordpress that provides subscription services, so I've wrestled with this subject a lot.

Any subscription-based website admin/author will tell you that they are continually trying to find the happy medium of giving enough teaser (public) material to show what the site is about, its quality and depth, etc, but not giving it all away. After all, your aim is to have people pay to read the rest of it.

This balance can and is already achieved by admins selecting some info for public viewing and restricting the rest.

I'm a big fan of Google and the technology it has introduced over the years, but I don't think this suggestion in its current form will be embraced because Google is asking to circumvent this crucial and delicate balance.

I might be more interested in this if Google's search engine would not display the FCF content in its search results (won't this show up in the cached area?), yet still have the search engine match any terms that the bot may have found in the FCF page. Also, I don't think the user's should be given any FCF privileges - only the Google Bot and only if it doesn't reveal exactly what it saw. If the user is interested in the content then let the website decide whether to let them in or not - maybe with a free trial if they signup, or a discount on a new subscription, or whatever the admin thinks is best.

Many thanks to Google though for at least sharing with us what goes on inside its incubator and giving us an opportunity to comment. How many other titans out there are so open and engaging. Thank you Google, I appreciate that.

- BC.

OK so Google wants me to agree to make all my prem...

OK so Google wants me to agree to make all my premium content free to anyone who either 1) uses Google's site specific search instead of the navigation I provide OR 2) who fakes the referrer.

WOW what a deal!

David

this is essentially what webmasterworld and google...

this is essentially what webmasterworld and google have been doing for since the former erected a pay wall.

Here's (I think) a better solution which avoids mo...

Here's (I think) a better solution which avoids most of these problems. Implement this, and pay me a commission on revenue (just kidding), or give me a job and I'll implement it for you (kidding a bit less ;) ).

----

1) Update the sitemap spec to support the following true/false flags for each page

* Free View (Is it cost free to view the full page)

* Free Preview (Is it cost free to view a preview of the page)

* Registration View (Is registration required to view the page)

* Registration Preview (Is registration required to view a preview of the page)

* Cache Preview (Should a preview of the page be cached)

* Cache page (Should the full page be cached)

(Depending on the model you require, you could base this on pages or articles - which would require some other modifications to the sitemap spec. This could easily be done without breaking for existing sitemaps.)

For each sitemap, indicate if SSL is supported for the URLs spidered (or alternate URLs, etc)

2) Equip googlebot with a client certificate to identify itself.

When googlebot spiders a site, the site can decide if it wants to let Google index the full article or not. If they are a big enough site to support SSL, they can validate the client is Google with higher certainty.

3) In the search listings, show flags depending on whether the information is free to view/preview and whether registration is required. Allow users to filter search results on these criteria, and setup default preferences.

Penalise sites if they are reported (and confirmed) as lying about the cost and registration flags.

Result: You can index the hidden web in a way allowed by the content authors, but still enable the end users to be in control of the type of sites their search will return.

Webmasters could build their registration and authentication system based on the rules in their sitemap file, so everything would be automatically up-to-date and in agreement between themselves and the search engines.

Just pop the cheque in the post ;)

الاثنين، 20 أكتوبر 2008

:-)

:-)

This is not something Google is "creating" just en...

This is not something Google is "creating" just endorsing. This is easily implemented, but how many people are actually going to do this? Why would you provide free access to only Google users?

What it boils down to, is will this increase registration conversion rates at sites?

There's two sides to this; first off if showing the "first click for free" does increase the conversion ratio, then you should do that for everyone who visits your site, not just those coming from Google results.

Second off, if you decide to do this only for Google users, people are going to spoof their referer ID to get to your content for free, not just for the first click, but for everywhere this is enabled on your site, which seems like it would reduce registration rates.

I see absolutely no reason a website should choose to do this.

There seems to be some confusion here on this issu...

There seems to be some confusion here on this issue of cloaking. The distinction here is not between users and search engines, but between users coming from one place, and users coming from another. Google apparently wants its customers to have more access to your site than customers from elsewhere - which shouldn't come as a surprise.

*However*, if you allow googlebot to index the whole page, but only show a limited page to browsers (whether refered from Google or not), then it is cloaking.

Google would possibly (probably?) argue that it's okay as long as the googlebot indexed page has the same content as the page seen by users *when they click through from Google*

I wonder why anybody would want to implement this. Maybe some of the above enthusiasts could explain what good they see here?

I added my thoughts here: http://blogoscoped.com/a...

I added my thoughts here: http://blogoscoped.com/archive/2008-10-20-n30.html

Looks like a new therm should be coined for the th...

Looks like a new therm should be coined for the things like that - the "content neutrality". Or rather, the lack of it in this case.

Also it's a pity Google has changed its stance on the cloaking. Time to update the webmaster guidelines?

Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.

Some examples of cloaking include:

* Serving a page of HTML text to search engines, while showing a page of images or Flash to users.
* Serving different content to search engines than to users.

Am I reading this right?Website owners are now exp...

Am I reading this right?

Website owners are now expected to reward users for using Google to navigate their site rather than the Website's own navigation.

A search for "site: www.restricted-content-here.com" and then opening all the documents in a new tab would seem to provide all "first click free" content, and when people figure this out, the only way those sites will make any money / signups is by having their valuable content unavailable through this system, and only having teaser content for free - Just like it is now.

Further to that, with this proposal, Google doesn't seem to be providing code for owners to use to filter these users, or treating the search traffic differently in any way. While it's not rocket science to implement, implying that this is a Google service seems dubious, as Google don't seem to be actually doing *anything* here.

I can see the benefit to general web users (In the short term perhaps), but what's the incentive for website owners to actually implement this? Are they going to be penalised in the search rankings if they don't?

I'm just a bit baffled - am I missing something important being offered here? I assume I must be, as Google usually offers well thought through products.

有妞妞的夢

2006在妞妞過世的時候, 我一直沒辦法接受永遠看不到她的事實, 剛開始每天睡覺的時候,都會祈禱妞妞會入我的夢.不過,我擔心太多了,妞妞在這兩年中,不知道自己跑進我的夢境幾次了.而且夢境越來越清楚,也越來越寫實. 雖然有的很好笑,例如: 妞妞變成馬了,我還是義無反顧的照顧她,幫她四處奔波找carrot, 但她的行為舉止還是跟狗一樣! sigh... 剛開始的夢境就是我抱著生重病的她,一直奔跑或是帶她出去玩. 這種只有一瞬間或是單一事件的夢境發生很多次. 接著,就是剛剛說變馬的夢境,雖然我在夢裡知道她是馬,但認定她就是妞妞,而她的行為也是妞妞.這個夢就有點複雜而且清楚,從我怎麼知道她是馬,知道她要吃什麼,去夢中的雜貨店還有萊爾富幫她找蔬菜,還跟老闆討價還價抱怨價錢太貴,然後一氣之下不買壞掉的蔬菜,帶她回家吃冷凍花椰菜,還命令kyle不可以再騎那隻馬了,因為她沒吃東西,

الاثنين، 13 أكتوبر 2008

Leaving SF & Moving down to LA

十月底即將離開SF搬到LA.雖然去過LA的次數不計其數, 但是還是很不習慣當地的天氣跟生活模式.畢竟"腳"的定義在SF和LA就大不相同.在畢業前夕到現在, 做出了幾項人生重大的抉擇. 雖然都很突然, 但都是為了未來的規劃而走的路,包括離開舊金山.雖然搬到LA好處多多, 最大的好處就是可以節省房租. 但畢竟在舊金山住了兩年多, 有感情也是必然的.想當初剛到舊金山的時候, 面對無所不在的流浪漢跟常有尿騷味的街道充滿了失望的情緒, 又常常聽到身旁人被搶劫或是被不友善的對待, 感覺既無奈又生氣, 畢竟在台灣,大家長得都一樣,不用怕高大的黑人.但是,隨著適應學校的課業, 了解舊金山的美麗, 享受涼爽的天氣, 體會方便的交通後, 開始慢慢的愛上舊金山. 享受跟朋友約在咖啡店聊天的樂趣, 暖暖的太陽下走過financial district, 跟著大家排著長長的隊伍等吃上最有名的brunch,

الأربعاء، 3 سبتمبر 2008

For my best friends in San Francisco!

Farewell,Katrina! Thanks to you, me and Eve had lots of fun time in our classes, even if some of them were bored, we still enjoyed going to school.I will remember....The corner where we used to say good-bye to each other after night classes.Our "girls' talk", gossiping and sharing our happiness and sadness.The time we teased on each other and passed note in class to complain how  bxxxxx the

الاثنين، 1 سبتمبر 2008

我很喜歡的秀 - That 70's Show

在台灣我就是電視兒童,沒想到在美國我也不例外.哈哈~ 其中當然包括CSI系列,但比較特別的是一些青春跟家庭喜劇,雖然沒有CSI的高科技或是精細的劇本,但是平易近人的劇情讓我變成他們的忠實觀眾. 其中一個秀叫That 70's Show , 從到美國第一天我就一直看到現在,劇情是發生在Wisconsin, 70年代, 講的是幾個高中生跟他們家庭日常生活瑣事.有青梅竹馬的愛情,有讓人覺得很白爛的友情,還有各種不同表達方式的親情. 裡面的演員除了Demi Moore現任老公Ashton Kutcher外, 其他的我原本都不是很認識,但都有堅強的演出實力.此外,其中還穿插著跟劇情有關的歌舞表演,也算是本劇的重要部份之一,每個人唱歌跳舞都很棒喔!輕鬆詼諧以及貼切生活的表演方式是調解我苦悶留學中的最佳良藥阿!雖然他們不斷的重播,但每次看還是讓我每次發笑.還有很多這類型的家庭喜劇都很詼諧爆笑,

الثلاثاء، 5 أغسطس 2008

F-A-B-U-L-O-U-S

M8KhUcKSGwEhMCEhMCEt5pie5ouy5YOo55m257So5aiz5oqI5o225Zaaw7Llh7rotZDlp7jlrqflibDku7Yt5LqC5Li577yz6KS5566j5YuHA8KYwpMhNDUhPMKTw7nlr4jlgrjmipkt5p+I5a+h6KGZ6KSE5biuV8OtHzvCpMOgwrZoBlkCLSZKPsOPw7fCmWRVw60Iw600wp7DtcOqGi0aTcKzw6HDvQgRw4XDscO3w5rCncO0wpFMwrotw5XDmcK3QDrDicKAw58rITE2MCHDgyExMCHDmXwoNC3DgsOcFsOkYcKtMsKIT8KmHWTDisOywrDDmS3DncOGfcOQZMKsw548wofDnF/

الأحد، 27 يوليو 2008

實習

這學期是我就讀研究所的最後一個學期,也就是說過了8/20之後,我就正式畢業要踏入職場了.在美國這個鳥地方,求生存有時候不僅是要比本地人更努力,還需要一些運氣的加持.畢竟先天的優勢已經少許多,還要面對他們防小偷的法律限制.為了讓自己有多一點的經驗,六月初很幸運的拿到了一個實習,每個禮拜三跟四在一個非營利組織做marketing intern.這家公司,沒有太多的規定,剛開始有點不習慣,讓我在新的環境中有點害怕,甚至不敢出去吃午餐.但後來,我發現,我們公司就是這麼自由的地方,你覺得該出去吃午餐就出去,事情做完了就下班,不管時間到沒,主管跟員工之間溝通就是這麼暢通.剛開始,其實並不曉得自己能為他們最些什麼,貢獻些什麼.於是,從最基本的開始做起,把他們希望你做的事情做到最好.於是我就開始熟悉google analytics, 幫他們詳細的分析Google AdWords的

الجمعة، 25 يوليو 2008

還有另一種人很搞笑

最近對蠢蛋的忍耐度越來越低,所以抒發一下!除了擁有前面所說的,“自以為來美國一段時間就覺得自己是美國人"的特質外,plus故意裝中文不好以及自以為是.自以為是有很多種表現方法,這個人是用最笨的方式表現.當他想要主動跟你有互動時,並不大方,連最基本的禮貌都沒有.這樣講也許有點含糊,那就來看看例子吧!例如,當他想要把你加入好友時,他不會像一般正常人一樣直接按一個叫做"add xxx as a friend"的選項,而是,寄封信息問你說"你把我加入了好友嗎?"一百,一千,一萬人裡面,只有一個人會這麼做,那就是他!因為,正常人都會直接按選項,不會這麼做作的傳訊息去問人!你要跟我成為好友為什麼不自己主動加?為什麼要我先加你,這樣感覺比較好嗎?自以為是+1更好笑的是,每當他要msn我的時候,他不會禮貌性的說"hi",他會故意傳錯訊息給我,傳一些很明顯對象不是我的訊息,然後要我回應,

الثلاثاء، 22 يوليو 2008

話不投機半句多

在這邊,圈圈很小,人跟人都互相認識,大部分的人都是友善的,但那少部份就不見得是."朋友"這個名詞,在我心裡就是單純的友誼.在忙碌的生活中,朋友始終在你的心中有個位子.當你有喜悅時,他們為你高興!當你悲傷不快樂時,他們用自己的方式關心著你,也許只是幾句話帶過,但溫暖之手已經伸入我的心中.大家有時間時,忙裡偷閒一下吃了飯,唱個歌,喝個飛快的咖啡.就算只是幾個小時,幾分鐘,那也是種滿足.隨著時間的改變,大家人生規劃的schedule也開始有具體的變化,花一整天在一起逛街吃飯已經是種奢侈的享受.真正的朋友是可以珍惜在一起僅有的時間,也珍惜一起擁有的回憶.但有些朋友,我不確定他們是不是fit“朋友"這個名詞,又或者可以稱為認識的人,說得話總是讓你搞不清.這種人有個很奇妙的特點,他會持續注意你的動向,也會不斷的提醒你他有多棒.在想盡辦法讓你了解他有多棒的同時,他也不忘讓自己表現的很謙虛,很矛盾吧?

الثلاثاء، 15 أبريل 2008

4/13超大熱天之Sausalito自行車遊

昨天終於實現長久以來的夢想啦!不管這週有多地獄,我終究抵擋不料太陽公公的呼喚,跟朋友們一起騎車從Fisherman's wharf(漁人碼頭)經過Golden Gate Bridge(金門大橋)到悠閒的小鎮Sausalito.好開心好開心啊!雖然路途只有8 mile左右,但我們大約騎了兩個多小時.以下是路線圖:藍色的記號是我們的路線:檢視較大的地圖藍色虛線就是我們的路線:從漁人碼頭取車時,心情真是緊張又興奮啊!因為很久沒有在路上騎腳踏車了,而且舊金山的路讓人覺得很challenge.沿途我們經過了Marina District, 一個有錢人家住的地方,旁邊就是海灘,我覺得路線設計得很好,我們可以安心的騎在腳踏車專用道上,邊騎還可以邊看到很多的辣妹帥哥在慢跑,再吹上涼涼的微風,真是一大享受啊! 眼看著越來越接近金門大橋,我們就停下來照照相!接近金門大橋的過程可不簡單啊,

السبت، 12 أبريل 2008

傳遞聖火在舊金山!

4/9當天聖火傳遞到了舊金山,這邊是全美唯一的一站. 雖然我對奧運一直沒有太大興趣(台灣參加的項目畢竟不多), 不過沒有看過聖火傳遞,還是想去見識一下!有人說,早上五點半要去集合,因為這樣可以搶到好位置!我~瘋了嗎?我想我應該沒熱衷到這個程度吧!哈哈.這次聖火傳遞引起國際高度重視,因為在巴黎以及倫敦都有藏獨支持者鬧事,以致最後狼狽的收場.因此舊金山警方以及市府,為了要聖火安全傳遞,想出了幾條不同的路線,隨時有可能更換!也就是說,不見得到宣佈的地點就可以看到聖火!聽起來很不妙吧!果然,我那天沒看到聖火,好險我只待了半小時!原先的路線是於AT&T PARK起跑,因此我們就到了AT&T PARK,一下muni就看到很多的五星旗以及藏獨的旗子,讓我忘記我身在何方了.更不用說當時聽到附近所有人唱起了中國國歌,我選擇默默的看著警方的一舉一動.在等待的同時,不時聽見對岸的同胞們罵著CNN

الأربعاء، 9 أبريل 2008

Women want more horeses!

如果妳粉有錢,買得起各種的車,妳會想要買什麼樣的車給自己呢?妳會選跑車,S.U.V,房車,還是minivan呢?我個人是會選跑車啦,因為很炫啊,雖然坐起來不是很舒服!但有錢嘛!怕什麼呢!據統計,美國中年婦女(超過45歲)買兩門車或是名貴車的比率越來越高.數字在這邊-從2000起, 中年婦女購買兩門車的數量已經增加297% (以Mazda & Chrysler crossfire銷售最好), 而購買名貴汽車(以BMW 3 series & Audi A4銷售最好)的數量則增加了97%. Wow, 真是驚人的數字!很難想像,中年婦女怎麼會跟跑車扯上關係呢?!這就牽扯到一個行為模式叫做self-gifting.每一年,全美花在買禮物的錢有$100 billion之多!而69%的人每年至少會買一次禮物,而大部分都是給自己的! 為什麼會有這種行為模式出現呢?因為中年婦女,她們長期為了家庭,孩子,

الاثنين، 7 أبريل 2008

Muuuah長大了!

經過幾天的努力後, Muuuah終於長腳跟長水管啦! 怎麼說呢? 心情真是激動啊!!哈哈~以前沒腳的時候,都會羨慕人家有腳!長腳之後,就看到大家都有水管,又更羨慕! 沒想到,昨天Muuuah也長水管了! 好開心啊!下一個目標是長高,希望別人來我家打招呼的時候, 我的Muuuah也會飛起來!超可愛的啦! BTW, 我也幫他的家佈置了一下, 走甜死人路線.我還要繼續加油賺錢!才可以給她更好的房間!^.^ 接下來看看她造訪好友babyH的逗趣小畫面吧! Muuuah很乖,隨時跟隨著babyH的腳步,畢竟那是人家家啊! 每天一定要來這報到, 因為有點心可以吃,現在還有電視可以看勒!咦?!兩個人居然交錯走路啦! 難道是新舞步嗎? Muuuah不要亂走啊!不然跟主人一樣隨時會發生糗事! 這就是我的Muuuah啦..M家有女初長成!

الأربعاء، 2 أبريل 2008

這個世界,又多了一個希望,叫做"Muuuah"

經過海蒂分享小水滴生命製造器後, 我Muuuah也就此出生,只能說,太神奇啦!雖然都是日文網站,但經過海媽媽詳細指導以及許多台灣網友熱情的教學, 我這個水滴媽媽是越來越上手啦!不僅要忙著到處去幫她交朋友打招呼以賺取生活費,還要拼命的幫她撿種子,讓她友更美好的環境生長跟玩耍! 昨天還帶還是嬰兒的她去泡溫泉呢! 馬上就邀請另一個嬰兒到我們家玩耍! 雖然我們家還是家徒四壁,但我會加油的!(每天努力打招呼@_@) 不知道Muuuah長大會變什麼水滴樣, 還好不會是變色龍,不然水滴媽媽我會內疚,哈哈~其他水滴都可愛的要命! Muuuah小水滴,歡迎你啦!

الجمعة، 28 مارس 2008

在美國巧遇復旦學姊

昨天晚上跟朋友去參加一個reporter的新書發表會, 就在離家不遠的swig bar, 外面是冷風颼颼,裡面可是熱翻天.一進去就看到大家在吃著熟悉的水煎包跟煎餃. 原來這個作者是華人,但是骨子裡跟整個生長過程是道道地地的美國人,所以並不會講太多的中文. 而她的書就是在探討Fortune Cookies, 一種常常在美國的中國餐廳出現的甜點. 每一個fortune cookie裡面都藏著一張紙條, 每個人都會在紙條上得到一句名言,通常我是看看就好!所以昨天bar裡面出現很多華人,有的會講中文有點不會講, 也有許多San Francisco Chronicle的記者, 在作者做完非常快速presentation之後, 我決定我要出去透透氣. Bar裡面實在是非常的擁擠跟吵鬧.這時,沒吃晚餐的我就嚷嚷著要去吃好吃的日本拉麵. 於是,朋友的朋友們也加入了我們.由於那家日本拉麵店實在生意太好,

الاثنين، 25 فبراير 2008

爆肝爆肝

終於把midterm看完.我只能說~我要爆肝了!希望SF不要再下雨了,我這個週末想出去玩阿!請賜給我一個大大的太陽, 再來點涼涼的微風就好了..挖哈哈

الاثنين، 18 فبراير 2008

**芋頭西米露**

mm...因為我很喜歡吃芋頭, 舉凡任何芋頭製品都是我的最愛.所以這次補貨的時候,又添購了芋頭以及芋頭罐頭再添購一包西谷米, 準備來作芋頭西米露~其實還滿簡單的, 水滾後, 就把西米丟下去煮4-5mins, 然後熄火悶鍋20mins.用冷水沖洗煮好的西米,然後放置一旁.接著就煮芋頭甜湯, 待水滾後, 倒進芋頭罐頭, 小火悶煮十分鐘左右, 再加牛奶及糖最後,把煮好的西米加進去攪拌一下就可以啦~ Easy and delicious

الأحد، 17 فبراير 2008

哈哈~是該復活了!

前幾天,小秋提醒了我, 網誌該弄一下了.想一想,也真久沒碰了, 自從上學期開始.不知道為什麼, 在美國總是覺得很多事情在腦袋轉,也許是因為功課, 也許是因為生活瑣事, 也許是因為未來.不過慶幸的是, 我總是能在最懊惱的時候頓時想開,讓我又能繼續大步往前走, 挖哈哈~