FCF has been around, though perhaps not well known, for years. I agree that this is practically begging for a simple Firefox StealThisWebPage plugin, which is why I helped kill the suggestion to implement FCF at my old job. Google Search quickly turns up some blogs on which folks have released one-line Javascript "bookmarklets" for accessing Wall Street Journal, and suggestions for using an existing privacy-focused Firefox plugin to take advantage of FCF.
I don't know if it's possible to truly secure this (especially with Google's insisting that there be no per-individual limit on FCF access). Chances are that the reader knows the "title" of the restricted content. Clearly the reader knows the site's URL. All the reader has to do, as eddie said, is search Google for that title with a "site:" phrase. It doesn't matter what fancy-pants crypto, Web Services, etc. Google might build for FCF -- hacking around FCF will always be fairly easy to do. So the value proposition of FCF is inversely proportional to the cost & difficulty of paying for the restricted content. WSJ wants $103 for an annual subscription, so naturally people will try to abuse that more than if they only asked for $10 per year, or were actually willing to sell a seven-day pass for $1.99.
الجمعة، 31 أكتوبر 2008
الخميس، 30 أكتوبر 2008
John- Can you tell us how long we have to adhere t...
John-
Can you tell us how long we have to adhere to these guidelines before being penalized?
If penalized, would it be just the offending pages or the entire site?
Thanks
Can you tell us how long we have to adhere to these guidelines before being penalized?
If penalized, would it be just the offending pages or the entire site?
Thanks
السبت، 25 أكتوبر 2008
great idea, but unfortunately as is, poor implemen...
great idea, but unfortunately as is, poor implementation.
the main question is: what will keep people from using the "site:" search modifier, or faking the referrer, to see my entire site's premium content for free? it doesnt have to a perfect solution, but, as is, it's just too easy to be useful.
one idea is to make FCF only available to users logged in to google. google would then pass an anonymous reference to the google user ID to the content provider's site, and the owner of the site can decide how many FCFs to give each user (1, 3, 5, 10, who knows). google would make a web service available to verify that the user ID being supplied was recently active in a google search. of course this locks the user even more into the googlesphere, and restricts opening up FCF to users from other SEs. now if everyone was on openID that problem too would be solved.
using referrer alone is certainly not enough protective enough to be useful.
the main question is: what will keep people from using the "site:" search modifier, or faking the referrer, to see my entire site's premium content for free? it doesnt have to a perfect solution, but, as is, it's just too easy to be useful.
one idea is to make FCF only available to users logged in to google. google would then pass an anonymous reference to the google user ID to the content provider's site, and the owner of the site can decide how many FCFs to give each user (1, 3, 5, 10, who knows). google would make a web service available to verify that the user ID being supplied was recently active in a google search. of course this locks the user even more into the googlesphere, and restricts opening up FCF to users from other SEs. now if everyone was on openID that problem too would be solved.
using referrer alone is certainly not enough protective enough to be useful.
الخميس، 23 أكتوبر 2008
I agree it may be enough for some types of site in...
I agree it may be enough for some types of site in the short term (before they go bust, or realise their mistake ;) ), but the aim of FCF is to make premium content freely available, and if it becomes widespread (which I can't see happening in this form - for anything other than non-profit sites - or sites that don't really need signups anyway), awareness and tools will develop to automate this for users. I'd have to place little value on signups/payments on the content to use a system like this.
I'm suggesting that the entire model of allowing access based on referer is flawed, and suggesting a model by which IMO a better result could be achieved.
FCF is based on the idea that users want to be able to access stuff fully when they click on it (From Google, and by extension from any search engine or directory - why stop there?). I fundamentally don't think that this model is sustainable or securable.
I'm suggesting that it would be a better outcome for everyone if searchers were simply able to distinguish between these results, and choose to filter based on whether they would tolerate signup or pay-for links.
Someone has to pay for the web, and not all content is freely available. I think people will be much more understanding about this (and hence links will convert better to customers) if it's clear to them that the information will cost or require signup before they click on the link.
I'm suggesting that the entire model of allowing access based on referer is flawed, and suggesting a model by which IMO a better result could be achieved.
FCF is based on the idea that users want to be able to access stuff fully when they click on it (From Google, and by extension from any search engine or directory - why stop there?). I fundamentally don't think that this model is sustainable or securable.
I'm suggesting that it would be a better outcome for everyone if searchers were simply able to distinguish between these results, and choose to filter based on whether they would tolerate signup or pay-for links.
Someone has to pay for the web, and not all content is freely available. I think people will be much more understanding about this (and hence links will convert better to customers) if it's clear to them that the information will cost or require signup before they click on the link.
> Subscription/paid sites need signups to be susta...
> Subscription/paid sites need signups to be sustainable, and
> for a key aspect of their business cannot rely on such
> weak "security".
It could be enough for many sites to just show the registration box to say 80% of the people, not caring about all the different ways power users would see the content, as it's not really about "securing" the site to users (after all Google searchers can see it anyway; though the case may be different if you want to "secure" your site from non-Googlebot crawling). It's more about deceiving, uhm, I mean convincing enough people to send around the link they found in Google believing that the content will show to their friend, and then the friend will see the payment request instead of what was intended to be sent.
> for a key aspect of their business cannot rely on such
> weak "security".
It could be enough for many sites to just show the registration box to say 80% of the people, not caring about all the different ways power users would see the content, as it's not really about "securing" the site to users (after all Google searchers can see it anyway; though the case may be different if you want to "secure" your site from non-Googlebot crawling). It's more about deceiving, uhm, I mean convincing enough people to send around the link they found in Google believing that the content will show to their friend, and then the friend will see the payment request instead of what was intended to be sent.
Beth Ann,1) You don't need to spoof a referer to "...
Beth Ann,
1) You don't need to spoof a referer to "beat" this, all you need to do is do a Google search for
"site: www.premium-content-here.com" and then open all the links in a new tab. You then get the referer in a perfecly legimate way, and get access to all the FCF content. Not difficult.
2) How long do you think it would be before browser versions / plugins / mods appear with a "Via Google" button that reloads the page with a Google referer header? Various plugins and browsers already exist to alter these headers (mainly for debugging purposes), and don't require any specialist knowledge.
Subscription/paid sites need signups to be sustainable, and for a key aspect of their business cannot rely on such weak "security".
IMO, a better approach to this would be for search engines to support meta data about content (via the sitemap), which specifies whether or not it's free / signup, which appear in the search results, and allow the searcher to specify preferences about whether they require free/pay/subscribe sites in their results. Robust authentication of googlebot could allow webmasters to permit indexing of a full article whilst retaining it as premium content for customers.
Cloaking content in this way would be far less damaging, as users would be aware that they were about to view a pay / subscription link, and would have the option to filter their search results to restrict these sites if they wished.
If extra processing & development costs are an issue, Google could potentially charge businesses a small amount per clickthrough on links cloaked like this - in the manner of Adwords. The customer would be well aware that the link was pay / register beforehand, so the conversion rates would be high. Google searchers would of course want to be confident that search rankings were not otherwise affected by this, and would be able to filter those results if desired.
Deep / hidden web indexed, customers happy that they are in control of their search results. Businesses happy that their content can be found and conversion rates are high. Google making more money.
Everyone is happy, the world becomes a better place. Free cake for everyone. Steve hailed as saviour of teh interwebs.
Now I'm off to solve the Middle East. What's all the fuss about?
1) You don't need to spoof a referer to "beat" this, all you need to do is do a Google search for
"site: www.premium-content-here.com" and then open all the links in a new tab. You then get the referer in a perfecly legimate way, and get access to all the FCF content. Not difficult.
2) How long do you think it would be before browser versions / plugins / mods appear with a "Via Google" button that reloads the page with a Google referer header? Various plugins and browsers already exist to alter these headers (mainly for debugging purposes), and don't require any specialist knowledge.
Subscription/paid sites need signups to be sustainable, and for a key aspect of their business cannot rely on such weak "security".
IMO, a better approach to this would be for search engines to support meta data about content (via the sitemap), which specifies whether or not it's free / signup, which appear in the search results, and allow the searcher to specify preferences about whether they require free/pay/subscribe sites in their results. Robust authentication of googlebot could allow webmasters to permit indexing of a full article whilst retaining it as premium content for customers.
Cloaking content in this way would be far less damaging, as users would be aware that they were about to view a pay / subscription link, and would have the option to filter their search results to restrict these sites if they wished.
If extra processing & development costs are an issue, Google could potentially charge businesses a small amount per clickthrough on links cloaked like this - in the manner of Adwords. The customer would be well aware that the link was pay / register beforehand, so the conversion rates would be high. Google searchers would of course want to be confident that search rankings were not otherwise affected by this, and would be able to filter those results if desired.
Deep / hidden web indexed, customers happy that they are in control of their search results. Businesses happy that their content can be found and conversion rates are high. Google making more money.
Everyone is happy, the world becomes a better place. Free cake for everyone. Steve hailed as saviour of teh interwebs.
Now I'm off to solve the Middle East. What's all the fuss about?
الأربعاء، 22 أكتوبر 2008
@beth ann: The real problem are not those few peop...
@beth ann: The real problem are not those few people knowing how to fake referrers in their browsers... the real problem arises from people writing own bots which always supplys a google referrer trying to steal the content of my site. As long as there is no 100% working way to tell if someone did really a search on searchengineXY or is just supplying a google.com?q=asdfasdf referrer, I will not implement this. I can check if someone claiming googlebot is really googlebot with dns lookups, but currently their is no way to check the referrer for authenticity and thats the big problem.
My feelings on this "service" are mixed, but I've ...
My feelings on this "service" are mixed, but I've got one simple question for the naysayers: How many *regular* people really know how to fake a referrer? Many of the Joe Searchers I know can't figure out the difference between the address box and the Google search box. A few of them might be able to figure out that an advanced search could let them see more content for free, but most people I know aren't that industrious.
الثلاثاء، 21 أكتوبر 2008
B1, Google's cache can be instructed to ignore you...
B1, Google's cache can be instructed to ignore your page by use of a Meta tag in the document.
[meta name=”robots” content=”noarchive”]
(Use angle brackets around it instead of square)
For premium content this would seem like a good idea anyway, but it doesn't solve all the issues.
[meta name=”robots” content=”noarchive”]
(Use angle brackets around it instead of square)
For premium content this would seem like a good idea anyway, but it doesn't solve all the issues.
I run a subscription-based website and I also sell...
I run a subscription-based website and I also sell a plugin for Wordpress that provides subscription services, so I've wrestled with this subject a lot.
Any subscription-based website admin/author will tell you that they are continually trying to find the happy medium of giving enough teaser (public) material to show what the site is about, its quality and depth, etc, but not giving it all away. After all, your aim is to have people pay to read the rest of it.
This balance can and is already achieved by admins selecting some info for public viewing and restricting the rest.
I'm a big fan of Google and the technology it has introduced over the years, but I don't think this suggestion in its current form will be embraced because Google is asking to circumvent this crucial and delicate balance.
I might be more interested in this if Google's search engine would not display the FCF content in its search results (won't this show up in the cached area?), yet still have the search engine match any terms that the bot may have found in the FCF page. Also, I don't think the user's should be given any FCF privileges - only the Google Bot and only if it doesn't reveal exactly what it saw. If the user is interested in the content then let the website decide whether to let them in or not - maybe with a free trial if they signup, or a discount on a new subscription, or whatever the admin thinks is best.
Many thanks to Google though for at least sharing with us what goes on inside its incubator and giving us an opportunity to comment. How many other titans out there are so open and engaging. Thank you Google, I appreciate that.
- BC.
Any subscription-based website admin/author will tell you that they are continually trying to find the happy medium of giving enough teaser (public) material to show what the site is about, its quality and depth, etc, but not giving it all away. After all, your aim is to have people pay to read the rest of it.
This balance can and is already achieved by admins selecting some info for public viewing and restricting the rest.
I'm a big fan of Google and the technology it has introduced over the years, but I don't think this suggestion in its current form will be embraced because Google is asking to circumvent this crucial and delicate balance.
I might be more interested in this if Google's search engine would not display the FCF content in its search results (won't this show up in the cached area?), yet still have the search engine match any terms that the bot may have found in the FCF page. Also, I don't think the user's should be given any FCF privileges - only the Google Bot and only if it doesn't reveal exactly what it saw. If the user is interested in the content then let the website decide whether to let them in or not - maybe with a free trial if they signup, or a discount on a new subscription, or whatever the admin thinks is best.
Many thanks to Google though for at least sharing with us what goes on inside its incubator and giving us an opportunity to comment. How many other titans out there are so open and engaging. Thank you Google, I appreciate that.
- BC.
OK so Google wants me to agree to make all my prem...
OK so Google wants me to agree to make all my premium content free to anyone who either 1) uses Google's site specific search instead of the navigation I provide OR 2) who fakes the referrer.
WOW what a deal!
David
WOW what a deal!
David
this is essentially what webmasterworld and google...
this is essentially what webmasterworld and google have been doing for since the former erected a pay wall.
Here's (I think) a better solution which avoids mo...
Here's (I think) a better solution which avoids most of these problems. Implement this, and pay me a commission on revenue (just kidding), or give me a job and I'll implement it for you (kidding a bit less ;) ).
----
1) Update the sitemap spec to support the following true/false flags for each page
* Free View (Is it cost free to view the full page)
* Free Preview (Is it cost free to view a preview of the page)
* Registration View (Is registration required to view the page)
* Registration Preview (Is registration required to view a preview of the page)
* Cache Preview (Should a preview of the page be cached)
* Cache page (Should the full page be cached)
(Depending on the model you require, you could base this on pages or articles - which would require some other modifications to the sitemap spec. This could easily be done without breaking for existing sitemaps.)
For each sitemap, indicate if SSL is supported for the URLs spidered (or alternate URLs, etc)
2) Equip googlebot with a client certificate to identify itself.
When googlebot spiders a site, the site can decide if it wants to let Google index the full article or not. If they are a big enough site to support SSL, they can validate the client is Google with higher certainty.
3) In the search listings, show flags depending on whether the information is free to view/preview and whether registration is required. Allow users to filter search results on these criteria, and setup default preferences.
Penalise sites if they are reported (and confirmed) as lying about the cost and registration flags.
Result: You can index the hidden web in a way allowed by the content authors, but still enable the end users to be in control of the type of sites their search will return.
Webmasters could build their registration and authentication system based on the rules in their sitemap file, so everything would be automatically up-to-date and in agreement between themselves and the search engines.
Just pop the cheque in the post ;)
----
1) Update the sitemap spec to support the following true/false flags for each page
* Free View (Is it cost free to view the full page)
* Free Preview (Is it cost free to view a preview of the page)
* Registration View (Is registration required to view the page)
* Registration Preview (Is registration required to view a preview of the page)
* Cache Preview (Should a preview of the page be cached)
* Cache page (Should the full page be cached)
(Depending on the model you require, you could base this on pages or articles - which would require some other modifications to the sitemap spec. This could easily be done without breaking for existing sitemaps.)
For each sitemap, indicate if SSL is supported for the URLs spidered (or alternate URLs, etc)
2) Equip googlebot with a client certificate to identify itself.
When googlebot spiders a site, the site can decide if it wants to let Google index the full article or not. If they are a big enough site to support SSL, they can validate the client is Google with higher certainty.
3) In the search listings, show flags depending on whether the information is free to view/preview and whether registration is required. Allow users to filter search results on these criteria, and setup default preferences.
Penalise sites if they are reported (and confirmed) as lying about the cost and registration flags.
Result: You can index the hidden web in a way allowed by the content authors, but still enable the end users to be in control of the type of sites their search will return.
Webmasters could build their registration and authentication system based on the rules in their sitemap file, so everything would be automatically up-to-date and in agreement between themselves and the search engines.
Just pop the cheque in the post ;)
الاثنين، 20 أكتوبر 2008
This is not something Google is "creating" just en...
This is not something Google is "creating" just endorsing. This is easily implemented, but how many people are actually going to do this? Why would you provide free access to only Google users?
What it boils down to, is will this increase registration conversion rates at sites?
There's two sides to this; first off if showing the "first click for free" does increase the conversion ratio, then you should do that for everyone who visits your site, not just those coming from Google results.
Second off, if you decide to do this only for Google users, people are going to spoof their referer ID to get to your content for free, not just for the first click, but for everywhere this is enabled on your site, which seems like it would reduce registration rates.
I see absolutely no reason a website should choose to do this.
What it boils down to, is will this increase registration conversion rates at sites?
There's two sides to this; first off if showing the "first click for free" does increase the conversion ratio, then you should do that for everyone who visits your site, not just those coming from Google results.
Second off, if you decide to do this only for Google users, people are going to spoof their referer ID to get to your content for free, not just for the first click, but for everywhere this is enabled on your site, which seems like it would reduce registration rates.
I see absolutely no reason a website should choose to do this.
There seems to be some confusion here on this issu...
There seems to be some confusion here on this issue of cloaking. The distinction here is not between users and search engines, but between users coming from one place, and users coming from another. Google apparently wants its customers to have more access to your site than customers from elsewhere - which shouldn't come as a surprise.
*However*, if you allow googlebot to index the whole page, but only show a limited page to browsers (whether refered from Google or not), then it is cloaking.
Google would possibly (probably?) argue that it's okay as long as the googlebot indexed page has the same content as the page seen by users *when they click through from Google*
I wonder why anybody would want to implement this. Maybe some of the above enthusiasts could explain what good they see here?
*However*, if you allow googlebot to index the whole page, but only show a limited page to browsers (whether refered from Google or not), then it is cloaking.
Google would possibly (probably?) argue that it's okay as long as the googlebot indexed page has the same content as the page seen by users *when they click through from Google*
I wonder why anybody would want to implement this. Maybe some of the above enthusiasts could explain what good they see here?
I added my thoughts here: http://blogoscoped.com/a...
I added my thoughts here: http://blogoscoped.com/archive/2008-10-20-n30.html
Looks like a new therm should be coined for the th...
Looks like a new therm should be coined for the things like that - the "content neutrality". Or rather, the lack of it in this case.
Also it's a pity Google has changed its stance on the cloaking. Time to update the webmaster guidelines?
Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.
Some examples of cloaking include:
* Serving a page of HTML text to search engines, while showing a page of images or Flash to users.
* Serving different content to search engines than to users.
Also it's a pity Google has changed its stance on the cloaking. Time to update the webmaster guidelines?
Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.
Some examples of cloaking include:
* Serving a page of HTML text to search engines, while showing a page of images or Flash to users.
* Serving different content to search engines than to users.
Am I reading this right?Website owners are now exp...
Am I reading this right?
Website owners are now expected to reward users for using Google to navigate their site rather than the Website's own navigation.
A search for "site: www.restricted-content-here.com" and then opening all the documents in a new tab would seem to provide all "first click free" content, and when people figure this out, the only way those sites will make any money / signups is by having their valuable content unavailable through this system, and only having teaser content for free - Just like it is now.
Further to that, with this proposal, Google doesn't seem to be providing code for owners to use to filter these users, or treating the search traffic differently in any way. While it's not rocket science to implement, implying that this is a Google service seems dubious, as Google don't seem to be actually doing *anything* here.
I can see the benefit to general web users (In the short term perhaps), but what's the incentive for website owners to actually implement this? Are they going to be penalised in the search rankings if they don't?
I'm just a bit baffled - am I missing something important being offered here? I assume I must be, as Google usually offers well thought through products.
Website owners are now expected to reward users for using Google to navigate their site rather than the Website's own navigation.
A search for "site: www.restricted-content-here.com" and then opening all the documents in a new tab would seem to provide all "first click free" content, and when people figure this out, the only way those sites will make any money / signups is by having their valuable content unavailable through this system, and only having teaser content for free - Just like it is now.
Further to that, with this proposal, Google doesn't seem to be providing code for owners to use to filter these users, or treating the search traffic differently in any way. While it's not rocket science to implement, implying that this is a Google service seems dubious, as Google don't seem to be actually doing *anything* here.
I can see the benefit to general web users (In the short term perhaps), but what's the incentive for website owners to actually implement this? Are they going to be penalised in the search rankings if they don't?
I'm just a bit baffled - am I missing something important being offered here? I assume I must be, as Google usually offers well thought through products.
有妞妞的夢
2006在妞妞過世的時候, 我一直沒辦法接受永遠看不到她的事實, 剛開始每天睡覺的時候,都會祈禱妞妞會入我的夢.不過,我擔心太多了,妞妞在這兩年中,不知道自己跑進我的夢境幾次了.而且夢境越來越清楚,也越來越寫實. 雖然有的很好笑,例如: 妞妞變成馬了,我還是義無反顧的照顧她,幫她四處奔波找carrot, 但她的行為舉止還是跟狗一樣! sigh... 剛開始的夢境就是我抱著生重病的她,一直奔跑或是帶她出去玩. 這種只有一瞬間或是單一事件的夢境發生很多次. 接著,就是剛剛說變馬的夢境,雖然我在夢裡知道她是馬,但認定她就是妞妞,而她的行為也是妞妞.這個夢就有點複雜而且清楚,從我怎麼知道她是馬,知道她要吃什麼,去夢中的雜貨店還有萊爾富幫她找蔬菜,還跟老闆討價還價抱怨價錢太貴,然後一氣之下不買壞掉的蔬菜,帶她回家吃冷凍花椰菜,還命令kyle不可以再騎那隻馬了,因為她沒吃東西,
الاثنين، 13 أكتوبر 2008
Leaving SF & Moving down to LA
十月底即將離開SF搬到LA.雖然去過LA的次數不計其數, 但是還是很不習慣當地的天氣跟生活模式.畢竟"腳"的定義在SF和LA就大不相同.在畢業前夕到現在, 做出了幾項人生重大的抉擇. 雖然都很突然, 但都是為了未來的規劃而走的路,包括離開舊金山.雖然搬到LA好處多多, 最大的好處就是可以節省房租. 但畢竟在舊金山住了兩年多, 有感情也是必然的.想當初剛到舊金山的時候, 面對無所不在的流浪漢跟常有尿騷味的街道充滿了失望的情緒, 又常常聽到身旁人被搶劫或是被不友善的對待, 感覺既無奈又生氣, 畢竟在台灣,大家長得都一樣,不用怕高大的黑人.但是,隨著適應學校的課業, 了解舊金山的美麗, 享受涼爽的天氣, 體會方便的交通後, 開始慢慢的愛上舊金山. 享受跟朋友約在咖啡店聊天的樂趣, 暖暖的太陽下走過financial district, 跟著大家排著長長的隊伍等吃上最有名的brunch,
الاشتراك في:
الرسائل (Atom)