Oops! Something went sideways.

Looks like the styling got goofed up. Sorry about that, unless it's what you wanted. If this isn't what you were looking for, try force refreshing your page. You can do that by pressing Shift + F5, or holding Shift and clicking on the "reload" icon. (It's the weird circle arrow thing "⟳" just above this page, usually next to where it says https://blog.unitedheroes.net...)

isn't quite ashamed enough to present

jr conlin's ink stained banana

:: Minus Words

i don’t generally like to block ads. As i’ve constantly stated time and time again, the internet is not free. Someone has to pay not only for your connection to it, but for all the various sites and whatnot you connect to. If it’s not you ponying up the payola, it’s probably an advertiser.

i recognize that. For the benefit i get from visiting various sites, i happily accept that i will get some level of advertisement. If a site has what i feel excessive or overly annoying ads (comics.com) i stop visiting them and look for appropriate replacements (yahoo.com)

Sadly, there are times that i find an ad so egregious that i do run adblocker. Things like exploiting a firefox hole to force a popup, playing a looped sound file on load with no way for me to turn it off, or floating an ad directly on top of content until i clear it all pretty much result in me adding code for that particular ad class to the block listing.

i’m now thinking about adding AdWords.

Mind you, i have no problem with Google collecting all sorts of metrics on the folks who want to use their Web Accelerator. i also applaud the fact that it’s an anonymizing proxy. What i have issues with is the fact that they are displaying content without the ads that are being used to pay for that content.

It hurts the advertiser
Let’s say that you’ve got an ad campaign where you’re giving away free ipods to the first 500 people to click on an ad. You’ve carefully worked out the details with some company to ensure that the ad is only viewed until you’ve given away the 500, so that you limit the number of people who will be refused. The accelerator pretty much blows that out of the water. It also means that if you advertise a price for something like an airfare, you may be legally responsible to continue honoring that price far longer than you planned because someone will be seeing a page well beyond it’s normally cache-busting expiration period. It’s not far fetched at all, folks.

It hurts the content producer
Sure, it’s nice that you don’t have to foot the bill for your page suddenly being incredibly popular, but how will you ever know? What if you decide to remove that page sometime in the future because it’s just not getting that much traffic anyway? You’ll have no idea how much of your audience will now be frustrated and go to your computer because you “just killed off the one, really cool feature that everyone on Slashdot was talking about”.

It doesn’t hurt Google
They get lots of information about how their customers surf. They get lots of information about what pages are popular (in a very insulated manner) that they can then sell off to various tracking sites like the MPAA and RIAA. (Nielsen ain’t got nothing on search engines, kids.) What’s more, how are Adwords currently being distributed? By Javascript includes, kids.

Guess what probably won’t be read out of local, pre-fetched caches?

i’m sorry, but if they’re not going to play fair, i can’t see why i should.

For what it’s worth, i will probably download and screw around with the web accelerator, if only to figure out how to best break it. Oh, don’t worry, i’m sure the fine folks at fastclick.net (the ones who happily pop up windows in Firefox) have already grabbed copies and are doing the exact same thing.

Google, Do No(thing but) Evil

Ok, i’ve installed the client and have begun poking around with it.

  1. You can’t specify the install directory (%ProgramFiles%/Google/Web Accelerator/) or the cache directory (%TMP%).
  2. According to the proxy.pac file for IE, Google, gmail, windowsupdate.microsoft.com, localnets (192.168.*, 10.*) and a few other subnets are excluded from caching
  3. You can specify various sites you wish to exclude as well. Based on the above, i’d recommend frequently visited news, rss, blogs, weather, finance, movie and tv listing, travel booking, and related sites.
  4. Which makes me wonder how useful this tool would really be.

Seriously. Either Google would have to be phenominal at guessing what pages you’re likely to go to, or have better cross country routing tables for this to really be useful. Don’t services like Akamai do this already? (With the added bonus that content producers can determine what should and shouldn’t be cached?

Oh well, more diddling to come.

Blogs of note
personal Christopher Conlin USMC memoirs of hydrogen guy rhapsodic.org Henriette's Herbal Blog
geek ultramookie

Powered by WordPress
Hosted on Dreamhost.