Friday, March 30, 2007

Republish your site's RSS feed on Twitter

Joel has been wanting to replicate my site's RSS data via Twitter, to do a quasi-realtime update service. Neat idea, even though I'm not the biggest Twitter fan. He sent me this SWiK page, showing how to repurpose an existing feed as a Twitter bot. Very slick.
require 'rubygems'
require 'active_record'
require 'simple-rss'
require 'open-uri'
require 'twitter'

#twitter account to post to
twitter_email = ""
twitter_password = "secret"

#rss feed to post
rss_url = ""
rss_user_agent = ""

#sqlite db
path_to_sqlite_db = "/PATH/TO/db.sqlite"

ActiveRecord::Base.logger =
ActiveRecord::Base.colorize_logging = false

:adapter => "sqlite3",
:dbfile => path_to_sqlite_db

#uncomment this section the first time to create the table
#ActiveRecord::Schema.define do
# create_table :item do |table|
# table.column :title, :string
# table.column :link, :string
# end

class Item def to_s
"#{self.title[0..(]} - #{}"

#run the beast
rss_items = SimpleRSS.parse open(rss_url ,"User-Agent" => rss_user_agent)

for item in rss_items.items
Item.transaction do
unless existing_item = Item.find(:all, :conditions => ["link=?",]).first
twitter ||=, twitter_password)
new_item = Item.create(:title => item.title, :link =>

Thursday, March 29, 2007

Are TV audiences getting smarter?

I just caught something from an NBC promo for "Heroes", in which the tagline read something like this:
To enter the contest and make your theory, go to: ""
Could this signal that mainstream TV audiences are getting smarter, at least in terms of URL savvy? Mentioning a web path that goes more than a single directory deep was a major no-no in years' past in terms of marketing over traditional broadcast media. Since I work for a media company, it was critical to have friendly URLs for our cross-platform teases. Mental bookmarking away from the PC is easier (translating into actual browser bookmarking later), top-of-mind awareness is preserved and you generate a winning connection with users. This was a major design decision that went into our site from the onset.

Most of the better, non-CMS dependent web sites in the early days used to do this, too. Microsoft used to be this way but has largely gone away from such a hierarchy. Google modified the rule, typically using,, and the like.

If viewers and listeners can tolerate going one more folder down, I guess that's progress.

It's on

Forgive me for gloating just a tad, but I've compiled a list of the stations that wew're up against for this year's national Murrow award for best small market TV news web site. We won last year.
What do you think and which is your favorite? And why? There are some pretty good sites this year with some interesting takes on presenting news, events and information. At some point, I'd like to be a judge for the Murrows...maybe there's a "Top Gun" rule - ya know, if you win once, you get to come back as an instructor. :-)

Of interest is that a lot of these sites run ASP.NET. I'm just sayin'. wins third straight regional Murrow

This is awesome! I mentioned earlier how it's the awards season for the broadcast industry. I've known about this for a couple days now, but now the media embargo's over so I can let the cat out of the bag - was selected as the top news web site by the Radio and Television News Directors Association for small markets in our region (Guam, Hawaii, Nevada & California).

A special congrats go out to our colleagues at KSEE 24 in Fresno, who dominated our region. Very nice work.

This is our third straight regional Edward R. Murrow Award at KUAM, and we're up for our second straight national award this fall. I'm also crossing my fingers for an Eppy.

Congrats to all this year's winners!

Tuesday, March 27, 2007

KUAM goes desktop

If you're wondering about my fascination with desktop RIAs over the last few weeks, there's a very good reason. We launched our KUAM Desktop initiative today, after having worked with our beta community over the last several weeks on a series of desktop tools and utils, with our new IE toolbar plugin being the first in the series.

This allows us to have yet another aspect of Constant Connectivity with our users, given the fact that the embedded newsticker pulls content from our news RSS feed, and has a search bar that lets users lookup stuff on our site.

I put it up about a half-hour ago and already it's been downloaded a bunch of times by Guam news junkies, so things are looking good!

Saturday, March 24, 2007

Let's talk!

Make some time to check out KUAM's booth at the Home and Living Expo this weekend. We're hosting a very fun and interactive session this year - you and your friends/family can star in a commercial! I'm not presenting anything formally (I wish I was), but it's still a very good time.

I spent a lot of time talking to people about ASP.NET, RIAs, scalability, Ubuntu Linux, Ruby on Rails, RSS and other non-tech subjects like how the Yankees are going to do, the NCAA tournament, sweep arpeggios and other fun things.

I might even stop by the electric guitar exhibit and do my rendition of George Lynch's "Mr. Scary". It's less impressive than "Eruption" but still fun to watch and less predictable.

(If you're lucky, you might able be able to catch a sneak peek at some of the new products we're working on for

Hope to see you there!

Thursday, March 22, 2007

Web servers supporting desktop traffic

Serving traffic on the World Wide Web used to be so easy. Long gone are the days when simple web servers process HTTP conversations, returning content requests with pages containing text, images, applets and dynamic script. Now with the Web being truly programmable, there's a lot more to think about in terms of total accessibility.

It's been a busy day at Camp Happy, but I've been taking a couple of moments in between breaking stories, thinking about architectures for desktop RIAs. Particularly, I'm thinking about scalability in the vein of how much additional traffic cross-platform desktop apps, desktop widgets and browser plugins impose on traditional web site infrastructure (a browser and server engaging in a stateless client-server transaction). They're essentially yet another venue to tap your site's resources, so now your server assumes the additional responsibility of pulling data for extra-browser devices and clients.

Ironically, emerging web technologies are pulling us further and further away from the Web as we've known it.

Unless the RIAs ship with embedded databases that in some way sync with their source (possible, but not likely) or have some snazzy means of making offline access feasible. Web services built on SOA are a good place to start - using SOAP or REST to architect and maintain servers just for devices and access points not intended for traditional web browsing.

Componentized, middle tier caching across disparate platforms and devices via properly written business objects, not just replication at the individual page level, is critical to sustaining performance and minimizing lag. Done properly, this ensures synchronization across platforms, not having modified data seen on a web page be different than that viewed seconds later on a mobile phone.

Constant consumer connectivity also worries me. Most people close their browsers when done with a web session. Even the truly hardcore exit out of the program every now and then when not actively surfing. But terminating an individual application is far more common than rebooting an entire machine. A user's desktop, being ever-resident, can mean non-stop polling for new data if a desktop RIA's running on top of it. This could result in an escalating amount of service requests or method invocations - a few users could easily tax your system without even trying, just by being connected to it.

From an app provider perspective, if you're building a desktop RIA really take the time to consider and properly use HTTP headers as a means of good data access practice, as is commonly done in well-designed RSS readers. Or, use a clever timeout setting like Digg's BigSpy, which halts incessant loading of information after a certain period of time or if the system detects dormancy. Such a pausing feature is also possible with AJAX.

Data access is changing very rapidly, and developing the APIs through which to get at information is only one part of the equation. You need to properly setup your backend to not only serve up your stuff quickly and in the right format(s), but also to facilitate a growing number of clients. And those programs living on the desktop will very soon be hitting your services with increasing - and possibly alarming - frequency.

Revenue strategy for web services

Adobe notes that it's developer revenue isn't really taking off, and interviews from Amazon's Jeff Bezos indicate that he's hopeful that someday, their web services platform will be a major driving force in the company's profit plan. This is something I've been struggling with, too - trying to find that secret sauce to monetizing APIs, RSS feeds, mashup models, and JavaScript imports.

At KUAM, opening up our news data more and more to programmers naturally means we'll need to support more bandwidth to facilitate the additional data transfer from remote clients accessing our stuff. This means more costs to our overhead.

Borrowing from the volume model made popular by Google for Google Maps, I've thought about using free license keys, allowing a certain number of API calls every day. Additional method invocations against the base service would either (a) not facilitate the service thereafter, or (b) apply an overage surcharge. I've even thought about developing tiered rate cards based on similar expected volume/overage levels for high-end consumers.

What revenue model are you using for your web services?

Mashup point/counterpoint

Read/Write Web perfectly and succinctly describes the argument over the legal concerns about web scraping:
Scraping technologies are actually fairly questionable. In a way, they can be perceived as stealing the information owned by a web site. The whole issue is complicated because it is unclear where copy/paste ends and scraping begins. It is okay for people to copy and save the information from web pages, but it might not be legal to have software do this automatically. But scraping of the page and then offering a service that leverages the information without crediting the original source, is unlikely to be legal.

Desktop RIAs & digital footptints

Since the big news of this afternoon seems to be the release of an Apollo-driven YourMinis desktop RIA, I did some snooping and wanted to see the footprint left by Apollo desktop apps. So I downloaded YourMini's 211K .AIR file, which being cross-platform, works on a Mac or Windows box. Contrast this against some of the early desktop news readers running WPF and the system demands (typically either WinXP with SP2, 500MB RAM recommended, the .NET Framework 3.0; or Vista).

Specifically, Forbes' impressive app requires an initial 1.4MB MSI download. I like both approaches, specifically Adobe's, being an easily distributable package, consistent with Flash. There's a slight wait, but nothing too unconscionable.

Wednesday, March 21, 2007

Tired of scraping data? Just ask!

A constant source of stress for modern-day web programmers is finding a good source of data (or two or three). Hackers constantly bemoan how their favorite sources for information don't have public APIs, web services, RSS feeds or other Web 2.0-style remote access. Thus, they're thus relegated to considering one the few options we have left: web scraping. This is can be tough (coding isn't simple), fickle (if the source changes its HTML, the app breaks), and arguably illegal (a form of unauthorized stealth). However, Alex Iskold points out a compelling argument against the theft concern.

With more and more local web programmers getting into mashups and remixing, has become a popular candidate as a data source, since we generate a lot of content frequently. But even with the various ways we offer developers to access our data, sometimes it's not enough. I totally understand. Some people have even e-mailed me asking if they can get a view of a database table, an XML file or an Atom feed of some aspect of our data. In some cases, I've accommodated them. (Hey, making this type of stuff available gives us one more thing to show of in our RSS gallery.)

In fact, a project I've started recently seeks to develop a web UI that serves as contact form and end-user license agreement, and upon submission dynamically builds RSS feeds based on a user's specifications and filtering options. So, you could easily request certain information, inherently comply with the legalese, and have a new source of data...instantly!

(Other more specific APIs, non-standard formats or complex data types to be serialized as XML are routed to site administrators who write the SQL or data shaping to spit the data out in the desired format with the intended structure, with proper frequency.)

So next time you get stuck looking for a good raw source of data, just ask if one can be set up for you. The worst you'll get back is 'no', and it's a safer and more reliable alternative to scraping.

Where's the Linux guy in the Apple spots?

Here's an interesting take on the question of how the Apple commercials might be if Linux was represented.

Want to work with me on cool web stuff?

In addition to actively looking for a full-time web UI designer, I also routinely shop around for developers and designers for seasonal projects we have. And it's about that time. Do the work from home, knock out a task, get paid, and get on with your life. If you're in Guam or Hawaii and want the chance to do some really cutting-edge, meaningful work that gets seen by of a ton of people and really expand your abilities, e-mail me at jason [at] kuam [dot] com and send me your resume and URLs to some of the work you've done.

If you can recite every single Java 2 method, property and event in reverse alphabetical order while whistling Dixie, that's impressive, but not the core skill. I'm more interested in attitude than API knowledge. I need people who are constantly creative and like solving interesting problems in innovative ways while under intense deadlines. This isn't an introductory gig, so you'll need to have a high level of technical ability and experience and be willing to bring your 'A' game to the table every day.

I'm also heading-up a group that's developing some really interesting web projects on the side...some really cool data remixing and social sharing concepts not limited to the Guam market. I'll need some help getting these off the ground, so e-mail me if you're interested in joining a really talented, diverse team.

Shying away from MSM

I came across a really engaging headline today while sifting through my aggregator, "Blogs Turn 10 - Who's the Father?", which instantly made me click on the link, but when seeing that the site pointed to CNet, my gut reaction was to close the window on principle. I'm sure the article's brilliant, well-written, properly edited and decently researched, but it just seems like there's a distinct disconnect between the quality of content found in blogs and mainstream media (MSM).

And since I'm in the news media as a professional journalist/developer, this is both contradictory and hypocritical. But that's kind of the harrowing point.

These days, I give a lot more credence to items written by non-professional writers from the blogosphere. The best pure writing I've seen over the past two years, without doubt, is from MSM authors, although bloggers more often than not make more of an impression - asking the right questions, issuing the appropriate challenges, making fair predictions, drawing accurate conclusions, and connecting resources where necessary. They just lack the grand scale circulation system indicative of newspapers, TV networks, publications and radio channels.

MSM sources also typically are slower to respond because of a need to cross-reference, fact-check, cross-promote a piece, in addition to running it through the editorial wash several times.

Take for example an article BusinessWeek did on Digg's Kevin Rose as the reluctant leader of the under-30 Web 2.0 movement. A masterful composition to be sure, but for guys like me in the know, incomplete. There were just perspectives not investigated, important history that was undocumented, achievements not chronicled, and associations that weren't made. And this was written, says the corresponding podcast, over four months. I could have gotten a lot more from Kevin's Wikipedia entry. But it is a major U.S. publication, so where it might be arguably lagging, it does give phenomenal exposure and circulation, as opposed to the relative obscurity of the blogosphere.

The scope in blogs is surely far narrower, but hits the mark more often.

(And in fairness, I did read the CNet piece after all, and it is very good, and bookmarkworthy. And despite my earlier criticism, the BusinessWeek pieces is one of my favorite articles on Digg. I guess we lash out against those things we adore the most. But these are still two of the exceptions of MSM.)

Tuesday, March 20, 2007

Your favorite blogger or your evening news anchor?

Steve Safran at Lost Remote has an intriguing post about a survey being undertaken by a researcher trying to prove the correlation of an audience's enjoyment of a media platform (blogging vs. TV news) with the strength of a direct relationship with them. Since I'm on both sides of the equation, I wonder how I'd fare.

Knowing my luck, I'd cancel myself out.

Progressive publishing at

Several of you have asked me for more insight into's web operations and analytics since I wrote a couple of recent items on the topic having to do with traffic, syndication, publishing volume and growth and viewership. An interesting thing to consider and something I discuss during tech talks is how our publishing has gotten greater and greater in volume over time.

The chart (full size image available here) plots the time its taken us to publish stories in blocks of 1,000 since I started in 2000 and we began doing news online. It shows how in Year 1 (when we ran FrontPage on a 14" monitor and a 33.6Kbps dial-up Internet connection), it took us about that long to run our first 1,000 stories online. Most recently, we set an internal record by pulling off 4-digit publishing in 69 days, and we're on pace this month to do that number in just under two months. Over the last year, we've averaged 88 days per each 1,000-article block.

There are several notable milestones within this timeline, such as when we migrated from a static system where we were saving physical files to disk to a dynamic, database-driven templated architecture based on ASP 3.0. We also implemented a custom content management system, built in-house, that publishes stories more rapidly, pulling them straight out of our Win32 newsroom backend. This lets us get stuff online fast, mirrored across web/SMS/WAP/RSS platforms, all in one operation, instead of the sequential authoring so common to blogs. (Before you ask, yes - we used this approach before and it drove me nuts to have to upload stories one at a time, with people reading them as you made them available. I got the idea from a BizTalk service I was using with

But of course, more than bragging about quantity, of chief concern to us is quality.

Web app scale and scalability

I'm borrowing from Om Malik's clever "Scale and Scalability" article and podcast (a must listen), which is turn borrows of course from Jane Austen. I've been consuming a lot of media lately on growing Web 2.0 sites, and handling the additional traffic concerns made possible by the programmable web. And I've been contrasting such growth headaches using the LAMP stack against more formal, closed frameworks.

I'll use Microsoft's .NET as an example and then discuss open source architectures.

Outside of best practice coding and performance tuning of the web applications themselves and aside from bandwidth considerations, most Microsoft apps can be co-hosted, with several sites per box. Despite the tidal wave of criticism against IIS, this is a credit to Microsoft's web server platform. During growth, the primary mentality of site managers overseeing ASP.NET sites with massive daily traffic is first to cleanly separate machines, with one box each functioning as web server(s) and database servers. Under larger loads, getting a dedicated box would be the typical next step. The next tier under heavy loads would be to build out the distributed processing, adding more servers (web farms) or more processors (web gardens). Features directly baked-in to the .NET Framework like garbage collection, managed code and ensure automated, if not better, memory management and object disposal.

The large concern for running such architectures is cost. Closed systems can be pretty expensive to run on your own, and private hosting services, while outstanding, largely aren't as cheap as their LAMP counterparts.

In contrast, since most Web 2.0 endeavors are born of open source, garage frameworks and have limited budgets, the need to organically grow a site's capacity to process requests and serve information can be daunting (outside of theory, I'm not a hands-on server guy). Excellent resources to consult on tools and techniques to utilize are the Danga/LiveJournal whitepaper on scaling web apps running LAMP, and the podcasts from Flickr's Cal Henderson and's Joshua Schachter from the 2006 Future of Web Apps conference. Topics discussed include things like using separate servers for HTML and multimedia, memcache, database indexing strategies and more.

Although I totally love Ruby on Rails, my one criticism of it is that we don't have enough projects akin to the size/scope of Amazon, eBay, or major-scale site running it yet. Lots of blogs, some really cool projects, but not much that gets Net Neutrality traffic yet. So in my mind, the jury's out on RoR.

But the one thing that all platforms have in common in terms of scalability management is the critical, basic need for proper planning, testing, performance monitoring, upgrade thresholds (at what baseline point does our traffic get such that we need to start buying more servers), intelligent use of caching, downtime notification for administrators (paging, e-mails, SMS), and clearly-defined escalation paths (who to contact when something goes awry).

Bottom line: don't skimp on scaling.

Softies on Rails take a look back

Jeff and Brian give a hilarious look at how ASP.NET has evolved, noting the inclusion of MVC architecture into the mix. It's funny how much I have in common with these guys. I totally need to make it out to their next training session.

Projects I'm thinking of doing

I'm mulling over several repetitive actions or web fallacies that might make nice side projects. Without actually looking around and seeing if any of these are already in existence, these would be fun to hack out. Among these:
All of these would be open source. Can you think of any more? I always enjoy working with other talented, mature programmers on such work, so get in touch with me offline and let's talk about it.

Report details MSM's community development

Right on the heels of the 2007 State of the Media Report, I'm now engrossed in the Frontiers of Innovation in Community Engagement report. True to form, it chronicles the efforts of mainstream media companies to build applications and embrace user-driven content. Nice work.

Download the 66-page PDF here.

Get up and running with Apollo

So you grabbed Apolllo. To help get you started, here are some good free videos on installing, configuring and running Apollo. Slick.

Monday, March 19, 2007

Adobe unveils public alpha of Apollo

Good news for RIA devs, both existing and potential. Adobe announced the release of the first public alpha for Apollo. Grab the runtime and SDK and start hacking!

I'm warming up to membership

I used to really be against the concept of forced membership for web sites, and fought hard for anonymous personalization. (It's good to see that people have developed workarounds.) The web's gotten so membership-centric these days that it seems you can't use any Web 2.0 applications without a login. It just irked me that services couldn't be used without belonging to a system. It was lethargy more than privacy or fear of commercialization of my info - as a user I didn't want to go through the extra step of registration, and as a developer didn't enjoy validating, (re)routing and conditionally displaying content based on inclusion.

I begrudgingly wrote systems like blogs and photo galleries that used membership where appropriate. But I've since gotten over my anti-membership days, mainly because out of ubiquity.

I you can't beat 'em, join 'em, I guess...

Usability decisions

I've been a voluntary Reese's Monkey in enough IT focus groups to learn a thing or two about human processes and usability. I've participated in several such endeavors for Microsoft, having gotten to use desktop sharing software that would eventually become Live Meeting, giving honest feedback on everything from the menu layout for Visual Studio, control nomenclature, and the syntax coloring scheme applied to T-SQL in Query Analyzer.

So I've grokked a few things about top-of-mind awareness and user psychographics. Today I ran a draft layout for a new project we're working on in-house, and observing the reactions to the layout of the controls and the logical flow of things. It's interesting feedback, and we always approach such design by considering the combined feedback of internal resources, our beta user community, and plain common sense.

Funny how simple stuff like this stalls the much more complex task programming a lot longer.

KUAM - "The Most", v.1.1

I previously mentioned our philosophy on managed growth with our online products, trying to create winning user experiences right out of the gate that get incrementally better. Just today I implemented an additional twist on a home-mashup-turned-feature "The Most", adding another tab to the UI that shows how many times people have e-mailed a KUAM News story, in addition to the top reads, search keywords and video clips.

Thanks to some very helpful feedback (thanks Jonah), we're already mapping out v.2.0.

Basketball jones

At long last, here's the Google Video presentation of the IIAAG boys basketball championship game I broadcast the other night. Journalistic objectivity aside, my alma mater won and the coach is a friend of mine, so I'm happy for the team and the school. Read my thoughts on the game.


Sunday, March 18, 2007

Can't get my LJ fix this month

There's only one publication I actually read anymore in its physical form - Linux Journal. So you can imagine my dismay that the local bookstore for whatever reason hasn't received/unpacked/put out their March mags. They're super behind.


Old and curmudgeonly

Can someone explain to me why "Stephen King's It" was playing on the lineup during ABC Family last night? Even though I wound up watching without turning away, am I the only who finds this fundamentally flawed? What's next - Hellraiser on Nick Jr.?

Geez, am I already that old that things like this matter to me?

IIAAG 2007 boys basketball championship

I did color commentary during our live broadcast of the local high school boys championship basketball game last night, wherein the Sanchez Sharks met the defending champ FD Friars. It was a good game, with Sanchez holding on to beat FD by 7. I'm getting a lot of e-mail from people asking me when we'll stream the game...and that's coming.

I'm uploading the entire 90-minute show to our Google Video channel, but the 317MB file may take awhile to pass approval. By Monday, for sure, so keep checking back and I'll have something on when its ready!

* UPDATE: I just finished uploading and its pending Google approval, which normally doesn't take too long. Hang in there!

Alan Graham on designing for Web 2.0

Alan Graham continues his excellent series on Web 2.0 design strategies for ZDNet. I enjoyed and Dugg his article, but respectfully disagree with one of his conclusions. He noted in Part 1 that architects should avoid the Perpetual Beta:
I like solutions that know where they are going, and get there. Period. It was a clever little marketing ploy using the "beta" phrase once upon a time…now it is just annoying.

This was a major factor in Tim O'Reilly's seminal writing on the topic, which I took at the time, and continue to believe, is about the philosophy of continually improving your software by making rapid modifications and upgades - and being on the web, implementing them instantly in the background and not redeploying.
The open source dictum, "release early and release often" in fact has morphed into an even more radical position, "the perpetual beta," in which the product is developed in the open, with new features slipstreamed in on a monthly, weekly, or even daily basis. It's no accident that services such as Gmail, Google Maps, Flickr,, and the like may be expected to bear a "Beta"logo for years at a time.
Understood as being a principle of software design and QA, this is critical and can't be dismissed as being a bad thing. I don't believe the "beta" phase should refer to marketing efforts, promotional tactics, or versioning; it's about developing good products that continually get better. The legend of Flickr (re)deploying every 30 minutes is an example.

If hiding beneath the guise of perpetual beta work means making excuses for touting shoddy programs with buggy code that perform poorly, I'm with Alan.

Tour the Apple Store in Second Life

Having been to the 24-hour Apple Store in New York City, I can tell you that this Second Life video is dead-on accurate.

I just don't get Twitter

I might be the goofy minority, but I haven't jumped full force on the Twitter bandwagon like a virally growing segment of the online community. Half-blog/half-IM...all odd to me. Telling people where you are, what you're doing and who you're doing it with is like encouraging legalized stalking. I'm glad I'm not the only one.

Granted, there are some sound, useful applications for it, like for breaking news or big events.

Maybe it's because I'm largely sedentary. Maybe it's because I'm personally introverted and value my privacy. Maybe it's because I'm in the news media business in a small market I know pretty much everything that's going on most of the time anyway. Maybe because the group of people with whom I communicate most are already on a closed cellular network.

Maybe I should just try the damn thing already.

Comment spam envy

TechCrunch is getting hit by comment spam 15,000 times a day, reports Michael Arrington. Geez. Before I implemented a custom CAPTCHA on my blogging framework at its peak point I usually got slammed several hundred times daily, and once 2,000 times in 24 hours. That was an unwelcome surprise.

Popularity does have its downside, I guess...

News sites prefer syndication over customization

I've been doing something I don't normally do lately: reading slowly. I've been poring over the phenomenal State of the Media report, absorbing each line in great detail, letting every precept and conclusion soak in like fine prose. (Of course, being a good journalist I'm taking everything with a grain of salt, but it is fine work to be appreciated.)

It comes as no surprise in the healthy discussion on digital journalism that the determination that for news web sites, syndicated services are trumping user customization/personalization in terms of dominant distribution features:
Apparently, for now, the ability to have content sent to you, or to find what you want, is taking precedence over letting people make a page theirs.
This isn't shocking in the least. RSS/Atom, being text-based, is really easy, inexpensive , templated, low-maintenance, platform agnostic and low-tech, and can be up and running in a matter of minutes. And the number of outlets, third-party tools and service providers that provide managed platforms for syndicated content are growing each week. Plus, there are a growing number of companies doing work in the RSS analytics space, as well as those making some headway in the in-feed ad insertion market. So the technology can be picked up by anyone and implemented practically on any web server; or there are a plethora of places to get help.

Contrast this with page customization, which in the context of content-heavy news destinations is a fairly heavy undertaking, to say the least. This makes for a serious sofware development project, probably ripe with advanced JavaScript to enable a high-end presentation layer. Advanced web frameworks like ASP.NET 2.0 might have a portal API that can power offshoots of as a widgets container, or a custom drag-and-drop UI like The LA Times' personalized portal to enable personalized layouts or an array of default content displays, but that's some serious, costly coding and equally uncheap IDEs.

So while the personalized experience is a feature either not in wide use, or only at the moment a luxury of major market corporate entities, it is growing. But RSS is here, and is formidable.

Emceeing the Miss Guam Tourism Pageant...again!

I was fortunate enough to have been asked to emcee the 2007 Miss Guam Tourism Pageant. I emceed last year with Sonya Artero, and it was great fun. It's on April 28, so make it out to the UOG Field House and check it out!

You can watch our interview with '06 MGT Risha Aguon.

Moderation blues

I hate censorship. Were I old enough at the time, I would have testified at the PMRC hearings, right along Dee Snider, Frank Zappa and John Denver. (I was 11 in 1985.) My mood this morning is fittingly as blue as the Blogger theme I've applied for this site...which is sadly about the only measure of control I have left these days.

Since I started, I've waged an escalating battle with my blog's producers over the level of feedback allowed and how closely I can interact with those who read my stuff. Despite this being my personal space and not being affiliated with KUAM, invariably it gets linked back to my station and what I do for a living as an anchorman. And I don't mind producers going through comments in the interest of quality control, and, to no small degree, safety.

Feedback my producers have deemed questionable has gotten so that they've decided to turn off anonymous commenting and go back to moderating feedback again, rejecting questionable comments (this happened once last year and I fought to get it removed). Bottom line: they're at the point where they're deleting more comments than ones they're allowing to remain. I welcome feedback and appreciate criticism, but it's apparently a bit much for the people whose job it is to make this a safe and happy place to be.

This isn't the result of one particular post or a specific user...just the overall tone certain people use, the relevance (or lack thereof) to the subject matter of the posts, and people's seemingly unending curiosity about personal details and schedules...and utter rage when I don't divulge such.

I wish I could more directly control it, but it's for the best. I've argued long and hard to allow those comments that make it online just to get there in the first place, so believe me when I say I'm with you on this. Consider this blog quasi-read-only for now...and do continue to leave comments and I'll do my best to get them published. To the 99.997% of you that do behave, I apologize and do appreciate your patience.

A few people pissed in the corner of the deep end of the pool, and now everybody's got to get out. So if you know me, you know how to get a hold of me.

That's show biz, baby. :-)

Saturday, March 17, 2007

Fixed searchbar for IE7

Internet Explorer 7 was having some behavioral issues (what a surprise) with the KUAM searchbar I implemented earlier this week. The parameters were being passed to the URL, so I wrote a different XML file than the one used by Firefox 2 to manage the bar in IE.

IE apparently expects the parameters right in the URL template, while Mozilla uses separate elements to contain the name/value pairs making up the query string.

Which is better? You be the judge. I'm just glad I got my new toy working.

Friday, March 16, 2007

How to Bluff Your Way in Web 2.0

Here's the talk that's got everyone talking. It's a great tongue-in-cheek stab, presented at this year's SXSW, at the (non)sense that's become of Web 2.0. Perfect timing, given that people are saying the bubble's now bursting.

(I was hoping for a podcast version of "Why XSLT is sexy" myself, but no luck yet.) on Rails

...well, not really.

Guam's News Network will be using Rails, I got the green light this morning from our executive producer to start hacking on a blogging app highlighting the behind-the-scenes goings on at Camp Happy (think allDay). I'm stoked because it'll be the first web app we'll put out based on Rails that'll be customer-viewable. To date, we've cobbled together minor admin utilities, but nothing I can link to. We'll be putting it on a different domain, which means a new application space, meaning the freedom to deviate from Microsoft architectures.

It's not like JasBlog, which I built in ASP.NET 1.1, or the mini-app I did for our election season last year. (The URL mapping's hella easier in Rails, letting us lean towards RESTful functionality.) I'm not going to attempt to set any landspeed records, although I know guys that have whipped out entire robust blogging apps during a lunch break.

More to come!

Best commenting systems

One thing I've noticed has been the micro-innovation within the blog commenting space. Some developers are getting pretty clever with displaying the feedback their posts generate. This makes reading the comments almost as much fun as the main article itself.

The two best examples I've found are those on Softies on Rails, Digg, and the Django Book beta site. Many of them leverage to do DHTML wipedown animations and such. Very impressive.

What cool, outside the box commenting systems have you noticed?

Thursday, March 15, 2007

You don't need people to make your apps social

Being Dugg. The Slashdot Effect. Having your site Farked. All facets of the network effects that are indicative of, but not exclusive to, Web 2.0 applications. This is the sensible result of the impact community involvement directly has on new media services - allowing users (registered or otherwise) to access, remix, share or interpret your core data in new, innovative, interesting ways.

So while the "Social Web" continues to be all the rage and legions of new sites pop up daily and existing media companies creep towards adoption of emerging paradigms with tons of phenomenal features centered around viral user interactivity, I'm reminded of the fundamentals of network effects, and what makes such services work in the first place. So consider this a lesson in pragmatic technical marketing next time you start pondering how to evolve your online projects.

I realize that not everything's got to have a friends list, AJAX-laden commenting systems, support for one or more IM clients, or tagging to the nth degree. The real magic is in delivering a positive user experience so that growing numbers gravitate towards and frequently use your stuff, promoting it communally.

Recently I implemented a little feature on exhibiting the content of ours that people are consuming with the greatest frequency - "The Most". Similar lists have proven hugely successful across media platforms - from the old radio countdown shows, to Letterman's nightly Top 10 rundowns, to VH1's infinite collection of ordered views of modern day pop culture. So it became obvious to me that expressing such a popularity contest through hypermedia was an opportunity that needed to be exploited. Consequently, we've been able to build a winning social service...without people.

The Most has been received extremely well, has generated positive feedback and has boosted our web traffic, so in that regard it's an instant hit. As a result, our video downloads have tripled. People are re-reading articles and bookmarking items they've already gone over. Users are checking back to see if their search keywords generate more and different results than with previous queries. It's a network effect of a different variety - the "Wow...people are checking this must be good!" upshot. There's no direct user interaction with our lists, but they're the exclusive result of people's browsing behavior.

In other words, we've capitalized on fostering a time-honored marketing device: good ol' fashioned community hype.

The point of this all is that sometimes the most effective solutions are the simplest. And on today's Web this so often gets lost. Developers insist on having bark overshadow bite. Granted, is a well-rooted, existing online property creating and distributing content in a proven industry (news) - not the latest player in the fickle Web 2.0 space. Admittedly, our solution by today's standards was arguably low-tech. With our little creation we haven't wowed anyone from O'Reilly, appeared on Digg's front page or secured a spot atop TechMeme (themselves canonical exhibitions of the power of Web 2.0 network effects), but it works for our users. It was easy to incorporate such a feature into the flow of our existing style of distributing news information without forcing our audiences to learn a radically new UI, or impeding our site's performance. We tapped existing data sources and reshaped the information so that it could be viewed in a new and different way.

Plus, this puts about 40 additional links on my homepage, which is better for SEO. And, more importantly, it gives individual stories one or two more precious hours of primetime exposure.

So give some thought to the effectiveness of your development next time you map out some cool hack. Don't get me wrong - I'm all about next-gen programming, but of greater concern to me is creating the best possible experience. If in certain situations this means scaling back from using high-end techniques and technologies, I've got no problem with that.

Wednesday, March 14, 2007

The Long Tail of Time for news sites

(Get the image at full-size here)

I got some very positive feedback from my blog post on how we've interpreted The Long Tail for news operations. I ran this by Chris Anderson (you should totally buy his book), who echoed my sentiments about The Long Tail of Time - the consumption impact for news stories available through homepage promotion, Digg-esque top-of-stack queuing or various other content delivery methods. This basically justifies the common sense assumption that items with a higher sense of immediacy and promotion will be greater consumed. But it also demonstrates the belief that having a larger inventory of content helps your overall traffic by constantly showing demand for smaller pieces of smaller effect, and that such in the aggregate can actually be greater than the stories in the head.

I'm prepping some slide decks for a tech talk I'll be giving soon (more on this forthcoming), so I mapped this out by tracking the stories we've produced that people are reading on over a few hours. This illustrates the concept as applied to the news industry, and I assume for other high-volume web publishers as well. Obviously, with sites larger and more consumed than ours (a local small-market affiliate in Guam) the numbers would be proportionately higher. But the main consistency should be the trend exhibiting similar numbers.

How are you applying The Long Tail in your industry?

So, Microsoft...where is the Web going?

I got an e-mail announcement for MIX07, during which time Ray Ozzie will give a breakdown on what the future holds for the Web and how developers can get involved. Interesting. Seeing as how Microsoft's notable lag with many emerging platforms a couple of years ago (i.e., podcasting, RSS, Web 2.0, AJAX, etc.) was the major motivator that led me to embrace open source, I'd be interested in going and listening.

Hell, there's a free copy of Vista Ultimate for attendees.

Tuesday, March 13, 2007

State of the Media has a really good report up about the existence of news media, broken down by platform. Very exhaustive, and very good. (When the executive summary's 38 pages, you know it's pretty heavy.)

LCDs, Mac monitors better with DV cams than CRTs

I'm not hardware guru, but since I work for a TV station, I've been able to glean a few things off our videographers. One such tidbit is the fact that LCD monitors, and for some reason, Mac OS displays, work better with most video cameras and DV cams than do CRTs on Windows.

After years of messing around with shooting techniques for web pages, applications and such, and battling the irritating nature of CRTs to translate vertically-jumping lines when displayed on TV, we came to the conclusion that it's the Windows refresh rate. Messing with the settings and synching a camera with the refresh rate of the monitor (60 hertz works best, in my experience) for the most part eliminates the lines. But I've never had such problems with our OS X machines, or the 22" LCD Scer display I've got for my XP laptop.

We have yet to test this for Linux yet.

Monday, March 12, 2007

MioMediaBox - offline RSS reader for PSP

I messed around with the freeware MioMediaBox from MioPlanet this weekend, after looking for a good offline RSS reader for the PSP. The SEA is a small footprint on your desktop PC, and synchs with your PSP, saving an abstract of each item within a feed as an image, of which you can set the resolution quality to save drivespace. It's a method of content delivery I certainly admire. Think of Perooz.

It's very clever, and does work, but the limited amount of content that's squished within a single JPG doesn't make for a very good experience. Most of the syndicated content I subscribe to can be pretty verbose, so the truncated content gets lost in the mix. It's nice for quick, one-off reading, but full-on RSS will hopefully be built into the next version.

Really not liking IE right about now

I sussed out a nagging bug this morning in a module I added to KUAM's homepage last night of which I previously wrote. The bane of my existence was a behavior that only acted up in IE, forcing the middle tab to screw with the content, placing it at the bottom of the DIV. I came to the conclusion that since the middle tab was vertically the tallest, it might have something to do with the number of items, so reducing them did the trick.

This sucks, because it worked just fine in Opera, Safari, Konqueror, et al. So we had the sacrifice the potential data we could have displayed to get the UI right. Dang.

KUAM does widgets

I announced today our new formalized widgets library...which at the moment is a series of web services, remote JavaScript imports, and RSS feeds. But more importantly this introduces the Google gadget I built a few weeks back but have had yet to promote, as well as the browser searchbar for IE7 and Firefox 2 that I cobbled together yesterday.

The start of the day is good!

Thursday, March 08, 2007

Hack-o-the-year: Google Maps zoom

This is really cool...add some arguments to the query string in Google Maps and get super zoomability.

Wednesday, March 07, 2007

Mashup - KUAM: The Most

What better provider to create mashups from than yourself? A fun little sub-app I've been hacking on for the past couple of weekends is almost ready for primetime. KUAM - The Most highlights the sections of our site that people request with the greatest frequency. It's basically a multi-tabbed UI, displaying lists of our most read stories, our most searched keywords, and our most watched video clips. I'll most likely deploy this as a pluggable IFRAME module, so that it can run safely in its own little security sandbox, and reload without imposing any more data access concerns on pages already using AJAX polling.

The interesting thing about this is that as a developer, this was a neat challenge. The data for the first tab (most read stories) was a simple web service call-&-cache operation...nothing too intimidating, but computationally the most intense process of the three. The second tab (most searched keywords) is the result of reading an existing RSS feed, which is quick and simple.

The third tab is a doozie - I'm reading-in link data from a password-protected stats page on a remote server. This meant performing an authenticated screen scrape and filtering out all the A tags with a pretty heavy-duty regular expression. Because the stats themselves are a descending rollup list that doesn't necessarily include currently playing video clips, I had to compare each one against a roster of available clips, evaluate them, and display only those.

Each tab uses it's own sliding caching policy based on the implied expiry nature of the underlying data. It was a little Sunday idea I had that turned into something fun and functional. Look for it soon on!

A-ha! A convert!

I've fostered my first Linux migrant, getting my best friend to try Ubuntu on a live CD. He's a longtime Windows systems guy and Microsoft consultant, so while having an open mind, he's not the type that easily embraces new IT things if they don't have Redmond roots.

Bottom line: he fell in love with open source. He marveled at the phenomenal performance he got out of hardware that only decently ran Windows. He really dug GNOME and I passed him some YouTube clips of Beryl doing its freaky thing. He couldn't believe the choice and wealth of apps in Synaptic. He's already freeing up partition space for a swap drive on which to install Edgy Eft permanently, and then run Win32 apps through Wine.

Mission accomplished. One down, more to go.

Coaches I'll never bet against

March Madness is a classic time, and I'm reminded why despite their teams' respective records, outside any controversy, pressure or drama, I will never count out several coaches. These are people who ability to get players to buy into a team concept, play beyond themselves and achieve something truly great is unprecedented. That having been said, it's also worth noting that my undying loyalty to them is a prime reason for my bracket dismay over the past few years, too.

Why mainstream media shouldn't go social

There's been quite a buzz of late after USAToday redesigned their site, but more importantly introduced a deep, slick array of social interactivity features. It's really impressive in the round, but personally, I'm not down with the whole 'comment on our story' option because it obviously opens up a Pandora's box of anonymous flaming. This I have a problem with.

My station does some work in the social space, but not with news. And this is deliberate. Now don't get me an interactive producer for a high-end local news site, I'm all about expanding access to information. The quality control element is the really hard cog to implement and sustain such that people won't be acting inappropriately in your threads. The unavoidable chance that users, even if registered, will post too much of a risk on your core competency.

Locally our newspaper, the Pacific Daily News (a great company with whom we've collaborated in the past), did this by implementing phpBB to manage a message board system integrated with its stories, but it's had it's share of problems. Case in point: their coverage of two soldiers today that got killed generated a slew of useless comments by handle-safe morons nitpicking the fact that the cover story drawing, where the men perished, may or may not have been accurate in depicting Guam's distance from Africa. The Florida Sun-Sentinel also is targeted for its similar use of a message board and people posting sexist, racist and generally non-related afterthoughts. I disagree with such a system because it unfairly taints the story and ruins the work of those that put it together. And most importantly, it degrades the user experience for those that actually give a damn.

There's also a lot more at stake to lose. If a garden variety blogger gets ripped by an unscrupulous commenter, it's of little, if any, consequence. If a major organization gets lambasted by enough people - it's,

It's inevitable that we'll spell something wrong, or make a minor error, an oversight that can be rectified quickly. But it'll generate a huge amount of people citing such with varying degree of professional candor, taking away from the purpose of the content. I've noted headlines having spelling snafus every now and then. It happens, so we fix it.

We're big and widely distributed enough that if people feel strongly enough about our issues, they'll create their own BBS, and we've got a presence indirectly through deep linking in the blogosphere and personal pages. Think about it: if the major news networks felt so strongly about opening up the lines of communication, wouldn't they have done so already?

The reality is that organizations of the mainstream caliber, when implementing any interactive feature, are going to get a ton of users registered and a tidal wave of usership. That means a big undertaking in terms of realtime moderation. Maintaining order on blogs is tough enough...putting such QA into practice within the stream of high-volume, rapid-production news is really tough.

Admittedly, we have gotten a lot of feedback from users asking when we'll introduce such a commenting system. My answer for many years has been simply: "We won't". We want everyone to have a positive experience, and not lose focus on the story itself. Maybe in the future I'll add something similar, but not with today's web technology and not with so much to lose with such minimal gains. Only time will tell whether a similar fate befalls USAToday with its comment system. I truly hope they break new ground and show us this can work, safely and to the benefit of their users.

I'll be watching.

Never underestimate your site's back catalog

They say in the news biz "you're only as good as your last story". Maybe. Or perhaps what makes a content-rich site successful is its ability to harness its library of information to be seen by more people in more ways, more often.

One thing I've been tracking in painstaking, anally-retentive detail for the last several months is how users use stories on our site. One thing that's stood out is the tendency for people to make use of our back catalog - those stories not immediately available through our wide gallery of online properties. Such activity stands as a perfect application of long tail theory towards web metrics.

People, through electronic access means, react very positively to the way we publish news, meaning they scamper towards new items as soon as they're made available. This is in no doubt due to the fact that we push synchronized access to our data through RSS, mobile access, web service/JavaScript import, e-mail and SMS alert messaging very heavily. So users get notified in a variety of formats via a plethora of digital devices as soon as we publish new stuff - and as quickly gravitate to it. But I've also noted a significant trend.

There's a distinct window in which new stories are dominant, in terms of their use. By this I mean content not immediately accessible through the normal web channels (primarily our homepage) is overwhelmingly the dominant form of information being requested from our database. We publish throughout the day, and whenever possible do so in batches so people will have lots of new content to look at, share and use. Despite the fact that the stories published on our site get pretty consistent traffic around the clock, the day's current headlines are hit really heavily for only about 2 hours from their time of publication. (Thanks to time zone differences, there's no real peak usage time for

After that point legacy stories make up the majority of requested articles, and with much more elongated distribution. These older stories don't have the frequency as the current items, but vastly exceed it in the total aggregate volume of pages being viewed.

For example, last week we published 110 stories, constituting the whole of the content immediately available off our homepage, on our mobile portal and in our RSS feed. However, during that period more than 1,400 different stories from our site were read in the aggregate. The vast majority of these (77%) were from our back catalog, requested by people using Google, watchlists, our internal search tool, deep linking, remoting clients, aggregation sites, etc. That's pretty amazing exposure, considering all we did was allow people to read our stuff and that we're not promoting it any more aggressively. Or, in most cases, at all.

This makes perfect sense: people read the latest news right away in bursts and then go scrounging for related/relevant previous items, cross-referencing until their heart's content. Fresh items get eaten up ASAP, but it's the legacy collections that allow our site to survive. When no new information is being put online, older content thrives, statistically speaking.

This usage pattern is one of many facets of our exposure and marketability we're coming to understand and leverage in our business plan - both to competitively serve information and around which to develop revenue streams. So your site's last story, perchance, isn't the secret sauce after all.

Maybe it's the whole of your previous information that makes it a winner.

Tuesday, March 06, 2007

MSM blogs aren't better written, they're better edited

One thing that irks me is when people knock posts by non-professional bloggers, ripping on them for having a less than acceptable writing style. (And I'm a mainstream guy.) Here's the thing: bloggers who represent mainstream media (MSM) news companies, syndication houses, wire services and the like don't necessarily have better writers, they have better editors.

More often than not, the editorial function is one missing from new media blogs. The process of painstakingly poring over fact-checking, grammatical and spelling snafus and citing references is usually done by another person, which most NM bloggers don't have. Or want, for that matter.

Brave new world, man.

Screenshots of Ubuntu 7.04

I've been quite the Ubuntu evangelist this week, giving out copies of the Dapper Drake live CDs I got from ShipIt a couple of days ago. Here's a cool sneak peek at Feisty Fawn, with mention of Canonical's Herd 5, with screenshots. Nice.

Scoble, I got your back

Ya gotta love Scott Koon - if nothing else, he's a man of great conviction.

I read with great interest Scoble's defense of his writing style and topics of choice, sans code. Lazycoder (Scott) is giving ol' Rob's blog the digital heave-ho from his aggregator due to the fact that Scoble's posts don't include programming examples. I'd be lying if I said this thought hadn't passed my mind on several occasions for certain sites, both for mainstream and new media bloggers.

Scoble isn't a coder, but he's got a solid technical background and upbringing with computers. He worked at Microsoft. Some of that had to have rubbed off on him. Michael Arrington is admittedly a lawyer, not a dev - but is learned, rock solid and reliable. You ever listen to Jason Calacanis talk tech? He can more than hang, even if he's not personally authoring a new compiler. And Chris Pirillo is about as tech as you can get. He freaking organizes GNOMEDEX! If Bill Gates actively blogged today, would you deny him a subscription? I'm not sure if he writes software anymore everyday, but you can't overlook his contributions to development.

Now in defense of Scott - I've interacted with him before on many occasions, and he's a cool cat and a valued resource. I don't write nearly as many articles on programming with code-heavy samples as I used to, and it's cost my blog many subscribers. I think he's made mention of this in the past to me offline. Scott's got tremendous theories on tech, which he shares very strongly and his posts avoids reader apathy. He's a blogger I like.

Because I swim at both sides of the pool as a professional software developer and journalist, I get harangued about this a lot. I'm thought to be too much of a talking head to be tech enough for the geeks; I'm way too binary for mainstream audiences. As a news anchor, the natural criticism against me is talking about stuff I don't know nearly enough about. But there are those who see past this implied shortcoming and get at the real gem - the content, how it's shaped, how it's made relevant, and how it's delivered. Not necessarily the underlying qualification about the material, or lack thereof.

My point is that I enjoy reading, listening to, and watching the opinions of those mentioned above on technology because they get the job done. I know they're not "tech". They understand the material enough and ask the right questions. And sometimes, not being knee-deep in this stuff allows them to not suffer from myopia of dev diehards who get too wound up in their elitism about actually being involved in the process to lucidly comment on the matter at hand. (Again, a trait I'm often guilty of myself.)

If my enjoyment of such stuff makes me less of an informed reader, than ignorance is bliss. So if you're like Scott, unsubscribe from blogs like Scoble's, Chris' and mine if you like. That's that's your prerogative. And your loss.

Amie Street is a really good idea

I remember thinking that Pandora (and eventually its indirect offspring, were the next big thing in distributing digital music since the iTunes craze. I've heard about Amie Street and just checked it out due to a link to a new BNL album from TechCrunch. Freakin' amazing.

As a programmer, I admire the concept of appreciating the proportionate valuation of a DRM-free, full access MP3 as it grows in popularity - but never getting higher than $0.99, so as to remain competitive with Apple. Total pat on the back for the team that put this together.

A day spent in the nude

I misplaced my mobile phone this weekend while attending a rosary (an exchange of my-phone-for-your-plat-of-food gone awry), and it was, to say the least, weird. For the first time in 15 years I was unconnected. I'd never felt such simultaneous liberation and helplessness.

I got my unit back, so back to the grind.

Et tu, dumbass?

I realize what a hypocrite I've become. One of our junior producers pointed out, after reading my post on my concerns for productivity due to youngsters' incessant use of social networking apps like MySpace, Facebook and YouTube, "Nice post...kinda reminds me of you using Digg, TechMeme and Technorati."


Sunday, March 04, 2007

Put your Mac to sleep via e-mail

Here's a neat little way to put your Mac to sleep by sending it an e-mail using Automator and a little bit of AppleScript. Sweet.

Gonzo is Agent Smith

This is truly amazing. Ya gotta love it.

Practicality of RSS feeds for 'Top 10' lists

TalkCrunch in December 2006 questioned why Digg's then-new new feature set didn't generate an RSS feed for it's list of the most popular submitted stories in its queue. Kevin Rose responded logically by saying that RSS feeds are created dynamically for each topic and content category anyway, and that listing such entries would often replicate, just with moving the items around. Michael Arrington quickly pointed out that other sites do this, but only by listing new items to crack the queue. It makes for an interesting design decision.

I'm running into the same question at the moment, too. I'm building a consolidated tabbed UI for a 'best of' list for content on One tab will show our most read stories, another will display the most-used search terms, a third will rank videos in terms of the frequency of their access. I'm quite sure I won't use an RSS feed, given the fact that we don't produce nearly the volume of content as does Digg, and we've got ordered RSS feeds by their publication date, which is a bit more intuitive.

But, you never know. I may very well spit one out, and if people find it to be a sticky service, it'll stay,

Muppets makes their rounds at Disneyland

I ran across this video clip of the animatronic likenesses of Dr. Benson Honeydew and his lovely assistant Beaker in Disneyland...freakin' amazing. Just two weeks ago I saw a show on Discovery about Lucky the Dinosaur. Man, when I was there the most impressive thing was the Country Bear Jamboree.

My, how far we've come.

MySpace'ing: privilege, not a right

These days I feel more and more geriatric. 33 is still a few weeks away, but one of the battles I've been forced to fight often over the last year is limiting access to social networking web applications to junior staffers. I was forced to threaten to terminate one intern because of her reliance - dare I say even addiction - to MySpace, which honestly shocked her. In her mind, I was infringing on her civil liberties.

We're a pretty liberal organization, and being the media, we don't limit time spent on cell phones, checking e-mail or most types of communications. But it's gotten so damaging to productivity at times that I've been forced to almost implement blocking software to MySpace, FaceBook, YouTube, and the like. Even for people working on web projects, whom I encourage to browse as much as possible, they're focus is misplaced. Shoot, when I was just starting out in the real world, I was allowed 2 personal phone calls per day or I got written up.

I realize that the younger crowd that's grown up knowing a Web filled with collaborative services, deep multimedia and rich UIs - nearly all largely available without cost - might not appreciate that engaging in such activities takes away from their daily responsibilities. Forget the problems we've had in the past like porno in the workplace...this is a much more pressing issue because it's inherently legit.

Maybe it's just a lack of professionalism to not self-police. Maybe amidst the fun they can't see that it's distracting them from their work. Maybe I need to be a better supervisor.

How are you handling unproductive behavior in your organization?

Saturday, March 03, 2007

AdSense - good, but not as advertised

Something that's been sticking in my craw has been online revenue opportunities for niche-focused, small market operations like mine. KUAM is a local news affiliate in Guam, so our target demographic is going to be very concentrated, relatively small-sized, and not having a great deal of growth or turnover. It makes for quite the finite audience.

With banner ads in the traditional sense of the word being a dead medium, the next obvious solution is Google AdSense. We implemented the technology on our news detail pages several months ago, and the revenue derived has worked out quite nicely.

With me being a one-man web shop all development is done in-house, so KUAM's biggest operational cost is my salary. My pocketbook aside, I've been able to create a very lean operation over the last 7 years, so the money we generate from displaying Google ads and generating clickthroughs more than adequately covers our costs for bandwidth and hosting. But it's not enough, and it's indicating diminishing returns, hence my disproportionate concern.

Here's the problem: since our audience level is relatively static, the same subset of people look at our stuff everyday. (International events like visiting dignitaries or natural disasters always spike traffic, and we occasionally do a piece on an issue affecting those in a wider reach than the local market, but for the most part usership is predictable.) With the AdSense ads that are displayed being generated dynamically by Google via of the context of the stories in which they appear, at some point the same ones are going to pop up. People aren't going to click on massage services or sailboat repair links time and time again.

That leads to the next concern: there are a lot of cool ads that run, but since "Guam" is mentioned so many times in our articles and our context is somewhat constrained, ads tend to lean towards that type of content only. And there aren't a huge number of local clients that strategically place their URLs with AdSense to make the amount of relevant choice that great. So we get a lot of repeaters by default, and the amount of clickthrough activity that they generate by users expectedly wanes over time.

Fortunately, Google's revenue-sharing model uses an algorithm that allows content providers to partake of money derived from clickthroughs and from the number ad impressions we make. So while it's unconscionable for us to alter our journalistic style to include certain phrasing as a means of attracting new ads to run to avoid the repetition issue, we've learned to benefit from simply doing more stories. (Which we have been anyway.) Being a quick-moving news operation, we generate loads of content everyday, meaning more page views on which to impress Google's ads. So it still works out for us.

This is but one concern that operations like mine have with the AdSense model, and one of the workarounds we've developed to make it work for us. We're a fairly large site serving a surely small userbase with a low growth factor. Despite our best intentions and most precise execution, some personal blogs generate more than we do from AdSense. That's one of the truisms of the web economy that's equal parts harsh reality and liberating freedom. Nonetheless, Google remains a valued partner and an integral way we do business.

How have you used AdSense, and how are you profiting with it?

This is for my homies (of podcasting)

I was just responding to blog comments about where past episodes of the podcast I previously produced have gone (they're deleted, so don't bother...). This made me think: practically none of my favorite podcasters from "the early days" - you know, pre-iTunes 4.9. I really miss some of the first-generation podcasters who did some really cool work. It was so raw, so unspoiled by sponsorship and perfunctory advertising mentions.

Don't get me wrong, I'm really glad to see friends I made like Soccergirl break out. Likewise for Dawn and Drew. And the podfather and Scoble make for a good combination to take Podshow to the next level. I just miss some of the other OG's. You know, the folks who helped shape the craft before Corporate America invariably took it over.

So for honks and giggles, I tried looking up some old friends. one's around anymore.

Cartoon Radio Network's gone (but legacy episodes are on Odeo). So is Celebrity Vinyl Heaven. The audio version of RossCode only lasted a couple of months. The Kiss Podcast is still available, but Dion's obviously been busy.

So here's to you, folks...wherever you are. Thanks for being so entertaining.

Combat spam with Thunderbird

Chalk one up for FOSS. I've been running Thunderbird exclusively as my primary POP3 client for about a year, and it's the best thing I ever did to combat - not get rid of - bothersome spam. The junk mail filter, even without custom personalized rules, works great and learns over time what's unnecessary crap and chucks it right into Ye Olde trash folder.

It also helps that my mail server's using the Barracuda firewall, which is awesome. But stuff does, and will continue, to get through, most of which Thunderbird catches and purges. Those that don't can be filtered out easily with a couple of clicks.

I made the switch in 2006 after getting tired of Outlook's unresponsive spam filters, and angered at the money I'd wasted at the equally ineffective Symantec AntiSpam plugin for Outlook.

Do the right thing...make the switch.

Guys I'd pay to listen to

I'm glad podcasts and blogs are accessible for free. But even if I had to pay, I'd gladly shell out dollars to absorb the opinions of Jason Calacanis, Michael Arrington, Richard Stallman and Doc Searls. Much like mainstream sportswriter Mike Lupica, also a personal favorite, these types of people always evoke some sort of reaction from me - total cohesion with my own thoughts, staunch opposition, outright rage, or uncontrollable laughter at the absurdity and/or relevance of their conclusions.

But dammit, I can't stop listening.

Serendipitous purchases

When it comes to personal entertainment, I'm the furthest thing from an impulse buyer. I'm cold, calculating and focused on getting exactly the DVD I want, or I leave the store/site pissed off. Being in the middle of a big open source project, I've been jonesing for the 2001 movie "Antitrust", which is one of my favorite movies about computers of all-time (see the rest of this distinguished roster).

I took a chance today and strolled into a local video game shop, hoping against hope that my beloved tale of NURV might be available on UMD. No dice. Feeling lucky, I strolled into the used DVD lane and after about 2 What were the chances? The used bin???

The weekend's off to a good start.

Apollo could finally make Flash truly functional

Despite earlier assertions I've made about Flash, the truth is that I actually enjoy it and think it's a useful technology on the web. It's just been severely mismarketed, resulting in underutilization and being a tool used to pimp movie sites. It's ability to process functional components within a site also got unfairly pushed to the wayside by emerging technologies that for whatever reason people found sexier.

No unique URLs meant no bookmarking outside the root address and catastrophic results when engaging a browser's Back button, which angered a lot of people. By far Flash serves to be a rich animation layer rather than anything, well...functional. It has the ability to process forms, but I've yet to see any real, practical, major implementations of web apps where people replaced HTML forms or AJAX with Flash forms.

The demos of Adobe Apollo show that neat stuff is coming down the pipe, particularly in the realm of an API for offline access. This is cool. The way I see it, this could be the next big, and some might argue, the first big evolution for Flash as a development tool.

The first major generation of Flash components that started making their way around the Web were mainly long-form animations that were more efficient, in terms of diskspace and loading time, than animated GIFs. The second evolution was evident in the release of ActionScript and a more granular, procedural was to programmatically control movies and components (note that this would be the same era of the doomed Flash form). The introduction of Flex got a little fanfare, but not much of a major stage. Then, several years later Flash Video gave new life to the platform, with the rise of embedded players on YouTube, MySpace, and countless homegrown offshoots.

I'm a big proponent of the coming development in the RIA and desktop widget space. I'm hoping Apollo will lead the way, with Flash finally getting a chance to star as a true developmental framework.

Friday, March 02, 2007

Don't skimp on your detail pages

Something I've been thinking about lately is how important the design of a site's detail page is. This is to be understood as those pages that typically link from a homepage or other central location to display article contents. Speaking for the vast majority of highly-visible, professionally developed sites, this is typically a dynamic page that emulates the general aesthetics and layout of a site's homepage, with the specific content for a particular article, story, or post.

In this, the age of the almighty RSS/Atom feed, the growing ubiquity of the widget/gadget/badge/module/flake, Google-dominated search, deep linking, and the portable content device, many people find your site first and foremost through detail pages. Historically, the marketing glitz has been on putting emphasis for ad placement on a site's homepage, and with logical reason. Traditional advertising and site promotion methods may pimp your root URL, but with much greater frequency your stuff's found through your secondary pages than through what might evolve as the main entry point into your site.

But surprisingly, I've seen lots of sites more and more build detail pages that inexplicably are shells of their larger homepages, not replicating elements like ads, e-mail links, contact info or headers/footers. I don't know why.

The major web development frameworks all account for making such aesthetic development easy. Ruby on Rails' use of layouts, Django's templates and master pages in ASP.NET 2.0 all make this super easy - providing an easy means of developing a uniform layout for your main and subsequent detail pages. And blogging platforms like Blogger use templates ensuring the same layout for master and detail pages.

So the lesson shouldn't be lost - you can't underestimate the value or importance of designing your detail pages.

Your site's true traffic, sans accoutrement

A topic of great contention these days is how much true traffic a site's servers handle when their distribution strategy is truly diverse - using cross-platform techniques and technologies like RSS/Atom, AJAX polling and processing, WAP and wireless rendering, e-mail lists, SMS alert frameworks, Flash presentations, various JavaScript imports, widgets/badges/modules/flakes, and the like. Opportunistic marketers are quick to point out that the "page views" generated by RSS feeds (if you even want to call them that) can increase a site's total viewership by several orders of magnitude - trying in vain to buffer the claim of technologists that such is different than the page requests from traditional desktop web browsing.

I'd blogged previously about how my site could tack on an additional 1.5 million page views if we factored in such extensions. At, we sport the most diverse collection of online products and services for our region in distributing news, but our traffic conundrum is interesting: for every service we add, we gain a lot more aggregate exposure, but take away from our core profit center: traditional web page views. We effectively wear ourselves thinner and thinner for eyeballs we could capture for ads on the, but get content in a variety of formats on other devices or through other channels.

So I'll end with a question - how do you report traffic to your advertisers, and is multiplatform distribution a benefit you try and pass along to a client?

Book market for widgets should be gi-normous

With Apollo having left people with jaws on the floor and having everyone wondering what the future will hold, and Apple, Microsoft, Yahoo!, Google and others already jockeying for position in the desktop content market, it's going to be an interesting year to see not only how cross-platform/cross-OS widgets are embraced by users, but also by the writing community.

I'd mentioned that I'm going to crack open Adam Nathan's "Windows Presentation Foundation" soon and get cracking at coding desktop gadgets. I'm more looking forward to Apollo as RIA continues to gain traction, preferring more the platform agnostic nature of that platform, even if it is a JRE, as many contend. (I'm just not down with the Vista or bust mentality.)

But nonetheless, it's going to be interesting to see all the titles that spring to life because of the gold rush to do such development.

Thursday, March 01, 2007

A history of violence

I try and stay away from linking to stories on my company's site for purposes of not wanting to bring work home with me. But today was just crazy, in terms of all the headlines that came down having to do with violence in local schools. And much like I've written before, people ate it up with a spoon...couldn't get enough of the stuff - they were banging away on our search engine with choice keywords almost faster than we could put the stories online.

There's something to be said about the decline of the Western Civilization and a growing interest in the misfortune of others. But this is getting out of hand.

My iPod's declining mortality rate

I've noticed my iPod's battery life showing diminishing returns over the past couple of weeks. I charge it full, and yet I only get about 75 minutes of playback time, and that's even with the backlight completely off. And of course, because I'm a moron, I haven't synched my playlists with iTunes on the desktop, so once this cow goes permanently out to pasture, I'm screwed.

Maybe it has something to do with the fact that I lost my original AC charger and I've been charging it with my iHome and USB cable at work into my laptop. At any rate, the prognosis isn't looking good.

Creating a new meme monster

Josie's got an interesting post about the various jobs she's had, expressed chronologically. Kinda reminds of the meme craze that people virally send to each other, demanding they publicly reveal 5 tidbits of personal information people might not otherwise know.

Here's mine (age/position):
I hope it implies progress and upward mobility. :-)

5 of A Kind performs "Juices"

5 of A Kind is a really popular local band that's got a catchy single we've been playing all the time. They performed tonight on our live talkshow.

Check them out on MySpace!

Giving some dap to Digg via YourMinis

Unless you're reading this by subscribing to my blog's RSS or Atom feeds, you might have noticed the slick not-so-little Digg widget/badge/gadget/module/whatever I placed on the rightmost column of I snagged this from, which is freakin' sweet and has got a lot of content already. Lots going on in this space with Adobe's demo of Apollo.

Gank some for yourself.

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]