Tuesday, February 27, 2007

Growth hurts

KUAM.com's always enjoyed being the most popular site for Guam news, but in the last few days, I've had some pretty extreme overage charges in terms of the bandwidth we've consumed from serving files, queries against our database server, and the amount of diskspace we're taking up. Not long ago, I was looking for memory leaks, which wound up being an excessive amount of DB-resident images pushed through a custom HTTP handler staying resident in the .NET 1.x Cache API a bit too long.

And now I got a friendly e-mail from my host telling me that performance is a bit down for queries, and that I'll need to run some admin scripts to regain speed.

While our head of accounting and I agreed that we'd track the escalating charges we face for the months and then determine if we'd upgrade plans, ORCSWeb is pragmatically nice enough to say, "Dude...just ramp up incrementally...you'll save money." That's why I love ORCSWeb. They know.

I guess it's a good thing when you feeling the pinch of rampant growth, huh?

FAVICON.ico a bandwidth hog???

Scott Hanselman stumbled across quite a revealing concept, determining that his FAVICON.ico image to have his logo appear in the address bar of a web browser sucked up 25GB of data transfer. Enlightening!

I checked my own stats, and while not impacting my bandwidth at the dramatic level that Scott's did, it is pretty revealing how much the little Guam flag that adorns my site's main content pages takes up when aggregated.

Thanks for the tip, Scott.

Having fun at work

My take on anchoring the news has always been to be friendly, knowledge and informative. And, when used sparingly and tastefully, to use humor as a means of getting a point across. It's a shame how little this is used throughout the industry.

As an example of such liberal thinking, here's a quip I did last night about the states of Guam utilities last night during our live newscast.



It got a nice reaction from our viewers born before 1985. :-)

Monday, February 26, 2007

Unwillingness to syndicate ticks me off

If there's been any technology I've been an ad hoc advocate for over the past 4 years it's been RSS web syndication. It's fairly simple to learn, automated, platform agnostic, text-based, has ample tool support, and completely free. There's no reason why anyone with a web site (or even just a data source for that matter) shouldn't extend their data offering and syndicate their stuff by setting up a feed.

I've been preaching for the Church of Winer pretty intensely to organizations of varying industry, sector and size, encouraging people to use RSS for everything from report generation, to streamlining e-mail clients, to standardizing outbound data streams.


That's why it irked me to no end to receive correspondence tonight from a partner I recommended using RSS with: "Our system is automated and due to continuing budget constraints our resources are limited." Huh?!? This to me doesn't make any sense. I guess it's the principle of the thing. Cost prohibitiveness I can deal with, no IT department works for me, but just an unwillingness to try it out doesn't gel with me.

One of my missions is to rid the world (Wide Web) of such naive thinking.

Transparency in product promotions

I'm a firm believer in mission statements, and put an equal amount of faith in marketing slogans. But what makes me different is that I rarely separate the two. It's my personal belief that if a rallying cry works internally, there's no reason it shouldn't work with customers...and vice-versa. There can't be a delineation between the way products and services are perceived - transparency is critical to making them work as live products, opening up the company, and holding employees responsible to upholding that commitment.

Examples of such phrases that we use at KUAM to promote how we do news online:
One such motto I developed recently that galvanizes our philosophy on webbifying our content. It makes for a great PowerPoint slide...earning cheers and jeers when I present it during talks I give:
Works for us. Give it a try sometime.

'Amazing Race' features my hometown

Consider it official. Guam is going to be featured in "The Amazing Race All-Stars" on one of my company's stations. I've fielded a couple of calls from some people who found out about this through various forums, message boards and the blogosphere - but I've been sworn to secrecy.

It'll be nice vindication since Palau beat us out with Survivor.

It's our award season, too

Who says the A-listers have to have all the fun? Oscars, Grammys, Emmys, Tonys - that's fine and dandy for Tinseltown, but we of the local media level work our butts off all year, too, and deserve the nod for recognizing the very best of the best.

Here are some of the major awards media outlets are getting in the next month or so:
Good luck to all!

Our new 'exhibitfeed' service

I just rolled out NewsLinks, a new service that lets our users keep track of exhibits we mention our our broadcasts, like court documents, transcriptions, police reports, audio/video clips, etc. While it's debatable if such cross-platform promotion is really effective by TV stations, it works for us and our users are enjoying it.

(Now, we're not claiming originality for this service...while our implementation is unique, other fine news services have similar offerings.) Because several of the exhibits are HTML presentations, the RSS feed that's I hooked up to NewsLinks make it functions sorta like a non-downloadable podcast - an 'exhibitfeed'.

Enjoy!

Sunday, February 25, 2007

Choice is a good thing

I always wondered what workstation(s) Google staffers use. I'd assumed the company was at least partial to Linux, as later affirmed by the Goobuntu rumors, with the company's engineers running an internal custom distro. It's refreshing to know that at least they give new hires their choice of OS of Windows, Mac OS, or Linux.

AJAX and ASP.NET output caching

The muse has been upon me today...wanting to do some cool programming that's either wound up being helpful proof-of-concept sampling, or prototypes for neato stuff upcoming on KUAM.com. One such example I was toying mentally with before putting code to screen was the impact(s) of using AJAX form processing in the background for ASP.NET webpages that use output caching.

I was thinking about doing some background asynchronous processing of data while people read our news stories, via AJAX. The unknown was that most of the pages on which I'd implement the actions are output cached, and while I'd need to retain such a performance gain, I'd need the AJAX functionality to fire with each page request.

Basically, my demo was for page analytics: seeding and incrementing a tally of how many times certain pages are accessed, on pages that are cached on the server-side. Fortunately, and as I expected, you can use AJAX in such a fashion without losing your ability to cache on the server.

Nice.

What churns my brain this Sunday

Things that have got me thinking while in the office this weekend:

Saturday, February 24, 2007

KUAM News promo: Spring '07

An epiphany fell on me like a ton of bricks - people still love commercials, after all! I've been stopped in the street numerous times in the past week by people asking - begging even - for me to digitize our new news promo since we premiered it during what would be considered Sweeps Week, if only Nielsen's holy light shone down upon Guam. (I'm cautiously optimistic about what morphs might come out of it, but whatever. I'm a trusting fool.)

We aim to please, so here you go! Watch it, download it, share it...enjoy it! And keep those e-mails coming!


WPF? WTF?

I haven't looked into anything dealing with Windows Presentation Foundation, not really being into the whole Avalon movement or XAML for that matter. But after sporting some major intellectual wood over the news of several new desktop newsreaders from some major newspapers, I may give it a shot. (Good work by the Seattle Post-Intelligencer, in particular.)

Luckily, the definitive book on the rather complex subject of WPF development from Sams happened to land on my desk this week, so I'm good to go.

Book review - Murach's ASP.NET 2.0 Upgrader's Guide: C# Edition

Murach's ASP.NET 2.0 Upgrader's Guide: C# Edition
by Doug Lowe & Joel Murach

Published by Murach & Associates

This is a great, quick read guide for those of us not having enough time to exhaustively go through every nuance of the .NET 2.0 Framework or the major aspects, but still needing to be productive and do more than simple examples. The book, for the most part, uses good architectural design and employs best practices, and explains each lesson adequately. It's also a great read for those wanting to get going quickly with Visual Studio 2005.

True to its name, this isn't a book for the first-time .NET coder, and is best suited for experienced programmers wanting a quick primer on the new features of ASP.NET v.2.0. The dominant aspects of the Framework are profiled, including web parts, personalization, master pages, data access, navigation and user profiling. (Although I would have preferred caching to have its own dedicated chapter, it was a nice touch to pepper the applicable chapters with proper use of the caching, Cache API, and database-level caching for SQL Server 2000 and 2005 where caching can help.)

As such, it doesn't drill down into the particularities of any one feature on a granular level, but such detail cab be accessed from a thousands different MSDN pages, blogs, video feeds, etc. So, the book accurately does what it says it does: get you up to speed and ready to build cool web-based stuff with ASP.NET.

Mud rocks out at KUAM

Last night a band with local roots, Mud, came to visit and played live in our Harmon studios. It ruled. The group's got an interesting vibe, and they cranked out "Apt4", which got a clever vibe and message to it, with a slick outro solo.




Also, check out the interview we did with Mud on our YouTube channel.

And webheads: make sure to check them out when they play South By Southwest.

Thursday, February 22, 2007

It must be the OS (or, LeBron knows Windows)

There was a time in my life when I thought I was Mr. Microsoft…seems #23 has ganked that firmly away. I just caught on ESPN that King James, who I love, inked a marketing deal with Bill Gates to tout Vista. So consider the following, if LeBron is from now on considered the basketball equivalent of Windows Vista.

LeBron James as Windows Vista - mind-blowing in every way with latent take-over-the-world potential ability. Beautifully engineered, well packaged, the toast of the town with limitless potential, with global fans and non-fans having at least heard about it. As many mainstream media pundits rip it for being all bark and no bite as there are devout followers poring over the implied superiority of the product.

Tim Duncan as Mac OS X - completely impressive at first glace to anybody. Excellent, refined packaging with rock solid fundamentals. And while not conforming to the trendy gimmickry of the day, both make their money and dominate by etching their own style and bringing about a renaissance to a new way of thinking about the market.

Allen Iverson as Linux - the initial packaging might be a bit misunderstood as inferior, but there's some world-class engineering within for those that really know what they're looking for in a winner. They might display some pretty severe compatibility issues, not gelling well with other system components, but when the chips are down they both flat-out get the job done, in terms of durability, performance and power. Also, various add-ons and physical enhancements over the years have altered the physical appearance, but haven't changed the game.


News sites shouldn't be curators of the world's data

One of the tasks I've been tapped to perform more and more in recent months is to provide console to the local government on web operations. Something that's become painfully apparent is the reliance of the public sector on the private media industry to host their electronic information. Not the aesthetics of a site, just the daily and routine data they send out relative to their scope of expertise.

If you're not from here, trust me when I tell you that there's been no organization worse at understanding, using, implementing or managing information technology than the Government of Guam. (It's been a long and arduous road just to get these agencies to scan their documents and then e-mail them to us, so this is a whole new barrel of monkeys we're dealing with.)

Historically over the past decade, my company has hosted various types of data (PDFs, Word documents, HTML-ized Powerpoint slide decks, etc.) in cases where an issue was of critical importance, we could get the info online faster, or our servers would be more reliable in terms of bandwidth, scalability or uptime. (Much in the same way the Starr Report was published on MSNBC.) But it's unconscionable for us to do so for every miniscule law, press release, announcement, retraction, etc. So I guess it's my fault that they never took the initiative and started hosting their own data on their own sites. Ignorance isn't an excuse here...it's lethargy.

But my problem is that I've been facing escalating and costly overage charges from my web host for the hundreds of MB of media we've been forced to archive for those agencies who should very well do for themselves. On another level, I want to be altruistic and foster the growth of other web sites and places to go. My traffic's always been phenomenal, but I want to link to external resources, too. It makes the game better by improving the ecosystem of having more than one destination to get at source information. Much to my chagrin, not every single person that could visit my site does so.

So a lot of my energy lately at meetings and talks has been working towards getting government to be cognizant of the critical importance of taking ownership for your own data. Mainstream media at any level can't be expected to be curators of the world's information - especially when having to pay extra for doing so. We report the news, we don't keep web-accessible archives of everything. Agencies have to be responsible for the integrity of their own stuff...and for making it available. In the event that a court decision, public law or other bit of information changes, it gets all out of synch. We've mastered the craft of working with data - creating it, distributing it, archiving it and sharing it - but that's for our own stuff.

Government's got to get with the program and start putting their own stuff on their own servers, too.

Wednesday, February 21, 2007

1,000 posts in 72 days

My company touts the moniker "Guam's News Leader" in more than one way, and it serves to stand for much more than just a clever marketing device. We own and have (re)set most of the local records for web publishing, and today again we eclipsed a new milestone. We just became the fastest regional news site to publish 1,000 stories.

Our 20,000th story was posted on December 17, with out 21,000th story going live just now on February 21. We pulled off the volume in 72 days - the fastest time between each. Our daily story load is about 25 stories every 24 hours, also a regional record. That outpublishes most newspapers, and far exceeds the typical volume for most small market broadcast stations our size.

It wasn't easy to attain this feat - we hired some new people, made a couple enhancements to our CMS and ramped up our publishing schedule, to the ultimate delight of our users. Onward!

Tuesday, February 20, 2007

Import data dynamically into PowerPoint slides

I recently discovered PresentationPoint, a really slick third-party provider that allows for data to be dynamically inserted into PowerPoint slide through their app, DataPoint. I was looking to have a presentation be retrofitted with my site's main news RSS feed, which NewsPoint supports. Really cool. DataPoint also supports straight DB queries by managing a datalink through ODBC or via XML.

I e-mailed the support address and a couple of hours later a dude from Belgium wrote me back...and even included a slide deck with my stuff already imported. Whatta sales pitch.

(Ever the naive programmer, I was trying to get this done on my own, but to no avail. Wally was nice enough to point out that this could be done with VBA, but that's a headache unto itself.)

Do I really want to be Dugg?

The Digg and/or Slashdot effect is intriguing, both in the exposure it brings and the performance considerations it introduces. Accurately referred to as:
"A roving random distributed denial of service attack before which web, network and systems administrators alike quake and have terrible nightmares about."
...it's both desired and feared by content managers. On several occasions I've considered adding the perfunctory "Digg this" text and/or graphical link to stories on my site, but always refrain, knowing that the limited-scope of the market we serve being a local news station in Guam likely won't refer that many, if any, articles of interest. I'm not going to go all out and be ridiculous like sites including the ASP.NET blogs and have Digg, Reddit and del.icio.us, turning a really cool and useful utility into overkill. And even several mainstream news sources have Digg well promoted, in the hopes their stories will, too.

But I digress - although I'd love for my site's stories to appear on Digg and get massive exposure...I'm scared as hell. I've got a really good host provider with ample bandwidth, but I don't know if it could take a really popular hit. I'm a believer in the oxymoron of controlled chaos. (I used the same cautiously optimistic mentality a few years back when refraining from registering my site's podcasts in the new iTunes Music Store.)

So how to most effectively combat this if you're not in possession of the type of bandwidth that the major news networks have? Don't openly invite people to Digg your site - let them do so voluntarily. Not that I don't encourage such behavior...it's cool as hell. Just take in the traffic that you'd get, but not by telling people to Digg every single story, because there are those that will.

Network effects, like most things in life, are best when taken in tastefully and sparingly.

Printable content on blogs

The topic of having links/buttons/commands on blogs that allow a page to be _properly_ printed is a thorn that's stuck in my side for quite awhile. It's been irritating me more and more lately, as I've backslid into archiving all interesting posts I find and pasting them into a Word document, for later printing and more attentive reading that the loads of crap I otherwise dismiss.

But the problem remains - despite vast innovations in weblogs (advanced archiving, syndication, formatting, comment systems, moderation, etc.) the basic premise of allowing content to be rendered so that it'll print properly is sorely missing. I realize I may be of the minority voice here, but I still print out my stuff.

And I'm not overlooking just commonplace bloggers, either. Mainstream media, primarily news sites, typically use the 'Print this article' link, but more and more these days are lazily only allowing such functionality to call up the client's print queue command. That's stupid, because it doesn't solve the problem of making the page more printer-friendly, which is the whole point.

What I want to see is a link that strips a normal page of all its justification, formatting, tabling, and other accoutrement - just giving me the content. If there are ads, so be it. I don't mind. I just can't stand losing out on information due to incessant formatting.

I find the best example of printable content on the web to be O'Reilly (see the actual page vs. the printable format), and that cool reformatting employed by NewsHutch. It's usability I echoed on my own site. I guess the next generation of this would be a we service that would tak page content and make it print-friendly, and/or dynamically format the content as a PDF, for ultimat6e cross-platform compatibility.

Sounds like a Web 2.0 service waiting to be invented. :-)

Monday, February 19, 2007

Why do some sites load so slowly?

I've given up trying to come up with something profoundly intellectual or referentially cute to title this post with. Being a real let's-find-out-what's-under-the-hood-of-web-apps guy, I'm too busy racking my brain trying to figure out why some big-name sites load so damn slow. I'm talking those destinations on the Web that regardless of client-side Internet connectivity, processor power, memory or browser just won't kick-in like all the rest.

Michael recently released built WebWait, a slick web app to test any URL's loading speed under a variety of conditions. So using this for scientific data and the basics of common sense, consider the following:
Yet despite my loathing of the time I spend waste waiting for pages to continue loading, I continue to visit them, each several times daily. I've got no solutions for the speed conundrum, other than pointing out what might be causing the lags. I'm still a registered, devoted user...albeit with waning patience.

Sunday, February 18, 2007

On operating systems, platforms & communities

Who am I? Perhaps the most rudimentary and critical of questions, it's applicable in any community, including the computer industry. So spoken in the first person from the standpoint of Windows, OS X and Linux die-hard advocates, consider the following in pessimistic defense of the merits of each.


Microsoft
91% of the people in the world using computers can't be wrong. Being the most-used product in the market, legions of people knowing nothing about computers will use our stuff…and remain as ignorant throughout their experience with us. And we like it that way because they'll tell their friends to buy Windows, who'll tell their friends... We'll happily prey on such rubes, making a ton of money, leveraging blind faith and building a community of devout developers who like lemmings will follow us to the ends of the earth, and as quickly proceed to jump off the edge without question or hesitation if we tell them.

We've created a new microeconomy in forcing customers to buy high-end computers with components that 20 years ago could have been classified as a low-end mainframe. We've likewise fostered an entire ecosystem of third-party programs that our users quite honestly can't live without these days (e.g., anti-virus, spyware/malware, firewalls, anti-spam). And the end result of this small fortune that a person's invested will be that their machine perform at a (hopefully) tolerable level.

We've completely lost our once-firmly established footing in Web technologies, so we're going to capitalize on capitalism: we'll focus on selling operating systems, productivity suites, business applications and developer tools. And those that dare stand in the way of our mega-million marketing be damned.

For the time being, we'll tell you everything's great while you battle stability, security and scalability concerns…that is, until we release the next version of our software, announce ceased support for legacy versions and bemoan our earlier work as absolute crap to encourage you to buy an upgrade.

All in the name of progress.


Apple
We're better than anyone out there, but we're misunderstood. Our stuff costs more because it quite simply is the best there is, so don't complain about the price. And our formats, platforms and hardware are completely closed off to the rest of the world, so don't expect much in terms of interoperability. You're paying for exclusivity. Across the board, everything we put our name on is the epitome of quality and innovation - devices, software, services. We're totally convinced that we're all a higher order of user - developer, engineer, student, teacher, graphics designer, or information worker. So you can rest assured that when carrying around one of our devices or sporting our logo, you've ascended to a greater plane of existence technically, and even socially.

Yet we so often neglect to acknowledge that we almost went out of business on at least one occasion due to the esoteric nature of our handiwork. The right people get us. If you don't, it's your loss.

But there's always room for you at the table.


Linux
The world (and World Wide Web) would truly be ours if only we could get our act together. We're the quintessential walking contradiction in the Age of Information: we're the supposed superior younger brother of UNIX, we adhere to our staunch beliefs in openness and freedom, and we know beyond a shadow of a doubt we're fundamentally better than either of our commercial contemporaries. But by orders of magnitude we're more complex - completely disorganized and uber-political.

Our stuff runs great on older hardware, but has driver incompatibilities up the wazoo. We've been on the cusp of making a major dent with mainstream audiences for awhile, and this could be the year we crossover and hit it big on the desktop. If only we could downplay the geekiness that surrounds us. Yet the one thing we've got going for us is that Google likes us…they really like us.

So tragically, rather than try to achieve legitimacy by justifying our existence to the layman and addressing the various hardware compatibility issues (i.e., WiFi, graphics cards, laptop suspend/hibernate on-close, codecs, etc.) that keep us from overcoming a longstanding hump, we pick fights within our own community. We force campiness between supporters our most popular desktop environments (GNOME vs. KDE), with each calling the other's complete excrement, and labeling our cousins - the various offspring of the core kernel inferior.

How's that for circling the wagons?

Coming home to mama

Despite my turning in my Microsoft MVP badge a year and a half ago to embrace the open source movement, despite my proud status as a card-carrying Softie on Rails and even though everything I do at home now is Ubuntu Linux, I've had to do some .NET v.2.0 this weekend. Call it a slight return, me going back to my roots.

Not having used anything from the 2.0 space since playing with the early alpha bits, I'm admittedly rusty. Lots of positive improvements have been made and I'm genuinely pleased that many of the suggestions I made in focus groups, in one form or another, made it into the final release.

I'm in at the office on a gorgeous Sunday afternoon (at least that's what my co-workers tell me), putting the updated Framework on my Lenovo laptop, along with Visual Web Developer Express Edition. With only a handful of people here, most of them doing paperwork and/or video editing, I've basically got an entire T-1 to myself. Keep in mind that for the majority of my career, I've been Mr. Anti-Visual Studio (particularly the first few .NET versions). I wrote Version 4.0 of my company's site in ASP.NET 1.x in 11 days by coding raw C# syntax in ScITE. So needless to say, this is a research project as much as it is a dev weekend.

Here's some highlights:
So it's been interesting. Nothing to complain about...a nice change of pace. Kinda like having lunch with an ex.

Saturday, February 17, 2007

Book review: Beginning Ubuntu Linux

Beginning Ubuntu Linux - From Novice to Professional
by Keir Thomas, published by APress

Keir Thomas puts everything you need to get started with the distribution of Linux getting the most press these days all in one handy tome. All you'll need is a decent computer, the software, and you're set! The book's chapters - 34 in all - deal with the various aspects of installing/configuring, using and personalizing Ubuntu for your tastes.

Stating true to the political slant of most open source and Linux advocates, the book spares no expense when taking potshots at rival Microsoft and denouncing the company's operating system and Windows' implied failure to live up to its billing as a high-performance, stable, secure operating system. Ubuntu, Thomas preaches, to the rescue.

For history buffs, the book starts out with a well-chronicled backstory on the oft-misconceived beginnings of GNU/Linux, and of the free and open source software movement. The book then proceeds to do a great job of talking about the various concerns with Linux's hardware compatibility, including WiFi access, and of finding your way around the default GNOME desktop environment. Differentiations are clearly made between traditional UNIX hardware problems and Ubuntu's nature to auto-detect all but the most obscure of devices and peripherals. An equally rewarding section is Thomas's lesson on using Synaptic and RPM-based repositories to discover, download, install and update the latest software.

Also helpful for Microsoft converts is a nice introduction to working with the UNIX terminal, specifically Ubuntu's use of BASH, and of basics of using the command-line interface for interacting with Linux. Cron and crontab are dealt with sparingly, but nicely.

There are also several very helpful chapters on multimedia. And one of the book's many redeeming sections is evident in the excellent discussions on using Wine, remote terminals, and virtualization to access external resources, or run other Linux or Windows applications from within Ubuntu. It's a part of the book that's very well-written and nicely laid out. And any user will appreciate the healthy section on using OpenOffice, and then the individual chapters laying out basic features and functionality of each of the applications with that productivity suite.

The live CD also ensures you can start playing with the OS right away, for those of you not wanting to download the 700MB ISO image (although the version on CD is likely one or more generations behind the current stable distro).

I found the book to only have a couple of minor shortcomings, namely a surprising lack of a chapter on gaming with Linux, which for a growing population is a major draw. It would have also been beneficial to go more into detail about shell scripting, beyond just simple (sub)directory and setting up basic cron jobs. Also, I find that the one area keeping the book attaining absolute perfection would be incorporating a Mac-centric viewpoint for people coming over from Apple environments, or looking to bridge Linux with that platform.

But those aside, the book is otherwise a one-stop shopping gem - everything you'd need to get up and running in a matter of hours.

Friday, February 16, 2007

ESPN starts daily web-only newscast...sounds familiar

I caught at the end of SportsCenter's 30,000th broadcast on Sunday night that the network was launching a new web-only video newscast, giving users a minute-long streaming presentation of the day's events, scores and storylines. (I haven't been able to find it since then.) I guess great minds think alike.

My station's been doing this, too for several months. And to great success.

Django book's release delayed until June

Dang. I found out today from a little birdie that APress is delaying the release of the first book to be published on Django. Dang.

While it's a bummer that I won't be able to have something bound and tangible in hand until the mid-part of the year, it's still revolutionary that Jacob and Adrian are putting the chapters online for unofficial review.

Thursday, February 15, 2007

Yahoo!'s new suggestion board: W-E-A-K

"Good artists copy, great artists steal" I think we've all heard at some point by at least a couple notable characters, but Yahoo!'s new suggestion board is totally lame. It's so much of a Digg ripoff it's laughable how little they tried to blur the distinction between the two. If anyone of lesser stature pulled this off, it would be easily dismissed. But this is freakin' Yahoo! One of the big four.

Cory Bergman from LostRemote broke the news about the expected backlash, but it's gotten much more vulgar in the few minutes since.

Major letdown.

System76 rocks!

I'm ordering a new Linux laptop to run Ubuntu and decided to go with System76 after hearing a very reassuring podcast interview with the boys from the Linux Action Show. And others are echoing such praise.

I had a memory question and shot their support address a note, and System76's tech support staff were very friendly, helpful, knowledgeable and fast. They had me at hello.

Wednesday, February 14, 2007

...and the Grammy goes to...Slayer

I was stoked to see Slayer finally get mainstream recognition as musicians in winning their first Grammy (second nomination, not taking a back seat to Tool this time) for Best Metal Performance. Congrats!

p.s. And am I the only guy to realize that it's already been 21 years since "Reign in Blood"? That still blows away most of what's come out since. The most brutal 38 minutes ever put on tape.

News site statistics for the New Web

I participated in a very interesting discussion on a Lost Remote Google group for developers, sharing my thoughts on the changing nature of site analytics for content web sites. The crux of the argument was that recent stats noted how the Fox family of online properties exceeded the web traffic of Yahoo!, supposedly because the latter's application of heavy AJAX treatment to their site and reduced the raw page requests and reloading that's been traditionally experienced.

This is a sticky wicket, indeed. If I aggregated all the traffic my site gets from our page requests via the traditional World Wide Web, web service calls, desktop widgets, WAP pages, AJAX-based polling (for tag clouds, etc.), RSS feeds being pinged, Flash Video clips being clicked on and podcatcher clients looking at feeds for updates, remote embedded videos being played back on our servers and other services, I could easily pad my stats and add 1.5 million more "page views" per week. But for clarity, we plainly separate our traffic for clients into what's really generated via our web site through traditional means and then also for enhanced features for the full effect.

Conversely, I have a friend in the biz who cut over a major part of his site to Flash. He was shocked to learn that data transfer surged with his web host, but his page views dropped by 90%.

When doing presentations about how we do news online at KUAM.com, I stress to people that we in the content industry measure metrics differently. Sure, we may get several hundreds of thousands of unique visitors, but each visitor requests perhaps 8-9 pages per session, and typically exhibits several sessions per day. And services like AJAX and Flash and certain types of caching may impact stats, rendering them into a less-than-impressive format, but we ensure clients out stuff's being seen more and more in a greater variety of formats. This they like hearing.

Bottom line: site stats ain't what they used to be.

Saturday, February 10, 2007

IE7, thou art mine enemy

I've developed an escalating hatred for Internet Explorer 7 lately. I've installed/upgraded Microsoft's web browser on several Windows machines, but I avoid it like the plague on my box. Uh-uh. But the overvalidating and hyperformatting snafus are getting more and more hairy, starting to negatively impact some of the apps I'm building that for years have rendered perfectly in IE, Firefox, Opera, Safari and Konqueror.

Amy shares this grief. I'm praying I'll not have to return to the days of "This site looks best using...". And this was supposed to be my "easy" weekend.

RSS aggregator as archive manager

I've got newfound respect for Google Reader as an RSS aggregator. I run a local news web site, which primarily means I'm concerned with two major things:
- how many stories appear on our homepage (our publishing load)
- how long the stories appear on our homepage (historical availability)

For purposes of remaining relevant and not being stale, most stories stay get primetime exposure for a half-day or so, and then are demoted to the lower-half for easy access, before they are essentially lost and have to be queried through date-specific search tools I've developed, or searched through search engines or Google News or other means. The same philosophy applies when publishing our data via RSS - everything is mirrored, of course, between our web-based and syndicated data, but the number of items that appear in the feed are those that have been published within a 24-hour cycle.

Now back to my point. I found today that after not checking my station's main RSS feed for a few weeks in Google Reader, that the app does a wonderful job of keeping past items cached and accessible. I was able to navigate backwards for an entire month, giving me instant access to more than 500 articles we've published. That's really cool.

I hated to do it...but CAPTCHAs are in!

I've been fighting comment spam for several months now, and I've written some administration utilities that automatically rid the posts for the custom blogging framework I wrote a couple years ago, but it's gotten lately to a volume that's nearly uncontrollable, and certainly bothersome. I've found CAPTCHAs to be bothersome, but I couldn't avoid it anymore.

Immediately after deploying the extra validation step it eliminated bad comments right away by 100%. Whew!

Sorry to those that this bothered, even if temporarily, because we caught most of the bad posts before our users saw the junk. If you're a dev, consider this list when rolling your own anti-spam solution.

Friday, February 09, 2007

Ubuntu, Linspire/Freespire partner up

Big news as Canonical and Linspire announced plans to integrate their OS'es. This is big for those of us running Linux on the desktop. The forthcoming Ubuntu 7.04 will sport Linspire's CNR interface for e-commerce. This'll be cool.


Monday, February 05, 2007

Back behind the podcast mic...sort of

I've agreed to help out the lads from Luckymonk with a new podcast series for Softies on Rails, the much-needed chronicle of .NET developers who have adopted Ruby on Rails web programming. I was conversing with Brian over the weekend, and he said I might be able to go on the show and talk about my sabbatical from Microsoft technologies in which I embraced open source projects and FOSS.

It should make for a nice time, and it'll be nice to lay down some MP3 audio for syndication again.

'Canes rule Super Bowl 41

Leave it to the players known for their swagger to come home to South Florida to roost. Former Miami Hurricanes Devin Hester and Reggie Wayne scored what's otherwise a really sloppy game. No surprise that da boyz from "The U" work better in the rain.

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]