Tuesday, February 27, 2007
And now I got a friendly e-mail from my host telling me that performance is a bit down for queries, and that I'll need to run some admin scripts to regain speed.
While our head of accounting and I agreed that we'd track the escalating charges we face for the months and then determine if we'd upgrade plans, ORCSWeb is pragmatically nice enough to say, "Dude...just ramp up incrementally...you'll save money." That's why I love ORCSWeb. They know.
I guess it's a good thing when you feeling the pinch of rampant growth, huh?
FAVICON.ico a bandwidth hog???
I checked my own stats, and while not impacting my bandwidth at the dramatic level that Scott's did, it is pretty revealing how much the little Guam flag that adorns my site's main content pages takes up when aggregated.
Thanks for the tip, Scott.
Having fun at work
As an example of such liberal thinking, here's a quip I did last night about the states of Guam utilities last night during our live newscast.
It got a nice reaction from our viewers born before 1985. :-)
Monday, February 26, 2007
Unwillingness to syndicate ticks me off
I've been preaching for the Church of Winer pretty intensely to organizations of varying industry, sector and size, encouraging people to use RSS for everything from report generation, to streamlining e-mail clients, to standardizing outbound data streams.
That's why it irked me to no end to receive correspondence tonight from a partner I recommended using RSS with: "Our system is automated and due to continuing budget constraints our resources are limited." Huh?!? This to me doesn't make any sense. I guess it's the principle of the thing. Cost prohibitiveness I can deal with, no IT department works for me, but just an unwillingness to try it out doesn't gel with me.
One of my missions is to rid the world (Wide Web) of such naive thinking.
Transparency in product promotions
Examples of such phrases that we use at KUAM to promote how we do news online:
- On Air. Online. On Demand.
- Guam's best news, entertainment & information.
- Anytime, anywhere, on any device.
- Every story, segment, series and show we do has a corresponding URL. (And most have feeds, too.)
'Amazing Race' features my hometown
It'll be nice vindication since Palau beat us out with Survivor.
It's our award season, too
Here are some of the major awards media outlets are getting in the next month or so:
Our new 'exhibitfeed' service
(Now, we're not claiming originality for this service...while our implementation is unique, other fine news services have similar offerings.) Because several of the exhibits are HTML presentations, the RSS feed that's I hooked up to NewsLinks make it functions sorta like a non-downloadable podcast - an 'exhibitfeed'.
Sunday, February 25, 2007
Choice is a good thing
AJAX and ASP.NET output caching
I was thinking about doing some background asynchronous processing of data while people read our news stories, via AJAX. The unknown was that most of the pages on which I'd implement the actions are output cached, and while I'd need to retain such a performance gain, I'd need the AJAX functionality to fire with each page request.
Basically, my demo was for page analytics: seeding and incrementing a tally of how many times certain pages are accessed, on pages that are cached on the server-side. Fortunately, and as I expected, you can use AJAX in such a fashion without losing your ability to cache on the server.
What churns my brain this Sunday
- DRM is the new PMRC
- Ever since I installed the .NET Framework 2.0, Windows Update runs almost daily
- Glad to see VMWare getting some big press
- The Split Browser add-on for Firefox is really cool
- I really enjoy having the office all to myself (T-1 to the 'Net, 19 TV sets, solitude, books)
- I've added Digg to the short & prestigious list of companies I'd like to work for (ESPN and Google being the others)
- Refining my definition of Web 2.0 (again)
- Prepping for the grand Monday rollout of a new service I soft-launched this weekend to archive all exhibits, links and downloads we mention on the air
- I personally can make it rain by washing my car
- Adding our new 'Most popular stories' AJAX process to KUAM.com
- How much I love development, but how much I hate design
Saturday, February 24, 2007
KUAM News promo: Spring '07
We aim to please, so here you go! Watch it, download it, share it...enjoy it! And keep those e-mails coming!
Luckily, the definitive book on the rather complex subject of WPF development from Sams happened to land on my desk this week, so I'm good to go.
Book review - Murach's ASP.NET 2.0 Upgrader's Guide: C# Edition
Murach's ASP.NET 2.0 Upgrader's Guide: C# Edition
by Doug Lowe & Joel Murach
Published by Murach & Associates
True to its name, this isn't a book for the first-time .NET coder, and is best suited for experienced programmers wanting a quick primer on the new features of ASP.NET v.2.0. The dominant aspects of the Framework are profiled, including web parts, personalization, master pages, data access, navigation and user profiling. (Although I would have preferred caching to have its own dedicated chapter, it was a nice touch to pepper the applicable chapters with proper use of the caching, Cache API, and database-level caching for SQL Server 2000 and 2005 where caching can help.)
As such, it doesn't drill down into the particularities of any one feature on a granular level, but such detail cab be accessed from a thousands different MSDN pages, blogs, video feeds, etc. So, the book accurately does what it says it does: get you up to speed and ready to build cool web-based stuff with ASP.NET.
Mud rocks out at KUAM
Also, check out the interview we did with Mud on our YouTube channel.
And webheads: make sure to check them out when they play South By Southwest.
Thursday, February 22, 2007
It must be the OS (or, LeBron knows Windows)
There was a time in my life when I thought I was Mr. Microsoft…seems #23 has ganked that firmly away. I just caught on ESPN that King James, who I love, inked a marketing deal with Bill Gates to tout
LeBron James as Windows Vista - mind-blowing in every way with latent take-over-the-world potential ability. Beautifully engineered, well packaged, the toast of the town with limitless potential, with global fans and non-fans having at least heard about it. As many mainstream media pundits rip it for being all bark and no bite as there are devout followers poring over the implied superiority of the product.
Tim Duncan as Mac OS X - completely impressive at first glace to anybody. Excellent, refined packaging with rock solid fundamentals. And while not conforming to the trendy gimmickry of the day, both make their money and dominate by etching their own style and bringing about a renaissance to a new way of thinking about the market.
Allen Iverson as Linux - the initial packaging might be a bit misunderstood as inferior, but there's some world-class engineering within for those that really know what they're looking for in a winner. They might display some pretty severe compatibility issues, not gelling well with other system components, but when the chips are down they both flat-out get the job done, in terms of durability, performance and power. Also, various add-ons and physical enhancements over the years have altered the physical appearance, but haven't changed the game.
News sites shouldn't be curators of the world's data
If you're not from here, trust me when I tell you that there's been no organization worse at understanding, using, implementing or managing information technology than the Government of Guam. (It's been a long and arduous road just to get these agencies to scan their documents and then e-mail them to us, so this is a whole new barrel of monkeys we're dealing with.)
Historically over the past decade, my company has hosted various types of data (PDFs, Word documents, HTML-ized Powerpoint slide decks, etc.) in cases where an issue was of critical importance, we could get the info online faster, or our servers would be more reliable in terms of bandwidth, scalability or uptime. (Much in the same way the Starr Report was published on MSNBC.) But it's unconscionable for us to do so for every miniscule law, press release, announcement, retraction, etc. So I guess it's my fault that they never took the initiative and started hosting their own data on their own sites. Ignorance isn't an excuse here...it's lethargy.
But my problem is that I've been facing escalating and costly overage charges from my web host for the hundreds of MB of media we've been forced to archive for those agencies who should very well do for themselves. On another level, I want to be altruistic and foster the growth of other web sites and places to go. My traffic's always been phenomenal, but I want to link to external resources, too. It makes the game better by improving the ecosystem of having more than one destination to get at source information. Much to my chagrin, not every single person that could visit my site does so.
So a lot of my energy lately at meetings and talks has been working towards getting government to be cognizant of the critical importance of taking ownership for your own data. Mainstream media at any level can't be expected to be curators of the world's information - especially when having to pay extra for doing so. We report the news, we don't keep web-accessible archives of everything. Agencies have to be responsible for the integrity of their own stuff...and for making it available. In the event that a court decision, public law or other bit of information changes, it gets all out of synch. We've mastered the craft of working with data - creating it, distributing it, archiving it and sharing it - but that's for our own stuff.
Government's got to get with the program and start putting their own stuff on their own servers, too.
Wednesday, February 21, 2007
1,000 posts in 72 days
Our 20,000th story was posted on December 17, with out 21,000th story going live just now on February 21. We pulled off the volume in 72 days - the fastest time between each. Our daily story load is about 25 stories every 24 hours, also a regional record. That outpublishes most newspapers, and far exceeds the typical volume for most small market broadcast stations our size.
It wasn't easy to attain this feat - we hired some new people, made a couple enhancements to our CMS and ramped up our publishing schedule, to the ultimate delight of our users. Onward!
Tuesday, February 20, 2007
Import data dynamically into PowerPoint slides
I e-mailed the support address and a couple of hours later a dude from Belgium wrote me back...and even included a slide deck with my stuff already imported. Whatta sales pitch.
(Ever the naive programmer, I was trying to get this done on my own, but to no avail. Wally was nice enough to point out that this could be done with VBA, but that's a headache unto itself.)
Do I really want to be Dugg?
"A roving random distributed denial of service attack before which web, network and systems administrators alike quake and have terrible nightmares about."...it's both desired and feared by content managers. On several occasions I've considered adding the perfunctory "Digg this" text and/or graphical link to stories on my site, but always refrain, knowing that the limited-scope of the market we serve being a local news station in Guam likely won't refer that many, if any, articles of interest. I'm not going to go all out and be ridiculous like sites including the ASP.NET blogs and have Digg, Reddit and del.icio.us, turning a really cool and useful utility into overkill. And even several mainstream news sources have Digg well promoted, in the hopes their stories will, too.
But I digress - although I'd love for my site's stories to appear on Digg and get massive exposure...I'm scared as hell. I've got a really good host provider with ample bandwidth, but I don't know if it could take a really popular hit. I'm a believer in the oxymoron of controlled chaos. (I used the same cautiously optimistic mentality a few years back when refraining from registering my site's podcasts in the new iTunes Music Store.)
So how to most effectively combat this if you're not in possession of the type of bandwidth that the major news networks have? Don't openly invite people to Digg your site - let them do so voluntarily. Not that I don't encourage such behavior...it's cool as hell. Just take in the traffic that you'd get, but not by telling people to Digg every single story, because there are those that will.
Network effects, like most things in life, are best when taken in tastefully and sparingly.
Printable content on blogs
But the problem remains - despite vast innovations in weblogs (advanced archiving, syndication, formatting, comment systems, moderation, etc.) the basic premise of allowing content to be rendered so that it'll print properly is sorely missing. I realize I may be of the minority voice here, but I still print out my stuff.
And I'm not overlooking just commonplace bloggers, either. Mainstream media, primarily news sites, typically use the 'Print this article' link, but more and more these days are lazily only allowing such functionality to call up the client's print queue command. That's stupid, because it doesn't solve the problem of making the page more printer-friendly, which is the whole point.
What I want to see is a link that strips a normal page of all its justification, formatting, tabling, and other accoutrement - just giving me the content. If there are ads, so be it. I don't mind. I just can't stand losing out on information due to incessant formatting.
I find the best example of printable content on the web to be O'Reilly (see the actual page vs. the printable format), and that cool reformatting employed by NewsHutch. It's usability I echoed on my own site. I guess the next generation of this would be a we service that would tak page content and make it print-friendly, and/or dynamically format the content as a PDF, for ultimat6e cross-platform compatibility.
Sounds like a Web 2.0 service waiting to be invented. :-)
Monday, February 19, 2007
Why do some sites load so slowly?
Michael recently released built WebWait, a slick web app to test any URL's loading speed under a variety of conditions. So using this for scientific data and the basics of common sense, consider the following:
- A local friend's blog isn't exactly a speed demon, typical of most hobbyist projects, but likely because of the low-tech architecture...cheap host with limited outgoing bandwidth, CGI as opposed to FastCGI, top-heavy CSS markup, PHP back-end calling uncached data and so forcing database operations each and every time out.
- Flickr - this is what started this whole inquisition. Granted, the site gets a slight pass because Yahoo!'s injected a ton of resources and money, and it's core functionality is to deal with image processing, so it's not just text, but for months I've wondered why the wait.
- Digg's search tool - this seems to be the only thing holding back my favorite news site from being an optimal performer. Long, drawn-out waiting periods between queries gets annoying.
- Wikipedia - this is an iffy one, with some searches coming back really rapidly, and others taking eons to return my desired article. Sure, they're collective online properties have broken into the Top 10 list of most-visited sites, so traffic is a major concern, but what gives?
- MSNBC.com - the news giant has no excuse to not perform. Ever since they did their most noticeable revamp and implemented .NET, performance has gone right into the toilet. And I don't get it - they're supposedly running on a platform that performs and scales better than anything else on the market, with an architecture utilizing top-notch know-how, what's assumedly a massive server farm, and crowds of developers. What's also weird is that the site apparently is pre-compressed. The ads and Flash gimmickry I could do without, truth be told.
Sunday, February 18, 2007
On operating systems, platforms & communities
91% of the people in the world using computers can't be wrong. Being the most-used product in the market, legions of people knowing nothing about computers will use our stuff…and remain as ignorant throughout their experience with us. And we like it that way because they'll tell their friends to buy Windows, who'll tell their friends... We'll happily prey on such rubes, making a ton of money, leveraging blind faith and building a community of devout developers who like lemmings will follow us to the ends of the earth, and as quickly proceed to jump off the edge without question or hesitation if we tell them.
We've created a new microeconomy in forcing customers to buy high-end computers with components that 20 years ago could have been classified as a low-end mainframe. We've likewise fostered an entire ecosystem of third-party programs that our users quite honestly can't live without these days (e.g., anti-virus, spyware/malware, firewalls, anti-spam). And the end result of this small fortune that a person's invested will be that their machine perform at a (hopefully) tolerable level.
We've completely lost our once-firmly established footing in Web technologies, so we're going to capitalize on capitalism: we'll focus on selling operating systems, productivity suites, business applications and developer tools. And those that dare stand in the way of our mega-million marketing be damned.
For the time being, we'll tell you everything's great while you battle stability, security and scalability concerns…that is, until we release the next version of our software, announce ceased support for legacy versions and bemoan our earlier work as absolute crap to encourage you to buy an upgrade.
All in the name of progress.
We're better than anyone out there, but we're misunderstood. Our stuff costs more because it quite simply is the best there is, so don't complain about the price. And our formats, platforms and hardware are completely closed off to the rest of the world, so don't expect much in terms of interoperability. You're paying for exclusivity. Across the board, everything we put our name on is the epitome of quality and innovation - devices, software, services. We're totally convinced that we're all a higher order of user - developer, engineer, student, teacher, graphics designer, or information worker. So you can rest assured that when carrying around one of our devices or sporting our logo, you've ascended to a greater plane of existence technically, and even socially.
Yet we so often neglect to acknowledge that we almost went out of business on at least one occasion due to the esoteric nature of our handiwork. The right people get us. If you don't, it's your loss.
But there's always room for you at the table.
The world (and World Wide Web) would truly be ours if only we could get our act together. We're the quintessential walking contradiction in the Age of Information: we're the supposed superior younger brother of UNIX, we adhere to our staunch beliefs in openness and freedom, and we know beyond a shadow of a doubt we're fundamentally better than either of our commercial contemporaries. But by orders of magnitude we're more complex - completely disorganized and uber-political.
Our stuff runs great on older hardware, but has driver incompatibilities up the wazoo. We've been on the cusp of making a major dent with mainstream audiences for awhile, and this could be the year we crossover and hit it big on the desktop. If only we could downplay the geekiness that surrounds us. Yet the one thing we've got going for us is that Google likes us…they really like us.
So tragically, rather than try to achieve legitimacy by justifying our existence to the layman and addressing the various hardware compatibility issues (i.e., WiFi, graphics cards, laptop suspend/hibernate on-close, codecs, etc.) that keep us from overcoming a longstanding hump, we pick fights within our own community. We force campiness between supporters our most popular desktop environments (GNOME vs. KDE), with each calling the other's complete excrement, and labeling our cousins - the various offspring of the core kernel inferior.
How's that for circling the wagons?
Coming home to mama
Not having used anything from the 2.0 space since playing with the early alpha bits, I'm admittedly rusty. Lots of positive improvements have been made and I'm genuinely pleased that many of the suggestions I made in focus groups, in one form or another, made it into the final release.
I'm in at the office on a gorgeous Sunday afternoon (at least that's what my co-workers tell me), putting the updated Framework on my Lenovo laptop, along with Visual Web Developer Express Edition. With only a handful of people here, most of them doing paperwork and/or video editing, I've basically got an entire T-1 to myself. Keep in mind that for the majority of my career, I've been Mr. Anti-Visual Studio (particularly the first few .NET versions). I wrote Version 4.0 of my company's site in ASP.NET 1.x in 11 days by coding raw C# syntax in ScITE. So needless to say, this is a research project as much as it is a dev weekend.
Here's some highlights:
- The download was fast (one of those 2.2MB pre-installer deals), the install was not. But, the installation process was self-managed and went off without a hitch.
- VWD, even without SQL Server 2005 Express Edition and the documentation, is a monster program, assuming some 400MB of diskspace. Yikes. Still, it's free, so I won't complain.
- Because of my iffiness with using free disk, I opted to only have the Framework locally. The .NET 2.0 Framework, with the SDK, is a 350MB download. Assuming MSDN still synchs the API with the desktop docs I'd be downloading for local viewing, I'll take my chances over the web, thank you very much.
- "You have 30 days to register this product". For freeware? Huh?!? I did it anyway, but the freebies I was promised (free Corbis imagery, free e-book, free temporary hosting) shortly thereafter led to a 404 page. Great.
Saturday, February 17, 2007
Book review: Beginning Ubuntu Linux
by Keir Thomas, published by APress
Keir Thomas puts everything you need to get started with the distribution of Linux getting the most press these days all in one handy tome. All you'll need is a decent computer, the software, and you're set! The book's chapters - 34 in all - deal with the various aspects of installing/configuring, using and personalizing Ubuntu for your tastes.
Stating true to the political slant of most open source and Linux advocates, the book spares no expense when taking potshots at rival Microsoft and denouncing the company's operating system and Windows' implied failure to live up to its billing as a high-performance, stable, secure operating system. Ubuntu, Thomas preaches, to the rescue.
For history buffs, the book starts out with a well-chronicled backstory on the oft-misconceived beginnings of GNU/Linux, and of the free and open source software movement. The book then proceeds to do a great job of talking about the various concerns with Linux's hardware compatibility, including WiFi access, and of finding your way around the default GNOME desktop environment. Differentiations are clearly made between traditional UNIX hardware problems and Ubuntu's nature to auto-detect all but the most obscure of devices and peripherals. An equally rewarding section is Thomas's lesson on using Synaptic and RPM-based repositories to discover, download, install and update the latest software.
Also helpful for Microsoft converts is a nice introduction to working with the UNIX terminal, specifically Ubuntu's use of BASH, and of basics of using the command-line interface for interacting with Linux. Cron and crontab are dealt with sparingly, but nicely.
There are also several very helpful chapters on multimedia. And one of the book's many redeeming sections is evident in the excellent discussions on using Wine, remote terminals, and virtualization to access external resources, or run other Linux or Windows applications from within Ubuntu. It's a part of the book that's very well-written and nicely laid out. And any user will appreciate the healthy section on using OpenOffice, and then the individual chapters laying out basic features and functionality of each of the applications with that productivity suite.
The live CD also ensures you can start playing with the OS right away, for those of you not wanting to download the 700MB ISO image (although the version on CD is likely one or more generations behind the current stable distro).
I found the book to only have a couple of minor shortcomings, namely a surprising lack of a chapter on gaming with Linux, which for a growing population is a major draw. It would have also been beneficial to go more into detail about shell scripting, beyond just simple (sub)directory and setting up basic cron jobs. Also, I find that the one area keeping the book attaining absolute perfection would be incorporating a Mac-centric viewpoint for people coming over from Apple environments, or looking to bridge Linux with that platform.
But those aside, the book is otherwise a one-stop shopping gem - everything you'd need to get up and running in a matter of hours.
Friday, February 16, 2007
ESPN starts daily web-only newscast...sounds familiar
My station's been doing this, too for several months. And to great success.
Django book's release delayed until June
While it's a bummer that I won't be able to have something bound and tangible in hand until the mid-part of the year, it's still revolutionary that Jacob and Adrian are putting the chapters online for unofficial review.
Thursday, February 15, 2007
Yahoo!'s new suggestion board: W-E-A-K
Cory Bergman from LostRemote broke the news about the expected backlash, but it's gotten much more vulgar in the few minutes since.
I had a memory question and shot their support address a note, and System76's tech support staff were very friendly, helpful, knowledgeable and fast. They had me at hello.
Wednesday, February 14, 2007
...and the Grammy goes to...Slayer
p.s. And am I the only guy to realize that it's already been 21 years since "Reign in Blood"? That still blows away most of what's come out since. The most brutal 38 minutes ever put on tape.
News site statistics for the New Web
This is a sticky wicket, indeed. If I aggregated all the traffic my site gets from our page requests via the traditional World Wide Web, web service calls, desktop widgets, WAP pages, AJAX-based polling (for tag clouds, etc.), RSS feeds being pinged, Flash Video clips being clicked on and podcatcher clients looking at feeds for updates, remote embedded videos being played back on our servers and other services, I could easily pad my stats and add 1.5 million more "page views" per week. But for clarity, we plainly separate our traffic for clients into what's really generated via our web site through traditional means and then also for enhanced features for the full effect.
Conversely, I have a friend in the biz who cut over a major part of his site to Flash. He was shocked to learn that data transfer surged with his web host, but his page views dropped by 90%.
When doing presentations about how we do news online at KUAM.com, I stress to people that we in the content industry measure metrics differently. Sure, we may get several hundreds of thousands of unique visitors, but each visitor requests perhaps 8-9 pages per session, and typically exhibits several sessions per day. And services like AJAX and Flash and certain types of caching may impact stats, rendering them into a less-than-impressive format, but we ensure clients out stuff's being seen more and more in a greater variety of formats. This they like hearing.
Bottom line: site stats ain't what they used to be.
Saturday, February 10, 2007
IE7, thou art mine enemy
Amy shares this grief. I'm praying I'll not have to return to the days of "This site looks best using...". And this was supposed to be my "easy" weekend.
RSS aggregator as archive manager
- how many stories appear on our homepage (our publishing load)
- how long the stories appear on our homepage (historical availability)
For purposes of remaining relevant and not being stale, most stories stay get primetime exposure for a half-day or so, and then are demoted to the lower-half for easy access, before they are essentially lost and have to be queried through date-specific search tools I've developed, or searched through search engines or Google News or other means. The same philosophy applies when publishing our data via RSS - everything is mirrored, of course, between our web-based and syndicated data, but the number of items that appear in the feed are those that have been published within a 24-hour cycle.
Now back to my point. I found today that after not checking my station's main RSS feed for a few weeks in Google Reader, that the app does a wonderful job of keeping past items cached and accessible. I was able to navigate backwards for an entire month, giving me instant access to more than 500 articles we've published. That's really cool.
I hated to do it...but CAPTCHAs are in!
Immediately after deploying the extra validation step it eliminated bad comments right away by 100%. Whew!
Sorry to those that this bothered, even if temporarily, because we caught most of the bad posts before our users saw the junk. If you're a dev, consider this list when rolling your own anti-spam solution.
Friday, February 09, 2007
Ubuntu, Linspire/Freespire partner up
Monday, February 05, 2007
Back behind the podcast mic...sort of
It should make for a nice time, and it'll be nice to lay down some MP3 audio for syndication again.
'Canes rule Super Bowl 41
Subscribe to Posts [Atom]