Thursday, April 26, 2007

GPL 3.0 and web services

I've been pondering the impacts of the GNU General Public License and its impacts on web services lately, so here are some cool links I found that help describe the topic.

FSF legal counsel Eben Moglen has a great talk on FLOSS Weekly (podcast downloadable here), in which he talks about GPL 3.0's impact on web services, and if developers are required to disclose their logic when using remote data sources over the wire.

From another description (HTML version here):
Many companies use software that is subject to the GPL to provide services via the web. This has frustrated many at the FSF and in the open source community because it essentially means that a number of changes and enhancements to Linux and other programs subject to the GPL are created and used as web applications, but never shared with the community at large (which many believe is the very purpose of the GPL). There is a movement within the open source community and apparently in the FSF to close this loophole and require that the provision of web services be accompanied by changes to the source code allowing the provision of those services. There are multiple ways that this issue might be addressed in GPL 3.0, some of which might be acceptable and some not.
There's an equally intriguing post here on GPL 3.0 and SAAS:
In the second draft, this [software as a service] was addressed in section 7(b)4, which would've allowed licensors to optionally add a requirement to their code so that source would remain available even when the software was running as a network service. The language we proposed got the job done, but it was not an elegant solution. Proponents of the requirement...supported the goal but were unsatisfied with the execution.

Monday, April 16, 2007

New record: 1,000 posts in 53 days

We've set a new record. Again. I previously blogged about how we at reached a new milestone for ourselves in publishing 1,000 news articles in 72 days. Well, not only eclipsed that marl...we shattered it. This morning we completed another batch of 1,000 stories, only 53 days after publishing the last batch. This is remarkable seeing as how our news team is fairly small and we produce all our own content (no correspondents or seasonal contributors).

This is significant, seeing how in our first year of operation it took us nearly a full year to make our first 1,000 stories available. We now do it in well under 2 months - effectively having increased our productivity 600%.

Management likes hearing that. :-)

Sunday, April 15, 2007

Grid-based design gets the nod

Here's a fantastic piece with a ton of helpful resources about using grid design for content-intense sites. This should be a wake-up call to ALL sites that would consider themselves distributors of mainstream news. I've never been a fan of the blog-like UI for mainstream news sites...basing a site's layout on a sequential collection of reverse-ordered articles that chronologically move themselves from immediate accessibility as new items are added is a lazy, piss-poor way to present current news.

The major precept is that you've got to be working with a LOT of content, or at least be willing to fill columns and/or rows with imagery or headers. It basically promotes a use of a logical column design, akin to traditional newspapers.

More importantly, it also discusses the inevitable side effect of doing grid-based design: extended vertical pages. This is something I've battled since Day 1.

This should be required reading for those getting into the design business for high-end content sites.

Monday, April 09, 2007

Do Apple mice produce their own form of CTS?

I've got a dull ache in my right wrist today...which intrigues me, seeing as how I spent the majority of the day on my dad's Mac. The normative low-grade carpal tunnel syndrome I get from time to time after using my Windows laptop or Linux box is replaced by a tingly sensation that's more central to my arm, right in the top & middle of the wrist and not off to the side like that which is produced by heavy sessions on 2-button, scrollwheel mice.

So I'm Apple mice produce their own traceable form of CTS?

Thursday, April 05, 2007

Mainstream news sites using ping servers

I'm mulling this morning the total awareness of a news site's content on the Web. Specifically, I'm thinking of ways to get my site's stuff exposed as soon as we publish it, almost at the rate that blogs do. I admittedly don't know a lot of in-depth info about ping servers, but I'm starting to look into it.

Most major sites that have been around for a few months will get automatically indexed with Google's main search service, and legitimate, credible news sources will be conditionally included in Google News for indexing there, too. Both are quite fast. Weblogs get listed and also rely on using ping servers to update their own indexes based on changes to their content. This can sometimes give blogs a leg-up in being recognized by blog search services like Technorati, Google Blog Search and IceRocket. With mainstream media (MSM) news sites being those whose content changes constantly, we get listed pretty fast, but can often get scooped (in terms of recognition and indexing by major crawlers) by such directories and services. I'm wondering if it might help to send XML-RPC pings to the known servers, too.

SEO is pretty much content-agnostic. Blogs and MSM sites alike get indexed and listed by the search engine community all the time. But arguably, ping servers can list content on blogs with greater frequency. That's kinda what Google News does.

Might this benefit us in the mainstream? Thoughts, anyone?

Tuesday, April 03, 2007

What happens when mashups are mashed-up?

Playing the role of the contrarian, I'm wondering how the Web will be if people start hacking sites which are themselves hacks. You know, someone scraping mashup pages that themselves scrape third-party sources. This might introduce the need for some pretty clever error handling patterns in terms of managing downtime and service outages.


Emerging market? Desktop RIA mashups

One thing I'm thinking about this morning is how to capitalize on the mashup market with desktop RIAs. With Apollo, WPF/E and Dekoh making for simplified development for desktop tools, how is this going to play out in the RIA space, with data access away from the browser?

Web service APIs and RSS/Atom feeds are going to have to work extra hard serving a new and additional generation of calling clients, and maybe even more so with the need to provide asynchronously communicate with devices and apps. Not much really has come out of mashups within the mobile space, so I'm wondering how the development community will embrace the browserless platform as an enrivonment for working with remote data.

Anyone have any thoughts on this? Perhaps we'll see offline versions of CraigsList and

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]