From aggravation to aggregation

Like it or not, more and more people are reading what we write on news-readers, or through links from Google or their home page.
I am already writing with aggregators in mind

– My titles make more sense than they used to, and my first line gives a better description of what the blog post will be about, since it may be the only thing most people will ever see. Users now have more control how they trawl the great hypertexted database of information. It is less about how we serve it up to them, and more about how they want to access it. The use of aggregators also means that I wont be getting as many hits on my site, since there is less reason to come all the way over here. Aggregation is the way the world is tapping the database with a filter and not a funnel and all of us are changing they way we create content with that in mind.
Joshua Porter, in his article Home Alone? How Content Aggregators Change Navigation and Control of Content sees 2 types of aggregators
“Search engines are the most common type of machine aggregators. They send out spiders to crawl the Web and index pages, and allow users to submit queries to them. Big search engines such as Yahoo! and Google attempt to aggregate the entire Web, while more specialized services such as Blogdex aggregate only a certain subset of the Web—those containing blogs.
Blogs themselves, however, are examples of human-aggregated content because a human makes an explicit choice about what content to include.”
Joshua Porter


Andrew Jones launched his first internet space in 1997 and has been teaching on related issues for the past 20 years. He travels all the time but lives between Wellington, San Francisco and a hobbit home in Prague.


  • Dan says:

    Andrew I have MY YAHOO set up with you on it. Whenever you update your site it tells me so I can get the latest and greatest “ANDREW JONES REMARKS.”

  • I use aggregators from time to time – BlogLines, NetNewsWire etc – but ultimately end up going back to looking at my favourite blogs directly.
    The immediate notification that a new entry has been posted is great but often you don’t get an idea of how much feedback or interest a post generated without looking at comments and trackbacks.
    It also leads one to living “in the moment” so interesting posts that do generate reflection over the course of a week drop off the radar almost immediately.
    And you lose all the other information and links that others include around the main content. For some that may mean that people won’t ever see their books they find helpful, links to other non-blogged documents and people, and for some their Google ads, Amazon links and PayPal buttons.
    Rather than do the two-step aggregator->browser I’ve gone back mostly to tabbed-collections of sites in Safari and Firefox.

  • Bald Man says:

    I use NetNewsWire Lite almost exclusively to read blogs. I’m to cheap/poor to pay for Internet access at home; so I log on at work or via a free hotspot, update my aggregator, skim the headlines, pull the interesting one’s up in my browser, and head home to read the details later. It’s a bit of a hassle should I want to follow a link, but I manage.
    One feature I would like is the ability to follow comments via an aggregator. Comments should be a standard part of the RSS feed. When a comment is posted, the feed updates and the aggregator flags it as new.

  • Andrew says:

    good call.
    My own comments come back to me on Google and i can edit them (i never do!!!!) or add to them with one click.

Leave a Reply