Posts Tagged ‘aggregation’

I wasn’t going to post anything on Michael Kinsley’s post about a Felix Salmon article on the New York Observer, which (the Kinsley piece) focuses on the issue of whether the quality of writing on the Web matters. But I keep talking to people about it. At least five people in the past 24 hours. So it seems worth pausing and posing this question: Whether or not Kinsley is serious (I’m pretty sure he’s joking, but don’t ask me to put money on it), might the point of the following sentence be true?

“Never did it occur to me, until I read Felix’s blog post, that it might be possible, without seeming insane, to argue that all aspects of good writing — accuracy, logic, spelling, graceful turns of phrase, wisdom and insight, puns (only good ones), punctuation, proper grammar and syntax (and what’s the difference between those two again?) — are all overrated.” (And yes, it says “all aspects … are all overrated.” Move on.)

You can read Salmon’s piece here, which may help in the details if you don’t get exactly what is meant by this from Salmon:

“When you’re working online, more is more. If you have the cojones to throw up everything, more or less regardless of quality, you’ll be rewarded for it — even the bad posts get some traffic, and it’s impossible ex ante to know which posts are going to end up getting massive pageviews. The less you worry about quality control at the low end, the more opportunities you get to print stories which will be shared or searched for or just hit some kind of nerve.”

So the question raised here — again, whether or not Kinsley is serious — is how close is this to being correct? Undoubtedly, quality control in the media universe as described in the Salmon piece is lacking, but that quality is transitory anyway, as is the audience. If you as a publication are largely reliable, does it matter if you carry writers who really stink? In the online world, the Washington Post’s columnists and Cagle’s can appear side by side under the same set of links, and how many online reader really notice — or care — that the Post’s are better edited and cleaner? I don’t have answers to that yet.

Read Full Post »

John Robinson fooled me. He started a post about the need for innovation with questions that seemed geared to curmudgeonly, 20th-century answers. For instance:

What would you do if:
* Half of your employees — including those in circulation — don’t subscribe?
* Half of your employees — including those in the newsroom — don’t read the paper (except for their own stories)?
* Half of your employees don’t subscribe to your e-newsletters?

I worked up a good, frothy dudgeon and was thinking to myself, “What has happened to John since he left newspapers that he is taking such a troglodyte approach?” — and then I got to the end of his post. So, spoiler alert, he was not writing in inverted-pyramid-style. It was more like pyramid-style. The end held the answers to my questions.

The “troglodyte” approach would be to require employees to subscribe and read (maybe quiz them, to test whether they really read), but, as John writes, a better idea is to ask your employees why: Why don’t they subscribe? Why don’t they read? If the only thing they read is the stories that carry their byline, then the only thing they care about is what was changed between writing and publishing, which means they don’t care about the content. If the reporters don’t care, why should anyone else? Ask them that. Ask what they SHOULD be writing about to make people read.

Related to this, Peter Osnos had an article in The Atlantic resurrecting the idea that aggregators should pay for the news they aggregate, which ignores the fact that no one pays the aggregators, except advertisers, which are not at current ad rates a source of revenue that would sustain news organizations. Paying for aggregation is an idea that traditional journalists love, but if most news organizations started charging with a hard paywall, almost all aggregators would stop looking and aggregating — just as most people do not subscribe.

Get to the basics: Whether or not your site has a paywall or a metered paywall, it’s important to ask what people will pay for and what will make them keep coming back. The same things that make your site worth aggregating are the things that make someone consider subscribing, so in the end whether you go the free model or the paywall model you hit the same capitalist question: Is it worth it?

And you can’t change what people want to read. Among the gathering evidence: a Washington Post story.

Read Full Post »

The folks at insidenova.com, the website of the News & Messenger in Manassas and Prince William County, Va., stumbled into an excellent example of how to respond to what you see happening locally in social media. After severe flooding in the region last week, people found themselves without a clearinghouse for information and discussion — but they gravitated to the insidenova Facebook page and were filling it with just such information. So, seeing that, interim managing editor Kari Pugh created a flood information clearinghouse page on Facebook. In just a few hours it had garnered about 250 “likes,” and the community discussion on it was mostly self-sustaining. The community is doing the organizing and exchange of information, but the news organization has facilitated that and put itself at the hub of the conversation.

Read Full Post »

Joplin before and after
Jot this idea down in case a disaster ever levels your city: Use Google Streetview to get a “before” scene of anyplace in town. The above from Joplin, Mo. (pros take note: the “after” photo by a citizen-journalist).

Read Full Post »

cat news
All right, so maybe it won’t be robot editors who take our jobs. Instead it might be the readers themselves who become their own news editors, creating their own personalized news product every day, or as often as they want news. That’s the vision of Ben Huh, creator of the Cheezburger network of sites, best known for the funny photos of cats with the funnier captions. In a Seattle Times article, it doesn’t sound like a radical departure from the direction things on the Web already are going:

“His plan is to create an open-source platform that people could use to be ‘amateur editors,’ designing and managing their own blend of online news sources and advertising. If there’s enough interest he’d like to develop it as a public tool like blogging platform WordPress.org.

“The end product sounds like a portal creation tool along the lines of Netvibes.com, a site that lets users customize a personal home page with widgets and news feeds.”

5/24/2011 UPDATE: ReadWriteWeb has further details, including a wireframes mockup of what Huh has in mind.

Read Full Post »

(Originally posted on Feb. 25, 2011)
Allbritton Communications unceremoniously demoted TBD.com to the status of glorified E! channel this week. If you remember all the way back to last year, when some people (like me) had high hopes for TBD as a model for local news online, read CJR’s interview with Jim Brady, who stepped down from leading TBD late last year when it must have become obvious that Allbritton intended to decapitate TBD. One thing that is true is that TBD’s model — aggregating news throughout the community, whether from partners or from competitors — was a success, as far as measured by traffic: In January, just five months after its debut, it attracted 1.5 million unique visitors, nearly double its December total of 838,000 and far surpassing November’s total, 715,000, the internal figures show; over the past three months, TBD’s traffic was substantially higher than Web sites operated by local TV stations WRC (Channel 4), WUSA (Channel 9) and WTTG (Channel 5), according to Compete.com.

“I’d even go so far to say that that model is, for a local news site, sort of indisputable. The debate over whether you work with people in your community, or whether you just say, ‘Here’s our website, and here’s all the stuff we produced today and that’s it,’ I think that has to be over. Newspapers had that power because they had the power of distribution. But on the web, people are going to go to all different sites, and so if you can be that place that connects people to good content that they’re interested in regardless of source, then you’re going to be the place they start their day. And on the web, that’s how you win: you have to be in somebody’s short list of sites they always go to. People would say, ‘Why are you linking off-site? You’re driving people away from your site!’ But what’s the counter-argument to that, that if you never link off-site, then people will never leave your website?

“I mean, they’re going to leave your website anyway, whether it’s to go check their e-mail or go to TMZ.com or whatever. So the concept that you’re losing people by doing that, is actually the opposite of what’s actually happening — which is that you’re building loyalty by performing the role you’re supposed to perform, which is to be a conduit for useful information.”

Read Full Post »

(Originally posted on Nov. 15, 2010)
SaveTheNews.org has an interesting look behind an experimental hyperlocal news-aggregation site in Boulder, Colo., Slices of Boulder. Steve Outing, who oversees the site, describes it as, “It’s curation, and aggregation, and intelligent semantic filtering and processing, and text mining, and personalization offered down to a micro level.” Aggregation seems a natural addition to mainstream news sites, but so far not many seem to be doing it. As Outing says:

“If local news organizations are to survive and be relevant, they must learn to curate and aggregate links to the best local content being produced online for their communities. If they don’t take this on, someone else will.”

Read Full Post »

(Originally posted on Nov. 11, 2010)
Poynter.org has a good post today on ways to get people to contribute good content to your site. But I have a beef with the title, because many people are going to look at “content” and think the tips apply to getting people to send in stories, photos, video, etc. They apply to everything, from in-person conversation to interviews to simple comments on stories or Facebook updates all the way up; it’s just trickier online. (Ironically, tip No. 1 is to avoid using the term “user-generated content,” which I’d broaden to avoiding the word “content” as much as possible, though it can’t always be avoided.)

Read Full Post »

(Originally posted on Nov. 10, 2010)
Here’s some long-ish reading that’s well worth the time: a piece by Alan Rusbridger, the editor in chief of The Guardian, on the value that linking and collaboration bring to journalism. You may be unfamiliar with The Guardian because it’s in England, but it’s a leader in the use of new media tools in service of Big J journalism. The post linked above includes several examples of that.

Here’s the important underlying philosophy of the approach:

“Openness is shorthand for the way in which the vast majority of information is, and will continue to be, part of a larger network, only a tiny proportion of which is created by journalists. Information may not want to be free, but it does want to be linked. It’s difficult to think of any information in the modern world which doesn’t acquire more meaning, power, richness, context, substance and impact by being intelligently linked to other information.

“Collaboration refers to the way we can take this openness one stage further. By collaborating with this vast network of linked information — and those who are generating and sharing it — we can be infinitely more powerful than if we believe we have to generate it all ourselves.”

One thing I would differ with Rusbridger on: He describes himself in the post as a utopian based on his own embrace of the changes and experimentation going on in journalism. I’d call him a realist.

Read Full Post »

(Originally posted on Oct. 20, 2010)
The Knight Foundation announced a second round of traditional media (three newspapers, one public radio station) partnering with hyperlocal sites, essentially aggregating headlines from these small sites. In a post titled “Collaboration is the new competition,” Jan Schaffer of J-Lab gives details of how round one went. There’s not a monolithic model in it. Some of the partnerships called for links back and forth; some allowed the traditional media partners to republish material from the hyperlocal partners.

Expect to see a lot more of this kind of thing. The resources traditional newsrooms have lost in recent years seem unlikely to return, certainly not soon, given the continuing sluggishness in the advertising market. These kind of partnerships can help fill the voids that the past years’ cuts have left. No, it won’t be the same. But if you pick your partners carefully, as The Daily Progress has done with Charlottesville Tomorrow — a local nonprofit group that focuses on development and planning issues — then what you get will help your site become the hub where people come first to find reliable local headlines.

10/22 UPDATE: One of the newspapers that participated in round one of the hyperlocal partnerships, The Seattle Times, won the Innovator of the Year award from APME, in part because of that partnership.

Read Full Post »

« Newer Posts - Older Posts »