May 27, 2009

A New Search Engine for an Old Problem

Posted in Science & Technology tagged , , , , , at 11:25 am by Maggie Clark

Yes, this is about Wolfram|Alpha. For those of you who’ve heard nothing of this search engine yet, let me answer your first question upfront: Wolfram|Alpha shouldn’t be compared to Google; they’re apples and oranges in the world of internet data-trawling.

What, then, is Wolfram|Alpha, and why on earth would it be useful when we already have Google? I’d usually tell people to go look for themselves: from the main page, for instance, it’s clearly identified as a “computational search engine.” But what does that mean? Doesn’t Google already use algorithms for its searches? And though the About page provides a little more insight, it still stymied a few people I’ve already introduced to the website. Such confusion isn’t surprising, either, when you take a good look at how expansive the language is:

Wolfram|Alpha’s long-term goal is to make all systematic knowledge immediately computable and accessible to everyone. We aim to collect and curate all objective data; implement every known model, method, and algorithm; and make it possible to compute whatever can be computed about anything. Our goal is to build on the achievements of science and other systematizations of knowledge to provide a single source that can be relied on by everyone for definitive answers to factual queries.

I have to smile at this kind of language: it reminds me very much of my own writing, which though intending to convey a lot of information, might be considered so complexly worded as to limit, instead of enhancing, general knowledge about the topic at hand. (I’m working on it!)

And, alack, there is no Simple Wikipedia entry to explain this new site in layman’s terms. Even Wolfram|Alpha itself, though designed in part for comparative queries, lists only a few rudimentary details when tasked to explain what makes it different from Google.

So. If you’ll permit the blind to lead the blind, here’s Wolfram|Alpha in a nutshell:

  • It does not search web pages. You will not get top hits. You will not get related searches. At present, while the system learns, even misspelling something will give you limited returns.
  • It provides, instead, listings — singular or comparative. If you want to see which of two buildings is tallest, or what gravity is on Jupiter, or the basic facts about lithium, Wolfram|Alpha is for you.
  • It is, in other words, just the facts. No blog commentary. No video response. No forums or wikis in sight. Pulling from hard data sources, Wolfram|Alpha provides the basics about anything that can be quantified and computed, in whatever ways are available for said thing to be computed. Truly, anything: Here’s the entry for god.
  • Pursuant to this, Wolfram|Alpha can respond to questions that have concrete, fact-based answers. For instance, it can answer with relative ease “Why is the sky blue?”, What is the gestation period of a human?”, and “What is ten times the surface area of the moon?” And for those of you wondering if this means Wolfram|Alpha doesn’t know the meaning of life, think again.
  • So what on earth is this good for? In an age of sprawling participatory encyclopedias, interactive learning through participation on internet forums, and a whole slew of multimedia ventures — to say nothing of Google itself, which commingles basic search functionality with meta-searching, specialized searches (books, shopping, blogs, news), interactive maps and more — do we really need a website that provides us with “just the facts”?

    Heck. Yes.

    It may come as a surprise, but there are still a great many websites engaged in the whole search engine struggle for survival. There’s Google, of course, and Yahoo, but also Cuil.com — an underdog created by a former Googler who grew dissatisfied with how big the company had grown. (I, personally, have trouble believing anyone would give up access to their incredible catering services.)

    Cuil.com is said to accumulate and store information more efficiently, by linking and dropping similar subject hits on the same computer, so future search bots can find the bulk of their search results in one location. It also has a different layout, prioritizing the presentation of more content from each search result on the search results page. The comments on this Slashdot entry, however, match my own feelings of being underwhelmed by the quality of its search results.

    And then there are niche market tools like Regator.com, which searches the internet for blog posts relating to any topic in question, and Google minus Google, a filter for web searchers tired of seeing Google subsidiary sites (YouTube, Blogger, Knol, etc) prioritized in their searches. Further amendments, like filtering out Wikipedia from search results, are also present — which in my opinion is a nice touch.

    In short, a great many websites are geared towards making the vast stores of information on the internet as accessible as possible by ranking other websites on the basis of information quality and relevance. In doing, so, however, the very definitions of quality and relevance have changed dramatically from the notions they imbued years earlier, in the heyday of Altavista and AskJeeves. How could they not, when far from just being a tool for education the internet exploded into the complex social realm it now is?

    So now, perhaps, the most accurate information about a subject is not foremost on a search list about it: now, perhaps, it has been supplanted by the most popular website other people visited in relation to that topic. And the most relevant information about another topic might easily be transposed by the most popular piece of entertainment riffing off its theme. And, of course, there still lies the question of Wikipedia: Should it come first in Google searches; is it always the most accurate response to whatever query you may have typed in; are more accurate responses buried farther down the list?

    While there is no discounting the incredible developments we’ve made in expanding internet functionality — day in and day out adding to the human element of online operations — it also cannot be denied that there will always be a need for straight answers, too. Think of Wolfram|Alpha as a reminder, then, that for all our dallying in the online realm there still exists a real world — a concrete place with numerous quantifiable attributes just waiting to be described.

    A world that will always await us, should we ever go offline.

    Advertisements

    May 20, 2009

    Participatory Government Online: Not a Pipe Dream

    Posted in Business & technology, Global discourse, Public discourse tagged , , , , at 8:13 am by Maggie Clark

    In an undergad political science course a few years back, I recall being challenged to present explanations for public apathy in Canadian politics. Out of a class of some thirty students, I was the only one to argue that there wasn’t apathy — that low voter turnout among youth was readily offset, for instance, by far higher youth turnout in rallies, discussion forums, and the like. Youth were absolutely talking politics: they just weren’t applying this talk in the strictest of official senses.

    My professor always did love such counterarguments, but my classmates never seemed to buy them. Rather, many argued that the “fact” of disengagement was not only accurate, but also healthier, because it meant that only those who “actually cared” about policy would set it. (We were working, at the time, with figures like only 2 percent of the Canadian population being card-carrying party members.) Many of these same students likewise believed that economics was not only the ultimate driving force in our culture, but also the only driving force that could lead; and also that true democracy was unwise because only a select few (I could only assume they counted themselves among this number) were able to govern wisely.

    At the time, Facebook was two years old. YouTube was one. And the online landscape, though unfurling at a mile a minute, was still light years from its present levels of group interaction. My sources for the presentation in 2006 were therefore an uncertain medley of old and new media: news articles and statistics; online party forums and Green Party doctrine.

    I didn’t have at my disposal, for instance, incredible videos like Us Now, a documentary encapsulating the many ways in which average citizens — seeing truly accessible means of interacting on a collective level with their environment — are achieving great success breaking down the representative government model to something much more one-on-one.

    Nor did I have The Point, which provides anyone with an account and an idea the means to start a campaign, co-ordinate fundraising, organize group activities, and otherwise influence public change. (Really, check it out — it’s fantastic.)

    And most regrettably of all, I didn’t have the Globe and Mail‘s Policy Wiki.
    This last, I just discovered yesterday on BoingBoing.net, when they noticed the Globe and Mail’s newest project on the website: The creation of a collectively developed copyright law proposal, to be sent to Ottawa for their consideration on July 1, 2009.

    As a huge policy geek, and a member of the new media generation to boot, I saw this as a goldmine of opportunity — and yet there is plenty else on the website for other policy development, too: discussion forums and wiki projects alike. So of course, in my excitement, I sent the link to a few members of the old generation — only to receive a curious collection of responses, dismissing the above as an exercise in anarchy, while simultaneously criticizing old-school committees as never accomplishing anything properly.

    Well, old guard, which is it? Is our present model of representative government failing us in certain regards, and should we thus try to engage different policy-building models? Or is the same model which, despite early challenges to legitimacy, created an online encyclopedia as powerful as the Encyclopedia Britannica, by its very nature as an open-source community project unfit for political consideration?

    Us Now makes the point that the internet’s promise of a more dynamic and accessible global community has had many false starts (spam, scams, and the proliferation of child pornography rings come personally to mind). But long before we became cynical of the internet’s capacity to improve our social impact, we as a society were already well used to doubting the potential of our fellow citizens to act intelligently and in the pursuit of the communal good. You can thank Machiavelli’s The Prince, Italo Calvino’s Crowds and Power, and bastardized readings of Adam Smith’s The Wealth of Nations in part for this.

    A little while ago, however, I got around to reading John Ralston Saul’s The Unconscious Civilization, a CBC Massey Lecture Series essay collection about the rise of the management class and the utter reversion of the democracy/free market equation to the extent that the notion of democracy itself has suffered massive political distortion. Written just before the first real explosion of online communal projects — be they open source software, open-access socio-political groups, or information-dissemination tools — what Saul wasn’t able to account for in his work was the balancing force of technology itself. Rather, when he wrote these essays, technology was still very much a cornerstone of continued economic distortions in lieu of real democracy. Now, though, it’s clear that technology created through the corporate model has itself emerged as a platform for participatory government — and thus also as the undoing of those same, hierarchical economic forces. Coming full circle is fun!

    So, to get back to this matter of “trusting in the intelligence of individuals, and their capacity to act in the common good,” yes, there is a lot of circumstantial evidence to the contrary on the internet. Heaven knows, for instance, that the low-brow interactions which inspired CollegeHumor.com’s We Didn’t Start The Flame War are in fact a daily, persistent reality online, and make up a substantial percentage of commentary therein.

    Yet any parent will tell you that the way to raise a responsible child is to give her responsibilities to live up to; a child entrusted with none will invariably continue to act like one. So rather than using, as a test of our group potential online, those sites that in no way engender a sense of responsibility for our actions, why not look at those sites that do — like ThePoint.com, and the Globe and Mail Policy Wiki?

    Furthermore, if our current model of representative government no longer yields the level of public engagement we crave (read: in the ways the government wants to see), maybe it’s because citizens at large haven’t been given the opportunity to feel like real participants at all levels of the democratic process. And maybe, just maybe, the internet not only can change that perception, but already is.

    After all, those same students who, in the comfort of a political science classroom just three years back, so boldly proclaimed that collective decision making was a waste of time? You’ll find every last one on Facebook and LinkedIn today.

    May 18, 2009

    To Pay or Not To Pay: The Internet’s Most Intricate Crisis

    Posted in Business & technology tagged , , , , , , , at 10:24 am by Maggie Clark

    Within two months after Last.fm, a music streaming service, signed into partnership with four major record labels, Amazon.com saw a 119 percent increase in online music sales. Through an ad-based revenue model Last.fm was able to offer free access to a database of songs numbering in the millions, and to group them into “stations” wherein your tastes would yield similar artists or songs in that vein. The catch was that after three iterations of one song, Last.fm would display an advertisement directing listeners to affiliate partners selling the tune. All in all, it was a sweet deal: We got free music, the big labels got paid, the small labels got exposure, and contrary to popular wisdom about downloaders detracting from music profits, online sales were through the roof.

    So, of course, Last.fm switched to a subscription model on April 22, 2009: Now International Users have to pay “three” every month — three euros, three dollars: whatever is regionally appropriate. And honestly? This makes tremendous business sense: Last.fm has to pay for every track you listen to from a major label, and when it can’t negotiate adequate terms for payment with a label, sometimes that label just cuts out.

    Nonetheless, as part of the Napster generation I can’t help but note how, the more things change online, the more they’ve ultimately stayed the same. From Napster to Pandora to Muxtape to Seeqpod and, of course, a slew of others, the introduction of free big-label music under any number of guises has always, invariably ended in a curtailing of services (at best), or else a complete redirection of the site’s aims and/or bankruptcy.

    Notice anything funny there? Take a look at how this cycle begins: With the desire to give something away for free. Not to make a profit on it; just to scrape by — and only when profit margins drop deep into the red, to impose fees on the consumers. Yeah, you might say, it’s easy not to try to make money on something you didn’t create (the music). But… if history serves us well, it’s not. People just don’t pass up the opportunity to exploit the work of others for their own profit. So how is it that models like the ones listed above ever existed in the first place?

    The answer perhaps lies in our generation’s unique conditioning: if as individuals we still demanded that our own creative output be viewable solely through a pay system (as Amazon is proposing in blog subscriptions for Kindle), we’d be hypocrites to demand free content from others. But growth on the internet has proven instead too nuanced for such hypocrisy: while some services have always tried to charge for content, the blogosphere, YouTube, GoogleVideo, MySpace, DeviantArt, Flickr, news aggregates, and other such websites have always run on a free viewing model. In short, by now we’re more than used to posting a piece of writing, a photo, a video, or a song online and expecting nothing monetary from it. Art and entertainment have entered into a free-for-all creation domain, and while this doesn’t mean we don’t still hold in high regard those artists and entertainers who dedicate the whole of their lives to such work, it certainly means we have different expectations for our engagement with them.

    As such, the story of those aforementioned music services means just what seems to mean: That our first push out into the world of the internet is just as likely to be in the pursuit of free access as it is to be about exploitation — and thus, that we as consumers can forever expect to find ourselves latching on to free content, taking it for granted, and having subsequent power plays or business models then wrest that freedom away. A cry of foul will emerge, we’ll flood a comments page with angry protests… and then most of us will clear off, find a new free music service, and repeat.

    Rest assured, this isn’t as hard to stomach as it sounds: we’re already quite used to learning to pay for goods we’d always taken for granted — how else can you explain bottled tap water? But the story of free music is a fast-paced tale that also speaks volumes about deeper, more complex payment issues at work on the internet.

    Because while the struggle for survival of music streaming services cater to our more immediate fears about The Man, there is a longer, more drawn-out battle being waged in turn for the whole of the internet. Yes, I’m talking about the attempts of Internet Service Providers to make heavy internet users pay more, or divest the whole medium of its equal playing field by allowing some companies to pay for prioritized access, effectively shutting small companies and websites out of the mass market. Or what about Bell Canada, which last year found an ally in the CRTC when the Canadian Association of Internet Providers complained that Bell was “throttling” access for peer-to-peer applications — a direct challenge to net neutrality? When the CRTC sided with Bell in the case, they likewise permitted, and set precedent for, the legality of an ISP interfering with an individual’s use of the service he’s paid for, through “traffic-shaping.”

    And then, of course, there is the anti-piracy bill passed by the French National Assembly on May 12, 2009: anyone caught downloading or sharing copyrighted files three times can now be suspended from the internet for two months to a year on that third notice. Chillingly, the law would not require a trial or court order: All the ISPs need do is send you your warnings, making this a huge win for corporate control of the medium.

    This, then, is the real conflict of the internet — an on-going negotiation being fought in a much more protracted, expansive way than any music streaming service need fear: but a negotiation, nonetheless, that will shape the future of the internet for us and those to come.

    For now we take our freedoms and equality online for granted — just as we do our free music moment by moment. The question is, if the lesson of music streaming services has taught us anything, what can we really say about how free or equal the internet as a whole will be just ten years down the line?

    And what, right now, can we do about it?

    March 22, 2009

    The Only Thing To Fear Is Fear Itself

    Posted in Media overviews tagged , , , , at 12:08 pm by Maggie Clark

    The Seattle Post-Intelligencer, a 146-year-old publication, printed its last issue on March 17; moving forward, it will be online-only. The San Francisco Chronicle was similarly threatened last week; it hangs on with the possibility of becoming a non-profit organization, or merging with other regional news outlets.

    Though friends send me links about the most striking losses in the print and broadcast journalism fields, the truth is that I’ve been following job losses, mergers, bankruptcies, and budget cuts for three years now. Anyone with any sustained interest in journalism has done the same for at least as long. The first post on this blog, written a year ago, especially notes the lengthy articles written in various publications over the past few years about the future of journalism — though many will say the discussion began in the ’90s, or earlier, and they’re likely right: I just haven’t been in the field that long yet myself.

    In any case, it’s not a new debate, and that’s precisely what I feel many people don’t realize when they weigh in. Especially striking is how conversation on this topic is framed around these most prominent losses, such that my friends’ questions become “What are you going to do in journalism [in light of this]” or “What do you think the future of journalism will be [in light of this]?” These are good questions, but troubling ones, because they guide the answers as reactions to these events — when really, the answers at this point should stand incident-independent.

    But before I get to these specific answers I should note that those outside the realm of journalism are not the only ones framing their questions and concerns in direct response to these monolithic collapses. Rather, quite a few disgruntled journalism majors and members of the industry are weighing in, too — to tell everyone how they’re “getting out.” In the case of journalism majors, boy, let me tell you: the reaction of many in this regard does nothing to placate my dislike for their programs; rather, these petulant grads make it clear just how many acquire their degrees with an unhealthy dose of entitlement — to jobs, to stability, to automatic repute in the journalism world. The journalism greats of old did not learn journalism in classrooms; they came from other degree programs, or else no university or college education at all, and in either case plucked up enough courage to engage their city publications and start as low-level reporters, working their way up to notoriety.

    This lacking drive can be felt in the newsrooms just as much as it can in classes: members of the traditional media corps are also jumping ship to other careers. While the financial imperative is understandable, especially among those with families to support, less so is their blind endorsement of others doing the same. I would, for instance, kill to hear someone say “I don’t have the means to pursue the profession in this changing environment — but I wish all the best to those who stick with it anyway.” But of course, to say something like this, one would have to grasp a basic, underlying tenet of this whole transformation: specifically, that its existence is the one fact we can and should rely upon.

    Yes, journalism is changing. Is the human desire for information about the world we live in changing, too? No. Not at all. So there will always be a need for news — and with it, people to acquire, distribute, and analyze this news. As such, our questions as journalists are the same as they’ve ever been:

    1) What is the quality of current news reporting?
    2) What can we do to improve or maintain this quality of reporting?
    3) What areas of our world are under-reported?
    4) How can we address these lapses?

    Some commentators are so mired in questions one and three — their fears about what is being lost in the midst of this dramatic upheaval — that they can’t progress to questions two and four. Frankly, this is more troubling than the answers to one and three themselves. Yes, the quality of current news reporting is greatly diminished by newsroom and foreign bureau cuts, to say nothing of explicit job losses. Yes, huge lapses in the quality of investigative journalism, both at home and abroad, are already being noted by journalism organizations. All right, we get it.

    But as two notable bloggers, Clay Shirky and Steven Berlin Johnson both note in very lengthy, but exceptionally potent essays, the fixation on these problems as signs of The End of Things is narrow-minded and fear-based.

    What we exist in is a time of transition, a time in which new vehicles for reporting will rise up to supplant the old. This means that, as the old forms of media are diminished (I hesitate to suggest they will ever fully disappear: I find that doubtful, myself), gaps in coverage, and sponsorship for this coverage, necessarily emerge. Johnson likens this void to the expectant nature of computer magazine readers in the mid to late eighties: the potential for a huge new data stream was there, but as of yet unformed; and though many a reader would eagerly await the next issue of, say, MacWorld, surely none could fathom the modern day equivalent: an excess of Apple news in print and daily online. Shirky emphasizes this point in his own essay: We cannot see the shape of things to come, but in the meantime, the only thing to fear is fear itself.

    And so, like the realm of computer information in the eighties, so too is the world of journalism now increasingly an expectant void — with many current lapses noted in response to the first and third questions, and few concrete, tried-and-true answers to these lapses in the second and fourth. And yet the need for answers to these questions persists — as does humanity’s overarching, all-consuming need for more information about the world we live in.

    The moment that unrelenting inquisitiveness disappears, then we can talk about the sky falling in Journalism Land. Until then, it’s only our own pride and complacency that needs to be checked: For journalists today the task is not to moor ourselves to any one vehicle for the acquisition, distribution, or analysis of news that matters: rather, it’s to stay adaptable, keep learning, maintain humility, and engage the changing media landscape with an open mind and a loyal heart.

    And any who can’t manage this (for reasons other than their need to attend to lives in their care) probably weren’t pursuing journalism for the right reasons in the first place — so to them I say, thank you. Thank you for getting the hell out.

    May 11, 2008

    The need for more, and better, mainstream media criticism

    Posted in Media overviews tagged , , , , , , , , , at 10:21 pm by Maggie Clark

    “You know what the worst part of that is? It’s not that the speeches have gotten better; it’s [that] media criticism isn’t as good as it used to be.”

    — Robert Schlesinger, guest author, The Daily Show

    Within the last two years, the Columbia Journalism Review, the Ryerson Review of Journalism, Adbusters, the Tyee, the New Yorker, and the Walrus have written extensively about the challenges facing contemporary media in its on-going bid at maintaining relevance as both political watch-dog and central arbiter of social discourse. Newsroom cutbacks, expansive media monopolies, weak protectionist policies from the government, social pressures from extremist interest groups, and the advent of New Media (and with it, a rapidly transforming revenue structure) are all aspects of a journalism culture that is presently tasked with re-branding itself without ready access to all the resources such an effort requires.

    And indeed, many feel this lack of resources is ultimately to blame for the deficit of effective media criticism at crucial North American turning points in the last fifteen years, but one could just as easily argue — and I would — that a lack of effective media criticism in and of itself marked the industry as “ripe for the picking” by corporations increasingly unfamiliar with journalism’s non-entertainment responsibilities. To elaborate on that reversal, though, I should first deliberate a little on what constitutes “good journalism.”

    To that end, consider a recent Globe & Mail article, which notified readers of the paper’s dominance at the 2008 National Newspaper Awards. One online respondent commented: “take it easy globe, you’re faaaaaar from perfect.” But is perfection even a reasonable aim for journalism? When by its very nature news media is tested every single day, with every single news report it issues, it can’t be: stories necessarily develop over time, new facts regularly emerge to supplant the old, and self-correcting mechanisms are an intrinsic part of the process, thereby confirming the necessary incompleteness of any one day’s product, no matter how thoroughly researched or reasonably presented. No, there is no resting on one’s laurels in an organization constantly tasked with proving itself anew, and so the measure of good media has to be based more on its commitment to that process itself. How tireless is it? How well does it resist complacency, revisit entrenched internal biases, question assumptions, and respond to outside criticism? Good journalism is fallible; but good journalism also knows how fallible it is, and strives very hard to account for subsequent lapses. And when good journalists internalize this state of constant questioning, this aversion to complacency, they can fight even the most aggressive of pressures to the contrary.

    In 2001, for instance, CanWest Global Communications tried to impose a national editorial in its constituent papers — the same editorial, written at CanWest headquarters, for papers all across Canada. Its inclusion would be mandatory, and while local op-ed pieces would still be accepted, they were not allowed to contradict the opinions expressed in the corporate editorial. In the name of maintaining an open forum for public debate, reporters and editors resisted: they went on a byline strike and raised public awareness — especially when a spate of CanWest firings were tied to similar attempts at curtailing different opinions and approaches to the news (with criticism of the Liberal Party and pro-Palestinian comments proving especially dangerous for CanWest staff).

    The CanWest corporation embodies a series of on-going problems for Canadian journalists, but at least where corporate editorials are concerned, journalists can — for the moment — claim victory: CanWest dropped that intended policy the moment public pressure became too much. But here, too, there is no such thing as a “perfect” victory: the freedom of the press, as the fourth pillar of democracy, must be tested and affirmed on a regular and rigorous basis. This is where media criticism comes in — journalism’s answer to the ancient question, “Who will watch the watchers?”

    I can’t say for certain that media organizations would have suffered fewer newsroom cutbacks, or that corporate owners wouldn’t have interfered as much with their editorial decisions, if there had been a more entrenched culture of media criticism in the early 1990s. But to have someone keeping tabs on other organizations, and teaching readers to keep tabs too — this, to me, is a crucial part of journalism’s internal, self-correcting mechanisms, and one I hope very much to participate in throughout my life.

    It is also one that has flourished, oddly enough, in its own absence. When mainstream publications proved unable to provide this public service, the public — settling very easily, and very prominently, into the age of New Media — began supplying this service on their own. Now, in 2008, we see military blogs about the wars in Afghanistan and Iraq rivaling information released through standard channels; the Huffington Post and the Drudge Report heading a broad spectrum of “Second Gen” blog aggregate sites (ones which, unlike Digg or Redditt, have an editorial team setting the front page content); and the Talking Points Memo especially empowering citizens by showing how public pressure can, in fact, improve political accountability.

    Whether or not journalists within mainstream publications are ready, the realm of discourse has broadened, and readers today are far from their passive cousins of yesteryear. To this end, the role of traditional journalism is still changing — still being “re-branded” — but not in any way that really lies outside of its original precepts. Journalism has always been something taken day-by-day — something that requires regular adaptation, and constant self-correction. And so long as Canadian journalists are willing to avail themselves to the new demands and needs of our population — and especially to acknowledge and make up for the lack of entrenched media criticism within its walls — we’ll never be perfect, but at least we’ll be far more likely never to forget that fact.