10 2 / 2013
"Because images sent through the application self-destruct seconds after they are opened, Snapchat is being embraced as an antidote to a world where nearly every feeling, celebration and life moment is captured to be shared, logged, liked, commented on, stored, searched and sold."
- JENNA WORTHAM
A Growing App Lets You See It, Then You Don’t, New York Times, February 8, 2013
12 7 / 2012
Tracking your bike ride to work using GPS or snapping a photo of each meal you eat and uploading them to a blog are just a couple of the wonders that technology has made possible - and they’re so common, we tend not to even think how the data trails they create affect us.
Is it possible that these self-tracking practices are a modernized version of diary keeping? An external vantage point from which to understand ourselves? A way to align ourselves more fully with machines? Or a way to exert our identity amidst a relentless flow of data?
CBC Radio journalist Nora Young explores these questions in her book The Virtual Self: How Our Digital Lives Are Altering the World Around Us.
Young provides a well-researched and thought-provoking look into the emerging world of self-tracking. She grounds the discussion in past practices such as Benjamin Franklin’s notebook in which he rated his success in meeting “virtues” such as temperance, frugality, sincerity and justice - though only one at a time. Or earlier in the “spiritual diaries” which were very popular during in 17th and 18th century England and France where people would strive for an objective record of their spiritual life in order to bring the self closer to God.
Seen in these terms, the goal of self-tracking can be thought of - at its best - as a way to keep on track to meet goals, providing gold stars along the way. At its worst, it’s a reminder that every missed workout, night of binge TV watching, and, indeed, sin in being observed through the Panopticonic lens of the smartphone.
Young explains that self-tracking can be a record of change that helps in “documenting the self into being.” For instance, a teenager will document her throughs in a private journal and read them later to see a record of emotional states as she grows into adulthood. “It is a way of creating and reinforcing a self that has substance, a history, and, most importantly, physicality,” she writes.
Indeed, while the digital world threatens to make our physical bodies obsolete (or for the techno-utopians - free from our fleshy prisons), self-tracking, in essence, re-asserts the self in terms - data, maps, infographics - that give the body and the self to take up space in the digital world.
As a side note, observers saw a different but similar reaction to the increasingly virtual world of the ‘90s with the return to physicality written on bodies themselves in the extreme body modifications of tattoos, piercings, and scarrings. Marshall McLuhan called the return to the body as “the discarnate effect” Mark Kingwell describes it in his book Dreams of Millennium: “When flesh is rendered virtual and personal integration is shaken, the body becomes a natural site of cultural resistance, perhaps the last meaningful place to resist electronic encroachment.”
Now, however, the capability to “hack” the body through smartphone-assisted workout routines and the mind through endurance reading groups is widespread. This has resulted in a myriad of new and different modifications.
Of course, it would be nice if we could stop the discussion at the reassertion of individuality amidst a tidal wave of data, but Young presses on, addressing the problem that someone striving to carve out an identity in the digital world will inevitably feed into a stream of information used by states and companies. While many would argue that self-tracking should be personal, there is the possibility that data (such as Canada’s Long-Form Census) provides crucial information about the conditions in which people live that could be put to good use (perhaps) as easily as it is made to target an ad or generate a sale. While the surface goal of self-tracking is to gain knowledge about the self, what emerges in the aggregate is a knowledge about society.
Perhaps the most troublesome aspect of self-tracking, Young asserts, is that the self cannot really be tracked. There’s simply no metric for personality nor a host of other traits.
But self-trackers have found ways to navigate this divide.
One self-tracker Young interviewed said he used tracking as merely a game through which he could get closer to introspection. With the realization that there are limits to which parts of him can be computed and which cannot, he said the fact that he’s listening to cues is what matters. And isn’t listening to ourselves - not just data and statistics - what creates the foundation for a self and the potential for community?
24 2 / 2012
"Your personal privacy should not be the cost of using mobile apps, but all too often it is…. By ensuring that mobile apps have privacy policies, we create more transparency and give mobile users more informed control over who accesses their personal information and how it is used."
20 2 / 2012
"Talking about memes and viral media places an emphasis on the replication of the original idea, which fails to consider the everyday reality of communication — that ideas get transformed, repurposed, or distorted as they pass from hand to hand, a process which has been accelerated as we move into network culture. Arguably, those ideas which survive are those which can be most easily appropriated and reworked by a range of different communities."
09 2 / 2012
06 2 / 2012
"I don’t ask that every program on the air be an exercise in reality, but I would like to see other shows do more, to talk about what we are, where we should be going, and what we lack."
28 1 / 2012
The Tiny Book of Tiny Stories: Volume 1 is an experiment in literature. Published by HarperCollins in December, the collection comprises 67 crowdsourced stories — selected from more than 8,500. Each Twitter-sized story has an accompanying illustration, and each has been carefully curated for the collection.
Good curation (or what publishers would call editing) is surely nothing new, but the content was generated through hitRECord, a collaborative platform for creative content generation. This platform, which essentially lets people edit and re-edit other peoples’ work, is surprising in that it produces relatively high quality work — like Wikipedia for artists.
While Tiny Book and other collaborative projects lack the single, unified vision of an artist or author, it serves as an interesting example of unconventional publishing and a testament to the ever-expanding boundaries of literature.
23 1 / 2012
LLC sprung up last year in Toronto as a non-profit organization to help women learn Web development. Haley Mlotek wrote about her experience attending an LLC workshop for Toronto Standard, explaining that it promotes equality but yet its organizers don’t equate its purpose with that of feminism.
Upon first reading the story, I thought the critique on whether the group identifies as “feminist” was tangential, but then again, is this not clearly the subtext of what’s actually going on? This is an organization that, contrary to the usual co-ed classrooms, creates an environment for women to learn something thought to be the domain of men.
The article identifies problems with Web development’s current culture: that women are intimidated by tech speak, that there are few women at conferences, and that women should not be excluded from the creation of websites and applications.
Given these hurdles, maybe LLC is being too weak in its stance. But then again, because the term “feminism” can have various meanings, I can see how LLC may not want to commit to everything it could imply. For instance, it provides a safe environment for women to learn Web development, but it stops short of breaking down the conference room doors at WWDC.
21 1 / 2012
In a subject that’s obsessively concerned about the future, it takes a certain amount of dedication to required to dig into its past. Indeed, the quick movement of technology often shrewdly makes yesterday’s (or God-forbid last week’s) analysis a dated artifact. Charles Jonscher’s WiredLife: Who are we in the digital age?, published in the heady days of 1999 — before the burst of the tech bubble and amidst trepidation over the Millennium bug— provides a perspective that’s simply different than our current one.
The title question: “Who are we in the digital age?” is one that we’ve been wresting with for decades. In contrast to broader questions about how technology has changed the structure of society as in the case of the Agricultural Revolution or the Industrial Revolution, Jonscher helpfully limits his discussion to just the digital. Suffice to say, it’s quite difficult to pick apart how the digital age may be changing “who we are”, and our proximity to it largely blinds us to these ways.
Jonscher draws a distinction between how humans and computers think — mainly that computers don’t think. Indeed, as exciting the prospect that a computer could checkmate the world’s best chess player, the computer is. While the sort of “combinatorial” analysis needed to play a good game of chess pales in comparison to the intelligence needed to react to situations in real life where there are infinite combinations.
There are structural limitations, Jonscher says, to the ability of machines to think, and for us to think like machines. It’s easy to forget that complex machines are made up of simple logic gates that open or close depending on the voltage applied to it. While there have been experiments in AI, neural networks, expert systems, and genetic algorithms, the dominant paradigm of digital age is operations coded into 1’s and 0’s, and this language is largely incapable of thinking in a human way.
In the early days of computing there was excitement over the invention of the transistor and the microchip, and the assertion by computing pioneer Alan Turing that any logical problem could be solved using simple gates. However, some of this excitement — at least in academic circles — was largely quelled by Kurt Gödel’s Incompleteness Theorem. Not so a demonstrated weakness in computing but that of logic itself, Gödel’s found that logical premises wouldn’t always lead to all conclusions, nor would all logical problems be computable. Take, for instance, the following logical problem: “There is a barber in a village who shaves all (and only) the men in the village who do not shave themselves.” The paradox over whether the barber shaves himself makes the statement nonsense.
What’s with all this computer bashing? Computers, after all, outperform humans computers in the amount of data they can process in a given time. Well, not all functions are created equal. Computers are able to process vast amounts of data, but it becomes harder for them to match humans as they move up the hierarchy from data to information to knowledge. Indeed, fears that machines will replace lawyers and other heavily analytical professionals are, at least now, quite unfounded.
Indeed, most computer scientists gave up on the dream of a thinking machine by the end of the ‘80s. And in the fallout of the unrealized artificial intelligence dream, people embraced computers as a way to break down the barriers of human communication. “The idea of creating machines to compete with our mental powers has been quietly superseded by the development of networks for us to communicate with each other,” Jonscher writes.
In order to understand who are we in the digital age, Jonscher helps us understand that machines are merely a medium through which we communicate. This leaves some larger questions unanswered. For instance, the digital age bombards us with more information and competing viewpoints than before.
Jonscher’s stance is that human hardware takes much longer to rewire than computer hardware, and that we are, more or less, the same as we’ve always been — we’re just better equipped with information and technology. This argument, in my opinion, falls short given the psychological and social effects that we can’t even begin to imagine. While we may have the same brain structure as the generation before, it remains to be seen what gains and disorders come from the relentless bombardment — not of technology — but of humans.
Of course, given that sophisticated computer systems are created and programmed by people, they can be thought of as a testament to the human mind. And this should give us hope that the human mind will prevail even if machines cannot be taught to think for us.