The Digital Diet Starts Now

As readers of this blog will know I am often drawn to wondering about the effects of technology on how we think, and I worry about my increasing ability to be distracted. Well, tis the New Year, and its traditionally the time of year when we try to change our ways and develop better habits, so rather than just think about these things, I’m trying to do something about them.

Whilst many people are running off to gyms and binning the chocolates in reaction to the Christmas-time consumption, my problem, info-gluttony, also requires a similar solution. I need a diet.

And like trying to tackle the waist-line, the problem is not so much how much you consume, but what you consume. I think its the same deal with info-dieting. Neither food nor information is inherently bad, in fact, you need them. But deciding what to take in is important. I also will be looking at quantity too.

So, some steps i’m taking:

1. Reviewing RSS subscriptions / Twitter following / Facebook Pages
This is tackling head-on the ‘quality’ of my information in-flow. I need to do an audit of what my various tentacles are absorbing. What blogs are I just not reading? How many titles do I consistently skim over in my RSS feed? Are there Twitter accounts that I get nothing from? They have to go.

2. Narrowing my focus.
I have a wide-range of interests which I like to keep up on by consuming lots of various media on them. Beyond these I find myself encountering random topics daily. None of this is bad, but I wonder how much I am really absorbing by spreading myself so thin. I need to consider limiting the scope of my consumption, to really drill down into topics and develop deep-understanding. Of course, there has to be room for expansion and to encounter things outside of my comfort zones, but I need to try to some degree to focus on my core interests. This will be tough, as I like to engage in many different areas.

3. Less multi-tasking
This may seem counter-intuitive, but it goes hand-in-hand with trying to develop deep-thinking skills. Really what it amounts to is this; when i’m consuming something, I focus on that. I have developed a terrible habit of jumping in and out of a text or piece of media whilst reading/watching. This scatter-shot approach to reading is known to hinder proper understanding, and is one of the primary results / causes of the distraction I encounter online. So, from now on, when I’m reading an article, no matter how long, I stick with it till its finished. Or if I find myself wandering, I need to evaluate whether or not its worth continuing. In the past I have soldiered on, but at the same time jumping to Twitter, and to Instant Messaging windows etc. The result is I physically finished the piece, but how much did I actually take in? Would it be better to cut my losses and abandon? I’ve already begun using a series of tools (such as Instapaper and an iPad) to help streamline and focus my online reading. I hope to continue this through to even the smallest piece of online media.

I do think there is a place for using a medium such as Twitter to help add a live, social layer to television watching, and I will still do this, but more discerningly. When I’m watching a film the feeds are going down.

These are just a few ideas I’m sketching down to get me started. I used to be really skeptical of the whole “New Year’s Resolution” idea, but I can see how it can be used as a good point in which to review and change. Let’s see how I get on.

As We May Link

I’ve talked about this before, but it seems to keep popping up in my thought process again. In discussing the cognitive effect of using the Internet, Nicholas Carr argued that the very nature of the medium; it’s hyperlinked text which spreads off into a web of connected text, is causing our mental processes to deteriorate. Citing neuroscience research, he argues that people who read text filled with embedded links end up “comprehending less than those who read traditional linear text”.

Whilst Carr tacks this from an end-user point of view, I thought today about this from the point of view of content providers. I was reading an article today in The Guardian about the Disney-built town of Celebration and recent troubles its been having. In the second paragraph, the article makes an intriguing reference to a recent ‘brutal murder’ and subsequent suicide and police shootout. The article then moves on, but tantalizingly offers you an avenue to explore this event further by way of the ubiquitous hyperlink.

Site Graph of Etsy.com
Photo owned by Noah Sussman (cc)

This mode of writing for the web is now the default. We are now living in the age of hypertext, as first envisioned by Vannevar Bush, in his 1945 article “As We May Think” and later coined as a phrase by Ted Nelson. A lot of blogs or online journalism is peppered with links inviting us off to other sources of information. This division of attention, Carr posits, is responsible for degrading our ability to absorb long pieces of text, and to develop deep understanding of them.

It occurred to me; what am I supposed to do when I come across that link? Do I read it now, or do I read it later if I want to? Will it illuminate what am I reading, or do I need to read it in order to get a full understanding of what is being said? In fairness to The Guardian writer the piece later explains and expands on the violent incidents alluded to earlier, but the dilemma still remains. Do I go off and read more about it now? Or wait? At the time I encountered the hyperlink I had no idea that the writer would return to describe in greater detail these intriguing incidents.

I could journey off down the hyperlink, and return, or like a rambling Billy Connolly stand-up story, I may never come back to my point of origin. What if I find another hyperlink on the next page…might I journey down a digital rabbit-hole of never ending links?

When this piece runs in the newspaper, it will not have these links. The piece holds up without them; the writer has done a fine job weaving the tale. But the default thinking is when a story appears online it should fire tentacles off in every direction, dragging you away from the text. There are worse examples out there too (i’m guilty myself) where the visible text of a hyperlink is incredibly vague, giving you no real clue as to what you are being led to, which gives you less scope to evaluate whether you need to click it, thus making it more enticing.

It seems to me that there is something about the very nature of hypertext, the fact that it naturally sits within the body of text, indeed as the very text you read, that makes it more distracting. This is its gift and its curse. The alternative could be ‘footnotes’ at the end of an article that invite you to explore more after you have finished the main body of text. Wikipedia, probably one of the greatest single hypertext resources out there, uses this method for citations alone; but also uses in body hyperlinks. This is, of course, one of the very great things about Wikipedia, and its a great example of just how powerful hypertext can be. But what is the net effect of this? Do we just skim through the material, feverishly clicking one link after another?

Hypertext is at the core of how the Web works, and we are no doubt richer for having such a vast, interlinked body of knowledge at our disposal. But I wonder sometimes if it is a case that we do it simply because we can. And is the result to the overall detriment of our reading experience?

Is your computer a thing of beauty or a disposable tool?

This week Google unveiled in more detail its new ChromeOS operating system, specifically showing off some of the machines that will run it. As an OS which is essentially little more than a browser, it very much revolves around storing information in the cloud.

To demonstrate this they released a promo video in which a user’s information is repeatedly saved from destruction by virue of the fact that it resides ‘in the cloud’. What is not saved, however, is a plethora of laptops that are destroyed in increasingly violent and extreme methods.

When Gina Trapani linked to this video recently she made the comparison between Google’s vision of disposable computers, and Apples slavish devotion to making beautiful machines. Although Google, in fairness, were trying to make an entertaining point about cloud computing, rather than a philosophical statement about the disposable nature of computers, it still seems to revel in this idea of the machines themselves being secondary. Trapani’s comparison with Apple is very much true.

One thing Apple-bashers often shout about is how Apple computers are some kind of ‘style-over-substance’ devices. That they are just ‘pretty’ boxes, but nothing more. Well, you know what? Aesthetics are important.

Humanity is drawn to create and be surrounded by beautiful objects. It may be an evolutionary trait. Simply utilitarian design is not enough; look at the blight on the landscape left behind by the brutally utilitarian buildings of the 60s and 70s. It amazes me when I go into PC World and look at the Windows laptops on sale. Many of them are ugly, plastic, cheap looking things, with a horrible build and tactile feel. This is beyond meer visual aesthetics; if you are to use a machine well, it should feel good to the touch. Many of these machines look and feel terrible.

This also goes beyond aesthetic sensitivities. Evidence has shown that aesthetically pleasing interfaces are more usable. Indeed, more obviously, using a cheap, plasticy keyboard can be frustrating and limiting.

This is to say nothing of Macs OS X operating system, which is where these debates usually rest. If, for whatever reason, I was to find myself wanting to use another OS, I would have to look far and wide to find a machine that physically matches the Apple Mac, certainly in the laptop department.

Google seem to want a world where the physical machine is secondary. I think this is a mistake. The things we use matter.

P.S. Engadget say these ChromeOS machines have no USB support. Isn’t that the thing that makes the iPad so horrible and unusable? 😉

Say what you see, see what you say

Damien Mulley has a really interesting post this morning about how people are using multimedia devices and the net to capture, record and share their experiences. We’re seeing it more and more all the time; a breaking news story or simply an interesting observation can be captured and shared like never before. Terms like “citizen journalism” float around, whilst Sky News now ask us to supply amateur footage.

In Damien’s piece he points out that this level of almost ubiquitous connection can be a positive thing:

Maybe the positive with these tools is we are becoming more observational of our surroundings at times, because of these tools

This jumped out at me, because it is at odds with conventional, mainstream wisdom. I wrote only days ago about my fears that ever present WiFi was helping grow the phenomenon of having to be constantly connected. The way I thought about it, this was then a disconnection from ‘reality’. When we tune in, we drop out of our immediate surroundings. I also thought of Chris Ware’s haunting New Yorker cover, showing a neighbourhood of zombie like figues, gazing intently at their mobile devices. I think we all see how many people walk around now with their attention not at their surroundings but a tiny screen relaying information from elsewhere, beemed in.

But Damien’s take is refreshing. Could it be that with the ability to share our experience that we become more aware of our experience. With the boom in amateur photography thanks to cheap, but impressive technology are we seeing our environment in a new light? More and more people are taking and sharing their photos, and they are not just the obligatory ‘group shot in the pub’. Someone I was talking to on Twitter recently berated a new mini video camera for not having WiFi, so content could not be immediately shared. It occured to me then that we are moving towards a state where it is not enough to simply capture the moment, we want to share it too, and that is equally important.

Damien acknowledges that these tools also disconnect us, which is what I alluded to in my previous post. If we are just refreshing our inbox, or reacting to a digital stream (in a closed loop), we are less aware of our environment. These are the charges which are commonly made against these technologies. Although I also sometimes struggle with the idea of people who are constantly reporting on their surroundings on Twitter. I do indulge in it myself; during a football game or a live TV event, and it can be really rewarding to join in in a massive conversation. I also appreciate it when someone can report back with insights from a conference I cannot make (or a football match I cannot see). But I also sometimes wonder why people do it at other times. I see people talking about what they are doing in a pub or at a party. I’ve sent the obligatory Hipster-cam photo of my pint from time to time, but I find these places are for other types of communication (Neither is better or worse than the other, just different, and both have a time in our lives). It also brings to mind a Tommy Tiernan show I once saw where he berated a woman for taking a picture of him during the show, in which he alluded to people living their lives through a view screen.

But its really interesting to think that these devices can be seen to increase our connectivity with the real world, not decrease it. The more we begin to live in a world where we are reporting and reacting in real time, will we hone our senses to what is around us and effectively do the opposite of what many tech-bashers think would happen?

One to think about.

Another apt Ware cover:

Wifi verus Lofi

Wifi is becoming increasingly ubiquitous in public. It started in cafes, then spread to hotels, pubs, then onto modes of transportation (which, I think most people would admit, elicited a bit of “this is the future, now do I get my jetpack” excitement) and now can be found in barber shops, and garden centers and the sides of cliffs. It has reached a point now where it is almost expected in any kind of place where people can sit for more than 3 minutes. On Twitter people regularly champion those places who offer free Wifi and berate those cave-dwelling troglodytes who don’t. If they are somewhere where they expect Wifi and it isn’t working, they often vent fury at the ineptitude of the providers. (This of course was wonderfully lampooned by Louis CK in a now infamous appearance on Conan where he describes a fellow passenger on an airplane who, having learned the Wifi connection was down, remarked “This is bullshit”, to which Louis offered amazement that people could become so irate at the lack of a service they just discovered existed…)

This desire for public Wifi is perfectly natural, and I myself have long been a supporter of free wifi in public and on transport. But recently I have found myself amused and a little concerned with our incessant demand for it. When I hop on the bus home on a Friday I almost immediately whip out my phone or iPad and make a connection to the buses Wifi service. This is despite having spend most of my day able to dip in and out of the web. I used to just sit there and chill out, either by dreamily staring out the window at the world going buy, or getting stuck into a book. Now I just roll my days browsing onwards, till I can get home and connect again. Things like the bus used to serve as a little break from the river of information, now it drives straight through it.

this is more like it (this afternoon
Photo owned by velkr0 (cc)

A friend recently expressed a wish that the train to Cork would have Wifi so he wouldn’t be bored. I jokingly suggest he read a book. That’s what we used to do on trains, wasn’t it? During the summer I had to get the train up and down to Dublin for the first time in ages due to my broken leg. Aside from the joy in rediscovering the views that unfold as we whizzed past, views that i have lost since I became a bus-user, I was also surprised to find that the Enterprise train service had no Wifi. This at first was slightly irritating, and it was then that I realized that I had become infected by this zombie need to connect. It was kind of a wake up call. I put away my phone and reveled in the little pocket of disconnection I had found.

Of course, you can simply point at me as a man with no will power. You can argue that it is my own fault I need (or feel the need to) connect all the time. This is something i have written about before, but my main point is that I have noticed this general trend towards expecting, almost demanding, the ability to connect, all the time, everywhere, amongst the general public.

I read an interesting article recently about the birthplace of Wifi cafe culture, San Francisco. It was one of the first places to see so many coffee shops offer the service, but in recent years there has emerged a kind of counter-culture movement which has seen venues ban Wifi, and even pride and market themselves as having no Wifi. I found this fascinating. As the city which has had this phenomenon for the longest, it is beginning to tire of it, and as the city has always done is rebelling against what has become mainstream.

I am, of course, a complete tech nerd. But I have found myself becoming a tech nerd who is ever-so-slightly concerned with how much technology has come to dominate our lives. I am not advocating smashing the machines, I love my gadgets, but I wonder if we need them all the time, constantly streaming to us information from afar. We need pockets of radio-silence. I don’t help myself by purchasing 3G enabled iPads and smartphones, but I think its a discussion we need to have. I have to admit, a little bit of me was delighted by hearing about coffee shops that champion their non-Wifi status. There is a bit of a tendency online to treat anyone who questions technology as a fossil, a luddite who just ‘doesn’t get it’. Its nice to see an alternative. Again, the problem isn’t the technology itself (it rarely ever is), its our relationship with it, and how reliant on it we let ourselves become.

The Web of Distractions

A hot topic at the moment is the effect that the web age is having on our brains, and specifically issues regarding focus and concentration in the face of a relentless almost infinite flow of information we find ourselves swimming in.

Cases are being made both for and against the effect that it is having on our thought processes. Some claiming it is eroding our ability to concentrate, others that it may have positive effects, increasing our ability to remember and cope with multiple sets of data.

It seems that this debate is being cast as a kind of Luddite vs Technophile battle, but I think both sides have useful things to say.

Alain De Botton describes the situation we find ourselves in as thus:

One of the more embarrassing and self-indulgent challenges of our time is the task of relearning how to concentrate. The past decade has seen an unparalleled assault on our capacity to fix our minds steadily on anything. To sit still and think, without succumbing to an anxious reach for a machine, has become almost impossible.

Whilst he may be over dramatizing the effect, I find a lot of truth in what he says. Its something I’ve talked to other people about, this feeling that its becoming harder and harder to focus on individual things. I’m trying to work on it myself, but often feel like I am failing miserably.

De Botton argues this may be due to a new need to be constantly informed of events, as they happen.

The obsession with current events is relentless. We are made to feel that at any point, somewhere on the globe, something may occur to sweep away old certainties—something that, if we failed to learn about it instantaneously, could leave us wholly unable to comprehend ourselves or our fellows.

With the proliferation of ‘real-time’ web services such as Twitter this is becoming more and more apparent. I have noticed this need, if not to be connected at all times, at least be capable of being updated soon, and with minimum fuss.

In his new book “The Shallows: What the Internet Is Doing to Our Brains,” Nicholas Carr argues that technology is destroying our ability to concentrate. As I said this is part of a current discussion which is investigating the effect that this brave new world is having on our human brains.

In his review of “The Shallows” for the New York Times, Jonah Lehrer shows that whilst this current debate is indeed of its time, it is not a new discussion at all.

Socrates started what may have been the first technology scare. In the “Phaedrus,” he lamented the invention of books, which “create forgetfulness” in the soul. Instead of remembering for themselves, Socrates warned, new readers were blindly trusting in “external written characters.” The library was ruining the mind.

Needless to say, the printing press only made things worse. In the 17th century, Robert Burton complained, in “The Anatomy of Melancholy,” of the “vast chaos and confusion of books” that make the eyes and fingers ache. By 1890, the problem was the speed of transmission: one eminent physician blamed “the pelting of telegrams” for triggering an outbreak of mental illness. And then came radio and television, which poisoned the mind with passive pleasure. Children, it was said, had stopped reading books. Socrates would be pleased.

For myself, I wouldn’t be of the opinion that the internet is ‘bad’ for us, indeed I would say that its positives far outweigh any perceived negatives, but I do feel that is having an effect on how we concentrate and focus. I have definitely felt what Carr described in his 2008 article “Is Google Making Us Stupid?”

I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

This is something I am aware of, I do find my mind wandering away, even when I’m reading something fascinating. Carr attributes this to a decade of “spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet.”

I haven’t read Carr’s book (ironically), but I’d be interested in seeing what solutions (if any) he offers for this problem. Its kind of pointless to shake a fist at “the internet” for its effects; it isn’t going anywhere, nor should it, and its only going to burrow deeper and deeper into our lives. Just as the written word and the printed book did before.

The Wall Street Journal recently pitched Carr against Clay Shirky in a series of articles. Carr asks “Does The Internet Make You Dumber?“, whilst Shirky asks “Does it make your Smarter?“. It seems to me that Shirky takes less of scientific approach to the question at hand, and seems to address societal and cultural issues, whilst Carr appears to be more conscerned with neuroscience. (Although, Lehrer questions some of Carrs facts in his NYT review).

watching sid the science kid on the ipad
Photo owned by jencu (cc)

Its all fascinating stuff, but on a practical level, what can we do to cope in this digital world?

Clay Shirky opines:

The response to distraction, then as now, was social structure. Reading is an unnatural act; we are no more evolved to read books than we are to use computers. Literate societies become literate by investing extraordinary resources, every year, training children to read. Now it’s our turn to figure out what response we need to shape our use of digital tools.

De Botton, offers one solution, an information diet.

The need to diet, which we know so well in relation to food, and which runs so contrary to our natural impulses, should be brought to bear on what we now have to relearn in relation to knowledge, people, and ideas. Our minds, no less than our bodies, require periods of fasting.

I think we all do this from time to time, I know i’ve tried to. What I have found however, is less that I am not able to ‘disconnect’ but that when I ‘reconnect’ I feel the need to trawl back over everything I missed, and the feeling can be overwhelming. The last time I was off-line for even a few days, when I came back on I was slightly intimidated by all the information that had piled up; the unread tweets, Facebook updates and of course the reams and reams of RSS in my Google Reader.

Of course ‘a diet’, doesn’t have to mean ‘fasting’. I think the key maybe to be conscious of the stream of information you are absorbing and making an effort to slim it down, to focus on the less things, but absorb them more. I’ve blogged before on the various tools for reading and writing that can help with this, and I’m also beginning to feel the need to curate what I’m taking in, to not try and cover it all, but to focus and really absorb quality information rather than feed this insatiable hunger to see it all. I’m a technophile, most definitely not a Luddite, so I’m not about to through my laptop in a river, but I am increasingly aware of a need to cope with the levels of distractions.

Its tough though, a fact which is all the more apparent as I type this in one tab of my browser, next to 11 others currently open….

The new “Reader” feature in Safari 5

A few weeks ago I blogged about Readability, a tool which allows you isolate the content of a webpage and present it in a much more readable way. This, for me, has at least two major advantages, it can help improve the poor legibility/typography of many webpages and also helps focus you in on the content in a sea of millions of links, flashing ads and the likes, all trying to grab you away and click, click, click to somewhere else. Its a really great tool.

Yesterday Apple released the latest version of their web-browsing software, Safari 5 and they’ve built “Readability” type functionality directly into it, with a feature called “Reader“. When you are on an article or blogpost you can click a little button up in the address bar and immediately the content is plucked out and presented in a much clearer, focused way. It has minimal customization (as far as I can see) but allows you to increase/decrease text size. Its a really nice feature. I’m not a regular Safari user, but I’m giving it a go today to see how the latest version handles. So far, its pretty good, and things like the “Reader” feature are likely to keep me using. Readability has for the moment the edge with its customization options, but its nice to see browser makers addressing this need and building this directly into the browser.

Its becoming more and more prevalent this need to help users focus on the actual content they are looking for. As we open up more and more lines of communication and throw more and more at users its important that we can offer this content in a clear, straightforward manner.

Content is king!

See also: Readability’s creators on Safari 5

On the iPad, magazines and digital archiving.

So the iPad with all its hullaballoo and hype and backlash, and the backlash to the backlash has finally arrived in a blaze of publicity. People love it! People hate it! People think “looks pretty damn nice, but i’m not so sure I need to buy it”. Apple fanboys are heralding it as the Second Coming, Apple-bashers are lining up to see who can make the most sarcastic remark about its lack of USB-ports or something.

The reviews seem to be mainly positive, and the overwhelming opinion is that its really hard to judge it until you’ve used it. Its that different.

For me the most interesting thing about it, or the thing that would tempt me most to buy one, is the possibilities for reading, particularly magazine reading, that it throws up. Much has been made of its role as a potential savior for the print-industry, and some heavy weights are lining up to support it.

And we are not talking about just throwing up some PDFs. Some people, such as Popular Mechanics, are using this as an opportunity to reinvent the magazine. The results appear to be some mind-blowing publications, with untold possibilities. Brad Colbow has done a great little video exploring some of the first magazines to embrace the iPad.

iPad Magazine Art Direction from Brad Colbow on Vimeo.

The other aspect to these new-fangled magazines that interests me,however, is the idea of digital archiving. A few years ago I supervised a college project which was a prototype for a digital magazine. One of the chief inspirations for the project for the students was the idea of creating digital artifacts that would be kind of frozen-in-time. One of the guys involved particularly lamented the loss of the feeling you get in our web-era of flicking through old magazines and in addition to the content you can discover a treasure trove of design styles, photography and advertising. My Dad has been doing extensive research over the past decade into the history of Dundalk Football Club and in doing so constantly comes across the most amazing bits of design and advertising from by-gone eras, these are the delightful bonuses that you get when you go on a treasure hunt through old publications.

These days with dynamic website creation webpages are not static creatures; they morph and evolve and update on the hour every hour. I was reminded of all this this morning when I stumbled apron a 1958 Time Magazine profile of Alan Watts, maintained on Time’s website. Of course, its amazing that we have this ability, this open resource of history at our finger tips. But there was also something missing. As I read it, what I really wanted was to read it in context. I wanted to see how it sat in that particular issue, were there photos? how was it typeset? what other content featured in that issue? Sadly, none of that is reproduced. What we get is the content from that article neatly set inside the current Time website. Again, I’m not complaining about this, its kind of cool you can read a 1958 article about Alan Watts and get the ability to retweet it, or Share it on Facebook or Digg it. But at the same time, I really wish I could read it in its original format (or as close to its original format as you can get on a monitor…) I know some newspapers are doing archiving projects where you can bring up PDF or images of the original paper itself, and its this kind of thing I’d like to see more of.

This is somewhere where I think the iPad will be able to shine. Colbows video shows just how Time Magazine is going to do this. In landscape mode you get the original article, with its original typography, and layout. The people who are saying the iPad is just a glorified web-browser or that its missing X, Y and Z are missing the point I think. I don’t know of anything else out there right now with this kind of capability for reading and design. Of course, Apple won’t be the only people to make tablets, but I really think the tablet form factor itself will work because of these kind of publications.

Interesting times.

Update:John Gruber and Khoi Vinh weigh in on the Popular Science app. Vinh’s insights in particular are very interesting.