One standardised storage format for common data

Is it too much to ask for software developers to agree on one standard storage format for commonly accessed data?  Why does every browser that I use have it’s own bookmarking system, rather than one location separated out from the actual programme that all programmes can then access.

For example, Firefox keeps it’s bookmarks at …/.mozilla/firefox/…default/… (different for each OS), Flock does the same in it’s own location, and so does Chromium.  Why not have one bookmark storage standard that gets kept in /home/michael/.bookmarks?  Every browser can then access the same place and so will all have the same History and Bookmarks, no matter which browser I’m using.  The same is applicable for RSS feeds.  Why can’t every application use a storage container that’s kept in /home/michael/.rss?  I’m hazy on the details, but I thought that this was the sort of thing that XML was designed for?

The Akonadi framework on KDE is going to go some way to address this problem, but only within the KDE desktop, and only for the (personal information management) PIM applications.  Am I missing something here?  If you know of a way to share resources across applications, please point me in the right direction.

Technology in the classroom: can we make it work?

I’ve been trying to think how to use technology to enhance both my teaching and my students’ learning and it’s proving more difficult than I’d initially thought.  I like to think that laptops and internet access in every classroom give students real-time access to related content while they engage in meaningful discussion, but this will never happen.  Their Facebook profile and IM conversations are far more interesting than the “Pathology of stroke” or “Justice in access to healthcare”.  And that makes sense in a bizarre kind of way.  Even while they (or their parents) pay vast sums in tuition fees to have the privilege of attending university, most students (in my very limited experience) see studying as inherently boring.

Some studies in American classrooms have all but proven that the distraction of the Internet in class is too strong for students to ignore and that most of the lesson is spent checking email, catching up with friends and even shopping.  Now, after that initial foray into “embracing” technology”, it seems as if there’s a move towards banning laptops altogether.

This is the kind of about-turn I’d like to avoid.  E-learning, while I have no doubt will be a revolution in education, is not the idea that technology for it’s own sake is the way forward.  Just because it’s possible to have Internet access in class, does it mean that we should?  Rather, teachers must take an approach whereby technology is used in a way that enhances it’s advantages, while minimising the disadvantages.  Just because I put the course reader online doesn’t make it “e-learning”, and neither does having a student blog.  The technology in itself doesn’t enhance learning in any way, but how you use it can have powerful implications.

I’ve been toying with the idea of using a wiki to manage a course, whereby any change to either the course content, test schedule or mark availability can by syndicated through RSS to all the students in the class.  Students will have to, as a course requirement, both add to and edit course content (obviously moderated), which can also then be tracked.  I think that this may be one way to encourage them to actively engage with the content, as well as introduce concepts like peer review, referencing and drafting, which may also improve their reading and writing skills (another huge problem).  The point though, will be to make the learning outcomes apparent from the beginning, so that students know what’s expected of them.  Merely creating a wiki and telling students to “Go forth and create content” isn’t enough.

I think that technology will fundamentally change the way we teach and how students learn, but not just by throwing technology at the problem.  The trick is to figure out how to use technology to facilitate deep learning by getting students to actively engage with the content.  A bad teacher will continue to teach badly, no matter how much “technology” they use.

Link to the article that inspired this post:

How web 2.0 is changing medicine

The British Medical Journal published this article in December (2006), which may not seem like a long time ago in the traditional approach to academic publication but which in terms of the Internet is already old news. It asks, “Is a medical wikipedia the next step?”, a question I think is becoming more and more relevant as we see user-generated content proliferating in all spheres of our lives, but more and more frequently in the field of healthcare.

The author, Dean Giustini (librarian at the University of British Columbia Biomedical Branch), looks at the advantages of web 2.0 technologies or social software (e.g. RSS, blogs, wikis and podcasts) with particular reference to the creation of open content, improving access to information and the impact all of this has on medicine. We need to be asking ourselves how we can use these new technologies to better inform the way we teach, learn and communicate with our students and colleagues.

I think the final paragraph sums up my own opinion of the role of the Internet in influencing those of us who are creators and publishers of content:

“The web is a reflection of who we are as human beings – but it also reflects who we aspire to be. In that sense, Web 2.0 may be one of the most influential technologies in the history of publishing, as old proprietary notions of control and ownership fall away.”