Using blogs, twitter & wikis to deliver e-learning by Anthea Sutton, Anna Cantrell, Pippa Evans – a presentation from the MMIT national conference – is available on Prezi.
It’s been more than a year since we looked at social media analysis tools and a lot has changed in that time. If an internet year is 7 weeks¹, than a social networking year feels like even less than that. And, as important as it is to have a social media strategy, it’s equally important for this to be a dynamic strategy, being constantly revised.
Some of the tools mentioned last year are still around; ThinkUp has left beta, HootSuite is now a freemium platform and, sadly, TwapperKeeper is no more (although the core functionality is now built into HootSuite). Twitter itself has gone through a couple of iterations since then. In the latest version of the Twitter web client (I’ve lost count by now, let’s just call it #newnewtwitter) you can view the interactions and mentions, activity (what people you follow are up to), browse categories and try to make sense of the latest ‘trends’ (only joking). But there still aren’t really any built-in tools to monitor the reach and effectiveness of your Twitter presence.
There are lots of different tools and apps for exploring Twitter metrics. I’m ignoring Klout and PeerIndex because measuring ‘influence’ is not the same as measuring engagement and in order to review your social media strategy you need data to show interactions with those who use your library or info service . For similar reasons, I’ve steered clear of social marketing tools such as Socialbakers and the like (also their website is a bit ..busy).
Metrics specifically for Twitter can tell you more about your followers (including reciprocal followers and ‘influential’ followers) but there are also more meaningful measures such as ‘conversations’. If you ask a question on Twitter, for example, how do you track and store the responses? And how can you archive and analyse conversations that occur on Twitter at conferences or around a specific subject?
And these can also be linked to your other web services. Twitter (and other social networks such as Facebook and LinkedIn) have a growing role for web traffic referrals. Twitter announced a new Twitter web analytics tool late last year in recognition of this but it’s gone a bit quiet since the first announcement.
Most Twitter power users manage their Twitter account via a Twitter client. TweetDeck (now owned by Twitter) is a handy way to manage multiple Twitter accounts if you meet the rather stringent browser requirements but doesn’t offer anything in the way of usage statistics or analytics. Similarly, the reporting tools of Hootsuite are largely restricted to Premium account holders.
I’ve heard good things about TwitterCounter (which generates graphs for current and predicted levels of followers) but more in-depth analysis is again limited to premium accounts.
Tweetreach is handy for occasional reports; you can view the ‘reach’ of your latest 50 tweets without signing up for an account.
Tweet effect is also a useful reference but on the various accounts I tried, it didn’t identify any correlation between tweet content and follower loyalty.
ThinkUp is the Twitter analysis and archiving tool that I use the most. It’s particularly good at measuring Twitter-based conversations by keeping track of replies, retweets and inquiries (questions you’ve posted on Twitter). It also has a GeoEncoder plugin to let you map your social networking conversations. The downside (or at least a slight barrier) is that you need to have hosting but this has been reduced a fair bit by the increasing free and shared hosting options available. PHP Fog now offers free ThinkUp hosting which you can have up and running in next to no time.
Tweetstats is great charting tool. As well as follower stats and frequency charts, you can visualise who you interact with most on Twitter, and even patterns of what time of day you tend to tweet — handy for identifying accidental routines.
Xefer is another great graphing tool built using Yahoo Pipes and Google charts. The Reply Explorer also lists replies to your tweets that you can sort by date or frequency.
And if you’re *really* into visualisation tools, check out TAGSExplorer, a brilliant tool created by Martin Hawksey that lets you create interactive visualisations of your Tweets using a Google Spreadsheet, the Google Visualisation API and some kind of d3.js graphing library magic. This is particularly great way of post-conference social networking analysis because it let’s you clearly see the interaction between participants.
Chances are, if your organisation is using Twitter in a significant way, you will need to use a couple of different sources and tools to review how you’re interacting with your network(s). And if there are any you’ve used and found particularly useful or any that we’ve overlooked, let us know in the comments.
API is one of those abbreviations that’s thrown around a lot but can seem a bit abstract. Application Programming Interfaces (APIs) basically define a way for you to interact with a particular application (All clear now? No?). The best way to get your head around what this actually means is to use an API for something.
The Nerdary has a clear introductory guide to APIs, using the Twitter API. And the Twitter API really is a great place to start.
While Twitter may have bumped RSS off the homepage, you can still subscribe to Twitter using the API and, using Yahoo Pipes, combine and filter these feeds (and even clean up the data a little). This is a particularly handy way of monitoring feedback and mentions on Twitter and combining these into a super social media feed.
The Twitter API documentation will provide generic URLs as a guide which you can then use as RSS feeds in Yahoo Pipes (for example).
For starters to subscribe to a Twitter user’s lists:
(If you have any problems, there are always cheats available).
You can also subscribe to a Twitter search using the following format
The Stay N Alive blog has an interesting post about how both Twitter and Facebook seem to have unceremoniously ditched RSS .
While Twitter have provided some basic information about how you can still use RSS (using the developer resources — so not particularly user-friendly), it’s still a crying shame that the RSS icon is no longer such a visible presence on the homepage. There are also various workarounds to be found for accessing Facebook feeds but no telling how long these will last.
Pancakes and Mash, the eighth (!) Mashed Library event was held yesterday, organised by the gang at University of Lincoln. It was a great day looking at various existing projects that involved mashing data together (in various ways and by various means) as well as giving attendees a chances to get involved in library-related mashups.
The keynote speaker was Gary Green (Surrey County Council), one of the key people behind the ‘Voices for the Library‘ campaign. Gary gave an overview of how this project began and how it facilitates sharing info between the many local library campaigns underway. He also gave an overview of the various tools that have helped make it such a success. Among these, the VftL team credits Twitter with playing a major role in bringing the project together while they also use Facebook, Delicious, email lists, Google Fusion tables, wikis and blogging to help get the word out.
One aspect of this I hadn’t fully explored until now was the map of libraries under threat — which is now generated using Google Fusion tables from data on Public Library news. You can see the map on libraries.fromconcentrate.net. If you haven’t explored the Voices for the Library website yet, it’s a great resource of campaign news and advice.
The second session was split into a few different workshops. I attended the ‘Metadata Forum: building a community around metadata‘ session led by Stephanie Taylor from UKOLN — while trying my best to also stealthily follow the other sessions on Twitter. The Metadata Forum is a JISC-funded project that aims to build a community around metadata for those who work with it in any capacity. They’re interested in all levels of metadata users, not just the specialists and based on attendance, a *lot* of us are working with metadata. You can read more about the forum on their blog on the UKOLN website.
The other sessions were “We Can Haz Ur Data!?” with Alex Bilbie & Nick Jackson from the University of Lincoln and “Using Web2.0 tools to save libraries” with Gary Green, building on his keynote talk. Both sounded really interesting and hopefully notes and/or slides will be circulated soon.
After lunch, Alison McNab (De Montfort University) gave a talk on ‘mash at lunchtime’ for those looking for shorter, onsite events and Stephanie Taylor (UKOLN) led the discussion on ‘Across the divide: how geeks and non geeks can have meaningful conversations with each other, and how we’re all the same, really‘. There was also the option to get some mashing done in the other spaces open to attendees — which turned into a great walkthrough session on Yahoo Pipes by Paul Stainthorp.
It was a productive and creative day and I think it’s safe to say that everyone went home brimming with ideas and armed with an extensive list of web apps and tools to try out. Thanks to Paul and the rest of the gang at Uni of Lincoln for organising such a great event (and to Elif Varol for the cakes). Only tentative rumours about the next mashed libraries day at the moment but keep an eye on the wiki as I’m sure there be more news soon and there’s plenty to get started with in the meantime.
For those who couldn’t make the conference, you can keep up with Internet Librarian International 2010 in a few different ways. There’s a (hyper)active Twitter feed (and subsequent archive and analysis) and the The #ILI2010 Daily for starters. Owen Stephens is also managing to keep a pretty extensive account of proceedings on his Overdue Ideas blog.
update: There’s also some ILI2010 liveblogging on the Geekfest blog.
A social media policy is becoming a must-have for libraries and, luckily, there has been a recent flux of guides to help get started. As more and more libraries make use of social networking tools, it’s important that this use is planned and managed alongside (and within where necessary) other library policies.
The Social Media Examiner1 has published an extensive guide to creating business guidelines for social media. This nicely complements the recent 23 Things Oxford workshop and the subsequent guide to writing and managing a social media policy.
For more specific advice, there’s Beyond Slice Bread’s guide to giving your library a Twitter makeover (and also masses of other advice to be found on this blog).
This is not a complete list (yet!) but definitely a few places to get started.
1. hat tip to iLibrarian
Great news. Twapper Keeper, the Twitter archiving platform, has gone open source. A version that can be installed on your own server is now available via their Google Project page. A hosted version is also available.
As well as being free and open source, you can also access Twapper Kepper APIs and export data in a variety of formats.You can find out more at both the blog and community site. There’s also a demo to play with. It would be great to see how this works with a Twitter analysis tool like ThinkUp (formerly ThinkTank).
It’s great to see more libraries using start pages in different and creative ways. I’d previously feared personalised start pages might have had their day, usurped by new browser features and dedicated social media tools like HootSuite and TweetDeck. Instead they seem to have found a niche within libraries, providing new ways to communicate with users and manage increasingly complex social media presences.
One of the main benefits is that start pages provide library staff with an easy way to manage RSS feeds, multiple social networking identities and other web content and share this with users. CILIP’s use of Netvibes to manage the Defining Our Professional Futures project was a good example of this in action.
Public and NHS libraries are putting start pages to good use already as a way of sharing information with users and as flexible homepage option on public PCs. Eddie Byrne’s presentation about Dublin City Public Libraries’ experiences using Pageflakes and Netvibes , gives a good overview of implementing a start page.
For those who haven’t used them before, start pages are a personalised home page for your browser customised using widgets. There are some comprehensive introductory guides available. The most common tools are iGoogle, Netvibes and Pageflakes (though Pageflakes has fallen behind a bit for various reasons). The Library 2.0 site has a great summary of the top-three , but there are many other options available.
There are some lesser known start page platforms providing unique takes that are also worth considering. 3×3 is a pretty basic, fast-loading landing page service . Symbaloo is also an interesting new development with good potential for library use. Netvibes continues to roll out new versions (and a new motto to match), including the Netvibes2go mobile version and the Wasabi edition. Protopage is another one. As well as dedicated start page options, blogging platforms are becoming more flexible and can be used as customisable library portals. I’m still looking for something that works like HootSuite but more start page than dashboard but new developments are surfacing regularly.
Cilip’s Multimedia Information and Technology Group held an excellent event on social networking and libraries yesterday (9th July) with speakers covering tools such as Facebook, Twitter, blogs and Foursquare as well as mobile technologies. Overview: http://kwiddows.blogspot.com/2010/07/social-networking-in-libraries-mmit.html also tweets on the seminar at #mmit and photos on flickr http://www.flickr.com/photos/catherinedhanjal/sets/72157624465983986/