I’ve been planning to put something new on to this blog.
This, finally, is the attempt that succeeded.
It turns out writing very little, and saying nothing, is the key.
I’ve been planning to put something new on to this blog.
This, finally, is the attempt that succeeded.
It turns out writing very little, and saying nothing, is the key.
I’ll try and keep this short. Since leaving University 13 years a go I’ve been getting fatter. And generally not very healthy. It would be fair to say I started running to reduce the beer belly.
Getting the runkeeper app was a key development; my endeavours went from occasionally trotting aimlessly for a little while (then feeling very smug for the next month or so before doing it again) – to actually trying to improve times, distance and improve on routes taken. Stats and Maps. Pure crack.
I did a couple of runs (10k) in 2011, and a half marathon in early 2012. This felt good, and it followed my simple logic that the further I run the more beer belly is burnt off. The day after the 2012 Brighton Marathon I signed up for 2013. Stupid Boy.
It turns out a Marathon is more than just two half marathons. Once you’ve got used the to distance of a half marathon, you can pretty much do it as required.
Marathons are different. Unless you are a freak (or ‘athlete’ as some people call them) your body can not store enough energy (carbs, glycogen) for a whole marathon, even with carb loading before hand. This is what they call hitting the wall. Walking or resting will make no difference, like a battery, if you are out of energy your stuffed.
As I’ve trained for the marathon, I’ve noticed this a lot, things get tougher soon after 14 or 15 miles. Every step is painful, a slight change is step (a curb, twisting your head to see if it is clear to cross a road) is painful. Stopping is painful. Starting again is almost impossible.
With the bad weather this year I’ve done a number of long runs in the dark, cold, rain and – above all – wind. It’s oddly lonely, you can start running in rush hour and finish when people are going to bed. A number of times I’ve not reached my planned distance.
Most training plans suggest doing a few runs of about 20 miles up until two/three weeks before the marathon. This I have done, the last and furthest in particular was hard, I hit 20 miles some way from home but had to instantly stop, and then take tiny slow steps back to my flat (it was very cold, very wet, very windy, woe is me). How could I do 26 more?
Tomorrow I find out. I haven’t been perfect, I haven’t been out 6 times a week like many training plans suggest, and haven’t worked out the exact amount of carbs I should be eating each day or anything like that. And I’m afraid right up until the last week I was eating and (plenty) drinking.
I want to finish this. I have no idea how i will do. If I can do 4hour 40 mins I will be happy (for reference I can do a half marathon in under 2 hours). These last two weeks have been odd, I feel like I’ve lost all I could do two weeks a go, I’m pretty sure tomorrow I will run a few miles and then want to stop with a stitch.
If you fancy it, and only if you do, no pressure please do sponsor me a small amount, everything if very much appreciated and I’ve found it very touching to see all the people who have done so so far.
I have this theory. In fact I have lots of theories. But for the rest of this paragraph I will restrain myself to blessing you with just one. My theory is that we are all moving to digital music before we are ready for it.
I mean listening to it on an iphone/ipad thing, sure fine. We’re all doing that ok. But in the house? or in the car? How do you listen to it there?
I quite often hear people talk about how they put their CDs in the loft, or never buy real CDs any more. Yeah, why bother with that crazy shit?! And the news which reminded me of this today was that Amazon will allow you to download (and keep in the Cloud) any physical album you buy.
Now, when I do buy music (which is less now, as the rules of being a grown state you buy less music), when I do buy music, I do still tend to buy a CD. Even though the first thing I will do is rip it.
Why? Because, for some odd, unexplainable, stupid, economic-defying reason it is still cheaper, to pay for something to be designed, made, put together, boxed up, shipped to a warehouse, stored, picked form the warehouse, shipped to a Store, unboxed and put on a shelf, have people decide on how it will look in that store, and have people on hand to offer advice, someone to take payment, pay expensive rent on said store, and factor in shrinkage, THAN PUTTING A BLOODY 5MB FILE ONLINE TO DOWNLOAD.
This is crazy.
I said CDs are often cheaper than MP3s, as a quick test by looking on Amazon for Madonna (YOU SEE I AM DOWN WITH THE KIDS). Of the nine albums shown: five are cheaper on CD, three are only available on CD and one, just ONE, is cheaper by downloading MP3s. iTunes seems little better.
My buying process goes something like this: like a song on Spotify. Decide to buy it (especially as I don’t subscribe and can often only listen 5 times). Decide that I like more than 4 or so songs from the album, at which point, at 99p per song it is only a little more to buy the whole album. Check the album on Amazon both to download and buy, often the physical album is cheaper and that’s what I’ll buy.
I will then have it on my computer and iPhone (at a bit rate of my choosing – I am a geek) plus have a backup physical copy – which comes in a nice presentation box with photos – that I can also use in the Hi Fi, lend to a friend and play in a car. Those extras there are pretty handy, and worth getting the CD even if it is a little more than the download only version. Plus, PLUS they can’t do a Amazon-Kindle-look-at-us-while-we-delete-an-ebook-from-your-kindle-which-you-previously-purchased. Not with my physical copy they can’t.
But, even with these physical things sitting very very close to my as I type this, I still play mostly via my laptop. The thing is, they do skip, and worse, I have to stand up and walk a whole metre and find a cd, and put it in the player, and then after a while it will get to the song i don’t care for much and have to get up again and press skip. I know! And then 15 mins in to the album my attention span will be all used up and I’ll want to instantly change to a completely different song which probably isn’t on any CD I own, let alone the one I’m playing. Is there no end to the grind?
That’s not to say CDs don’t have advantages. For one they are better quality (no compression, dedicated hardware) when they are not skipping, and also they continue to play even when my laptop does the pretty rainbow circle for a mouse pointer. Which happens every two minutes and lasts 110 seconds each time for me. Sometimes I avoid switching windows or opening a new tab because I like the song I’m listening to and don’t want it to cut out.
But lets get back to the question, if you, the general public (yes, that was quite patronising) are abandoning the CD for digital music, then what are you doing.
There seems to be three options: use earphones, use a cable in to your existing Hi Fi, use a Hifi or specialist device with an iPhone dock.
Now, if you fancy ‘pumping some tunes’ (once again I demonstrate just how with it I am) in to your living room then headphones are no good. But do we really all rely on little cables connecting our laptop headphone socket with our HiFi external input socket? Or has everyone dumped their HiFi and just use their iPhone (and, yes for you in the corner, also Android devices, bless) in some specialist dock? (the latter of course will give much better quality, a digital signal basically going to the Hifi’s DAC – digital analog converter darling – and then on to the Hifi’s amp)
Or do we all now party in our living rooms by the sound of an internal laptop speaker? Good news for the neighbours. Less so for crazy parties.
My point, which so far I have failed to make in any articulate way, is that we all seem to be running around going ‘remember those CDs? how quaint! we’re all digital now, yeah, we’ve given all our CDs to Oxfam, yar, darling pass the hummus’, at the same time, we’re not really ready to do so. Cars either just have a CD player, or need to come with a 5 year old child attached to them to explain how you transfer your music from your ‘digital cloud’ on to a stick your car can play. And even if you can do that with your computer, how do you do it with your iPad where files are so twentieth century, god who needs them any more, and who wants a nasty looking usb port to ruin the smooth lines that Steve himself created? How do you get files from a device with no files and no usb port to your car?
And Spotify, how can it take over / destroy / save the music industry if there’s no easy way to get the music to sound ok. I’ve often amazed when people say they just use Spotify now. How do you play it? Oh we just play it out of the laptop speakers. Really? Is this progress? It feels like the McDonalds of progress, instant choice but not a great step for quality.
Me? My HiFi is on the other side of the room to my laptop and I use Apple’s Airport to stream music wirelessly. It’s not an ideal solution, expensive to buy an airport express just for this, requires a special third party app to stream Spotify and anything else other than iTunes, but does work.
The whole point of laptops is that they are portable, so I’m surprised there aren’t more common technologies to cheaply take the sound your laptop is making and streaming it with no wires to your HiFi. I would have though that would be a common requirement and yet it seems to be only me looking for it.
We’re told on a regular basis that the music industry is doomed. Mainly due to evil pirates. And the Internet. And Spotify.
We’re also told that Spotify gives the artist a very poor deal, and a number of charts have done the rounds online over the years comparing the money an artist will typically receive from CDs, online, singles, radio play and Spotify, with the sat being a tiny fraction of the rest.
Something seems to be wrong. Because to me, it seems like people are spending money like they never used to, meanwhile, costs are being cut out. With more money in the industry, and fewer people wanting a cut, this should mean good times. So why doesn’t it?
First my logic. I don’t have any numbers. But my instinct is that most people (MOST) don’t buy a new CD each month. What would be the average for an adult, a couple of year? We’ll make it 4 to be generous. Let’s say £10 a CD, that’s £40 per adult a year.
Now it so happens that a Spotify Premium account a month costs about the same as a CD, £10. So for a year that’s £120. So for a typical person, with a Spotify account, they’ve gone from putting £40 a year in to the music industry right up to £120 a year, tripling what they used to pay.
Now of course, many people with a Spotify account will be music lovers who, pre-spotify, would buy more CDs than my plucked out the air 4, but I know many people with Spotify Premium who I wouldn’t put in to that grouping.
And higher up in this ramblings I pointed out just how many extra costs the traditional CD has compared with a digital download. That £40 included a cut for the security guard in HMV, and the person who does the Health and Safety training for stores in the south west. And don’t forget the guys in the warehouse, or the one who sources the packaging, or the girl who designed the art layout inside the sleeve.
But that £120? Well yes Spotify get a cut, but the rest goes to the record label itself (i.e. the music industry), and hopefully a portion of that will go on to the actual artist. So more money is coming in, and more of it is going to the core of the industry.
There are partial answers, but they don’t explain it all. The music industry complains because that’s what it always does (and I get a feeling that they still live in an excess of a previous era).
Spotify is playing a long-term game, expanding both the number of countries and users, and will hopefully become sustainable. And the numbers we have for artists are patchy and mostly from those who have shared (confidential) numbers, and mostly indie outfits. Of course the truth is it is a long tail. And indies are the tail. Lady Gaga is probably played more than all of them put together and can also negotiate a higher play fee, combined probably means she does quite well out of it. The humble CD did equalise things a little: the price of a CD album did not differ too much between major acts and indie bands, so if you bought lady gaga and an indie band you would probably pay roughly the same amount. I’ve also a hunch that Gaga fans will probably play the same song many times, where as someone who prefers small indie bands is more likely to have a wider range of acts they listen to, which with Spotify’s pay for plays means that they have a small audience listening to their music, plus that audience will listen to it less per person.
And of course the Spotify model is more long-term for the artist as well. With CDs you get a surge in spending, as people buy the CD, they then may listen to it for decades but you earn nothing more directly from this. However with Spotify they could go on earning for years, without doing any extra work. So while it may look to like CDs, downloads, etc are better earners, we will have to see how they compare over a longer time period.
As an aside what I don’t get however is why the adverts on Spotify often seem quite poor, as if they struggle to sell the advert slots. To me this is advertising gold, audio adverts are harder to ignore than magazine, online or even tv ads. Spotify users are likely to be young, tech savvy, probably not too badly off (they have broadband and a computer) and these sound like the sort of things which advertisers like. What’s more adverts can be be tailored based on listening tastes. They should be able to target much more accurately than for TV or radio, and hitting the right audience is always the key thing.
Get back to the point and wrap up this bit Chris. So my point is, Spotify, based on my non-fact-based guesswork, looks like it is getting people to spend more money on music than they would previously, while reducing the number of people who need a cut of that money. So why is the music industry in ruins, Spotify in loss, and artists complaining of a poor deal.
A couple of hours a go it was announced that HMV, the last major Music retailer in the UK, is going in to Administration. This was shocking in that it was and wasn’t shocking.
It wasn’t shocking because anyone who reads the news will have read a slow drip feed of bad news for HMV, and this christmas didn’t bring good results.
But it was shocking because it was both the last major music chain (they also did films and games but I wasn’t really interested in those) and the one I’ve visited most in my life. It was also the one I visited when growing up.
Someone tweeted earlier that they’re glad HMV sold Waterstones (the UK last major national book store) so not to bring them down with it. I don’t feel the same. I wish I did. I wish I could say I was the bookish type, always lost in a book when growing up, always reading new things. The truth is I didn’t read much, and I don’t now. And the only book shop I remember in Northampton, where I grew up, was WH Smiths (later on Waterstones, and The Works, did open up a store, and in those days WH Smiths wasn’t too bad, and not the mess of a store it is today). So, I feel bad – and somehow a lesser person – for saying it, but if it was Waterstones announcing closure today I wouldn’t feel the same sense of nostalgia and sentimentality as I do today. I imagine for many towns it will be a choice between WH Smith and the supermarkets which is depressing.
Luckily I have quite a few music shops near me, most sell CDs I’ve never heard of, and nearly all only exist for a few years before they close and new stores open up to replace them. Resident music makes an exception by both being open 8 years (aka ‘forever’ in terms of Brighton’s shops) and even sells some music i have heard of.
Finally, I never quite understand why companies go in to Administration in this way. When times are getting tough, why not sell those stores that generate the biggest loses, make the whole company smaller and then focus on rebuilding a much smaller company. It seems to me that Comet, Jessops and HMV all kept nearly all their stores open right up to Administration, and in HMV’s case, they often had large stores, right in the busiest (aka most expensive) part of the shopping centre. Why not move to smaller sized units, and, while not moving to the edge of town, look in to units which were a little less ‘premium’.
I’ve been surprised by a number of the recent closures. Comet may not have been great, but it’s where you often went for a fridge or electrical good. And while people may be splashing out less at the moment, white goods are not something that has really taken off in terms of online shopping. And Habitat, a store that over priced everything and yet always seemed busy. I always thought overpriced+busy=win. But clearly not.
And HMV, yes it had a LOT of competition from Amazon and the supermarkets, but it was the last high street music seller of note, especially with Virgin Megastores gone, if you wanted a CD, or film while in town that is where you went, so I find it surprising they couldn’t find a way to make that work, even if it meant reducing the stores.
The three parts of this are all about how we listen to music, or how we are buying it, which are both connected. We are listening to it online, even if I suspect we are not doing it correctly (according to me, who obviously makes the calls on these judgements), we are subscribing and streaming not download or buying, which to me should bring in more money to the industry, and mean it goes to those we actually play, and it looks like we are losing the last real way to buy a physical album on the high street.
The weird thing about technology progress is that no one plans it through, or has any control of the direction. Each little development and change leads to a knock on effect to our lifestyles and way of living, sometimes we know this will have bad knock on effects but there is little we can do. For roughly the last hundred years (maybe a little less) we purchased music from a store, on a circle shaped thing (mostly), and certainly for the last few decades the most popular concept was the ‘album’ of 10 or so songs released together, with a name and some artwork. Like most publishing industries, we are clinging on to as much of this infrastructure even though the online environment makes it pointless, but for how much longer?
In 25 days time England and Wales will elect Police and Crime Commissioners for the first time.
Many are cynical that this will politicize high level decisions by the Police, which it will almost certainly will to some extent, but mostly people seem indifferent or simply unaware.
Me. I hoped that the candidates would not be politically aligned with parties, that we could judge them based on their policies and priorities, not because they are connected to the party we usually vote for. But this was always nothing but naive.
I think the way we split central and locally run services/government in England could be improved, so I approach all change with an open mind to see if it will improve this. Too much centralised; local councils no more than under funded basic service providers with almost no power and at the whim of central Government control. Secondly, But there are key services which seem to slip through a democratic gap between the two, the Police and Health being examples.
Who do you hold to account for local Policing? What do you do if it is not up to scratch? Your local councillors? MP?
Sussex Police, my local police force, covers two geographical Counties (East Sussex and West Sussex), and three local authorities (East Sussex, West Sussex, Brighton and Hove City). It is overseen, until next month, by the Sussex Police Authority.
There is a weak link between my local councillor (who I obviously can vote for) and the Sussex Police Authority: There seems to be just one Brighton & Hove City Councillor on the SPA. He isn’t from my ward, so I do not directly elect anyone who oversees Sussex Police. I can ask my local councillors to raise an issue with our representative, or I could try contacting him directly, but I have no come back if he ignores me. I could write to my MP, but she has no real power, just soft power due to her status. There’s no direct line between people who I elect and those who oversee the Police Authority.
This may seem academic, but the general idea of democracy is that you have some power to elect those who are making decisions on your behalf. For what it is worth, I think the situation is worse with Health (the people who oversee your local Hospital are probably not even local elected officials, your council and MP have no say at all if your local hospital closes).
So, elected Police Commissioners do make the line of accountability very clear: the good folk of Sussex, including myself, elect a Sussex Police and Crime Commissioner. If they don’t do the things I want (or do the things I think they shouldn’t) then I have the right to not vote for them.
However it does obviously lead to the risk of politicalising the Police force and going for populist policies (which may sound great but may not actually lead to a safer environment).
It does seem quite rare, with only the USA being the obvious example of a country with something similar. For info, France, Sweden and Italy mostly have a national Police force (or several of them) rather than local Police forces, they do have small local forces but they have limited powers. Germany and Canada have State and Regional forces. Oh, and to add to the mix, we are getting a National Crime Agency soon as well – which will probably help with more complex crime, and potentially help the odd situation that the Metropolitan Police act both as the Police for London, and as a national force for serious incidents such as terrorism (even though Scotland proudly has a separate legal system and government, it was the Met who dealt with the car bomb at Glasgow Airport).
So, here are some links
Candidate Manifestos’ should be on the choosemyPCC site from the 26th October. I’m going to have a think about what’s important to me before then and then see how they compare.
UPDATE 6th November
The Brighton Argus has a section on the election with more information
UPDATE 12th November
First I want to mention a Radio 4 documentary which covers the Police. As I mention above some countries, such as France have one national Police force, where as other (and most) have local forces. The documentary covers that Scotland is moving to just one Police force, mostly to save back office costs and to save duplication in specialist services. It looks out reducing the number of Police forces in England too. Worth a listen if you are interested.
I also want to highlight this interview with four of the Sussex PCC candidates and this blog post, much better than mine about the Cambridge PCC.
I said above that I was cynical of the political nature of these elections and therefore I am pleased to say, based on look through manifestos and aims:
I am going to vote for Ian Chisnall, an independent candidate for the Sussex PCC.
I like his independence, he has avoided simplistic media-friendly claims about bobbies on the beat and tackling young people in hoodies causing a nuisance. One of his priorities is “Abuse including Domestic Violence, Hate Crime & Trafficking” which I agree with, and another is “Anxiety, the fear of crime and support for Victims of Crime – Sussex is an area with low levels of crime but not all of us feel safe” I feel there is a disparity of between people’s perception of certain crimes and their actual levels and this is a useful acknowledgement of that. There’s no point in directing limited Police resources to issues which are more about perception than real crime. He seemed to approach this with a positive approach to Sussex Police, unlike some of the other ‘must drive efficiency and savings through’ that others are taking.
Of the others, the Godfrey Daniels – the Labour candidate – stood up well. He had experience and a pragmatic approach. The Lib Dem candidate had the most limited website and it didn’t really aspire. The Tory candidate, had business experience, presumably useful when dealing with budgets, but seemed to focus on rural and business crime, do the residents of Sussex really want the Police to priorities a theft from Primark over other things. I also found her pledge for a Special Police Constable in each village dubious, would we attract the right number, and calibre of person to act as a Special, and isn’t it just a form of unpaid intern.
Finally, I’d like to complain about the Home Office advertising campaign, which you can find here. Look at the names: vandal, burglar, mugger. These all seem to focus on one area of crime, and I can’t help feeling that by advertising it in such a way, people will approach these elections thinking about them regarding this one area (without sounding flippant: street, common crime). No mention of domestic abuse, or dangerous driving, or serious fraud, or questions of liberty. To me these adverts set the frame of what these elections are about and therefore were advantageous to those who are standing for election and empathised such crimes. The Home Office should have been more broad in the advertising.
In any case, I urge you to vote this week, even if you disagree with the principle of these elections. And if you are in Sussex, and you don’t know who to vite for, I urge you to vote for independent candidate Ian Chisnall.
A lot of online services have tried to introduce social elements in to their product. This is normally annoying, but in the case of the BBC’s iplayer I found it interesting and useful.
The problem was I was about the only person who did.
A brief recap, when the current design was launched, it had an extra (smaller) column on the homepage. Along with Featured and Most Popular was a column called Friends Recommend or something like that. They made some sensible design choices, rather than having another site where you needed to maintain a ‘friends’ list, you could simply point it to twitter and co and it would do the rest. The problem was that in turned out that at any time it would have maybe three programs to recommend, each of which just one of my friends would have recommended, and that was about it. It was therefore showing ‘anything my friends recommend’ – due to lack of take up – rather than ‘the most recommend programs by my friends’ which would probably be more useful and avoid the slightly narrow topics that came through (oh look Formula 1 and Dr Who).
There were other reasons why it probably didn’t take off. First each program on iplayer had options to favourite, plus recommend, plus share using the usual social suspects. Too many options. I enjoyed something, so I favourite or recommend it? More, to recommend, and see recommendations from you friends you had to be logged in, when to watch or listen did not require to do so – so most didn’t. Quietly the feature disappeared.
But why was I interested in it?
At any given moment iplayer is treasure trove of content. Especially in radio. It’s much easier for good TV to rise to the top, partly because there is less of it, and partly because good TV tends to be expensive. Any sort of semi decent good drama will be prime time viewing and probably on the featured section of iplayer. I’m not shocking anyone by saying Daytime output can be ignored, you’re really looking at a few hours of prime time a night on three channels (well, four if you’re really nice and allow BBC Three to be counted, bless it).
But radio is different. You can make the most amazing radio with a script, voice and microphone. Interesting stuff is being pumped out at all times of the day on various stations.
Sometimes – not often – I’ll fall asleep with the radio on Radio 4. Because it’s not something I often do it makes me drift in and out of sleep, waking for a few minutes every so often, sometimes reaching a level of consciousness that knows what that sound is but can’t quite reach the levels of energy to turn it off. It creates very strange dreams – and snippets of conversations and monologues. The first time I did this, I wanted to listen again to some of the things that were coming back to me once I was awake. It took a while to find, the Radio 4 Schedule just said ‘World Service’ and the latter’s schedule was quite difficult to navigate, but I found the programs in question. Some were really interesting, one was an Arts magazine program with a World slant, and I actually bought a book they were discussing as a result.
All that interesting stuff, in just one nights broadcasting, on a channel I would never listen to, and programs I would never bump in to on iplayer. What if others could highlight these gems as they listen to them. What if I could highlight them to others.
I have a rule that I don’t put the TV on unless there is absolutely something I want to watch – and as I never look at the TV Guides, that is quite rare (though I can smell Family Guy on BBC Three a mile off). I have this rule because I am a Lazy Person. If the TV is on I will sit in front of it as if chained, moving for nothing, even if it was stuck on the test card (I scratch my knee, I have to scratch the other).
Because of this I listen to a lot of radio recordings in the evenings, when one program finishes I stop what I am doing (whether online, or – depressingly rarely – something in my flat that doesn’t require an internet connection, ummm, hang on, I’m going to think of an example… Like… like the washing up! you see, I’m not so boring) and need to find something else to listen to.
Once I’ve exhausted the ‘most popular’ list of things that take my interest I get a little stuck. The categories on the bottom right of the Radio iplayer page never really work for me (long lists, the fact they include stuff from all the regional stations doesn’t help). And browsing yesterdays schedule for each station is a bore. So after the ‘most popular’ list I head for the main radio station pages on the BBC website with the first stop being, of course, Radio 4.
You get six highlights on the Radio 4 homepage. For me, today is a bad day for the highlights. The first seems to be a running fiction series (no thanks), another talks about making chocolate mousse (which isn’t the same as eating them), and while I’m interested in things music related, mostly because I’m so ignorant of it – people talking about how a piece has changed them doesn’t really appeal (I’m guessing a couple met each other as a result of it, and someone from a ‘disadvantaged estate’ was destined for a life of crime until hearing it). What I need is Highlights, sure, but lots of them.
After that I need to make more effort hunting for things worth a listen. And I need to pace myself. Too much hard listening on one night will leave me with nothing to listen to the next. Like Drug barons the world over, Radio 4 has learnt the art of limiting out supply to its addicts. Bastard.
Where was I? Discovering stuff. Yes. And the thing is there is good stuff where you least think it – even Radio 2. Bloody Radio 2! (Henning Wehn, probably the funniest comedian in the UK, and Michael Grade’s documentary on Television since you ask). Radio 3 has debates, I’ve mentioned the World Service, and 6 Music obviously has a lot worth listening to.
It did occur to me this was an itch I was trying to scratch – and perhaps I could make my millions by developing the universal solution (obscure UK radio documentaries surely have the same mass-market appeal as Facebook). And while the BBC have done much with APIs, supporting developers and Linked Data, I couldn’t see any obvious way to build a third-party site to cater to my (and the millions) needs with what was available.
I might need to follow in Bob Monkhouse’s footsteps. Buy the Radio Times each week and going through the listings each week with a highlighter (note to self, must purchase highlighter).
But what about the other side of the coin, sharing the things I have listened to? There’s nothing really charitable about this aim. This is good honest preachiness – I’ve decided you should listen to something and YOU’RE JUST GOING TO OBEDIENTLY LISTEN TO IT. You will like what I like.
This is easier. The iplayer has sharing tools to post to various social network sites. I use Twitter and Reddit (and have a Facebook account so I can’t stalk people, don’t we all). So what if I could use one of the other services offered as a way of recording and sharing what I liked.
Long story short (though if you have reached this far you will have realised it’s more long story thankfully a little less long) – I went with delicious. You may have heard of it.
Delicious was my bookmarking tool of choice for years until the famous Yahoo! ‘Sunsetting’ slide leaked out – even though it did not state the site was closing – and it provided (as it should – but many don’t) easy export options, I didn’t like the idea of my bookmarks being on a service whose future was questionable. Besides, a recent change to their system seemed to require me to constantly re-enter my password no matter what I tried. When would these first world problems stop haunting me?
So I moved to pinboard.in – in the knowledge that there is no safer place to store my bookmarks than a one-man operation – the one-man spending much of his time promoting and baiting his competitors and mocking his users. Using pinboard felt good – in as much as a service that remembers links for you can ever make you feel good, and as it had imported everything from Delicious. I haven’t ever needed to log in to the latter until last week.
So it goes something like this. If I listen (or watch) anything I feel noteworthy I now use the Delicious share option on iplayer. I use the Delicious note field to add (get this) notes, and text I want to share. I wanted this to be frictionless as possible and originally wanted to not use tags, but then decided that to spare those who follow me on twitter I would use the tag ‘t‘ for those i wanted to share. Next I use twitterfeed (now owned by bitly, which pleases me as I could never work out how they would find a business model) and pass it the RSS feed the t tag on delicious, and tell it to tweet what ever comes in.
And so Delicious – a site I thought I would never use again, is now (until I get bored with the idea) my iplayer diary, and via twiterfeed, a way of telling the world (where world = my followers) what they should be listening to.
The first tweet to come out of this was for Masters of Money: Marx written and presented by Stephanie Flanders – which yes is a TV show (a prime time one at that) flying in the face of most of this article. Interestingly it got a few replies and retweets and a favourite – all down to the brilliance of my tweeting I’m sure, and nothing to do with the program being an incredible smart, interesting and well made.
So for as long as I remember to do this, I will have a record of what I have watched, and with careful consideration of the annoyance-threshold of my followers, a way to share what I have been listening to and watching.
Monday 8th October
I wrote the above on Saturday. I didn’t quite finish it all, and kept on meaning to get around to hitting ‘publish’. Then on Monday: BBC launches iPlayer Radio to promote audio content. The literally minutes I had spent typing this were now wasted – to think I could have spent that time staring at new DMs on twitter saying “have you seen what they are saying in this video….” (No I haven’t, I’m totally clicking on the link later). This post is out of date before it is even published. BBC Iplayer has changed, the new version seems to be merged with what was the main BBC Radio page.
Now, when things change on the internet strange things happen to people, especially when it’s the BBC Homepage (OMG the direct link to 16th century weather formations over Essex have been removed from the homepage, do those overpaid autocratic plebish so-called experts have a clue how much they have destroyed it for EVERYONE) or Facebook (OMG my profile page now has two columns rather than one – does no one understand the pain). Yet for once I feel like getting my green pen and joining in.
I mean on the plus side it gives Radio its own space (but some will note bbc.co.uk/radio/ was pretty much a space for radio) and makes listening online to live and recorded items an integral part.
However – green pen time! – where are my Featured and Most Listened to? While by definition highlighting what the editors decide to promote, or what others are listening to, is hardly finding that rare nugget that no one else has found, it was a great way of bumping in to things that you would not normally – well – bump in to.
In fact the nearest thing looks to the highlights on each stations homepage, similar to those I describe above on the Radio 4 homepage, back in the good ol’ days of early October 2012. Ironically – or intentionally – this highlights and strengthens the original stations that produce the content – on iplayer they were just shows, it was easy to listen to something and have no idea what station it came from (except Radio 4 Extra / Radio 7 – which oddly always adds four or five minutes of recording to the start and end of each programme).
We do have categories, like before, but these always seem to have a little too much noise to signal. For example Factual (the place to go to for things like the chat/music/comedy/not-many-facts Loose Ends, The Bottom Line and Midweek) is currently dominated by “Everything you need to know about Cumbria’s day.”, “All of Oxfordshire’s news, sport and essential information in one place.” and “Digon o sgwrsio, cyngor, cerddoriaeth a chwerthin yn fyw o stiwdio Caerfyrddin yng nghwmni Iola Wyn.” (ok, there was the odd national broadcast in there, the odd one, I didn’t have to point this out. This is how honest I am). Of course, all of these are just lovely I’m sure, but I don’t want to wash up to them. and I can’t be bothered to scroll through page after page looking for the odd thing I want.
My final observation is that it seems to focus on the current. I’m sure this is deliberate, but, for example, iplayer will default to showing yesterdays schedule, which is useful when you just want to listen to things already available. I’m guessing the search feature of the Radio page is going be quite a key feature – and they do allude to it in the introduction text – as a way of finding regulars (it could really do with auto-complete).
Today Talis announced they are moving their focus away from Linked Data. I think my initial reaction, which I tweeted, holds true: this is on similar lines to Microsoft announcing plans to move away from Windows.
Earlier this year I was at an event where I bumped in to two people from Talis. The one who knew me introduced me to the other, I think he described me as ‘a long time Talis watcher’, whether or not I quoted him accurately, that statement is probably true, and I therefore want to muse a little. Forgive me (especially if you are involved).
Here are my thoughts in a personal capacity. I start with some history, it goes on a bit, feel free to skip past it. At the end I move on to today’s announcement.
First, I want to go back a few years. I had started working at Sussex in late 2002 and this was my first time using/running the Talis Library Management System. Talis’ history is fairly well-known: Started in the late sixties as Birmingham Libraries Cooperative Mechanisation Project (BLCMP) as a co-operative shared service project between Birmingham Libraries, it developed over the years to provide what became known as a Library Management Systems (or Integrated Library System) and other libraries – both public and academic – joined the co-operative (i.e. customer owned), the system being known as Talis. Around 2000 (which was around when Sussex migrated to Talis), the company changed its name to match the name of its core product – Talis – and (I think) at this point became employee owned. I should add, for those new to such things, that in the Academic Library Management System (loans, fines, buying books, cataloguing, etc) market there were probably around six main players in the UK. All but Talis were international companies with many have well over a 1,000 customers. Talis had roughly 100 customers (half public, half academic). In terms of revenue, in simple terms, this meant a tenth of what others had to spend, I was always very aware of this and impressed in the way Talis kept up with the competition.
When I started at Sussex the system here had only been live a year, the server was a bit of a mess from a System Administration point of view. There seemed to be little structure to where components lived on the system, and many components seem to have several versions installed, yet often an older version seemed to be the one running. The top-level of the system had lots of files with silly names such as “ “ or “~”. The previous administrator was new to Unix/Solaris and to be blunt, had taken to it like a duck to sulphuric acid. My frustration here was mostly I had no idea what was the result of specific requirements of Talis and what was the whims of previous Sussex staff; what was essential to the running of the system (no matter how unusual) and what needed to be cleaned up. The crontab for root had about 1,000 lines, and I wont even start describing the printing system.
One of these frustrations was the web catalogue. For a start it was running the CERN HTTPD (netcraft pretty much confirmed you could have fitted all the servers still running CERN httpd in to classic Mini). Ironically the first thing I had done at my previous job was migrate the web catalogue to Apache. It is usual with third party systems to have clear documentation as to which web server /version is supported (or even it is installed with the application itself with no choice or involvement by the customer, as was the case with Prism2). It took a while to find out if we could move to Apache (perhaps CERN was the only thing they supported), and which version we should move it, and if there was guidance configuring Apache to serve the web application. Oh, and the web application seemed to live in about five different duplicate locations on the file system (Grrr). Like a lot of the web at the time, it used frames, and images for the menu text.
In 2003 Talis released Prism their next generation web catalogue. It ran on a separate server to the main Talis system. Based on the information we gave them they recommended two (entry level) Dell servers. These were essentially shipped to Talis where they were fully configured, we just had plug them in. One slight annoyance was that it was a master/slave configuration, with the master passing sessions to the slave to load balance. If the master dies, the whole service died.
Prism was a great leap forward, it looked like a modern web application, and I was quite pleased to see it was a java-based applocation (the ugly URLs are always a give-away). We had our severs up and running by summer 2003, but did not make it our default catalogue until summer 2004. Prism was not perfect, it timed out after a period of inactivity (nothing worse than leaving a record on screen for the book you must have, only to come back to a ‘session has timed out’ message), and had no relevance ranking. One example was for searching for books about the web, books starting with the word web would come near the end (‘w’), with lots of unrelated (but matching the search) books coming before. You can see an example here (this uses Manchester Metropolitan University, we have recently shutdown ours).
What made me happy was that Talis saw the release of Prism 1 (and 1.1) as just the very first steps in a long line of developments. I attended the Talis User Symposium 2003 where this was discussed. As a technical aside, I think it was at this event that I asked if the Linux boxes were to be treated as ‘black boxes’ what would happen regarding security updates, OS upgrades etc. I was told that as they were such a minimal install, and so locked down (they’re public web servers!) this would not be required. I’ve just checked, the last of these boxes will soon be decommissioned here, ‘uname –a’ shows “Red Hat Linux release 8.0 (Psyche)” (not to be confused with Redhat Enterprise Linux or anything modern like that) – we are running a system released in 2002, which we were told we shouldn’t be patching.
If I’m being open about Talis, then I should be open about something else. Its customers. I had been involved with a couple of other library systems. Customers were often frustrated with the slow pace of developments and poor service, but meetings and conferences were always professional and diplomatic exchanges, with both sides understanding the realities of the other. As such I had never witnessed such negativity and moaning, especial aimed at new developments. Each new product or version was seen by many customers purely in the negative. We’ll have to update our (overly complex and probably unneeded) user guide, it probably wont work, it will probably have bugs, it wont be useful to our users, what’s wrong with the old version. At conferences, on mailing lists, in meetings, the (vocal) majority were against anything new, and unable to see that while the new may not be perfect, it was considerably better than what was before and probably even the competition. (I should add this is some years a go, and much has changed in the last 10 years, nor due to my current job have I attended any meeting about the LMS for at least seven years).
In November 2004 two people came down from Talis to talk to a few of us at Sussex. As part of this we had a conversation about Prism: where it was going in the future. This mainly involved us saying ‘wouldn’t it be brilliant if Prism did X’ and them saying ‘That’s great, we’re so pleased you’re saying that, the next version can do that, it’s going to be available any day now’. That release never really came. A fix (Prism 1.2 I think) came out early 2005 for a bug some users were having (this started out as a ‘only upgrade if you have this bug’ but at some point became ‘why haven’t you upgraded to the latest version yet’). Prism 1.3 was released in August 2005, and in 2006 Prism 2 was released. This did have new features: it worked with MARC21 (a newer version of a common – and dire – bibliographic standard), worked with 13 digit ISBNs and worked with Unicode. While good stuff, hardly features that users would really notice.
A classic example of requested functionality was the ability of export by Endnote. A method to do this did appear on the Talis Developer Network (TDN) and a senior member of staff at my Library emailed me to check-up that I was fully on top of such things. Only… it was written by me. I had created a filter for Endnote which allows you to cut and paste the output of Prism in to a text file and then use the filter to correctly import the file in to your Endnote library (it matched the Text displayed next to the record details in Prism with the matching fields in Endnote). I had documented this on the Sussex website, and Talis, with my permission, has adapted it as a guide. I confess, I felt smug.
Around the same sort of time Talis Graphical was being developed. Talis Graphical was a Windows GUI to the Talis system, until now, all staff activity was via a Unix application accessed via a Telnet or SSH client. Like all unix applications that use function keys (hello Ingres Client) Term Types and keyboard mappings were a bit of a dark art. I liked Graphical a lot and give Talis a lot of credit for it, especially compared with some of the other LMS systems Windows applications, which were amazingly bad. I was also impressed that all keyboard functionality was consistent between the traditional unix application (now called Talis Text) and the new Windows client.
Talis Graphical became Talis Alto on its release. A few years later the phase Talis Alto started to be used in some places as a term to encapsulate the entire LMS, and gradually this became the norm. A minor thing, but I wished such changes in terminology had been announced, the message – when it was queried – was that the client was the LMS and hence Alto had always been the new name of the LMS to distinguish it from the company (named, if you remember, after it).
The Talis Alto LMS was(is) a good system. It had its pros and cons. It used a relational database AND actually used it as a relational database (in the Library system world this double is rare) and as a company they were open and approachable. At the same time their documentation was poor and trying to find out what documentation was current, and likewise what the latest release of a product was a pain. A one point after an update our ‘scratch’ area on our system kept filling up and on inspection a good number of very large files were there. After more work than it should have been it turned out the latest release had a new indexing system, half the files were the (essential) indexes and half were meaningless log files from the indexing operations. All stored within scratch and no notice to set up jobs to plan for these large files, or rotate the log files. Elsewhere, it felt like half the system ran by a series of slightly confusing Perl scripts which it was never fully clear what did what. It felt like it was painful for the system to adapt to a MARC21 world, and where bibliographic records and imported, bulk changed, and dropped with regularity (mainly for online content).
In 2005/06 Talis started to take on a new lease of life and confidence. Talis Insight 2005 has a list of high profile speakers from the Technology and Library worlds. A number of new initiatives and projects were started (see some of them at the bottom of the forums). The Talis Developer Network was started. Linked Data was being talked of, Panlibus the blog started (so did many other blogs), Panlibus the magazine did too. And then there’s Talking with Talis and The Library 2.0 Web Gang. There was a Library mashup competition, and the first mentions of the Talis Platform. Above all, Talis had recruited many new staff and it was clear from the blogs and concepts such as the TDN that they had some very smart people who understood the web and how to do things properly. This gave me confidence in the applications I could expect to see in the future.
One of the early things I liked was Richard Wallis’ Talis Whisper Project, a ‘catalogue’ which also showed the price of the item on the right as you moved over search results, using the Amazon API, had a basic search auto-complete and could show you via a Google Map which Library had the item. In terms of look and feel, and these features, this was light years ahead of what we currently had. Soon after, we had Project Cenote – another library catalogue demo, built on top of the new Talis Platform. It was quick to build and showed the power of building applications on top of the new Platform thingy.
Talking of the Platform, this was something that was everywhere, on the library (Panlibus) blog, the TDN, on the new more technical blogs now being setup and had its own newsletter. In 2007 some clueless wanna-be-developer tried to go through all these references to the Talis Platform and BigFoot and work out what these things were. The results were embarrassing.
Talis were certainly raising their profile: nationally and internationally, and beyond the traditional library market. Their call for Open Data, Apis, mashups and open standards struck with me and gave hope for systems working together, especially (as we were Talis customers) systems we were paying for. They were pushing for the things I wanted. And they were telling much large players in the international library market how to do things right, while talking about cutting edge stuff at the www conference. And as we rolled in to 2008, I got on to Twitter, where Talis staff were active, open and doing interesting stuff.
A few things troubled me. First I loved the Whisper catalogue demo, and waited, and then loved the Cenote catalogue demo. And I knew they were quick demos, and I knew they didn’t have half of the needed catalogue functionality in the real world, and I knew real products need testing, and I knew that they need to be built for exceptions, and for load. But… but… as I looked at Prism, the catalogue that was going to see regular updates and lots of new features from the day it was released in 2003 – but didn’t – I couldn’t help thinking, how long would it take to get just some of this stuff in to Prism the product, Prism that we pay for? These demos did not time out, had a modern web look, nice URLs, worked on OS X(!). The ideas, projects, research areas, mashups and demos kept on coming on the blogs but when was this going to filter in to, well, you know, the products?
There seemed to be a separation between Talis, which was producing some cutting edge thinking and ideas, and Talis the Library systems vendor, who – like many LMS providers – had a product that was looking somewhat tired, and not exactly technologically cutting edge (though I imagine the small team working on it had much better things to be doing). This separation formalised as Talis showed a clear structure between it’s Library application side and the Talis Platform side. You can see this clearly in this March 2007 Talis homepage and the May 2007 homepage. Would the ideas of web technology, standards and openness filter from one to the other?
And there was something else bothering me: the business plan. The Talis Platform had been in public for sometime, clearly a lot of development had gone in to it, and a lot of documentation, guides, web casts, conference talks and more (all costing time and money). The same too regarding modern library technology, with talks around the world, and world-wide Talking with Talis / Library 2.0 Gang podcasts and mashup competitions. What was the aim? Their Library market was the UK and Ireland, and I could see the Platform had a potentially global market, but what was the plan regarding Libraries? Were they going global there too? If not, why the push to market themselves (via talks and competitions) globally specifically in the library sector, why not focus on the markets they plan to sell in and leave the global stuff to those advocating the global Talis Platform? How did a mainly US based Library Gang help them with a fairly conservative (and often cynical of technology) UK library customer base? Meanwhile, while it was understandable in the early days of the Platform that they were trying to build momentum and mind share, when were they going to sell something? Click here to buy Platform Space…. or Contact us to do a big enterprise deal. The website seemed short on calls to action. All this work, while good for the community, and great for someone like me, cost money and resources from a small company that I wasn’t convinced could afford to essentially pay for such broad community building, in the hope they would get some of the eventual business.
Talis had grown many tentacles, and has clearly made a name in the linked data and library technology sectors, but hard revenue on the back of this seems slow. And this is from a customer (normally customers are crying out for companies to spend less time worrying about the bottom line and more on cutting edge stuff!).
In 2007 ‘Prism 3’, a new version of the catalogue which would be hosted, was much trumpeted by Talis; at account meetings and elsewhere, ready for release by the end of the year. It would be built on the Talis Platform, and all my concerns about ‘the talk’ not going in to ‘the walk’ would be gone.
This was good not just because I wanted to see some of the cool web stuff in our catalogue, but because of something more important. The National Student Survey. For those outside of UK HE, the NSS is big, and doing well in it is important. Essentially, finalists fill out a survey of questions about their experience while at University (rate your feedback from assignments from 1-5), one of which is about the Library. Like most Universities, Units such as the Library at Sussex produced a plan as to how they would increase student satisfaction (and hence the NSS score), and while certainly not a core issue, the catalogue had been noted as a weakness and we had therefore committed to improving it. It was a concern therefore when the end of 2007 came and went, and in early 2008, while we could see early demos of the new product (then ‘Project nugget’), it was now clear it would not be ready for the summer, when Universities traditionally launch new services.
Instead we went for a relatively new (but actually ready to use) product called Aquabrowser (implemented by the likes of Harvard, Cambridge, Edinburgh and Chicago) which could import MARC21 records from any system and act like a catalogue (well it could, if you ignore the need to log in and renew books and place reservations, which as it happens is quite a big need). We hit a very common problem in the Library technology world, and one I rant about at every opportunity (including this one): while MARC21 may be a bad standard, it is, at least, a standard, and it is, at least, used by most library systems for bibliographic information. The other type of information essential for a library catalogue is Holdings and availability information, i.e. for each item, the shelfmark and if it is on loan. This is simple information, and a common standard could be developed in an afternoon in a pub on an envelope and yet one does not exist. Instead, Aquabrowser screen scraped the Prism 2 record page for the item it wanted to show details for (all geeks will dance with joy at this wonderful solution). Active development in Aquabrowser seemed to end the day we signed up for it.
A quick note about another of Talis’ products: TalisList. TalisList was a reading list system and the unloved ugly duckling. At one event, which a colleague was at, a senior person at Talis said ‘with the new Talis Platform it’s just going to take a few months of Agile Scrums to quickly build the new TalisList application, that’s why this stuff is so good’. This follows the given wisdom in IT that to accurately predict development time you should take the time given in months and convert the units to years. And then add a couple of more as well. At the end of it, we had Aspire, of which we were the second customer.
In March 2011 Talis announced the sale of the Library division to Capita. This was big news, and I confess, quite shocking, I hadn’t seen it coming. In a sense it quickly made sense, Talis was more and more acting like two companies under the same roof. One provided a traditional library system and made money from support, consulting and extra services. The other had the Talis Platform and a number of exciting developments around it, plus the Education side, built on top of the platform, which consisted of Aspire.
Talis has made a big statement, they had sold the Unit which they were named after, which was their history and main source of revenue. It showed a coming of age of the Linked Data work that had been taking place, and a confidence that there was business there to be made. The sale of the Library division gave them cash to get going and allowed them to be lean and focused.
The Library Systems market had matured, most libraries had a system they were happy with, the cost of changing systems (training, integration with other existing systems) was high. While there was on-going income from service contracts and new developments this was not a good fit for where Talis wanted to be, for a company like Capita there must be additional benefits of offering such a system as part of a larger suite of products to their public sector customers.
The split was, from the outside, a clean one, the only slight grey areas were the Web Catalogue (Prism3) on the Library side being built on top of the Talis Platform now on the Talis side (but then, Talis were always keen to host other companies data), meanwhile many customers of the Library System (Alto) now owned by Captia, were also customers of Aspire, which remained with Talis. Any benefits for customers of having two products from the same supplier were gone.
Talis continued as three Units; Talis Education (Aspire), Kasabi (a data store and related services) and Talis Systems, who provided Consulting. Presumably the developers and System Administrators who managed the actual Platform fitted under the latter.
And now we get to the point of this post. All that above, really just a very long comment in parenthesis to set the context of where I am coming from. I’ve been following Talis for a long time, partly because it is what I should do when they are our main system supplier, and partly because I was interested in their direction. And as my job moved more in to innovation and new developments, again, this used useful for work (as one example, by following Talis, I grew interested in Linked Data, which in turn led me to the idea for SALDA, which became a JISC project).
But there is a limit to how much one small organisation can achieve. In our view, the commercial realities for Linked Data technologies and skills whilst growing is still doing so at a very slow rate, too slow for us to sustain our current levels of investment.
We have therefore made the decision to cease any further activities in relation to the generic semantic web and to focus our efforts on investing in our growing Education business.
Effective immediately we are ceasing further consulting work and winding down Kasabi. We have already spoken to existing customers of our managed services and, where necessary, are working with them on transition plans.
You can see some extra thoughts here.
There are a lot of talented people at Talis, and this must be a sad day for both current former staff. This has been their passion, and endless code, documentation, talks, presentations, plans and meetings have gone in to building what they currently have. To think it could all be turned off must be an incredible blow.
While working in my cosy public sector job, I admire those who take risks. I follow the start-up scene in the US and closer to home, and in my own small way try encourage those who work in Universities to try new ideas and be less afraid of risk (one example being blogging about a key commercial supplier in a way that might damage a client/customer relationship). Those who never fall never take big steps.
And Talis did take a massive risk. Credit to them. It sold, let us be blunt, the cash cow which must have brought in the vast majority of revenue. Yes this created a large pile of one-off cash, but this would not last, and they now had a set time limit for the Linked Data work to prove itself. Like any company, they would have planned this carefully, trying to predict growth and take-up of their services and plan for profitability.
Which perhaps makes today’s announcement even more surprising. No one could predict how long the economic downturn would continue for, or how much the Tories would cut back on new public sector IT projects, or how ideas like Kasabi would go down. But my basic calculations as to how far several million pounds from the sale, plus consulting fees, plus Aspire fees would go, did not leave me thinking that the crunch point would be now. Of course no company waits to their last pound before taking appropriate action when out goings are more than income but I’m amazed that just 16 months after confidently saying Linked Data is the future for Talis, we now hear the opposite is true.
Two out of three divisions are to close, the third looks like it will be re-engineered to move away from Linked Data. (Ironically, in this blog’s drafts folder is a recent attempt at playing with the Aspire API and being ever so slightly frustrated – while acknowledging my limited experience – about how the linked data concept can result in having to make many calls to the API for what one can get in just one API call using the laughably uncool CSV format).
There were some signs, in the last six months a number of key people have left, and Aspire Development has made mention of a new infrastructure and platform. I did find talk of this odd when so much of Aspire’s infrastructure is The Platform so removing the need to worry about it, so I guessed one interpretation of moving to a new platform involved moving away from the Linked Data based Talis Platform.
And what about The Platform? It really is the thing that has been there from the start, for many years you could see the release notes for the latest monthly upgrade to the Platform’s software on a wiki, it was constantly being improved. The announcement does not explicitly say what will happen with the Platform. In the short term, not a lot, Aspire and Prism 3 are built on it, and I have no idea if other third parties are using it (plus, disclosure, we have some data on there).
Perhaps the Platform on it’s own could be profitable? And if companies don’t like the idea of hosting their data on someone else’s servers, can the Platform be bundled up and sold as a standalone product to be deployed internally by large enterprises? I guess probably not, especially if it is built with other third-party components.
I admire Talis’ steps to create a new company in a new area in uncharted water, but wonder if there is anything to be learned from this, could more have been done to test the market and accurately plot take-up, demand, and the best range of products/services before taking the jump?
Aspire (a University Reading list system) has good take up in the UK, and even some internationally customers. And while breaking in to new countries is hard – and every country has a different HE setup (especially the US) – Aspire is a fairly unique product which might have many potential customers in countries that run Universities similarly to the UK. There are also complimentary developments which may interest Universities (though with Moodle being free and Open Source, moving towards the VLE/e-learning-like functionality would be difficult).
I don’t normally publicly dissect the decisions and history of a business, nor my frustrations and experience with a commercial system in such detail (though when I have here, it is mainly from at least several years a go). I hope I haven’t upset anyone – or at least not too many people. Since I joined Sussex, nearly ten years a go, Talis has reinvented itself several times, I’ve enjoyed trying to second guess its next move.
But I end with this. Never am I reminded so much that I am a creature with a small brain, of at best average intellect and shocking poor ability to grasp what should be basic concepts than when I read the blogs of those who work, or have worked, at Talis. Time and again I am blown away by smart thought, insight, comment and ideas. I wish Talis the best in its new direction (so long as that direction involves making Aspire bloody awesome… I’m still a customer).
To read this post you first need to pop along here. [opens in a new tab]
Then go to Street view at the precise point where the pin is (where the road crosses the river).
Do it now, we’ll be waiting for you.
…You hopefully saw a bizarre view, dark warped and not clear at all what it was, but at the same time there was the street view overlay showing there was a road ahead.
Meanwhile, try to rotate to the left or right resulted in more oddness. Depending on how you do it it can feel like you are rotating around the walls of the tunnel (when you first go to Street View, pull down, i.e. so that normally in Street View you would be pulling down to move the camera up to look at the sky, you should then see the tunnel ahead, you can then drag along the top, or bottom, of the screen from left to right, or use the compass in the top left)
You might well have realised what it is. You can try using Street view a little to the right of the pin in the map where the junction is with what looks like a cul-de-sac, and then move towards the tunnel for an idea of what has happened here. The Street View Car couldn’t go through the tunnel with the cameras up so the bar with the cameras was lowered. This leaves the ‘forward’ camera facing up, and the behind camera facing the car itself. However the controls still act as if you are seeing the normal view, making it odd to control.
For info, the water is actually the Grand Union Canal. Just below this road is where the river Nene (not far from its source and still small) crosses the canal, heading towards Northampton and on to The Wash. The railway you can see is the West Coast Mainline.
Tomorrow, Tuesday 14th February is the start of Dev8D 2012. A three day event for developers working in UK Higher Education. Which I am not really one, but don’t tell anybody and we should be ok. It’s an event that is on steroids : last time I went, my brain was pounded with more information to digest in 30 minutes than I would normally expect to receive in an entire day. It’s also bloody fantastic. I’m chuffed I can go this year.
The organisers have once again predicted the sessions I would like to attend and deliberately made sure they all clash. The Government should put a stop to this. Last time I tried to counter this by walking between the sessions of interest hoping to benefit from both. This did not work.
Had I mentioned I’m not a developer? Yes. But I’m very aware that where as developers a few years a go were talking LAMP (Linux, Apache, MySQL, Perl/PHP/Python), they are now talking almost anything but. The OS is of almost zero importance, so too is the web server (unless it is node.js it seems), MySQL is the uncool in the corner while hipsters MongoDB, CouchDB, Solr and Redis are on the dance floor, meanwhile never-cool PHP has become uncooler, perl has been forgotten, Groovy, Ruby (on Rails, of course), Scalar, Erlang, Haskell, Clojure and R are where it’s at. This article from Simon Willison sums it up nicely. If I walk away with half a clue as to what some of these technologies are useful for, and how I might use them, then job well done.
Anyway… My Plan:
I am also expecting to consume: (a) Beer (b) Gin. And maybe even interact with people in some sort of attempt at (a) being social (b) networking. If you see me there, say hi.
PS it is against the rules to quiz me after the event about any of the items above.
TwapperKeeper is shutting down it seems. It’s a popular online tool for archiving hashtags and other twitter searchers, and certainly well used in UK Higher Education where I work. I actually met John O’Brien, founder of TwapperKeeper, when he attended Dev8D (a developer event for those working in UK HE) a couple of years a go, nice chap.
Anyway, Martin Hawksey has created a wonderful tool for archiving, ummm, archives before they are gone. The tool is actually a Google Spreadsheet and to me it’s a testament to Google Docs power that an application that fetches and stores data from another site can be created using it.
Here’s my Twappers. Saved thanks to Martin’s brilliant spreadsheet.
I make no claim on owning any of the data. I’m guessing the original tweeters do. Or maybe Twitter Inc. Or Facebook. Actually it’s definitely Facebook. And it’s already alerted your mum that you’re reading this. Sorry.
There’s a long tradition of bloggers blogging about blogging. This post follows in the self-obsessed insular tradition.
This blog post is somewhat unusual in that I’m posting it to nostuff, my blog. I’ve been using posterous a bit of late and like it a lot. Posterous is also setup to post to a WP instance on nostuff here: http://www.nostuff.org/posterous/ it was setup as a way to archive an externally hosted service, but the end result is very useable (and seems to rank quite highly in Google).
Why am I using Posterous more, and this blog less? A number of Posterous posts have started off as ‘I can’t quite fit this in to 140 characters on twitter so I’ll use Posterous’, and normally write much, much more than I intend.
I also find using gmail as a post creator rather nice to use, it makes me focus on writing as it doesn’t support any fancy formatting or blog specific features.
In fact the composition window of WordPress has always been its weak point. Does this put me off using it? Even though the WP developers have put a lot in to the interface, it is at the end of the day, a TinyMCE (or similar) WYSIWYG editor. The text box for which always seems a little small to me. And it feels a little like editing a form. I wish it looked more like a Google Doc, taking up nearly all the screen with the text editor, large text, and excellent ‘constant save’ / view changes support.
What’s not helpful is that my blog is hosted in the States, or a server that ins’t always as responsive as it should be, so the experience feels slower than using gmail. Finally, the categories, tags, perm link etc all make blogging feel like a ‘heavy’ experience, even though – of course – I’m free to ignore them. The simplicity of posterous was liberating.
So, for this post, and others recently to this blog, I’ve used a client called ecto to compose it, and at the end post it to WordPress. This is something ironic in doing just about everything in a browser except composing something that is so at its heart a web-based thing, a blog. You could argue that writing (relatively) long bits of text is better suited to local apps than web apps, but this doesn’t explain why I head for Google Docs and Google spreadsheets by default rather than Word of Excel. I selected ecto many years a go after trying it and MarsEdit. I can’t help thinking I made a bad choice as ecto has not since had a single update and MarsEdit has gone from strength to strength. Still, while it works I shall resist paying the £28 for MarsEdit. This is a rare area where the Microsoft alternative is better and free.
One to(o) Many
When I started out with this blog many years a go, I thought I was in the same boat as many of my peers. Over time I’ve noticed that I’m quite rare in that nostuff.org/words is a ‘anything goes’ blog – most are either work/professional related or of a particular interest. In fact very few of the blogs I follow are of a general style such as this (Dave Pattern and Tom Roper are two examples I can think of which buck the trend). Many people seem to have a professional blog, perhaps a specialist blog (cooking, running, knitting, etc) and increasingly perhaps a tumblr for random stuff.
I’ve rather keen to keep the general feel. I know I’m guilty in so many ways at letting work/personal intermingle in so many ways but you can categories blog posts (and probably twitter to) as one of: I’ve done something I want to share; I’ve got a view/opinion on something, and the third, related to the second, I want to reflect on something (which this post probably falls under), and I like the idea of this blog reflecting those things no matter what subject they are about. This space is a dump of my thoughts and things worth sharing. I prefer to let categories and tags provide a way of filtering should people only be interested. Of course, there’s an argument that you may want to avoid people in a professional network (I hate that phase) from seeing your thoughts and rantings outside of work. It’s a very good argument, but one so far I’ve resisted changing what I do because of it.
Of course, the idea that this is the place for the thoughts and outputs of Chris Keene is nonsense. As mentioned above, I also dump stuff on posterous. I’m using Google+ more, there’s flickr, youtube, comments on other blogs and most of all twitter. There’s no easy answer to this, I’ve gone away from the route of adding a lot of plugins from other sites to the sides of this blog, it makes things look messy, but it does leave a hole through my general ‘this is my dumping ground’ philosophy.
I haven’t done much with this blog over the last year though I do have some things on my mind.
I have one final idea but it will take more than a bullet point to explain.
For a long time I have felt that Blog comments leave me wanting, on my own and other sites. I only get to see those who commented before me, I probably won’t see those left after mine, people who already have commented will not see mine etc. If I write a long comment with some good points, I have no way of recording that comment, i.e. there’s no way to see a list of all the comments I’ve left on other sites. Wanting to refer to an old comment on mine relies on me remembering which blog and post it was connected to. Finally, managing comments on your own blog can be hard work, even with the impressive – free – WordPress spam plugin.
What’s more, I see an increasing number of blog sites either just keep comments turned off (unless your very popular commenting is rare) or use an external commenting solution such as Discus.
I’d like to use Google+ as my comment system. And I don’t think this is currently possible.
I think Google+ would make an excellent platform for commenting (so would friendfeed, which it is almost identical to). Everyone can see a public post. All comments are listed together under the link to the post on Google+, and everyone can see every comment even if they don’t follow the other person (unlike twitter and, probably (it’s too hard to understand) Facebook). You can come back days later and easily see new comments. If you didn’t see the post you can glance at the comments to see if it is of interest. If you are not interested you can just skip past the post in your stream (the comments will be wrapped up so it won’t take much space). Facebook is too much centred around a closed set of people you follow. Twitter is very much for the here and now, miss it and it’s gone.
As much as I’m a fan of twitter, it does have flaws (by design not implementation). As noted above, it’s easy to miss things. I sometimes go back in the timeline and come across a really useful conversation that I could have missed. What’s more, I know that I may see person A’s original tweet, and person B and C – who I also follow – conversation with A about it. What I miss (and remember I’m lucky I bumped in to this) is Person D and E commenting on it with A, which I don’t see as I don’t follow them. Nor do I see person A and person B replying back, and hence I might miss the bulk of the conversation. What’s more I miss F, who follows C joining in which starts a whole new track with others! I miss all of this, in fact no one is guaranteed to see all of it. Meanwhile those who are totally uninterested are getting bored of seeing these tweets about a specific (and probably quite anal) topic.
So I would love for my blog to autopost to Google+ and then use the Google+ post as the place to comment, ideally showing it below the blog post. In a similar way to Techcrunch showing Facebook messages under its posts (but why a tech site uses Facebook, hated by much of the geek community, is beyond me… but then it is Techcrunch).
I still feel having a blog is useful. I’ve never attempted to update it regularly – nor have I ever understood the idea that there is pressure to post of a regular basis.
To an extent, I don’t see a difference between ‘maintaining a personal website’ (which was what we did before blogging) and keep a blog. Occasionally you have an idea to create a page about something, blog software just makes the process easier. Posts are just (the new) pages.
In my mind an online presence – a website and domain – are essential to those who spend much time on the web. Simply having a series of profiles on popular websites just doesn’t seem the same. And words on nostuff.org continues to be the main part of the nostuff.org content-free experience, living up to its name.
Update: Feel free to leave comments here :)