Now time to talk about something else.
This week my partners sister & niece are visiting from Helsinki, so we've both got a few days off work, and we'll be acting like tourists.
Otherwise the job of this week is to find a local photographer to shoot the pair of us. I've shot her many, many, many times, and we also have many nice pictures of me but we have practically zero photos of the pair of us.
I spent a lot of time talking to local volunteers & models, because I like to shoot them, but I know only a couple of photographers.
Still a big city, we're bound to find somebody suitable :)
There’s still just under a week to be in with a chance to get your hands on two tickets for the BFI Doctor Who 50th anniversary screening of the Paul McGann TV movie. The man himself is going to be there on the 5th October to be part of the panel after the screening. I’m giving away a couple of spare tickets to people to donate to my Malawi Mission, which is raising money for the AMECA Trust to provide better health care in Africa.
Just donate £5 (or more!) to my charity fund between 9th September and 22nd September, mentioning “BFI” in the message to be in with a chance to get your hands on the tickets. Remember to leave your e-mail address when you donate: Don’t donate anonymously!Donate here to enter the draw.
Please see last week’s blog post for the full details and T&Cs. And thank you for donating!Pin It
I recently mentioned that there wasn't any built-in node.js functionality to perform IP matching against CIDR ranges.
This surprised me, given that lots of other functionality is available by default.
The NPM documentation was pretty easy to follow:
Now I can take a rest, and stop talking about blog-spam.
Living dangerously I switched DNS to point to the new codebase on my lunch hour.
I found some problems immediately; but nothing terribly severe. Certainly nothing that didn't wait until I'd finished work to attend to.
I've spent an hour or so documenting the new API this evening, and now I'm just going to keep an eye on things over the next few days.
The code is faster, definitely. The load is significantly lower than it would have been under the old codebase - although it isn't a fair comparison:
The old XML-RPC API is still present, but now it just proxies to the JSON-version, which is a cute hack. How long it stays alive is an open question, but at least a year I guess.
God knows what my wordpress developer details are. I suspect its not worth my updating the wordpress plugin, since nobody ever seemed to love it.
These days the consumers of the API seem to be, in rough order of popularity:
There are few toy-users, like my own blog, and a few other similar small blogs. All told since lunchtime I've had hits from 189 distinct sources, the majority of which don't identify themselves. (Tempted to not process their requests in the future, but I don't think I can make such a change now without pissing off the world. Oops.)
PS. Those ~200 users? rejected 12,000 spam comments since this afternoon. That's cool, huh?
tl;dr I wrote some bash scripts, scroll down to have a laugh at how bad I am at it.
Get them with bzr branch lp:~popey/+junk/phablet-testing
Over the past few months we’ve had a lot of fun testing out the Core Apps and more recently new apps submitted for the Ubuntu App Showdown. Frequently someone will drop by #ubuntu-app-devel or the Ubuntu App Developer G+ Community and ask us to test their latest app out.
Many developers are developing their apps on an Ubuntu desktop using the Ubuntu Touch SDK Preview and only run their app on that PC, with no mobile device to test on. Clearly there are constraints when running apps on mobile devices and all the developers are keen to ensure theirs works well on all form factors. With this in mind some of us at Canonical (and some community folks too!) who are fortunate enough to have devices have offered to test the apps out for them.
The Core Apps are in a PPA which gets included in the daily builds, so we can simply apt-get update and apt-get dist-upgrade to get the latest version of those apps, once they’re built and have passed through some automated testing and review. However many apps aren’t in a PPA, especially those apps being created for the App Showdown. They’re mostly just in personal branches on Launchpad. As we get asked to test these a lot, I wanted to try and automate the process, so it’s not too much of an interruption to my day when people request ad-hoc testing.
The requirements were:-
So that’s why I made those scripts. Hope they’re useful to someone. Let me know if they do/don’t work or if there are better ways to do it. Note: There almost certainly are better ways to do all of it.Tweet
Some history of the blogspam service:
Back in 2008 I was annoyed by the many spam-comments that were being submitted to my Debian Administration website. I added some simple anti-spam measures, which reduced the flow, but it was a losing battle.
In the end I decided I should test comments, as the users submitted them, via some kind of external service. The intention being that any improvements to that central service would benefit all users. (So I could move to testing comments on my personal blog too, for example).
Ultimately I registered the domain-name "blogspam.net", and set up a simple service on it which would test comments and judge them to be "SPAM" or "OK".
The current statistics show that this service has stopped 20 million spam comments, since then. (We have to pretend I didn't wipe the counters once or twice.)
I've spent a while now re-implementing most of the old plugins in node.js, and I think I'll be ready to deploy the new service over the weekend. The new service will have to handle two different kinds of requests:
These will be submitted via HTTP POSTed JSON data, and will be handled by node.js. These should be nice and fast.
These will come in via XML-RPC, and be proxied through the new node.js implementation. Hopefully this will mean existing clients won't even notice the transition.
I've not yet deployed the new code, but it is just a matter of time. Hopefully being node.js based and significantly easier to install, update, and tweak, I'll get more contributions too. The dependencies are happily very minimal:
The only significant outstanding issue is that I need to pick a node.js library for performing CIDR lookups - "Does 10.11.12.23 lie within 10.11.12.0/24?" - I'm surprised that functionality isn't available out of the box, but it is the only omission I've missed.
I've been keeping load & RAM graphs, so it will be interesting to see how the node.js service competes. I expect that if clients were using it, in preference to the XML-RPC version, then I'd get a hell of a lot more throughput, but with it hidden behind the XML-RPC proxy I'm less sure what will happen.
I guess I also need to write documentation for the new/preferred JSON-based API...