LUG Community Blogs

Jonathan McDowell: Fixing my parents' ADSL

Planet ALUG - Sat, 18/01/2014 - 07:22

I was back at my parents' over Christmas, like usual. Before I got back my Dad had mentioned they'd been having ADSL stability issues. Previously I'd noticed some issues with keeping a connection up for more than a couple of days, but it had got so bad he was noticing problems during the day. The eventual resolution isn't going to surprise anyone who's dealt with these things before, but I went through a number of steps to try and improve things.

Firstly, I arranged for a new router to be delivered before I got back. My old Netgear DG834G was still in use and while it didn't seem to have been the problem I'd been meaning to get something with 802.11n instead of the 802.11g it supports for a while. I ended up with a TP-Link TD-W8980, which has dual band wifi, ADSL2+, GigE switch and looked to have some basic OpenWRT support in case I want to play with that in the future. Switching over was relatively simple and as part of that procedure I also switched the ADSL microfilter in use (I've seen these fail before with no apparent cause).

Once the new router was up I looked at trying to get some line statistics from it. Unfortunately although it supports SNMP I found it didn't provide the ADSL MIB, meaning I ended up doing some web scraping to get the upstream/downstream sync rates/SNR/attenuation details. Examination of these over the first day indicated an excessive amount of noise on the line. The ISP offer the ability in their web interface to change the target SNR for the line. I increased this from 6db to 9db in the hope of some extra stability. This resulted in a 2Mb/s drop in the sync speed for the line, but as this brought it down to 18Mb/s I wasn't too worried about that.

Watching the stats for a further few days indicated that there were still regular periods of excessive noise, so I removed the faceplate from the NTE5 master BT socket, removing all extensions from the line. This resulted in regaining the 2Mb/s that had been lost from increasing the SNR target, and after watching the line for a few days confirmed that it had significantly decreased the noise levels. It turned out that the old external ringer that was already present on the line when my parents' moved in was still connected, although it had stopped working some time previously. Also there was an unused and much spliced extension in place. Removed both of these and replacing the NTE5 faceplate led to a line that was still stable. At the time of writing the connection has been up since before the new year, significantly longer than it had managed for some time.

As I said at the start I doubt this comes as a surprise to anyone who's dealt with this sort of line issue before. It wasn't particularly surprising to me (other than the level of the noise present), but I went through each of the steps to try and be sure that I had isolated the root cause and could be sure things were actually better. It turned out that doing the screen scraping and graphing the results was a good way to verify this. Observe:

The blue/red lines indicate the SNR for the upstream and downstream links - the initial lower area is when this was set to a 6db target, then later is a 9db target. Green are the forward error correction errors divided by 100 (to make everything fit better on the same graph). These are correctable, but still indicate issues. Yellow are CRC errors, indicating something that actually caused a problem. They can be clearly seen to correlate with the FEC errors, which makes sense. Notice the huge difference removing the extensions makes to both of these numbers. Also notice just how clear graphing the data makes things - it was easy to show my parents' the graph and indicate how things had been improved and should thus be better.

Categories: LUG Community Blogs

Chris Lamb: Review: Captain Phillips (2013)

Planet ALUG - Fri, 17/01/2014 - 13:27

Somalia's chief exports appear to be morally-ambiguous Salon articles about piracy and sophomoric evidence against libertarianism. However, it is the former topic that Captain Phillips concerns itself with, inspired by the hijacking of the Maersk Alabama container ship in 2009.

What is truth? In the end, Captain Phillips does not rise above Pontius Pilate in providing an answer, but it certainly tries using more refined instruments than irony or leaden sarcasm.

This motif pervades the film. Obviously, it is based on a "true story" and brings aboard that baggage, but it also permeates the plot in a much deeper sense. For example, Phillips and the US Navy lie almost compulsively to the pirates, whilst the pirates only really lie once where they put Phillips in greater danger.

Notice further that Phillips only starts to tell the truth when he thinks all hope is lost. These telling observations become even more fascinating when you realise that they must be based on the testimony of the...liars. Clearly, deception is a weapon to be monopolised and there are few limits on what good guys can or should lie about if they believe they can save lives.

Even Phillip's nickname ("Irish") is a falsehood – he straight-up admits he is an American citizen.

Lastly, there is an utterly disarming epilogue where Phillips is being treated for shock by clinical efficient medical staff. Not only will it scuttle any "blanket around the shoulders" cliché‎ but is probably a highly accurate portrayal of what actually happens post-trauma. This echoes the kind of truth Werner Herzog often aims for in his filmmaking as well his guilt-inducing duality between uncomfortable lingering and compulsive viewing.

Another angle worthy of discussion: can a film based on real-world events even be "spoilered"? Hearing headlines on the before you read the newspaper hardly robs you of a literary journey...

Captain Phillips does have some quotidian problems. Firstly, the only tool for ratcheting up tension is for the Somalians to launch verbal broadsides at the Americans, with each compromise somehow escalating the situation further. This technique is genuinely effective but well before the climatic rescue scene—where it is really needed—it has been subject to the most extreme diminishing returns.

(I cannot be the first to notice the "Africans festooned with guns shouting incomphensively" trope – I hope it is based on a Babel-esque mechanism of disorientation and miscommunication rather than anything, frankly, unsavoury.)

The racist idea that Africans prefer a AK-47 rotated about the Z-axis is socially constructed.

Secondly, the US Navy acts like a teacher with an Ofsted inspector observing quietly from the corner of the classroom; far too well-behaved it suspends belief, with no post-kill gloating or even the tiniest of post-arrest congratulations. Whilst nobody wants to see the Navy overreact badly to other military branches getting all the glory, nobody wants to see a suspiciously bland recruitment vehicle either. Paradoxically, this hermetic treatment made me unduly fascinated by them, as if they were part of some military "uncanny valley". Two quick observations:

  • All US—Somali interactions are recorded by a naval officer. No doubt a value-for-money defense against a USS Abu Ghraib, but knowing the plot is based on a factual events, it was perhaps a little too Baudrillardian to ponder how the presence of the Navy's cameras in a scene actually lent weight to the film's version of events, crucially without me even knowing whether the parallel "real life" footage is verifiable or not.
  • The navigational computers not only seem to require lines to drawn repeatedly between points of interest, but the Maersk Alabama's arbitrary relabelling as MOTHERSHIP seems to imply that an officer could humourously rename a contact to something unbecoming of a 12A classification.

The drone footage: I'd love to read (or write) an essay about how Call of Duty might have influenced cinema.

Finally, despite the title, the film is actually about two captains; the skillful liar Phillips and ... well, that's the real problem. Whilst Captain Muse is certainly no caricatured Hook, we are offered little depth beyond a "You're not just a fisherman" faux-revelation that leads nowhere. I was left inventing reasons for his akrasia so that he made any sense whatsoever.

One could charitably argue that the film attempts to stay objective on Muse, but the inability for the film to take any obvious ethical stance actually seems to confuse and then compromise the narrative. What deeper truth is actually being revealed? Is this film or documentary?

Worse still, the moral vacuum is invariably filled by the viewer's existing political outlook: are Somali pirates victims of circumstance who are forced into (alas, regrettable) swashbuckling adventures to pacify plank-threatening warlords? Or are they violent and dangerous criminals who habour an irrational resentment against the West, flimsily represented by material goods in shipping containers?

Your improvised answer to this Rorschach test will always sit more haphazardly in the film than any pre-constructed treatment ever could.

6/10

Categories: LUG Community Blogs

Chris Lamb: Captain Phillips: Pontius Pirate

Planet ALUG - Fri, 17/01/2014 - 13:27

Somalia's chief exports appear to be morally-ambiguous Salon articles about piracy and sophomoric evidence against libertarianism. However, it is the former topic that Captain Phillips concerns itself with, inspired by the hijacking of the Maersk Alabama container ship in 2009.

What is truth? In the end, Captain Phillips does not rise above Pontius Pilate in providing an answer, but it certainly tries using more refined instruments than irony or leaden sarcasm.

This motif pervades the film. Obviously, it is based on a "true story" and brings aboard that baggage, but it also permeates the plot in a much deeper sense. For example, Phillips and the US Navy lie almost compulsively to the pirates, whilst the pirates only really lie once (where they put Phillips in greater danger).

Notice further that Phillips only starts to tell the truth when he thinks all hope is lost. These telling observations become even more fascinating when you realise that they must be based on the testimony of the, well, liars. Clearly, deception is a weapon to be monopolised and there are few limits on what good guys can or should lie about if they believe they can save lives.

Even Phillip's nickname ("Irish") is a falsehood – he straight-up admits he is an American citizen.

Futhermore, there is an utterly disarming epilogue where Phillips is being treated for shock by clinical efficient medical staff. Not only will this scuttle any "blanket around the shoulders" cliché‎ but is probably a highly accurate portrayal of what actually happens post-trauma. This echoes the kind of truth Werner Herzog aims for in his filmmaking as well his guilt-inducing duality between uncomfortable lingering and compulsive viewing.

Lastly, a starter for a meta-discussion: can a film based on real-world events even be "spoilered"? Hearing headlines on the radio before you read your newspaper hardly robs you of a literary journey...

Captain Phillips does have some quotidian problems. Firstly, the only tool for ratcheting up tension is for the Somalians to launch verbal broadsides at the Americans, with each compromise somehow escalating the situation. This technique is effective but well before the climatic rescue scene—where it is really needed—it has been subject to the most extreme diminishing returns.

(I cannot be the first to notice the "Africans festooned with guns shouting incomphensively" trope – I hope it is based on a Babel-esque mechanism of disorientation from miscommunication rather than anything more unsavoury.)

The racist idea that Africans prefer an AK-47 rotated about the Z-axis is socially constructed.

Secondly, the US Navy acts like a teacher with an Ofsted inspector observing quietly from the corner of the classroom; far too well-behaved it suspends belief, with no post-kill gloating or even the tiniest of post-arrest congratulations. Whilst nobody wants to see the Navy overreact badly to other military branches getting all the glory, nobody wants to see a suspiciously bland recruitment vehicle either. Paradoxically, this hermetic treatment made me unduly fascinated by them as if they were part of some military "uncanny valley". Two quick observations:

  • All US—Somali interactions are recorded by a naval officer. No doubt a value-for-money defense against a USS Abu Ghraib, but knowing the plot is based on factual events, it was perhaps a little too Baudrillardian to ponder how the presence of the Navy's cameras in a scene actually lent weight to the film's version of events, crucially without me even knowing whether the parallel "real-life" footage is verifiable or not.
  • The navigational computers not only seem to require lines to drawn repeatedly between points of interest, but the Maersk Alabama's arbitrary relabelling as MOTHERSHIP seems to imply that an officer could humourously rename a radar contact to something unbecoming of a 12A classification.

The drone footage: I'd love to write an essay about how Call of Duty might have influenced (or even be) cinema.

Finally, despite the title, the film is actually about two captains; the skillful liar Phillips and ... well, that's the real problem. Whilst Captain Muse is certainly no caricatured Hook, we are offered little depth beyond a "You're not just a fisherman" faux-revelation that leads nowhere. I was left inventing reasons for his akrasia so that he made any sense whatsoever.

One could charitably argue that the film attempts to stay objective on Muse, but the inability for the film to take any obvious ethical stance actually seems to confuse and then compromise the narrative. What deeper truth is actually being revealed? Is this film or documentary?

Worse still, the moral vacuum is invariably filled by the viewer's existing political outlook: are Somali pirates victims of circumstance who are forced into (alas, regrettable) swashbuckling adventures to pacify plank-threatening warlords? Or are they violent and dangerous criminals who habour an irrational resentment against the West, flimsily represented by material goods in shipping containers?

Your improvised answer to this Rorschach test will always sit more haphazardly in the film than any pre-constructed treatment ever could.

6/10

Categories: LUG Community Blogs

Steve Kemp: So I found a job.

Planet HantsLUG - Fri, 17/01/2014 - 13:27

Just to recap my life since December:

I had worked with Bytemark for seven years and left for reasons which made sense. I started working for "big corp" with a job that on-paper sounded good, but ultimately turned out to be a poor fit for my tastes.

I spent a month trying to decide "Is this bad, or is this just not what I'm used to?", because I was aware that there would obviously be big differences as well as little ones.

At the point I realized some of the niggles could be fixed but most couldn't then I resigned, rather than prolong the initial probationary training period - because I knew I wouldn't stay, and it seemed unfair and misleading to stay for the full duration of the probationary period knowing full well I'd leave the moment it concluded - and the notice period switched from seven days to one month.

A couple of people were kind enough to get in touch and discuss potential offers, both locally, remotely in the UK, and from abroad (the latter surprised me, but pleased me too).

I spent a couple of days "contracting", by which I really mean doing a few favours for friends, some of whom paid me in Amazon vouchers, and some of whom paid me in beer.

e.g. I tweaked the upcoming death Knight site to handle 3000 simultaneous HTTP connections, then I upgraded some servers from Squeeze to Wheezy for some other folk.

That aside I've largely been idle for about 10 days and have now picked the company to work for - so I'm going to be a contractor with a day-rate for an American firm for the next couple of months. If that goes well then I'll become a full-time employee, hopefully.

Categories: LUG Community Blogs

Jono Bacon: Growing an Active Ubuntu Advocacy Community

Planet WolvesLUG - Fri, 17/01/2014 - 00:13

Like many of you, I am tremendously excited about Ubuntu’s future. We are building a powerful convergence platform across multiple devices with a comprehensive developer platform at the heart of it. This could have a profound impact on users and developers alike.

Now, you have all heard me and my team rattling on about this for a while, but we also have a wonderful advocacy community in Ubuntu in the form of our LoCo Teams who are spreading the word. I want to explore ways to help support and grow the events and advocacy that our LoCo Teams are doing.

I had a conversation with Jose on the LoCo Council about this today, and I think we have a fun plan to move forward with. We are going to need help though, so please let me know in the comments if you can participate.

Step 1: Ubuntu Advocacy Kit

The Ubuntu Advocacy Kit is designed to provide a one-stop shop of information, materials (e.g. logos, brochures, presentations), and more for doing any kind of Ubuntu advocacy. Right now it needs a bit of a spring clean, which I am currently working on.

I think we need to get as many members of our community to utilize the kit. With this in mind we are going to do a few things:

  • Get the kit cleaned up and up to date.
  • Get it linked on loco.ubuntu.com and encourage our community to use it.
  • Encourage our community to contribute to the kit and add additional content.
  • Grow the team that maintains the kit.

Help needed: great writers and editors.

Step 2: Advocacy App

The Ubuntu Advocacy Kit works offline. This was a conscious decision with a few benefits:

  1. It makes it easier to know you have all relevant content without having to go to a website and download all the assets. When you have the kit, you have all the materials.
  2. The kit can be used offline.
  3. The kit can be more easily shared.
  4. When people contribute to the kit it feels like you are making something, as opposed to adding docs to a website. This increases the sense of ownership.

With the kit being contained in an offline HTML state (and the source material in reStructured Text) it means that it wouldn’t be that much work to make a click package of the kit that we can ship on the phone, tablet, and desktop.

Just imagine that: you can use the click store to install the Ubuntu Advocacy Kit and have all the information and materials you need, right from the palm of your hand on your phone, tablet, or desktop.

The current stylesheet for the kit doesn’t render well on a mobile device, so it would be handy if we could map the top-level nav (Documentation, Materials etc) to tabs in an app.

We could also potentially include links to other LoCo resources (e.g. a RSS feed view of news from loco.ubuntu.com) and a list of teams.

If you would be interested in working on this, let me know.

Help needed: Ubuntu SDK programmers and artists.

Step 3: Hangout Workshops

I am going to schedule some hangout workshops to go through some tips of how to organize and run LoCo events and advocacy campaigns, and use the advocacy kit as the source material for the workshop. I hope this will result in more events being coordinated.

Help needed: LoCo members who want to grow their skills.

Step 4: LoCo Portal

We also want to encourage wider use of loco.ubuntu.com so our community can get a great idea of the pule of advocacy, events, and more going on.

Help needed: volunteers to run events.

Feedback and volunteers are most welcome!

Categories: LUG Community Blogs

Aq: Posting to Discourse via the Discourse REST API from Python

Planet WolvesLUG - Thu, 16/01/2014 - 21:05

The Bad Voltage forum is run by Discourse. As part of posting a new episode, I wanted to be able to send a post to the forum from a script. Discourse has a REST API but it’s not very well documented, at least partially because it’s still being worked on. So if you read this post two years after it was written, it might be entirely full of lies. Still, I managed to work out how to post to Discourse from a Python script, and here’s an example script to do just that.

First, you’ll need an API key. If you’re the forum administrator, which I am, you can generate one of these from http://YOURFORUM/admin/api. It is not clear to me exactly what this API key does: in particular, I suspect that it is a key with total admin rights over the forum, so don’t share it around. If there’s a way of making an API key with limited rights to just create posts and that’s it, I don’t know that way; if you do know that way, tell me! Once you’ve got your API key, and your username, fill them into the script as APIKEY and APIUSERNAME.

import requests # apt-get install python-requests # based on https://github.com/discoursehosting/discourse-api-php/blob/master/lib/DiscourseAPI.php # log all the things so you can see what's going on import logging import httplib httplib.HTTPConnection.debuglevel = 1 logging.basicConfig() logging.getLogger().setLevel(logging.DEBUG) requests_log = logging.getLogger("requests.packages.urllib3") requests_log.setLevel(logging.DEBUG) requests_log.propagate = True # api key created in discourse admin. probably super-secret, so don't tell anyone. APIKEY="whatever the api key is" APIUSERNAME="your username" QSPARAMS = {"api_key": APIKEY, "api_username": APIUSERNAME} FORUM = "http://url for your forum/" # with the slash on the end # First, get cookie r = requests.get(FORUM, params=QSPARAMS) SESSION_COOKIE = r.cookies["_forum_session"] # Now, send a post to the _forum_session post_details = { "title": "Title of the new topic", "raw": "Body text of the post", "category": 7, # get the category ID from the admin "archetype": "regular", "reply_to_post_number": 0 } r = requests.post(FORUM + "posts", params=QSPARAMS, data=post_details, cookies={"_forum_session": SESSION_COOKIE}) print "Various details of the response from discourse" print r.text, r.headers, r.status_code disc_data = json.loads(r.text) disc_data["FORUM"] = FORUM print "The link to your new post is: " print "%(FORUM)st/%(topic_slug)s/%(topic_id)s" % disc_data
Categories: LUG Community Blogs

Steve Engledow (stilvoid): TODO

Planet ALUG - Thu, 16/01/2014 - 17:11

New year this year passed basically the same as last year, though even more enjoyably.

I decided I'd better review my TODO list from last year so here's a diff :)

My firstsecond new year as a dad was a very pleasant one. The Mrs and I polished off atwo bottles of champage (I never used to like the stuff -I still can't get enough of it now) and put our favouritea wide variety of tunes on all night. Dylan woke up at precisely two minutes to midnightdidn't wake up all night :)

New Year's Resolutions

I don't normally make new year's resolutions but I'll give it a go. Here's my resolution list for this year including those not done from last year.

  • Sort out NatWest

  • Sort out LoveFilm

  • Buy a house

  • Read even more (fiction and non-fiction)

  • Write at least one short story

  • Write some more games

  • Go horse riding

  • Learn some more turkish

  • Play a lot more guitar

  • Lose at least a stone (in weight, from myself)

  • Try to be less of a pedant (except when it's funny)

  • Try to be more funny ;)

Categories: LUG Community Blogs

Andrew Savory: Home Automation: buttons on the desktop

Planet ALUG - Wed, 15/01/2014 - 19:49

I want a button on my Mac’s desktop to turn on or turn off the lights I have controlled by the Raspberry Pi. Here’s what I’ve got so far.

First, I wrote a script on the Pi to turn the lights on:

#!/bin/bash
/usr/bin/tdtool --on 1
/usr/bin/tdtool --on 2
/usr/bin/tdtool --on 3

For everything here, I also wrote the equivalent to turn the lights off.

Next, I tested running the script via SSH from my Mac:

ssh -f user@raspberrypi.local /home/user/lights_on.sh &>/dev/null

You can use the Applescript Editor to make pseudo-apps, so I wrote an Applescript to execute the SSH command:

tell application “Terminal”

    do script “ssh -f user@raspberrypi.local /home/user/lights_on.sh &>/dev/null”

    activate

end tell

delay 15

tell application “Terminal”

    quit

end tell

I saved this as file format “Application” in my Applications folder, then dragged it to my desktop (an alias is automatically created, leaving the original in Applications). I can now double-click the app and the script runs, or launch it from Spotlight or the wonderful Alfred:     It works but it’s ugly as I have a Terminal window pop up for ~ 15 seconds. There must be a better way.   After some searching, I came across Use automator to create X11 shortcuts in OSX. It’s similar to the Applescript trick, but uses Automator instead, so there’s no need for Terminal. I put the SSH command into the “Run shell script” workflow action, and saved it as an application:     It works! Now I can turn the lights on or off without the Terminal window popping up.   There’s a couple of issues:
  • The tdtool command takes a long time to execute on the Pi. It takes about three seconds for all three plugs to switch, whereas the smart plug remote has an “all on / all off” button which is instant. I need to find out why the command is so slow, and/or a way to control all three in one go.
  • I don’t really want “Lights on” and “Lights off” apps, I want a single app that toggles the state. This could be done by making the server-side script smarter, but I’d really like the app icon to reflect the status of the lights too.
Room for improvement, but this is good enough for now.   Next up: automating sunsets.   References  
Categories: LUG Community Blogs

Debian Bits: Call for Proposals for the MiniDebConf 2014 Barcelona

Planet HantsLUG - Wed, 15/01/2014 - 19:40

Debian Women will hold a MiniDebConf in Barcelona on March 15-16, 2014. Everyone is invited to both talks and social events, but the speakers will all be people who identify themselves as female. This is not a conference about women in Free Software, or women in Debian, rather a usual Debian Mini-DebConf where all the speakers are women.

Debian Women invites submissions of proposals for papers, presentations, discussion sessions and tutorials for the event. Submissions are not limited to traditional talks: you could propose a performance, an art installation, a debate or anything else. All talks are welcome, whether newbie or very advanced level. Please, forward this call to potential speakers and help us make this event a great success!

Please send your proposals to proposals@bcn2014.mini.debconf.org. Don't forget to include in your message: your name or nick the title of the event, description, language, and any other information that might be useful. Please submit your proposal(s) as soon as possible.

For more information, visit the website of the event: http://bcn2014.mini.debconf.org

We hope to see you in Barcelona!

Categories: LUG Community Blogs

Andrew Savory: Home Automation: Turn it on again

Planet ALUG - Wed, 15/01/2014 - 00:55

In Pi three ways I wrote:

what happens when you combine an RF transmitter, smart sockets with RF receivers, and a Raspberry Pi?

I’m still finding out what can be done, but this is what I’ve discovered so far.

This is what I bought:

First, I set up each of the smart plugs, and paired them with the remote control.

This is already a big improvement: my home office has bookshelves filled with LED lights, but with inaccessible switches. Being able to turn them all on and off from the remote is awesome, but I’d really like them to turn on automatically, for example at sunset. So I need some compute power in the loop. Time for the Pi.

The Pi is running Raspian. I followed the installation instructions for telldus on Raspberry Pi. See also R-Pi Tellstick core for non-Debian instructions.

Next I tried to figure out the correct on/off parameters in tellstick.conf for the smart plugs. The Tellstick documentation is a bit sparse. Tellstick on Raspberry Pi to control Maplin (UK) lights talks about physical dials on the back of the remote control; sadly the Bye Bye Standby remote doesn’t have this.

Each plug is addressed using a protocol and a number of parameters. In the case of the Bye Bye Standby, it apparently falls under the arctech protocol, which has four different models, and each model uses the parameters “house” and sometimes “unit”.

Taking a brute-force approach, I generated a configuration for every possible parameter for the arctech protocol and codeswitch model:

for house in A B C D E F G H I J K L M N O P ; do
    for unit in {1..16} ; do
      cat <<EOF
device {
   id = $count
   name = "Test"
   protocol = "arctech"
   model = "codeswitch"
   parameters {
      house = "$house"
      unit = "$unit"
   }
}
EOF
   done
done

I then turned each of them on and off in turn, and waited until the tellstick spoke to the plugs:

count = 0
((count++))
for house in A B C D E F G H I J K L M N O P ; do
    for unit in {1..16} ; do
        echo "id = $count, house = $house, unit = $unit"
        tdtool --on $count
        tdtool --off $count
        ((count++))
    done
done

This eventually gave me house E and unit 16 (and the number of the corresponding automatically generated configuration, 80):

tdtool --on 80
Turning on device 80, Test – Success

But this only turned on or off all three plugs at the same time. I wanted control over each plug individually.

I stumbled upon How to pair Home Easy plug to Raspberry Pi with Tellstick, and that gave me enough information to reverse the process. Instead of getting the tellstick to work out what code the plugs talk, in theory I need to get the tellstick to listen to the plug for the code.

So this configuration should work, in combination with the tdlearn command:

device {
    id = 1
   name = "Study Right"
   protocol = "arctech"
   model = "selflearning-switch"
   parameters {
      house = "1"
      unit = "1"
   }
}

However this tiny footnote on the telldus website says: 

4Bye Bye Standby Self learning should be configured as Code switch.

So it seems it should be:

device {
    id = 1
   name = "Study Right"
   protocol = "arctech"
   model = “codeswitch"
   parameters {
      house = "1"
      unit = "1"
   }
}

… which is exactly what I had before. Remembering of course to do service telldusd restart each time we change the config, I tried learning again:

tdtool --learn 1
Learning device: 1 Study Right - The method you tried to use is not supported by the device

Well, bother. Looking at the Tellstick FAQ:

Is it possible to receive signals with TellStick?

TellStick only contains a transmitter module and it’s therefore only possible to transmit signals, not receive signals. TellStick Duo can receive signals from the compatible devices.

So it seemed like I was stuck with all-on, all-off unless I bought a TellStick Duo. Alternatively, I could expand my script to generate every possible combination in the tellstick.conf, and see if I can work out the magic option to control each plug individually. But since there are 1 to 67108863 possible house codes, this could take some time.

Rereading Bye Bye Standby 2011 compatible? finally gave me the answer. You put the plug into learning mode, and get the Tellstick to teach the right code to the plug by sending an “off” or an “on” signal:

tdtool --off 3

So setting house to a common letter and setting units to sensible increments, I can now control each of the plugs separately.

Next up: some automation.

Categories: LUG Community Blogs

Dick Turpin: Gizza loan.

Planet WolvesLUG - Tue, 14/01/2014 - 12:21
Customer: "Hi we bought a cheap laptop from you but it does not appear to have MS Office on it, do you by any chance have a copy of Office we can borrow?"

Speechless
Categories: LUG Community Blogs

Peter Cannon: Iniegogo campaign fails to hit target.

Planet WolvesLUG - Tue, 14/01/2014 - 10:48

Well sadly there it is folks. We failed to meet the target of £700 for sending The Dick Turpin Road Show to FOSDEM.

We did however raise £368.00 which equates to 52.57% or as +Thomas Heine said “Over half the target.” We did have 24 people contribute, 1,039 visits and 399 referrals. On a personal level I’d like to thank each person who donated. Regardless of the amount. You’re an awesome selfless person and I appreciate the effort you all made in an attempt to get not only myself but #tdtrs  to +FOSDEM

Thankfully (Read that as you will) only +Matthew Copperwaite will be there to represent us so if you was planning on buying me a ginger beer buy Matt one instead. To all my friends that are going (A fair few are not on G+) I hope you all have a fantastic time and once again thanks everyone for your kindness.

If you listen to Episode 67 of The Dick Turpin Road Show when it is released you’ll hear that I will hopefully be thanking you all personally in my own famous way.

Pete

Categories: LUG Community Blogs

Debian Bits: Call for Proposals for the MiniDebConf 2014 Barcelona

Planet HantsLUG - Tue, 14/01/2014 - 00:00

Debian Women will hold a MiniDebConf in Barcelona on March 15-16, 2014. Everyone is invited to both talks and social events, but the speakers will be all people who identify themselves as female. This is not a conference about women in Free Software, or women in Debian, rather a usual Debian Mini-DebConf where all the speakers are women.

Debian Women invites submissions of proposals for papers, presentations, discussion sessions and tutorials for the event. Submissions are not limited to traditional talks: you could propose a performance, an art installation, a debate or anything else. All talks are welcome, whether newbie or very advanced level. Please, forward this call to potential speakers and help us make this event a great success!

Please send your proposals to proposals@bcn2014.mini.debconf.org. Don't forget to include in your message: your name or nick the title of the event, description, language, and any other information that might be useful. Please submit your proposal(s) as soon as possible.

For more information, visit the website of the event: http://bcn2014.mini.debconf.org

We hope to see you in Barcelona!

Categories: LUG Community Blogs

Tony Whitmore: Oh yes they did!

Planet HantsLUG - Mon, 13/01/2014 - 22:08

This weekend I had the pleasure of watching my parents do something I’ve never seen them do before. Acting together. On a stage. In front of people who have paid to be there.

The church that they attend resurrected its annual pantomime this year, for the first time in probably fifteen years. Conversation over Christmas was about little else, and they have clearly put heart and soul into it since August when rehearsals began.

I went along to the performance not knowing too much about what to expect, apart from embarrassment. Dad has been a dame before and something tells me he was near the front of the line of volunteers, offering to don the wig and heels again. It is a measure of his commitment that he shaved his ankles and forearms for the role. Mum claims she was roped into playing the baddie’s son who gets to woo a milkmaid with enormous buckets.

My parents are, I hope they won’t mind me saying, both in their sixties now. Yet you wouldn’t know it if you had seen them throwing themselves around on the stage. Dad actually did a forward roll, in heels. Mum sang and danced and looked so different in the costume and wig that I didn’t recognise her at first!

Of course, there were plenty of other people in the cast doing their bit. With a script and songs written especially for the run, this was not an AmDram production that did things by halves. Particularly impressive were the costumes, even the chorus dressed in consistent and plausible outfits. Clever use of moving lights kept the lighting rig small. But above all it was fantastic to see my folks having a ball on stage, clearly loving every moment of it. Congratulations.

Pin It
Categories: LUG Community Blogs

Mick Morgan: strip exif data

Planet ALUG - Sat, 11/01/2014 - 21:52

I have a large collection of photographs on my computer. And each Christmas the collection grows ever larger. I use digiKam to manage that collection, but as I have mentioned before, storing family photographs as a collection of jpeg files seems counter intuitive to me. Photographs should be on display, or at least stored in physical albums that family members can browse at will. At a push, even an old shoebox will do.

So when this Christmas I copied the latest batch of images from my camera to my PC, I did a quick count of the files I hold – it came to nearly 5,500. Ok, so many of these are very similar, and this is simply a reflection of the ease (and very marginal cost) of photographs today compared with the old 35mm days, but even so, that is a lot of images. Disappointingly few of these ever see the light of day because whilst both my wife and I can happily view them on-line, I don’t print enough of them to make worthwhile albums. Sure, actually /taking/ photographs is cheap these days, but printing them at home on the sort of inkjet printer most people posess is rather expensive. Which is where online print companies such as photopanda, snapfish, photobox or jessops come in.

Most of these companies will provide high quality prints in the most popular 6″ x 4″ size for around 5 pence each – so a batch of 40 Christmas pictures is not going to break the bank. But one nice innovation of the digital era is that you can get your photos pre-printed into hard back albums for very reasonable prices. Better yet, my wife pointed me to a “special offer” (70% off) being run by a site she has used in the past. That was such a bargain that I decided to go back over my entire collection and create a “year book” for each of the eleven years of digital images I currently hold (yes, I don’t get out much).

However, I don’t much like the idea of posting a large batch of photographs to a site run by a commercial company, even when that company may have a much less cavalier approach to my privacy than does say, facebook. Once the photographs have been posted, they are outside my control and could end up anywhere. And of course I am not just concerned with the actual images, but the metadata that goes with those images. All electronic images created by modern cameras (or more usually these days, smartphones) contain EXIF data which at the minimum will give date and time information alongside the technical details about the shot (exposure, flash timing etc). In the case of nearly all smartphones, and increasingly with cameras themselves, the image will also contain geo-location details derived from the camera’s GPS system. I don’t take many images with my smartphone, and in any case, its GPS system is resolutely turned off, but my camera (a Panasonic TZ40) contains not just a GPS location system but a map display capability. Sometimes too much technology gets crammed into devices which don’t necessarily need them. As digicamhelp points out, many popular photo sharing sites such as Flickr or picasa helpfully allow viewers to examine EXIF data on-line. It is exactly this sort of capability which is so scarily exploited by ilektrojohn’s creepy tool.

So, before posting my deeply personal pictures of my cats to a commercial site I thought I would scrub the EXIF data. This is made easy with Phil Harvey’s excellent exiftool. This tool is platform independent so OSX and Windows users can take advantage of its capabilities – though of course it is much more flexible when used in conjunction with an OS which offers shell scripting.

Exiftool allows you to selectively edit, remove or replace any of the EXIF data stored in an image. But in my case I simply wanted to remove /all/ EXIF data. This is easy with the “-all= ” switch. Thus having chosen (and copied) the images I wanted to post to the commercial site it was a matter of a minute or two to recursively edit those files with find – thus:

find . -type f -iname ‘*.jpg’ -exec exiftool -all= {} \;

Highly recommended – particularly if you are in the habit of using photo sharing sites.

Categories: LUG Community Blogs

Steve Kemp: Some productive work

Planet HantsLUG - Sat, 11/01/2014 - 19:15

Having decided to take a fortnight off, between looking for a new job, I assumed I'd spend a while coding.

Happily my wife, who is a (medical) doctor, has been home recently so we've got to spend time together instead.

I'm currently pondering projects which will be small enough to be complete in a week, but large enough to be useful. Thus far I've just reimplemented RSS -> chat which I liked a lot at Bytemark.

I have my own chat-server setup, which doesn't have any users but myself. Instead it has a bunch of rooms setup, and different rooms get different messages.

I've now created a new "RSS" room, and a bunch of RSS feeds get announced there when new posts appear. It's a useful thing if you like following feeds, and happen to have a chat-room setup.

I use Prosody as my chat-server, and I use my http2xmpp code to implement a simple HTTP-POST to XMPP broadcast mechanism.

The new script is included as examples/rss-announcer and just polls RSS feeds - URLs which haven't been broadcast previously are posted to the HTTP-server, and thus get injected into the chatroom. A little convoluted, but simple to understand.

This time round I'm using Redis to keep track of which URLs have been seen already.

Beyond that I've been doing a bit of work for friends, and have recently setup an nginx server which will handle 3000+ simultaneous connections. Not too bad, but I'm sure we can make it do better - another server running on BigV which is nice to see :)

I'll be handling a few Squeeze -> Wheezy upgrades in the next week too, setting up backups, and doing some other related "consultation".

If I thought there was a big enough market locally I might consider doing that full-time, but I suspect that relying upon random work wouldn't work long-term.

Categories: LUG Community Blogs

Andrew Savory: Mobile modem

Planet ALUG - Fri, 10/01/2014 - 23:01

I was trying to get a handle on how much mobile data download speeds have improved over the years, so I did some digging through my archives. (The only thing I like more than a mail archive that spans decades is a website that spans decades. Great work, ALUG!) Here’s some totally arbitrary numbers to illustrate a point.

In response to A few Questions, this is what I wrote in May 2002:

[Nokia] 7110 works fine with IR (7110 has a full modem). The 7110 supports 14.4bps
connections, but I bet your telco doesn’t

That should have been 14.4kbps (14,400bps). In 2002 the phones were ahead of the network’s ability to deliver. In 2014, not much has changed.

In GPRS on Debian, this is what I wrote in November 2002:

I finally took the plunge and went for GPRS [..]  (up to 5x the speed of a dialup connection over a GSM mobile connection)

Remember when we did dialup over mobile connections? GSM Arena on the Sony-Ericsson T68i states 24-36 kbps. I’m assuming I got the lower end of that.

In 2003 I was using a Nokia 3650 GPRS connection. GSM Arena on the Nokia 3650 states 24-36 kbps. Let’s be generous and assume reality was right in the middle, at 30 kbps.

In 2004 I got a Nokia 6600, which according to GSM Arena could also do 24 – 36 kbps. It was a great phone, so let’s assume the upper bound for the 6600.

In 2008 I upgraded to 3G with the Nokia N80, and wrote:

3G data connections are dramatically better than GPRS

… but sadly I didn’t quantify how much better. According to GSM Arena, it was 384 kbps.

That’s a pretty good and pretty dramatic speed increase:

But then in 2009 I was using the Nokia N900 (and iPhone, HTC Hero, Google Nexus One, …). GSM Arena on the Nokia N900 states a theoretical 10Mbps … quite the upgrade, except O2 were limited to 3.6 mbps.

In 2012 I was using the Samsung Galaxy SII. GSM Arena on the Samsung Galaxy SII promises 21 mbps.

And now the Sony Xperia Z Ultra supports LTE at 42 MBPS and 150MBPS. Sadly, the networks don’t yet fully support those speeds, but if they did, the chart would be truly dramatic. 2003-2008 starts to look like a rounding error:

I don’t need to use a modem or infrared, either. Things have really improved over the last twelve years!

(This post is probably best read in conjunction with Tom’s analysis of Mobile phone sizes.)

Categories: LUG Community Blogs
Syndicate content