News aggregator

Jonathan McDowell: 2014 SPI Board election nominations open

Planet ALUG - Mon, 07/07/2014 - 21:13

I put out the call for nominations for the 2014 Software in the Public Interest (SPI) Board election last week. At this point I haven't yet received any nominations, so I'm mentioning it here in the hope of a slightly wider audience. Possibly not the most helpful as I would hope readers who are interested in SPI are already reading spi-announce. There are 3 positions open this election and it would be good to see a bit more diversity in candidates this year. Nominations are open until the end of Tuesday July 13th.

The primary hard and fast time commitment a board member needs to make is to attend the monthly IRC board meetings, which are conducted publicly via IRC (#spi on the OFTC network). These take place at 20:00 UTC on the second Thursday of every month. More details, including all past agendas and minutes, can be found at http://spi-inc.org/meetings/. Most of the rest of the board communication is carried out via various mailing lists.

The ideal candidate will have an existing involvement in the Free and Open Source community, though this need not be with a project affiliated with SPI.

Software in the Public Interest (SPI, http://www.spi-inc.org/) is a non-profit organization which was founded to help organizations develop and distribute open hardware and software. We see it as our role to handle things like holding domain names and/or trademarks, and processing donations for free and open source projects, allowing them to concentrate on actual development.

Examples of projects that SPI helps includes Debian, LibreOffice, OFTC and PostgreSQL. A full list can be found at http://www.spi-inc.org/projects/.

Categories: LUG Community Blogs

HP 255 G1 Laptop with Ubuntu

Planet SurreyLUG - Mon, 07/07/2014 - 13:08

At work I needed a cheap laptop for a computer-illiterate user. Giving them Windows, would have meant that they would have had to keep up-to-date with Windows Updates, with all the potential issues that would cause, along with the need for malware protection. It would also have pushed up the cost, a laptop capable of pushing Windows along reasonably decently, would have cost a few hundred pounds at least.

Generally I would just have purchased a low-end Lenovo laptop and installed Ubuntu onto it, but I was aware that Ebuyer had recently launched an HP255 G1 Laptop with Ubuntu pre-installed for £219.99 inc. vat (just £183 if you can reclaim the VAT).

Buying pre-installed with Ubuntu afforded me the comfort of knowing that everything would work. Whilst Ubuntu generally does install very easily, there are sometimes hassles in getting some of the function buttons working, for brightness, volume etc. Knowing that these issues would all be sorted, along with saving me the time in having to install Ubuntu, seemed an attractive proposition.

Unboxing

My first impressions were good, the laptop comes with a laptop case and the laptop itself looks smart enough for a budget machine. An Ubuntu sticker, instead of the usual Windows sticker, was welcome, although the two sticky marks where previous stickers had been removed were less so. Still, at least they had been removed.

Whilst we are on the subject of Windows’ remnants – the Getting Started leaflet was for Windows 8 rather than Ubuntu. Most Ubuntu users won’t care, but this is a poor attention to detail and, if this laptop is to appeal to the mass market, then it may cause confusion.

First Boot

Booting up the laptop for the first time gave me an “Essential First Boot Set-up is being performed. Please wait.” message. I did wait and for quite a considerable time – probably a not dissimilar time to installing Ubuntu from scratch; I couldn’t help but suspect that was precisely what was happening. Eventually I was presented with a EULA from HP, which I had no choice but to accept or choose to re-install from scratch. Finally I was presented with an Ubuntu introduction, which I confess I skipped; suffice to say the new user was welcomed to Ubuntu with spinny things.

The first thing to note is that this is Ubuntu 12.04, the previous LTS (Long Term Support release). This will be supported until 2017, but it is a shame that it didn’t have the latest LTS release – Ubuntu 14.04. Users may of course choose to upgrade.

Secondly, the wireless was slow to detect the wireless access points on the network. Eventually I decided to restart network-manager, but just as I was about to do so, it suddenly sprang into life and displayed all the local access points. Once connected, it will re-connect quickly enough, but it does seem to take a while to scan new networks. Or perhaps I am just too impatient.

Ubuntu then prompted to run some updates, but the updates failed, as “91.189.88.153” was said to be unreachable, even though it was ping-able. The address is owned by Canonical, but whether this was a momentary server error, or some misconfiguration on the laptop, I have no idea.

This would have been a major stumbling block for a new Ubuntu user. Running apt-get update and apt-get dist-upgrade worked fine, typing Ctrl+Alt+t to bring up the terminal and then typing:

$ sudo apt-get update $ sudo apt-get dist-upgrade

I notice that this referenced an HP-specific repository doubtless equipped with hardware specific packages:

http://oem.archive.canonical.com/updates/ precise-oem.sp1 public http://hp.archive.canonical.com/updates precise-stella-anaheim public

I assume that adding this latter repository would be a good idea if purchasing a Windows version of this laptop and installing Ubuntu.

Hardware

This is a typical chunky laptop.  But, if you were expecting a sleek Air-like laptop for £220, then you need to take a reality shower. What it is, is a good-looking, well-made, traditional laptop from a quality manufacturer. At this price, that really should be enough.

Ubuntu “System Details” reveals that this is running an “AMD E1-1500 APU with Radeon HD Graphics x 2″, running 64-bit with a 724GB drive and 3.5GiB RAM. That would appear to be a lower spec processor than is typically available on-line for an HP 255 G1 running Windows; which generally seem to have 1.75Ghz processors (albeit at twice the price).

The great news was that, as expected, all the buttons worked. So what? Well, it may seem like a trivial matter whether, for example, pressing Fn10 increases the volume or not, but I think many of us have the experience of spending inordinate amounts of time trying to get such things to work properly. And buttons that don’t work, continue to irritate until the day you say goodbye to that machine. The fact that everything works as it should is enormously important and is the primary reason why buying Ubuntu pre-installed is such a big deal.

The keyboard and trackpad seem perfectly good enough to me, certainly much better than on my Novatech ultrabook; although not everyone seems to like them. In particular, it is good to have a light on the caps lock key.

I have not tested battery life, but, as this is usually the first thing to suffer in an entry-level machine, I would not hope for much beyond a couple of hours.

For other details on hardware, please refer to the product information and read more reviews here.

Performance

Booting up takes around 45 seconds and a further 20 seconds to reach the desktop. That is quite a long time these days for Ubuntu, but fast enough I would imagine for most users and considerably faster than it takes Windows to reach a usable state, at least in my experience.

Being that bit slower to boot, Suspend becomes more important: Closing the lid suspended the laptop and opening it again brought up the lock screen password prompt almost immediately. Repeated use showed this to work reliably.

As to system performance, well frankly this is not a fast laptop. Click on Chromium, post boot, and it takes about 9 seconds to load; LibreOffice takes about 6 seconds to load. Even loading System Settings takes a second or two. Once you’ve run them once, after each boot, they will load again in less than half the time. Despite the slow performance, it is always perfectly usable, and is absolutely fine for email and web-browsing applications.

The other thing to remember is that this will be the performance you should be able to expect throughout its life – i.e. it will not slow down even more as it gets older. Windows users typically expect their computers to slow down over time, largely because of the different way in which system and application settings are stored in Windows. Ubuntu does not suffer from this problem, meaning that a 5-year-old Ubuntu installation should be working as fast as it did when it was first installed.

Conclusions

I struggle to think of what else you could buy that provides a full desktop experience for £220. And it isn’t even some cheap unbranded laptop from the developing world. Sure, it isn’t the fastest laptop around, but it is perfectly fast enough for web, email and office documents. And the fact that you can expect it to continue working, with few, if any, worries about viruses, makes it ideal for many users. It certainly deserves to be a success for HP, Ubuntu and Ebuyer.

But, whilst this low-price, low-power combination was ideal for me on this occasion, it is a shame that there are no other choices available pre-installed with Ubuntu. I wonder how many newcomers to Ubuntu will come with the belief that Ubuntu is slow, when in reality it is the low-end hardware that is to blame?

Please HP, Ubuntu and Ebuyer – give us more choice.

And Lenovo, please take note – you just lost a sale.

For more reviews please visit Reevo.


Categories: LUG Community Blogs

Meeting at "The Moon Under Water"

Wolverhampton LUG News - Mon, 07/07/2014 - 08:43
Event-Date: Wednesday, 9 July, 2014 - 19:30 to 23:00Body: 53-55 Lichfield St Wolverhampton West Midlands WV1 1EQ Eat, Drink and talk Linux
Categories: LUG Community Blogs

Chris Lamb: Strava Enhancement Suite

Planet ALUG - Sun, 06/07/2014 - 20:55

Today I merged my individual Strava Chrome extensions into a single package, added some features that I thought were still missing and published it to the Chrome Web Store.

It now supports:

  • Infinite scroll: Automatically load more dashboard entries when reaching the bottom.
  • Switch imperial/metric units: Quickly switch from miles/km from the browser address bar.
  • Default to my results: Changes the default leaderboard to "My Results" instead of "Overall" when viewing a segment effort.
  • Hide "find friends": Hide invitations to invite and find friends to Strava
  • "Enter" posts comment: Immediately posts comment when pressing the "enter" / "return" key in the edit box rather than adding a new line.
  • Compare running: Changes the default sport for the "Side by Side comparison" module to running.
  • Running cadence: Show running cadence by default in elevation profile.
  • Variability Index: Calculate a Variability Index (VI) from the weighted average power and the average power, an indication of how 'smooth' a ride was.
  • Estimated FTP: Select "Show Estimated FTP" by default on Power Curve.
  • Running TSS: Estimates a run's Training Stress Score (TSS) from its Grade Adjusted Pace distribution.
  • Standard Google Map: Prefer the "Standard" Google map over the "Terrain" view (experimental).

Needless to say, this software is not endorsed by Strava. Suggestions, feedback and contributions welcome.

UPDATE: Also added more aggregate segment data, leaderboard options, showing running heart rate by default, links to Veloviewer & Race Shape.

View/install in Chrome Web Store.

Categories: LUG Community Blogs

Martin Wimpress: Headless cloudprint server on Debian for MFC-7360N

Planet HantsLUG - Sat, 05/07/2014 - 12:00

I have a Brother MFC-7360N printer at home and there is also one at work. I wanted to to get Cloudprint working with Android devices rather than use the Android app Brother provide, which is great when it works but deeply frustrating (for my wife) when it doesn't.

What I describe below is how to Cloudprint enable "Classic printers" using Debian Wheezy.

Install CUPS

Install CUPS and the Cloudprint requirements.

sudo apt-get install cups python-cups python-daemon python-pkg-resources Install the MFC-7360N Drivers

I used the URL below to access the .deb files required.

If you're running a 64-bit Debian, then install ia32-libs first.

sudo apt-get install ia32-libs

Download and install the MFC-7360N drivers.

wget -c http://download.brother.com/welcome/dlf006237/mfc7360nlpr-2.1.0-1.i386.deb wget -c http://download.brother.com/welcome/dlf006239/cupswrapperMFC7360N-2.0.4-2.i386.deb sudo dpkg -i --force-all mfc7360nlpr-2.1.0-1.i386.deb sudo dpkg -i --force-all cupswrapperMFC7360N-2.0.4-2.i386.deb Configure CUPS

Edit the CUPS configuration file commonly located in /etc/cups/cupsd.conf and make the section that looks like:

# Only listen for connections from the local machine. Listen localhost:631 Listen /var/run/cups/cups.sock

look like this:

# Listen on all interfaces Port 631 Listen /var/run/cups/cups.sock

Modify the Apache specific directives to allow connections from everywhere as well. Find the follow section in /etc/cups/cupsd.conf:

<Location /> # Restrict access to the server... Order allow,deny </Location> # Restrict access to the admin pages... <Location /admin> Order allow,deny </Location> # Restrict access to the configuration files... <Location /admin/conf> AuthType Default Require user @SYSTEM Order allow,deny </Location>

Add Allow All after each Order allow,deny so it looks like this:

<Location /> # Restrict access to the server... Order allow,deny Allow All </Location> # Restrict access to the admin pages... <Location /admin> Order allow,deny Allow All </Location> # Restrict access to the configuration files... <Location /admin/conf> AuthType Default Require user @SYSTEM Order allow,deny Allow All </Location> Add the MFC-7360N to CUPS.

If your MFC-7360N is connected to your server via USB then you should be all set. Login to the CUPS administration interface on http://yourserver:631 and modify the MFC7360N printer (if one was created when the drivers where installed) then make sure you can print a test page via CUPS before proceeding.

Install Cloudprint and Cloudprint service wget -c http://davesteele.github.io/cloudprint-service/deb/cloudprint_0.11-5.1_all.deb wget -c http://davesteele.github.io/cloudprint-service/deb/cloudprint-service_0.11-5.1_all.deb sudo dpkg -i cloudprint_0.11-5.1_all.deb sudo dpkg -i cloudprint-service_0.11-5.1_all.deb Authenticate

Google accounts with 2 step verification enabled need to use an application-specific password.

Authenticate cloudprintd.

sudo service cloudprintd login

You should see something like this.

Accounts with 2 factor authentication require an application-specific password Google username: you@example.org Password: Added Printer MFC7360N

Start the Cloudprint daemon.

sudo service cloudprintd start

If everything is working correctly you should see your printer the following page:

Printing from mobile devices Android

Add the Google Cloud Print app to Android devices and you'll be able to configure your printer preferences and print from Android..

Chrome and Chromium

When printing from within Google Chrome and Chromium you can now select Cloudprint as the destination and choose your printer.

References
Categories: LUG Community Blogs

Adam Trickett: Picasa Web: Tour de France

Planet HantsLUG - Sat, 05/07/2014 - 08:00

The Tour de France visits Yorkshire

Location: Yorkshire
Date: 5 Jul 2014
Number of Photos in Album: 67

View Album

Categories: LUG Community Blogs

Steve Kemp: And so it begins ...

Planet HantsLUG - Thu, 03/07/2014 - 22:00

1. This weekend I will apply to rejoin the Debian project, as a developer.

2. In the meantime I've been begun releasing some some code which powers the git-based DNS hosting site/service.

3. This is the end of my list.

4. I lied. This is the end of my list. Powers of two, baby.

Categories: LUG Community Blogs

Creating Time-lapse Videos on Ubuntu

Planet SurreyLUG - Thu, 03/07/2014 - 16:17

I’ve previously blogged about how I sometimes setup a webcam to take pictures and turn them into videos. I thought I’d update that here with something new I’ve done, fully automated time lapse videos on Ubuntu. Here’s when I came up with:-

(apologies for the terrible music, I added that from a pre-defined set of options on YouTube)

(I quite like the cloud that pops into existence at ~27 seconds in)

Over the next few weeks there’s an Air Show where I live and the skies fill with all manner of strange aircraft. I’m usually working so I don’t always see them as they fly over, but usually hear them! I wanted a way to capture the skies above my house and make it easily available for me to view later.

So my requirements were basically this:-

  • Take pictures at fairly high frequency – one per second – the planes are sometimes quick!
  • Turn all the pictures into a time lapse video – possibly at one hour intervals
  • Upload the videos somewhere online (YouTube) so I can view them from anywhere later
  • Delete all the pictures so I don’t run out of disk space
  • Automate it all
Taking the picture

I’ve already covered this really, but for this job I have tweaked the .webcamrc file to take a picture every second, only save images locally & not to upload them. Here’s the basics of my .webcamrc:-

[ftp] dir = /home/alan/Pictures/webcam/current file = webcam.jpg tmp = uploading.jpeg debug = 1 local = 1 [grab] device = /dev/video0 text = popeycam %Y-%m-%d %H:%M:%S fg_red = 255 fg_green = 0 fg_blue = 0 width = 1280 height = 720 delay = 1 brightness = 50 rotate = 0 top = 0 left = 0 bottom = -1 right = -1 quality = 100 once = 0 archive = /home/alan/Pictures/webcam/archive/%Y/%m/%d/%H/snap%Y-%m-%d-%H-%M-%S.jpg

Key things to note, “delay = 1″ gives us an image every second. The archive directory is where the images will be stored, in sub-folders for easy management and later deletion. That’s it, put that in the home directory of the user taking pictures and then run webcam. Watch your disk space get eaten up.

Making the video

This is pretty straightforward and can be done in various ways. I chose to do two-pass x264 encoding with mencoder. In this snippet we take the images from one hour – in this case midnight to 1AM on 2nd July 2014 – from /home/alan/Pictures/webcam/archive/2014/07/02/00 and make a video in /home/alan/Pictures/webcam/2014070200.avi and a final output in /home/alan/Videos/webcam/2014070200.avi which is the one I upload.

mencoder "mf:///home/alan/Pictures/webcam/archive/2014/07/02/00/*.jpg" -mf fps=60 -o /home/alan/Pictures/webcam/2014070200.avi -ovc x264 -x264encopts direct=auto:pass=1:turbo:bitrate=9600:bframes=1:me=umh:partitions=all:trellis=1:qp_step=4:qcomp=0.7:direct_pred=auto:keyint=300 -vf scale=-1:-10,harddup mencoder "mf:///home/alan/Pictures/webcam/archive/2014/07/02/00/*.jpg" -mf fps=60 -o /home/alan/Pictures/webcam/2014070200.avi -ovc x264 -x264encopts direct=auto:pass=2:bitrate=9600:frameref=5:bframes=1:me=umh:partitions=all:trellis=1:qp_step=4:qcomp=0.7:direct_pred=auto:keyint=300 -vf scale=-1:-10,harddup -o /home/alan/Videos/webcam/2014070200.avi Upload videos to YouTube

The project youtube-upload came in handy here. It’s pretty simple with a bunch of command line parameters – most of which should be pretty obvious – to upload to youtube from the command line. Here’s a snippet with some credentials redacted.

python youtube_upload/youtube_upload.py --email=########## --password=########## --private --title="2014070200" --description="Time lapse of Farnborough sky at 00 on 02 07 2014" --category="Entertainment" --keywords="timelapse" /home/alan/Videos/webcam/2014070200.avi

I have set the videos all to be private for now, because I don’t want to spam any subscriber with a boring video of clouds every hour. If I find an interesting one I can make it public. I did consider making a second channel, but the youtube-upload script (or rather the YouTube API) doesn’t seem to support specifying a different channel from the default one. So I’d have to switch to a different channel by default work around this, and then make them all public by default, maybe.

In addition YouTube sends me a “Well done Alan” patronising email whenever a video is upload, so I know when it breaks, I stop getting those mails.

Delete the pictures

This is easy, I just rm the /home/alan/Pictures/webcam/archive/2014/07/02/00 directory once the upload is done. I don’t bother to check if the video uploaded okay first because if it fails to upload I still want to delete the pictures, or my disk will fill up. I already have the videos archived, so can upload those later if the script breaks.

Automate it all

webcam is running constantly in a ‘screen’ window, that part is easy. I could detect when it dies and re-spawn it maybe. It has been known to crash now and then. I’ll get to that when that happens

I created a cron job which runs at 10 mins past the hour, and collects all the images from the previous hour.

10 * * * * /home/alan/bin/encode_upload.sh

I learned the useful “1 hour ago” option to the GNU date command. This lets me pick up the images from the previous hour and deals with all the silly calculation to figure out what the previous hour was.

Here (on github) is the final script. Don’t laugh.

Categories: LUG Community Blogs

Alan Pope: Creating Time-lapse Videos on Ubuntu

Planet HantsLUG - Thu, 03/07/2014 - 16:17

I’ve previously blogged about how I sometimes setup a webcam to take pictures and turn them into videos. I thought I’d update that here with something new I’ve done, fully automated time lapse videos on Ubuntu. Here’s when I came up with:-

(apologies for the terrible music, I added that from a pre-defined set of options on YouTube)

(I quite like the cloud that pops into existence at ~27 seconds in)

Over the next few weeks there’s an Air Show where I live and the skies fill with all manner of strange aircraft. I’m usually working so I don’t always see them as they fly over, but usually hear them! I wanted a way to capture the skies above my house and make it easily available for me to view later.

So my requirements were basically this:-

  • Take pictures at fairly high frequency – one per second – the planes are sometimes quick!
  • Turn all the pictures into a time lapse video – possibly at one hour intervals
  • Upload the videos somewhere online (YouTube) so I can view them from anywhere later
  • Delete all the pictures so I don’t run out of disk space
  • Automate it all
Taking the picture

I’ve already covered this really, but for this job I have tweaked the .webcamrc file to take a picture every second, only save images locally & not to upload them. Here’s the basics of my .webcamrc:-

[ftp] dir = /home/alan/Pictures/webcam/current file = webcam.jpg tmp = uploading.jpeg debug = 1 local = 1 [grab] device = /dev/video0 text = popeycam %Y-%m-%d %H:%M:%S fg_red = 255 fg_green = 0 fg_blue = 0 width = 1280 height = 720 delay = 1 brightness = 50 rotate = 0 top = 0 left = 0 bottom = -1 right = -1 quality = 100 once = 0 archive = /home/alan/Pictures/webcam/archive/%Y/%m/%d/%H/snap%Y-%m-%d-%H-%M-%S.jpg

Key things to note, “delay = 1″ gives us an image every second. The archive directory is where the images will be stored, in sub-folders for easy management and later deletion. That’s it, put that in the home directory of the user taking pictures and then run webcam. Watch your disk space get eaten up.

Making the video

This is pretty straightforward and can be done in various ways. I chose to do two-pass x264 encoding with mencoder. In this snippet we take the images from one hour – in this case midnight to 1AM on 2nd July 2014 – from /home/alan/Pictures/webcam/archive/2014/07/02/00 and make a video in /home/alan/Pictures/webcam/2014070200.avi and a final output in /home/alan/Videos/webcam/2014070200.avi which is the one I upload.

mencoder "mf:///home/alan/Pictures/webcam/archive/2014/07/02/00/*.jpg" -mf fps=60 -o /home/alan/Pictures/webcam/2014070200.avi -ovc x264 -x264encopts direct=auto:pass=1:turbo:bitrate=9600:bframes=1:me=umh:partitions=all:trellis=1:qp_step=4:qcomp=0.7:direct_pred=auto:keyint=300 -vf scale=-1:-10,harddup mencoder "mf:///home/alan/Pictures/webcam/archive/2014/07/02/00/*.jpg" -mf fps=60 -o /home/alan/Pictures/webcam/2014070200.avi -ovc x264 -x264encopts direct=auto:pass=2:bitrate=9600:frameref=5:bframes=1:me=umh:partitions=all:trellis=1:qp_step=4:qcomp=0.7:direct_pred=auto:keyint=300 -vf scale=-1:-10,harddup -o /home/alan/Videos/webcam/2014070200.avi Upload videos to YouTube

The project youtube-upload came in handy here. It’s pretty simple with a bunch of command line parameters – most of which should be pretty obvious – to upload to youtube from the command line. Here’s a snippet with some credentials redacted.

python youtube_upload/youtube_upload.py --email=########## --password=########## --private --title="2014070200" --description="Time lapse of Farnborough sky at 00 on 02 07 2014" --category="Entertainment" --keywords="timelapse" /home/alan/Videos/webcam/2014070200.avi

I have set the videos all to be private for now, because I don’t want to spam any subscriber with a boring video of clouds every hour. If I find an interesting one I can make it public. I did consider making a second channel, but the youtube-upload script (or rather the YouTube API) doesn’t seem to support specifying a different channel from the default one. So I’d have to switch to a different channel by default work around this, and then make them all public by default, maybe.

In addition YouTube sends me a “Well done Alan” patronising email whenever a video is upload, so I know when it breaks, I stop getting those mails.

Delete the pictures

This is easy, I just rm the /home/alan/Pictures/webcam/archive/2014/07/02/00 directory once the upload is done. I don’t bother to check if the video uploaded okay first because if it fails to upload I still want to delete the pictures, or my disk will fill up. I already have the videos archived, so can upload those later if the script breaks.

Automate it all

webcam is running constantly in a ‘screen’ window, that part is easy. I could detect when it dies and re-spawn it maybe. It has been known to crash now and then. I’ll get to that when that happens

I created a cron job which runs at 10 mins past the hour, and collects all the images from the previous hour.

10 * * * * /home/alan/bin/encode_upload.sh

I learned the useful “1 hour ago” option to the GNU date command. This lets me pick up the images from the previous hour and deals with all the silly calculation to figure out what the previous hour was.

Here (on github) is the final script. Don’t laugh.

Categories: LUG Community Blogs

Chris Lamb: Race report: Ironman Austria 2014

Planet ALUG - Thu, 03/07/2014 - 12:13

I arrived in Klagenfurt early on Thursday before Sunday's race and went to register at the "Irondome" on the shores of Lake Wörthersee. I checked up on my bike at Race Force's HQ and had a brief look around the expo before it got busy.

Over the next few days I met up a number of times with my sister who had travelled—via Venice—to support and cheer me. Only the day before the race did it sincerely dawn on me how touching and meaningful this was, as well as how much it helped having someone close by.

I had planned to take part in as much of the "Ironman experience" as possible but in practice I not only wanted to stay out of the sun as much as possible, I found that there was an unhealthy pre-race tension at the various events so I kept myself at a slight distance.

Between participants the topic of discussion was invariably the weather forecast but I avoided paying much attention as I had no locus of control; I would simply make different decisions in each eventuality. However, one could not "un-learn" that it reached 40°C on the run course in 2012, landing many in hospital.

As this was my first long-distance triathlon with a corresponding investment of training I decided that conservative pacing and decisions were especially prudent in order to guarantee a finish. Ironman race intensity is quite low but this also means the perceived difference between a sustainable and a "suicide" pace is dangerously narrow.

Despite that, my goal was to finish under 11 hours, targeting a 1:20 swim, a 5:30 bike and a 4:00 marathon.

Race day

I got to sleep at around 10PM and awoke early at 3AM, fearing that I had missed my alarm. I dozed for another hour before being woken at 4AM and immediately started on two strong coffees and waited for a taxi.

Over the next 2 hours I ate two Powerbars, a banana and sipped on isotonic energy drink. I also had a gel immediately before the swim, a total of approximately 600 calories. Many consume much more pre-race, but I had not practised this and there would be plenty of time to eat on the bike.

I got to transition as it opened at 5AM and checked over my bags and bike and then made my way slowly to the lake to put on my wetsuit.

Swim
Distance
3.8km / 2.4 miles
Time
1:21:31 (2:08/100m)

I felt my swimming ability to be just on the right of the bell-curve so I lined myself up according to their suggestion. I'm quite good with nerves so in the final ten minutes I kept to myself and remained very calm.

After some theatrics from the organisers, the gun finally went off at 7AM. It was difficult to get my technique "in" straight away but after about 5 minutes I found I could focus almost entirely on my stroke. I didn't get kicked too much and I reached the first turn buoy in good time, feeling relaxed. Between the next two buoys I had some brief hamstring cramps but they passed quickly.

After the second turn I veered off-course due to difficulties in sighting the next landmark—the entrance to the Lend canal—in the rising sun. Once I reached it, the canal itself was fast-moving but a little too congested so I became stuck behind slower swimmers.

The end of the swim came suddenly and glancing at my watch I was pretty happy with my time, especially as I didn't feel like I had exerted myself much at all.

T1
Time
6:30

Due to the size of Ironman events there is an involved system of transition bags and changing tents; no simple container beside your bike. There was also a fair amount of running between the tents as well. Despite that (and deciding to swim in my Castelli skinsuit to save time) I was surprised at such a long transition split – I'm not sure where I really spent it all.

I also had been under the impression volunteers would be applying sunscreen so I had put my only sun spray in the bike-to-run bag, not the swim-to-bike one. However, I found a discarded bottle by my feet and borrowed some.

Bike
Distance
180km / 112 miles
Time
5:30:12 (32.7kph - 20.3mph)

The bike leg consists of two hilly laps just to the south of the Wörthersee.

It felt great to be out on the bike but it soon became frustrating as I could not keep to my target wattage due to so many people on the course. There were quite a few marshalls out too which compounded this – I didn't want to exert any more than necessary in overtaking slower riders but I also did not want to draft, let alone be caught drafting.

This also meant I had to switch my read-out from a 10-second power average to 3-seconds, a sub-optimal situation as it does not encourage consistent power output. It's likely a faster swim time would have "seeded" me within bikers more around my ability, positively compounding my overall performance.

I started eating after about 15 minutes: in total I consumed six Powerbars, a NutriGrain, half a banana, four full bottles of isotonic mix and about 500ml of Coca-Cola. I estimate I took on about 1750 calories in total.

The aid stations were very well-run, my only complaints being that the isotonic drink became extremely variable—the bottles being half-filled and/or very weak—and that a few were cruelly positioned before hills rather than after them.

I felt I paced the climbs quite well and kept almost entirely below threshold as rehearsed. Gearing-wise, I felt I had chosen wisely – I would not have liked to have been without 36x28 in places and only managed to spin out the 52x12 four or five times. Another gear around 16t would have been nice though.

There was quite heavy rain and wind on the second lap but it did not deter the locals or my sister, who was on the Rupitiberg waving a custom "Iron Lamb" banner. It was truly fantastic seeing her out on the course.

On the final descent towards transition I was happy with my time given the congestion but crucially did not feel like I had just ridden 112 miles, buoying my spirits for the upcoming marathon.

T2
Time
3:46

Apart from the dismount line which came without warning and having a small mishap in finding my bike rack, transitioning to the run was straightforward.

Run
Distance
42.175km / 26.2 miles
Time
3:54:21 (5:33/km - 8:56/mile)

The run course consists of two "out-and-backs". The first leads to Krumpendorf along the Wörthersee and the railway, the second to the centre of Klagenfurt along the canal. Each of these is repeated twice.

I felt great off the bike but it is always difficult to slow oneself to an easy pace after T2, even when you are shouting at yourself to do so. I did force my cadence down—as well as scared myself with a 4:50 split for the first kilometer—and settled into the first leg to Krumpendorf.

Once the crowds thinned I took stock and decided to find a bathroom before it could become a problem. After that, I felt confident enough to start taking on fuel and the 10km marker on the return to the Irondome came around quickly.

Over the course of the run I had about three or four caffeinated gels and in latter stages a few mouthfuls of Coke. I tried drinking water but switched to watermelon slices as I realised I could absorb more liquid that way, remaining moving and gaining a feeling of security that comes from simply carrying something.

The first visit to Klagenfurt was unremarkable and I was taking care to not go too hard on the downhill gradients there – whilst going uphill is relatively straightforward to pace, I find downhill running deceptively wearing on your quads and I still had 25km to go.

The halfway point came after returning from Klagenfurt and I was spotted by my sister which was uplifting. I threw her a smile, my unused running belt and told her I felt great, which I realised I actually did.

At about 23km I sensed I needed the bathroom again but the next aid station had locked theirs and for some bizarre reason I then did not stop when I saw a public WC which was clearly open. I finally found one at 28km but the episode had made for a rather uncomfortable second lap of Krumpendorf. I did run a little of the Klagenfurt canal earlier in the week, but I wished I had run the route through Krumpendorf instead – there was always "just another" unexpected turn which added to the bathroom frustration.

In a chapter in What I Talk About When I Talk About Running, Haruki Murakami writes about the moment he first ran past 26.2 miles:

I exaggerate only a bit when I say that the moment I straddled that line a slight shiver went through me, for this was the first time I'd ever run more than a marathon. For me this was the Strait of Gibraltar, beyond which lay an unknown sea. What lay in wait beyond this, what unknown creatures were living there, I didn't have a clue. In my own small way I felt the same fear that sailors of old must have felt.

I was expecting a similar feeling at this point but I couldn't recall whether my longest run to date was 30 or 31km, a detail which somehow seemed to matter at the time. I certainly noticed how uphill the final return from Krumpendorf had suddenly become and how many people had started walking, lying down, or worse.

32km. Back near the Irondome, the crowds were insatiable but extremely draining. Having strangers call your name out in support sounds nice in principle but I was already struggling to focus, my running form somewhat shot.

In the final leg to Krumpendorf, the changes of gradient appeared to have been magnified tenfold but I was still mostly in control, keeping focused on the horizon and taking in something when offered. Once I reached Klagenfurt for the last time at 36km I decided to ignore the aid stations; they probably weren't going to be of any further help and running on fumes rather than take nutrition risks seemed more prudent.

The final stretch from Klagenfurt remains a bit of a blur. I remember briefly walking up a rather nasty section of canal towpath, this was the only part I walked outside of the aid stations which again seemed more important (and worrying) at the time. I covered a few kilometers alongside another runner where matching his pace was a welcome distraction from the pain and feelings of utter emptiness and exhaustion.

I accelerating away from him and others but the final kilometer seemed like an extremely cruel joke, teasing you multiple times with the sights and sounds of the finish before winding you away—with yet more underpasses!—to fill out the distance.

Before entering the finishing chute I somehow zipped up my trisuit, flattened my race number and climbed the finish ramp, completely numb to any "You are an Ironman" mantra being called out.

Overall
Total time
10:56:20

I spent about 15 minutes in the enclave just beyond the finish line, really quite unsure about how my body was feeling. After trying lying down and soaking myself in water, Harriet took me off to the post-race tent where goulash, pizza and about a litre of Sport-Weiss made me feel human again...

(Full results)

Categories: LUG Community Blogs

Mick Morgan: inappropriate use of technology

Planet ALUG - Mon, 30/06/2014 - 13:49

I have been travelling a lot over the last few months (Czech Republic, Scotland, France, Germany, Austria, Slovenia, Croatia, Italy). That travel, plus my catching up on a load of reading is my excuse for the woeful lack of posts to trivia of late. But hey, sometimes life gets in the way of blogging – which is as it should be.

A couple of things struck me whilst I have been away though. Firstly, and most bizarrely I noticed a significant number of tourists in popular, and hugely photogenic, locations (such as Prague and Dubrovnik) wandering around staring at their smartphones rather than looking at the reality around them. At first I thought that they were just checking photographs they had taken, or possibly that they were texting or emailing friends and relatives about their holidays, or worse, posting to facebook, but that did not appear to be the case. Then by chance I overheard one tourist telling his partner that they needed to “turn left ahead” whilst they walked past me so it struck me that they might just possibly be using google maps to navigate. So I watched others more carefully. And I must conclude that many people were doing just that. I can’t help but feel a little saddened that someone should choose to stare at a google app on a small screen in their hand than look at the beauty of something like the Charles Bridge across the Vlatva.

The second point which struck me was how much of a muppet you look if you use an iPad to take photographs.

Categories: LUG Community Blogs

Steve Kemp: Slowly releasing bits of code

Planet HantsLUG - Sun, 29/06/2014 - 14:37

As previously mentioned I've been working on git-based DNS hosting for a while now.

The site was launched for real last Sunday, and since that time I've received enough paying customers to cover all the costs, which is nice.

Today the site was "relaunched", by which I mean almost nothing has changed, except the site looks completely different - since the templates/pages/content is all wrapped up in Bootstrap now, rather than my ropy home-made table-based layout.

This week coming I'll be slowly making some of the implementation available as free-software, having made a start by publishing CGI::Application::Plugin::AB - a module designed for very simple A/B testing.

I don't think there will be too much interest in most of the code, but one piece I'm reasonably happy with is my webhook-receiver.

Webhooks are at the core of how my service is implemented:

  • You create a repository to host your DNS records.
  • You configure a webhook to be invoked when pushing to that repository.
  • The webhook will then receive updates, and magically update your DNS.

Because the webhook must respond quickly, otherwise github/bitbucket/whatever will believe you've timed out and report an error, you can't do much work on the back-end.

Instead I've written a component that listens for incoming HTTP POSTS, parses the body to determine which repository that came from, and then enqueues the data for later processing.

A different process will be constantly polling the job-queue (which in my case is Redis, but could be beanstalkd, or similar. Hell even use MySQL if you're a masochist) and actually do the necessary magic.

Most of the webhook processor is trivial, but handling different services (github, bitbucket, etc) while pretending they're all the same is hard. So my little toy-service will be released next week and might be useful to others.

ObRandom: New pricing unveiled for users of a single zone - which is a case I never imagined I'd need to cover. Yay for A/B testing :)

Categories: LUG Community Blogs

Martin Wimpress: Humax Foxsat HDR Custom Firmware

Planet HantsLUG - Sun, 29/06/2014 - 12:00

I've had the notes kicking around for absolutely ages. I haven't checked to see if this stuff is still accurate because during the last 12 months or so our viewing habbits have changed and we almost exclusively watch streamed content now.

That said, my father-in-law gave us a Humax Foxsat HDR Freesat digital recorder as a thank you for some work I did for him. It turns out the Humax Foxsat HDR is quite hackable.

Hard Disk Upgrade

I contributed to the topic below. The Humax firmware only supports disks up to 1TB but the hackers method works for 2TB drives, possibly bigger but I haven't tried. I've upgraded mine and my father-in-laws to 2TB without any problems.

Custom Firmware

These are the topics that discuss the custom firmware itself and includes the downloads. Once the custom firmware is installed and it's advanced interface has been enabled you can enable several add-ons such as Samba, Mediatomb, ssh, ftp, etc.

Web Interface

This is the topic about the web interface. No download required it is bundled in the custom firmware above.

Channel Editor

The channel editor is installed and configured via the web interface and is one of my favourite add-ons. It also allows you to enable non-freesat channels in the normal guide.

Non Freesat channels

We get our broadcasts from Astra 2A / Astra 2B / Astra 2D / Eurobird 1 (28.2°E). Other free to air channels are available and KingOfSat lists them all. These channels can only be added via the Humax setup menus, but can then be presented in the normal EPG using the Channel Editor above.

Decrypt HD recordings

I never actually tried this, but what follows might be useful.

OTA updates

This is not so relevant now, since Humax haven't released an OTA update for some time.

I have not yet found a way to prevent automatic over the air updates from being automatically applied. When an over the air update is applied the Humax still works, but the web interface and all the add-ons stop working. The can be solved by waiting for the custom firmware to be updated (which happen remarkably quickly) and then re-flashing the custom firmware. All the add-ons should start working again.

Categories: LUG Community Blogs

Adam Trickett: Bog Roll: New Time Piece

Planet HantsLUG - Sat, 28/06/2014 - 17:11

Personal portable tine pieces are strange things. My first watch was a small red wind-up Timex that I was given when I could read the time. The strap eventually frayed out but it was still in good order when I was given a Timex LCD digital watch when they became popular and cheap. That watch lasted all of 12 months before it died to be replaced by a Casio digital watch which lasted many years and many straps.

At University I had my grandfather's watch, a mechanical Swiss watch that was very accurate as long as I remembered to keep it wound up. I continued to wear it as my regular watch in my first job. The only problem with this watch was it wasn't shock or waterproof and it was no good when I forgot to wind it up! My better half got me a Casio hybrid analogue/digital watch which I used for many years as my out doors/DIY watch.

I was somewhat disappointed when the resin case failed and though the watch was okay it wasn't wearable anymore and I replaced it with two watches, another similar Casio and a titanium cased Lorus quartz analogue display watch. The Casio failed in exactly the same was as the previous ones, dead straps and the a case failure.

My most recent watch is a Seiko titanium cased quartz analogue display similar to the Lorus (they are the same company) but this one is slightly nicer and battery is solar re-charged so it should never need replacing. The strap is also made of titanium so unlike the continually failing resin straps it should also last quite a bit longer...

The irony of all these watches is that I sit in-front of a computer for a living, and often at home - all of which have network synchronised clocks on them that are far more reliable than any of my wrist-watches. When I'm not at a computer I usually have my mobile phone with me or another high precision time pieces is available...

Categories: LUG Community Blogs

Brett Parker (iDunno): Sony Entertainment Networks Insanity

Planet ALUG - Sat, 28/06/2014 - 16:54

So, I have a SEN account (it's part of the PSN), I have 2 videos with SEN, I have a broken PS3 so I can no deactivate video (you can only do that from the console itself, yes, really)... and the response from SEN has been abysmal, specifically:

As we take the security of SEN accounts very seriously, we are unable to provide support on this matter by e-mail as we will need you to answer some security questions before we can investigate this further. We need you to phone us in order to verify your account details because we're not allowed to verify details via e-mail.

I mean, seriously, they're going to verify my details over the phone better than over e-mail how exactly? All the contact details are tied to my e-mail account, I have logged in to their control panel and renamed the broken PS3 to "Broken PS3", I have given them the serial number of the PS3, and yet they insist that I need to call them, because apparently they're fucking stupid. I'm damned glad that I only ever got 2 videos from SEN, both of which I own on DVD now anyways, this kind of idiotic tie in to a system is badly wrong.

So, you phone the number... and now you get stuck with hold music for ever... oh, yeah, great customer service here guys. I mean, seriously, WTF.

OK - 10 minutes on the phone, and still being told "One of our advisors will be with you shortly". I get the feeling that I'll just be writing off the 2 videos that I no longer have access to.

I'm damned glad that I didn't decide to buy more content from that - at least you can reset the games entitlement once every six months without jumping through all these hoops (you have to reactivate each console that you still want to use, but hey).

Categories: LUG Community Blogs

Martin Wimpress: Integrating Dropbox photo syncing with Open Media Vault and Plex

Planet HantsLUG - Sat, 28/06/2014 - 12:00

I've installed Open Media Vault on a HP ProLiant MicroServer G7 N54L and use it as media server for the house. OpenMediaVault (OMV) is a network attached storage (NAS) solution based on Debian Linux. At the time of writing OMV 0.5.x is based on Debian 6.0 (Squeeze).

I use a free Dropbox account to sync photos from mine and my wife's Android phones and wanted to automate to import of these photo upload into Plex, which is also running on Open Media Vault.

Installing Dropbox on Open Media Vault 0.5.x

I looked for a Dropbox Plugin for Open Media Vault and found this:

Sadly, at the time of writing, it is unfinished and I didn't have the time to go and learn the Open Media Vault plugin API.

The Open Media Vault forum does include a Dropbox HOW-TO which is very similar to how I've run Dropbox on headless Linux servers in the past. So, I decided to adapt my existing notes to Open Media Vault.

Create a Dropbox Share

Create a Dropbox share via the OMV WebUI.

  • Access Right Management -> Shared Folders

I gave my the name "Dropbox". I know, very original.

Installing Dropbox on a headless server

Download the latest Dropbox stable release for 32-bit or 64-bit.

wget -O dropbox.tar.gz "http://www.dropbox.com/download/?plat=lnx.x86" wget -O dropbox.tar.gz "http://www.dropbox.com/download/?plat=lnx.x86_64"

Extract the archive and install Dropbox in /opt.

cd tar -xvzf dropbox.tar.gz sudo mv ~/.dropbox-dist /opt/dropbox sudo find /opt/dropbox/ -type f -exec chmod 644 {} \; sudo chmod 755 /opt/dropbox/dropboxd sudo chmod 755 /opt/dropbox/dropbox sudo ln -s /opt/dropbox/dropboxd /usr/local/bin/dropboxd

Run dropboxd.

/usr/local/bin/dropboxd

You should see output like this:

This client is not linked to any account... Please visit https://www.dropbox.com/cli_link?host_id=d3adb33fcaf3d3adb33fcaf3d3adb33f to link this machine.

Visit the URL, login with your Dropbox account and link the account. You should see the following.

Client successfully linked, Welcome Web!

dropboxd will now create a ~/Dropbox folder and start synchronizing. Stop dropboxd with CTRL+C.

Symlink the Dropbox share

Login to the OMV server as root and sym-link the Dropbox share you created earlier to the Dropbox directory in the root home directory.

mv ~/Dropbox ~/Dropbox-old ln -s /media/<UUID>/Dropbox Dropbox rsync -av -W --progress ~/Dropbox-old/ ~/Dropbox/ init.d

To run Dropbox as daemon with init.d. Create /etc/init.d/dropbox with the following content.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65#!/bin/sh ### BEGIN INIT INFO # Provides: dropbox # Required-Start: $local_fs $remote_fs $network $syslog $named # Required-Stop: $local_fs $remote_fs $network $syslog $named # Default-Start: 2 3 4 5 # Default-Stop: 0 1 6 # X-Interactive: false # Short-Description: dropbox service ### END INIT INFO DROPBOX_USERS="root" DAEMON=/opt/dropbox/dropbox start() { echo "Starting dropbox..." for dbuser in $DROPBOX_USERS; do HOMEDIR=`getent passwd $dbuser | cut -d: -f6` if [ -x $DAEMON ]; then HOME="$HOMEDIR" start-stop-daemon -b -o -c $dbuser -S -u $dbuser -x $DAEMON fi done } stop() { echo "Stopping dropbox..." for dbuser in $DROPBOX_USERS; do HOMEDIR=`getent passwd $dbuser | cut -d: -f6` if [ -x $HOMEDIR/$DAEMON ]; then start-stop-daemon -o -c $dbuser -K -u $dbuser -x $DAEMON fi done } status() { for dbuser in $DROPBOX_USERS; do dbpid=`pgrep -u $dbuser dropbox` if [ -z $dbpid ] ; then echo "dropboxd for USER $dbuser: not running." else echo "dropboxd for USER $dbuser: running (pid $dbpid)" fi done } case "$1" in start) start ;; stop) stop ;; restart|reload|force-reload) stop start ;; status) status ;; *) echo "Usage: /etc/init.d/dropbox {start|stop|reload|force-reload|restart|status}" exit 1 esac exit 0

Enable the init.d script.

sudo chmod +x /etc/init.d/dropbox sudo update-rc.d dropbox defaults Starting and Stopping the Dropbox daemon

Use /etc/init.d/dropbox start to start and /etc/init.d/dropbox stop to stop.

Dropbox client

It is recommended to download the official Dropbox client to configure Dropbox and get its status.

wget "http://www.dropbox.com/download?dl=packages/dropbox.py" -O dropbox chmod 755 dropbox sudo mv dropbox /usr/local/bin/

You can check on Dropbox status by running the following.

dropbox status

For usage instructions run dropbox help.

Photo importing

So, the reason for doing all this is that I now have a Dropbox instance running on my home file server and everyday it runs a script, that I wrote, to automatically import new photos into a directory that Plex monitors. I'll post details about my photo sorting script, Phort, at a later date.

References
Categories: LUG Community Blogs
Syndicate content