News aggregator

Meet Up Thursday 29th January 2015 Ye Olde Cock Tavern from 6.30pm: Pre-FOSDEM Drinks

Greater London LUG News - Thu, 15/01/2015 - 12:36
After a bit of a break and after much interested we've put together a drink/meet up. It is right before FOSDEM so hoping we can get a few people from out-of-town too who might be down to head off on the EuroStar. So do let others know if they're travelling this way.

Feel free to bring laptops and gadgets to show off or get help with, but mainly bring yourself to meet other people with similar interests to your own.

We have booked an area of the ground floor and you should be able to find us pretty easily but we will have a plushy tux just in case.

Ye Olde Cock Tavern
22 Fleet Street
London
EC4Y 1AA

29th Jan 2015
From 6.30pm
Categories: LUG News

Jonathan McDowell: Tracking a ship around the world

Planet ALUG - Tue, 13/01/2015 - 16:07

I moved back from the California Bay Area to Belfast a while back and for various reasons it looks like I’m going to be here a while, so it made sense to have my belongings shipped over here. They haven’t quite arrived yet, and I’ll do another post about that process once they have, but I’ve been doing various tweets prefixed with “[shipping]” during the process. Various people I’ve spoken to (some who should know me better) thought this was happening manually. It wasn’t. If you care about how it was done, read on.

I’d been given details of the ship carrying my container, and searching for that turned up the excellent MarineTraffic which let me see the current location of the ship. Turns out ships broadcast their location using AIS and anyone with a receiver can see the info. Very cool, and I spent some time having a look at various bits of shipping around the UK out of interest. I also found the ship’s itinerary which give me some idea of where it would be calling and when. Step one was to start recording this data; it was time sensitive and I wanted to be able to see historical data. I took the easy route and set up a cron job to poll the location and itinerary on an hourly basis, and store the results. That meant I had the data over time, if my parsing turned out to miss something I could easily correct it, and that I wasn’t hammering Marine Traffic while writing the parsing code.

Next I wanted to parse the results, store them in a more manageable format than the HTML, and alert me when the ship docked somewhere or set off again. I’ve been trying to learn more Python rather than doing my default action of turning to Perl for these things, and this seemed like a simple enough project to try out. Beautiful Soup seemed to turn up top for HTML parsing in Python, so that formed the basis. Throwing the info into a database so I could do queries felt like the right move so I used SQLite - if this had been more than a one off I’d have considered looking at PostgreSQL and its GIS support. Finally Tweepy made it very easy to tweet from Python in about 4 lines of code. The whole thing weighed in at only 175 lines of code, mostly around pulling the info out of the HTML and then a little to deal with checking for state changes against the current status and the last piece of info in the database.

The pieces of information I chose to store were the time of the update (i.e. when the ship sent it, not when my script ran), reported area, reported state, the position + course, reported origin, reported destination and eta. The fact this is all in a database makes it very easy to do a few queries on the data.

How fast did the ship go?

sqlite> SELECT MAX(speed) FROM status; MAX(speed) 21.9

What areas did it report?

sqlite> SELECT area FROM status GROUP BY area; area - Atlantic North California Caribbean Sea Celtic Sea English Channel Hudson River Pacific North Panama Canal

What statuses did we see?

sqlite> SELECT status FROM status GROUP BY status; status At Anchor Moored Stopped Underway Underway using Engine

Finally having hourly data lets me draw a map of where the ship went. The data isn’t complete, because the free AIS info depends on the ship being close enough to a receiving station. That means there were a few days in the North Atlantic without updates, for example. However there’s enough to give a good idea of just how well traveled my belongings are, and it gave me an excuse to play with OpenLayers.

(Apologies if the zoom buttons aren’t working for you here; I think I need to kick the CSS in some manner I haven’t quite figured out yet.)

Categories: LUG Community Blogs

Chris Lamb: Giant's Causeway puzzle

Planet ALUG - Mon, 12/01/2015 - 10:34

Over Christmas I found the above puzzle at my parent's house. I don't know if it has a name but it seemed apt to name it after The Giant's Causeway.

The idea is that you swap and/or rotate the outer tiles until the colours at the edges match. The center tile is fixed, or at least I assume it is as the wooden edges are deliberately less rounded.

Of course, solving it manually would be boring so here's a dumb brute force solution. I actually admire how inefficient and stupid it is...

import itertools class Tile(list): def __getitem__(self, x): return super(Tile, self).__getitem__(x % len(self)) def rotate(self, x): return Tile(self[x:] + self[:x]) COLOURS = ('YELLOW', 'WHITE', 'BLACK', 'RED', 'GREEN', 'BLUE') YELLOW, WHITE, BLACK, RED, GREEN, BLUE = range(len(COLOURS)) TILES = ( Tile((WHITE, YELLOW, RED, BLUE, BLACK, GREEN)), Tile((RED, BLUE, BLACK, YELLOW, GREEN, WHITE)), Tile((WHITE, BLACK, YELLOW, GREEN, BLUE, RED)), Tile((WHITE, BLUE, GREEN, YELLOW, BLACK, RED)), Tile((GREEN, BLUE, BLACK, YELLOW, RED, WHITE)), Tile((RED, YELLOW, GREEN, BLACK, BLUE, WHITE)), ) CENTER = Tile((WHITE, BLACK, RED, GREEN, BLUE, YELLOW)) def pairwise(it): a, b = itertools.tee(it) next(b, None) return itertools.izip(a, b) def validate(xs): for idx, x in enumerate(xs): if x[idx + 3] != CENTER[idx]: raise ValueError("Tile does not match center") for idx, (a, b) in enumerate(pairwise(xs)): if a[idx + 2] != b[idx + 5]: raise ValueError("Tile does not match previous anticlockwise tile") return xs def find_solution(): # For all possible tile permutations.. for xs in itertools.permutations(TILES): # ... try all possible rotations for ys in itertools.product(range(len(COLOURS)), repeat=len(COLOURS)): try: return validate([x.rotate(y) for x, y in zip(xs, ys)]) except ValueError: pass raise ValueError("Could not find a solution.") for x in find_solution(): print ', '.join(COLOURS[y] for y in x)

This prints (after 5 minutes!):

$ python giants.py GREEN, BLUE, RED, WHITE, BLACK, YELLOW WHITE, BLUE, GREEN, YELLOW, BLACK, RED YELLOW, GREEN, BLACK, BLUE, WHITE, RED GREEN, WHITE, YELLOW, RED, BLUE, BLACK RED, BLUE, BLACK, YELLOW, GREEN, WHITE BLUE, BLACK, YELLOW, RED, WHITE, GREEN python giants.py 269.08s user 0.02s system 99% cpu 4:29.36 total

UPDATE: Seems like this puzzle is called "Circus Seven". Now to dust off my Prolog...

Categories: LUG Community Blogs

Chris Lamb: Uploading a large number of files to Amazon S3

Planet ALUG - Fri, 09/01/2015 - 13:31

I recently had to upload a large number (~1 million) of files to Amazon S3.

My first attempts revolved around s3cmd (and subsequently s4cmd) but both projects seem to based around analysing all the files first, rather than blindly uploading them. This not only requires a large amount of memory, non-trivial experimentation, fiddling and patching is also needed to avoid unnecessary stat(2) calls. I even tried a simple find | xargs -P 5 s3cmd put [..] but I just didn't trust the error handling correctly.

I finally alighted on s3-parallel-put, which worked out well. Here's a brief rundown on how to use it:

  1. First, change to your source directory. This is to ensure that the filenames created in your S3 bucket are not prefixed with the directory structure of your local filesystem — whilst s3-parallel-put has a --prefix option, it is ignored if you pass a fully-qualified source, ie. one starting with a /.
  2. Run with --dry-run --limit=1 and check that the resulting filenames will be correct after all:
$ export AWS_ACCESS_KEY_ID=FIXME $ export AWS_SECRET_ACCESS_KEY=FIXME $ /path/to/bin/s3-parallel-put \ --bucket=my-bucket \ --host=s3.amazonaws.com \ --put=stupid \ --insecure \ --dry-run --limit=1 \ . [..] INFO:s3-parallel-put[putter-21714]:./yadt/profile.Profile/image/circle/807.jpeg -> yadt/profile.Profile/image/circle/807.jpeg [..]
  1. Remove --dry-run --limit=1, and let it roll.
Categories: LUG Community Blogs

Chris Lamb: Uploading a large number of files to S3

Planet ALUG - Fri, 09/01/2015 - 13:31

I recently had to upload a large number (~1 million) of files to Amazon S3.

My first attempts revolved around s3cmd (and subsequently s4cmd) but both projects seem to based around analysing all the files first, rather than blindly uploading them. This not only requires a large amount of memory, non-trivial experimentation, fiddling and patching is also needed to avoid unnecessary stat(2) calls. I even tried a simple find | xargs -P 5 s3cmd put [..] but I just didn't trust the error handling correctly.

I finally alighted on s3-parallel-put, which worked out well. Here's a brief rundown on how to use it:

  1. First, change to your source directory. This is to ensure that the filenames created in your S3 bucket are not prefixed with the directory structure of your local filesystem — whilst s3-parallel-put has a --prefix option, it is ignored if you pass a fully-qualified source, ie. one starting with a /.
  2. Run with --dry-run --limit=1 and check that the resulting filenames will be correct after all:
$ export AWS_ACCESS_KEY_ID=FIXME $ export AWS_SECRET_ACCESS_KEY=FIXME $ /path/to/bin/s3-parallel-put \ --bucket=my-bucket \ --host=s3.amazonaws.com \ --put=stupid \ --insecure \ --dry-run --limit=1 \ . [..] INFO:s3-parallel-put[putter-21714]:./yadt/profile.Profile/image/circle/807.jpeg -> yadt/profile.Profile/image/circle/807.jpeg [..]
  1. Remove --dry-run --limit=1, and let it roll.
Categories: LUG Community Blogs

Jonathan McDowell: Cup!

Planet ALUG - Thu, 08/01/2015 - 13:58

I got a belated Christmas present today. Thanks Jo + Simon!

Categories: LUG Community Blogs

MJ Ray: Social Network Wishlist

Planet ALUG - Thu, 08/01/2015 - 04:10

All I want for 2015 is a Free/Open Source Software social network which is:

  • easy to register on (no reCaptcha disability-discriminator or similar, a simple openID, activation emails that actually arrive);
  • has an email help address or online support or phone number or something other than the website which can be used if the registration system causes a problem;
  • can email when things happen that I might be interested in;
  • can email me summaries of what’s happened last week/month in case they don’t know what they’re interested in;
  • doesn’t email me too much (but this is rare);
  • interacts well with other websites (allows long-term members to post links, sends trackbacks or pingbacks to let the remote site know we’re talking about them, makes it easy for us to dent/tweet/link to the forum nicely, and so on);
  • isn’t full of spam (has limits on link-posting, moderators are contactable/accountable and so on, and the software gives them decent anti-spam tools);
  • lets me back up my data;
  • is friendly and welcoming and trolls are kept in check.

Is this too much to ask for? Does it exist already?

Categories: LUG Community Blogs

Chris Lamb: Web scraping: Let's move on

Planet ALUG - Wed, 07/01/2015 - 18:53

Every few days, someone publishes a new guide, tutorial, library or framework about web scraping, the practice of extracting information from websites where an API is either not provided or is otherwise incomplete.

However, I find these resources fundamentally deceptive — the arduous parts of "real world" scraping simply aren't in the parsing and extraction of data from the target page, the typical focus of these articles.

The difficulties are invariably in "post-processing"; working around incomplete data on the page, handling errors gracefully and retrying in some (but not all) situations, keeping on top of layout/URL/data changes to the target site, not hitting your target site too often, logging into the target site if necessary and rotating credentials and IP addresses, respecting robots.txt, target site being utterly braindead, keeping users meaningfully informed of scraping progress if they are waiting of it, target site adding and removing data resulting in a null-leaning database schema, sane parallelisation in the presence of prioritisation of important requests, difficulties in monitoring a scraping system due to its implicitly non-deterministic nature, and general problems associated with long-running background processes in web stacks.

Et cetera.

In other words, extracting the right text on the page is the easiest and trivial part by far, with little practical difference between an admittedly cute jQuery-esque parsing library or even just using a blunt regular expression.

It would be quixotic to simply retort that sites should provide "proper" APIs but I would love to see more attempts at solutions that go beyond the superficial.

Categories: LUG Community Blogs

Steve Engledow (stilvoid): Pinally

Planet ALUG - Wed, 07/01/2015 - 00:03

I've finally found a real, practical use for my Raspberry Pi: it's an always-on machine that I can ssh to and use to wake up my home media server :)

It doubles as yet another syncthing client to add to my set.

And of course, it's running Arch ;)

Categories: LUG Community Blogs

Steve Kemp: Here we go again.

Planet HantsLUG - Tue, 06/01/2015 - 00:00

Once upon a time I worked from home for seven years, for a company called Bytemark. Then, due to a variety of reasons, I left. I struck out for adventures and pastures new, enjoyed the novelty of wearing clothes every day, and left the house a bit more often.

Things happened. A year passed.

Now I'm working for Bytemark again, although there are changes and the most obvious one is that I'm working in a shared-space/co-working setup, renting a room in a building near my house instead of being in the house.

Shame I have to get dressed, but otherwise it seems to be OK.

Categories: LUG Community Blogs

Philip Stubbs: Quad-copter Build Log

Planet HantsLUG - Mon, 05/01/2015 - 21:06
Having enjoyed watching a series of videos on YouTube, I have been inspired to have a go at building a quad copter. I hope to include here some notes that will act as a reminder for myself. If anybody else decides to try this, please don't rely on this information alone, but please do your research.

RadioThis seems like a good place to start. If this bit can't be made to work, then nothing has been wasted on motors, batteries ESC's etc.
Rather than build an entire transmitter, an old DigiFleet transmitter will be gutted and used for the housing, control sticks, knobs and switches.
I had been hanging onto this transmitter, thinking that I would use it someday, but in the meantime, the NiCAD cells had spilt their guts and pretty much wrecked the circuit boards inside. So nothing was lost by tossing out the control boards, to make space for an Arduino. At the same time, all the parts were removed to give everything a good clean.
Some parts were ordered from Banggood to get started on the radio. An Arduino Pro Mini to use as the reciever, and Arduino Leonardo for the transmitter, and a couple of NRF24L01 boards for the 2.4GHz radio link.

Connecting Arduino to NRF24L01 BoardsAfter receiving the Arduino boards, the first thing was to load new Blink programs. This confirmed that they worked, and I could program them. The only issue was for some reason the Leonardo will only work as root. When it cam to connecting the the NRF modules to the Arduino, it was again the Leonardo that was proved to be more challenging. In addition to the power lines, the NRF's require five extra connections. Three of those connections are not available on the standard header, but are available on the ICSP header. As the remaining two connections can be assigned to any digital pin, it is possible to have all of the Leornardo's 12 analogue input pins free to use. This would make it possible to connect up all the controls on the radio, if needed.

The diagram below shows how the NRF boards were connected for initial testing.

With this set-up, it has been possible to communicate between the boards from different locations in the house.

SoftwareThe YouTube videos that started this mentioned a specific library for the RF modules. For no good reason, for this test, the Radio Head library has been used.
Learnt so far
  • Getting the radios to work was easy.
  • The Leonardo is not quite as simple to use as the pro-mini. Mainly due to the build in USB has to be handled by the Arduino software in a different way.
  • Choosing the Leonardo for the transmitter will allow me to use all 12 analogue inputs.
  • Just because a NRF24L01 board has a plug for an aerial does not mean that it is the high power version!
Categories: LUG Community Blogs

Meeting at "The Moon Under Water"

Wolverhampton LUG News - Mon, 05/01/2015 - 13:32
Event-Date: Wednesday, 7 January, 2015 - 19:30 to 23:00Body: 53-55 Lichfield St Wolverhampton West Midlands WV1 1EQ Eat, Drink and talk Linux
Categories: LUG Community Blogs

Bring-A-Box / Pub meet, Epsom Downs, 10th Jan 2015

Surrey LUG - Sun, 04/01/2015 - 23:48
Start: 2015-01-10 12:00 End: 2015-01-10 12:00

We have regular sessions on the second Saturday of each month.  To kick off the New Year we're meeting at the Tattenham Corner, Epsom.  A commanding view of Epson Downs, good food (a Beefeater pub), beer and a welcome whether you're a regular Surrey LUG enthusiast or a first-timer.

 

Categories: LUG Community Blogs

Steve Engledow (stilvoid): Keychain and GnuPG >= 2.1

Planet ALUG - Fri, 02/01/2015 - 15:48

A while ago, I started using keychain to manage my ssh and gpg agents. I did this with the following in my .bashrc

# Start ssh-agent eval $(keychain --quiet --eval id_rsa)

Recently, arch updated gpg to version 2.1.1 which, as per the announcement, no longer requires the GPG_AGENT_INFO environment variable.

Unfortunately, tools like keychain don't know about that and still expect it to be set, leading to some annoying breakage.

My fix is a quick and dirty one; I appended the following to .bashrc

export GPG_AGENT_INFO=~/.gnupg/S.gpg-agent:$(pidof gpg-agent):1

:)

Categories: LUG Community Blogs

Steve Engledow (stilvoid): TODO

Planet ALUG - Fri, 02/01/2015 - 12:58

Here's this year's TODO diff as compared with last year.

New Year's Resolutions
  • Read even more (fiction and non-fiction)

  • Write at least one short story

  • Write some more games

  • Go horse riding

  • Learn some more turkish

  • Play a lot more guitar

    I did this but want to do more!

  • Lose at least a stone (in weight, from myself)

    I almost did this and then put it all back on again

  • Try to be less of a pedant (except when it's funny)

  • Try to be more funny ;)

    I'm sure I've achieved these ;)

  • Receive a lot less email

    In particularly, unsubscrive from things I don't read

  • Blog more

    Particularly about technical subjects

  • Write more software

  • Release more software

  • Be a better husband and father

    I think I'm doing alright but I'm sure I can do better

  • Improve or replace the engine I use for my blog

Categories: LUG Community Blogs

Chris Lamb: Goals

Planet ALUG - Thu, 01/01/2015 - 21:36

Dr. Guy Winch:

We used to think that happiness is based on succeeding at our goals, but it turns out not so much.

Most marathon runners, for example — not professionals but the amateur runners — their high for completing the marathon usually disappears even before their nipples stop bleeding.

My point is that the high lasts for a very short amount of time. If you track where people's happiness and satisfaction is, it is in "getting" or in making progress towards our goals. It's a more satisfying, life-affirming, motivating and happy thing than actually reaching them.

So it's a great thing to keep in mind... Health, for example. When you define health as something you want to do, living your life and looking back on the week and saying "That was a healthy week for me: I worked out this number of times, I did this amount of push-ups, I ate reasonably most of the time..." that's very satisfying and that's where you'll feel happy about yourself. And if you're too focused on a scale for some reason, then that's an external thing that you'll hit... and then what?

So it is about creating goals that are longer lasting and really focusing on the journey because that's really where we get our happiness and our satisfaction. Once it's achieved... now what?

Categories: LUG Community Blogs

Chris Lamb: 2014: Selected highlights

Planet ALUG - Wed, 31/12/2014 - 17:23

Previously: 2012 & 2013.


January

Was lent a 15-course baroque lute.

February

Grandpa's funeral. In December he was posthumously awarded the Ushakov Medal (pictured) for his service in the Royal Navy's Arctic Convoys during the Second World War.

March

A lot of triathlon training but also got back into cooking.

April

Returned to the Cambridge Duathlon.

May

Raced 50 and 100 mile cycling time trials & visited the Stratford Olympic pool (pictured).

June

Ironman Austria.

July

Paced my sister at the Downtow-Upflow Half-marathon. Also released the first version of the Strava Enhancement Suite.

August

Visited Cornwall for my cousin's wedding (pictured). Another month for sport including my first ultramarathon and my first sub-20 minute 5k.

September

Entered a London—Oxford—London cycling brevet, my longest single-ride to date (269 km). Also visited the Tour of Britain and the Sri Chomnoy 24-hour endurance race.

October

London—Paris—London cycling tour (588 km).

November

Performed Handel's Messiah in Kettering.

December

Left Thread.com.

Categories: LUG Community Blogs

MJ Ray: GPG Transition Statement

Planet ALUG - Tue, 30/12/2014 - 10:34

Rather late but I guess that just confirms it’s really me, right? The signed text and IDs should be at http://mjr.towers.org.uk/transition-statement.txt

Thank you if you help me out here I’ll resign keys in a while.

Categories: LUG Community Blogs

Steve Kemp: Reducing, or redirecting at least, charitable donations.

Planet HantsLUG - Tue, 30/12/2014 - 00:00

This is the time of year when there are lots of adverts shown on TV solicating donations for charities, which frequently end with the two words "thank you".

I've always felt there were too many charities in the world, and that it was hard to half-heartedly give money to one charity this month, one the next, and still another next year. On that basis I decided long ago to give my money solely to three charities. If I had money that was spare, or I felt generous that month, I would give it to one of "my" charities. Any other appeals I just ignored (with minor exceptions for one-off events like tsunamis, etc).

I won't claim credit for this idea, it came directly from my mom who does the same thing. I've given money to the same three charities for twenty years now. Maybe not thousands, but hopefully enough to be useful. Certainly more than I'd have given if my donation were split between more recipients.

Now I'm changing. As of next year only one charitable organization will get my pennies. The other two haven't done anything bad, wrong, or failed/succeeded (sadly), but it feels better for me to stick to a single recipient.

Happy Christmas.

(Details shouldn't matter, but to answer the obvious question the charity I've kept is the RNLI.)

Categories: LUG Community Blogs

Martin A. Brooks: Elite Dangerous: Tips for explorers

Planet HantsLUG - Sun, 28/12/2014 - 17:45

If you’re not into, or don’t have the equipment to go bounty hunting, exploring in Elite Dangerous can be a lucrative way to make money.  Here are some tips for Commanders wanting to have a go.

You will need a detail scanner which costs CR250,000.  You can explore with just the basic scanner, but it’s much harder.  Assign the scanner to your secondary fire group. It you want to explore deeper space, then you’ll need a fuel scoop, too.   You can get by without a scoop to start with but it means you need to be careful not to stray too far from an inhabited system.  It’s a good idea to keep track of the last inhabited system you were at, if you start to run low on fuel, head back.

Pick a system where there’s no navigation information available, look for the red system information icon in the galaxy map. Systems with an actual name rather than a designation are likely inhabited.

As you jump into the system, whilst still in witchspace, set the throttle to zero. You’re going to arrive very close to the local primary and don’t want to get too hot.  As soon as you exit, press and hold secondary fire to charge the scanner, it takes a few seconds before firing.  The first object you’ll immediately pick up is the star, point the ship at it and hit your “target ahead” key.  You’ll see your ship start scanning it, this takes a little while.  If your detail scanner picks up any other nearby bodies, you’ll get an alert telling you.

Target each unexplored object in turn, working from closest to furthest away.   Anything within 5ls you can scan just by rotating your ship and pointing at it, however at some point you’ll need to start moving around the system.

Be careful near the star, it’s easy to overheat your ship.   If an object is on the other side of the star just point your ship away from the star to get some altitude and, when the heat levels decrease, gradually turn to get the target in your sights.

Learning to jockey the FSD’s autothrottle is essential.  If you keep overshooting objects then you’ll just waste time.  Once the object is in your sights, push the throttle forward until the power line turns blue and leave it there.  Your ship will now automatically accelerate and decelerate for you.  You’ll need to get each object within range to scan, the distance depends on the mass of the object.  Stars can be scanned from a long way out, gas giants from about 100ls, planets from around 20ls or 10ls and rock belts 5ls.

Not all systems contain planets, or they might be out of the range of your detail scanner.  You can try looking round the sky for objects moving relative to the background, I don’t bother and just move on the the next system.

Data you gather can be sold at any station provided you’ve travelled at least 20LY from where you got it.  The least you’ll get is a few hundred credits, however even a single star explored will usually net you around CR1200.   More interesting systems, with high metal content planets, will net you much more.  The highest I’ve seen to date is CR53000.  There seems no point in hoarding the data, the price doesn’t change regardless of distance.

Every so often, seems to be about 1 in 20, you’ll get an interdiction attempt.  It’s up to you whether you fight or flee but if you lose your ship, you will also lose any navigation data you have not sold.  If you get something juicy, it’s probably worth making a deliberate trip to an inhabited system to cash it in.

As you travel around a solar system, you’ll see blips on your scope marked Unidentified Signal Source.  If you investigate these then you may be lucky and it’s free cargo (albeit it will be marked stolen), or it may be dumped toxic waste, or it may be a trap.  The traps are not usually hard to evade if you don’t feel like a fight.  Stolen cargo can be offloaded at the black market, you might have to carry it around for a while before you find one.  I usually limit myself to investigating one USS per system, usually once I’m done with exploring.  If it turns out to be a trap then I just jump to the next system.

 

Categories: LUG Community Blogs
Syndicate content