News aggregator

Andrew Savory: Multi media

Planet ALUG - Tue, 07/01/2014 - 22:39

My movie collection is a bit of a mishmash, a bunch of different file formats all sat on a Drobo. In the early days I would create AVI, MKV or MP4 rips of my DVDs depending on how and where I wanted to watch them. Sometimes the rips would be split across multiple files. More recently I just copied the DVD wholesale, for conversion later. As a result, ensuring a consistent set of files to copy onto my phone or tablet is a bit of a pain.

With the arrival of the RaspBMC media server, I decided to clean everything up. Some constraints I set:

  • I want to avoid loss of quality from source material (so no re-encoding if possible, only copying).
  • I should be able to do everything from the command line so it can be automated (manipulating video files can be a slow process even without encoding).
  • I want to combine multiple DVDs where possible for easier viewing.
  • My end goal is to have MKV files for most things.
Here’s what I’ve got working so far. Bug fixes and improvements welcome.

~

AVI files

You can glue AVI files together (concatenate them) and then run mencoder over the joined up file to fix up the indexes:

brew install mplayer cat part1.avi part2.avi > tmp.avi && \ /usr/local/bin/mencoder -forceidx -oac copy -ovc copy tmp.avi -o whole.avi

This forces mencoder to rebuild the index of the avi, which allows players to seek through the file. It encodes with the “copy” audio and video codec, i.e. no encoding, just streamed copying.

~

MKV files

MKV is Matroska, an open source open standard video container format. The process is similar to AVI files, but the mkvmerge tool does everything for you: 

brew install mkvtoolnix /usr/local/bin/mkvmerge -o whole.mkv part1.mkv +part2.mkv

This takes the two parts and joins them together. Again, no re-encoding, just copying.

~

DVD rips

I started using RipIt to back up my DVDs; it can automatically encode DVDs, but once I got my Drobo I opted to keep the originals, so I always have the option to re-encode on a case-by-case basis for the target device without losing the best quality original.

I don’t need to touch most of the DVD copies, but a number of my DVDs are split across several disks, for example Starship Troopers and The Lord of the Rings.

One option would be to encode each DVD at the highest possible quality and then merge the AVI or MKV using the mechanisms above, but I want to avoid encoding if possible.

It turns out that the VOB files on a DVD are just MPEG files (see What’s on a DVD? for more details), so there’s no need to convert to AVI or MP4. We can glue them together as we did with the AVIs, then package them as MKV. The basic method is:

cat *.VOB > movie.vob

The problem is that we need to be selective about the VOB files that are included; there’s no point including DVD menu and setup screen animations, for example. A dirty hack might be to select only the VOB files bigger than a certain threshold size, and just hope that the movie is divided into logical chunks. Something like this, run in a movie directory:

find -s . -name '*.VOB' -size +50M

There’s a catch: the first VOB (vts_XX_0.vob) always contains a menu, so we need to skip those, and we don’t want the menu/copyright message (video_ts.vob):

find -s . \( -iname '*.VOB' ! -iname 'VTS_*_0.VOB' ! -iname 'VIDEO_TS.VOB' \) -size +50M 

We can then use ffmpeg to copy the output of find (a list of our VOB files) into an MKV file. So far we’re assuming we only want the first audio stream (usually English), and I haven’t investigated how best to handle subtitles yet. The command is:

ffmpeg -i - -vcodec copy -acodec copy foo.mkv

There’s a couple of issues with this:

So our final command is:

find -s . \( -iname '*.VOB' ! -iname 'VTS_*_0.VOB' ! -iname 'VIDEO_TS.VOB' \) -size +50M -exec cat {} \; \ | ffmpeg -fflags +genpts -i - -f matroska -vcodec copy -acodec copy -c:s copy foo.mkv

The output should be an mkv file roughly the same size as the constituent .dvdmedia directories. You can test it using mkvinfo foo.mkv, which should output information on the mkv file. For some reason, using ‘file foo.mkv’ does not recognise it as an mkv file, only as data.

~

Putting it all together

Now we know how to handle several individual file formats, we can script the whole process.

The next step is to trawl through a disk full of movies and to normalise them into one format. At this point, we’re well into XKCD territory (The General Problem, and Is It Worth The Time?), so that’s left as an exercise for the reader

~

References

Categories: LUG Community Blogs

Tony Whitmore: A message for fans of Sherlock (spoilers!)

Planet HantsLUG - Tue, 07/01/2014 - 21:36

For fans of Sherlock, I’d just like to make it clear that although I am a wedding photographer, I’m not a psycho murderer dude. And I almost never use flash during the daytime. I just do stuff like this…

These photos are from Stuart and Zoe’s fantastic wedding in Greece last autumn. I will be writing more about it soon!

Pin It
Categories: LUG Community Blogs

Dick Turpin: I just knew you'd say that.

Planet WolvesLUG - Tue, 07/01/2014 - 11:18
Prospect: "I've had your email so we must be on your mailing list?" (No shit Sherlock!)
Me: "OK"
Prospect: "I'm interested in three of the Dell i5 machines, I see they are refurbished can you tell me the history of them and what has been done to them?"
Me: "I'm afraid not......................"

Most suppliers have between 20 - 20,000 units they come from all sorts of places and for all sorts of reasons. Some need new parts, some just need a clean and some are in pristine condition and need nothing doing to them. Basically I only trade in Grade A. The machines come into the supplier and are graded, tested and offered for sale with varying lengths of warranty. They don't have time to write a frigging Bio on every unit.

If you want a brand spanking new PC then "Pay the fooking price" if you don't want to pay full whack accept what you are buying then. 

Me: "So three units plus delivery plus VAT is £1285.00"
Prospect: "OK let me just look at that and come back to you."

Now how did I know you was going to say that?
Categories: LUG Community Blogs

Jono Bacon: Ubuntu Loco Team App Dev Schools – Volunteers Needed!

Planet WolvesLUG - Mon, 06/01/2014 - 23:37

2014 is going to be a great year for Ubuntu App Developers. We laid down some fantastic foundations in 2013, but this year we want to extend and grow our community in multiple directions…building a solid, empowered on-ramp for creating awesome apps for Ubuntu.

…but we can’t do this alone, we need your help!

One effort here is to work with our fantastic LoCo Team Community to run a series of Ubuntu App Developer schools across the world. We have one of the greatest advocacy communities anywhere, so this seems like a perfect match.

Fortunately, David Planella has already created some awesome slides and a good tutorial that these schools can work from (he did this for a previous event), and we are here to help provide help and guidance about how to run an event.

As such, we are looking for volunteers to run a local Ubuntu App Dev school in your area. Doing this is as simple as:

  • Find a place to run an event and pick a date when to run it.
  • Find some other folks in your LoCo who would be interested in helping.
  • Get the material and tune it for your event if needed.
  • Promote the event locally and encourage people to join.
  • Practice the material a few times before the big day, then show up, run the class and have fun.
  • Take lots of pictures!

The last step is really important as we would like to create a montage of the events.

So, if you are interested in participating, send me an email to jono@ubuntu.com and mention which LoCo team you are part of and where you would run the event, and lets make the magic happen!

Categories: LUG Community Blogs

Meeting at "The Moon Under Water"

Wolverhampton LUG News - Mon, 06/01/2014 - 09:31


53-55 Lichfield St
Wolverhampton
West Midlands
WV1 1EQ

Eat, Drink and talk Linux

Event Date and Time:  Wed, 08/01/2014 - 19:30 - 23:00
Categories: LUG Community Blogs

Steve Kemp: A beginning is a very delicate time.

Planet HantsLUG - Mon, 06/01/2014 - 08:30

Recently I wrote about docker, after a brief diversion into using runit for service management, I then wrote about it some more.

I'm currently setting up a new PXE-boot environment which uses docker for serving DHCP and TFTPD, which is my first "real" usage of any note. It is fun, although I now discover I'm not alone in using docker for this purpose.

Otherwise life is good, and my blog-spam detection service recently broke through the 11 million-rejected-comment barrier. The Wordpress Plugin is seeing a fair amount of use, which is encouraging - but more reviews would be nice ;)

I could write about work, I've not done that since changing job, but I'm waiting for something disruptive to happen first..

ObQuote: Dune. (film)

Categories: LUG Community Blogs

Adam Trickett: Bog Roll: Dithering

Planet HantsLUG - Sun, 05/01/2014 - 14:52

My desktop boxen are getting on. Over a year ago I started to think about replacing at least one of them: Twin Dilema. The boxen are even older now, really feeling the strain and I've still not done anything about it...

This autumn I ripped the hard-disk out of an old Sky+ box I had, and put that into my desktop PC. It is faster and larger than the original drive and as a result has bought some life back into my PC. However the writing is on the wall and it will need replacing this year.

Since I last thought about this, I have managed to get the gas boiler replaced and have the ancient windows done. That did cost a fortune, a lot more than the £1k of a decent desktop, but I will get some of that back in reduced energy bills. It looks like a ~ 10% reduction at the moment. If you assume a 10% annual energy inflation that adds up to a total saving over 30 years of around £16.5k which covers the cost of the boiler but not the windows and doors.

Categories: LUG Community Blogs

Brett Parker (iDunno): Wow, I do believe Fasthosts have outdone themselves...

Planet ALUG - Sat, 04/01/2014 - 11:24

So, got a beep this morning from our work monitoring system. One of our customers domain names is hosted with livedns.co.uk (which, as far as I can tell, is part of the Fasthosts franchise)... It appears that Fasthosts have managed to entirely break their DNS:

brettp@laptop:~$ host www.fasthosts.com ;; connection timed out; no servers could be reached brettp@laptop:~$ whois fasthosts.com | grep -i "Name Server" Name Server: NS1.FASTHOSTS.NET.UK Name Server: NS2.FASTHOSTS.NET.UK Name Server: NS1.FASTHOSTS.NET.UK Name Server: NS2.FASTHOSTS.NET.UK brettp@laptop:~$ whois fasthosts.net.uk | grep -A 2 "Name servers:" Name servers: ns1.fasthosts.net.uk 213.171.192.252 ns2.fasthosts.net.uk 213.171.193.248 brettp@laptop:~$ host -t ns fasthosts.net.uk 213.171.192.252 ;; connection timed out; no servers could be reached brettp@laptop:~$ host -t ns fasthosts.net.uk 213.171.193.248 ;; connection timed out; no servers could be reached brettp@laptop:~$

So, that's fasthosts core nameservers not responding, good start! They also provide livedns.co.uk, so lets have a look at that:

brettp@laptop:~$ whois livedns.co.uk | grep -A 3 "Name servers:" Name servers: ns1.livedns.co.uk 213.171.192.250 ns2.livedns.co.uk 213.171.193.250 ns3.livedns.co.uk 213.171.192.254 brettp@laptop:~$ host -t ns ns1.livedns.co.uk 213.171.192.250 ;; connection timed out; no servers could be reached brettp@laptop:~$ host -t ns ns1.livedns.co.uk 213.171.193.250 ;; connection timed out; no servers could be reached brettp@laptop:~$ host -t ns ns1.livedns.co.uk 213.171.192.254 ;; connection timed out; no servers could be reached

So, erm, apparently that's all their DNS servers "Not entirely functioning correctly"! That's quite impressive!

Categories: LUG Community Blogs

Crontab header

Planet SurreyLUG - Sat, 04/01/2014 - 10:47

Very early on in my Linux life, I came across this suggested header for crontab and I’ve used it ever since. So much so that I am always slightly thrown when I come across a crontab without it! No, you don’t need it, yes the standard commented header works just fine, but, if like me you prefer things neatly lined up, then this might suit you:

MAILTO= # _________________________ 2. Minute - Minutes after the hour (0-59) # | # | ______________________ 2. Hour - 24-hour format (0-23). # | | # | | ___________________ 3. Day - Day of the month (1-31) # | | | # | | | ________________ 4. Month - Month of the year (1-12) # | | | | # | | | | ______________5. Weekday - Day of the week. (0-6, 0 indicates Sunday) # | | | | | #__|_____|_____|_____|____|___Command_____________________________________________________________

And if you recognise this as your’s, then thank you!


Categories: LUG Community Blogs

Dick Turpin: The whole Internet is very slow.

Planet WolvesLUG - Fri, 03/01/2014 - 15:42
Customer: "Hi Pete, any chance you can look at our Internet for us? We could do with it being a bit faster!"
Me: "Hang on a moment while I just turn your dial up a couple of notches."

Happy New Year.
Categories: LUG Community Blogs

Jono Bacon: Ubuntu In 2014

Planet WolvesLUG - Thu, 02/01/2014 - 22:56

Happy new year, friends!

2013 was a phenomenal year for Ubuntu. It is difficult to believe that it was just a year ago today that we announced Ubuntu for phones. Since then we have built and released the first version of Ubuntu for phones complete with core apps, delivered Mir in production on the phone, built a vastly simplified and more powerful new app delivery platform complete with full security sand-boxing, created a powerful smart scopes service to bring the power of native search and online content to devices, delivered a new SDK with support for QML, HTML5, and Scopes, built an entirely new developer.ubuntu.com, created extensive CI and testing infrastructure to ensure quality as we evolve our platform, shipped two desktop releases, extended the charm store, delivered Juju Gui, spun up multiple clouds with Juju, and much more.

In terms of Ubuntu for devices, I mentally picture 2013 as the year when we put much of the core foundational pieces in place. Everything I just mentioned were all huge but significant pieces of delivering a world-class Free Software convergence platform. Building this platform is not as simple as building a sexy GUI; there is lots of complex foundational work that needs doing, and I am incredibly proud of everyone who participated in getting us to where we are today…it is a true testament of collaborative development involving many communities and contributors from around the world.

So, 2013 was an intense year with lots of work, some tough decisions, and lots of late (and sometimes stressful) nights, but it laid down the core pillars of what our future holds. But what about 2014?

This time next year we will have a single platform code-base for phone, tablet, and desktop that adapts to harness the form-factor and power of each device it runs on. This is not just the aesthetics of convergence, it is real convergence at the code level. This will be complemented by an Ubuntu SDK in which you can write an app once and deliver it to any of these devices, and an eco-system in which you can freely publish or sell apps, content, and more with a powerful set of payment tools.

These pieces will appear one phase at a time throughout 2014. We are focusing on finishing the convergent pieces on phone first, then bringing them to tablet, and then finally bringing our desktop over to the new convergent platform. Every piece of new technology that we built in 2013 will be consumed across all of these form-factors in 2014; every line of code is an investment in our future.

Even more importantly though, 2014 will be the year when we see this new era of Ubuntu convergence shipping to consumers. This will open up Ubuntu to millions of additional users, provide an opportunity for app developers to get in on the ground floor in delivering powerful apps, and build more opportunity for our community than ever before.

I wish I could tell you that 2014 is going to be more relaxing than 2013. It isn’t. It is going to be a roller-coaster. There are going to be some late nights, some stressful times, some shit-storms, and some unnecessary politics, but my goal is to help keep us working together as a community, keep us focused on the bigger picture, keep our discourse constructive, and to keep the fun in Ubuntu.

Let’s do this.

Categories: LUG Community Blogs

FreeNX NX Startup Session Failed

Planet SurreyLUG - Thu, 02/01/2014 - 11:52

Occasionally users are unable to connect to our FreeNX server, they report an error “Startup Session Failed”. Clicking on “Detail” shows that it is unable to find the server session file.

Searching for solutions suggested a number of options, including removing the server /tmp/.X1***-lock files, or simply removing FreeNX and installing NoMachine’s NXServer instead.

In the end the solution proved remarkably simple:

On the server run:

# nxserver --list NX> 100 NXSERVER - Version 1.5.0-60 OS (GPL) NX> 127 Sessions list: Display Username Remote IP Session ID ------- --------------- --------------- -------------------------------- 1001 chris 192.168.1.52 1BC6B4B9C3CF4C2B6BD7137AC7FDE5DA 1000 helen - 87443D16622EC0751551685A93DD023B 1002 michelle 192.168.1.102 DBAC430C8AA8B414A5E2228970E2BBDC NX> 999 Bye

The session with no remote IP is for a session that has not ended properly. Terminate this session and users should be able to log in once again:

# nxserver --terminate helen

Update: This problem was little more complex than I at first thought, all I had really done is allow one more user to login before the issue re-occurred. You also need to check is that there are no old lock files in /tmp/.X11-unix:

# cd /tmp/.X11-unix #ls -al /tmp/.X11-unix# ls -al total 316 drwxrwxrwt 2 root root 4096 2014-01-02 12:48 . drwxrwxrwt 375 root root 315392 2014-01-02 12:48 .. srwxrwxrwx 1 root root 0 2013-07-24 17:04 X0 srwxrwxrwx 1 helen helen 0 2014-01-02 10:32 X1000 srwxrwxrwx 1 chris chris 0 2014-01-02 11:42 X1001 srwxrwxrwx 1 michelle michelle 0 2014-01-02 08:52 X1002 srwxrwxrwx   1 terry  terry     0 2013-09-05 15:05 X1003

Notice that there is a .X11-unix file for terry X1003, but no corresponding user shown in nxserver –list above. Remove this spurious file and it should now work.

It would also make sense to ensure that the correct /tmp/.X1***-lock files are present:

# cd /tmp # ls -al | grep -i X.*-lock -rw------- 1 nx nx 0 2014-01-02 10:32 .nX1000-lock -rw------- 1 nx nx 0 2014-01-02 11:42 .nX1001-lock -rw------- 1 nx nx 0 2014-01-02 08:52 .nX1002-lock -r--r--r-- 1 helen helen 11 2014-01-02 10:32 .X1000-lock -r--r--r-- 1 chris chris 11 2014-01-02 11:42 .X1001-lock -r--r--r-- 1 michelle michelle 11 2014-01-02 08:52 .X1002-lock

You should expect to see  the correct number of lock files for your user sessions,  in my case these required no changes, but if there had been spurious files, removing them would seem sensible.

If this has been helpful to you, please do consider rating this post or adding a comment!


Categories: LUG Community Blogs

John Woodard: A year in Prog!

Planet ALUG - Wed, 01/01/2014 - 20:56

It's New Year's Day 2014 and I'm reflecting on the music of past year.Album wise there were several okay...ish releases in the world of Progressive Rock. Steven Wilson's The Raven That Refused To Sing not the absolute masterpiece some have eulogised a solid effort though but it did contain some filler. Motorpsyco entertained with Still Life With Eggplant not as good as their previous album but again a solid effort. Magenta as ever didn't disappoint with The 27 Club, wishing Tina Booth a swift recovery from her ill health.

The Three stand out albums in no particular order for me were Edison's Children's Final Breath Before November which almost made it as album of the year and Big Big Train with English Electric Full Power which combined last years Part One and this years Part Two with some extra goodies to make the whole greater than the sum of the parts. Also Adrian Jones of Nine Stones Close fame pulled one out of the bag with his side Project Jet Black Sea which was very different and a challenging listen, hard going at first but surprisingly very good. This man is one superb guitarist especially if you like emotion wrung out of the instrument like David Gilmore or Steve Rothery.

The moniker of Album of the Year this year goes to Fish for the incredible Feast of Consequences. A real return to form and his best work since Raingods With Zippos. The packaging of the deluxe edition with a splendid book featuring the wonderful artwork of Mark Wilkinson was superb. A real treat with a very thought provoking suite about the first world war really hammed home the saying "Lest we forget". A fine piece that needs to be heard every November 11th.


Gig wise again Fish at the Junction in Cambridge was great. His voice may not be what it was in 1985 but he is the consummate performer, very at home on the stage. As a raconteur between songs he is as every bit as entertaining as he is singing songs themselves.

The March Marillion Convention in Port Zealand, Holland where they performed their masterpiece Brave was very special as every performance of incredible album is. The Marillion Conventions are always special but Brave made this one even more special than it would normally be.
Gig of the year goes again to Marillion at Aylesbury Friars in November. I had waited thirty years and forty odd shows to see them perform Garden Party segued into Market Square Heroes that glorious night it came to pass, I'm am now one very happy Progger or should that be Proggie? Nevermind Viva Progressive Rock!
Categories: LUG Community Blogs

Andy Smith: Yearly (Linux) photo management quandary

Planet HantsLUG - Wed, 01/01/2014 - 14:19

Here we are again, another year, another dissatisfying look at what options I have for local photo management.

Here’s what I do now:

  • Photos from our cameras and my phone are imported using F-Spot on my desktop computer in the office, to a directory tree that resides over NFS on a fileserver, where they will be backed up.
  • Tagging etc. happens on the desktop computer.
  • For quick viewing of a few images, if I know the date they were taken on, I can find them in the directory structure because it goes like Photos/2014/01/01/blah.jpg. The NFS mount is available on every computer in the house that can do NFS (e.g. laptops).
  • For more involved viewing that will require searching by tag or other metadata, i.e. that has to be done in F-Spot, I have to do it on the desktop computer in the office, because that is the only place that has the F-Spot database. So I either do it there, or I have to run F-Spot over X11 forwarding on another machine (slow and clunky!).

The question is how to improve that experience?

I can’t run F-Spot on multiple computers because it stores its SQLite database locally and even if the database file were synced between hosts or kept on the fileserver it would still need the exact same version of F-Spot on every machine, which is not feasible — my laptop and desktop already run different releases of Ubuntu and I want to continue being able to do that.

It would be nice to be able to import photos from any machine but I can cope with it having to be done from the desktop alone. What isn’t acceptable is only being able to view them from the desktop as well. And when I say view I mean be able to search by tags and metadata, not just navigate a directory tree.

It sounds like a web application is needed, to enforce the single point of truth for tags and metadata. Are there actually any good ones that you can install yourself though? I’ve used Gallery before and was never really satisfied with ease of use or presentation.

Your-Photos-As-A-Service providers like Flickr and even to some extent Google+ and Facebook have quite nice interfaces, but I worry about spending many hours adding tags and metadata, not bothering to back it all up, and then one day the service shuts down or changes in ways I don’t like.

I’m normally quite good about backing things up but the key to backups is to make them easy and automatic. From what I can see these service providers either don’t provide a backup facility or else it’s quite inconvenient, e.g. click a bunch of times, get a zip file of everything. Ain’t nobody got time for that, as a great philosopher once wrote.

So.. yeah.. What do you do about it?

Categories: LUG Community Blogs
Syndicate content