News aggregator

Jonathan McDowell: Timezones + static blog generation

Planet ALUG - Sun, 18/12/2016 - 23:28

So, it turns out when you move to static blog generation and do the generation on your laptop, which is usually in the timezone you’re currently physically located, it can cause URLs to change. Especially if you’re prone to blogging late at night, which can result in even just a shift to DST changing things. I’ve forced jekyll to UTC by adding timezone: 'UTC' to the config, and ensuring all the posts now have timezones for when they were written (a lot of the imported ones didn’t), so hopefully things should be stable from here on.

Categories: LUG Community Blogs

@Yahoo account deleted. Just not worth it any more.

Planet SurreyLUG - Fri, 16/12/2016 - 13:30

@Yahoo account deleted.  Just not worth it any more.

Goodbye!

The post @Yahoo account deleted. Just not worth it any more. appeared first on dowe.io.

Categories: LUG Community Blogs

Steve Kemp: A simple Perl alternative to storing data in Redis

Planet HantsLUG - Thu, 15/12/2016 - 22:00

I continue to be a big user of Perl, and for many of my sites I avoid the use of MySQL which means that I largely store data in flat files, SQLite databases, or in memory via Redis.

One of my servers was recently struggling with RAM, and the suprising cause was "too much data" in Redis. (Surprising because I'd not been paying attention and seen how popular it was, and also because ASCII text compresses pretty well).

Read/Write speed isn't a real concern, so I figured I'd move the data into an SQLite database, but that would require rewriting the application.

The client library for Perl is pretty awesome, and simple usage looks like this:

# Connect to localhost. my $r = Redis->new() # simple storage $r->set( "key", "value" ); # Work with sets $r->sadd( "fruits", "orange" ); $r->sadd( "fruits", "apple" ); $r->sadd( "fruits", "blueberry" ); $r->sadd( "fruits", "banannanananananarama" ); # Show the set-count print "There are " . $r->scard( "fruits" ) . " known fruits"; # Pick a random one print "Here is a random one " . $r->srandmember( "fruits" ) . "\n";

I figured, if I ignored the Lua support and the other more complex operations, creating a compatible API implementation wouldn't be too hard. So rather than porting my application to using SQLite directly I could juse use a different client-library.

In short I change this:

use Redis; my $r = Redis->new();

To this:

use Redis::SQLite; my $r = Redis::SQLite->new();

And everything continues to work. I've implemented all the set-related functions except one, and a random smattering of the other simple operations.

The appropriate test-cases in the Redis client library (i.e. removing all references to things I didn't implement) pass, and my own new tests also make me confident.

It's obviously not a hard job, but it was a quick solution to a real problem and might be useful to others.

My image hosting site, and my markdown sharing site now both use this wrapper and seem to be performing well - but with more free RAM.

No doubt I'll add more of the simple primitives as time goes on, but so far I've done enough to be useful.

Categories: LUG Community Blogs

Jonathan McDowell: No longer a student. Again.

Planet ALUG - Mon, 12/12/2016 - 22:27

(image courtesy of XKCD)

Last week I graduated with a Masters in Legal Science (now taught as an MLaw) from Queen’s University Belfast. I’m pleased to have achieved a Distinction, as well an award for Outstanding Achievement in the Dissertation (which was on the infringement of privacy by private organisations due to state mandated surveillance and retention laws - pretty topical given the unfortunate introduction of the Investigatory Powers Act 2016). However, as previously stated, I had made the decision that I was happier building things, and wanted to return to the world of technology. I talked to a bunch of interesting options, got to various stages in the hiring process with each of them, and happily accepted a role with Titan IC Systems which started at the beginning of September.

Titan have produced a hardware accelerated regular expression processor (hence the XKCD reference); the RXP in its FPGA variant (what I get to play with) can handle pattern matching against 40Gb/s of traffic. Which is kinda interesting, as it lends itself to a whole range of applications from network scanning to data mining to, well, anything where you want to sift through a large amount of data checking against a large number of rules. However it’s brand new technology for me to get up to speed with (plus getting back into a regular working pattern rather than academentia), and the combination of that and spending most of the summer post DebConf wrapping up the dissertation has meant I haven’t had as much time to devote other things as I’d have liked. However I’ve a few side projects at various stages of completion and will try to manage more regular updates.

Categories: LUG Community Blogs

KITT upgrade to 2016 MBPs

Planet SurreyLUG - Wed, 07/12/2016 - 15:49

Turn Your MacBook Pro Into Kitt From Knight Rider, Compute Like David Hasselhoff.

I finally see the point.

http://gizmodo.com/turn-your-macbook-pro-into-kitt-from-knight-rider-comp-1789382596

The post KITT upgrade to 2016 MBPs appeared first on dowe.io.

Categories: LUG Community Blogs

Indieweb: wish I could do more for you.

Planet SurreyLUG - Wed, 07/12/2016 - 15:14

Despite previous posts advocating the indieweb, sadly I need to trim down my WordPress plugin experience.  This is mainly to seeing a lot more traffic on my site recently, and not having the time or resources to optimise the plugin code running on my virtual server.  I found that the number of plugins in my site (around 48) was really starting to hamper performance.

So it’s with regret that I step out of the indieweb sharing platform, by removing all associated plugins from my WordPress.  Despite being in full agreement with the indieweb mantra, of owning one’s own data, I do find some satisfaction and convenience of using WordPress.com‘s own tools to do the same job now.  To some extent, they have embraced providing a richer, more social experience through WordPress sites – whether hosted by them, or by “us”.

My only regret is that I couldn’t contribute to the project, the principles of which I wholly believe in and support – if only on an intellectual level.

Good luck Indieweb!

The post Indieweb: wish I could do more for you. appeared first on dowe.io.

Categories: LUG Community Blogs

Chris Lamb: Free software activities in November 2016

Planet ALUG - Wed, 30/11/2016 - 21:18

Here is my monthly update covering what I have been doing in the free software world (previous month):

  • Started work on a Python API to the UK Postbox mail scanning and forwarding service. (repo)
  • Lots of improvements to buildinfo.debian.net, my experiment into how to process, store and distribute .buildinfo files after the Debian archive software has processed them, including making GPG signatures mandatory (#7), updating jenkins.debian.net to sign them and moving to SSL.
  • Improved the Django client to the KeyError error tracking software, enlarging the test coverage and additionally adding support for grouping errors using a context manager.
  • Made a number of improvements to travis.debian.net, my hosted service for projects that host their Debian packaging on GitHub to use the Travis CI continuous integration platform to test builds on every code change:
    • Install build-dependencies with debugging output. Thanks to @waja. (#31)
    • Install Lintian by default. Thanks to @freeekanayaka. (#33).
    • Call mktemp with --dry-run to avoid having to delete it later. (commit)
  • Submitted a pull request to Wheel (a utility to package Python libraries) to make the output of METADATA files reproducible. (#73)
  • Submitted some miscellaneous documentation updates to the Tails operating system. (patches)
Reproducible builds

Whilst anyone can inspect the source code of free software for malicious flaws, most software is distributed pre-compiled to end users.

The motivation behind the Reproducible Builds effort is to permit verification that no flaws have been introduced — either maliciously or accidentally — during this compilation process by promising identical results are always generated from a given source, thus allowing multiple third-parties to come to a consensus on whether a build was compromised.


This month:


My work in the Reproducible Builds project was also covered in our weekly reports (#80, #81, #82 #83).


Toolchain issues

I submitted the following patches to fix reproducibility-related toolchain issues with Debian:


strip-nondeterminism

strip-nondeterminism is our tool to remove specific non-deterministic results from a completed build.


jenkins.debian.net

jenkins.debian.net runs our comprehensive testing framework.

  • buildinfo.debian.net has moved to SSL. (ac3b9e7)
  • Submit signing keys to keyservers after generation. (bdee6ff)
  • Various cosmetic changes, including
    • Prefer if X not in Y over if not X in Y. (bc23884)
    • No need for a dictionary; let's just use a set. (bf3fb6c)
    • Avoid DRY violation by using a for loop. (4125ec5)

I also submitted 9 patches to fix specific reproducibility issues in apktool, cairo-5c, lava-dispatcher, lava-server, node-rimraf, perlbrew, qsynth, tunnelx & zp.

Debian
Debian LTS

This month I have been paid to work 11 hours on Debian Long Term Support (LTS). In that time I did the following:

  • "Frontdesk" duties, triaging CVEs, etc.
  • Issued DLA 697-1 for bsdiff fixing an arbitrary write vulnerability.
  • Issued DLA 705-1 for python-imaging correcting a number of memory overflow issues.
  • Issued DLA 713-1 for sniffit where a buffer overflow allowed a specially-crafted configuration file to provide a root shell.
  • Issued DLA 723-1 for libsoap-lite-perl preventing a Billion Laughs XML expansion attack.
  • Issued DLA 724-1 for mcabber fixing a roster push attack.
Uploads
  • redis:
    • 3.2.5-2 — Tighten permissions of /var/{lib,log}/redis. (#842987)
    • 3.2.5-3 & 3.2.5-4 — Improve autopkgtest tests and install upstream's MANIFESTO and README.md documentation.
  • gunicorn (19.6.0-9) — Adding autopkgtest tests.
  • libfiu:
    • 0.94-1 — Add autopkgtest tests.
    • 0.95-1, 0.95-2 & 0.95-3 — New upstream release and improve autopkgtest coverage.
  • python-django (1.10.3-1) — New upstream release.
  • aptfs (0.8-3, 0.8-4 & 0.8-5) — Adding and subsequently improving the autopkgtext tests.


I performed the following QA uploads:



Finally, I also made the following non-maintainer uploads:

  • libident (0.22-3.1) — Move from obsolete Source-Version substvar to binary:Version. (#833195)
  • libpcl1 (1.6-1.1) — Move from obsolete Source-Version substvar to binary:Version. (#833196)
  • pygopherd (2.0.18.4+nmu1) — Move from obsolete Source-Version substvar to ${source:Version}. (#833202)
Debian bugs filed RC bugs

I also filed 59 FTBFS bugs against arc-gui-clients, asyncpg, blhc, civicrm, d-feet, dpdk, fbpanel, freeciv, freeplane, gant, golang-github-googleapis-gax-go, golang-github-googleapis-proto-client-go, haskell-cabal-install, haskell-fail, haskell-monadcatchio-transformers, hg-git, htsjdk, hyperscan, jasperreports, json-simple, keystone, koji, libapache-mod-musicindex, libcoap, libdr-tarantool-perl, libmath-bigint-gmp-perl, libpng1.6, link-grammar, lua-sql, mediatomb, mitmproxy, ncrack, net-tools, node-dateformat, node-fuzzaldrin-plus, node-nopt, open-infrastructure-system-images, open-infrastructure-system-images, photofloat, ppp, ptlib, python-mpop, python-mysqldb, python-passlib, python-protobix, python-ttystatus, redland, ros-message-generation, ruby-ethon, ruby-nokogiri, salt-formula-ceilometer, spykeviewer, sssd, suil, torus-trooper, trash-cli, twisted-web2, uftp & wide-dhcpv6.

FTP Team

As a Debian FTP assistant I ACCEPTed 70 packages: bbqsql, coz-profiler, cross-toolchain-base, cross-toolchain-base-ports, dgit-test-dummy, django-anymail, django-hstore, django-html-sanitizer, django-impersonate, django-wkhtmltopdf, gcc-6-cross, gcc-defaults, gnome-shell-extension-dashtodock, golang-defaults, golang-github-btcsuite-fastsha256, golang-github-dnephin-cobra, golang-github-docker-go-events, golang-github-gogits-cron, golang-github-opencontainers-image-spec, haskell-debian, kpmcore, libdancer-logger-syslog-perl, libmoox-buildargs-perl, libmoox-role-cloneset-perl, libreoffice, linux-firmware-raspi3, linux-latest, node-babel-runtime, node-big.js, node-buffer-shims, node-charm, node-cliui, node-core-js, node-cpr, node-difflet, node-doctrine, node-duplexer2, node-emojis-list, node-eslint-plugin-flowtype, node-everything.js, node-execa, node-grunt-contrib-coffee, node-grunt-contrib-concat, node-jquery-textcomplete, node-js-tokens, node-json5, node-jsonfile, node-marked-man, node-os-locale, node-sparkles, node-tap-parser, node-time-stamp, node-wrap-ansi, ooniprobe, policycoreutils, pybind11, pygresql, pysynphot, python-axolotl, python-drizzle, python-geoip2, python-mockupdb, python-pyforge, python-sentinels, python-waiting, pythonmagick, r-cran-isocodes, ruby-unicode-display-width, suricata & voctomix-outcasts.

I additionally filed 4 RC bugs against packages that had incomplete debian/copyright files against node-cliui, node-core-js, node-cpr & node-grunt-contrib-concat.

Categories: LUG Community Blogs

Report of last meeting and a question

West Yorkshire LUG News - Mon, 28/11/2016 - 14:59

It was a good evening – some varied chat. Some around the i3 window
manager (http://i3wm.org/), which two folks actively run, and another
had tried at least. Some container/docker chatter, and some talk
around git usage, and commit message ‘style‘. David, with reference to
that last one, the kernel community have a good set of guidelines they
adhere to fairly strictly (that is, if your commit does not follow the
rules it pretty much isn’t getting merged into the codebase). A good
overview can be found in the kernel source tree docs:
https://git.kernel.org/cgit/linux/kernel/git/torvalds/linux.git/tree/Documentation/SubmittingPatches

Some of that is kernel-ish specific, and some possibly a little
outdated (iirc there is no mention of git send-email and maybe no git
format-patch either for instance), and the workflow is email based,
and many of us are possibly using services such as github which have a
bit of a different (PR based) workflow etc. – but, the essence of how
to create a good commit message and organise patches still applies.

And then on to the December meet – the last Thursday would be the 29th
December, and I think we agreed it was looking, shall we say,
‘non-optimal’ for turnout. Thus, right now it is not clear what we
will do in December, if anything. There may be an impromptu ‘get
together for a beer’, maybe in town. Open for ideas and thoughts
here…

Graham

Andy Smith: Supermicro SATA DOM flash devices don’t report lifetime writes correctly

Planet HantsLUG - Sat, 26/11/2016 - 16:43

I’m playing around with a pair of Supermicro SATA DOM flash devices at the moment, evaluating them for use as the operating system storage for servers (as opposed to where customer data goes).

They’re flash devices with a limited write endurance. The smallest model (16GB), for example, is good for 17TB of writes. Therefore it’s important to know how much you’ve actually written to it.

Many SSDs and other flash devices expose the total amount written through the SMART attribute 241, Total_LBAs_Written. The SATA DOM devices do seem to expose this attribute, but right now they say this:

$ for dom in $(sudo lsblk --paths -d -o NAME,MODEL --noheadings | awk '/SATA SSD/ { print $1 }') do echo -n "$dom: " sudo smartctl -A "$dom" | awk '/^241/ { print $10 * 512 * 1.0e-9, "GB" }' done /dev/sda: 0.00856934 GB /dev/sdb: 0.00881715 GB

This being after install and (as of now) more than a week of uptime, ~9MB of lifetime writes isn’t credible.

Another place we can look for amount of bytes written is /proc/diskstats. The 10th column is the number of (512-byte) sectors written, so:

$ for dom in $(sudo lsblk -d -o NAME,MODEL --noheadings | awk '/SATA SSD/ { print $1 }') do awk "/$dom / { print \$3, \$10 / 2 * 1.0e-6, \"GB\" }" /proc/diskstats done sda 3.93009 GB sdb 3.93009 GB

Almost 4GB is a lot more believable, so can we just use /proc/diskstats? Well, the problem there is that those figures are only since boot. That won’t include, for example, all the data written during install.

Okay, so, are these figures even consistent? Let’s write 100MB and see what changes.

Since the figure provided by SMART attribute 241 apparently isn’t actually 512-byte blocks we’ll just print the raw value there.

Before:

$ for dom in $(sudo lsblk -d -o NAME,MODEL --noheadings | awk '/SATA SSD/ { print $1 }') do awk "/$dom / { print \$3, \$10 / 2 * 1.0e-6, \"GB\" }" /proc/diskstats done sda 4.03076 GB sdb 4.03076 GB $ for dom in $(sudo lsblk --paths -d -o NAME,MODEL --noheadings | awk '/SATA SSD/ { print $1 }') do echo -n "$dom: " sudo smartctl -A "$dom" | awk '/^241/ { print $10 }' done /dev/sda: 16835 /dev/sdb: 17318

Write 100MB:

$ dd if=/dev/urandom bs=1MB count=100 > /var/tmp/one_hundred_megabytes 100+0 records in 100+0 records out 100000000 bytes (100 MB) copied, 7.40454 s, 13.5 MB/s

(I used /dev/urandom just in case some compression might take place or something)

After:

$ for dom in $(sudo lsblk -d -o NAME,MODEL --noheadings | awk '/SATA SSD/ { print $1 }') do awk "/$dom / { print \$3, \$10 / 2 * 1.0e-6, \"GB\" }" /proc/diskstats done sda 4.13046 GB sdb 4.13046 GB $ for dom in $(sudo lsblk --paths -d -o NAME,MODEL --noheadings | awk '/SATA SSD/ { print $1 }') do echo -n "$dom: " sudo smartctl -A "$dom" | awk '/^241/ { print $10 }' done /dev/sda: 16932 /dev/sdb: 17416

Well, alright, all is apparently not lost: SMART attribute 241 went up by ~100 and diskstats agrees that ~100MB was written too, so it looks like it does actually report lifetime writes, but it’s reporting them as megabytes (109 bytes), not 512-byte sectors.

Every reference I can find says that Total_LBAs_Written is the number of 512-byte sectors, though, so in reporting units of 1MB I feel that these devices are doing the wrong thing.

Anyway, I’m a little alarmed that ~0.1% of the lifetime has gone already, although a lot of that would have been the install. I probably should take this opportunity to get rid of a lot of writes by tracking down logging of mundane garbage. Also this is the smallest model; the devices are rated for 1 DWPD so just over-provisioning by using a larger model than necessary will help.

Categories: LUG Community Blogs

Steve Engledow (stilvoid): Win or lose?

Planet ALUG - Wed, 23/11/2016 - 01:02

I never paid any attention in art classes. On reflection, I think we had an awful teacher who more or less ignored those of us with no latent talent or interest. I grew up mildly jealous of people I knew who could draw and always wished I was able.

Over the past few years, I've heard several people say that artistic ability is 10% talent and 90% practice and I've considered giving it a go at some point. Recently, we bought some pencils and a pad for my son and this evening, with a glass of wine at hand and some 70s rock on the stereo, I decided to take the plunge and see what horrors I could submit the unwitting page to.

Here's the first thing I've drawn since school:

It was supposed to be my wife. If you know her, you'll know I failed ;)

I focussed too much on the individual features and not enough on the overall shape. The eyes and hair aren't bad (at least they look something like hers), but the mouth and nose are too large and disproportionate - though recognisable.

I decided to try drawing what was in front of me: a ghost-shaped candle holder:

That's a photo by the way, not my drawing ;)

Here's the drawing. I killed the perspective somewhat but at least it's recognisable!

After I'd drawn the ghost, I decided to have another go at my wife while she wasn't paying attention. This one looks more like her but the eyes look as though she's been in a fight and the hair is a tad more Edward Scissorhands than I'd intended.

Overall, I got a better result than I'd expected from my first three attempts at sketching in 20 years. This might turn into a series.

More than willing to receive criticism and advice from people who know what they're doing with a pencil :)

Categories: LUG Community Blogs

Steve Engledow (stilvoid): Win or lose?

Planet ALUG - Wed, 23/11/2016 - 01:02

I never paid any attention in art classes. On reflection, I think we had an awful teacher who more or less ignored those of us with no latent talent or interest. I grew up mildly jealous of people I knew who could draw and always wished I was able.

Over the past few years, I've heard several people say that artistic ability is 10% talent and 90% practice and I've considered giving it a go at some point. Recently, we bought some pencils and a pad for my son and this evening, with a glass of wine at hand and some 70s rock on the stereo, I decided to take the plunge and see what horrors I could submit the unwitting page to.

Here's the first thing I've drawn since school:

It was supposed to be my wife. If you know her, you'll know I failed ;)

I focussed too much on the individual features and not enough on the overall shape. The eyes and hair aren't bad (at least they look something like hers), but the mouth and nose are too large and disproportionate - though recognisable.

I decided to try drawing what was in front of me: a ghost-shaped candle holder:

That's a photo by the way, not my drawing ;)

Here's the drawing. I killed the perspective somewhat but at least it's recognisable!

After I'd drawn the ghost, I decided to have another go at my wife while she wasn't paying attention. This one looks more like her but the eyes look as though she's been in a fight and the hair is a tad more Edward Scissorhands than I'd intended.

Overall, I got a better result than I'd expected from my first three attempts at sketching in 20 years. This might turn into a series.

More than willing to receive criticism and advice from people who know what they're doing with a pencil :)

Categories: LUG Community Blogs

Steve Engledow (stilvoid): Win or lose?

Planet ALUG - Wed, 23/11/2016 - 01:02

I never paid any attention in art classes. On reflection, I think we had an awful teacher who more or less ignored those of us with no latent talent or interest. I grew up mildly jealous of people I knew who could draw and always wished I was able.

Over the past few years, I've heard several people say that artistic ability is 10% talent and 90% practice and I've considered giving it a go at some point. Recently, we bought some pencils and a pad for my son and this evening, with a glass of wine at hand and some 70s rock on the stereo, I decided to take the plunge and see what horrors I could submit the unwitting page to.

Here's the first thing I've drawn since school:

It was supposed to be my wife. If you know her, you'll know I failed ;)

I focussed too much on the individual features and not enough on the overall shape. The eyes and hair aren't bad (at least they look something like hers), but the mouth and nose are too large and disproportionate - though recognisable.

I decided to try drawing what was in front of me: a ghost-shaped candle holder:

That's a photo by the way, not my drawing ;)

Here's the drawing. I killed the perspective somewhat but at least it's recognisable!

After I'd drawn the ghost, I decided to have another go at my wife while she wasn't paying attention. This one looks more like her but the eyes look as though she's been in a fight and the hair is a tad more Edward Scissorhands than I'd intended.

Overall, I got a better result than I'd expected from my first three attempts at sketching in 20 years. This might turn into a series.

More than willing to receive criticism and advice from people who know what they're doing with a pencil :)

Categories: LUG Community Blogs

Steve Kemp: Detecting fraudulent signups?

Planet HantsLUG - Mon, 21/11/2016 - 05:37

I run a couple of different sites that allow users to sign-up and use various services. In each of these sites I have some minimal rules in place to detect bad signups, but these are a little ad hoc, because the nature of "badness" varies on a per-site basis.

I've worked in a couple of places where there are in-house tests of bad signups, and these usually boil down to some naive, and overly-broad, rules:

  • Does the phone numbers' (international) prefix match the country of the user?
  • Does the postal address supplied even exist?

Some places penalise users based upon location too:

  • Does the IP address the user submitted from come from TOR?
  • Does the geo-IP country match the users' stated location?
  • Is the email address provided by a "free" provider?

At the moment I've got a simple HTTP-server which receives a JSON post of a new users' details, and returns "200 OK" or "403 Forbidden" based on some very very simple critereon. This is modeled on the spam detection service for blog-comments server I use - something that is itself becoming less useful over time. (Perhaps time to kill that? A decision for another day.)

Unfortunately this whole approach is very reactive, as it takes human eyeballs to detect new classes of problems. Code can't guess in advance that it should block usernames which could collide with official ones, for example allowing a username of "admin", "help", or "support".

I'm certain that these systems have been written a thousand times, as I've seen at least five such systems, and they're all very similar. The biggest flaw in all these systems is that they try to classify users in advance of them doing anything. We're trying to say "Block users who will use stolen credit cards", or "Block users who'll submit spam", by correlating that behaviour with other things. In an ideal world you'd judge users only by the actions they take, not how they signed up. And yet .. it is better than nothing.

For the moment I'm continuing to try to make the best of things, at least by centralising the rules for myself I cut down on duplicate code. I'll pretend I'm being cool, modern, and sexy, and call this a micro-service! (Ignore the lack of containers for the moment!)

Categories: LUG Community Blogs

monthly meeting Thurs 24th Nov 2016

West Yorkshire LUG News - Fri, 18/11/2016 - 14:27

Time has come again for the monthly meeting in the Lord Darcy. Look for a bunch of us sitting round a Laptop(s). If you have anything planned, in the world of computers, post it on the mailing list or the meetups page.

Optimal development environment

Planet SurreyLUG - Fri, 18/11/2016 - 08:43

After all these years, I can still find no better development environment than GNOME 3, Emacs and Rhythmbox.

A 100% functional desktop environment, that’s way more flexible than macOS or Windows, more secure, more resource-efficient, faster, cleaner, less obtrusive, quicker to navigate, more economic keyboard shortcuts to navigate, and (IMHO) better on the eye too.

Which all matters when you spend whole days looking at code.

The post Optimal development environment appeared first on dowe.io.

Categories: LUG Community Blogs

Debian Bits: Debian Contributors Survey 2016

Planet HantsLUG - Wed, 16/11/2016 - 14:45

The Debian Contributor Survey launched last week!

In order to better understand and document who contributes to Debian, we (Mathieu ONeil, Molly de Blanc, and Stefano Zacchiroli) have created this survey to capture the current state of participation in the Debian Project through the lense of common demographics. We hope a general survey will become an annual effort, and that each year there will also be a focus on a specific aspect of the project or community. The 2016 edition contains sections concerning work, employment, and labour issues in order to learn about who is getting paid to work on and with Debian, and how those relationships affect contributions.

We want to hear from as many Debian contributors as possible—whether you've submitted a bug report, attended a DebConf, reviewed translations, maintain packages, participated in Debian teams, or are a Debian Developer. Completing the survey should take 10-30 minutes, depending on your current involvement with the project and employment status.

In an effort to reflect our own ideals as well as those of the Debian project, we are using LimeSurvey, an entirely free software survey tool, in an instance of it hosted by the LimeSurvey developers.

Survey responses are anonymous, IP and HTTP information are not logged, and all questions are optional. As it is still likely possible to determine who a respondent is based on their answers, results will only be distributed in aggregate form, in a way that does not allow deanonymization. The results of the survey will be analyzed as part of ongoing research work by the organizers. A report discussing the results will be published under a DFSG-free license and distributed to the Debian community as soon as it's ready. The raw, disaggregated answers will not be distributed and will be kept under the responsibility of the organizers.

We hope you will fill out the Debian Contributor Survey. The deadline for participation is: 4 December 2016, at 23:59 UTC.

If you have any questions, don't hesitate to contact us via email at:

Categories: LUG Community Blogs

Updated: Import Chrome passwords into Firefox

Planet SurreyLUG - Tue, 15/11/2016 - 15:56

Some time back, I wrote a post listing the steps required to migrate passwords stored in Chrome to Firefox

That post was a bit convoluted, so this post is hopefully an improvement!  My intention is to make this process as simple, and reliable, as possible.  To succeed, you will need:

There are five main steps.  Let’s get started!
  1. In Chrome’s address bar, paste: chrome://flags/#password-import-export

    …then hit enter.

    The option in Chrome should appear like this. Enable it!

    In the option that is highlighted, Select Enabled and then Relaunch.

  2. Now, in Chrome, navigate to chrome://settings-frame/passwords, scroll down and click Export.  Save the file with a .csv extension.
  3. Locate the CSV file and right click > Open With > LibreOffice Calc (Alternatively, start LibreOffice Calc and open the CSV file).
  4. Using LibreOffice Calc, you will need to modify the CSV file to import it into Firefox.  Do the following:
    1. Right-click on row 1 and select ‘Insert Rows Above’.  This should insert a single row at the top of the sheet.
    2. Copy the following and paste into cell A1, using Shift-Ctrl-V (to ensure you paste as plain text): # Generated by Password Exporter; Export format 1.0.4; Encrypted: false
    3. You need to move one column, B, to where column D is – but we don’t want to overwrite your data!
      • At the top of column B, right-click and select Cut.
      • Then right-click again and select Delete Columns – this should remove the now-empty column, and shift-left columns C and D, to positions B and C.
      • Now, on column D, select Paste.  Your url data should now live in column D.
    4. Paste the following into cell A2, using Shift-Ctrl-V:
      hostname username password formSubmitURL httpRealm usernameField passwordField

      When pasting, you may be prompted to select the data format.  Select “Unformatted Text” in the list and click OK.  We are ok with overwriting other cell contents, so “OK” that.

    5. Finally, we’re ready to export this data!  Go to the File menu, select Save As…In the Save As requester that appears, at the bottom check ‘Edit Filter’ and select ‘Text CSV (.csv)’ in the format drop-down:

      Select these options to correctly export your data!

    6. Before we get too excited, there’s just one more step to perform – some textual clean-up!Open up the exported CSV file in your favourite plain-text editor.  In the first row, you may see this: "# Generated by Password Exporter; Export format 1.0.4; Encrypted: false",,,,,,

      Delete the leading ” and trailing “,,,,,, from that line.

      Secondly, do a Find/Replace on double-commas (,,) making them ,””,  (with two quotes inserted) instead.  You may need to perform this Find/Replace twice.  Now save the file again.

  5. In Firefox, click on the burger menu and select Add-ons (or just go to about:addons).  Find Password Exporter and click Preferences.  In the Preferences window, click Import Passwords.  Now locate your saved CSV file and load it.You should finally see something like this:

    Importing saved passwords into Firefox. Not easy, but definitely rewarding!

 

The post Updated: Import Chrome passwords into Firefox appeared first on dowe.io.

Categories: LUG Community Blogs
Syndicate content