Planet ALUG

Syndicate content
Planet ALUG - http://planet.alug.org.uk/
Updated: 1 hour 57 min ago

Chris Lamb: Generating dynamic Python tests using metaclasses

Sun, 27/03/2016 - 16:09

One common testing anti-pattern is a single testcase that loops over multiple yet independent inputs:

from django.test import TestCase class MyTests(TestCase): def test_all(self): for x in ( 'foo', 'bar', 'baz', # etc. ): self.assertEqual(test(x), None)

Whilst this code style typically occurs in an honourable attempt to avoid a DRY violation, these tests:

  1. Can only report the first failure
  2. Prevent individual inputs from being tested independently
  3. Are typically slower than their neighbours, or a performance hotspot generally
  4. Do not allow for parallel computation, a feature recently added to Django 1.9

(Note that whilst foo, bar, etc. are defined statically above for simplicity, the values could be determined dynamically by, for example, iterating over the filesystem.)

If you have such tests, consider splitting them out using a metaclass like so:

class MyTestsMeta(type): def __new__(cls, name, bases, attrs): for x in ( 'foo', 'bar', 'baz', ): attrs['test_%s' % x] = cls.gen(x) return super(MyTestsMeta, cls).__new__(cls, name, bases, attrs) @classmethod def gen(cls, x): # Return a testcase that tests ``x``. def fn(self): self.assertEqual(test(x), None) return fn class MyTests(TestCase): __metaclass__ = MyTestsMeta

This has the effect of replacing the single testcase with individual test_foo, test_bar & test_baz testcases. Each test can then be run separately:

$ ./manage.py test myproject.myapp.tests.MyTests.test_baz Creating test database for alias 'default'... . ---------------------------------------------------------------------- Ran 1 test in 0.039s OK Destroying test database for alias 'default'...

... or we can test them all in parallel:

$ ./manage.py test myproject.myapp.tests.MyTests --parallel Creating test database for alias 'default'... . ---------------------------------------------------------------------- Ran 3 tests in 0.065s OK Destroying test database for alias 'default'...

You must ensure that the tests have unique names to avoid cases masking each other. In the above example we could simply use the input string itself, but if you have no obvious candidate you could try using Python's enumerate method to generate a unique, if somewhat opaque, suffix:

for idx, x in enumerate(( 'foo', 'bar', 'baz', )): attrs['test_%d' % idx] = cls.gen(x)

One alternative approach to a metaclass is to generate the test methods and use setattr, to bind them to the TestCase class. However, using a metaclass:

  1. Is cleaner and/or more "Pythonic"
  2. Avoids a number of subtle pitfalls with parameter binding
  3. Prevents your TestCase class from being polluted with loop variables, etc.
  4. Can be composed or abstracted into reusable testing components


Note that you can still use setUp and all the other unittest.TestCase and django.test.TestCase methods as before:

from .utils import MyBaseTestCase # custom superclass class MyTestsMeta(type): # <snip> @classmethod def gen(cls, x): def fn(self): self.assertRedirects(...) return fn class MyTests(MyBaseTestCase): __metaclass__ = MyTestsMeta def setUp(self): super(MyTests, self).setUp() # Code here is run before every test def test_other(self): # Test some other functionality here

UPDATE: Stu Cox asked: Do you even need the metaclass, or could you do this with __new__ straight on the TestCase? Curiously, unittest does not initialise classes in the typical way if you do not explicitly define at least one test_ method.

Categories: LUG Community Blogs

Jonathan McDowell: Dr Stoll: Or how I learned to stop worrying and love the GPL

Sat, 26/03/2016 - 17:28

[I wrote this as part of BelFOSS but I think it’s worth posting here.]

My Free Software journey starts with The Cuckoo’s Egg. Back in the early 90s a family friend suggested I might enjoy reading it. He was right; I was fascinated by the world of interconnected machines it introduced me to. That helped start my involvement in FidoNet, but it also got me interested in Unix. So when I saw a Linux book at the Queen’s University bookshop (sadly no longer with us) with a Slackware CD in the back I had to have it.

The motivation at this point was to have a low cost version of Unix I could run on the PC hardware I already owned. I had no knowledge of the GNU Project before this point, and as I wasn’t a C programmer I had no interest in looking at the source code. I spent some time futzing around with it and that partition (I was dual booting with DOS 6.22) fell into disuse. It wasn’t until I’d learnt some C and turned up to university, which provided me with an internet connection and others who were either already using Linux or interested in doing so, that I started running a Linux box full time.

Once I was doing that I became a lot more interested in the Open Source side of the equation. Rather than running a closed operating system that even the API for wasn’t properly specified (or I wouldn’t have needed my copy of Undocumented DOS) I had the complete source to both the underlying OS and all the utilities that it was using. For someone doing a computer science degree this was invaluable. Minix may have been the OS discussed in the OS Design module I studied, but Linux was a much more feature complete option that I was running on my desktop and could also peer under the hood of.

In my professional career I’ve always welcomed the opportunities to work with Open Source. A long time ago I experienced a particularly annoying issue when writing a device driver under QNX. The documentation didn’t seem to match the observed behaviour of the subsystem I was interfacing with. However due to licensing issues only a small number of people in the organisation were able to actually look at the QNX source. So I ended up wasting a much more senior engineer’s time with queries like “I think it’s actually doing x, y and z instead of a, b and c; can you confirm?”. Instances where I can look directly at the source code myself make me much more productive.

Commercial development also started to make me more understanding of the Free Software nature of the code I was running. It wasn’t just the ability to look at the code which was useful, but also the fact there was no need to reinvent the wheel. Need a base OS to build an appliance on? Debian ensures that the main component is Free for all usage. No need to worry about rolling your own compilers, base libraries etc. From a commercial perspective that allows you to concentrate on the actual product. And when you hit problems, the source is available and you can potentially fix it yourself or at least more easily find out if there’s been a fix for that issue released (being able to see code development in version control systems rather than getting a new upstream release with a whole heap on unrelated fixes in it really helps with that).

I had thus progressed from using FLOSS because it was free-as-in-beer, to appreciating the benefits of Open Source in my own learning and employment experiences, to a deeper understanding of the free-as-in-speech benefits that could be gained. However at this point I was still thinking very much from a developer mindset. Even my thoughts about how users can benefit from Free Software were in the context of businesses being able to easily switch suppliers or continue to maintain legacy software because they had the source to their systems available.

One of the major factors that has helped me to see beyond this is the expansion of the Internet of Things (IoT). With desktop or server software there is by and large a choice about what to use. This is not the case with appliances. While manufacturers will often produce a few revisions of software for their devices, usually eventually there is a newer and shiny model and the old one is abandoned. This is problematic for many reasons. For example, historically TVs have been long lived devices (I had one I bought second hand that happily lasted me 7+ years). However the “smart” capabilities of the TV I purchased in 2012 are already of limited usefulness, and LG have moved on to their current models. I have no intention of replacing the device any time soon, so have had to accept it is largely acting as a dumb display. More serious is the lack of security updates. For a TV that doesn’t require a network connection to function this is not as important, but the IoT is a trickier proposition. For example Matthew Garrett had an awful experience with some ‘intelligent’ light bulbs, which effectively circumvented any home network security you might have set up. The manufacturer’s defence? No longer manufactured or supported.

It’s cases like these that have slowly led me to a more complete understanding of the freedom that Free Software truly offers to users. It’s not just about cost free/low cost software. It’s not just about being able to learn from looking at the source to the programs you are running. It’s not even about the freedom to be able to modify the programs that we use. It’s about giving users true Freedom to use and modify their devices as they see fit. From this viewpoint it is much easier to understand the protections against Tivoization that were introduced with GPLv3, and better appreciate the argument sometimes made that the GPL offers more freedom than BSD style licenses.

Categories: LUG Community Blogs