Colin's Journal: A place for thoughts about politics, software, and daily life.
I’ve recently had reason to investigate the structure of income tax and national insurance in the UK. It’s always easy to take the headline income tax rate (22% basic and 40% higher) and think of that as the amount of money being handed to the government. Unfortunately the true tax situation is neither that simple, nor that low.
Take for example, a company that can afford to spend £1000 on a Christmas bonus for a basic rate employee. Using the headline income tax rate you would expect the employee to receive an £880 bonus, yet in practise they will actually receive £593.97. This is an effective tax rate of 40.6% and that doesn’t take into account VAT, which reduces spending power further to £490 (or a 51% tax rate).
Here’s how the numbers add up:
Employer budget: £1000 Amount paid to employee (this plus the NIC add up to the budget): £886.52 Employer NIC (National Insurance Contribution Class 1 "secondary" at 12.8%): £113.48 Employee tax (at 22%): £195.04 Employee NIC (Class 1 "primary" at 11% for those earning less than £31,720): £97.52 Total money in the pay packet: £593.97
Higher tax rate payers are, of course hit harder, but not by as much as you might expect. Earnings of over £31,400 attract the 40% income tax rate, but employee national insurance contributions fall to %1 after £31,720. The resulting pay received for a £1000 company expenditure is £523.05 or an effective tax rate of 47.7%.
Note: All figures are for the 2004/2005 tax year. National insurance information was taken from the Business Link’s National Insurance contribution rates and allowances site, with the latest income tax thresholds from the Inland Revenue.
I really need to make time for some more photography. It’s been an age since I got out and about with my camera, so long ago in fact that I’m now resorting to digging out old Venice photos!
Following a discussion on the Python Web SIG mailing list I’ve made some corrections and enhancements to WSGIUtils. In the previous release applications could only register with the WSGIServer for a particular URL. Now applications will receive all requests that are made for that URL and all paths beneath that URL.
In addition I’ve modified the API for wsgiAdaptor to make it more logical. I’ve also included a new method (
sendFileForPath) that will send a file from the filesystem for the given path. The API modifications do require small changes to be made to any application written against wsgiAdaptor, but should improve overall usability.
I’ve released these updates as version 0.3. Feedback and bug reports are always welcome!
This photo would be better if I’d been able to expose the islands for longer without washing out the sky. I appreciate the mood that the cloudy skies and sunlight reflection create, but the photograph is lacking enough that I almost didn’t post it. The curve of light on the ocean is pleasing, but with more time and a lower position I could have improved on the composition considerably.
I’ve recently been struggling with the OpenOffice distribution that is included with Fedora Core Release 3 (FC3). I’ve encountered three major defects, and only found fixes for two of them:
If anyone has any suggestions for the third problem I’d by happy to hear them!
Overall the number of defects I’ve hit in FC3 has been far higher than the number I encountered with FC2. Parts of the desktop (particularly Evolution and the Gnome panel) crash more frequently than they did in FC2, various processes (again Evolution is the worst culprit) leak memory and overall memory consumption seems to have gone up.
I am now starting the process of looking for a new job. Barring any unexpected turn of events, I will be leaving Amdocs on the 3rd of December.
The last four years have, overall, been an excellent experience. I’ve learnt a huge amount about the Telecoms industry, particularly mobile data, mobile content, and ISPs. I’ve developed a strong technical knowledge base on how to integrate and scale complex systems. I have also had the opportunity to learn a great deal about project management and led several projects to successful completion.
Now I’m looking forward to a new challenge. I’m hoping to secure work as either a Solution Architect (my current role) or a Project Manager (a role I’ve filled and assisted with several times) somewhere in Europe.
My current plan is to return to the UK in late December. There are all sorts of logistical problems I’ll have to resolve over the coming months, and I fully expect to be making at least one return trip to Canada in the spring. I’m unlikely to make any further comment here on progress with my job search until it has come to its conclusion, but be assured that plenty will be happening behined the scenes. Obviously if anyone reading this has any possible leads for Solution Architect/Project Manager jobs please drop me an email!
Photo: Rocks by the sea, at the Kejimkujik Sea Adjunct, Nova Scotia.
What is the European Parliament good for? Why does it exist, when we are already represented by nationally elected representatives? These are good questions that are due some consideration. One of the strongest arguments in favour of adopting the new European Constitution is that it will increase the powers of Parliament. Only if we are sure of Parliament’s purpose does such an argument carry any weight.
In a traditional transnational organisation, such as the UN, all decisions are made by government representatives among themselves. Decision-making in the EU could be done along similar lines, with decisions requiring either a unanimous or qualified majority.
EU legislation typically starts life by the member states asking the Commission to create a directive proposal. For those policy areas subject to “co-decision” the Council of Ministers (i.e. the national governments) and the European Parliament both have to agree to the legislation. The new Constitution will expand Parliament’s role to cover almost all policy areas, making co-decision the norm. It is in the co-decision process that I see the value of the European Parliament.
The European Parliament is exclusively focused on scrutinising European legislation. A constituent, contacting their MEP regarding proposed European legislation, is likely to find them already familiar with it. The European Parliament is a forum in which a knowledgeable debate of proposed legislation can take place. Most importantly, MEPs have the power to suggest amendments to legislation, giving them opportunity to directly address people’s concerns.
In contrast, a nationally elected representative has no opportunity to amend EU legislation. They can lobby their government to make changes in the Council of Ministers, but can be easily ignored by those in power. Naturally, national representatives are focused on the legislation passing through their own parliament, and so are unlikely to have any deep knowledge of legislation being proposed at the EU level.
In short, the only way nationally elected representatives could give proper scrutiny of European legislation would be for it to pass through each and every Member State’s parliamentary process. This is clearly unworkable as any amendment made by one parliament would have to be re-considered by all others.
To answer my own questions: The European Parliament is critical for the scrutiny and amendment of European legislation. Our national representatives are unable to fulfil this role because they are unable to amend European legislation and giving them such powers is unworkable.
Until recently I was using Mod_Python as a development platform for a couple of different web applications. Unfortunately I kept stumbling across nasty bugs that I kept having to work around. The development mailing list for Mod_Python consists of people reporting bugs and patches with no action being taken on the part of the maintainers, so these bugs are likely to remain for the foreseeable future. I’ve now moved over to using WSGI for web development, and in the process create a few libraries to aid development. I’ve bundled these together as WSGIUtils and hope that others in the Python community will find them useful.
The new release of PubTal now includes installation instructions for Linux, MacOS X and Windows. The recently added support for auto-generation of is included as well as a small re-arrangement of the manual.
Fixes included in the latest version of RSyncBackup and SimpleTAL address mostly minor defects but are well worth getting. I’ve also decided to stop maintaining TALAggregator. I’ll leave the website information intact, but I’m not actively developing this any further.
(Photo shows an alternative view of Toronto’s waterfront – taken off Cherry Street.)
Last night we took part in the fifth annual Night of Dread at Dufferin Grove Park. Unlike the previous two years I arrived before the parade had left. This gave me a chance to photograph some of the costumes and puppets while there was still some daylight left. Following the parade was a lot of fun, and I managed to fill up all of my memory cards with pictures before the end of the evening.
Although I took lots of photos I’m only expecting a small number of them to come out. The light levels were so low that auto focus wouldn’t work, and I had great difficulty focusing manually. To give myself the best chance of getting pictures in focus and without camera shake I set the aperture to f8, exposure to 1/180s, and used ISO800 to give the flash a decent range. An external flash would have helped of course, but I think I did OK with the built in one.
As most people reading this will already know, the incoming head of the European Commission asked Parliament to postpone the approval vote, conceding that changes to his team are required. After several days of delay Rocco Buttiglione, the major source of the confrontation, has stood down, giving Italy the chance to put forward a more suitable candidate. There are a few other commissioners that the European Parliament has concern over, and it seems possible that more widespread changes may be made to the lineup.
This is undoubted good news for accountability in the EU, and should hopefully lead to countries in future putting forward multiple candidates for commission posts. This would raise the overall quality of the commission, and diminish the value of the posts as a gift for favoured politicians.
The anti-eu blog EU Referendum dismisses the significance of this outcome:
Nothing much has changed in the legislative or political structure of the European Union. But the MEPs, bless their little hearts, hug themselves with delight whenever there is a sign of their strictly temporary importance. In a month’s time we would have once again forgotten of their existence unless more stories come out of them claiming expenses. Let them enjoy their moments in the sun.
Ironically the site is dedicated to the rejection of the European Constitution. The same Constitution that would significantly enhance the role of parliament, finally ensuring that all future legislation is backed by directly elected politicians as well as conniving state governments. I recently stumbled across the Wikipedia entry on the Constitution. It is the shortest, clearest summary of what changes the European Constitution introduces that I’ve read to date – highly recommended.
I’ve uploaded the new “Autumn” design for my website. If you’ve any opinion on this new design please let me know, I’ve been staring at it too long to know whether I still like it. I had envisaged something more complex, but as I got into the design I kept coming across good reasons not to add more elements. This design follows the trend of earlier designs by putting a great deal of value on white-space. Most commercially designed websites have a narrow column of content among lots of design elements – something I may try next time.
I encountered numerous difficulties implementing this design in such a way that it works in all of the major web browsers (IE, Firefox, Opera and Safari). Issues ranged from known IE CSS bugs (particularly the double margin bug) through to weird box stacking in Safari.
My new design is fairly simple and yet it took three different approaches before I managed to get something that worked across the four browsers that I’m testing. If I had used absolute positioning on everything it would have been fairly easy, but it would have scaled badly for anyone using a large font size. I’ve had to use some CSS hacks (see “The Box Model Problem”) to make the site legible, although not correct, in IE 5.0/5.5.
Only around 3% of visitors to owlfish.com use IE 5.0 and hopefully by the time I create my next design there will be almost no-one using this browser.
Autumn arrived last week, bringing a final flush of colour to the world prior to the onset of winter. The weather hasn’t played to the season’s strength, with most days being grey with very low cloud and fog. I’ve still got some hope that I’ll be able to capture sunlight streaming through golden leaves before all the trees finish shedding.
Autumn colours, even in grey dispersed light, can still be impressive, and so on Saturday I went to Trinity Bellwoods Park to take a few pictures. I took a couple that are interesting in their own right, and many more that are suitable for editing into website graphics. Shana reminded me that I had originally intended to create a website theme for each season, and so I’ve embarked on creating an Autumn theme.
Autumn colours are warm and friendly, especially in comparison to the rather cool look that I’m currently using, so I think the result will be an improvement. It always takes me longer to finish doing website designs than I anticipate, but I’m currently hoping to roll the new look out within a week.
Today’s picture works best as a full screen image, but hopefully readers will be able to get something from the version I’ve uploaded here. I like this photo particularly because the grey skies, football player, and CN Tower give a great sense of an ordinary dull Toronto day. In contrast the tree’s leaves range from a plain green through to a deep red in a way that seems anything but dull.
It was widely expected that the European Parliament’s quizzing of the new Commissioners would be nothing more than an paper work exercise. The controversy surrounding the Italian Commissioner Rocco Buttiglione, who has been assigned the Justice Freedom and Security portfolio by the new Commission head José Manuel Durão Barroso, is proving that assumption to be wrong.
Mr Buttiglione is described by the BBC as a devout Roman Catholic and a close friend of the pope. A Commissioner’s religious beliefs do not normally effect their candidature, but in questioning by the European Parliament Mr Buttiglione’s has expressed some very unwelcome ideas. By describing homosexuality as a “sin” and the aim of marriage as being “to allow women to have children and to have the protection of a male” he has called into question his suitability for the position.
In the summary of Mr Buttiglione’s hearing before the Civil Liberties Committee he attempts to draw a distinction between morality and the law. The problem is the European Commission’s role as the sole initiator of new legislation. The views of individual Commissioners have a huge effect on EU law, and so their moral views are important.
The European Parliament is only allowed to veto the whole Commission rather than individual members. Whether MEPs will be willing to go that far in trying to stop Rocco Buttiglione is uncertain, but it is now a real possibility. The best political solution would be for Rocco Buttiglione to stand-down, something he has indicated he is willing to consider.
The more scrutiny Parliament gives the Commission the better. Often Commissioners are put forward based on their political friendships rather than competency. Parliament is, as it was designed to, helping us all by ensuring that Commissioner’s fundamental views are compatible with the laws they are asked to administer.
It’s been a while since I’ve taken my camera out and done any photography, and that’s part of the reason why I’ve not updated this web journal over the last two weeks. The other reason for lack of updates is that I’ve been refining the software that tries to answer the question asked above: How many people are reading this?
Every time a web server receives a request for a URL from a client (web browser, search engine, RSS Aggregator, etc) it logs the event into a file. By analysing the web server log file it’s possible to approximate how many people loaded a particular web page, which hopefully gives an indication of how many people read it.
Determining the number of people, versus the number of search engines or other robots, requesting your web page is very difficult. Each log file entry contains the user agent string sent by the client making the request. Most web browsers provide their own distinct user agent string, enabling you to determine whether the request was made by Firefox, IE, Safari or some other web browser. Unfortunately IE’s user agent string includes extensions, e.g. “Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.1.4322)” means IE 6.0 with the .NET framework installed. The range of possible user agents for IE is huge, and so pattern matching is the only practical way to determine whether or not a request came from IE.
Good search engines (such as Google which uses “Googlebot/2.1 (+http://www.google.com/bot.html)”) provide a user-agent that looks nothing like IE’s user-agent, and so is very easy to tell apart. Other’s such as the “Grifabot” deliberately use user-agents that are easy to confuse with IE, such as “Mozilla/4.0 (compatible; MSIE 5.0; Windows NT; Girafabot; girafabot at girafa dot com; http://www.girafa.com)”.
Even if the user-agent does match a known web browser this doesn’t necessarily mean that it really was that web browser that sent the request – a user can change their browser’s user agent to be anything they like. The only saving grace here is that the vast majority of users don’t bother as there isn’t really any point to changing it.
Having decided which requests are legitimately from a web browser rather than a robot the next challenge is to determine what counts as a unique page visit. If I reload the web page within 5 minutes, it will generate two requests for the page and two entries in the log file. I probably want to count such reloads as a single page visit, up to a cut off point (say 2 hours).
Detecting whether the same client has re-requested the page is fairly easy because the IP address is included in the information logged by the web server, and is unlikely to change between requests. Writing software that counts unique requests in a scalable manner is a challenge, because each request within the last 2 hours for every URL has to be remembered.
Photo: Seaweed on the beach at Kejimkujik Sea Adjunct, Nova Scotia.
Email: colin at owlfish.com