Tuesday, 3 May 2011

3 things I'd change in Perl 5

I replied to Ovid's question on blogs.perl.org about what three things you would change in Perl 5 language:
In summary, proper OO support, better deployment and making perl5 parsable without using the perl5 compiler that is written in C and makes it hard to virtualise perl support in a sandbox.

Regards, Peter

Monday, 21 September 2009

Eee BIOS upgrade

I was getting occasional errors from my RunCore SSD where the partition when read only and would not write and I needed to update my Eee PC 1000 BIOS.
This is a bit trickier than you might think as you need a FAT-16 formatted USB stick of less than 2GB, otherwise the Eee updater gives you error messages saying it cannot read 1000.ROM or that it is in the wrong format.
Anyway, to save time for anyone else who is searching how to do this, see http://wiki.eeeuser.com/howto:updatebios.
From Windows, you can right click a mounted USB and choose Format / FAT to get the correct FAT-16 format, then download the latest Eee 1000 BIOS .ZIP file from the Asus site (mine was 1000-ASUS-1003.zip), unzip it and rename the 1000-asus-1003.rom file to 1000.ROM before copying to the USB stick. Put it in your Eee then power on with ALT-F2 held down, making sure you are running on mains power and have a fully charged battery. It takes about 5 minutes to flash update the BIOS ROM.

Sunday, 16 August 2009

Make your Asus Eee 900/1000 run 4x faster

Previously, I described how to set up an Asus Eee 1000 to run Eeebuntu and described the 8GB / 32GB Solid State Disk setup it has. What I didn't realise then is that the 32GB SSD provided was much slower than the 8GB SSD, and that the 8GB SSD wasn't particularly fast either.

You can now purchase a much faster "Runcore" drop-in replacement for the 32GB SSD.
It is 4-6x faster for read/write and makes apps that were sluggish before, such as Firefox, respond quickly. Boot and shutdown time and noticeably faster too.

The Runcore is sized at either 64GB or 128GB. I went for the 64GB option (enough for me!)
The full title for Googling is RunCore 64GB PATA Mini PCI-e PCIe SSD for ASUS EEE PC 901 and 1000. Phew.
In the US you can buy it from mydigitaldiscount.com ($220) and in the UK I bought mine from memoryc.com for £160 inc. VAT. Check out the comments on there from other happy customers :-)

Installation is simple. Backup all your data to a USB stick (usual warnings about proceeding at your own risk). Shutdown and unplug your Eee PC. Turn it over. On the back there is a panel (red rectangle below) retained by two precision screws (red circles). One of these on mine was hidden under a silver "EeePC" sticker which I had to peel up for access.

Unscrew these using the (provided) precision screwdriver and lift off the panel. You can then remove two screws to release the existing 32GB memory card. It is simple to push in the replacement one and screw it down. Replace the panel and screw that down. Here's my old Asus 32GB card in the box the RunCore card came in from memoryc.com:

Now power back up and on the BIOS boot screen press F2. Go to Advanced, IDE Configuration and you should see [RunCore 64G-C SSD] as the new drive.

What I did was install Eeebuntu 3.0.1 (Ubuntu 9.04) to the 64GB RunCore SSD (format it and use 100% of space for root / partition) and use the Asus 8GB SSD as the Linux swap space (select manual configuration at install time to be able to do this). After that I restored my user files from the backup.

Works like a charm!

Asus Eee 1000 using Ubuntu 9 Jaunty with INQ1 Phone as a Bluetooth Modem

This article follows on from my earlier blog post on how to install Eeebuntu (Ubuntu 8.10 Intrepid) on an Asus Eee 1000 and use it with an INQ1 phone as a bluetooth modem.
I have reinstalled with Eeebuntu 3.0.1 NBR (Ubuntu 9.04 Jaunty) and it's a big improvement. It now comes with EeePC Tray working right out of the box and you can skip most of the setup I gave previously. There are a couple of tricky points and the procedure for getting an INQ1 phone to work as a bluetooth modem is slightly different (and easier) and I outline these below.

Install Eeebuntu

First, install Eeebuntu 3.0.1 NBR onto a USB stick and then onto your Eee 1000 using the same approach as described in my earlier blog post. Run Administration / Update Manager and make sure you have the latest updates installed.

I found the latest acpi utilities were showing up as grey on the Update Manager (it ends up with an interim version that doesn't handle the screen mode switching properly from the EeeControl tray icon). There was also some debris from the install. To fix this do:
$ sudo -s
# apt-get autoremove
# apt-get install eeepc-acpi-utilities

Install Dropbox

The Dropbox package available under Administration / Synaptic does not work with Eeebuntu 3.0.1. When I tried that I got an error:

The program 'dropbox' received an X Window System error.
This probably reflects a bug in the program.
The error was 'BadIDChoice (invalid resource ID chosen for this connection)'.
(Details: serial 722 error_code 14 request_code 53 minor_code 0)

There is a fix described here.
To make it work I downloaded nautilus-dropbox_0.6.1_i386_ubuntu_9.04.deb and http://dl.getdropbox.com/u/17/dropbox-lnx.x86-0.6.510.tar.gz then installed the patched version manually with:

$ sudo dpkg -i nautilus-dropbox_0.6.1_i386_ubuntu_9.04.deb
$ cd; rm -fr .dropbox-dist
$ tar xfz dropbox-lnx.x86-0.6.510.tar.gz
$ .dropbox-dist/dropbox --sync
(enter login details)

On a restart, Dropbox was integrated and working with Eeebuntu nautilus.

Install Blueman

To install Blueman you need to add their repository. Start a terminal session and type:
$ gpg --keyserver=wwwkeys.eu.pgp.net --recv-keys 6B15AB91951DC1E2
$ gpg --export --armor 6B15AB91951DC1E2 | sudo apt-key add -

Then go to Administration / Software Sources, click the Third Party Software tab and click "Add...". Where it says "APT line:" type in on one line:

Click 'Add Source', click Close, click Reload. Close the window.
Run Administration / Update Manager and apply updates and restart.
Run Administration / Synaptic Package Manager and search for and install "blueman".

Note - if you follow the instructions I gave in my previous blog for Ubuntu 8.10, you will get an older version of Blueman that gives a Python error complaining it cannot find 'Constants.py'. In that case, do an 'apt-get remove blueman' then follow the instructions above.

On the main menu select Preferences / Bluetooth Manager. If it prompts you to start Bluetooth, say Yes. Now do a Search and bond and trust with your INQ1. As mentioned in the previous blog, if you cannot see your phone check you have turned on bluetooth networking on the INQ1 phone:

menu Settings / Bluetooth = Switch - on ; Visibility - show your phone

menu Settings / Advanced / Connectivity / Modem Connect - via Bluetooth

In Blueman you should see your phone's name below the button bar. Click on it, then click the Setup button at the top. Choose Dialup Networking (DUN) and click Forward. It will prompt you to add a profile type. dropbox-lnx.x86-0.6.510.tar.gzSelect "3 (Handsets)". It should then connect to your phone and report a working connection. Try running menu Internet / Firefox and browsing to a website to test it.

If I find any more issues, I will add them to this blog.

Thursday, 6 August 2009

YAPC::EU::2009 Lisbon

I'm back from the European Perl conference in Lisbon.
Man, that was fun. Lisboa is a great city, with plenty of tourist sights to see before the conference itself. I was staying in Hotel Alif, the main conference hotel, along with many other Perlmongers. The metro system was good and easy to use (wish I could say the same about London ;-) and it was simple getting to the conference venue at the Faculty of Sciences, Lisbon University.

The standard of talks this year was very high and I attended more than usual. In particular, I wanted to hear more about Moose the meta-object protocol based extension to Perl 5.

Moose in Perl 5.10 brings in the best language features of Perl 6, Common Lisp, Smalltalk, Java, Ruby etc. while still being compatible with older Perl 5 code such as the huge library at CPAN.
It lets you handle in a simple, powerful and intuitive way:
  • classes
  • data members and accessors
  • method parameter signatures and type-checking (no more @_ unrolling)
  • roles
  • traits
  • mix-ins
Giving shorter, easier-to-read and more expressive code that is much easier to maintain.

Moose's core features are now stable and production-ready and any self-respecting Perl 5 programmer should consider switching to using it as soon as possible.

Piers Cawley told us about how he moved from Perl to Ruby then was lured back to Perl by the cutting-edge language features that have been added to Perl 5. The joy of it was apparently enough to make him sing! Either that or is was the sight of beautiful teutonic youths ;-)

Yuval Kogman filled in the details of using Moose and some MooseX modules and is running a long best practices Moose training session today in the post-conference training slot.

Paul Fenwick of Perl Training Australia gave us a good run through of modern Perl programming using Moose and autobox with some interesting asides on autodie, PAR::Repository for automatic deployment and updating of applications, and application speed profiling using Devel::NYTProf.

To get you started with Moose I can recommend the Moose book by Dave Rolsky and Stevan Little.

Well done to the organisers, and I look forward to next year's event in Pisa, Italy.

Wednesday, 13 May 2009

Testing with Perl Catalyst

One of the things I really like about Catalyst is how easy it makes writing tests for your applications. There is documentation about testing in the Catalyst tutorial and a more generic book on Perl Testing. Going into more detail, I suppose I would classify these tests at several levels: Model unit, Controller unit, Integration, End-To-End.

Firstly, Model unit tests. The Model encapsulates business logic and access to data serialisation, typically via DBIx::Class Object-Relational Mapper to a database. Strictly speaking, these tests exist outside of Catalyst although you may need to pick up database access credentials from your Catalyst application configuration file (conf/myapp.yml or similar) using Config::JFDI. This replicates what Catalyst::Plugin::ConfigLoader does but works outside Catalyst.

Usually, I'd expect to start off with tests named 00_setup_database.t and zz_teardown_database.t. These create a known database before unit testing and then tear it down after. You can use a utility like DBIx::Class::Fixtures to do this.
Then I'd have 10_schema.t (and 11_schema2.t if I am using more than one schema) to test that I can connect to a database schema, that the sources (tables) are as expected and that I can retrieve a sample datum.
use Test::More tests => 4;  
BEGIN { use_ok 'MyApp::Model::MyModel' }  
BEGIN { use_ok 'MyApp::Schema' }   
my $connect_info = MyApp::Model::MyModel->config->{connect_info};  
diag "connecting schema2 to ".$connect_info->[0]."\n";  
my $schema = MyApp::Schema->connect( @$connect_info );   
my @sources = $schema->sources();  
ok( scalar @sources > 0, 'found schema sources :-  '.join(", ",@sources) );   
my $cl;  
ok($cl = $schema->resultset('Client')->find({ client_code => 'TEST01' }),
      'find client with code TEST01');  
ok ($cl->name eq 'A Test Client', 'client name as expected');
To this I'd add 20_user.t, 21_client.t (and so on), one per business object. Often these directly relate to a database table, but sometimes they can be a virtual class and use several tables with tricky logic built in, for example to calculate matching availability of resources over time. With a DBIx::Class schema you can do this and hide the details of the implementation inside the class or other classes it calls using a chained recordset approach. But back in the unit test, you'd test the public interface each schema class is exposing. For example, create a client record, then check to make sure subsidiary records have been created in other tables. Once you obtain a DBIC $schema handle, as above, the rest is straightforward.

Secondly, Controller unit tests. In the Catalyst world you use Catalyst::Test to achieve this. It sets up a fake Catalyst context so you can test actions. The manual page and the tutorial above explain this clearly so I won't go into detail. You add a unit test for each public action URL to ensure it dispatches to handlers as expected.

use Test::More tests => 2; 
BEGIN { use_ok 'Catalyst::Test', 'MyApp' } 
ok( request('/web/login')->is_success, 'Request should succeed' );
Thirdly, Integration testing. By this I mean running through some typical business workflow scenarios. In effect, expressing the functional requirements and specification of the system in the form of unit tests that will exercise the system as a whole (database, controller, model code, subsidiary library code, view templates).

For a web application (bear in mind Catalyst does not purely have to be used for writing web apps) you can use Test::WWW::Mechanize::Catalyst. This sets up a fake HTTP request object so you can test a sequence of actions as if you were going through a web server. So you might login, retrieve a list of records, view the detail for one of them, update a record, then check the list of records reflects that. After the WWW::Mechanize tests you could then check the database manually to ensure the records are as expected from the system functional specification.

use Test::More;   
eval "use Test::WWW::Mechanize::Catalyst 'MyApp'";  
plan $@ ? ( skip_all => 'Test::WWW::Mechanize::Catalyst required' )      
  : ( tests => 36 ); 
ok( my $mech = Test::WWW::Mechanize::Catalyst->new, 'Created mech object' );   
# logout screen redirects to login screen  
$mech->get_ok( 'http://localhost/web/logout', 'logout' );  
$mech->content_contains('Log on using', 'logout redirects to login screen'); 
diag "check login authentication is required to access extranet screens";  
# turn off automatic redirect follow so we can check response code  
# 1. check redirect code and location
$mech->get( 'http://localhost/web/extranet/list/booking' ); 
is($mech->response->headers->{status}, 302, 'unauthed user is redirected away from page requiring auth'); 
like($mech->response->headers->{location}, qw|/web/login/|, 'redirect is to login page');
# 2. check all protected paths 
for (qw|     index     home     booking     |) {
  my $path = '/web/extranet/'.$_;   
  $mech->get( 'http://localhost/'.$path);   
  is($mech->response->headers->{status}, 302, 'unauth redirect for '.$path); } 
# allow automatic redirect again  
$mech->requests_redirectable([qw/ GET HEAD POST /]); 
# 3. business workflow tests 
# get login screen  
$mech->get_ok( 'http://localhost/web/login', 'get login' );  
$mech->content_contains('Log on using', 'login contains Log on using');  
# login
$mech->submit_form( fields => { username => 'cust', password => 'cust' } );  
# check logged in successfully  
$mech->content_contains('Welcome to the', 'successfully logged in screen');  
diag "screen tests";  
$mech->get_ok( 'http://localhost/web/extranet/home', 'get home' );  
$mech->content_contains('Welcome to the', 'home contains welcome'); 
$mech->get_ok( 'http://localhost/web/extranet/list/booking', 'list/booking' ); 
$mech->content_contains('25843011', 'booking list contains booking 25843011');

Having written a range of unit tests then use Devel::Cover and run them all to check how well you have covered your code.

Fourthly, End-To-End testing. This is using a set of Integration tests to exercise the whole system in use by feeding in data and user actions and then checking the output data and files to make sure they are consistent. In the past, I worked on a meter reading and billing system and as part of a major release one of the tests was a manual one to set up a fixture database then type a set of pre-determined meter readings into the system from real remote handsets communicating over a WAN, process some batch runs in the central server, perform a few edits from the central server Oracle screens, then generate output billing records. You would expect the financial totals to be in balance and for selected bills to reflect the combination of the meter readings over a time period. Most systems are not that complex but the principle is to use real kit and real life scenarios. You might use several versions and releases of real web browsers against a copy of your live environment on a staging server and check the resulting screens, any PDFs generated and the figures totals on your database.

That's it for today. Happy testing!

Cheers, Peter

Friday, 1 May 2009

Complex databases with DBIx::Class

One thing that is tricky is combining the use of an Object-Relational Mapper, such as DBIx::Class or Rose::DB::Object with complex real world situations. Simple database queries are easy in SQL, but there's still a big gain from using an ORM to handle them as there are fewer lines of code and you can abstract away physical database changes to field names or relations without the calling code even knowing it has happened. Adding in complex joins and the use of database extensions inside triggers or stored procedures makes it difficult to get plain old SQL right, let alone in extra business methods to an ORM class that depend on the physical database structure and engine.

Here is where some examples come in handy.

Matt Trout has written a useful talk on how he implemented a combination of DBIx::Class ORM, custom queries and OO database extensions in a talk he gave to PostgreSQL WEST 2008 (note: requires Firefox to view).

The Catalyst Wiki MojoMojo implements a complex Entity-Relationship schema including hierarchical page nodes. It is well worth looking at the MojoMojo source code to see how it was done.