Tuesday 30 December 2014

XML Prague 2015

I finally got an approval from my boss to attend XML Prague 2015 and registered for it the other day. I'm not presenting this time around, just listening and learning, and looking very much forward to it.

A Note

Noting it's been two months since I last wrote anything here, I feel it is time to add the following:

If you hoped for a new version of ProXist (as hinted by a previous blog entry), sorry. It has not happened yet. It will, eventually.

If you expected something else from me, sorry again. It has not happened yet. It might, if I find out what you're on about.

Contact me if you want blame assigned.

Saturday 25 October 2014

ProXist v2

For the last few days, I've been busy updating ProXist, my XProc abstraction layer and app for eXist. There is a new(-ish) XProc package for eXist that promises to support a lot of (Norm Walsh's XProc engine) Calabash's capabilities, so I decided it was time to test it out and do a new ProXist at the same time.

My XML Prague ProXist version supported only my custom document type and stylesheets, and barely those at that. It was meant to be a demo. For the new version, though, I'm thinking of doing a default implementation for DocBook, including some of the more commonly used stylesheets and a couple of standard pipelines, packaged so they can be used with ProXist--it should be a question of writing a ProX blueprint XML file, theoretically, plus something that helps me list DocBook resources included using XInclude.

At the same time, I'm finally updating the ProXist documentation. It's written using DocBook, incidentally, and now part of the git repository.

ProXist is not even close to being finished, but at least I seem to have moved on from procrastinating to actually doing something.

Monday 20 October 2014

I Guess I'll Have to Find Another Email App

About a year and a half ago, I bought Airmail to replace Apple's standard Mail app as my primary OS X email client. Not only was it miles above what Apple could offer, it was a bargain at $1.99. It was a no-brainer; I would gladly have paid more, considering its great feature set.

Lately, though, the Airmail updates have done little to fix the bugs I've encountered, from unread post counts not matching the actual numbers and poor threading of mailing list posts to annoyingly slow performance with large inboxes, etc. These annoyances haven't been enough for me to bother looking up another email client just yet, but that's mostly because I'm lazy and keep hoping that an update will eventually fix the problems.

So imagine how pleased I was earlier today, when I noticed that a version 2.0 of Airmail is available from the App Store. Finally!

Except then I spotted the price tag. The upgrade costs $19.99, and yes, that applies for customers both new and old. Although, for a limited time, you can get it at a special introductory price of $9.99.

WTF?

I feel cheated. It's not the money - a year and a half ago, I would have considered $19.99 to be more than reasonable - it's that they want to make me pay twice to get what essentially is an upgrade, the first more major upgrade since I bought version 1.0. See, they've reworked the app "from the ground up" and there's now a "faster engine", that is, they've finally addressed the performance issues and maybe more, but apparently they think they didn't charge their customers enough the first time around.

And what happens the next time they plan a bigger upgrade? What happens when they move to version 3.0? Do they expect me to pay for the damned thing again, a third time?

So, sorry but no; not only will I not be buying the 2.0, I will also uninstall the 1.0 and replace it with an email client developed by someone who isn't planning to rip me off.

Sunday 19 October 2014

OS X Upgrade

I upgraded my MacBook Pro to the new OS X version, Yosemite, yesterday. Some observations:
  • The new system font looks great. I'm really glad they finally realised how crappy the old one was.
  • As expected, the upgraded zapped my rEFInd bootloader. I can no longer dual boot, so no more Ubuntu until I've rerun the rEFInd script. Unfortunately the word on the web is that the bootloader becomes awfully slow, so I think I'll wait.
  • The new dock is sort of ugly and sort of old-fashioned. Didn't it use to look like this, a couple of versions ago?
  • And speaking of looks, it all feels to me as something that might have happened if Apple had merged with ToysRus. Which isn't the case, as far as I know, so it probably means that the iOS camp at Apple is winning.
  • I'm feeling a bit cheated because the Handoff features require a newer computer than my mid-2010 model. I would have loved to test them out.
I'm pretty sure I'll have more to say in a few days. For now, though, I'll just hit Publish.

Friday 17 October 2014

On Reflection

Having reread my recent post on HTML5, I suppose I'd better stress the fact that it was never meant to be a commentary on HTML5 itself. It was a rant, something that happened because of the attitudes I think have increased in tandem with HTML5's growing popularity. I really don't know enough HTML5 to have an informed opinion beyond what I see suggested and discussed about it that is related to markup in general. My comments should be read in that light.

Take the addition of document semantics in HTML5 as a case in point. For example, the article and section tags are welcomed additions, as are, in my humble opinion, the fairly minor redefinitions of the em and strong semantics. And there's some important work being done in the area of extensible semantics for the web (see, for example, Robin Berjon's XML Prague paper, Distributed Extensibility: Finally Done Right? and the Web Components web page on best practices), which turned out to be a heated topic at Balisage this year because quite a few of its participants are, like me, grumpy old men defending their own turf.

These are steps in the right direction, because they move away from the presentational horror that is the "old" HTML and to a more semantic web. Semantics is about meaning, and meaning is now being added to the web rather than simply empty but oh-so-cool visuals. I should add that some very cool visuals are being added, too, but in, and please pardon the joke, a meaningful way.

But, and maybe this is just me, it's when those steps are being heralded as original and unique, never thought of before or at least never done right, when history and past (and working) solutions are ignored or misinterpreted because they are part of a standard (XML or even SGML) that is regarded as failed, when I react. Google's Dominic Denicola provided a case in point when he held a controversial presentation on the subject called Non-Extensible Markup Language at Balisage; unfortunately, only the abstract seems to be available at their website.

That grumpy old men thing, above, is meant as a joke, of course, but I imagine there to be some truth in it. Part of the HTML5 crowd will certainly see it that way because they are trying to solve a very practical problem using a very pragmatic standard. HTML5 is, in part, about keeping old things working while adding new features, and it seems to do the job well. Having to listen to some older markup geeks argue about what XML was actually designed to do must seem to be as being utterly beside the point.

So, what to do? Well, I think it's largely about education, both for the newer guys to read up on the historical stuff, and the older guys to try to understand why HTML5 is happening the way it is, and then attempting to meet halfway because I'm pretty sure it will benefit both.

Me, I'm in the midst of the reading up phase, and HTML5 - The Missing Manual is an important part of that.

Wednesday 15 October 2014

Reading Up On HTML5 - A Rant

I've been reading up on HTML5 lately, feeling that I should brush up my web development skills before I attempt an upgrade of a personal web page. Every other web page these days seems to be taking its design cues from some for me yet unknown source and I figured HTML5 is probably the place to look.

I've come across it before, of course, both in my day-to-day work and at markup conferences. We've been doing various web solutions and mobile apps involving HTML5 for some time now, and this year's Balisage, to pick the most recent markup conference I've attended, started with a one-day symposium on HTML5 which was both interesting and illuminating. There's a lot of it going on, right now.

And the other day, I came across HTML5 - The Missing Manual and have been reading it on and off since. It seems to be quite comprehensive and more or less what I need right now, but it is not a book primarily written for markup geeks. Don't get me wrong; it is well written, as are all of the Missing Manuals I've read, and it seems to do a reasonably good job explaining its subject.

But--and there's always a but, isn't there?--there's stuff in that book that makes me cringe. Or rather, it's not the book itself but what the book represents. See, the book seems to define that moment in time when I am no longer able to selectively ignore the HTML5 crowd.

There are a couple of terminology issues, of which the most annoying to me right now is that empty elements are apparently called void elements these days. This is not the author doing a reinterpretation of anything, this is the W3C working group talking--I just checked--but it is a symptom of what I am talking about here.

Another symptom, and one known to everyone who's done any web-related markup since the inception of the web, is how markup errors are frequently shrugged at. Nesting errors are the most glaring, of course, and the way they are ignored is to say that everybody does them but the browsers can all handle them, so while you may decide to be anal and actually validate your HTML, the message between the lines is that it's OK and you shouldn't worry about it since the browsers can handle it.

Coupled with this lax attitude towards draconian error handling (actually any error handling that is not related to browser-based limitations) in the book is the not-so-subtle hint that this is all the fault of the oh-so-obviously failed XHTML attempt from a few years back, when W3C tried to introduce XML syntax rather than fix HTML.

And there are the usual mentions of missing end tags (but no recognition of the fact that this was actually an SGML feature, once upon a time) and how that's really not such a bad thing after all because modern browsers can handle it, and the closely related ignorance towards empty (sorry, void) element syntax. It's OK, no need to be anal about it, because modern browsers can handle it. It's a question of style.

The HTML DOCTYPE declaration is explained, of course, and its lack of any kind of versioning indicators, SYSTEM or PUBLIC identifiers, etc, declared to be a good thing. Again, the motivations are browser-based and basically about doing it in this way because the browsers can handle it. The DOCTYPE declaration's existence is only important because of legacy reasons.

And the whole concept of well-formedness is explained as a matter of good style but really not much more than that (since yes, the browsers can handle it), and the benefit mainly the ability to include a couple of attributes in the root element.

And on and on it goes.

I don't mean to say that HTML5 is a bad set of standards--I know far too little about it to have an informed opinion--but much of the attitude I'm seeing in the wonderful world of HTML5, both now in this book and earlier in the various presentations and discussions I've witnessed or been a part of, seems both frivolous and dangerous. It seems to me to be a combination of reinventing the wheel and ignoring the lessons learned by others, and I can't help wondering what damage it will do to other markup technologies.

The Balisage symposium from earlier this year did give me hope--the people present were both smart and reasonable, more than willing to listen and learn from each other, and the discussion that ensued was informative and balanced--but most people do not go to Balisage, or any other markup conference, for that matter. They learn by doing, and some by reading, and quite a few sources now are like HTML5 - The Missing Manual, well written, comprehensive--and potentially dangerous to anyone with no experience of markup technologies outside their immediate needs, because they reinvent, reinterpret and redefine what they need and ignore everything else.

The result, I fear, is a more complex situation than ever in the world of markup, a situation where different sets of standards apply depending on the context and where XML, while sorely needed by some parts of the community, is ignored by others.

End of rant. Thanks for reading.

Tuesday 14 October 2014

Thursday 9 October 2014

I Should Probably...

Following this year's Balisage conference, I should probably do a rewrite and update of the whitepaper I presented. It's on my list of things to do.

On the other hand, I should do an eXist implementation of the version handling system I suggested in that paper. It's also on my list of things to do.

But then again, I still have to finish my ProXist implementation, the one I presented at XML Prague. It is (you guessed it) on my list.

I have lots of good (well, I think so) ideas first presented at XML conferences, many of which deserve (well, I think so) to live beyond them. After all, a lot of work goes into the papers and the presentations, so shouldn't I follow up on them more?

My version handling system, for example, should be easy enough to do in eXist. It doesn't require a revolution, it doesn't require me to learn how to code in Java, it should be enough to spend a couple of nights writing XQueries and XSLT to produce a usable first version.

ProXist is both simpler and more difficult to finish, but on the other hand, the basic application is already out there and works. It's a question of rewriting it to be easier for others to test, which basically means redoing it as a standard eXist app.

Yet, instead of doing something about either of them, here I am, writing this blog post. It's conference procrastination taken to yet another level.

And the next XML Prague deadline is coming up fast.

Wednesday 20 August 2014

Another Take on English vs Swedish

Some years ago, I visited the library to get a C book or two to read while waiting for the one I had ordered from Amazon. The only C book I found, however, was the Swedish translation of Stephen Prata's C++ Primer. The problem was, though, that I never read computer literature in Swedish (I don't even know the Swedish word for "encapsulation") so procrastination ensued. I'd heard some good things about that book, though, so in the end, I had to try, Swedish or not.

But the book felt weird, as if Prata wasn't really serious about programming. It was as if he had decided to make up the terminology as he went along, and as if I had two foreign languages (computer-Swedish and C++) to master, instead of just one.

And that, I guess, was the real problem. The whole computerised world speaks English, which means that if we need to talk about computing, we need to do it in English or risk not being understood. In Sweden, we'll invariably borrow from English when discussing anything having to do with computers and IT, even if the main conversation is in Swedish. Sometimes the words are semi-translated and (sort of) accepted, but often we just use the English terminology directly, without even stopping to think about what the corresponding Swedish word might be. Yet nobody cares, because we all seem to agree that the Swedish words are, or at least sound, phoney.

It is a problem, however, one of the curses of the modern western civilisation and shared by every nation with a native language other than English. Precious few see it in that way; most accept the consequences without a second thought, with devastating consequences for their native tongues.

Icelanders have a different approach. The authorities will produce official Icelandic translations for every single new word that pops up in a book, television show or United Nations address. Every single one, and not just for the English ones either. Everything: German, Spanish, Swahili, doesn't matter what the language is. See, the Icelandic language police regards every foreign influence as potentially harmful and acts accordingly. For example, they never use "television" or "TV", they use "sjónvarp" (roughly "flying picture"). "Telephone" is "sími" (original meaning: "thread"), and "computer" is "tölva".

Cool, isn't it?

I'm not entirely sure their way is better. The Icelandic market is far too small for translating every new book or film or United Nations address there is, and certainly not immediately, which means that if there is some new and important terminology to learn, it is not translated either. Which in turn means that the foreign-language terminology is actually needed after all if you want to keep up, at least for a transitional period. So, by the time the authorities come up with an Icelandic translation, the English terminology might already be in use and hard to get rid of.



There are not that many Icelanders. How many of them can keep up with the advances of the western computing?

As for me, I now know how to translate "encapsulation" ("datagömning", which, by the way, is not quite as Swedish a word as it might look to you, dear reader, but that's another story). It's not a word I'll use any time soon. It sounds downright silly.

Saturday 26 July 2014

Further Unity Comments

After a couple of days with Ubuntu and the Unity desktop, I have another couple of comments.

First of all, I don't like the global menu bar. I really don't like it. It takes up space for no good reason - the clock, etc don't require it, not really, and if you turn off global menus, which I did, it is mostly empty.

Second, I hate the "Amazon Lens". It's not that I don't shop on Amazon - I do, all the time - but I want to choose when to interact with Amazon (or other commercial providers) without my desktop interfering. So I've turned that off, too.

I also don't like how newly opened windows are placed on the desktop. It's probably configurable using some of the tweak tools that are available, but I can't be bothered to look it up.

This morning, I switched back to KDE and the more traditional desktop metaphor, and immediately realised that it's rather boring, too. It's nice, with all kinds of extras and eye candy and stuff, but it's boring and it should be possible for someone to come up with a more modern Linux desktop.

It's just that Unity isn't the answer.

Monday 14 July 2014

New Distro, Part Two

After a couple of days of Kubuntu, my curiosity took the upper hand and I decided to install the Unity desktop along KDE.

It's an interesting GUI, I have to admit, but I remain unconvinced. The search-oriented task bar thingy to the left is an odd bird, for example. It is as if Canonical were mixing their desktop metaphors. There must be a task bar because everyone's got one, but it seems as if they've gone out of their way to ensure it is different. I'm not sure if it's a good thing or a bad thing yet.

Worse is that the global menus do not work--as many others have pointed out, on a large screen the menus will simply be too far off from the programme window itself. I know, OS X does this also, but the difference, the crucial difference, is that the menus and their behaviour are consistent on a Mac, something they can never be on Linux.

Version 14.0 allows you to move the global menus to their respective windows, which solves the problem but also highlight a less serious one: the top bar, now mostly empty sans a few icons to the right, still takes up space but now provides no real benefit.

On the whole, though, the GUI looks nice, with better graphics than I remember from past Ubuntu versions. It looks like a finished product, something that, say, Debian Testing, doesn't--the XFCE desktop I briefly tried when deciding ona new distro looks ghastly. I know it's not supposed to have the bells and whistles of a Plasma desktop, Windows 7 or even Gnome, but my god, the damned thing put me off to an extent I didn't think possible.

Submitted My Final Balisage Edit

I submitted the final edit of my Balisage paper, Multilevel Versioning for XML Documents, the other day. While I did try to shorten it (I seem to be unable to produce a short paper) and, of course, correct problems and mistakes pointed out by reviewers, there were no radical changes, and so I am forced to draw one of two possible conclusions:

I am deluded and simply don't know what I'm talking about. This is an awful feeling and happens to me a lot after submitting papers.

The paper suggests something that might actually work.

(There is a third conclusion, obviously, one that is a mix of the two, but let's not go there.)

My paper is about a simple versioning scheme for the eXist XML database, built on top of the versioning extension that ships with it. Its main purpose is to provide granularity to versioning, to provide an author of XML documents with a method to recognise significant new versions as opposed to the long series of saves, each of which comprises a new eXist version.

On the surface of it, my scheme is a typical multilevel versioning system,with integers, decimals, centecimals, etc (1, 1.1, 1.1.1, 1.1.2, 1.1.3, 1.2, ...) identifying a level of granularity. The idea is that the lowest level (centecimal, in this case) denotes actual edits while the levels above identify significant new versions. Nothing new or fancy, in other words. What is new (to me, at least; I have not seen this suggested elsewhere) is how the scheme is handled in eXist.

I'm proposing that each level is handled in its separate collection, each using eXist's versioning extension to keep track of new versions in the respective collections. When a level change occurs (for example, if a new centecimal version such as 1.3.1 is created from 1.3), the new version is created using a simple copy operation from the decimal collection to the centecimal collection. The operation itself (in this case, a check-out from a decimal version to a centecimal version) is kept track of using an XML file logging each such operation and mapping the eXist "standard" version to the new integer, decimal or centecimal revision.

A related task for the XML file is to map the name of the resource to its address; the XML file's other big purpose is to provide the resources with a naming abstraction so a named resource in a specific significant version can be mapped to an address on the system. I propose using URNs, but most naming conventions should work just as well.

Implementation-wise, the XML version map abstraction is very attractive to me as a non-programmer (or rather, someone whose toolkit of programming languages is mostly restricted to those commonly associated with XML technologies), as I believe most of the operations can be implemented in XSLT and XQuery.

But I'm not there yet. I've submitted the final paper and now, I have to produce a sufficiently convincing presentation on the subject.

The presentation is on Tuesday, August 5th, and I'd love to see you there.

Sunday 6 July 2014

Time for a New Distro

Recently, I upgraded my work laptop with an SSD disk.The laptop, a Lenovo Thinkpad T510, has been pretty reliable but getting a bit long in the tooth. A conventional 2.5" disk three years old is a cause for concern if used daily, and anyway, SSDs are amazingly fast these days. It's almost like buying a new computer.


I should also mention the Nvidia Optimus graphics card. It's basically two cards in one, an Intel graphics chip for the daily stuff that doesn't require much graphics processing and an Nvidia chip for the stuff that does, the idea being that the OS switches between the two to either save battery or boost performance.

So, anyway, while I simply cloned the Windows partitions from the old disk (using Acronis software), I eventually decided to try a new Linux distro rather than fixing the cloned Debian Sid I've been running since 2010 or so. The Debian system was spread out over several partitions, which caused problems when booting the cloned system--apparently UUIDs changed when cloning, crashing the system.

I wanted something Debian-based, of course. Apt-get rules and all that, and Debian is pretty hard to break even if you run an unstable version of it.

First, I tried the new Linux Mint Cinnamon distro (v 17), having heard some very good things about it. The installation went without a hitch and I was soon able to boot into the desktop (what a pretty one, btw) using the open-source Nouveau display drivers. They were OK but not great, so I decided to replace them with Nvidia's proprietary package and something called nvidia-prime that would allow me to switch between the two graphics chips. This seemed to work well, until I came to work the next morning, placed the laptop into a dock and booted using an external monitor only.

No desktop appeared, just a black screen.

Opening the laptop's lid, I discovered that the desktop was actually there, after all, but only on the laptop screen. Nvidia Settings, the helper software that allows you to configure the X server and screens, was broken and so I couldn't use it to configure the monitors. The Cinnamon display settings would only share the desktop between the two screens but not allow me to only use the external monitor.

Changing from the Nvidia chip to the Intel one did not significantly change this, but introduced a new problem: I no longer had the option to change back to Nvidia.

I looked around to see if there were newer Nvidia packages around, or perhaps a newer kernel, since that's what I would always do in Debian Sid; there would frequently be something in the experimental branch that would help me. Linux Mint, however, while Debian-based, is far from Debian Sid. It is meant to be stable, and anything, um, unstable would have to come from somewhere else entirely.

I found a 3.15 kernel from a Ubuntu branch and installed that, but Linux Mint would then insist that a 3.13 kernel was actually an upgrade, so I gave up and realised Linux Mint wasn't for me after all.

I then spent the following evening (and night) installing and testing Ubuntu 14.04 in place of Linux Mint, as a Google search suggested nvidia-prime would work out of the box in it. It did, but after a few hours of fooling around with Ubuntu, I realised I truly hated Ubuntu's Unity desktop.

Discouraged, I wiped Ubuntu from the disk in favour of Debian's Testing branch, but that didn't go well. I downloaded an ISO, remembering that Debian's installer would not support WiFi cards during the install, only to discover that they had a) switched to XFCE from Gnome as their default desktop and, more importantly, b) my WiFi card was still considered bad as it was non-free according to Debian's rather strict criteria and so the firmware was not on the ISO and I had no wired network hooked up to that laptop.

I could have used the Windows partition or my Macbook Pro to download the missing firmware, of course, but I got annoyed and wiped the disk again, now installing the new Kubuntu 14.04 instead.

Which is where I am now. Kubuntu also handles nvidia-prime out of the box, but it also has the (for me) familiar KDE desktop. It's not perfect (the system fonts, for example, are ghastly and I have to do something about that very soon) but it's good enough for now.

Now, you may be tempted to point out that Nvidia Optimus works out of the box there, too, and with more finesse, but if so, you are missing the point.

Linux is fun, and the very fact that there are so many distros out there speaks in its favour. If something in Windows doesn't work for you, you won't have a Windows alternative. Well, you have Windows 8, but seriously?

Sunday 18 May 2014

Tuesday 22 April 2014

ProXist Documentation, Etc

My XProc abstraction thingy for eXist, ProXist, is not the most well-documented open source project there is, but at least there is now something to read. It's little something in DocBook, just a first draft and terribly incomplete, but something that I'm hoping to make more complete, given enough time.

I also feel it's time to ProXist it as an eXist app rather than a set of misplaced collections.

Paper Woes

I managed to submit my Balisage paper on time, in case you wondered.

Also, I still think my basic idea is a good one. It's simple and, I think, useful. So simple, in fact, that I'm worried that everybody but me thinks it's perfectly obvious.

::sigh::

Tuesday 15 April 2014

Just A Few More Days

The Balisage paper deadline is approaching fast. Deadline is on Good Friday, which is three days from now and close enough to keep me busy until some decidedly ungodly hours come Friday night. Basically I'm interpreting "April 18" as "early morning, April 19, GMT".

My paper is, so far, a study in procrastination. There is an angle bracket or two, and possibly even a semi-intelligent thought, but most of all my long days so far remind me more of my uni days and approaching exams, when any excuse from sorting spoons (I had two) to counting dust bunnies was enough to keep me away from the books, than a serious commitment to markup theory and practice.

It is coming along, though.

Monday 14 April 2014

XPath 3.0

Just when I was finally starting to get used to XPath 2.0.

And there's more. XQuery 3.0 is out, too. They've been busy.

Tuesday 11 March 2014

Linux Ready for the Desktop and All That

My recent XML Prague presentation ran from a Linux partition, the first time in a while I've used Linux for presenting anything. The reasoning was simple; I'd developed the accompanying demo on Linux, on a server on localhost, so it would be much easier to just write a presentation in Open Office than to move the demo to something else.

It wasn't.

I'd fixed every bug in the demo, styled my web pages in an aesthetically pleasing manner (well, for me), and carefully prepared an XML Prague presentation project in oXygen with only the files I would need to show, making sure that they'd fit without scrolling when projected in a lower resolution. I'd bookmarked the important code, and I'd folded everything else. My demo was in great shape.

What I didn't do beforehand (even though I actually meant to) was to test my Linux laptop in dual screen mode, mirroring the laptop screen to an external monitor using that lower projector resolution. That, of course, was what failed.

My talk was immediately after a coffee break so I figured I'd hook up my laptop immediately after the last talk before the break and test all this. How hard could it be?

Well, no mirroring in that lower resolution. Mirroring in a higher one (the laptop's native resolution) was possible but of course, the projector wouldn't work in that resolution. They usually don't. Dual screen mode, outputting two different screens, didn't work because I wouldn't be able to see on my laptop's screen what was being projected for the audience. I tested pretty much every setting there was but to no avail.

And then the (Gnome) window manager decided it couldn't take the abuse any longer and crashed.

I rebooted into KDE, hoping it would fare better, but all I got for my troubles was another crash. Not the same software, mind, but something or the other in KDE. I hadn't really tried anything very dramatic, I'd simply changed the display modes a few times.

So I rebooted again and accepted my faith, booting into Gnome and using the dual screen mode where I'd be flying blind unless twisting my head all the way back like that poor girl in The Exorcist, trying to run the demo from the laptop's touchpad in front of me while hurting my neck to see the results on the large screen behind and above me.

If you've watched the conference video (second day, about 7 or 8 hours into the file), you now know why.

My laptop is not particularly fancy or modern. It's a 3-yo Thinkpad with an Nvidia Optimus graphics card, the kind that includes what was then a high-end Nvidia card and a low-end Intel card, the idea being that you use the former for the graphics-intensive stuff while reserving the latter for the 2D desktop stuff. It still doesn't work properly in Linux so I only use an Nvidia only mode. It's not something I blame the Linux developers for--the Optimus is proprietary and thus not something easily handled in open source--but it is what it is and quite common.

But other than that, there is nothing very special about my laptop. It just works, mostly. Well, it should.

So is Linux ready for the desktop yet?

Tuesday 25 February 2014

On the Importance of Free T-shirts

My friends at SyncRO Soft, the makers of the oXygen XML Editor, frequently hand out oXygen t-shirts to their users at XML conferences, and the recently concluded XML Prague was no exception. I'm wearing my new one as I write this. It's a very nice shirt, the quality is good, and while the oXygen brand is clearly visible, it is unobtrusive and has a non-commercial feel to it.

I also plan on decorating my laptop with some of the stickers they gave away, and I'm always backing up my data on their USB sticks.

Free t-shirts are a difficult art to master. There's the issue of quality, obviously, and we don't really need to go there, do we? Then there's the issue of the logo, the basic message, and the key here is unobtrusiveness. Yes, it will have to be visible, but that's about it. I'm not a commercial message, I'm a pro, and want to be regarded as one. Your logo is fine but keep it simple, please.

But most importantly, I need to like the brand.

Here, the oXygen people have a unique advantage. Their product is used almost universally in my chosen field. It's used by my peers, pretty much every single one of them, and it's used by our customers. The fact that it is a terrific product and we all rely on it helps, but that's not really why. A lot of people use Windows every day but would never dream of wearing their shirts.

I think the real reason to why I like the product is that it always feels as if oXygen is updated based on our feedback and what we need. Part of the reason is that they are as much XML geeks as we are; they are a knowledgeable bunch. It really helps if you know your market, basically.

They also sponsor markup conferences which, to me, makes a lot of business sense. Our field is highly specialised and rather small, so these gatherings are hugely important. I do think that without them, eventually development would stop or be taken over by the really commercial entities.

But even more importantly, they participate. They show up at conferences and they present papers. They share what they know and are glad to listen to us share what we know. They are the most non-commercial commercial entity I know of, because while I certainly know that they are selling a product, I don't really care. See it's not really us and them, it's just us, license fees or no license fees.

When you are in a position like that, a t-shirt is an opportunity not to be missed. For all of us.

There's another company that I like, MarkLogic, that sells what's probably the most expensive XML database in the world. Lots of people I like and respect work there, which is why I noticed them in the first place. They also participate, they sponsor, and they frequently help develop the standards we all rely on. I have yet to try their product more extensively, but there are free developer licenses for non-commercial purposes and so I will, at some point.

They also hand out t-shirts, and even though I have yet to test MarkLogic Server more extensively, I wear them. And at one point they handed out this awesome USB stick slash bottle opener, a true collector's item.

They key here, though, is participation, not the nice give-aways. They show up and they share, and so I'm much more inclined to share back.

My New Jag

I've switched Jags. It's a MY2004 XJ-8 with a 3.5 V8, this time around, and I'm really pleased.

Sunday 23 February 2014

This Year's XML Prague...

...was fabulous. It always is, don't get me wrong, but this one was the best yet. It's all on video at the conference website, which, all things considered, is a pretty decent substitute for being otherwise engaged, but Prague this time of year is the XML capital of Europe and the place to be.

For one thing, I think I finally actually understand some of the streaming part of the up-and-coming XSLT 3.0 spec, thanks to Abel Braaksma and Michael Kay, who both presented papers on the subject.

John Lumley presented a paper on lessons learned when finalising a standard library for XSLT/XPath extensions to manipulate binary data, a brilliant talk.

George Bina showed oXygen on mobile devices with the crowd cheering his every swipe of the iPad screen, in what was probably one of the most memorable demos ever at XML Prague.

And there was me, lastly (literally; I was the last scheduled speaker, right before a concluding interactive talk led by Robin Berjon), showing my ProXist demo. It all went surprisingly well, except for a slight problem with Linux and Gnome.

You should have been there.

Thursday 30 January 2014

Trust

With the advent of digital projection comes the age of explicit distrust.

In the olden days, when 35mm projection was the norm and 70mm what you hoped for, the film distributors would usually send the prints to the cinemas well in advance, mostly because the boxes were heavy and the distributors had little control over the actual physical distribution, but also because they imagined it would take the projectionist some time to assemble a 35mm print for a show.

Balancing the tip in the opposite direction was the fact that they also feared that a new film might be illegally scanned at a cinema. Never mind the fact that it wasn't easy to set up a reasonable scanning facility at a cinema without the managers noticing, nor do the actual deed, as it would invariably involve actually projecting images for however long the film was.

It was not common; most new feature films that ended up on Pirate Bay, or whatever the VHS precursor was called, were, in fact, pirated by employees of the production company or by a very early link in the distribution chain, long before the films reached the cinemas.

Today, when films are digital and delivered to cinemas in hot-swappable hard disk drives, the industry uses every means at their disposal to stop the illegal copying.

Unfortunately, as evidenced by the many bittorrent sites on the internet, they mostly focus on the wrong targets.

Modern digital films are frequently encrypted, meaning that they require a digital key to unlock them for a show. This key is matched to not only a specific theatre but also a specific digital projector and server at that theatre. The key is dated, with a start date and time, and an end date and time, so that it will only be valid for the screenings and, usually, for a limited time before and after them.

Now, many producers and distributors will send unencrypted films to film festivals such as the Göteborg Film Festival, their reasoning being that digital keys are not always reliable and can cause problems, but also because sometimes the venue needs to be changed at a short notice. Others send their films encrypted but with liberally timed keys, lasting for days, weeks or even months, to help minimise problems.

Which we appreciate. None of us is against the companies protecting their property; while digital keys are a hassle, the actual license files are small and can easily be distributed through email, FTP, etc.

But not every film distributor or production company will understand or even care about what we think or what problems they help create.

Last night, I screened the Spanish film Living Is Easy with Eyes Closed, a charming story about what happened when John Lennon came to Almeria, Spain, to act in a film. About halfway through, I noticed that the server's screen warned that the film's license key had expired. It stated that the show contains "one or more unlicensed clips".

Someone, somewhere, had decided that since it was the only screening at my cinema, the Draken, there was no need to wait until the show was over, which was around 1 AM. As if, after a 17-hour workday, I would stick around to scan the film, never mind the fact that the server manufacturer, Dolby, has other arrangements in place to prevent me from doing so.

Yet, the film is already available on the internet (I checked).

Now, if someone had fainted in the auditorium during the film (which happened two nights ago, by the way) and I had been forced to pause the film, I would not have been able to continue the show. The key will allow us to finish an ongoing show but not to interrupt it and finish it later.

I'm sure the director, who was present at the screening for a midnight Q&A, would have been thrilled to explain why the audience didn't get to see the conclusion of his film.

Last year, we screened Spring Breakers, one of Hollywood's many attempts at making money from showing bikini-clad teens, in this case Selena Gomez (in spite of the fact that I hear it's easier to watch moving images of scantily clad females on the internet). The license key was timed between fifteen minutes before the screening started and one hour forty-five minutes after, making any kind of check by me to check the image format and noting a curtain call cue an impossibility, not to mention the fact that had something happened during the screening and we had been forced to stop it, we again had risked not being able to finish the show.

The production company didn't stop there, however. They also distributed night vision binoculars to the ushers, with strict orders to monitor the audience during the show. They also wanted to place a guard in the projection booth for the duration of the screening, but I refused, informing the festival that said guard would also have to run the show since I wouldn't be there.


We do our utmost to run beautiful shows. We take pains to ensure that the screening is as good as we can possibly make it, running a few minutes of every film to see that it's OK, that the key works, that the image aspect ratio is the correct one, that the audience gets value for their money - tickets are quite expensive these days and most of us regard the audience as our true employers - but the production companies and film distributors aren't helping.

I think I speak for most of us when saying that we don't do what we do for them. Had it only been them I doubt I would have bothered at all, to be honest.

Trust is earned; it's not given freely and they haven't done anything to earn mine.

Sunday 26 January 2014

Finally, A Book About eXist

O'Reilly just released an early draft of Erik Siegel and Adam Retter's eXist book, aptly named "eXist". The timing is perfect, as I have not yet finished my XML Prague demo that, as you might know if you've read my earlier blog posts, features eXist.



Guess what I'll be reading the next few days?

This Year's Göteborg Film Festival Logo...

...is pretentious and boring.

I mean, really? What were you thinking? That, after a dozen shows, anyone would still even pretend to like it?

Please, filmmakers: Don't quit your day jobs. If you have them, that is.

Wednesday 22 January 2014

T-2 Days and Counting

The annual Göteborg Film Festival is almost upon us, with only two days left when I write this. 11 days of film and, for me, 11 days of clicking Play because it's all digital now, with the exception of one (1) film.

I've written about the advent of digital film before, but also about the death of a profession, and while I briefly considered another stab at these two subjects, I quickly came to my senses; I feel that I've said pretty much everything I have to say on the subject. This year's festival is merely a confirmation of those two blog posts, and there is little reason to reiterate any of it.

So I'll write about the death of a cinema instead. More specifically, my cinema, the Draken, until recently the last surviving Cinerama theatre with its original appearance intact, until last summer mostly unchanged by both the ravages of time and pitiful small-screen multiplexes. It survived them both, although Svensk Filmindustri, Sweden's only cinema owner of note, did have plans to convert the theatre into a double-screen abomination in the early seventies.

What it didn't survive, in the end, was the long-planned "renovation" by its owner, Folkets hus, a k a Sweden's working class movement. The original 1950s chandeliers were thrown away and holes cut into the marble walls to lead the way to toilets forced into the space under the auditorium. Light riggings were carelessly hung up in the auditorium  itself and computer-controlled fluorescent lights with only nominal dimming capabilities were allowed to replace the old auditorium lighting.

And in the large upper foyer, the maritime-themed painting that used to be the pride of the cinema has now been replaced by a motorised conference screen. My cinema has now been reduced into a pathetic two-screen cinema. Or, rather, conference hall. Well done, Folkets hus.

Those closest to you are the ones that can hurt you the most.

Oh, and...

...most of the ProX stuff is available at Github. Not the eXist web pages, yet, but that's because I'm still experimenting with them and there's some work left. There's the Balisage demo, and there's the basic ProXist stuff, with pipelines and XQueries and such, and there's the authoring environment (with Relax NG schema, FO, etc), but no instructions on how to get any of it to run, yet.

I have a test app running locally, a little something that is about as simple as I can make it, but since I am not a web developer (I'm a markup geek), the HTML is awkward, the CSS nonexistent apart from the default eXist stuff, and the XQueries somewhat painful. I do think it's going to be pretty cool, though, and look forward to presenting it at XML Prague.


ProXist and My XML Prague Paper

I recently submitted the final version of my XML Prague whitepaper about my eXist implementation of ProX, called ProXist (with apologies for the tacky name). While I'm generally pleased with the paper, the actual demo implementation I am going to present at the conference is not quite finished yet and I wish I had another week to fill in the missing parts.

Most of the ProXist stuff works but there are still some dots to connect. For example, something that currently occupies the philosophical part of my brain has to do with how to run the ProX wrapper process, the one that configures the child process that actually does stuff to the input. ProX, so far, has been very much about automation and about things happening behind the scenes, and so I have aimed for as few end user steps as possible.

My Balisage ProX demo was a simple wrapper pipeline that did what it did in one go. Everything was fitted inside that pipeline: selecting the input, configuring the process that is to be applied to the input in an XForm, postprocessing the configured process and converting it to a script that will run the child process, running the child process, saving the results. Everything.

But the other day, while working on the eXist version and toying with its web application development IDE, it dawned on me that there doesn't have to be a single unified wrapper process. If its components are presented on a web page and every one of them includes logic to check if the information from a required step is available or not (for example, a simple check to confirm that an input has been chosen before the process can be configured), they don't have to be explicitly connected.

The web page that presents the components (mainly, selecting input and configuring the process to be applied on the input) then becomes an implicit wrapper. The user reads the page and the presentation order and the input checks are enough. There is no longer a need a unified wrapper process.

Now, you may think this is obvious, and I have to admit that it now seems obvious to me, too. But I sometimes find it to move from one mindset (for example, that automation bit I mentioned, above) to another (such as the situation at hand, the actual environment I implement things in) as easily as I would like. If this is because I'm getting older or if it's who I am, I don't know. In this particular case, I was so convinced that the unified wrapper was the way to go that it got in the way of a better solution.

At least I think it's a better solution. If it isn't, hopefully I can change my mind again and in time.

See you at XML Prague.

Monday 6 January 2014

ProXist

I've been working on an eXist-based implementation of my XProc abstraction layer, ProX, hoping to have something that runs before XML Prague, next month. It turns out that the paper I submitted on the subject has been accepted, so now I guess I just have to.

The ProX implementation should not be terribly complicated to finish, but until recently it risked to be rather hackish (is that a word?) because the XMLCalabash eXist module written by Jim Fuller was rather primitive: it would only support pointing out the pipeline to run and one, hard-coded output port. I foresaw a more or less complete rewrite of the ProX wrapper in XQuery.

Luckily, Jim very graciously agreed to rewrite his module into something more immediately usable. I received the first updated module to test in December and the most recent update just a few days ago. He also found a bug in Calabash's URI handling and sent a first fix to me along with the updated module. There are still things to do but for me, Christmas came really early this year.

Oh, and I'm calling the implementation, and the paper, ProXist. Sorry about that.

Happy New Year!

Title says it all. Happy new year, everybody!