CyberTech Rambler

May 6, 2008

ODF validation

Filed under: Uncategorized — ctrambler @ 1:27 pm

I blogged before about Alex Brown’s original test that Microsoft Office cannot write OOXML correctly. I said the test was problematic because we are validating against a completely new standard which developers had no time to implement. I actually expected Rob Weir to jump on the bandwagon and make a “song and dance” about it, the same way PJ did. However, to his credit, he saw through that the test was not a good one. The only time he commented was in the blog post I am discussing below where he had no choice but to touch on the issue as it is relevant to the discussion. True to someone that had development experience, he agreed that the failure to conform to ISO OOXML standard can be reasonably explained.
True to his word, Alex Brown made good his promise to perform the conformance analysis he did with ODF. The conclusion of the analysis is that the ODF document he created with does not satisfy the ODF standard. Unfortunately, several decisions he made along the way in some way can potentially distracted him from the original goal. Some were touched on in Rob Weir’s comment on Brown’s work. This leads to more discussion about ID and IDREFs (Brown’s rebuttal of Weirs [Weir’s answer at the end of original comment) and the internal working of the validator they use. It is interesting to note experts at work here. Who is right? The answer is not as straight forward as it seems.
Weir casted Brown analysis as validating the ODF/OOXML standard document, download from OASIS and OOXML in their respective format, with the appropriate ISO Standard. Was this the aim of Brown analysis? I had always read Brown’s OOXML analysis as an article, processed by Microsoft Office 2007 does not satisfy the ISO OOXML standard. It appeared that I had mistaken. Brown had by-passed Microsoft Office completely. He took a OOXML Standard Document and validate it against the new OOXML specs.

In the ODF analysis, Brown’s decision to use the same OOXML document, raises a lot of questions. First and foremost, he should had used ODF standard document to validate against ODF specs. I will be the first to admit that this itself is not perfect. For one thing, if need be, it is easier to edit a shorter document by hand. 😉 However, since we are testing a standard document, its your fault if you supplied a longer document. While I understand why he chose to use the same document (OOXML Standard Document), he introduced a a lot of complication simply by the need to convert that document from OOXML to doc via Microsoft Office, then to ODF via As evidence I will present Weir’s comment about double hash (##) sign in URL. The second problem is the test is also a test of OOXML to doc format by Microsoft, and doc to ODF conversion by Moreover, the first is a backward-conversion, and even Microsoft does _not_ guarantee a backward compatibility from OOXML to doc format. All these makes the results with ODF less predictable.

So, is Brown’s analysis worthless? No. From the beginning his purpose is to simulate discussion and work on standard conformance for ODF and OOXML and judging by the responses he got from both postings, he succeeded. Discussion on his finding, including Weir’s critic are precisely efforts that we should encourage to encourage both camp to work harder in achieving conformance. While it is unfortunately that his ODF work is not performed in the same “laboratory” settings as his work on OOXML, lets not forget that he always stress that his results is a “smoke” test. Therefore, he is permitted a lot of leeways.

Did he managed to prove that did not write valid ODF? Yes, he did, even Rob Weir acknowledge it. I acknowledge the standard we used in validation is very very high, i.e., not a single failure is tolerated. That is the standard _and_ everyone agrees with it. However, since this standard is so high that I do not really expect any applications to conform to either standard, ever. What is more interesting is the answers to these two questions: How many flaws that we find and how many _types_ of flaws there are. I believe this is what we should measure application with. Unfortuantely, this discussion is more difficult as there is no agreement on what the acceptable measure is.

As I had expected, Doug McHugh picked up on the topic. As usual, he took it on face value without performing any analysis of his own. But he did point us to Stocholm’s article about Stocholm testing conformance of ODF back in Fall 2007. While it is alarming to see that did not produce valid ODF, we need to note that the methodology is seriously flawed since there is NO way for an independent third party to cross-validate the results. For a start, we don’t have the original documents. The second part is that he did not say which version of ODF it validate against. There is always the possibility that it validate against the wrong one. The claim that all OOXML docs in the test passed ECMA OOXML is a smoke screen: It is easy to do so when you write the standard yourself and get it rubber-stamped. However, his conclusion in his last line is valid: we want better ODF conformance iwth!


Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Blog at

%d bloggers like this: