CyberTech Rambler

May 26, 2006

Microsoft Open XML Schema and Open Document Format war of words so far

Filed under: Uncategorized — ctrambler @ 1:09 pm

Well, before two parties throw punches/bullets at each other in a fight/real war, they usually start out fighting with words.

This is precisely what we are seeing with Microsoft Office XML (Open XML Schema) and Open Document Format (ODF). The real war (market share) cannot start yet because one party is not ready yet. Open XML Schema is available at the end of this year with the next generation of Microsoft Office (Consumer will have to wait until next year as the release is for Corporate Customers only). The war of words started with Massachusett's Announcement that it is using ODF instead of Open XML Schema.

Lets cast aside the arguement on which is the "truely" free format and accessibility issues they had been covered widely, including on this blog. Lets instead, focus on the technical merit. To a certain extent this is a beauty contest. In my mind, this started when Microsoft made Open XML Schema available for analysis when it officially handed over Open XML Schema to ECMA. That is when both formats are available for comparison.

The first real salvo was fired by the ODF front when it published an article demonstrating that raw ODF XML is more readable by human. This includes not only reading of the document text but also other elements that makes up the document, pictures, hyperlink etc. The ODF element names ("text:p") is also more approachable than Open XML Schema ("w:t"). The same article also argued that reusing existing standard in ODF is better than Microsoft's recreation of the same thing. While this is all true, we must take it with a pinch of salt because the majority of the users will not be eye-balling raw ODF XML but will be using software that interprets ODF such as OpenOffice.org and KOffice. Hence, the value of human readability should not be overstated because XML is by large read by machine who really don't mind how the XML are presented. Human readability is important for the implementer of ODF-enabled software, but there is probably where the advantage stop with human readability advantage.

So now we are witnessing the next salvo: Microsoft Yates says ODF is slow while IBM's Bob Sutor says Open XML Schema is bloated. Now, we are starting to see things that users care about: Speed and the ability to exchange documents. Both offers evidence to support their case. Sutor points out the size of ODF (700 odd pages) compared to Open XML Schema (4000 pages). He made the arguement that Open XML Schema is too detail and therefore bloated, thus making it difficult to achieve full-interoperability.

Yate's arguement that ODF is slow does not stack up that well. One critical factor is his inference from George Ou's study. That study compares Office 2003 with OpenOffice.org. One immediately note that there is a confounding factor in Yate's comparison: software. A further problem is Office 2003 is using a predecessor of Open XML Schema, not Open XML Schema itself. A lot of water passed under the bridge between the two.

Yate's comments had always leave me the impression that he is more persusive with his business arguement then technical arguement. Sutor, on the other hand, can make technical arguements better. Sutor's is stronger in mixing the two but Yate is struggling to do so.

Let's what and see what the third salvo is…

May 23, 2006

Microsoft consider yanking admin rights from employee

Filed under: Uncategorized — ctrambler @ 6:02 pm

ZDNet Australia is carrying a story on Microsoft is considering to take away admin rights from employee. The headline should instead be “Can you believe Microsoft employees have ‘Admin rights’ to their computer?”

If even Microsoft finds it necessary to give employee admin rights, what hope we joe public have when it comes to deploy IT security rule of “giving the least privilege” necessary to carrying out a job?

The article says that it is a case of Microsoft “eating its own dog food”. I would say this is actually a case of “Do as I preach, not as I do”.

Shocking, shocking revelation…. Worthy of comtempt!

May 19, 2006

Who is more reliable or dependable? Microsoft or Open Source

Filed under: Uncategorized — ctrambler @ 7:14 pm

In a new BBC documentary, "The Code Breaker", Microsoft commented that open source is not reliable or dependable. Humm…. lets put both in a beauty contests and see who wins.

Well, as people say, beauty is in the eye of the beholder, and this is certainly the poster child. When it comes to technical merits, open source is more reliable and dependable today. Microsoft is playing the catch up game and I do not like playing the prediction game. However, when it comes to "Business Case", that statement might be correct.

When we talk about business, we are playing the "blame game". If insurance company refuse the pay out when an IT failure cost your company a million dollar, the blame will fall on the IT head. This is something only sucidal IT head will play. So, proprietary software is definitely more dependable here, if just to save one's skin. 

OK, I know, this is not what Microsoft meant when they say "dependable".

So, go back to "Business Case". While open source has so far consistently wins on the technical definition of reliability and dependeablity, Business case puts humans into the equation. More precisely, the IT skill of the coperation or the that of the outside help they have access to. There is no point for any company to use open source software if it means longer down time than proprietary software because they do not have the skill beyond shutdown and restart to deal with IT problems. In this case, of course, properitary software is definitely more reliable or dependable.

The question on how many companies finds that open source is not reliable or dependable from the business point of view is a controversial one. As I had said before, it depends on the skill the company can call on to deal with problems. We are in the middle of this process where we see medium to large companies finding that the technical reliability and dependency makes open source superior to proprietary software. Smaller companies tend to find the lack of skills made open source a bad solution. However, Innovation in small company open source is still in its infancy. It is still looking for the "break through" innovation to bring open source to the majority of small businesses.

[John Carroll] What if Microsoft was truly evil

Filed under: Uncategorized — ctrambler @ 2:40 pm

John Carroll, a self-confessed Microsoft lover published a blog titled "What if Microsoft was truly evil" in which he fantasize that it would be a heaven for alternative operating system vendors if the Microsoft 1994 consent degree preventing it from charging a "Microsoft Tax" on non-microsoft PC did not happens and this eventually leads to a truly restrictive Microsoft environment. He did it on the grounds that if Microsoft can charge $10,000 per copy, people will be flocking to alternatives.

What a fantasy! Let's inject some reality into it. To be able to charge $10,000 per copy, Mirosoft must have a stranglehold on everything that computes. We will not be looking at scenario where OEM cannot ship computers without Microsoft in it, but hardware lock-in that refuses to run anything except Microsoft-authorized software. Any person attempting to even sniff at creating anything that Microsoft do not like will see their Microsoft license yanked. Linux will be dead in the water because Linus cannot legally write it on his "Microsoft" computer. Even if Linus "pirates" his computer, Microsoft will set pressure on his university citing fears that Linus will contaminate their computer system which they absolutely cannot live without. Linus will have to choose between a degree or total oblivion coz Microsoft will threaten his employer/future employer etc. SAMBA team will not had existed as there is no need for it (No Unix, no alternative implementation to Microsoft.)

In short, no alternative will be available. Microsoft sees to it. Worst , it is all "within the basic constraints of property rights" as Carroll will like to champion. Nevermind it probably trample on several anti-trust and unfair competition laws. Hang on… wouldn't Microsoft force governments to change these troublesome competition laws already?

May 18, 2006

Google Web Tool Kit

Filed under: Uncategorized — ctrambler @ 1:47 pm

Google just reveals a new toy for web development, the Google Web Tool Kit (GWT). It is designed to write AJAX-enabled website. AJAX is known to be very difficult to develop for because of the complex interaction between browser and server. A lot of people wonder why it appears that Google seems to be the only company that manage to use it well. GWT certainly give us a clue on this topic.

Now, if I understand it correctly, with GWT, you write java code, debug your web page as java program. Then, when you are ready to put up your webpage, recompile it to AJAX-enabled webpages. Yes, you are right, you do not need to know anything about AJAX, Java script and how they interact with the webserver etc. You just need to know Java and GWT well. Amazing!

This is the first time I heard about this new development paradigm where you write and debug application in one language, and have it convert to another language during deployment. Mastery of this is difficult. But by doing so, Google achieved the following:

  • Make AJAX approachable. How much easier can it be if I tell you you can use AJAX (or anything language for that matter) without having to learn it.
  • It is a very good example of the use of abstraction to make downstream programming easier. The complexity of AJAX is abstracted out for downstream programmers. Only the GWT development team needs to know how to manipulate and work with AJAX.
  • It harness the power of Java debugging environment to debug AJAX code.
  • The way GWT harness the Eclipse Workbench probably amaze even the Eclipse Workbench Development Team. It certainly surprises me.

One thing that also surprises me is that GWT does not automatically generate a MS Visual Studio project.

Finally, perhaps surprisingly, I learned a lot about programming just by reading GWT website. 

May 17, 2006

Why you should always opt-out of offer to receive promotional material

Filed under: Uncategorized — ctrambler @ 7:39 am

ComputerWorld's Editor-in-chief, Don Tennant published an article about a Microsoft salesperson using licensing enforcement as a ruse to get a salesperson into your company to pitch for more Microsoft program. Frank Hayes from the same magazine then follow up with another article confirming what we already suspected: It is official Microsoft policy to use the threat of enforcement to get a salesperson into your company to "help you get the most out of Microsoft licensing", a euphermism for getting you to spend more money on Microsoft product.

Immediately obvious is the question of Business Ethics as made clear by both articles: (1)If your intention is to do a sales promotion, make it clear to your receipients. I am sure a lot of us do not like the experience of friends getting into our house on the pretext of social visit then trying to sell us something, let alone a stranger. (Aside: I particularly respect one of father's friend. We went to his house to pay him a visit. My father than found out that he was a sales representative for a product he had been consider buying so he placed an order with him. When the order did not arrive, he call him up and his reply was "I thought you were just going with the flow of the conversation when you placed the order". He was simply trying not to pressure my father into the sales by following up with the sales. A true friend indeed) and (2) As the sales pitch started with the "blackmail" of licensing enforcement, most people will simply pay to make it go away as they do not want the hassle of the threat materializing.

If you think this is bad, they did exactly the same trick with schools. See this article and this slashdot discussion.

So what does this has to do with the title? It reminds me of the small print on almost every form that says "tick here if you do not want us to contact you about things that might interest you". There is one thing call "Data Protection Act" that stops licensing enforcement division from passing information about customers to sales division. The small print is there to sidesteps this restriction. Hence, if I have suspicion that people pass my information inappropriately this way, I can tell them I had never consented to this and challenge them to prove it.

Actually, I am more paranoid then this. In Europe some company use the "opt-in" system rather than "opt-out". With "opt-in" it means you tick a box to consent to them contacting you. the weakness in this system is anyone can tick the box for you later. Doing a hand-writing analysis for a tick is probably going to be impossible and ink test is not likely to rule out that the form is tempered with. That is why I carry around with me a purple coloured pen.

May 15, 2006

Sensible Ruling from Supreme Court with respect to Permanent Injunction against Patent Infringer

Filed under: Uncategorized — ctrambler @ 5:46 pm

The Supreme Court had spoken in the case of "Ebay vs MercExchange". (See this ZDNET article). Ebay is appealling against a permanent injunction prohibits it from using a technology patented by MercExchange.

The Supreme Court in effect, return the question of whether a "permanent injunction" is warranted in this case back to lower court, with the advise that in some cases where the patent holder does not use the patent but merely collecting licensing fee from licensees, "monetary damage" might be sufficient to remedy the infringement.

The Supreme Court expressly said that it does not take a position on whether  "permanent injunction" or "Monetary damage" is the appropriate remedy. In fact, it is rather likely that the lower court, after considering the case, still decides that "permanent injunction" is the appropriate way in this case. In my humble, non-lawyer opinion, the important fact is the Supreme Court open the door that might one day means companies that does nothing except  licensing patents might find that it must license it patents to everyone who wants a license at a reasonable fee by stopping them from "gaming the system". It does this by taking away what I call the "Intellectual Monopoly" of patents from these companies, reducing it to "Intellectual Property". (See this article for a description of Intellectual Monopoly and Intellectual Property").

In my opinion, this is an important ruling for "Intellectual Property". The Supreme Court was tossed a hot potato and it handles it well.  

May 11, 2006

Dogs trains to sniff DVD, but cannot distinguish fakes from real.

Filed under: Uncategorized — ctrambler @ 4:33 pm

Yes, we have a new weapon to combat pirate DVD, two dogs to be precise. While it is certainly newsworthy that our canine companion can be put to weed out pirates, it is worthwhile pointing out to those who are about to open the champagne that Lucky and Flo (the two dogs in question) cannot tell pirated copy from legitimate copy.

The fact that Lucky and Flo first live tests only found legit DVD in their first test was hidden in the comment by FedEx official in  Federation Against Copyright Theft(FACT) Press Release (See PDF copy of 8 May release). After what I can at best characterized as "Forward Looking Statements" by Director General of FACT and Director Optical Disk Operation WorldWide Anti-Piracy MPAA (Motion Picture Association of America). That press release is peppered with the word "piracy" and boast that this is a good anti-piracy move. Well, the jury is still out on whether the dogs are really a useful anti-piracy tool. It certainly look like the failure to find pirated DVD in their first realistic test is being swept under the carpet.

Truth is, the fact that all DVDs they found through their "wonder" nose are legit does throws doubts on the dogs usefulness. While I am sure they can also sniff out pirate DVD alongside legit DVD, it is the success rate of finding illegal DVDs that counts. In this test, the score is zero. Other factors affecting the dogs usefulness include the ratio of legitimate DVD to pirated one and the impact on the market for illegal DVD the dogs can make with every successful find. Only time can tell whether Lucky and Flo has a career as, literally, dogs of FACT (Federation against Copyright Theft) and Motion Picture Association of America (MPAA).

For ambitious doggies worldwhile, lets wait until the coast is clear before embarking on a career sniffing DVDs.

May 9, 2006

The importance of standardization

Filed under: Uncategorized — ctrambler @ 11:19 am

Microsoft's PR machine is doing some new spin on standardization. This time, Microsoft Tom Robertson's comments on eweek about ways to achieve interoperability without standardization, and before that, Jason Matusow's "economic with the truth about ODf"  irked quite a number of respectable bloggers, including, Andy Updegrove (Robertson, Matusow), Bob Sutor (Robertson) and Simon Bray (Matusow).

Robertson comments that 

  1. "You can achieve interoperability in a number of ways," said Robertson. Among them: joint collaboration agreements, technology licensing and interoperability pacts.
  2. "Standards are not always appropriate," Robertson said. And in the cases in which they are, "you should standardize only what is necessary."

Both statement are accurate. The first statement underline the core of Microsoft strategy : Do not standardize if you can. Use workaround such as those mentioned. May be he should add "reverse engineering" into the mix. The key to understanding the second sentence is the word "necessary". Which perspective are we looking at here? Company's, Vendors', consumers' or all of the above? They yield different answers. The bind Microsoft has is, everyone, including me, has this setting that says Microsoft is going to see it from "company" perspective and that will be determental to vendors and consumers. If anything, Microsoft's action has been to confirm this.

Despite KOffice letter to Microsoft and wide publicity, and that Microsoft has access to KOffice and OpenOffice.org source code and has the resources to compare and rebuke it, Microsoft's Matusow still insist that ODF is supported only by StarOffice/OpenOffice. Robertson says that "[Despite having 400+ people working on standards], we're not tapping into these people in the most efficient way possible." I agree.
Robertson and Matusow are part of Microsoft which deals with standardization. If I were their boss, I will fire them immediately, unless of course (wink wink) they are just telling the world what I want them to say. The most serious offence is that they are simply undermining Microsoft's (and their own) credibility on standardization with their comments. For Mr Robertson, while his statement is accurate, the overall impression he gives is that Microsoft is not interested in standardization if it can get away with it. He demonstratably "failed" to understand the important message sent by standardization process. Matusow is either an Osterich burying his head in the sand, not networking enough to realize that his information is up to date or downright dellusional about ODF, one of Microsoft biggest  threat in the office application space.

Ms Foley, the author of the eWeek article where the quotes from Robertson originate, hit the nail on the head when she says

"The bottom line: Going forward, Microsoft may end up participating in 10 times as many standards efforts or maybe just a third of the ones in which it participates today, company officials said. But, however it chooses to engage, the company is hoping to present a more unified, well-educated and trained, and standards-savvy face to the public."

Two-faced bastard. 

May 8, 2006

Is revealing the source code really necessary?

Filed under: Uncategorized — ctrambler @ 11:34 am

A while ago, there are some storm clouds over the use of Breath Alcohol results in Driving Under Influence (DUI) charges. In particular, judges in Florida rules that the source code must be available to the defendents for them to analyze and contests the results. Unfortunately, claiming trade secret, manufacturers refuses to provide the source code and this leads to problems in prosecution.

A lot is at stake here. Most importantly, the rights for defendents to scrutinize and challenge accusation levelled at them. The fundamental issues in these case is whether the breath alcohol analysis machine is accurate "as claimed by the manufacturer". Manufacturers refusal to allow examination of source code, even by independent, bonded third parties is alarming, as it at best suggests bad practice and at worst, the machine is not accurate at all. However, is the disclosure of source code really necessary? I do not think so. At issue is accuracy, a question normally solved by calibrating the machine. Hence, an argueably better way of judging the accuracy of the breath alcohol test is to treat the unit as a black box and subject it to rigorous tests. In this case, the easiest way is to hand the machine over to the defendent to test as a black box. This is of course, impractical. May be a similar machine can be provided, but the best compromise in this case is of course having an independent party (for example, Florida state ANSI branch) testing and calibrating the machine at regular interval and to provide this information to the defendent.

While Florida's legislature is right to try close this loophole by saying that only the test results need to be disclosed to the defendent. But I fear the legislation might not had strike the correct balance. One major pillar for  arguing that  source code should be disclosured is the ease to update the program in those Breath Alcohol Machine means the machine can be using source code that are untested or calibrated. Providing test results (assuming third party testing) does NOT make this goes away.

Between calibration run, if the software on these machines are updated, it must automatically require immediate recalibration. Machine record must reflects that the software is updated. Machine must be treated as new from accuracy point of view because this modification do change its operating parameters (hopefully for the better).

If you think this is all that is needed, you are wrong. There is an important requirement that the machine should be calibrated before the software are updated. Why? To ensure that results from the machine since last calibration to the point before software update are still accurate. This is a fundamental right of the defendent, i.e., to be sure that the equipment that incriminates him is accurate. 

Moreover, machine history must be provided as well. Each product has its own faults/merits, and each machine has its own characteristics. These are important information for defendent for them to defend themselves. Without machine history, the value of the test results diminishes for the defendent, something we must avoid.

In short, I believe that source code disclosure is not necessary, but a rigorious calibrating procedure and machine history documentation is needed instead.  

Next Page »

Create a free website or blog at WordPress.com.