CyberTech Rambler

November 28, 2008

Barbarians might one day turn out at the gate, but not today

Filed under: Uncategorized — ctrambler @ 2:55 pm

Matt Asay quoted Mark Murphy saying that RedHat stock is undervalued. Most interestingly I think, is Murphy’s assertion (which Matt Assay agress) that Oracle’s Unbreakable Linux failed to make a dent in RedHat’s Linux offering. This is not really surprising. Oracle, after all, is a database business. Linux is just a “side line” and no business can trust his company’s main operating system to someone who support it as a side business. Moreover, according to Murphy, Oracle’s support is really bad, even for those minority of people who converted.

This Unbreakable Linux thing may be Oracle way of trying to capitalize on its Linux expertise, but Larry Ellison’s approach leave a bad taste in the mouth, and not a good way to start a business. I never thought Oracle qualifies as the barbarians at the gate for RedHat. I think we got the confirmation today.

Novell, however, fancy itself as the next barbarian. It recently launched a support program to entice business away from RedHat. Compared to Oracle, Novell’s is a more credible barbarian. Its main business is SuSE and have more experience than Oracle in supporting Linux.

Novell might one day be the barbarian. Right now, I don’t know. Linux-to-linux migration is easier than say Windows to Linux or Unix to Linux migration. For business, it is not only the question of support for Linux, the operating system, but support for the critical applications. It is kinda a chicken-and-egg problem: Fewer vendor support means fewer customers and the spiral continues.

As for RedHat, it cannot let its guard down. If any, it shows that it is a painted target. One thing Red Hat has earned is the right to claim “I’ve arrived” on the Operating System scene. If there is nothing to plunder, the barbrians (and their wanna be) would not bother to make the effort to get to it’s gate.

November 20, 2008

Silverlight loses its trophy

Filed under: Uncategorized — ctrambler @ 4:06 pm

According to Rob Perogaro of Washington Post, one of the major snatch for Microsoft on Silverlight technology, MLB.com, decided to ditch it for Adobe’s Flash technology.

It’s just another road block Microsoft has to navigate around for Silverlight technology. I would say Silverlight is dead yet, but so far, I haven’t seen any need to install Silverlight (or Moonlight) in the one-and-a-half years since its launch.

Perogaro is delighted that it does not have to install yet another plugin to see his favourite MLB game. I’m sure a lot of people will share his delight. The million dollar question is whether the plugin download is the real stumbling block, or is it an excuse for failure. I think it is an excuse, if one chooses to blame this departure from Silverlight. To me, it is not that big a deal as it is a one-time-only installation steps and for the majority of users, i.e., Windows user, it is a straight forward installation. Let’s not forget that one have to do the same for flash player too. The games I thought, would make this worthwhile.

Finally ISO-OOXML text is here

Filed under: Uncategorized — ctrambler @ 3:54 pm

ISO finally published the ISO-OOXML standard, ahead of time since Alex Brown predicted December.

November 12, 2008

Rethinking CodePlex? About time …

Filed under: Uncategorized — ctrambler @ 10:43 pm

According to TheRegister, Microsoft is rethinking CodePlex. Given the recent controversy, I think it is a good idea.

CodePlex’s problems can be explain in just one line : “No clear objectives”. Primary question: What type of licenses does it accepts? What does the collection of licenses says about the website?

A few recent events suggests that Microsoft does not really know what CodePlex is about. First, it has to pull a project, then it appears that there is one rule for Microsoft, another for the rest of us. All these causes confusion and devalue CodePlex. The worry is, will eventually, at least in the mind of the users, turn CodePlex into a site where you download Windows software, the promise about source code is “vapourware”. When that happens, what’s differentiating CodePlex from other software download site?

I do not think yet another site for open source license only (the way OSI define it, not Microsoft) is the way forward: SourceForge.net already serve that role. It is a more prestigious site than Microsoft. However, a few other options are available which can differentiate CodePlex from others:

  1. a site for software with all Microsoft “share source” license
  2. a site for all projects sponsored by Microsoft with a “share-source” like license
  3. a site which only host software from Microsoft under “share source license”

As for reorganizing the site, I don’t think splitting the site into subsites based on licenses help. The example quoted, “academics.codeplex….”, quite simply add to the confusion.

If any, sourceforge.net taught us that projects with different licensing terms and conditions can live side-by-side if the site has clear objective. Initially, we probably have confusion, but once this dies down, the clear objective for the site will help users navigate it.

We did not pay contractor to abandon Linux! Honest!

Filed under: Uncategorized — ctrambler @ 10:13 pm

Interesting follow-up in ComputerWorld about Microsoft “bribing” contractor to abanon Linux story. Microsoft enied doing that. However, when you dig deeper, you will find while it is technically true, since the contract in question was not executed, it is an admission that Microsoft did offer to bribe.

To me, the central question is ethics. Is it proper business practice to “bribe” contractor? I don’t know. I haven’t work in the business world before.

November 11, 2008

Interesting UI development titbit

Filed under: Uncategorized — ctrambler @ 1:39 pm

I don’t like designing and programming UI. Unfortunately, at work, I am thrust into it. Otherwise, the software we development is simply speaking “Unusable by target audience”. Since then, I had been picking bits and pieces of advices in UI building, with Apple being the Design God I worship and pay homage to.

PCPro magazine did a piece on how Windows 7 (pre Alpha) is not faster than Vista, but simply perceived to be faster. According to Graham-Smith, the author, what happens is “They’ve recognised that perceptions of speed focus almost exclusively on interactive performance.”, i.e., they polished the surface but leave the underlying things unchanged. Graham-Smith says, it is not a con but is an inspiring move. I agree this is not a con (yet) and Microsoft’s polishing effort inspiring, and the discovery insightful and a useful fact for UI designers to note.

One thing to note. We are looking at pre-Alpha. It is very possible that making application runs faster will be implemented sometime now to the final release. Concentrating on the perception of Windows 7 is fast on pre-alpha is a business decision, and a correct one. Now what we want to see is the application do indeed run faster.

Remember, beauty is only skin-deep. Polishing is something is necessary to increase its value, but polishing removes a layer from the surface, makes exposing the inner core one step closer, and polishing also do nothing to improve the quality of inner core.

Ultimately, it is the inner core that matter. Microsoft currently has a problem with Vista, it cannot run on low spec computer, what they like to call Ultra-low-cost PC, others call NetBook, and what I call not-overpriced PC. That segment of the market had become increasingly important. Vista’s inability to handle those computers is one reason why XP just refuses to die. It will be a shame that Microsoft did not fix this in Windows 7.

November 7, 2008

Filed under: Uncategorized — ctrambler @ 4:22 pm

Just by reading the title of the blog post on my RSS Reader and before I read Alex Brown post of The Maintenance of ODF, I already decided that it is going to be a post supporting a ISO SC34 “land grab” of ODF maintenance. After reading it, I can confirm that it is a land grab. In fact, it is a land grab for land ceded by JTC1 to OASIS. This is what I expect a lot of pro-ODF people to conclude.

Basic arguments

Before I proceed I must make it clear that any changes to the edition of ODF (ODF 1.0 2nd Edition in Alex Brown’s chart) by OASIS has to be agreed on with ISO. OASIS had opted, in their own free will, to submit that edition to ISO and therefore it must sought ISO’s agreement on any changes to that particular edition and perform all of its duty it agreed with ISO on the maintenance of that edition of ODF.

I must also make clear that I don’t know the exact mechansim of an ISO maintenance regime nor am I likely to. For example, I interpret terms like “withdrawal” the way joe public will, but it can be used as a technical term to mean the replacement of one standard by its successor, even if the successor is simply an updated version.

I am also looking at the whole thing by drawing parallels on software development, which might not be valid in standardization speak.

I am also comparing ODF with OOXML, this is because OOXML is by far the best candidate to compare ODF with. It is not a criticism or comment on OOXML processes, just a comparsion, nothing more.

OK. Now that I think I spelled out all my disclaimers and bottom line, let me express my opinion.

First and foremost, ISO does not, and cannot claim any jurisdiction over other versions of ODFs, except for the edition submitted to it. If OASIS choose not to submit anything to ISO, it is its choice. It is not unreasonable nor is it unexpected, for OASIS to choose to evolve ODF on its own. If it does, then one consequence is the passage for ISO approval of the next evolution of ODF is probably going to be more difficult compared to the case if it got ISO involved. This is, however, the bargain that OASIS choose to make and it just have to live with it. In any case, if my reading of ISO management’s view when rejecting OOXML appeal is correct, it has little to fear from JTC1 or SC34 administration, since ultimately, the true power lies in member countries and it can force things down the administrative part of the two committees.

That big nice time diagram that Alex Brown produced look to me like charting the natural evolution of a standard, any standard, no more, no less.

The purpose of Errata is to correct problems in the original text, or to clarify ambiguities in the text. Normally, an errata for older versions of the standard  is useful for anyone who has to support the older standard. Generally speaking, one will still issue errata for an older version of a standard, especially when the newer version of the standard is incompatible with the older one.

If I assume that ODF 1.0 first edition is radically different from ODF 1.0 second edition, then I cannot qualify Errata for the first edition as a “fork” but a maintenance release. However, this is not the case here. The case is even simpler. Brown himself acknowledged that that issuing this errata is OASIS’s way of incorporating work done in  “ODF 2nd edition”, i.e., ISO ODF, into the first edition. I suppose OASIS could have converted ODF 2nd edition to a “OASIS standard” to prevent the need to call the update to ISO ODF an errata, but I am not sure this will clarify this situation.

As for we having multiple versions of ODF which are not ISO standard, I am not surprised, nor would I see it as a problem. People who do not follow any discussion on ODF, and those unaware of how standard and software are made, might find the multiple versions confusing. Those of us in related fields see it as a norm. One thing that is clear to me, when explaining ODF development process to others, I would use Brown’s brilliant illustration, minus the “fork” box.

ISO development is at a slower pace from what members of OASIS expects and want. Not all OASIS ODF standard need to be ISO. In fact, in a fast-moving field, there is merits in not keep on updating an ISO standard. I don’t think I prefer the idea of multiple OASIS ODF versions, but periodically, have ISO versions that represents the algamation of work done since the last ISO versions. Using software terminology, OASIS ODF versions are the development/milestone/beta versions but ISO ODF is the final release version.

Land grab

Where do I see a “land grab”? In the section “Problems”.  In the paragraph preceding it, Brown says that OASIS believe it has an agreement that “… allows it to maintain ISO/IEC 26300 in a way which exempts it from the maintenance provisions of the JTC 1 Directives.” Paragraph 4 in the section shows that even Brown agrees that OASIS interpetation is correct.  That is what Brown is trying to reverse, by claiming that it is in violation of JTC1 directives.

Let’s make it clear that we are missing one piece of the jigsaw: The reason why JTC1 choose to disregards its own directive and entered into such an agreement with OASIS.

However, simply claiming that the agreement with OASIS violate JTC1 directives is not a carte blanche to get maintenance back to JTC1. Members of JTC1 may complain, but they did sign away the maintenance rights in the first place. Most of the time we cannot wiggle off commitment we latter wish we did not commit, why should JTC1 be allowed to do so?

If the argument actually boils down to the JTC1 directive violation, my only comment is that this shows rules does not mean anything in JTC1 even before OOXML exposed this. People like me, who believe that JTC1 breach their own rules with OOXML, has no symphathy there for Brown’s argument. No support. None. Nada. Complain as much as you like that we broke the rule first. I don’t care. You lost the morale highground as well.

Finally…

From the point of view of a pro-ODF person, I am glad that OASIS did it. I do not think it is a good idea for  JTC1 or SC 34 to maintain both ODF and OOXML. There is a conflict of interest here and it will be to the detriment of both standards. Right now, we are seeing a renewed sense of excitment in the office application space, mainly the result of rivalry between the two. Consumers are reaping the benefit of this rivalry so why do I want to stop it.

November 3, 2008

Windows’ fragmented installation

Filed under: Uncategorized — ctrambler @ 2:12 pm

When I wrote the piece about my bad experience with installing WIndows XP, I must admit I was a bit puzzled. My experience confirmed that Windows installation from scratch is not easy and still represents what one must go through to install any operating system on a customized white box 10 years ago. I even concluded that Windows installation technology is seriously lacking behind Linux because it does not have a “unified installer” that does everything for you.  Still, why didn’t the smooth installation of drivers, something everybody tout as an advantage of Windows, help me in the installation?

It finally dawn on me that the reason is every driver is living in its own lala-land and is only aware of itself and Windows but nobody else. In the Window’s world, products from the same company often shipped with installers that are unaware of other products offered by the company. There are one or two dominant installation technology used by multiple companies, but they themselves are unaware of any other instance of the same installer. In other words, we do not have a repository of software that we can tap into to download and install the software we want, even if they use the same installation technology.

In Linux, we have multiple installation technologies, just like Windows. However, the installation technologies providers maintains a repository of drivers/software it can install. Therefore, inside one installer, you can pull in a lot of different drivers/software. Initially, this was a necessity because most vendors do not even have a linux driver. However, this pooling of resources means we now have “unified” installers that can pretty much pull in every piece of software you need.

Of course, open source licensing terms helps here. You can easily include OpenOffice.org, GIMP and other programs in your installer because the developers give you the rights to and actually encourages you to do it. With proprietary licensing, there is at least a layer of bureaucracy, i.e., you need to ask for permission, and some technical problems, e.g., remitting licensing fee to the developers that put hurdles into the implementation of a unified installer, then you have the problem of competitive rivalry between developers that dictate to you to exclude rival software if you want to include this software. All these are hurdles. A unified installer can be done of course and it might be done one day, it is just more difficult to do it in a proprietary world.

So, in a proprietary world, you have no collaboration on repository of software/driver installation, eventhough vendors may be using the same installer technology? Don’t ask me why. I don’t buy the idea that installation technology vendor haven’t thought of it. May be it is the software comapny that thinks this does not make the business case, or the finance department see this as an avoidable cost. I don’t know.

Having an unified installer save users a a lot of time but why should vendors care? It is your cost, not theirs. Besides, may be one day they can charge you for a service that makes this easy for you and therefore save you time. Will I buy into such a service? Let’s say given my experience, it is a “May be”.

Ubuntu’s Linux contributions

Filed under: Uncategorized — ctrambler @ 1:37 pm

People are accusing Ubuntu for not contributing to Linux. I disagree because I do not think the word contribution should be limitted to source code contribution to the kernel. I think Ubuntu is contributing to Linux, just not in the traditional sense.

At the minimum, Ubuntu gives GNU/Linux a good PR. If you ask me which distribution you should play with if you are a Linux virgin? I will say Ubuntu. Ubuntu is sufficiently polished and dummy proof.  Live CDs are nice, but installing from Live CD is difficult. Moreover, since the installation CD and Live CD is the same one, I don’t have to explain that when they get to the website they have to select to download a Live CD, not the installation one.

Ubuntu’s strength is user engineering. Ubuntu is one of the distribution that are more likely to listen to the users and cater for their needs instead of developers. While Linux’s developers do “eat their own dog food”, developers and users view of what a computer should be is different. Let’s face it, developer has the “curse” of knowing how the computer work and Unix developers are a particular bred that knows how to do the same thing 5 different ways with at least 5 loops you have to jump through whichever ways you choose 😉

Want an example? See Shuttleworth’s latest post on the FUSA applet design.

Ubuntu seems to be following the traditional example of someone learning Linux. You start with the big thing, i.e., getting the distribution installed (in Ubuntu’s case, on a lot of computers), then start playing around with applications before finally go in and play with programming. And with programming, you start small, i.e., creating applets before diving in and out of the big piece of collaboration code, e.g., GNOME or the kernel itself. It is just a matter of time before Ubuntu finds the need to contribute to GNOME or the kernel, if it haven’t decided that it need to.

Ubuntu is not without flaws. Adam will happily points you to a few. I don’t necessary agrees with Adam but I can see his points.

We need Linux Hater-like bloggers

Filed under: Uncategorized — ctrambler @ 12:59 pm

Yegulalp is correct, we, the lusers (Linux Hater Blog’s terms for us), will miss Linux Hater Blog.

Hate it as much as you like, we need people who cast a critical eye on Linux development, and put developers in the uncomfortable situation of having to defend their decisions. Engaging in debate like the ones Linux Hater blogs provide allow us to critically review ourselves.

It is unfortunate that the language hygiene there is bad. So bad that it often drown the messages. We, the lusers, are of couse intelletually more superior and will filter all the noise out to get to the message. Didn’t we?

Blog at WordPress.com.