Can Ubuntu succeed? comparing to iOS and Android

Categories: Uncategorized

Last week during Ubuntu Developer Summit, head honcho Mark Shuttleworth said something to the effect of “iOS and Android have managed to succeed despite Microsoft’s monopoly,  and we haven’t” (see the keynote here). As a few days passed I thought about it a bit and here’s what resulted.

I think it’s not quite as clear-cut as “they have done it and we haven’t”. Microsoft’s monopoly is on the desktop, and it is there that Ubuntu is going directly against Microsoft and perhaps, yes, failing to capture a percentually significant chunk of the market. And I won’t go into the whole “Linux is better than Windows” debate.

Rather, let me point out a key fact about Android’s and iOS’s success: they both did so in a market where Microsoft wasn’t a dominant player. Before Apple unleashed the iPhone on the world, the smartphone market was very fragmented, with Microsoft a relevant player (Windows Mobile), but nowhere near the dominance it has in the desktop. Nokia (Symbian) and RIM (Blackberry OS) were two big players, but they have both been relegated, one to irrelevance (Nokia – plus the deal with Microsoft), the other (RIM) to a frankly defensive posture where they lack a strategy and are scrambling just to stop the exodus of users.

Now, even while Apple and Google are the two strongest players in the smartphone market, things are pretty much in a state of flux, and no platform can claim the stranglehold that Microsoft has on the desktop. So those companies are forced to innovate and stay on their toes. But the fact is that, even with a product that is clearly superior to previous offerings, any one of these companies would have had a hell of a time dethroning a hugely dominant player from the field.

Ubuntu’s challenge is greater as it’s going head-on on Microsoft’s cash cow, and there’s no real competition for the desktop. The only other mainstream operating system with any success is Mac OS X. Apple is content with whatever niche they’ve carved for themselves, and it’s clear to anyone that the strides they’ve made in the past decade are more due to the halo effect of the iPod and iPhone than because of OS X’s (admittedly great) merits. So yes, they have a superior product, but that still hasn’t propelled them beyond a 10% market share.  While I’m at it, let me comment: it’s easy to forget that the first versions of OSX were kludgy, slow and difficult to use, and had a myriad usability problems. It was the iPod and then the iPhone that propelled Apple from a fringe player into the powerhouse they are today. In the end, Apple realizes that promoting Mac OS X is not worth a big effort, and that the momentum from the iPod and iPhone are enough to keep OS X alive.

So what does Ubuntu need to succeed on the desktop? I have no insight in this topic, but let’s just realize that it’s not as clear-cut as looking at, and imitating, Android’s and Apple’s successes, because as I’ve said, their playing field was a vastly different one. Would a “halo-effect device” help Ubuntu the way the iPhone helped Mac sales? maybe. Maybe all Ubuntu needs is endurance, as even hugely dominant players (Ford, IBM, WordPerfect, Netscape) can be surpassed under the right circumstances.

Ubuntu and Community Testing

Categories: Uncategorized

During Ubuntu Developer Summit (UDS), held last week (May 9-13) in Budapest, Hungary, a very interesting program was discussed. It’s the Ubuntu Friendly program. The end product of Ubuntu Friendly should be a way for people to find out whether a particular computer system can run Ubuntu properly (it’s friendly!). This has many possible uses, not the least of which is to enable people to choose a system on which Ubuntu has a good chance of working without much tinkering by the end-user. This is important in a world where most people compare a preinstalled Windows system (which has had most of the dirty driver installation/enablement work done by the manufacturer) with a from-scratch Ubuntu installation, where Ubuntu is expected to pick up all the hardware and work adequately with it.

Due to this last scenario/requirement, in my opinion, installing Ubuntu is already a much cleaner/friendlier experience than installing Windows; on my laptop, a Samsung QX410, Ubuntu has a few glitches which require manual configuration (touchpad, hotkeys), but the system is immediately usable out of the box. The same can’t be said of Windows, where a plethora of device drivers are required for even the most basic devices to even work. However, since the system is purchased with this work already done, to the user’s minds, the Ubuntu experience is not as polished as the Windows one.

I digress. So the Ubuntu Friendly program seeks to award Ubuntu Friendly status to those laptops the community finds work well with Ubuntu. This in a way replaces the Ubuntu Ready program, where manufacturers were responsible for running the tests on their systems. It also complements the existing Ubuntu Certified program, where hardware is tested in-house by Canonical, under more stringent standards, and an official Canonical-sanctioned certificate is issued to machines that are deemed certified, as in, work with Ubuntu out of the box.

Needless to say, there’s a lot of interest from the community in this Friendly program; it’s a win-win situation where the community can contribute valuable testing and results, and the world becomes a better place through what will be a large list of systems that the community has identified as being “Friendly” to Ubuntu.

During UDS, the sessions where this program was discussed had great success; attendance was good, and I was glad to see people from outside the Hardware Certification team in Canonical participate. Yes, there was a lot of interest and participation from community members too. There were a lot of valid concerns and good ideas being talked about, and even though an extra session was scheduled for this program, they all ran out of time with people still wanting to participate.

All in all it’s a very interesting program, one that hopefully will take form quite soon. If you’re interested in seeing what this is all about, here’s the blueprint with a more formal description of what is being planned.

Internet access in Canada – Stop the Meter!

Categories: English

I have no idea what the CRTC is. But they recently ruled on something that means that, my internet provider, which I chose based on the fact that they had no download caps (unlike other greedy providers like Bell, Rogers or Videotron), will now have to institute said caps.

In practical terms this is how it looks. If I had my link downloading stuff at 100% capacity, I could potentially download 1500 GB of data in a month. I’d pay 45$ a month for this. Now, however, they limit me to 30GB a month, for the same price. If I go over, they charge me 1$ per extra GB, up to a limit of 60$ a month, at which point I’ll have downloaded 90GB. At this point they stop charging extra and I can download more. However, if I hit 300GB, they cut me off for the rest of the month.

This is basically them not honoring the contractual obligation which says I get X amount of service for Y amount of money. This is extremely unfair for me, and although I understand the position ISPs are in, at the mercy of big telcos, I certainly wish they’d put up a bit more resistance to this.

So indeed, the problem for the end user is that it’s more expensive and inconvenient to use bandwidth. And however much telcos whine about how “power users” are the ones saturating the pipes, the fact is, these power users pay for the bandwidth they use, at the rates set by the providers, and now the providers wanting to basically charge more for the same service, is a bit ridiculous and speaks of greed and money-hunger. Also, it basically stifles innovation, the kind that would move big media out of the picture, since I’m actually penalized for doing stuff like watching TV online, which is very convenient for me, but uses quite a bit of bandwidth (that, in the other hand, I’d already paid for).

I’d be inclined to go with this counterproposal to my ISP, and ultimately to the big telcos.

Previously I could download 1500 GB in a month at 45$, meaning each GB would cost me 0.028$ (compare this to the abusive 1$ a GB for overage, which is a 3500% increase). So what I propose is, if you want to limit me to 30 GB, then I’ll pay only 0.84$ a month, which by their previous rates is a fair amount. Hell, limit me to 300GB, for which I’ll pay only 8.40$ a month.

Optionally, OK, charge more if I go over 30 GB. But conversely, and if the power user / common user argument holds any value, prorate for users who go UNDER the 30 GB; so if I download only 10 GB in a month, I only pay 15$ for internet access.

I bet no ISP would like going with either of these proposals. Guess what: We users don’t like your proposal either. So please go to www.stopthemeter.ca and raise your voice against this idiotic measure that puts Canadians at a huge disadvantage technology- and connectivity-wise.

The myth of better device support on Windows

Categories: English Geeky Uncategorized

It’s long been argued that peripheral support in Linux is far inferior to that under Windows, and that this has been a factor for Windows’ dominance in the desktop. More and more, the myth that Windows has any kind of technical superiority leaves place to the fact that marketing, and being bundled with nearly every PC sold worldwide, are Windows’ only keys to its widespread adoption. And here’s a story to prove that point.

I bought a printer (HP Photosmart C4780). It’s one of those cheap, $50 numbers that eat through ink like crazy. So I come home, wondering if I’ll have to install 500 MB of crap as included in the bundled CD to get the printer to work with my Mac at home.

As is usually the case with the Mac, I just plugged it in and it worked, both the printer and the scanner, without a hitch or problem.

I then proceeded to do the same on a freshly installed Ubuntu 10.10 laptop. Same story, the printer just worked, and Ubuntu even recognized it when being plugged in, no need to install drivers or anything.

Now, on Windows the printer wouldn’t have worked at all without installing a boatload of crap, HP is notoriously bloaty when it comes to their bundled software.

The usual wisdom is that hardware manufacturers care more about Windows, and ship all their hardware with drivers and stuff to make it work. It would seem, then, that the burden is on Apple and Linux distributions to provide drivers and support to most hardware. It would seem like a daunting task. But they do it, and the end result is that Mac OS and most Linux distros include drivers for everything, right out of the box. This puts them a step ahead of Windows, when it comes to ease of use, at the cost of maybe a slight bloat. Still, my Ubuntu installation is much leaner than the 16-GB behemoth that is Windows 7.

So there you have it, the myth of better hardware support on Windows, finally debunked.

Now, if I could only get the braindead wireless support on the HP printer to work…

Flash Sucks

Categories: English Geeky Uncategorized

A world without flash?

I’ve always been a hater of Macromedia/Adobe Flash. Now that the entire Apple-Adobe controversy has rekindled the debate of whether the web is a better or worse place because of Flash, I realized why it is I don’t like Flash.

Also, I realized most technically-inclined people dislike Flash too, because they recognize a lot of its shortcomings, unlike the layperson who only cares about the web being pretty, full of animations and beeps and stuff.

Now, before I begin, let me state this: I’m griping about Flash as a web content creation platform/tool. I couldn’t care less about its use as a mobile development tool. A lot of bloggers have expressed more informed opinions on this topic.

For me, a true flash hater, what Flash does is take control away from the end-user, the consumer of content, and give it to the content creator, the designer.

If you’re the designer this is all fine and dandy; you can control exactly what the user sees, you can tell your application to be exactly this many pixels wide, this many pixels high, and how to look and behave down to the pixel and the microsecond. This is why designers love Flash; it not only lets them work in a familiar environment and with familiar tools, but it also gives them complete control about how and what the user sees and can do.

By the way, don’t be fooled; a designer that claims to know web design but uses only Flash is not a web designer. Flash was created to allow designers (Adobe’s primary clientele) to be able to say (untruthfully) they can design web sites.

Web page scaling FAIL santander

The problem is, the web wasn’t meant to be this way. Fundamentally, the kind of content the web was created for, was meant to empower the user. This is why the web browser was designed from the very beginning to not impose those very parameters (width, height, fonts, and so on); the content should adjust to whatever the user’s agent can display. So web content reflows to adapt to your browser; it should degrade for those systems that for any reason lack a certain capability (think Lynx and visually-impaired users). It should also allow me, the user, to alter how it looks and is rendered. This is why I can disable cookies, javascript, replace or even remove altogether the CSS used to format my content, decide not to display images, and so on. Even the most complex non-flash web page consists of text and images; and with a bit of cleverness I can get both the text and the images and incorporate them in the rest of my workflow; paste them into a document, translate them, email them to someone else, the possibilities are limitless since web content is delivered to me as-is, as bytes I can not only look at, but also manipulate as I would any other kind of information on my computer.

This freedom is lost on a Flash-only (or mostly) website. What’s worse, instead of the content being, well, content, stuff I can get out of the browser and process and manipulate in other ways, it becomes merely an image, a photograph or a movie trapped in the clutches of the Flash plugin. I can’t copy the text, I can’t scroll except through the provisions the designer made for me, I can’t easily extract the audio or the images, and I’m basically limited, not by the constraints of my browser, but by those set forth by both Adobe through its display plugin, and the designer. And let’s face it, most designers are also clueless about user interfaces and ease-of-use, unlike the people who designed my web browser, which is rendered mostly useless on a Flash site.

It is this loss of freedom that makes Flash so dangerous, and why I think it would be a good thing for Flash to disappear eventually.

Flash adds nothing of true value to the Web, as we could all live happy without all the animations, all the desktop-apps-masquerading-as-web-apps made in Flash (write a Web app from the ground up, it’s not that hard), all the stupid content that forces me to work its way instead of my way, and luckily, thanks to the advent of HTML5, the one thing for which Flash has proven to be indispensible (web video) we won’t need it even for that. Because, let’s face it, web video was Flash’s killer application; everything else that could once be done only in Flash is now doable in AJAX, CSS and Javascript. And honestly, if Flash had been such a good technology for those things, we would have stayed with it and not bothered with anything else.

If anything, the existence of so many alternatives to Flash and whatever it can do, is evidence that the world at large truly does not like Flash.

Open letter to Amazon.com: Please make my Kindle not suck

Categories: English Geeky

Update: It appears Amazon is indeed listening; I was able to preorder Robert J. Sawyer’s latest for Kindle delivery, and most of the titles I talk about in this post are alerady available in my region. Thanks Amazon!

Kindle Like (according to Amazon.com) millions of people, I own a Kindle e-book reader. However, I’m a bit irked by the fact that Amazon is treating Kindle users as second-class citizens. As early adopters who paid a hefty sum for Amazon’s flagship product, I think we deserve better.

I’ve been a fan of e-ink technology since I first learned about the early, clumsy prototypes. When the original Kindle came out, I nearly jumped at the chance to get one. However I decided that the hassle of having a Kindle in a non-supported country (Mexico), meaning I’d have to jump through hoops to get content into the kindle, was not worth being an early adopter.

So patiently I waited, until, in late 2009, Amazon finally started selling the Kindle, complete with wireless content delivery, in Mexico and a host of other countries. “Great”, I thought. “I get to have my nice gadget, save on shipping costs and delivery time, and I still get to read a lot”.

The story has been a bit different. And it has more to do with politics and commercial interests than with technology. Let’s get this out of the way right now: I have only ONE complaint about the tech side of the Kindle, and it doesn’t even have anything to do with the product itself. More about that later.

So I got my shiny new kindle and went online to get some books for it. I naturally searched for my favorite Sci-fi author, Canadian writer Robert J. Sawyer.

To my dismay, there’s very little from him available as Kindle content. None of the books I was interested in were available: nor Calculating God, the first RJS book I read; neither Factoring Humanity, my all-time RJS favorite; I can’t get the Quintaglio Ascension trilogy, one of the very few RJS titles I haven’t read. They’re simply not available for the Kindle.

Titles are being “kindlefied” all the time. However selection is still quite shallow.

Sometimes I do find the title I’m looking for, only to be greeted by the message “not available in your region”. Amazon, if you CAN send physical books to my region, why can’t you deliver them to my Kindle? I know you’re going to say it’s not the same, but to me, that doesn’t cut it.

A few days ago I received a notification for Dan Simmons’ latest book. Black Hills was to come out in a few days, and I was offered a nice pre-order discount. However, it didn’t apply to the Kindle edition. So you mean to tell me that, even though I’d click on “buy now” this minute AND wait for the book to actually come out and be delivered to my Kindle, I can’t? and that the only way to take advantage of the discount is to wait for the dead-tree version to actually come out? well, never mind, because the book is for sale right now and there’s no Kindle edition in sight. So anyway I have to either get the hardcover or wait until the publisher decides it’s OK to let the Kindle edition out. It’s ridiculous that a hardcover book delivery will actually have me reading it sooner than the instantly-delivered electronic version.

Amazon, this is one area where you have to work with publishers and let them see what a big market they’re missing, and help them reach it. Because all these artificial restrictions, stemming from the irrational fear they have of electronic distribution, will only end up hurting their bottom line. I’m able (and more than willing) to purchase books. Look at my past history if you don’t believe me: even with a 50% delivery overcharge (the joys of not being in the United States) I routinely spent over $500 a year on books. Now I’m a bit weary of ordering physical books, since I’d prefer to offset the delivery cost with my Kindle; however, many of the titles that interest me aren’t available for the Kindle.

Interestingly, I find myself loading mostly classic literature on the Kindle; from Wilkie Collins to Jules Verne, these wonderful titles are available for free in Kindle-compatible formats. This is a consequence of the titles I want not being available on the Kindle; so if I have to choose between Jack London’s Call of the Wild  (old book, I’ve read it 1000 times, I can get it for free at mobipocket.com) and Robert Sawyer’s Starplex (haven’t read it, but is not available for the Kindle), guess what, I’ll get the former.

Now for my one technical quip: What’s this about “optimized for large screens” books? so now I need a Kindle DX to read content? That just sucks.

So Amazon, you have the clout, but also the flexibility to work with publishers and stop (both you and them) treating us like second-class citizens, just because we find the convenience of the e-book reader worth the high admission price. A lack of reasonably-priced content shouldn’t be part of that price.

Video tutorials suck – most of the time, or – the bow tie

Categories: English

For a long time the Internet was a veritable treasure trove of howtos and tutorials; this is people (mostly) selflessly sharing the stuff that’d taken them a lot to learn, in order to benefit the crowds. Philosophically, this has a lot to do with the Free Software movement. Most people wouldn’t realize it, but the “share freely” idea is what has propelled pieces of software such as Linux or Firefox to their current positions.

I digress. However, at some point, someone decided that a) the Internet was now fast enough to carry video, and b) people were too stupid to read and follow instructions. This brought about the unfortunate appearance of video tutorials. I usually rant against these, as I can still read faster than I can watch a video, where some random dude takes me step by step at their own pace (intead of at mine). Video tutorials also suck when you need some quick, compact piece of reference material to “refresh” your knowledge about a procedure, which would be better served by a 2-kb piece of text, instead of a 10-mb, 5-minute video.

Still, I must admit there are instances where a video tutorial makes the most sense; some steps in procedures are, indeed, better explained by following the actual action (and perhaps having a narrator telling you what the hell is going on).

I recently found myself needing to learn how to tie a bowtie. None of the text tutorials helped, no matter how well-written or illustrated they were. There is ONE crucial step that basically necessitates a video for you to understand it. I spent 40 minutes wrestling with the text-and-pictures instructions. The video made it clear in under a minute.

So, without further ado, if you EVER need to learn how to tie a bow tie, don’t bother with anything else: these three videos will show you how it’s done.

The first is the one that best explains the CRUCIAL step of “finding the hole”.

The second one goes into a bit more detail, I hate this guy who says “go ahead and” all the time, but his explanations are good.

The final one is hilarious from the way the woman “handles” his male-model, but it’s also instructional and explains the crucial step adequately.

Enjoy!

“To its devotees the bow tie suggests iconoclasm of an Old World sort, a fusty adherence to a contrarian point of view. The bow tie hints at intellectualism, real or feigned, and sometimes suggests technical acumen, perhaps because it is so hard to tie. Bow ties are worn by magicians, country doctors, lawyers and professors and by people hoping to look like the above. But perhaps most of all, wearing a bow tie is a way of broadcasting an aggressive lack of concern for what other people think.” ” —Warren St John, The New York Times

Back to the stone age: a tale of two phones

Categories: English Geeky

Nokia 7210 So my iPhone fell and got damaged. To its credit I have to say I did hit it pretty hard several times in the past, and it’d survived. However this time it didn’t, and I had to get a replacement. I had to pay for it since it was out of warranty. However the truly painful thing was spending one week without the perks of the modern smartphone.

I had to dig out my trusty 5-year-old Nokia 7210 (not the SuperNova, I mean the original funky-buttoned 7210), a stylish and compact phone which, however, is pretty featureless by modern standards. You can talk on the phone, send SMS (barely; I don’t know how I sent messages without a full QWERTY keyboard) and that’s about it. It has no camera, no network access, the screen is only 128-color and uploading stuff requires a tedious conversion process, and it only supports 4-voice MIDI polyphonic tones.

iPhone This was due in no small part to the death of my Blackberry’s lame battery; the ‘berry would have been a decent temporary replacement for the iPhone,even though it’s not compatible with my data plan. So here’s a tip: when your phone is about to be left indefinitely in a drawer, remove the battery.

Being without the iPhone, what I missed the most was:

  • The QWERTY keyboard, without a doubt, is the most-missed feature. Whether virtual or real, it’s a necessity if you plan on composing a lot of text.
  • The camera, believe it or not, is really useful for a lot of purposes.
  • Synchronization with my computer’s address book. A lesser phone can do it but the Nokia lacked connectivity (only infrared).
  • The browser, being able to access the internet anywhere, anytime has become a true necessity.
  • E-mail. Yes, also not being able to receive emails periodically or, at least, on demand, is crippling and makes me feel out of touch and claustrophobic.
  • Music, I guess it’s a case of “if you have it, you will use it”. Somehow carrying the iPod around in addition to the Nokia didn’t seem like a good idea.

What I didn’t miss:

  • Ringtones. However weak the Nokia’s ringtone support is, it’s very loud and adequate, and my favorite ringtone ever (acceleration.mid) was available. I like it so much, I made an MP3 of it and loaded it on the iPhone.
  • GPS. It’s cool to have it but I really don’t use it all that often.
  • Most of my games. I don’t play on the iPhone that often. I must point out that neither the Nokia nor the iPhone had the “snakes” game from older (and newer) Nokia phones. I guess this 7210 got stuck in the past.

Also in case you hadn’t noticed, the entire point of this rant was so that I could have a new post before the 12th and thus keeping my blog updated “more than once every 6 months”.

El puente de la ignominia

Categories: Pinche México

La Jornada reporta que, casualmente, dos de los puentes que se están construyendo en el DF se terminarán en Junio y no en Agosto como estaba planeado.

Pero claro que los quieren poner “en funcionamiento” antes de las elecciones en Julio; por eso es tan “atinado” el adelanto de las obras, que como de costumbre en el GDF, se malterminarán a marchas forzadas, y varios meses después aún se verán obreros dando los últimos toques como ha pasado con el segundo piso de periférico, el distribuidor vial de Zaragoza, los puentes de eje 3 Oriente y un sinfín de obras.

Y por más que la capital y en general todo el país requieran de esta infraestructura, es absurdo que los aceleren de esa manera únicamente para “cumplir” antes de las elecciones y darle, aunque ellos digan que no es así, un uso electoral a lo que en realidad es el TRABAJO del gobierno. Porque por muy discretos que se quieran ver, deberían darse cuenta de que la gente no se chupa el dedo y se da cuenta a meses de distancia, cuando se hacen estas maniobras electoreras que constituyen un insulto a la gente que paga las obras con su dinero, como para que luego nos las vengan a “vender” a cambio de nuestro voto.