Countdown bash function

Categories: Uncategorized

To be put in your .bashrc. Combines a waiting period with a (simple) progress report.

countdown () {
    [ "$1" != "" ] || return && i=$1
    while [ $i -gt 0 ]; do i=$(($i-1)); echo -n "$i " ; sleep 1; done
    echo "LIFTOFF";
}

The perfect keyboard layout?

Categories: Uncategorized

I remember an easier time when all keyboards had the same layout (C-64, anyone?) and if you wanted to type special characters you had to resort to arcane command sequences, if they were at all possible.

My, how times have changed.

My first PC compatible had a spanish keyboard, and you could very simplistically tell the OS (MS-DOS) about your keyboard layout. For a while this worked pretty well. Then someone decided that Latin America was so different from Spain, that we needed our very own keyboard layout; this layout just moves stuff around needlessly, destroying many years of experience for those of us who were accustomed to the spanish keyboard. I understand removing the ç as it’s not used in Latin America, but why move all the rest of the stuff around?

Latin American Keyboard
Latin American Keyboard

So basically I got used to the spanish keyboard which has worked well in all kinds of OSes, from MS-DOS to Windows, OS/2 and yes, Linux. While the Latin American layout was such a pariah that, at some point, it got overwritten by the Latvian keyboard (la), so when doing a system upgrade, all of a sudden your keyboard was in latvian, and you had to select “latam” for Latin America.

French Canadian Keyboard
French Canadian Keyboard

Eventually I happened to get a laptop with a Canadian French keyboard. Luckily, this is not the dreaded french AZERTY keyboard, but basically an english keyboard layout with most symbol keys mapped very strangely. So if you want to type the basic alphabet you’re OK, like you’d be with an english keyboard, but things start getting weird when you need to create special characters or compose accents, cedillas and stuff like that. This was so different from any other layout I’ve used, that I was basically freaking out. I could just ignore the red characters on my keyboard, and/or use it as just an english keyboard, but I routinely need to compose text in spanish and in french, so how would I go about doing this?

And no, the ages-old trick of memorizing ASCII codes for special characters doesn’t cut it: for one, it’s unreliable on Linux (especially on graphical mode), and for another, it’s just primitive! I used to chuckle at all the people I’ve seen through the years who had a nice “cheat sheet” glued to their desktop with ASCII codes for frequently-used accented characters, as opposed to taking 15 minutes to correctly configure their keyboards to do this natively.

So anyway, what I came across while checking out the available keyboard maps under Linux and trying to figure out how to type stuff on the Canadian keyboard, was this wonder of wonders, the US International with AltGr Dead Keys layout.

Basically, it takes the right Alt key (labeled AltGr on my keyboard, a monstrosity I was already used to from the LatinAmerican and spanish keyboards) and uses it to “compose” or “deadkey” stuff (dead keys are like accents, for instance, where you press the accent key and then the next letter you type will be accented). In combination with ~, “, ‘ and `, this enables me to type nearly all accented characters with relative ease.

Also, I can use AltGr+vowel to type acute-accented vowels (áéíóú), and AltGr+n for ñ.

Grave accents (è) and tilded letters (ã) can be composed by AltGr+accent (use ` for grave, ~ for tilde), and then the letter you want to type.

What I like about Linux’s keyboard selection thingy is that you can see an actual layout map. Thus, even if my keyboard doesn’t have the characters stenciled in, I can take a quick peek and see where stuff I need might be.

Thus I can do things like use ç or €, all with a minimum of fuss. Also more complicated stuff like ï œ ø is still just one AltGr+key away. All this while preserving a layout that’s very familiar to everyone (english), and where most strange characters using while programming {}][\|~ are also much easier to use than on the spanish keyboard I was used to (it needs AltGr for all sorts of braces and piping, which makes it very painful on my hands).

International with AltGr deadkeys layout
The actual US International with AltGr deadkeys layout as shown by the Gnome keyboard selection applet.

So there you have it, if you see yourself wrestling with choosing a good physical keyboard layout *and* making it work on your OS, stop pulling your hair out, get an english-layout keyboard and use US International with AltGr Dead Keys!

Can Ubuntu succeed? comparing to iOS and Android

Categories: Uncategorized

Last week during Ubuntu Developer Summit, head honcho Mark Shuttleworth said something to the effect of “iOS and Android have managed to succeed despite Microsoft’s monopoly,  and we haven’t” (see the keynote here). As a few days passed I thought about it a bit and here’s what resulted.

I think it’s not quite as clear-cut as “they have done it and we haven’t”. Microsoft’s monopoly is on the desktop, and it is there that Ubuntu is going directly against Microsoft and perhaps, yes, failing to capture a percentually significant chunk of the market. And I won’t go into the whole “Linux is better than Windows” debate.

Rather, let me point out a key fact about Android’s and iOS’s success: they both did so in a market where Microsoft wasn’t a dominant player. Before Apple unleashed the iPhone on the world, the smartphone market was very fragmented, with Microsoft a relevant player (Windows Mobile), but nowhere near the dominance it has in the desktop. Nokia (Symbian) and RIM (Blackberry OS) were two big players, but they have both been relegated, one to irrelevance (Nokia – plus the deal with Microsoft), the other (RIM) to a frankly defensive posture where they lack a strategy and are scrambling just to stop the exodus of users.

Now, even while Apple and Google are the two strongest players in the smartphone market, things are pretty much in a state of flux, and no platform can claim the stranglehold that Microsoft has on the desktop. So those companies are forced to innovate and stay on their toes. But the fact is that, even with a product that is clearly superior to previous offerings, any one of these companies would have had a hell of a time dethroning a hugely dominant player from the field.

Ubuntu’s challenge is greater as it’s going head-on on Microsoft’s cash cow, and there’s no real competition for the desktop. The only other mainstream operating system with any success is Mac OS X. Apple is content with whatever niche they’ve carved for themselves, and it’s clear to anyone that the strides they’ve made in the past decade are more due to the halo effect of the iPod and iPhone than because of OS X’s (admittedly great) merits. So yes, they have a superior product, but that still hasn’t propelled them beyond a 10% market share.  While I’m at it, let me comment: it’s easy to forget that the first versions of OSX were kludgy, slow and difficult to use, and had a myriad usability problems. It was the iPod and then the iPhone that propelled Apple from a fringe player into the powerhouse they are today. In the end, Apple realizes that promoting Mac OS X is not worth a big effort, and that the momentum from the iPod and iPhone are enough to keep OS X alive.

So what does Ubuntu need to succeed on the desktop? I have no insight in this topic, but let’s just realize that it’s not as clear-cut as looking at, and imitating, Android’s and Apple’s successes, because as I’ve said, their playing field was a vastly different one. Would a “halo-effect device” help Ubuntu the way the iPhone helped Mac sales? maybe. Maybe all Ubuntu needs is endurance, as even hugely dominant players (Ford, IBM, WordPerfect, Netscape) can be surpassed under the right circumstances.

Ubuntu and Community Testing

Categories: Uncategorized

During Ubuntu Developer Summit (UDS), held last week (May 9-13) in Budapest, Hungary, a very interesting program was discussed. It’s the Ubuntu Friendly program. The end product of Ubuntu Friendly should be a way for people to find out whether a particular computer system can run Ubuntu properly (it’s friendly!). This has many possible uses, not the least of which is to enable people to choose a system on which Ubuntu has a good chance of working without much tinkering by the end-user. This is important in a world where most people compare a preinstalled Windows system (which has had most of the dirty driver installation/enablement work done by the manufacturer) with a from-scratch Ubuntu installation, where Ubuntu is expected to pick up all the hardware and work adequately with it.

Due to this last scenario/requirement, in my opinion, installing Ubuntu is already a much cleaner/friendlier experience than installing Windows; on my laptop, a Samsung QX410, Ubuntu has a few glitches which require manual configuration (touchpad, hotkeys), but the system is immediately usable out of the box. The same can’t be said of Windows, where a plethora of device drivers are required for even the most basic devices to even work. However, since the system is purchased with this work already done, to the user’s minds, the Ubuntu experience is not as polished as the Windows one.

I digress. So the Ubuntu Friendly program seeks to award Ubuntu Friendly status to those laptops the community finds work well with Ubuntu. This in a way replaces the Ubuntu Ready program, where manufacturers were responsible for running the tests on their systems. It also complements the existing Ubuntu Certified program, where hardware is tested in-house by Canonical, under more stringent standards, and an official Canonical-sanctioned certificate is issued to machines that are deemed certified, as in, work with Ubuntu out of the box.

Needless to say, there’s a lot of interest from the community in this Friendly program; it’s a win-win situation where the community can contribute valuable testing and results, and the world becomes a better place through what will be a large list of systems that the community has identified as being “Friendly” to Ubuntu.

During UDS, the sessions where this program was discussed had great success; attendance was good, and I was glad to see people from outside the Hardware Certification team in Canonical participate. Yes, there was a lot of interest and participation from community members too. There were a lot of valid concerns and good ideas being talked about, and even though an extra session was scheduled for this program, they all ran out of time with people still wanting to participate.

All in all it’s a very interesting program, one that hopefully will take form quite soon. If you’re interested in seeing what this is all about, here’s the blueprint with a more formal description of what is being planned.

The myth of better device support on Windows

Categories: English Geeky Uncategorized

It’s long been argued that peripheral support in Linux is far inferior to that under Windows, and that this has been a factor for Windows’ dominance in the desktop. More and more, the myth that Windows has any kind of technical superiority leaves place to the fact that marketing, and being bundled with nearly every PC sold worldwide, are Windows’ only keys to its widespread adoption. And here’s a story to prove that point.

I bought a printer (HP Photosmart C4780). It’s one of those cheap, $50 numbers that eat through ink like crazy. So I come home, wondering if I’ll have to install 500 MB of crap as included in the bundled CD to get the printer to work with my Mac at home.

As is usually the case with the Mac, I just plugged it in and it worked, both the printer and the scanner, without a hitch or problem.

I then proceeded to do the same on a freshly installed Ubuntu 10.10 laptop. Same story, the printer just worked, and Ubuntu even recognized it when being plugged in, no need to install drivers or anything.

Now, on Windows the printer wouldn’t have worked at all without installing a boatload of crap, HP is notoriously bloaty when it comes to their bundled software.

The usual wisdom is that hardware manufacturers care more about Windows, and ship all their hardware with drivers and stuff to make it work. It would seem, then, that the burden is on Apple and Linux distributions to provide drivers and support to most hardware. It would seem like a daunting task. But they do it, and the end result is that Mac OS and most Linux distros include drivers for everything, right out of the box. This puts them a step ahead of Windows, when it comes to ease of use, at the cost of maybe a slight bloat. Still, my Ubuntu installation is much leaner than the 16-GB behemoth that is Windows 7.

So there you have it, the myth of better hardware support on Windows, finally debunked.

Now, if I could only get the braindead wireless support on the HP printer to work…

Flash Sucks

Categories: English Geeky Uncategorized

A world without flash?

I’ve always been a hater of Macromedia/Adobe Flash. Now that the entire Apple-Adobe controversy has rekindled the debate of whether the web is a better or worse place because of Flash, I realized why it is I don’t like Flash.

Also, I realized most technically-inclined people dislike Flash too, because they recognize a lot of its shortcomings, unlike the layperson who only cares about the web being pretty, full of animations and beeps and stuff.

Now, before I begin, let me state this: I’m griping about Flash as a web content creation platform/tool. I couldn’t care less about its use as a mobile development tool. A lot of bloggers have expressed more informed opinions on this topic.

For me, a true flash hater, what Flash does is take control away from the end-user, the consumer of content, and give it to the content creator, the designer.

If you’re the designer this is all fine and dandy; you can control exactly what the user sees, you can tell your application to be exactly this many pixels wide, this many pixels high, and how to look and behave down to the pixel and the microsecond. This is why designers love Flash; it not only lets them work in a familiar environment and with familiar tools, but it also gives them complete control about how and what the user sees and can do.

By the way, don’t be fooled; a designer that claims to know web design but uses only Flash is not a web designer. Flash was created to allow designers (Adobe’s primary clientele) to be able to say (untruthfully) they can design web sites.

Web page scaling FAIL santander

The problem is, the web wasn’t meant to be this way. Fundamentally, the kind of content the web was created for, was meant to empower the user. This is why the web browser was designed from the very beginning to not impose those very parameters (width, height, fonts, and so on); the content should adjust to whatever the user’s agent can display. So web content reflows to adapt to your browser; it should degrade for those systems that for any reason lack a certain capability (think Lynx and visually-impaired users). It should also allow me, the user, to alter how it looks and is rendered. This is why I can disable cookies, javascript, replace or even remove altogether the CSS used to format my content, decide not to display images, and so on. Even the most complex non-flash web page consists of text and images; and with a bit of cleverness I can get both the text and the images and incorporate them in the rest of my workflow; paste them into a document, translate them, email them to someone else, the possibilities are limitless since web content is delivered to me as-is, as bytes I can not only look at, but also manipulate as I would any other kind of information on my computer.

This freedom is lost on a Flash-only (or mostly) website. What’s worse, instead of the content being, well, content, stuff I can get out of the browser and process and manipulate in other ways, it becomes merely an image, a photograph or a movie trapped in the clutches of the Flash plugin. I can’t copy the text, I can’t scroll except through the provisions the designer made for me, I can’t easily extract the audio or the images, and I’m basically limited, not by the constraints of my browser, but by those set forth by both Adobe through its display plugin, and the designer. And let’s face it, most designers are also clueless about user interfaces and ease-of-use, unlike the people who designed my web browser, which is rendered mostly useless on a Flash site.

It is this loss of freedom that makes Flash so dangerous, and why I think it would be a good thing for Flash to disappear eventually.

Flash adds nothing of true value to the Web, as we could all live happy without all the animations, all the desktop-apps-masquerading-as-web-apps made in Flash (write a Web app from the ground up, it’s not that hard), all the stupid content that forces me to work its way instead of my way, and luckily, thanks to the advent of HTML5, the one thing for which Flash has proven to be indispensible (web video) we won’t need it even for that. Because, let’s face it, web video was Flash’s killer application; everything else that could once be done only in Flash is now doable in AJAX, CSS and Javascript. And honestly, if Flash had been such a good technology for those things, we would have stayed with it and not bothered with anything else.

If anything, the existence of so many alternatives to Flash and whatever it can do, is evidence that the world at large truly does not like Flash.

Java: what’s the point?

Categories: Uncategorized

What’s the point of being type-strict if you can typecast anyway?

What’s the point of being object-oriented if the language is so byzantine it forces you into procedural hacks every second step?

What’s the point of having such a huge class library when at the end of they day, your Java implementation doesn’t behave consistently? (j2me, i’m talking to you and your Hashtable and Vector classes and their lack of toArray).

What’s the point of the compiler being so pesky and anal if it can’t even catch scope-related variable disappearance? in a method, a variable declaration will override an instance variable. I mean, if the compiler complains about *everything* else, why in hell doesn’t it complain about THIS?

Bleh.

ColdHeat soldering tool – don’t buy it!

Categories: Uncategorized

Coldheat pro
coldheat pro
How nice to begin the new year with disappointment. I bought a ColdHeat pro soldering tool, thinking I might use it for quick soldering jobs, since using the wired soldering tool is a bit of a ritual.

I made the fatal mistake of not reading the  reviews online (such as this and this), which warn that the ColdHeat might not, well, live up to the hype.

As it turns out, the reviews are pretty accurate. The ColdHeat turns a simple task, such as soldering two wires together, into an absurdly difficult affair. Something that would take even me, a pretty inexperienced solderer, a few minutes with the wired iron, turned out to be impossible with the ColdHeat. Maybe it’s just that I’m dumb; but the fact remains that the ColdHeat plainly didn’t work for me.

About my only consolation is that the thing was pretty unexpensive!