10.8.10

Random content generator


var i = 12;
while(i--) links.push({
title: 'This is a title',
article_text:
(new Array((100*Math.random())|0 + 1))
.join('Lots of dolorem in my ipsum. '),
image:
'http://9gag.com/photo/'+
((32000*Math.random())|0) +
'_540.jpg'
});


Boom.

7.8.10

Relative font sizes



The discussion over whether pixels (px) or ems (em) are best as a way to measure content when designing web pages is often ended with the argument that all modern browsers can scale content defined in pixels, which is not untrue.

What falls short from this argument is that it puts the burden on the user to make the conscious decision to change the font size. The mental process could go like this:

I'm having trouble reading this. I'm squinting. The text is too small. I'll press... what is it? oh, ctrl,+. One, two, three... no, one less so ctrl,-. There.

The triviality of the example is balanced by the importance of avoiding disruption when it comes to user interaction. An action that we require from a user (in this case, attention to what we write) is impoverished by the division of said attention between the task that we want performed and adjusting the viewport.

The whole point of affordance and usability is precisely to take the burden of thinking from the user, so it becomes an enjoyable activity. Would I like my readers to enjoy my blog? yes. Would I like my buyers to enjoy purchasing on my website? yes. Would you?

An additional element is missing from pixels: There is research regarding the ideal width for a line of text (which I defer to you to investigate and debate). When using pixels, the rhythm and proportion of your page is much more difficult to preserve, since now it is a matter of font size and line width. If instead the layout is all dependent on font size, a person designing and coding a website can have a lot more control over the aesthetics without additional cognitive burden.

Author tweaks site css...

*ahem*. Got it?

28.6.10

Javascript timestamp formatting

I've ran into this scenario enough times that I think it's faster to post it and search it than opening an old file :)

Assume that you're working with a timestamp (number of milliseconds since January 1, 1970), for example:
var t = +new Date;

If you want to make that a relative time like "10 minutes ago", you can use this simple function:

function format(timestamp){
  var now = +new Date;
  var diff = (now - timestamp)/1000;
  if (diff < 60) return diff + ' seconds ago';
  if ((diff/=60) < 60) return Math.floor(diff) + ' minutes ago';
  if ((diff/=24) < 24) return Math.floor(diff) + ' hours ago';
  if ((diff/=7) < 7) return Math.floor(diff) + ' days ago';
  if ((diff/=4) < 4) return Math.floor(diff) + ' weeks ago';
  var date = new Date(timestamp);
  var month = ['January', 'February', 'March',
               'April', 'May', 'June', 'July',
               'August', 'September', 'October',
               'November', 'December'][date.getMonth()];
  var YEAR = 365*24*60*60;
  var year = (now - timestamp < YEAR) ? '' : ' ' + date.getFullYear();
  return (month) + ' ' + (date.getDate()) + year;
}
I hope you find this useful.

Update

Tinkering around, I noticed that blogger has this:
function relative_time(time_value) {
  var values = time_value.split(" ");
  time_value = values[1] + " " + values[2] + ", " + values[5] + " " + values[3];
  var parsed_date = Date.parse(time_value);
  var relative_to = (arguments.length > 1) ? arguments[1] : new Date();
  var delta = parseInt((relative_to.getTime() - parsed_date) / 1000);
  delta = delta + (relative_to.getTimezoneOffset() * 60);

  if (delta < 60) {
    return 'less than a minute ago';
  } else if(delta < 120) {
    return 'about a minute ago';
  } else if(delta < (60*60)) {
    return (parseInt(delta / 60)).toString() + ' minutes ago';
  } else if(delta < (120*60)) {
    return 'about an hour ago';
  } else if(delta < (24*60*60)) {
    return 'about ' + (parseInt(delta / 3600)).toString() + ' hours ago';
  } else if(delta < (48*60*60)) {
    return '1 day ago';
  } else {
    return (parseInt(delta / 86400)).toString() + ' days ago';
  }
}

11.5.10

Internet Explorer slows down the internet

Sounds like an accusation. It is.

Take for example web standards. Don't try to feed us your crap that you have and are actively participating in all sorts of initiatives. That's great but it's the people you employ who are doing that. It is the enthusiasm of human creativity.

Now take JavaScript. Other browsers have implemented version 1.6, 1.7, 1.8 and introduced elegant language features that allow for very cool things to be written with very little cost (size or performance, for example).

But because the majority of people still use a shitty browser (I grant you that it has gotten better, but let's not depart from the point), web developers are forever stuck programming with the lowest level denominator only because IE6 has to be supported. Or because IE7 or IE8 incorrectly implemented something.

How can a browser slow down the internet? Because it slows down progress by not allowing programmers to evolve and educate at higher levels. Language evolution is but an academic exercise, not something that can be popularized by mass adoption. Forget about great ideas, they won't work for your clients.

Can anything be done? At all?
I wait for inertia to catch up. IE will fall. The weight of the giant will leave it behind.

Bah.

6.2.10

Where OAuth and everything social fails

Okay, I get it. The interwebs is like the real world, but binary. We all want to play in it and feel safe. How come it hasn't taken off?

Let's take as a hypothetical example site X. If they have made a big enough name, I'll trust them. Why? perhaps because everyone else is. The power of the masses is such that even if we're all wrong, I feel that we can also all rebel at the same time and it will be okay than if I land on some random site that ends up stealing my identity and my cats.

I've just mentioned the fundamental problem: trust. How do you trust a service or a corporation that you cannot see, chase, yell at or run over when it pisses you off? After all, the people in customer service are but human employees, they are not the corporation. Those running the corporation are the closest thing, but up there who knows what's going on. Maybe they are nice and fix problems when they come up, or maybe they laugh at you and close your account.

This is the kind of situation that makes legitimate businesses not flourish: lack of trust.

Back to site X. It looks legit. Okay, that means someone knows how to keep up with web standards, etc. That's a good sign: dedication. It's no guarantee, however.

Let's say site X gets acquired by Y, a rogue company without any particular concern for the individuals involved. The site has OAuth and whateverconnect which allows your identity provider (say, Google or Facebook) to pull the plug on the service if they become rogue. Unfortunately, they can only prevent the rogue company to do evil on your behalf. It cannot prevent them from acquiring information from you: the information cannot be taken back.

This is the fundamental problem between real life (meatware) and virtual life (software): the model doesn't work.

Can you imagine walking into a coffee shop and automatically telling everyone in it your home address and phone number? That's the equivalent of an email address. Giving you a way to contact me back doesn't mean I have to give you a way to contact everyone I know.

Message to identity providers: allow people to see my avatar and nickname and to contact me through your service. That's it. Not my email address, not my contacts, not my fan clubs. It's okay if you aggregate my behavioral data along with everyone else's and profit from that information, as long as you protect my identity. That's your end of the deal in exchange for the information I choose to give you. I do buy stuff, I do need ads, and I understand you need to know about me to show me relevant stuff. Just don't sell my identity. Protect it. Your company depends on the trust that your users have.

Furthermore, knowing who I am doesn't mean you should know everything I do. In a way, that is equivalent to having someone follow you around, even if it's just to figure out what you like best. It's creepy and it's not because I'm doing something I'm ashamed of. It has to do with moods. If I'm in the mood for entertainment, I want to watch entertainment-related ads, not diapers. I don't need a reminder of what I should be doing instead, that's why I'm watching sports in the first place. That's why you shouldn't bundle my identity with my physical container. Each mood is a different account. Don't mix them together.

For the people implementing and designing these standards: pay close attention to how these interactions work in the physical world. After all, these protocols have been around for thousands of years and have evolved to their current state based on what works best: evolution and selection. Being smart is not a substitute for insight and patience, it's just shortsighted.

Before I climb off my soap box, let me add this tangential note: If I'm paying for insurance I don't just expect to get some money for my stolen motorcycle. I expect you to also not make me fill 30 forms that are just a seemingly random combination of the same information: claim number, name, address, lien holder, who did my service the last time... ugh. I get it, the more information the better but your forms are ridiculous. It is the modern equivalent of torture. Neither physical nor terrible, but very, very annoying. Pay attention to your job, there's people on the other side.

(off the soap box)

My duty as an engineer is to realize that what I create, other people consume. I can make their lives better or worse.

My duty as a human is to realize that my attitude is other people's experience of life, and that if something sucks it is because someone made it that way, even if not on purpose.

My footprint is someone else's path.

I'm in your beaches

2.2.10

Users are the beating heart of the internet.

Gravity logoUsers flock to a site; and then what?
Why do they?

The internet is a vast universe. The bleeding edge is always too far to see, as well as the long tail of users. The reality is, there are new people coming online all the time; a constant stream of confused newbies. How do we capture them? How do we create an appealing product?

Timing and marketing are a separate, but crucial component. That is dealt by people more competent than I am. Technologically, however, I ask these questions: What needs to happen on Gravity to make people want to come back often? What needs to happen to make people feel like sharing stories of good quality? How do we cut down on the chattiness and crank up the quality of content?

This is the generic problem that everyone with a dream faces.

We need to ask each user, what do you like? what interests you? what do you already do? Users look for information, and filter and create and shuffle information. Most importantly, users need feedback. People asking for your advice, people letting you know when your opinion is valuable.

A nerdy way to see it: a user, like our system, has an input, and an output.
The output can be interpreted (and predicted) if the input is known. In terms of quantity, the input is a relative positive (as it increments the information in the system) and the output, compared to the input, can be less (information was filtered) or more (information was created), dissimilar (shuffled) or a combination of these. All these outputs are embedded with the ideas and feelings of each user, and an increasing quality will see increasing number of users on the site.

From the information consumption perspective, the user should always be shown the best content first, diminishing in relevance as she scrolls down. From the technical perspective, how does this ranking happen?

There needs to be a very transparent system for tracking user intention: assigning positive and negative points between users and items and rank them accordingly. Examples of explicit interaction: liking and orbiting. Weak indicators could be clicks, scrolls and mouse movements. These scores could be propagated to make educated guesses.

Intrinsic value for an object is determined by its outgoing positive connections. In other words, only external interest determines value of the object in question, and only relative to those consuming it. When it comes down to users and what they post, having someone else like your stuff is the only way you will get credibility, regardless of how good something is, according to you.

However, this is not the whole story. A company's success depends on how useful it is to its users. The metric of usefulness for each user will depend on how often users want to come back and invite the ones they trust. Again, it's a matter of propagating good stuff to the people you care about.

27.1.10

Spacetime sync'ing

When two points are moving through space and time, there is a warp in the spacetime fabric between them, caused by relativity effects.

When computers leave the factory, their clocks are set and Quality Control ensures they are beating in sync. As they are shipped and handled they invariably end up taking different routes through space and time which could make the clocks differ.

If an observer measures the clock, frequency and the numerical value might appear the same between two points. But this implies a third, observing, absolute point.

But how do only two points do it?

If one computer sends its current time value, there IS a delay when comparing:

function compare(tick){
  return tick == new Date;
}

Seems like a difficult task to make that function return true.

However, let's assume that the clocks don't differ and what this warping is actually the medium through which the computers are communicating. We start with the assumption that the epoch (the origin or zero) is constant between all computers and assume that any difference in synchronicity is directly caused by a warp in the communication medium.

Furthermore, assume that the computers in question are physically the same computer, but in different positions in time. Could we send messages back and forth between them?

Analizing a possible method:

Taking a binary stream (since that is what electronics communicate through), measure the length between changes. That is, instead of measuring how many times a 1 or 0 were encountered, only measure how long it remains unchanged. An unwarped system does not have separate information of data and time because all distances are unitary, therefore we assume that time is constant.

To function like a reliable channel and data storage node, information exchanged must be unchanged and encrypted. Assuming that both nodes have a separate resolution of time, their unique id would be their time signatures, as measured by themselves. This unique id is what serves as a crypto key, and not even the client knows it, the key itself is encrypted into the memory, as a signature of how the memory is allocated.

Using then time as part of the data, the problem becomes a matter of storing the data, unwarped.

Since we assume that two nodes cannot control the warp on the medium, but to comply with being a reliable data store each node can assume that it will get what it expects. The first test of the system is its ability to function like a memory:

function recall(reference){
  // the very first thing: to record the first data point
  var now = Number(new Date);
  // while no change, return nothing
  if (!reference) return;

  var callee = arguments.callee;
  var caller = callee.caller;
  
  setTimeout(function(){
  // callback with data response
    caller.call(null, callee[reference]);
  // in this unwarped medium, the ratio remains constant
  // so a client would expect a spacetime delay directly
  // proportional to the data given. In other words, the
  // function takes a predictable amount of time to execute
  }, start/Number(new Date));
}

var bytestream = '11000010';
// calculating lengths = 2, 4, 1, 1:

function measure(){
  var start = Number(new Date);
  var caller = arguments.callee.caller;
  var input = Number(arguments[0]) || 0;
  // function serves as its own memory
  var memory = arguments.callee;
  var response = memory[input];
  setTimeout(function(){
    caller.call(caller, response);
  }, Number(response || 0)/start);
}

Can this work?

27.12.09

Playful

//TODO: get code prettifier for blog :)

I feel like creating art with javascript.


<!doctype>
<style>
*{margin:0;padding:0;border:0}
body{
height:100%;
width:100%;
}
div{
background:red;
margin:auto;
}
</style>
<body>
<script>
console.log('ok', Phi = Math.sqrt(5)/2 + .5);

(function(step){
  console.log(step);

  plot(step);
  
  if (step > Phi) arguments.callee.call(this, step / Phi);
  else setTimeout(arguments.callee, step * Phi);

})();

/**
 * Example plot function
 * @param n The step number from balancer function
 */
function plot(n){
  // font size allows for a one-time value change in the page that will affect
  // relatively-defined units, in this case divs with em dimensions.
  var last = parseInt(document.body.style.fontSize)/100 || 0;
  n = n || last;
  // Pick relatively larger number
  // This number is a relative maximum
  var max = Math.sqrt(last || 1 * n) * 100;
  document.body.style.fontSize = max + '%';
  
  // relative rendering number
  var rel = n/max;
  
  var block = document.createElement('div');
  
  block.style.backgroundColor = 'red';
  
  block.style.MozOpacity = n/100;
  block.style.height = '1em';
  block.style.width = rel * 50 + '%';
  document.body.appendChild(block);
  if (window.scrollMaxY) document.body.removeChild(document.body.firstChild);
  if (n < 0) document.body.innerHTML = '';
}
</script>
</body>

12.12.09

Secure string interpolation

While working on giddy/gNius, I ran across this article about Secure interpolation on JavaScript written by peeps at Google's Caja. Very well written and nowhere close to my "this should work" approach. Check it out.

25.6.09

Structure

An ideal software development environment would be, among other things, reliable.

Currently, systems use tests to keep expectations and reality linked to some degree. Compilers for example, can somewhat also test the code, but only to the extent that is syntactically correct. But there are at least 3 kinds of errors: syntax, runtime and intent. The first two are familiar, the third one signals a disconnection between user expectation and running code. A bug.

I've been working for a while on what I see as The Software Engineering Problem. You know, programmers that are demotivated, project managers who are stressed, designers who are frustrated, etc. I come from a classical engineering background: Mechanical, electrical, automotive. As such, I am often taken aback by the things happening in software that would just not fly on meatworld. There's a great article on Linux Journal.

This is where the system I envision fits: connecting human expectations with computer execution, with minimal explicit human interaction. It is what robots did for Detroit: automate lower level tasks to free resources for higher level planning, with much higher quality results.

For the time being, the intent behind this prototyping UI environment is to minimize the time spent cranking functionality. This time can be cut by making for example a meta-testing program, which can collaborate with the developer to write unit tests automatically; or by making it easy to write interfaces so people with a more visual skill don't need to also know a great deal of programming or rely on someone else (and the quality of the communication between them) to write a mockup or prototype. This bit could amount for several months saved per small feature.

By focusing on human intent, computer systems can be built that are much more optimal, because the name of the problem is Human-Computer-Interaction. Much more if we are aiming for the birth of Artificial Intelligence, or the emergence of a hive mind. Previously, the idea had been to focus on the user but it seems that engineers were left behind there. "The CLIENT is who we focus on, not the lowly developer. You do what you have to do to rake in the money." Not so much.

I don't know if there's even such a place, but I'm just glad it's not really like that for me. That doesn't mean that it couldn't be improved.

Finally, "capture requirements in the user's terms, and then to try to create an implementation language as isomorphic as possible to the user's descriptions, so that the mapping between requirements and implementation is as direct as possible" --Language Oriented Programming

5.5.09

How to make engineers quit

1. Put them to work with tools that are inefficient. Good examples are: spaghetti code, slow or unneeded compiles and server restarts, infinitely complicated instructions and a billion systems that only a true god could figure how to make them work together.

2. Inundate them with feature requests and bugs, making sure that the deadlines sound reasonable for an optimal environment. After all, their time should be well accounted for.

So far, you already have a winning recipe. Your engineers will be swamped every time they want to do something because, ideally, it should take as long as they estimated in the first place, not really knowing the state of the underlying code.

Of course, you can't know the state of the code until you actually try working with it. If the engineer is not experienced enough, he or she can't figure out that it's not supposed to be this hard in the first place. This makes the engineer feel stupid.

There's more ways to make this more fun:

3. Reward those who are eager to write code, but inexperienced. This ensures you get a lot of features at the cost of exploding code. Tell yourself that as long as something is documented, then that's all it needs.

Remember, users never come first. A good and often used example of this is ignoring that any code that is written is supposed to be used by another, different human. Make sure you don't design for simplicity, but something that is more tangible, like page load speed.

4. Don't pay attention to engineers warning about the state of the tools. If the young'uns can deal with it, then clearly that's just an excuse to slack off.

Stay tuned for your next issue: How to lose your business to your ex-employees in a few years. Hint: it has something to do with being able to push features faster.

17.4.09

Anonymous Construct Pattern

I've been furiously working on "the red pill"... this is the latest

JavaScript. A powerful language, misunderstood.

So many people have tried to tame it with libraries. Many have failed, some have succeeded.

The idea behind a library either local, company-wide or open source, is to abstract imperfect parts: Not all browsers behave the same and it's extremely difficult to tame them. Gathering all this knowledge into a library saves a lot of time.

Many have tried to put some structure into JavaScript, but have only succeeded in creating only Java-ish. Classes, supers, the works.

A good abstraction is one that we don't have to think about. It just works and it's easy to use it.

Let's say that we want to make the powerful in JavaScript more friendly for kids that shouldn't play with fire. Let's tame closures for them and offer the scopes they are familiar with: private and public. Classical inheritance is sack of potatoes: heavy (when full of potatoes) and old school.

Let's say we create a translator for the rest.

Part I: Construct


  • An anonymous construct does not pollute; it is humble. Only its UID is exposed.

    Construct's instance and static (immutable) scaffolds:

    /**
    * @constructor
    */
    (window.Construct = function(){
    // instance
    }).prototype = new function(){
    // static
    };

    Private members, public members:

    (window.Construct = function(){
    this.private = {};
    }).prototype = new function(){
    this.public = {};
    };

    Closures:

    (window.Construct = function(){
    var closure = {};
    this.private = function(){
    with(closure){
    // private, with access to closure objects
    }
    };
    }).prototype = new function(){
    var closure = {};
    this.public = function(){
    with(closure){
    // public, with access to closure objects
    }
    };
    };
  • 10.2.09

    The time is coming

    Will you miss out?

    new Date(1234567890123);

    16.12.08

    This is why I don't get any work done

    (function () {
        var lyrics = [];
        ['badger', 'mushroom', 'snake'].forEach(function(word) {
            window[word] = function(){ lyrics.push(word); };
        });
        function repeat(f, n) {
            for (var i=0; i <= n; ++i) {
                f();
            }        
        }
        repeat(function(){
            repeat(badger,11);
            repeat(mushroom,2);
        }, 3);
        repeat(badger,11);
        repeat(snake,4);
        alert(lyrics.join());
    })();
    I'm sure you remember the song.

    21.8.08

    Interface design

    Nobody doubts that the interface of a shiny new iPhone or macbook were thoroughly thought through. Most of the tangible things we use have a thought process behind them that reflect that there were people thinking about how these things were to be used. Ranging from kitchen tools to a car dashboard, it's evident that this interface design is present.

    Lately, as the web evolves, the same concepts are ported into new disciplines: interface design becomes a crucial component of any respectable web site or application. I can't say that most interfaces have been thought through, but the concept is there.

    Unfortunately, when it comes to programming interfaces, one can sadly see that they have not been thought through at all.

    I'm taking a break from my first attempt at a GWT application to give you an example:

    1. To create an Eclipse project:
    projectCreator -eclipse MyProject
    2. Then, you need to create the application skeleton:
    applicationCreator -eclipse MyProject com.mycompany.client.MyApplication

    Looks super simple, but:
    - Why do I need to repeat -eclipse MyProject? Can't the applicationCreator detect an existing eclipse project? Apparently not
    - From the example, it seems like i should be able to create com.failcorp.something.MyTestApplication right? No. I get a warning:
    Please use 'client' as the final package, as in 'com.example.foo.client.MyApp'.It isn't technically necessary, but this tool enforces the best practice.

    Uh, if you're going to enforce it, can you not give an example where you can change anything but client on it? Yeah, you can change com, example, foo and MyApp but not client. That's a bit obtuse.

    No disrespect. GWT is the product of very hard work and deserves lots of props. This example does not in any way represent the quality of GWT but it does portray a problem that the vast majority of software tools and web applications have: their interfaces are designed by people familiar with the solution --and sometimes not even familiar with the problem, as I point out below--. The result? People familiar with the problem and unfamiliar with the solution spend an unacceptable amount of time trying to learn how to use the tools. It so pervades our programming culture, that nobody makes a public stink about it. Once you learn it, it's not a problem; and if you haven't learned it yet then you should probably feel stupid until you do.

    The issue is becoming more obvious in web applications: the features are driven by CEOs, managers, sales people, etc. People that "know" the landscape. How come programmers often end up not implementing what clients most often ask for? Who is the product for? What's worse is that these product designers often don't even use the product and are unfamiliar with the problems it attempts to solve. This is why the dogfood concept is so important: if you don't use it, you'll never know if it's useful or not.

    I mentioned before the hardware design of Apple products: it might not be perfect, but it's superior to the average. Here's an example for the software world: jQuery. This tiny library does about the same as many others. Maybe it's faster, maybe it's smaller, maybe it's hotly debated. But what is really different is that it was designed to be easy to use, and it really is. The difference is so vast, that it's actually difficult to express how different it is to someone who is used to clunky APIs.

    Ultimately, is a tool designed only to solve a problem or does it keep in mind its users? Writing an API or command-line tool should NOT be like slapping buttons on a box.

    21.5.08

    Enabling the debug menu in Safari (Mac/Win)

    In Safari/Mac, from a terminal:
    defaults write com.apple.Safari IncludeDebugMenu 1

    In Safari/Win:
    1. On your favorite XML editor, open:
    c:\Documents and Settings\your_username\Application Data\Apple Computer\Safari\Preferences.plist
    2. Add just before the closing </dict>
    <key>IncludeDebugMenu</key>
    <true/>

    3. Restart Safari

    6.3.08

    Firebug firepalooza

    Today I discovered two fantastic extensions to Firebug.

    First, I discovered that Steve Souders (creator of YSlow, another excellent Firebug extension) wrote LiveCoder, which allows you to edit and save your code on the fly. It has been on my wish/todo list for a while now, as I use Firebug pretty heavily for testing changes on the fly (especially CSS). Pretty handy when you're making pixel-perfect (emperfect?) layouts or trying to decypher a bug. Seems natural that you should be able to save your changes once you've made them, no? WYSIWIG, people, it's obvious.

    I hesitated about posting twice on the same day because... well, I want to watch Lost. But I just came across Firecookie and I really think you people should check both out. Firecookie allows you to tinker with your cookies (not that you couldn't with other extensions) within Firebug.

    Firefox 3 and JavaScript 1.8

    I finally grew a spine and decided to take the plunge into Firefox 3 beta. Half of my extensions don't work, but only one of them is something I use/need: CS Lite (update: available on addonsmirror). All of these work: Adblock Plus, Firebug (1.1.0b10), GMarks, Google Notebook, Greasemonkey, Secure Login and Stylish. Not bad.

    About 15 seconds into my adventure I crashed it. Well, I crash everything pretty quickly after I start using it, I think it's a gift. My dad called it "duodenum hands", although back then it wasn't about programs, it was about... objects in general. Wikipedia says this is where most chemical digestion takes place. As in, destroys everything.

    From the what's new in 3.0b3 page, I promply went to "New in JavaScript 1.8". First thing on the list: Expression Closures.

    Are you serious? Expression Closures? Ok, it's not obvious what it is. It just means you can do: function (x) x*x instead of function(x){ return x*x; }, when you're using a function as an expression, like: setTimeout(function(){...},10);. It also means it's useless!

    First, the only advantage of this feature is that you'll save exactly 8 characters. That's it. Then, nobody supports it. Well, that's understandable, since it's JavaScript 1.8... however, I didn't just mean browsers, I meant IDEs. That's right, you'll be all happy coding along your fancy expression closures and your Aptana will balk, whine and nag. Expression Closures!

    Next on the list: generator expressions. Sweet! Surely you understand that Array Comprehensions were available since JS 1.7. Particularly noteworthy, as you know, is that generators aren't run until needed, unlike an array comprehension which is calculated ahead of time. This can be a ginormous performance difference... and you already know this since this is in your every day's cup o' Python.

    Hm, that doesn't work as well as Java. Java sucks anyway. Just kidding. But not really.

    Skip a bit further down, and you'll see JSON decoding and encoding. While it's not here yet, it's quite interesting. It heralds further adaptation of the next wave of applications: faster, more robust and sort of working in IE11 or whatever :)

    Take the plunge. It's fast.

    29.1.08

    Inverse-Proportioned Columns

    Today I accidentally blew my mind with a table layout. Given the following [simplified/spare me validation] code:


    <table border=1>
    <tr>
    <td width=50%>hello</td>
    </tr>
    </table>


    If you decrease the width of the column, you'll see the table grow. If you increase it (up to 100%), you'll see the table shrink. Get it? reduce the width and the thing grows!

    The explanation is simple, however. When a table layout is auto and a column is defined in percentage, the table width is calculated as the column width multiplied by the inverse of its percentage. In other words, a 20% cell will occupy 1/5 of the table's width. This logic is loosely based on the assumption that you'll have the corresponding number of cells per row (in this case, 5), but it certainly isn't required to do so.

    Verdict: USELESS

    23.1.08

    Jaxer

    John Resig writes about Server Side JavaScript with Jaxer in his blog. Basically, it's an extremely simple way to set up a JavaScript server, with the things that you would expect a server to be able to do such as writing and reading files.

    I first ran into Jaxer a couple of days ago when doing the latest update for the Aptana plugin for Eclipse. I didn't give it much attention, other than getting upset at the ugly icon on the toolbar. Ok, it's not ugly as much as it is inconsistent. The last thing I want is another "feature" being shoved down my throat (like the Aptana messages — grrr).

    Hey, if John thinks it's cool, then it is. His jQuery library is not half bad and his book "Pro JavaScript Techniques" is not terrible either,* so there might be something to Jaxer.

    Dion Almaer also posts about it on Ajaxian, adding that the database connection could be integrated with Gears. One thing that immediately caught my eye is that you can use JavaScript 1.8 — on the server.

    Personally, I'd like to work on a project where there's only JavaScript to deal with, both on the server and the client side. Ok, maybe lately I've been hating C++ more than the average person, but wouldn't it be nice to use the same language on both sides? Have both sides talk to each other without complicated setups? That's Jaxer's promise to you. We'll see how it pans out.



    *My sarcastic signature comments may confuse the unacquainted. John Resig has earned my respect because of his visionary work, most notably jQuery and lately with his excellent book.